Search Results
268 results found with an empty search
- Sigma 150-600 Contemporary OS Anti-Vibration Algorithm Comparison
This article analyzes the anti-vibration (OS) algorithms available using firmware version 1.02 for the Sigma 150-600 mm Contemporary lens. Sigma hasn’t published any information on the relative effectiveness of the three available OS algorithms, so I took it upon myself to see if there is any difference. The three available OS algorithms are called “Dynamic View Mode”, “Standard”, and “Moderate View Mode”. The default setting, if you don’t program any customization, is “Standard”. Each of these modes is available regardless of selecting “OS1” or “OS2”. The “OS1” is the normal hand-held mode, while the “OS2” mode is used for horizontal panning, such as while mounted on a tripod. All tests reported here are using “OS1”. Sigma "OS" is the same as Nikon "VR". You must use Sigma’s “USB Dock” with their Optimization Pro software to program any customization into the lens. The dock also allows programming focus fine-tune adjustments (16 settings for 4 focal lengths and 4 distances) and focus limiter modifications. To perform the tests, I used a tripod to rest my fist on, and then I rested the lens on my fist. This arrangement afforded me some level of aiming control, while still letting the lens “wiggle around” to simulate hand-held. I shot about 10 frames of my “A0” size resolution chart at 55 feet for each lens switch setting. The resolution chart images were analyzed using the MTF Mapper program from Frans van den Burgh (see this link). The resolution measurements require that the chart images be pretty level and perpendicular to the lens axis, which is why I didn’t simply try hand-holding the lens while shooting the chart. I shot each resolution chart image at 400 mm and f/6.3, and I used back-button focus with “AF-C” continuous auto-focus. These are typical shooting conditions for me, which is why I chose them for the testing. Beyond 400 mm, accurate aiming just gets too difficult for reliable/repeatable testing measurements. My lens was programmed with the “C1” switch setting having “Fast AF Priority” focus speed and “Dynamic View Mode” for the OS setting. I have already made tests that show this “fast” focus mode is essentially as accurate as the “Standard” (default) focus mode, at least when using firmware 1.02. The “C2” switch was set up with “Standard” AF focus speed and “Moderate View Mode” for the OS setting. If the customization switch is turned off, then you get “Standard” AF focus speed and “Standard” OS as well. Tests such as these are 'statistical' in nature, since they involve taking measurements with a lens waving around. I have included some data below, to give you an idea of how the measurements vary. Yes, I could have made 1,000 measurements at each setting to raise the confidence level; I leave that as an exercise to the reader. High Shutter Speed Tests My previous testing has shown an insignificant difference in resolution when you leave OS active at higher shutter speeds (1/1000 and above). This statement is not valid for all lenses!! Any OS algorithm is equally effective (or ineffective, if you wish) at high shutter speeds (you need firmware 1.01 or newer to get this result, however). I only saw a decrease of about 1.0 lp/mm MTF50 by leaving OS active above 1/500 shutter speed. OS Setting Screen C1 Switch Settings That I’m Using Now Medium Shutter Speed Tests The following tests were conducted using a shutter speed of 1/250 second. For 400mm using an APS-C sensor (600mm equivalent), I consider this “medium”, and starting to get into the realm of needing anti-vibration. Some people would benefit with stabilization at this speed, and some wouldn’t. Dynamic View OS, High-speed AF MTF50 Measurements: 36, 40, 38, 30, 36, 40, 42, 42, 40, 36, 40. Average = 38.2 lp/mm Moderate View OS, Normal (Standard) AF MTF50 Measurements: 42,40,42,40,44,40,45,38,42,42,42,42. Average = 41.5 lp/mm Standard (default) OS, Normal (Standard) AF MTF50 Measurements: 40,45,42,38,42,42,45,42. Average 42.0 lp/mm OS Off, Normal (Standard) AF MTF50 Measurements: 44,42,40,42,40,34,38,42,38. Average 40.0 lp/mm Results here don’t show much difference with OS active or not. The “Standard” OS algorithm got the best results, but not enough to really matter. Low Shutter Speed Tests These tests used a shutter speed of 1/60, or a little more than 3 stops beyond the traditional limit of 1/600 for an equivalent of 600 mm (DX frame). This is roughly the rated effectiveness of OS for this lens. Dynamic View OS, High-speed AF MTF50 Measurements: 23,26,34,30,26,30,22,28,28,28. Average 27.5 lp/mm Moderate View OS, Normal (Standard) AF MTF50 Measurements: 30,24,28,30,30,28,30,30,30,28. Average 28.8 lp/mm Standard (default) OS, Normal (Standard) AF MTF50 Measurements: 24,23,24,26,32,22,24,18,28,24. Average 24.2 lp/mm Results here show that the “Moderate View” is the winner. I didn’t show the “OS Off” here, because the images were mostly blurred beyond recognition. Bear in mind that the MTF Mapper software is extremely picky, so the numbers here may lead you to believe that OS is not that helpful. Not true. The pictures are enormously helped by the OS system when you shoot at slower speeds, but there’s no substitute for high shutter speeds. Conclusion It appears that “Moderate View” wins, although not by a huge margin. Sigma (rather cryptically) describes the effect of how each OS algorithm “looks” through the viewfinder. To me, what counts more is which algorithm provides the best anti-vibration effect in the final picture. It seems to me that there is a different end result in your pictures, depending upon which OS algorithm you pick. I prefer the “look” of Nikkor VR to Sigma OS when looking through the viewfinder, but both companies seem to provide roughly equivalent results in the final shot. Newer lenses invariably provide better stabilization, though. Sigma has the advantage of future OS algorithm improvements, however, available through a new firmware update. I saw a definite improvement in the Sigma auto-focus system (speed and accuracy) after loading the firmware versions 1.01 and 1.02. I also saw an improvement in the ability to not “mess up” the shot when forgetting to turn off anti-vibration at high shutter speeds. Don’t be surprised if Sigma has more tricks up their firmware sleeves in the future. Recently, Tamron finally saw the light, and is now copying Sigma with the ability to reprogram lens firmware (in their new 150-600 mm offering), with an essentially identical set of features as Sigma. Nikon et al. hasn’t yet seen the light. #review
- The Fallacy of Spray and Pray
Blue Angel Daredevils Photographers lust after that pro camera model with those high frame rates, so that they won’t miss that crucial shot. Guess again. Let’s say you just got that new D500 and you dialed in 10 frames per second. How could you miss now? Well, let’s look at some simple math. The above shot shows two jets which are cruising at about 500 miles per hour. Their closing speed is 1000 miles per hour, which is about 0.28 miles per second, which is equivalent to almost 1500 feet per second. That D500 taking a picture every tenth of a second captures those jets every 150 feet or so. Those aren’t very good odds to get jets right next to each other, are they? A similar scenario gets played out trying to capture the touchdown catch. So what to do? How about relying on your own reflexes? You can be quicker than you might think. Something I’ve noticed is that most photographers will close their left eye while looking through the viewfinder with their right eye. Stop that! Train yourself to observe what’s going on with your left eye. You need to be able to anticipate peak action, and you can’t do that if you can’t see it. Next, you need to learn to compensate for the slight delay when you squeeze the shutter release before your camera takes the shot, called “shutter release lag”. This lag is usually about 40 or 50 milliseconds, unless you have one of those point-and-shoots that can take eons to respond. There is no substitute for practice. No matter how much automation your camera has, it will never be able to replace your human intelligence or your anticipation of action. Learn your camera and lens; know which way to twist that zoom ring. Make it become second nature to you to zoom out until you locate your subject, then zoom in to properly frame the shot. I'm not claiming that pro camera features have no value. I'm just saying that you can't necessarily buy your way to getting that great shot. If this was all trivial, then where’s the fun and challenge in that? Go after that satisfaction of owning the shot that didn’t get away! Jet Smooch #howto
- MTF Mapper Version 0.5.8
This version of MTF Mapper has some new features and some changed features. This is the software that I use to evaluate both lens resolution and focus calibration. The author of this program is Frans van den Bergh. You can get this software here: https://sourceforge.net/projects/mtfmapper/ This new revision can still use the original resolution and focus chart designs, which is a real relief if you have invested time, effort, and money in printing/mounting large versions of the charts. If you print the newer charts, you get some new and welcome abilities. What’s changed? You may want to review my MTF Mapper Cliff’s Notes article, detailing the older version capabilities. New Resolution Chart The biggest change as far as I’m concerned is the switch from ‘relative’ measurements to ‘absolute’ measurements in the resolution charts (grid2d and grid3d). The chart scales of earlier MTF Mapper versions would only have a value range matching the actual measurements, but now the resolution range starts at zero. Another big change is the switch to monochrome color coding in the 2d and 3d charts, instead of the ‘rainbow’ color coding that would auto-scale to the entire measurement range. A small but welcome change is the addition of the photo name under the chart, so you know where the chart came from. Original resolution chart design. You can still use this chart. New resolution chart design, showing annotated photograph. New chart up close. Good edge measurements are blue; “iffy” ones are in yellow. 2D Chart with absolute scale for MTF50 lp/mm Older 2D Chart measurements with relative scale for MTF50 lp/mm. 3D new resolution chart showing the absolute (monochrome) scale MTF 10 and MTF 30 Graphs Camera companies have traditionally published MTF charts that show 10 lp/mm and 30 lp/mm “theoretical” values. I stress “theoretical” here, because those companies are merely blowing smoke. They don’t actually measure anything (at least Canon and Nikon don’t). MTF Mapper can now plot the real-deal MTF10 and MTF30 charts, based upon actual reality. What a concept. These graphs use the same chart design used for "grid2d" and "grid3d". MTF10 and MTF30 measurements using the new resolution chart Focus Chart The new software can use the original focus chart. What’s new is how the ‘annotated’ version of the chart displays the measurements. The measurements are in “cycles per pixel”. The measurements are no longer embedded inside boxes, which makes reading the values and seeing the edges much easier. Focus chart photo, showing the annotated edge measurements “Profile” option for focus chart The chart above shows a very slight focus error. The chart is oriented to make the left side farther from the camera than the right side. The ideal angle to shoot the chart is at 45 degrees relative to the vertical. The measurements above would indicate that the camera (or lens firmware) needs some “-” focus-tune adjustment, to pull the focus toward the camera. I always recommend, by the way, to look at the annotated focus chart measurements. There are occasions when the numbers give you a better idea of how to adjust focus. Also, repeat this test several times to avoid reacting to normal focus variations. Lastly, perform the tests in good light for optimal reliability. In the focus test above, the camera focus point was placed onto the right edge of the large central trapezoid. Because the chart is rotated, the trapezoid looks like a rectangle in the photograph. Measure Longitudinal Chromatic Aberration Focus chart with “fiducials” for measuring longitudinal chromatic aberration Chart zoomed in. Shows green channel focus error. The other major feature addition is the ability to analyze how a lens focuses in the red, blue, and green channels. When the different channel color focus measurements don’t coincide, then you have longitudinal chromatic aberration. To create the charts shown above, the MTF Mapper needs to be configured as shown in the following picture: Preferences dialog. Note the camera sensor “pixel size” must match your camera. Summary The new MTF Mapper version 0.5.8 brings many welcome additions. You might want to retain your older version, however, if you prefer the “relative” versus the “absolute” resolution measurements. Please visit the Frans van den Bergh site and give him some praise for going through all this effort. Frans, you’re the man! #review
- MTF Curves: Theoretical Versus Actual
All camera companies (with the exception of Sigma and Leica) publish MTF curves for their lenses that are “theoretical” and not actually measured. Should you care? Personally, I believe in the old President Reagan saying “trust but verify”. What follows is a dose of reality, compared to theory. I have chosen what most people would agree are among Nikon’s best pro lenses for this study, lest I get accused of measuring lenses that were manufactured using lesser standards. The MTF curves I’m referring to are the traditional mix of MTF10 (contrast) and MTF30 (sharpness). I used the “mtfmapper” software version 0.5.8 to create the following charts. Personally, I place much more stock in the 2-dimensional MTF50 plots that measure the whole camera sensor. Unfortunately, getting 2-D MTF50 plots is hard to come by outside of this site. The MTF charts are traditionally generated for a wide-open aperture, so that’s how mine are measured. It’s unknown what focus distance is used by Nikon; mine will be measured at the distance needed to photograph an “A0” resolution chart filling the frame. I took the measurements in shade on a clear sunny day. Light wavelengths can affect measurements; I like to test using the same lighting conditions that I normally shoot. 105mm f/2.8G ED‑IF AF‑S VR Micro Nikkor This lens is supposed to be optimized for “close” distances, but I’m measuring it at a more conventional distance. Nikon Theoretical Chart (from Nikon site) for 105mm Measured MTF10 and MTF30 for 105mm at f/2.8 I don’t want to appear cynical, but I was 99% sure that my measurements would show less sweetness and light than the Nikon claims. This is pretty much borne out by the measurements. Take a look at the edge of the lens, though. It actually performs better than theoretical! Measured MTF10 and MTF30 for 85mm at f/1.4 Again, not quite as good as theoretical. The edges have a few pleasant surprises, however. 85mm at f/4.0 Just for fun, I tried an f/4.0 test. It really cranks up the quality, doesn’t it? 24-70mm f/2.8E ED VR AF-S Nikkor The wide end of this lens looks dramatically different than theoretical. Again, this lens at 70mm looks quite a bit different than the claims. Conclusion It appears that Reagan had some good advice. Bear in mind that these lenses don’t represent the whole population; your mileage may vary. My biggest surprise is that the FX frame edges fared better than expected. Trust but verify. #review
- Focus Stacking With Combine ZM
I have tried a few different programs that let you increase depth of focus by stacking pictures that are shot at varying focus distances. Most of those programs will readily fail when the subject is too complex or the focus depth is too extreme. Focus stacking is mostly used in two different realms, namely landscapes and macro photography. Landscape photographers usually want maximum depth of field and maximum resolution, which can be had by stacking photos shot at the sharpest aperture and at multiple focus distances. Macro photographers know only too well that a single close-up can have paper-thin depth of focus; combining a dozen or more shots is often necessary to get sufficient depth of focus. I have had too much grief using Photoshop and Hugin tools, but a (free) program that works pretty well for me is called CombineZM by Alan Hadley. Stacking pictures requires a lack of image movement, so wind can mess up your plans. The pictures need consistent exposure, so manual exposure works best (at a constant aperture). For macro work, I like to use my Nikon PB-4 bellows (with its rack-and-pinion focus rail) to easily move the camera/lens combination from shot to shot, shifting focus by maybe half of a millimeter per shot. Good luck finding a Nikon PB-4 bellows. The default settings in CombineZM don’t always work the best for me, so I thought I’d share how I make it work for me. It bears mentioning that “macro” in CombineZM means “run a sequence of steps” and not anything to do with close-ups. My most successful recipe to stack pictures is this: Post process and convert your pictures into TIF format (16- bit with LZW compression is what I use) using your favorite image editor. Run CombineZM (I use it in Windows7 and Windows10, but it works in other operating systems, too). Select File | *New and then select the set of TIF pictures to stack (select in focus-order). Wait until the pictures are loaded. Select Macro | Do Weighted Average. I have less success using “Do Stack” or “Do Weighted Average Correction”. After it finishes, use your mouse to draw the diagonals of a rectangle around the ‘good’ part of the result. Select File | Save Rectangle As. I just save the result as JPG, with typically 95% quality. Here’s a sample finished shot, which is from a stack of 10 files: “Do Weighted Average” macro to create the stack I actually took even more shots in front and behind of what was used above. The software started to mess up with this many pictures, so I omitted some shots to achieve the result shown above. “Do Weighted Average” with even more pictures in the stack. Note the evil ‘ghosting’. “Do Stack” macro. Note strange artifacts using this option. What a focus stack looks like before you crop it. Typical single shot depth of focus (60mm f/10) Depending upon your subject, you may have to iterate on the selected options or perhaps how many pictures you can stack. Life is rarely simple… Make sure your subject is a bit smaller than the frame, because you will have to crop the edges of the “stack”. Conclusion Focus stacking is one of those techniques that takes a little tenacity. There are many different tools that can stack pictures, with varying degrees of success (or failure). This is one of those digital tricks that seems to defy optical physics. If you're willing to put in the effort, the results can be quite rewarding. #howto
- Clean Your Camera Image Sensor
Are you a little intimidated about cleaning those dust bunnies off your camera sensor? Should you punt and pay to have it done for you? It's a little scary to clean your camera sensor if you haven't done it before. I used to bring my camera to a Nikon repair facility to get it cleaned. They would keep my camera overnight, and it would cost me $70.00 for something that took them about 5 minutes of their labor. My D7000 camera, for roughly the first 12,000 exposures of its life, would sling oil onto the sensor. The Nikon service center denied this was oil, and indicated I was probably a little sloppy in my camera-handling cleanliness. Beg to differ. One time I cleaned my sensor (a "wet clean") and then made a 1000-shot time lapse video. By the end of this 20-minute video, the sensor probably had a hundred oil blobs on it. Arrgh. There is a cheap solution (one of those bad puns again). Believe me, you can't get oil off of a camera sensor unless you give it a "wet clean". But what about the more usual case of mere sticky dust? The kind of dust that a blower can't budge? There's now a tool you can buy that can clean off stubborn sensor dust in a safe and easy way. I made a little video that shows how simple and quick it can be to clean your camera sensor yourself. You don't need to be a frady cat any more! #howto
- The Orton Effect
Michael Orton is a photographer who wanted to re-create the look of a water color painting with film. He invented a technique that combines slides containing a sandwich of in-focus and de-focused images. Michael originally called his technique “Orton Imagery”, but now everybody just calls it “the Orton Effect”. When digital photography came along, people wanted to emulate this effect using software. Perhaps the most famous use of this effect is in the Hobbit movies. People knew the “look” was different, but they couldn’t put their finger on what the difference was. I really love the look of the Orton Effect for certain kinds of subjects. Just like cupcakes, though, you may like them but they're less than ideal as a steady diet. Everything in moderation. A straight shot The Orton Effect (Capture NX2, blur radius 25) A Few Ways To Create The Orton Effect Many different photo-editing packages have the capability to create the Orton Effect. Some examples are Gimp, Photoshop, and Nikon Capture NX2. Maybe one of these days, your camera will have an “Orton Effect” setting to create it directly. Nikon Capture NX2 I must be one of the last hold-outs on using Capture NX2. I have a zillion batch files to process pictures using this software; one of them is, of course, the Orton Effect. The first step is to set the Output curve to a value of 3 in "Levels & Curves". Leave other settings at their defaults. Second step: Set a Gaussian Blur value to around 25. Use a blending mode of “Multiply”. The radius value here should be set to suit your subject matter. Third step: Set a midpoint value to “2” in Levels & Curves. Alter the blending mode to “Multiply”. There you have it. At this point, it would be prudent to save your steps as a Batch Process (Batch | Save Adjustments… | Save As | OrtonEffect). Now, you can select photos and run the batch process on them to get the Orton Effect without having to memorize any more steps. You might want to save a few different batch files, setting a different Gaussian Blur radius in each one (the “second step”). The Orton Effect (Capture NX2, blur radius 35) For simple subject matter, I prefer a larger blur radius. For you, sprinkle to taste. Adobe Photoshop First, duplicate the photo and call it “Sharp” Second, right-click on the “Sharp” layer, select “Duplicate Layer…”. and name it “Sharp copy”. Select “Screen” for the blending mode of this layer. While “Sharp copy” is still selected, right-click and select “Merge Down”. You will be left with just the “Sharp” and “Background” layers. Right-click on the “Sharp” layer, select “Duplicate Layer…” and name it “OutOfFocus”. With the “OutOfFocus” layer selected, go to “Filter | Blur | Gaussian Blur” and set a radius suitable to the effect you want. No details should be visible, but you can still make out shapes. Set the “OutOfFocus” layer blending mode to “Multiply”. All that’s left is to save your final image in whatever format you prefer. Finished Orton Effect using Photoshop Summary I would encourage you to explore this processing technique. It can transform a blah photo into something special. I find that purely literal recording of images can start to feel a bit mundane. Try something on the wild side once in a while. Michael Orton did some really pioneering work in photography. We owe him a big thank you. It does look a little like a watercolor painting, doesn't it? #howto
- White Balance Calibration When Colors Go Haywire
Setting the white balance is one of those things that can be laden with a lot of guilt. If you shoot RAW, it’s supposed to be a “don’t care”, but many photographers will look down their nose at you if you don’t “do Kelvin”. I thought it’s high time to do a little comparison shopping. Modern cameras have a lot of computing horsepower to figure out what white balance you should be using ala “auto”, but is it any good? What about using tables that supply the answers? What about a color meter? What about just setting it in your photo-processing software after the fact? How about using "Live View" mode to help you decide? So many questions. I always shoot RAW, so white balance decisions aren’t a big deal to me. If I don’t like the color balance, I just change it in photo editing software. With jpeg, though, it’s not nearly as forgiving. Jpeg has very little elbow room for errors, so you want to get it right in the camera. But how can you reliably get it right in the camera? I conducted some tests using a Nikon D610 and an android smartphone program called “Light Meter” (version 2.6) written by Borce Trajkovski. This program lets you measure light levels (as in LUX) and also color temperature in degrees Kelvin. The program has the additional advantage of allowing you to calibrate it for both light (scale and offset) and also for color (scale and offset). It gives you approximate calibration values to use for various smartphone models. I used Nikon’s Capture NX2 to adjust and analyze my RAW shot tests, but many photo editing packages would work just as well. Sunshine I took 3 shots of my neutral grey card target illuminated by direct sun. I used “Auto” white balance, the measured color meter temperature (5560K was closest), and the “Direct Sunlight” camera white balance selection. Not surprisingly, all of the shots look acceptable (although the “direct sunlight” choice was off the most). Using my histogram view in Capture NX2, I could see that the R,G,B color peaks were nearly on top of each other, as they should be for a neutral grey target. The in-camera histogram showed the same result. “Auto” white balance in sunshine. 5433K was set automatically. “5560K” white balance in sunshine. Meter indicated 5600K “Direct Sunlight” white balance selection in sunshine. 5209K was set by camera. Most cameras have sunlight pretty well figured out, so you’d expect those shots to have well-balanced color. Shade My next test was in open shade, with a clear blue sky. “Auto” white balance in shade. 7662K got set automatically. “7140K” white balance in shade. Meter indicated 7000K. “Shade” white balance selection in shade. 7989K was set by camera. The analysis of the histogram peaks indicates that “Auto” white balance is the best here, but again all three are reasonably close to each other. I prefer the "color-metered" setting. Indoor LED Lighting “Auto” white balance inside using LED (ceiling) lighting. 3390K was set automatically. “4350K” white balance in LED lighting. Meter indicated 4300K. LED lighting “Live View” guide with “3030K” selected WB. Now things get interesting. The “Auto” setting was pretty inaccurate, and using the color meter was really terrible. It turns out that by using “Live View”, I could really nail the white balance. In "K" WB mode, I could spin my camera's control dial and instantly see the color change on the screen in Live View. An electronic viewfinder would work the same way. Since I use RAW, though, all is not lost. All I have to do if I messed up the in-camera white balance is to adjust it via my photo editor when I get back home. Here’s the trick: with a neutral grey target, adjust the white balance value until the RGB histogram peaks coincide. Let’s take a look at the badly-adjusted shot in Capture NX2 next. With the 4350K original setting, the RGB peaks aren’t even close to where they should be (they should land on top of each other). By adjusting the slider “Fine Adjustment” to 2950K, the peaks overlap and the picture is now perfectly neutral. “Live View” really saved the day on the indoor shots. If I were shooting an indoor wedding ceremony at the mercy of whatever lighting was there, I’d definitely want to consult “Live View” to set my white balance. Outdoors in bright lighting is another animal, however. Live View (unless you use something like a Hoodman Loupe or have an electronic viewfinder) is an underwhelming experience. You should still be able to analyze the histogram peaks on shots after the fact to help you dial in white balance, though. By the way, don't even think about using published tables of color temperatures for indoor lighting. Indoor lighting color is all over the map, and tables are mostly useless. Conclusion So what have we learned today, class? Auto white balance can be your friend and your foe. Learn when it’s safe to use it and when it’s not. Carry a grey card with you to calibrate the white balance. Take a shot of the card so that you at least have a good reference picture to dial in white balance at home with your photo editor. Remember to take another shot of the card when the lighting changes. Live View can really be your friend, even if you just use it to dial in the white balance and then turn Live View off. Shooting birds in flight moving in and out of shade, however, leaves a single viable choice: "Auto WB". This is what RAW format is all about; just fix the color in your photo editor. Happy (calibrated) shooting. #howto
- Lens Focus Repeatability and Calibration
Many people are under the impression that your camera/lens will auto-focus the same way each time. Nope, nope, nope. Camera designers have to live in the real world of “close enough”, “fast enough”, and “cheap enough”. The holy grail of focus is to make sure your target gets inside the zone of acceptable focus. If your camera misses focus every time (and it probably will), it doesn’t really matter as long as the target is still in focus. I’m going to show you some real-world measurements and what kind of compromises you need to make when evaluating and calibrating your focus. I’m assuming you have a camera that supports focus calibration. It drives me crazy when people make claims about “facts” without the data to back it up and without giving you the tools to repeat the same experiments for yourself. What follows should be reasonably repeatable by anyone, without much expense involved. As always, it bears repeating that “your mileage may vary”. Measurements are affected by the camera, lens, light level, aperture, target size, alignment accuracy, target distance, and stuff I haven’t even thought of. I decided to try the experiment with two different cameras and two different lenses. I chose a Nikon D7100 with my Sigma 150-600 at 300 mm and a Nikon D610 with a Nikkor 24-70 at 70 mm. They’re both competent combinations, and should be representative of what an average camera enthusiast might use. I did all tests with the aperture wide-open, since stopping down would only obscure the results. The Nikon D7100 has a focus sensitivity down to -2 EV, and the D610 has a focus sensitivity of -1 EV. That doesn’t mean you should perform a focus test there. I always use a light level of at least 10 EV for testing. I’m after focus repeatability, and repeatability will go out the window if you shoot in dim light. How far out of the window would be an excellent topic of study for another time. Keeping with the theme of doing things by the numbers, I’m using my go-to analysis software MTF Mapper (version 0.5.13) by Frans van den Bergh. I used the “focus” option with his “mfperspective_a3.pdf” chart, printed to about 10” by 12” and mounted flat. The chart is oriented 45-degrees to the camera, to capture correct depth information. This arrangement gives me ample accuracy for evaluation. By the way, the camera pixel size doesn't matter for this particular test, but you need to remember to set it for the other measurements in the program options. All of my photographs are made with un-sharpened RAW format. I de-focused the lens between each shot, and I alternated between too-near and too-far de-focus to exercise both directions of auto-focus. I always use back-button focus. I didn’t want any directional bias in the shots. What the focus chart looks like. The image above shows what gets photographed and analyzed. The chart left side is rotated farther away from the camera’s plane of focus by 45 degrees. The camera focus sensor is pointed dead-center on the chart. Sample measurement from Nikkor 24-70 mm at f/2.8 and 1 meter. The above picture shows how the MTF Mapper can measure the key elements in the chart and provide focus error measurements. In this picture, the camera missed focus by 2.8 mm at a distance of 1 meter. The camera focus sensor was aimed at the marker that’s under the vertical orange arrows. The depth of sharp focus for this lens/aperture/distance combination is about 15 mm. Anything inside the 15 mm focus window would be “success”. It’s possible to make the measurements using only the red, blue, or green-sensitive sensor pixels in the photo, if desired. Lenses with significant longitudinal chromatic aberration will have focus peaks that are widely separated. For this experiment, all that is needed is to be consistent in using the same settings each time. Sample measurement Sigma 150-600 at 300 mm f/5.6 and 4.22 meters. The depth of sharp focus for this lens/aperture/distance combination is about 45 mm. Anything inside the 45 mm focus window would be “success”. Here, the camera missed focus by 0.7 mm, which is pretty much dead-on. Sample results for Nikkor 24-70mm f/2.8 Test The Nikkor 24-70 mm at 70 mm test follows. It should be noted that this lens is notorious for focusing differently at different focal lengths. This means that it is impossible to “fine tune” focus at a single value and have it correctly calibrated throughout the focal range. I have set a fine-tune value (+16) that under-compensates at 70 mm and over-compensates at 24 mm, with a bias toward 70 mm. The Sigma lens has far smarter firmware in it, and I have it calibrated at 4 different focal lengths and at 4 distance settings per focal length, for a total of 16 calibration fine-tune settings. Measurement Errors Per Photograph (mm): -8.1, -11.2, -10.6, -1.7, -8.4, -7.6, -5, -7.2, -4.5, -6.2, -7.3, -8.9, -1.9, -7.1, -1.1, -7.0, -4.3, 2.8, -2.6 N = 19, MEAN = -5.68 mm, STDEV = 3.55 mm Given a “sharp zone” of about 15 mm (plus, minus 7 mm), I’d say that this test showed a focus miss about a third of the time. The mean of -5.68 mm is my “bias” focus error to help minimize the “+” focus error I get when zoomed to 24 mm. The standard deviation of 3.55 mm is a measure of the typical magnitude of each focus “miss”. Call this repeatability. This is actually an impressive value. That's a typical error of 1 part in 282, or (1000 mm / 3.55 mm). Bear in mind that the 15 mm “sharp focus depth” criteria is actually being quite picky. Also note that the focus chart target was only 1 meter away and the shots were with a wide-open aperture. Longer distances and/or stopping down would make the target zone quite sharp. If I were to expect to spend the day shooting at 70 mm, I’d certainly adjust the focus fine-tune up to the maximum +20, and my focus miss rate would go toward zero. Typical focus error plot, showing it needs more “+” focus fine-tune to drive it toward the vertical blue marker. Sample results for Sigma 150-600 mm f/5.6 Test The Sigma 150-600 mm at 300 mm test follows. In contrast to the Nikkor, this lens has focus fine-tune calibration throughout its zoom and focusing range (16 calibration fine-tune settings). It makes all the difference. Measurement Errors Per Photograph (mm): -0.7, -9.8, -14.5, -16.6, 1.1, 14.5, 0.6, -0.19, 9.4, 4.9, -15.3, 7.0, 4.9, -8.2, -12.0 N = 15, MEAN = -2.3 mm, STDEV = 9.8 mm Given a “sharp zone” of about 45 mm (plus, minus 22.5 mm), I’d say that this test showed it NEVER missed the focus zone. The standard deviation of 9.8 mm is a measure of the typical magnitude of each focus “miss”. Given the focal length and target distance, this is awesome. That's a typical error of 1 part in 431, or (4220 mm / 9.8 mm). Bravo, Sigma. Conclusions If you really, really want to know how your lenses and cameras perform, these tests are representative of how you would do it. Being an engineer myself, I never fail to be impressed at how far photographic technology has come. Twenty years ago, you couldn’t get this level of camera/lens performance at any price. Thanks again, Frans, for your incredible MTF Mapper program. #howto
- “Safe” Storage of Camera Gear
Many people have purchased a safe to store their camera gear and protect themselves from theft and fires. This might not be the best idea. Let me explain. Safes (mostly advertised as “gun safes”) are usually advertised as being fireproof, typically able to withstand a fire for about a half-hour. How is this achieved? It’s achieved typically in two different ways: via the gypsum in drywall material, or through clay-like materials. Typical fire-rated gun safe with electronic lock Typical gun storage cabinet with key lock This fireproof insulation material makes up the bulk of the wall thickness of a safe. I bet you thought those safe walls were pure steel. Nope. If that were true, a safe would weigh probably triple what it actually does. So why should you care how a safe is made fireproof? The key here is moisture. These fire-proofing materials contain water molecules, and that can create a high-humidity environment. High humidity is not a friendly environment for cameras or lenses. The way these materials work is to convert water molecules into steam during a fire, which keeps the safe’s flammable contents from igniting. A “steamed camera” is probably a dead camera, so fire-proofing won’t really help you anyway. Some, but not all, fire-proofing materials in safes may keep a high-humidity environment inside the safe. The worst offenders here are “document safes”. I used to use one of these myself, and noticed that my camera LCD screen would always fog up when I started using my camera. Not good. Another concern with fire-proof safes is formaldehyde. The drywall insulation material (often from China) might contain formaldehyde. It would be great if you could buy a safe that used Space Shuttle tiles for insulation, but I doubt you could afford it. So, how do you keep your gear secure, without ruining it by high humidity? One solution is a lockable steel storage cabinet. It won’t have insulation that can pose a humidity problem. Steel cabinets can be secured to the floor or a wall via lag screws, etc. Cabinets made for gun storage typically have the heaviest gauge steel, which rival the steel thickness found in modestly-priced safes. These gun-storage cabinets typically have hardened steel, as well. What about storage capacity? You will gain about 4 inches in all interior dimensions for extra storage space if you omit insulation. You will find over time that you ALWAYS need more space. Professional thieves probably have grinding tools that can penetrate most safes without much more difficulty than a steel cabinet. It’s probably more important to protect your gear from humidity and resign yourself to protection against mere amateur thieves. Fire protection? Forget it. #review
- Test Your Secure Digital Card: Lame and Lamer
Why Lame and Lamer? I can guarantee that the specifications of your secure digital card are bogus. As I'm fond of saying, they're blowing smoke you know where. You'll find that the read speed isn't as fast as the card manufacturer says (lame), and the write speed will typically be even slower than the read speed (lamer). When a manufacturer like Nikon tells you how many frames you have in your camera “buffer”, they assume you have the very fastest SD card available. If you don’t have a fast card, your buffer size is much smaller. What this really means is that Nikon doesn’t have as big of a buffer as they claim; they’re depending upon your SD card to write out the pictures while you’re shooting a sequence. I'll show you how you can test the card for yourself outside of your camera to see how fast it really is. I’m going to demonstrate results from a Windows environment; you can download other free software to perform tests in an Apple and Linux environment, as well. The Windows program "h2testw" can measure both read and write speeds of your secure digital card. If your computer cannot take advantage of UHS-II hardware, you can't accurately assess those newer-generation cards for speed. Newer UHS-II cards have an additional row of electrical contacts on them, allowing for parallel data transmission, which is how they’re able to get so much more speed out of them. Cameras that only have UHS-I capability can still use the UHS-II cards, but they ignore the extra row of electrical contacts and run them at reduced speed. As a result, you won’t ever get beyond the “UHS-I” speed using a “UHS-II” card. If you use a card reader and your computer USB port is USB 3.0, then you may still be able to accurately measure it, assuming your card reader has the electrical capability of the UHS-II specification. Older computers may in fact be too slow themselves to give you accurate information, so be forewarned. Many off-brand SD card manufacturers lie even more than the big guys. The speeds they claim aren’t even close to name-brand manufacturer speed ratings, let alone reality. Buyer beware. Be advised that this (h2testw.exe) program will destroy any pictures on the card, so save existing pictures elsewhere before running the program. You should format the SD card once you place it back into your camera after testing it. h2testw.exe User Interface In-progress program screen SanDisk Extreme Pro 95MB/s test results SanDisk Extreme Plus 90MB/s test results Sample computer "SD card slot" tests: SanDisk Extreme Plus 32GB 90MB/s UHS-I card. Actual Read speed: 69.6 MB/s Actual Write speed: 55.2 MB/s SanDisk Extreme Pro 32GB 95MB/s UHS-I card. Actual Read speed: 66.5 MB/s Actual Write speed: 67.5 MB/s Nikon D7100 results (from cameramemoryspeed.com): SanDisk Extreme Pro 32GB 95MB/s UHS-I card = 69.8 MB/s Conclusions The computer results and camera results are comparable to each other (within 3% for write speed). Compared to the “95MB/s” being advertised for the Pro version, however, these speeds are quite different. Be aware that the newest SD cards (with write speeds approaching 300MB/s) are “UHS-II” or beyond. You’ll have to upgrade to a new camera (like the D500) if you expect to take advantage of such speeds. #howto
- Photo Noise Reduction: Nik Dfine 2.0
Many photo-editing programs include noise reduction, but they’re typically crude. If you’re the type of person that wants the nth degree of control over this process, you might consider using Nik Dfine 2.0. Nik Dfine is a “plug-in”, so that means it runs inside of another program. Many programs, such as PhotoShop, LightRoom, Aperture, and Zoner Photo Studio can use plug-ins. If you use Nik Dfine, it means you get the same user experience inside any of those programs that can run it. Nik Dfine comes from Google, and they decided to discontinue it; they made it free, so you can’t beat the price! Google also discontinued the other Nik plug-ins, so they’re all free now. You may need to consult Google for specific procedures on how to install the Nik plug-ins for your particular program. You can still get the plug-ins here: Why would you want noise reduction in the first place? Two of the biggest reasons that come to mind are small-sensor cameras and dim light pictures where you were forced to really crank up the ISO. Those color spreckles, especially in deep shadows, can look terrible. You should know that the order in which you process your pictures is important. You want to handle noise reduction first, before any other photo manipulation. By the way, you want to handle sharpening last. In-between these two editing operations, order isn’t too important. I noticed that the Dfine user guide said it supports only Tiff format, but I used it in Zoner Photo Studio with Nikon NEF format pictures, so you don’t have to worry about that constraint. A little noise reduction goes a long way, so you don’t want to overdo it. It you don’t heed this advice, you’ll probably end up with pictures that have a lot of mush instead of fine details. If you’re working with “raw” format, then don’t apply any sharpening or noise reduction before using Dfine on the image. Nik Dfine is extremely smart about how and where it rids noise. It can aggressively attack featureless areas and barely touch areas with fine detail. You can go as manual as you wish to “take control”, or you can let Dfine do its magic automatically. Personally, I love the automation and the end results. Many photographers find that they hate the color noise, but that they don’t mind the luminance noise. I (mostly) fall into that category. I don’t mind the gritty or sandy effect luminance noise can have. Leaving luminance noise alone can result in an overall sharper-looking photo. If you want to rid both types of noise, however, Dfine can deliver. The Dfine developers were very smart about leaving the fine details alone while smoothing the out-of-focus areas. Dfine is essentially a two-step process. The first step is “Measure”, where it analyzes the photo and decides what to do and where to do it. The second step is to act on the noise, or “Reduce”. Before working on noise reduction, you will probably want to set up some preferences. I prefer the “single” versus a “split” or “side-by-side” view of the photo, and I toggle the “Preview” checkbox to see the “before” and “after” effect on the whole photo. I also prefer the default “RGB Mode”, versus modes such as “Chrominance Only” or “Luminance Only” (which switch to black-and-white). I like to keep the “Loupe” enabled, so that I can selectively look at the pixel-level wherever the mouse pointer is. You can also lock its view into a position of your choosing by selecting its “pin” icon. Some notes on setting up the Dfine functionality “Measure” showing Automatic mode Again, click on “Measure” to let Dfine analyze the photo and determine its game plan on how to reduce noise prior to clicking “Reduce”. “Reduce” showing “Color Ranges” Method. After clicking “Measure”, the interface will change to allow you to fine-tune settings. I typically will increase the “Edge Preservation” (under the “More” drop-down) to save really fine details, such as fur or feathers. For global changes based upon color, select the “Color Ranges” Method. If you wish to work on areas of the picture, you can select “Control Points” instead, and add as many as you need. People familiar with Nikon Capture NX2 know all about control points, since Nik wrote that program, too. When you’re happy with your noise reduction setup, click on “Reduce”. Example noise reduction at pixel-level zoom Typical noise (in shadows) Noise reduced without detail loss in fur As camera sensors get better, noise reduction is needed less and less. But when you need to fix a noisy image, Dfine is a great tool to have in your arsenal. #howto











