Search Results
272 results found with an empty search
- D610 VS. D7100 VS. D7000 Infrared Comparisons
I happened to be testing an old Nikkor 20mm f/4 (AI-converted). I thought I’d try some infrared shots, since this lens is supposed to be excellent shooting IR. I use the Hoya R72 IR filter, with the 52mm thread diameter. This 20mm lens is about the smallest and lightest FX lens Nikon ever made. 7.4 ounces light. The D7000, D7100, and D610 allow aperture-priority auto-exposure after defining the “non-CPU lens data” for this lens. I absolutely love its field of view (94 degrees) on the D610. You’d never miss auto-focus using a lens like this, since everything is typically in focus all the time. You still get the 3-stage focus indicator inside the viewfinder while manually focusing. Still, 20mm on a DX camera isn’t that wide; my Tokina 11-16mm f/2.8 has no reason to feel threatened here. The 20mm, of course, has the little red dot on the focus scale for infrared focus compensation. Back in the day, Nikon really paid attention to stuff like that. Manual-focus lenses are actually superior to auto-focus lenses for shooting infrared with a filter like the Hoya R72, since you can’t see through the viewfinder. You frame and focus (and use the lens focus scale red-dot IR shift) before attaching the filter. Those of you who have gone through the pain of framing/pre-focusing a ‘G’ auto-focus lens and then mounting an IR filter know what I’m talking about. After all of those digressions, back to the subject at hand: IR shooting comparisons. The D7000 IR results indeed look excellent. Absolutely nothing to complain about here, aside from the gripe about the narrower DX field of view. I was shooting at f/11.0, 15 seconds, ISO 250. (“Sunny 16” rule would have been 1/500 at f/11, ISO 250. IR needed 12 stops more light!) Nikon D7000 using Hoya R72 with Nikkor 20mm f/4 AI-converted lens. Excellent Now, for the D610 infrared results. How to describe what I got? Epic failure comes to mind. Totally unusable. It appears that the light baffling and anti-reflection coatings inside the D610 act more like a mirror in the infrared spectrum. In comparison, this 20mm lens is wonderful for regular-light photography on the D610, especially for landscapes. Nikon D610, Hoya R72, Nikkor 20mm f/4 AI-converted lens. Gross. Next, I head for my D7100. Terrible. Exact same light baffling and anti-reflection coating problem in infrared. Nikon D7100, Hoya R72, Nikkor 20mm f/4 AI-converted lens. Gag me. Note the terrible horizontal glare across the entire frame for both the D610 and D7100. The Nikon D7100 misbehaves in a nearly identical way to the D610 when shooting infrared. My guess is that the camera internal baffling and anti-reflection coatings actually reflect instead of absorb infrared wavelengths. Ahh. I bet it's something wrong with the 20mm lens, you say. I bet the problem goes away with a different lens, you say. There's no way the D610 and D7100 could let me down this badly, you say. How could the D7000 possibly be superior to the D610 and D7100 in any way, you say. I tried using the 50mm f/1.8 AF-D with the Hoya R72 IR filter on the D610, since I’m apparently a glutton for punishment. Big nasty hot spot in the center of the picture, in addition to the terrible horizontal banding flare. Having used this lens in the past for infrared, I know it’s not the lens’ fault. I have to conclude that the D610 is useless for infrared photography. I didn’t have the heart to try this same lens on the D7100; I know the results would be the same. Now, the secret sauce to making the D7100 and D610 succeed with IR photography: you absolutely need to cover the viewfinder eyepiece with the little "DK-5" eyepiece blocker. Unlike the D7000, D60, D50, and D500 cameras I have tested, the light baffing in the D7100 and D610 seems to be inferior. You can clip the DK-5 onto your camera strap so you don't lose it, and you don't even need to take it off of the strap to slip it over your viewfinder! 50mm f/1.8 AF-D on the D610, HoyaR72 filter. Still gross and unacceptable. The Nikon D7000, as with all modern digital cameras, (that haven’t been converted to infrared) is very insensitive to infrared. The filter on top of the image sensor screens out almost all of the infrared wavelengths. Older camera sensor filters (like the D50 and D60) were much better at passing IR (a few stops better, at least). Aside from long exposure times, the D7000 provides top-notch IR results. Just keep in mind that the Bayer sensor only has a quarter of the photo sites sensitive to red, so your camera resolution is essentially divided by 4 as well. Nikkor 20mm f/4.0 AI-converted, on a D610. It's only a little bigger than a body cap. Moral of the story: don’t ditch that D7000 if you do infrared photography. For the D610 and D7100 (and probably the d7200), always use the little DK-5 viewfinder eyepiece blocker. By the way, I typically use Nikon Capture NX2 to convert my 'deep red' RAW shots into the samples you see above. I can't get the D7000 to succeed at measuring a scene to get "preset manual" white balance with infrared, although the preset measurement works for a D50 and D60. I use the Capture NX2 "Camera Settings", "White Balance", "Set Gray Point", "Marquee Sample", "Start", then rectangle-mouse-select the whole picture, then click inside the selection. This is a quick and easy way to get the picture really close to what the in-camera "preset manual" white balance procedure achieves for regular photography. Once the editing steps are entered, just save those steps as a batch process. The batch process can be run on a whole folder of IR shots, to quickly get everything converted. #review
- Nikkor 20mm f/4.0 AI Review
This article is an evaluation of the old manual focus 20mm f/4.0 lens that has been AI-converted. The conversion makes it possible to get automatic exposure on the better Nikon digital cameras, including the D610 with an FX sensor. The 20mm f/4 was manufactured from 1974 through 1978, back when Nikon was the big man on campus in photography. Mine was purchased in 1975, and virtually never came off of my Nikon F2 when I was back-packing (unless I did some macro shots). Small, light, sharp, tough, elegant, and wide; almost exactly like the ideal woman, except perhaps for the ‘wide’ part. This 20mm lens is one of the smallest and lightest FX lenses Nikon ever made. At 7.4 ounces, you hardly notice it’s there. It's about 1.4 inches long, like a thick body cap. Cameras like the D7000 series and D610 allow aperture-priority auto-exposure after defining the “non-CPU lens data” for this lens. I absolutely love its field of view (94 degrees) on the D610. The lack of auto-focus using a lens like this isn’t a hardship, since everything is typically in focus all the time. You still get the 3-stage focus indicator inside the viewfinder while manually focusing with the better Nikons. The 20mm, of course, has the little red dot on the focus scale for infrared focus compensation. Back in the day, Nikon really paid attention to stuff like that. One of my main uses for this lens is infrared, but unfortunately cameras like the D7100 and D610 are essentially useless for infrared (see this article: ). My D7000, however, works perfectly for infrared with this lens (using the Hoya R72 52mm filter). Manual-focus lenses are actually superior to auto-focus lenses for shooting infrared with a filter like the Hoya R72, since you can’t see through the viewfinder. You frame and focus (and use the lens focus scale red-dot IR shift) before attaching the filter. Those of you who have gone through the pain of framing/pre-focusing a ‘G’ auto-focus lens and then mounting an IR filter know what I’m talking about. Focusing is still silky-smooth, unchanged since the day it was manufactured. The focus scale is a thing of beauty. I have every reason to believe that this lens will last not just a lifetime, but multiple lifetimes. Although it works without vignetting, be careful using a polarizer on this lens; the sky will look too un-even because of the wide field of view. 20mm f/4 Nikkor AI-converted on the D610. Sweet. Resolution Tests I test lenses using the MTF Mapper software and the recommended resolution charts (printed to A0 size and dry-mounted). The article here explains the software and its use. As you’ll see, you are going to want to stop down to f/8 or more to get the corners you want. The center is already very good at f/5.6, and only gets better as you stop down. Avoid going beyond f/16, because of diffraction. All tests were done using the D610, with 24 MP (5.95 micron pixels). I only shoot un-sharpened RAW for the resolution tests. To convert the MTF50 lp/mm measurements into LP/PH, simply multiply readings by 24.0. To convert into LW/PH, take the LP/PH values and multiply by 2. Sample Photos Sun just out of frame. Palm fronds near frame edge are sharp, D610. The “wavy” distortion is quite minimal, D610. Infrared, Hoya R72, D7000 Conclusion If you're willing to stop this lens down to f/8 or f/11, the results are about as good as any ultra-wide lens made today. You won't find a more compact lens anywhere. I love that it uses 52mm filters, too. This has become my go-to lens for infrared photography. My D610 and D7100, by the way, are basically useless for IR unless I put the DK-5 eyepiece cap over the viewfinder. My other cameras are fine without the cap, unless I switch to my 850nm IR filter, which requires exposures of 2 or 3 minutes; all cameras need an eyepiece cap when exposures get that long. I understand that this lens is a bit rare these days; I have no intentions of ever selling mine. #review
- Measure Axial Chromatic Aberration: MTF Mapper Part Deux
This article will show you how to use the mtf_mapper_gui.exe program to measure axial (longitudinal) chromatic aberrations. Longitudinal chromatic aberration (LoCA) is the optical problem of focusing different colors of light at different distances along the optical axis. My other article about MTF Mapper is located here . The other article is about MTF50 resolution measurement and focus calibration. Be aware that you need to use separate charts to make separate measurements (lens MTF50 resolution, focus calibration, or LoCA analysis). Also be aware that measurements depend upon the color of light used while photographing the targets (I used outdoor lighting with a clear sky in the sun). LoCA differs from lateral (transverse) chromatic aberration, which instead spreads out different colors perpendicular to the lens optical axis. Lateral chromatic aberration typically shows up as purple corners in the photograph; you won’t see it in the center of the image. Lateral chromatic aberration is simple to fix; most modern cameras can even automatically fix it in-camera. Longitudinal chromatic aberration is more difficult to correct, although programs such as Nikon Capture NX2 can largely mask its effects. Axial (longitudinal) aberration diagram, courtesy of Wikipedia.org LoCA can rob a lens of resolution, since some colors will be in-focus and some colors will be out of focus. You can recover the resolution by stopping the lens down until all visible wavelengths are in focus, but artistically this is often a poor option. A common visual effect of LoCA is seeing magenta instead of white in specular reflections on the eye. If you’re interested in evaluating how much LoCA a lens has, then the MTF Mapper program can help you measure it, and the program is free at the time of this writing. The MTF Mapper program author is Frans van den Bergh. His software and printable test charts are available here. Frans comes across as a ‘brainiac’, and his documents can take your breath away. They’re worth a read, though, even if you can’t grok 100% of their contents. More of his writings about image analysis topics can be found here. If you like his stuff as much as I do, please let him know! I’ll bet Frans has spent a gazillion hours working on this software, and he deserves all the praise we can give him. I’m using version 0.5.7 of mtf_mapper_gui.exe for these tests. You need to print the proper chart and use the correct program preferences to get the desired measurements. The chart I used is called “mfperspective_a3.pdf” (printed on A3 paper), which is also available at the same site that you download the program. You’re supposed to rotate the chart 45 degrees about the vertical, with the taller vertical targets farther from the camera than the shorter ones. Mfperspective_a3.pdf chart photo. Chart rotated 45 degrees. The MTF Mapper program takes advantage of the fact that camera sensors are separated into red, green, and blue-sensitive pixels. The program can independently analyze single-color pixels or all of them together. By the way, half of all of the camera sensor pixels are sensitive to green, while 25% are sensitive to red and the last 25% are sensitive to blue. It turns out that human eyes are most sensitive to green, so having 50% of the pixels being sensitive to green makes pictures look ‘correct’. This kind of sensor design is called “Bayer”, named after the inventor Bryce Bayer who used to work for Kodak. Different combinations of the R,G,B pixels can recreate all of the colors we see. MTF Mapper uses a program called “dcraw” (included in download) that knows most camera “raw” formats, and is regularly updated for new camera models. Axial R,G,B Focus Measurement In the following test, I’m using the Sigma 150-600mm zoom. Big lenses typically have the most trouble with axial chromatic aberration, and they also tend to exaggerate any focus errors. Take a photo of the chart, aligning the camera focus sensor on the middle of the chart. Don’t worry if the focus isn’t perfect; what counts is the difference between the red, green, and blue color channels in the same photo. A perfect lens would focus all three colors at exactly the same distance. The largest aberration errors will be seen with the lens aperture wide open. You’ll likely get different measurement results as you change focal lengths, too. Set your camera for RAW mode, so that there are no in-camera compensations for sharpening or chromatic aberrations. Remember to rotate the chart so that it’s at about 45 degrees from the sensor, with the tall side of the chart target images farther away than the short images. MTF Mapper Settings Preferences to measure axial aberrations After getting a photo of the test chart, the program needs to be configured for the color to be analyzed and for the camera pixel dimensions. The Nikon D610 has pixels of 5.95 microns, but the D7100 has 3.92 micron pixels, for instance. The program needs to be run against the same picture three times, each time selecting a different Bayer color channel. Blue Bayer Channel focus results show focus is 14mm in front of the chart center. Chart close-up (green). The orange arrows are at the chart center. Green Bayer Channel shows focus is 5mm beyond the chart center. Red Bayer Channel shows focus is 4mm beyond the chart center. The results above indicate that the red and green colors are focused at almost exactly the same distance, but the blue channel shows focus is nearly 19mm closer than the green and red channels (at a focus distance of about 6 meters). The test results also show the MTF50 resolution, measured in cycles per pixel. The green channel is sharpest at .211 c/p (peak sharpness). The red and blue channels both measure about 0.197 c/p. This MTF50 measurement is only valid for the peak focus location; you should be analyzing resolution using the resolution chart (see the other MTF Mapper Cliffs Notes article). Evaluating the MTF50 Math MTF50 lp/mm = (c/p)* pixels tall / height_mm Line pairs/pixel height (lp/ph) = MTF50 lp/mm * sensor height mm. D7100: 3.92 micron pixels 4000 X 6000 pixel sensor (24 MP) 15.6mm X 23.5 mm sensor Green MTF50 = 0.211 c/p = .211*4000/15.6 = 54 lp/mm, or 844 lp/ph Blue MTF50 = 0.197 c/p = .197*4000/15.6 = 50.5 lp/mm, or 788 lp/ph Red MTF50 = 0.197 c/p = .197*4000/15.6 = 50.5 lp/mm, or 788 lp/ph Conclusion The MTF Mapper program is a great way to evaluate axial chromatic aberrations. This optical aberration is typically a bit mysterious to get a handle on, but now there’s an easy way to characterize it. When you can analyze by the numbers, then you can actually compare it against other lenses in a meaningful way. And you can’t beat the price. #howto
- Sigma 150-600mm Contemporary Lens Firmware Updates
Sigma sells a USB dock that lets you update and customize their lens firmware. The (free) program used with the dock is called Sigma Optimization Pro. I have an article on it here: . Sigma has been providing firmware updates for my 150-600mm Contemporary (and also for the Sports version). The first update (1.01) improves the auto-focus speed (they claim up to 50%). The second update (1.02) fixes focus issues with the Nikon D500 used with a teleconverter. I bought the USB dock to enable in-lens focus fine-tune. This style of focus fine-tune goes way beyond any other lens manufacturers; it lets you fine-tune at 4 focal lengths and 4 distances per focal length, giving you a total of 16 fine-tune settings. This feature totally transformed my lens resolution from mediocre to stellar. Nikon’s (and Canon’s) focus calibration only lets you perform a simple global focus shift; this just doesn’t cut it for focus calibration. It is handy, though, when you mount the Sigma on another camera body, where the camera’s focus fine tune gets applied in addition to the Sigma in-lens focus calibration. My focus calibration fine-tune settings Auto-focus Firmware Changes I tried their auto-focus customization options (fast “Fast AF Priority”, medium “Standard AF”, or precise (slow)) when I first got the lens, but found the ‘fast’ algorithm wasn’t very accurate. I settled on the default “Standard AF” auto-focus speed, since I’m not willing to sacrifice resolution for speed. I didn’t notice any precision improvement trying their “precise” setting. It took me a long time, but I eventually got around to testing the new focus algorithms that were provided with the 1.01 version of the software. I noticed when I first loaded the new firmware that the default speed (Standard AF) was more responsive than it used to be, and have been happy shooting with that setting. It never occurred to me to re-try the “fast” auto-focus setting; big mistake. It’s great. Accuracy is now essentially the same as the medium “Standard AF” setting, and it is simply faster. It’s like I just got a new lens, but for free. C1 switch settings: focus speed, focus limits, viewfinder stabilization ‘effect’ C2 switch settings: focus speed, focus limits, viewfinder stabilization ‘effect’ Accuracy Comparison: High-Speed Auto-focus vs. Standard Auto-focus I used the MTF Mapper software to evaluate auto-focus accuracy. This test is a bit ‘statistical’ in nature, because it’s based upon a moving target. I lean on a tripod while hand-holding the lens at 600mm. This technique lets me get more reliable framing of my resolution target, but the lens is still “wiggling” quite a bit. I use back-button auto-focus and “AF-C” continuous focus. I also re-focus by pointing away from and then back onto the resolution target while AF is active. Note that the following MTF results aren’t as good as a firmly-mounted lens on a tripod and remote shutter release. With a long lens, even 1/2000 will have a tiny amount of motion blur when hand-holding. 1) MTF50 maximum results with “Standard” AF Speed, 600mm 1/2000s f/6.3 hand-held: 28, 28, 26, 28, 28, 26, 30, 32, 26, 24, 24, 30, 28, 28, 26, 26, 26, 28, 30 Average MTF50 maximum: 27.47 lp/mm 2) MTF50 maximum results with “Fast AF Priority” Speed, 600mm f/6.3 hand-held: 28, 30, 23, 24, 28, 22, 28, 26, 32 Average MTF50 maximum: 26.78 lp/mm Conclusion: There is no real focus accuracy difference using the “fast” auto-focus algorithm versus the “medium” auto-focus algorithm using the 1.01 or 1.02 firmware. So why wouldn’t you just leave it on “fast”? Sigma didn’t advertise that their accuracy got better with the new firmware, but I’m seeing a definite improvement in both speed and accuracy. Vibration Reduction “Optical Stabilization” Firmware Changes? The good news doesn’t end there. When I first got the lens, I experimented with using their vibration reduction (they call it “optical stabilization” or OS). They provide the usual ‘OS1’ for general hand-held use, and ‘OS2’ for panning use. I tested the lens using OS1 and shutter speeds beyond the normal VR upper-limit of 1/500. I found that the resolution was reduced when I tried 1/1000 by about 9%. As a result, I would turn off VR (OS) at high shutter speeds, just as I was taught to do for all lenses with VR. Using the new firmware (1.02) I’m not noticing any measurable degradation in resolution at high shutter speeds (all the way up to 1/8000)! This is just fantastic. I’ve always hated having to remember to turn VR on and off to accommodate my shutter speed changes. Now, I can just leave VR on and forget it. I’m going to have to re-test my other lenses to see if they really require me to turn VR off with higher shutter speeds or not. The moral of the story is don’t blindly believe the urban legend about always turning VR off at high shutter speeds. Test it first! VR Testing at High Shutter Speed Sample These tests were performed at 600mm f/6.3 using the “fast” auto-focus setting, hand-held, AF-C “back-button” focus. Shutter speed was 1/2000 throughout. Sigma lens Firmware version 1.02. Again, these MTF50 numbers are lower than when using a tripod with a remote shutter release; even high shutter speeds with a big lens aren’t as effective as a tripod for static subjects. OS1 Active MTF50 maximum: 28, 30, 23, 24, 28, 22, 28, 26, 32. MTF50 Average: 26.78 lp/mm OS1 OFF MTF50 maximum: 28, 26, 28, 28, 30, 22, 26, 24, 22. MTF50 Average: 26.0 lp/mm Conclusion: There is essentially no difference with stabilization active or not at this high shutter speed. I tried tests such as these all the way to 1/8000 shutter, without significant changes to MTF50 resolution when leaving vibration reduction active. Isn’t this great that Sigma comes out with these firmware updates? If only Nikon and Canon could catch up to these guys in making smarter lenses. #review
- Sigma 150-600 Contemporary OS Anti-Vibration Algorithm Comparison
This article analyzes the anti-vibration (OS) algorithms available using firmware version 1.02 for the Sigma 150-600 mm Contemporary lens. Sigma hasn’t published any information on the relative effectiveness of the three available OS algorithms, so I took it upon myself to see if there is any difference. The three available OS algorithms are called “Dynamic View Mode”, “Standard”, and “Moderate View Mode”. The default setting, if you don’t program any customization, is “Standard”. Each of these modes is available regardless of selecting “OS1” or “OS2”. The “OS1” is the normal hand-held mode, while the “OS2” mode is used for horizontal panning, such as while mounted on a tripod. All tests reported here are using “OS1”. Sigma "OS" is the same as Nikon "VR". You must use Sigma’s “USB Dock” with their Optimization Pro software to program any customization into the lens. The dock also allows programming focus fine-tune adjustments (16 settings for 4 focal lengths and 4 distances) and focus limiter modifications. To perform the tests, I used a tripod to rest my fist on, and then I rested the lens on my fist. This arrangement afforded me some level of aiming control, while still letting the lens “wiggle around” to simulate hand-held. I shot about 10 frames of my “A0” size resolution chart at 55 feet for each lens switch setting. The resolution chart images were analyzed using the MTF Mapper program from Frans van den Burgh (see this link). The resolution measurements require that the chart images be pretty level and perpendicular to the lens axis, which is why I didn’t simply try hand-holding the lens while shooting the chart. I shot each resolution chart image at 400 mm and f/6.3, and I used back-button focus with “AF-C” continuous auto-focus. These are typical shooting conditions for me, which is why I chose them for the testing. Beyond 400 mm, accurate aiming just gets too difficult for reliable/repeatable testing measurements. My lens was programmed with the “C1” switch setting having “Fast AF Priority” focus speed and “Dynamic View Mode” for the OS setting. I have already made tests that show this “fast” focus mode is essentially as accurate as the “Standard” (default) focus mode, at least when using firmware 1.02. The “C2” switch was set up with “Standard” AF focus speed and “Moderate View Mode” for the OS setting. If the customization switch is turned off, then you get “Standard” AF focus speed and “Standard” OS as well. Tests such as these are 'statistical' in nature, since they involve taking measurements with a lens waving around. I have included some data below, to give you an idea of how the measurements vary. Yes, I could have made 1,000 measurements at each setting to raise the confidence level; I leave that as an exercise to the reader. High Shutter Speed Tests My previous testing has shown an insignificant difference in resolution when you leave OS active at higher shutter speeds (1/1000 and above). This statement is not valid for all lenses!! Any OS algorithm is equally effective (or ineffective, if you wish) at high shutter speeds (you need firmware 1.01 or newer to get this result, however). I only saw a decrease of about 1.0 lp/mm MTF50 by leaving OS active above 1/500 shutter speed. OS Setting Screen C1 Switch Settings That I’m Using Now Medium Shutter Speed Tests The following tests were conducted using a shutter speed of 1/250 second. For 400mm using an APS-C sensor (600mm equivalent), I consider this “medium”, and starting to get into the realm of needing anti-vibration. Some people would benefit with stabilization at this speed, and some wouldn’t. Dynamic View OS, High-speed AF MTF50 Measurements: 36, 40, 38, 30, 36, 40, 42, 42, 40, 36, 40. Average = 38.2 lp/mm Moderate View OS, Normal (Standard) AF MTF50 Measurements: 42,40,42,40,44,40,45,38,42,42,42,42. Average = 41.5 lp/mm Standard (default) OS, Normal (Standard) AF MTF50 Measurements: 40,45,42,38,42,42,45,42. Average 42.0 lp/mm OS Off, Normal (Standard) AF MTF50 Measurements: 44,42,40,42,40,34,38,42,38. Average 40.0 lp/mm Results here don’t show much difference with OS active or not. The “Standard” OS algorithm got the best results, but not enough to really matter. Low Shutter Speed Tests These tests used a shutter speed of 1/60, or a little more than 3 stops beyond the traditional limit of 1/600 for an equivalent of 600 mm (DX frame). This is roughly the rated effectiveness of OS for this lens. Dynamic View OS, High-speed AF MTF50 Measurements: 23,26,34,30,26,30,22,28,28,28. Average 27.5 lp/mm Moderate View OS, Normal (Standard) AF MTF50 Measurements: 30,24,28,30,30,28,30,30,30,28. Average 28.8 lp/mm Standard (default) OS, Normal (Standard) AF MTF50 Measurements: 24,23,24,26,32,22,24,18,28,24. Average 24.2 lp/mm Results here show that the “Moderate View” is the winner. I didn’t show the “OS Off” here, because the images were mostly blurred beyond recognition. Bear in mind that the MTF Mapper software is extremely picky, so the numbers here may lead you to believe that OS is not that helpful. Not true. The pictures are enormously helped by the OS system when you shoot at slower speeds, but there’s no substitute for high shutter speeds. Conclusion It appears that “Moderate View” wins, although not by a huge margin. Sigma (rather cryptically) describes the effect of how each OS algorithm “looks” through the viewfinder. To me, what counts more is which algorithm provides the best anti-vibration effect in the final picture. It seems to me that there is a different end result in your pictures, depending upon which OS algorithm you pick. I prefer the “look” of Nikkor VR to Sigma OS when looking through the viewfinder, but both companies seem to provide roughly equivalent results in the final shot. Newer lenses invariably provide better stabilization, though. Sigma has the advantage of future OS algorithm improvements, however, available through a new firmware update. I saw a definite improvement in the Sigma auto-focus system (speed and accuracy) after loading the firmware versions 1.01 and 1.02. I also saw an improvement in the ability to not “mess up” the shot when forgetting to turn off anti-vibration at high shutter speeds. Don’t be surprised if Sigma has more tricks up their firmware sleeves in the future. Recently, Tamron finally saw the light, and is now copying Sigma with the ability to reprogram lens firmware (in their new 150-600 mm offering), with an essentially identical set of features as Sigma. Nikon et al. hasn’t yet seen the light. #review
- The Fallacy of Spray and Pray
Blue Angel Daredevils Photographers lust after that pro camera model with those high frame rates, so that they won’t miss that crucial shot. Guess again. Let’s say you just got that new D500 and you dialed in 10 frames per second. How could you miss now? Well, let’s look at some simple math. The above shot shows two jets which are cruising at about 500 miles per hour. Their closing speed is 1000 miles per hour, which is about 0.28 miles per second, which is equivalent to almost 1500 feet per second. That D500 taking a picture every tenth of a second captures those jets every 150 feet or so. Those aren’t very good odds to get jets right next to each other, are they? A similar scenario gets played out trying to capture the touchdown catch. So what to do? How about relying on your own reflexes? You can be quicker than you might think. Something I’ve noticed is that most photographers will close their left eye while looking through the viewfinder with their right eye. Stop that! Train yourself to observe what’s going on with your left eye. You need to be able to anticipate peak action, and you can’t do that if you can’t see it. Next, you need to learn to compensate for the slight delay when you squeeze the shutter release before your camera takes the shot, called “shutter release lag”. This lag is usually about 40 or 50 milliseconds, unless you have one of those point-and-shoots that can take eons to respond. There is no substitute for practice. No matter how much automation your camera has, it will never be able to replace your human intelligence or your anticipation of action. Learn your camera and lens; know which way to twist that zoom ring. Make it become second nature to you to zoom out until you locate your subject, then zoom in to properly frame the shot. I'm not claiming that pro camera features have no value. I'm just saying that you can't necessarily buy your way to getting that great shot. If this was all trivial, then where’s the fun and challenge in that? Go after that satisfaction of owning the shot that didn’t get away! Jet Smooch #howto
- MTF Mapper Version 0.5.8
This version of MTF Mapper has some new features and some changed features. This is the software that I use to evaluate both lens resolution and focus calibration. The author of this program is Frans van den Bergh. You can get this software here: https://sourceforge.net/projects/mtfmapper/ This new revision can still use the original resolution and focus chart designs, which is a real relief if you have invested time, effort, and money in printing/mounting large versions of the charts. If you print the newer charts, you get some new and welcome abilities. What’s changed? You may want to review my MTF Mapper Cliff’s Notes article, detailing the older version capabilities. New Resolution Chart The biggest change as far as I’m concerned is the switch from ‘relative’ measurements to ‘absolute’ measurements in the resolution charts (grid2d and grid3d). The chart scales of earlier MTF Mapper versions would only have a value range matching the actual measurements, but now the resolution range starts at zero. Another big change is the switch to monochrome color coding in the 2d and 3d charts, instead of the ‘rainbow’ color coding that would auto-scale to the entire measurement range. A small but welcome change is the addition of the photo name under the chart, so you know where the chart came from. Original resolution chart design. You can still use this chart. New resolution chart design, showing annotated photograph. New chart up close. Good edge measurements are blue; “iffy” ones are in yellow. 2D Chart with absolute scale for MTF50 lp/mm Older 2D Chart measurements with relative scale for MTF50 lp/mm. 3D new resolution chart showing the absolute (monochrome) scale MTF 10 and MTF 30 Graphs Camera companies have traditionally published MTF charts that show 10 lp/mm and 30 lp/mm “theoretical” values. I stress “theoretical” here, because those companies are merely blowing smoke. They don’t actually measure anything (at least Canon and Nikon don’t). MTF Mapper can now plot the real-deal MTF10 and MTF30 charts, based upon actual reality. What a concept. These graphs use the same chart design used for "grid2d" and "grid3d". MTF10 and MTF30 measurements using the new resolution chart Focus Chart The new software can use the original focus chart. What’s new is how the ‘annotated’ version of the chart displays the measurements. The measurements are in “cycles per pixel”. The measurements are no longer embedded inside boxes, which makes reading the values and seeing the edges much easier. Focus chart photo, showing the annotated edge measurements “Profile” option for focus chart The chart above shows a very slight focus error. The chart is oriented to make the left side farther from the camera than the right side. The ideal angle to shoot the chart is at 45 degrees relative to the vertical. The measurements above would indicate that the camera (or lens firmware) needs some “-” focus-tune adjustment, to pull the focus toward the camera. I always recommend, by the way, to look at the annotated focus chart measurements. There are occasions when the numbers give you a better idea of how to adjust focus. Also, repeat this test several times to avoid reacting to normal focus variations. Lastly, perform the tests in good light for optimal reliability. In the focus test above, the camera focus point was placed onto the right edge of the large central trapezoid. Because the chart is rotated, the trapezoid looks like a rectangle in the photograph. Measure Longitudinal Chromatic Aberration Focus chart with “fiducials” for measuring longitudinal chromatic aberration Chart zoomed in. Shows green channel focus error. The other major feature addition is the ability to analyze how a lens focuses in the red, blue, and green channels. When the different channel color focus measurements don’t coincide, then you have longitudinal chromatic aberration. To create the charts shown above, the MTF Mapper needs to be configured as shown in the following picture: Preferences dialog. Note the camera sensor “pixel size” must match your camera. Summary The new MTF Mapper version 0.5.8 brings many welcome additions. You might want to retain your older version, however, if you prefer the “relative” versus the “absolute” resolution measurements. Please visit the Frans van den Bergh site and give him some praise for going through all this effort. Frans, you’re the man! #review
- MTF Curves: Theoretical Versus Actual
All camera companies (with the exception of Sigma and Leica) publish MTF curves for their lenses that are “theoretical” and not actually measured. Should you care? Personally, I believe in the old President Reagan saying “trust but verify”. What follows is a dose of reality, compared to theory. I have chosen what most people would agree are among Nikon’s best pro lenses for this study, lest I get accused of measuring lenses that were manufactured using lesser standards. The MTF curves I’m referring to are the traditional mix of MTF10 (contrast) and MTF30 (sharpness). I used the “mtfmapper” software version 0.5.8 to create the following charts. Personally, I place much more stock in the 2-dimensional MTF50 plots that measure the whole camera sensor. Unfortunately, getting 2-D MTF50 plots is hard to come by outside of this site. The MTF charts are traditionally generated for a wide-open aperture, so that’s how mine are measured. It’s unknown what focus distance is used by Nikon; mine will be measured at the distance needed to photograph an “A0” resolution chart filling the frame. I took the measurements in shade on a clear sunny day. Light wavelengths can affect measurements; I like to test using the same lighting conditions that I normally shoot. 105mm f/2.8G ED‑IF AF‑S VR Micro Nikkor This lens is supposed to be optimized for “close” distances, but I’m measuring it at a more conventional distance. Nikon Theoretical Chart (from Nikon site) for 105mm Measured MTF10 and MTF30 for 105mm at f/2.8 I don’t want to appear cynical, but I was 99% sure that my measurements would show less sweetness and light than the Nikon claims. This is pretty much borne out by the measurements. Take a look at the edge of the lens, though. It actually performs better than theoretical! Measured MTF10 and MTF30 for 85mm at f/1.4 Again, not quite as good as theoretical. The edges have a few pleasant surprises, however. 85mm at f/4.0 Just for fun, I tried an f/4.0 test. It really cranks up the quality, doesn’t it? 24-70mm f/2.8E ED VR AF-S Nikkor The wide end of this lens looks dramatically different than theoretical. Again, this lens at 70mm looks quite a bit different than the claims. Conclusion It appears that Reagan had some good advice. Bear in mind that these lenses don’t represent the whole population; your mileage may vary. My biggest surprise is that the FX frame edges fared better than expected. Trust but verify. #review
- Focus Stacking With Combine ZM
I have tried a few different programs that let you increase depth of focus by stacking pictures that are shot at varying focus distances. Most of those programs will readily fail when the subject is too complex or the focus depth is too extreme. Focus stacking is mostly used in two different realms, namely landscapes and macro photography. Landscape photographers usually want maximum depth of field and maximum resolution, which can be had by stacking photos shot at the sharpest aperture and at multiple focus distances. Macro photographers know only too well that a single close-up can have paper-thin depth of focus; combining a dozen or more shots is often necessary to get sufficient depth of focus. I have had too much grief using Photoshop and Hugin tools, but a (free) program that works pretty well for me is called CombineZM by Alan Hadley. Stacking pictures requires a lack of image movement, so wind can mess up your plans. The pictures need consistent exposure, so manual exposure works best (at a constant aperture). For macro work, I like to use my Nikon PB-4 bellows (with its rack-and-pinion focus rail) to easily move the camera/lens combination from shot to shot, shifting focus by maybe half of a millimeter per shot. Good luck finding a Nikon PB-4 bellows. The default settings in CombineZM don’t always work the best for me, so I thought I’d share how I make it work for me. It bears mentioning that “macro” in CombineZM means “run a sequence of steps” and not anything to do with close-ups. My most successful recipe to stack pictures is this: Post process and convert your pictures into TIF format (16- bit with LZW compression is what I use) using your favorite image editor. Run CombineZM (I use it in Windows7 and Windows10, but it works in other operating systems, too). Select File | *New and then select the set of TIF pictures to stack (select in focus-order). Wait until the pictures are loaded. Select Macro | Do Weighted Average. I have less success using “Do Stack” or “Do Weighted Average Correction”. After it finishes, use your mouse to draw the diagonals of a rectangle around the ‘good’ part of the result. Select File | Save Rectangle As. I just save the result as JPG, with typically 95% quality. Here’s a sample finished shot, which is from a stack of 10 files: “Do Weighted Average” macro to create the stack I actually took even more shots in front and behind of what was used above. The software started to mess up with this many pictures, so I omitted some shots to achieve the result shown above. “Do Weighted Average” with even more pictures in the stack. Note the evil ‘ghosting’. “Do Stack” macro. Note strange artifacts using this option. What a focus stack looks like before you crop it. Typical single shot depth of focus (60mm f/10) Depending upon your subject, you may have to iterate on the selected options or perhaps how many pictures you can stack. Life is rarely simple… Make sure your subject is a bit smaller than the frame, because you will have to crop the edges of the “stack”. Conclusion Focus stacking is one of those techniques that takes a little tenacity. There are many different tools that can stack pictures, with varying degrees of success (or failure). This is one of those digital tricks that seems to defy optical physics. If you're willing to put in the effort, the results can be quite rewarding. #howto
- Clean Your Camera Image Sensor
Are you a little intimidated about cleaning those dust bunnies off your camera sensor? Should you punt and pay to have it done for you? It's a little scary to clean your camera sensor if you haven't done it before. I used to bring my camera to a Nikon repair facility to get it cleaned. They would keep my camera overnight, and it would cost me $70.00 for something that took them about 5 minutes of their labor. My D7000 camera, for roughly the first 12,000 exposures of its life, would sling oil onto the sensor. The Nikon service center denied this was oil, and indicated I was probably a little sloppy in my camera-handling cleanliness. Beg to differ. One time I cleaned my sensor (a "wet clean") and then made a 1000-shot time lapse video. By the end of this 20-minute video, the sensor probably had a hundred oil blobs on it. Arrgh. There is a cheap solution (one of those bad puns again). Believe me, you can't get oil off of a camera sensor unless you give it a "wet clean". But what about the more usual case of mere sticky dust? The kind of dust that a blower can't budge? There's now a tool you can buy that can clean off stubborn sensor dust in a safe and easy way. I made a little video that shows how simple and quick it can be to clean your camera sensor yourself. You don't need to be a frady cat any more! #howto
- The Orton Effect
Michael Orton is a photographer who wanted to re-create the look of a water color painting with film. He invented a technique that combines slides containing a sandwich of in-focus and de-focused images. Michael originally called his technique “Orton Imagery”, but now everybody just calls it “the Orton Effect”. When digital photography came along, people wanted to emulate this effect using software. Perhaps the most famous use of this effect is in the Hobbit movies. People knew the “look” was different, but they couldn’t put their finger on what the difference was. I really love the look of the Orton Effect for certain kinds of subjects. Just like cupcakes, though, you may like them but they're less than ideal as a steady diet. Everything in moderation. A straight shot The Orton Effect (Capture NX2, blur radius 25) A Few Ways To Create The Orton Effect Many different photo-editing packages have the capability to create the Orton Effect. Some examples are Gimp, Photoshop, and Nikon Capture NX2. Maybe one of these days, your camera will have an “Orton Effect” setting to create it directly. Nikon Capture NX2 I must be one of the last hold-outs on using Capture NX2. I have a zillion batch files to process pictures using this software; one of them is, of course, the Orton Effect. The first step is to set the Output curve to a value of 3 in "Levels & Curves". Leave other settings at their defaults. Second step: Set a Gaussian Blur value to around 25. Use a blending mode of “Multiply”. The radius value here should be set to suit your subject matter. Third step: Set a midpoint value to “2” in Levels & Curves. Alter the blending mode to “Multiply”. There you have it. At this point, it would be prudent to save your steps as a Batch Process (Batch | Save Adjustments… | Save As | OrtonEffect). Now, you can select photos and run the batch process on them to get the Orton Effect without having to memorize any more steps. You might want to save a few different batch files, setting a different Gaussian Blur radius in each one (the “second step”). The Orton Effect (Capture NX2, blur radius 35) For simple subject matter, I prefer a larger blur radius. For you, sprinkle to taste. Adobe Photoshop First, duplicate the photo and call it “Sharp” Second, right-click on the “Sharp” layer, select “Duplicate Layer…”. and name it “Sharp copy”. Select “Screen” for the blending mode of this layer. While “Sharp copy” is still selected, right-click and select “Merge Down”. You will be left with just the “Sharp” and “Background” layers. Right-click on the “Sharp” layer, select “Duplicate Layer…” and name it “OutOfFocus”. With the “OutOfFocus” layer selected, go to “Filter | Blur | Gaussian Blur” and set a radius suitable to the effect you want. No details should be visible, but you can still make out shapes. Set the “OutOfFocus” layer blending mode to “Multiply”. All that’s left is to save your final image in whatever format you prefer. Finished Orton Effect using Photoshop Summary I would encourage you to explore this processing technique. It can transform a blah photo into something special. I find that purely literal recording of images can start to feel a bit mundane. Try something on the wild side once in a while. Michael Orton did some really pioneering work in photography. We owe him a big thank you. It does look a little like a watercolor painting, doesn't it? #howto
- White Balance Calibration When Colors Go Haywire
Setting the white balance is one of those things that can be laden with a lot of guilt. If you shoot RAW, it’s supposed to be a “don’t care”, but many photographers will look down their nose at you if you don’t “do Kelvin”. I thought it’s high time to do a little comparison shopping. Modern cameras have a lot of computing horsepower to figure out what white balance you should be using ala “auto”, but is it any good? What about using tables that supply the answers? What about a color meter? What about just setting it in your photo-processing software after the fact? How about using "Live View" mode to help you decide? So many questions. I always shoot RAW, so white balance decisions aren’t a big deal to me. If I don’t like the color balance, I just change it in photo editing software. With jpeg, though, it’s not nearly as forgiving. Jpeg has very little elbow room for errors, so you want to get it right in the camera. But how can you reliably get it right in the camera? I conducted some tests using a Nikon D610 and an android smartphone program called “Light Meter” (version 2.6) written by Borce Trajkovski. This program lets you measure light levels (as in LUX) and also color temperature in degrees Kelvin. The program has the additional advantage of allowing you to calibrate it for both light (scale and offset) and also for color (scale and offset). It gives you approximate calibration values to use for various smartphone models. I used Nikon’s Capture NX2 to adjust and analyze my RAW shot tests, but many photo editing packages would work just as well. Sunshine I took 3 shots of my neutral grey card target illuminated by direct sun. I used “Auto” white balance, the measured color meter temperature (5560K was closest), and the “Direct Sunlight” camera white balance selection. Not surprisingly, all of the shots look acceptable (although the “direct sunlight” choice was off the most). Using my histogram view in Capture NX2, I could see that the R,G,B color peaks were nearly on top of each other, as they should be for a neutral grey target. The in-camera histogram showed the same result. “Auto” white balance in sunshine. 5433K was set automatically. “5560K” white balance in sunshine. Meter indicated 5600K “Direct Sunlight” white balance selection in sunshine. 5209K was set by camera. Most cameras have sunlight pretty well figured out, so you’d expect those shots to have well-balanced color. Shade My next test was in open shade, with a clear blue sky. “Auto” white balance in shade. 7662K got set automatically. “7140K” white balance in shade. Meter indicated 7000K. “Shade” white balance selection in shade. 7989K was set by camera. The analysis of the histogram peaks indicates that “Auto” white balance is the best here, but again all three are reasonably close to each other. I prefer the "color-metered" setting. Indoor LED Lighting “Auto” white balance inside using LED (ceiling) lighting. 3390K was set automatically. “4350K” white balance in LED lighting. Meter indicated 4300K. LED lighting “Live View” guide with “3030K” selected WB. Now things get interesting. The “Auto” setting was pretty inaccurate, and using the color meter was really terrible. It turns out that by using “Live View”, I could really nail the white balance. In "K" WB mode, I could spin my camera's control dial and instantly see the color change on the screen in Live View. An electronic viewfinder would work the same way. Since I use RAW, though, all is not lost. All I have to do if I messed up the in-camera white balance is to adjust it via my photo editor when I get back home. Here’s the trick: with a neutral grey target, adjust the white balance value until the RGB histogram peaks coincide. Let’s take a look at the badly-adjusted shot in Capture NX2 next. With the 4350K original setting, the RGB peaks aren’t even close to where they should be (they should land on top of each other). By adjusting the slider “Fine Adjustment” to 2950K, the peaks overlap and the picture is now perfectly neutral. “Live View” really saved the day on the indoor shots. If I were shooting an indoor wedding ceremony at the mercy of whatever lighting was there, I’d definitely want to consult “Live View” to set my white balance. Outdoors in bright lighting is another animal, however. Live View (unless you use something like a Hoodman Loupe or have an electronic viewfinder) is an underwhelming experience. You should still be able to analyze the histogram peaks on shots after the fact to help you dial in white balance, though. By the way, don't even think about using published tables of color temperatures for indoor lighting. Indoor lighting color is all over the map, and tables are mostly useless. Conclusion So what have we learned today, class? Auto white balance can be your friend and your foe. Learn when it’s safe to use it and when it’s not. Carry a grey card with you to calibrate the white balance. Take a shot of the card so that you at least have a good reference picture to dial in white balance at home with your photo editor. Remember to take another shot of the card when the lighting changes. Live View can really be your friend, even if you just use it to dial in the white balance and then turn Live View off. Shooting birds in flight moving in and out of shade, however, leaves a single viable choice: "Auto WB". This is what RAW format is all about; just fix the color in your photo editor. Happy (calibrated) shooting. #howto











