top of page

Search Results

268 results found with an empty search

  • Topaz DeNoise AI 3.4.2 with RAW File Processing

    Topaz Labs has recently updated their DeNoise AI to version 3.4.2. The main changes include improved AI engine efficiency, improved raw support, GUI additions, and some bug fixes. I had tried their older DeNoise version 3.3.4 on my Nikon raw (.NEF) files for both the D500 and D850. Version 3.3.4 failed miserably with my Nvidia Quadro FX 880M GPU. The new 3.4.2 version thankfully works perfectly with this GPU. Other file formats have worked on all previous De Noise versions, but the NEF files didn’t. You’ll want to do the denoise step before any other raw processing. I do most of my edits using Lightroom, so I’m going to concentrate on that Topaz/LR combination here. The Topaz company explains that their AI algorithms will perform optimally using the original (raw) file information, rather than the already-processed files from editors such as Lightroom. Lightroom understands the DNG raw format, so that’s what you want to use. This combination of processing is superior to letting Lightroom edit the raw file first and then invoke the DeNoise AI plug-in to finish. Be prepared for huge DNG files, however. The D850 files in DNG format are about 262 MB! You might want to save this processing technique for your special photos, unless you have loads of storage space. It would be tempting to eventually delete the DNG files after editing and exporting the results into jpeg, but it might be pretty difficult to re-create all of your edits later on from the NEF/CR2/CR3/ARW original. You should skip any sharpening/noise removal using Lightroom, and just let Topaz work its magic instead. You just need to start with DeNoise AI and then finish with Lightroom, instead of starting with Lightroom and then finishing with DeNoise AI. If you start with Lightroom editing the raw file first and then call the DeNoise AI plug-in from within Lightroom (which is what I always used to do) the quality is still extremely good. This new sequence of starting with DeNoise AI first just gets slightly improved results, at the cost of creating a huge DNG file (and then importing it into Lightroom). ISO 4000 processing with DeNoise AI The shot above shows one of my favorite recipes for processing photos that are in the vicinity of ISO 4000. This photo was shot with my Nikon D500. I use the “Clear” algorithm, with “Remove Noise” = medium, “Enhance Sharpness” = Low, “Recover Original Detail” = 26, and “Color Noise Reduction” = 7. I was never happy with any other photo editors for noise removal and sharpening for photos with an ISO above 1600. I no longer hesitate to use up to ISO 6400 after adding De Noise AI to my work flow. In a pinch, I’ll sometimes go to ISO 10,000 but I’m not too wild about the results. It wouldn’t surprise me if future AI algorithms are able to adequately handle even these high-ISO shots. Process the raw shot and save as DNG Procedure (Windows 10) to start with a raw-format input file Go to the folder with NEF (or other raw format) files using Windows Explorer. Select (or multi-select) the NEF/CR2/CR3 raw file(s). Left-mouse-drag file(s) onto the Topaz De Noise AI desktop icon. This invokes Denoise AI as a stand-alone application, versus a plug-in. If you select more than a single shot to drag onto Denoise AI, then you’re doing batch processing of the raw shots. Alternatively, you can also start up Topaz DeNoise AI and simply mouse-drag your photos or folder onto the running Topaz program. The finished shot, after processing the DNG file in Lightroom Using ISO 4000 (and generally even ISO 6400) looks gorgeous when I process my shots with DeNoise AI. I used to cringe when I had to go up to ISO 3200 using Lightroom or Zoner or Photoshop or DarkTable by itself. Topaz has really changed my opinion of what’s possible with high ISO shots. Both noise and sharpness show huge improvements. By the way, the (Nikon D500, 21MP) shot above was cropped by about 50%. ISO 4000 never used to look this good You should still strive to keep ISO as low as is practical, in order to preserve the maximum dynamic range. With big lenses, however, you’re stuck with high shutter speeds and therefore higher ISO. DeNoise AI 3.3.4 was a disaster with my Nvidia GPU. You can see above the kind of results I was getting previously, when I tried to directly process my raw (.NEF) shots with DeNoise AI. I would get a bunch of random black squares. This is now totally repaired in the newer software releases 3.4.1 and 3.4.2. You will also find that the algorithms run a bit faster with this newer version. They’re still slow, compared to conventional noise removal and sharpening in a photo editor, but AI processing results in vastly superior end results. If your computer doesn’t have a GPU (graphics processing unit), then you probably shouldn’t use Topaz DeNoise AI. Its artificial intelligence algorithms need huge computing resources, which GPUs provide. I predict that eventually all of the photo editors will be forced to adopt AI algorithms, since non-AI techniques can’t even begin to compete with the quality that products like Topaz De Noise AI provide. I don’t get any money from Topaz; I just think that photographers need to be aware of just how good the latest version of this product is.

  • How to Process Infrared Photos with Zoner Photo Studio

    I started processing my raw-format shots from my infrared-converted camera in Zoner Photo Studio. My go-to editor is Lightroom, but it does a poor job in creating a proper white balance. Infrared photos have a strong red cast to them, and Lightroom just can’t handle them properly. To do the effects shown in this article, you’ll need an editor that supports swapping color channels directly, or through a plug-in. A handy tool to have when editing color infrared shots is a red/blue channel swap plug-in. This channel swap is how to get your sky to turn blue, instead of the ‘tobacco’ color of white-balanced color infrared. I found a free plug-in from Flaming Pear that works in Zoner Photo Studio to perform this channel swap. I got my old Nikon D7000 converted into infrared by Kolari Vision, although there a few companies that will do a conversion like this. I chose a 590nm infrared filter, which allows part of the visible spectrum to get used and therefore enables color infrared shooting. Camera conversions that use long-wave infrared, such as 850nm, are strictly for black and white photography. You can of course achieve these same effects by purchasing IR filters, instead of getting a camera converted to infrared. If you use filters instead, then get ready to take a tripod along. Typical infrared shot with a red/blue channel swap The photo above started out with a very strange sky color. The procedures that follow show you how to convert your infrared shots to get the classic blue sky effect. Bear in mind that infrared doesn’t contain anything that can be called “color”. Therefore, any color assignments that you might make are just as correct (or incorrect) as any other. People tend to prefer color shots that have a blue(ish) sky, however. Install the free plug-in to swap red/blue color channels Before trying any editing in Zoner Photo Studio, I installed a library of plug-ins from Flaming Pear. Here’s a link to get the free plug-in. The Flaming Pear plug-ins can of course do a lot more than just swap color channels; you should probably try out some of their fun effects that they offer. Download the compressed plug-ins into the desired folder. For Windows, the file is called “freebies-win-latest.zip”. Decompress this file. Open up Zoner Photo Studio Go to the “Editor” tab Click the “Effects” tab to add the plug-in folder Zoner doesn’t like 64-bit, so select the 32-bit freebies folder. Select the “Settings…” to add plug-ins Click the “Add…” button to browse to the plug-in folder Once the plug-ins are ready for use, you can start editing your raw-format infrared photos. You really, really should be shooting in raw format to get quality results. Color Processing Steps for a Blue Sky (Do these steps for the first image, to set up the Zoner defaults) Raw photo: use the eyedropper tool to select a neutral area You’ll probably see something like the shot above before you set a proper white balance. To successfully use the eyedropper tool to set a white balance, you photo should contain something neutral in it, like a rock or sidewalk. You only need to do this white-balance operation once, since you can save the result in Zoner to be applied to all subsequent shots. Finished result of using the eyedropper tool Open the raw photo in Zoner Photo Studio Go to the “RAW” tab Select the White Balance Eyedrop tool Pick a neutral area in the photo, such as a sidewalk or stone Click Settings | Set Current as Default Now, opening other photos in the RAW tab will automatically white-balance them, or at least get pretty close. If you want to process non-infrared shots later, then you’ll probably want to set up other defaults at that time. Adjust the image to suit your taste in the RAW tab. (Green plants should now look blue, and the sky is tobacco-colored) Click the “To the Editor” button in the bottom-right corner. You’ll probably want to turn the sky into a shade of blue, which will then turn the plants into shades of yellow/orange. This step is where the Flaming Pear “Swap Red/Blue” plug-in comes into play. Many people are happy leaving the plants with a blue color; color infrared doesn’t have any set rules. Click Effects | Plug-in Modules | Flaming Pear | Swap Red/Blue After the plug-in finishes, your sky should now be blue and plants typically look a shade of yellow. Neutral white balance, before the red/blue channel swap Red/Blue channels are now swapped You can now touch up the photo with the usual editing tools, and then save it in TIFF/JPEG format. If you don’t like the colors, you can also convert the shot into black and white. Convert your shot into black and white I really love the plug-in called Silver Efex Pro 2. It’s made by the Nik people, who are presently working for DXO. While editors such as Zoner Photo Studio have built-in features to turn shots into black and white, they’re quite primitive compared to Silver Efex. This plug-in is installed using the same techniques shown above for Flaming Pear. Shot converted into black and white with Silver Efex Pro 2 Once the plug-in is installed, do the following steps to convert shots into black and white: Start in the “Editor” Click: Effects| Plug-in Modules…| Nik Collection| Silver Efex Pro2 Select whichever effect you like the best. You can even fine-tune the effect, if you wish. I’m a huge fan of black and white, and infrared landscapes can look stunning in black and white. Actually, black and white infrared photos are probably more “correct” than the color photos shown earlier in this article. Summary There are several photo editors available that can perform the red/blue channel swap for handling color infrared. There are a lot fewer editors that have the necessary range to get a proper white balance for color infrared. I mostly use Lightroom and plug-ins for my photo editing, but here’s a case where it just falls to its knees. I stopped searching after I discovered that Zoner had all of the capabilities (including plug-in support) that infrared processing needs. I’m not getting any money from the Zoner people, but I thought some of you would be interested in knowing about useful tools for editing of infrared. I’ll also mention that I save my sharpening and noise removal for the Topaz Denoise AI plug-in. Once it’s configured properly, I have seen no equal to its capabilities.

  • Nikon Image Overlay Feature Tutorial

    Most Nikon DSLRS and mirrorless cameras have a feature called Image Overlay. This is a vastly underrated capability that can really spice up your photographs. You can think of it as a “double exposure” technique, but it’s more powerful than that simple description. Sample camera models that have this feature include D5000, D7000, D7100, D610, D500, D850. I use this feature to add a moon to landscape shots that are just begging for a little something extra in them. It’s particularly interesting to be able to add a telephoto moon to a wide-angle landscape, to create an ‘impossible’ combination. The fact that these shots are created entirely in-camera makes it that much more powerful. Most people don’t realize that the Image Overlay feature actually creates a RAW output result, with an image quality that goes way beyond simple jpeg format. You should be using RAW input images, also. It’s of course possible to use an image editor with ‘layers’ to accomplish something like this. This technique, though, will probably yield better quality and also give you the ability to re-shoot on the spot if you decide that you don’t like the alignment, for instance. I actually keep a little library of protected moon shots on my camera’s second card slot. I protect these shots (using the little “key” button) against accidentally erasing them. This way, I always have the ability to add a moon, for example, to a landscape. I keep a variety of moon shots that have different positions and magnifications in the night sky, to allow flexibility with composition. Note that you have to remember to not re-format this “library” memory card; even “protected” images will be lost if you do that. You might want to copy your ‘library card’ to a backup card for safe keeping. Aloe reaching for the moon The example shot above shows how I added a moon to the sky. This kind of shot would not have been possible with a single shot. The moon was photographed at a much longer focal length, not to mention that the moon is virtually never in the right spot at the right time for your shot. Image Overlay Procedures I demonstrate a typical editing session in the steps that follow. The camera menus allow you to try out different images and balance their exposures, as well. You can select the shots from either memory card slot. Go to the Retouch Menu to find Image overlay Press OK to locate your first image to use Select the desired first image, then press OK Select the second image to overlay onto the first image Adjust the brightness of the second shot In the example above, I increased the exposure of the second (moon) photo by 1.5X. I could see the final result in the Preview window, prior to saving the combined photo. Note that my first photo was selected from my XQD memory card, and my second photo was selected from the SD card slot. Press the OK button to overlay the shots Press OK to save the result or Back to adjust further The finished shot Saving Photos To Another Card Slot If you want to move shots to your second memory card “library”, the images below show those details. You should begin by marking each photo you want to move as “protected” by pressing the little “key” button. This step is simple insurance against accidentally erasing them, and it will also simplify photo selection. The operations to copy/save images are found in the Playback menu. The steps assume that you have already marked the shots to copy as “protected”. Select the Copy image(s) option Locate the shots to copy Choose the “Select protected images” option here To make things easy, there’s an option called “Selected protected images”. If you have already protected the shots that you want to copy, then this option will grab them all in one step. Now, you can save the selected images onto your other memory card. Now you can easily build up a library of "stock" image files (mine are mostly moons at different phases, magnifications, and positions). Summary Nikon engineers did a really quality job when they designed the Image Overlay feature. This capability lets you create multiple exposures that have maximum quality, and you can complete the whole process in-camera. I typically only use multiple exposures for things like combining fireworks or adding a moon to the sky. This feature, of course, doesn’t care what you decide to combine together. You can always copy some of your original raw shots back onto your memory card later, to add a moon (or something else) after the fact. It’s gratifying to see that Nikon has chosen to carry this feature over to so many camera models over the years. It’s a pity that more people don’t use it, or are even aware it exists. This is something that you can have a lot of fun with.

  • Kolari Vision Infrared Camera Anti-Reflection Coating Review

    When I got my Nikon D7000 converted into infrared-only, I decided to use the Kolari Vision company. The main reason I chose this company was because of the specifications they gave in regards to their IR filter covering the camera sensor. Kolari Vision offers (at additional charge) to put an anti-reflection (AR) coating over the surfaces of their glass IR filter that they install in place of the normal (visible-light) camera sensor filter. I was a bit skeptical about this AR coating (it’s adjusted to infrared wavelengths), but I decided to add this option when they converted my camera. They also add materials that are supposed to make this sensor filter easier to clean, if debris gets on it (which it always does). My camera conversion was with their 590nm sensor filter, but they offer the anti-reflection coating option for all of their conversion wavelength options. I wanted to mention that I don’t get any money from Kolari Vision, so I have no stake in anybody buying something from them or not. I just have a deep interest in infrared photography. Kolari Vision claims that un-coated IR sensor glass filters will reflect 7% of the light, while their coated filters reduce this down to 0.4%. They further claim that this reflected light bounces off of the rear of the lens aperture blades, and is a primary contributor to the dreaded hotspot in the middle of your photos (with many lenses). Hotspots, if the lens produces them, always get worse as the lens aperture gets stopped down, and this is the reason why. You can read for yourself about their anti-reflection coating here: The Life Pixel company, which also does IR camera conversions, doesn’t put anti-reflection coatings on their IR filters over the camera sensor. They claim that they tested this technology, and found that it made light transmission worse and didn’t help reduce hotspots in any way. All modern lenses have multi-coating, and they vastly increase light transmission. It doesn’t make any sense to me when Life Pixel claims that the anti-reflection coating reduces light transmission. They doth protest too much, methinks. Sorry, Shakespeare. So who’s telling the truth here? I figured that some testing was in order. I can’t prove that the reflected light bounces off of the aperture blades, but I can at least look at the end result of using a coated sensor filter. For many years, I have done infrared photography by using IR filters on my lenses. I always figured that it got me exactly the same results as an infrared-converted camera, except that the light levels were reduced when using the filters (compared to an IR-converted camera). I have always known that my Nikkor 50mm f/1.8 AF-D lens got a bad hotspot after stopping it down while doing infrared photography. Because of this, I always avoided using it when shooting infrared. The shot below shows exactly what I’m talking about. 50mm f/1.8 at f/7.1, 30s, ISO 800, Neewer 850nm IR filter You can clearly see the hotspot in the middle of the shot above, even though the lens was only stopped down to f/7.1. The hotspot was clearly visible, no matter which IR filter I tried on this lens. The hotspot got breathtakingly bad at f/16. This shot uses my own attempt at a white balance preset, and the color wasn’t modified by any post-processing editor. I could use Lightroom and its “radial filter” to mask the hotspot, but this is definitely a second-rate, band-aid kind of fix. I decided to re-create this shot with the same lens at the same aperture, the same filter, and the same lighting conditions. While I was at it, I also decided to try a variety of other infrared filters. The main change here is that I’m now doing these tests on my Kolari Vision converted IR camera. The shot above was done with my Nikon D7100, which has not been converted to infrared. I have seen this same hotspot effect when trying my other cameras with infrared filters, including on my D7000 before I got it converted into infrared-only. My shots would look okay wide-open, but would get ruined after stopping down beyond f/5.6 or so. In the test shots that follow, I am using the factory-set white balance that Kolari Vision provides, without any further modification by me in an editor. In regular photography, I would do lots of post-processing to alter the colors, including improving the white balance. 50mm f/1.8 at f/7.1, 1/800s, ISO 800, Neewer 850nm IR filter The shot above was with my Kolari Vision 590nm conversion, but also using the same Neewer 850nm IR filter on the lens that resulted in the nasty hotspot with my D7100. Notice a few differences from the previous photo using the same Neewer IR filter. The exposure went from 30 seconds to 1/800 second (over 14 stops)! The big thing, however, is the total absence of a hotspot. Kolari Vision’s anti-reflection coating appears to have made a huge difference. Also note the loss of color, due to the double-filtering of the light through two visible-light-cutoff filters (590nm and 850nm). I need to mention that I use Live View with an LCD magnifying viewer to see and focus when I put an IR filter onto a lens. It’s only a minor inconvenience, as long as I’m not trying to follow action. This combination even works in bright sunlight. Hand-held shooting is still possible with this combination of an IR-converted camera and an IR lens filter, although I typically lose about 3 stops of light. I decided to try some other filter tests at the same lens aperture, to see if the hotspot might show up with filtration changes. Kolari Vision 590nm sensor filter conversion only. 50mm f/7.1. No hotspot! Zomei 850nm 50mm f/7.1 with Kolari-converted camera. The Zomei 850nm filter also causes total loss of color information, and is extremely similar to the Neewer 850nm filter. Hoya R72 50mm f/7.1 with Kolari-converted camera. As you can see, the various IR filters on my 50mm lens didn’t make the hotspot appear at f/7.1. Next, I decided to see if I could coax a hotspot to appear with this lens by stopping it down to f/16. I had no filter on the lens, and only adjusted the exposure by stopping down from the previous f/7.1. Kolari Vision 590nm only. 50mm f/16. Tiny hint of a hotspot. Most people wouldn’t even notice it, but at f/16 my 50mm lens shows just the barest trace of a hotspot in the middle of the shot above. This lens is now totally usable for infrared, at any aperture. Despite what Life Pixel claims, I consider that the hotspot issue is indeed repaired (at least with my lenses) at lens apertures down to about f/11. This may not be a total cure, but I think it’s a huge improvement. When stopping down further, the hotspots are weak enough to largely be ignored. I rarely use f/16 or narrower apertures in my photography anyway, since it ruins resolution due to diffraction. I suspect that I’d be seeing hotspots if I had the “standard” Kolari Vision 590nm conversion done, which doesn’t include the anti-reflection coating on the sensor filter. I guess I won’t ever find out, since I don’t intend to ever get a camera converted that doesn’t include the AR coating. It’s a cheap investment to make sure your lenses perform as well as possible with infrared photography. Kolari Vision 590nm only. Adjusted white balance. I fixed up the white balance in the shot above, using the tree trunk as neutral gray. Notice that there is a bit of a glow from the bushes on the right. This is known as the “Wood Effect”, which comes from infrared heavily reflecting off of the chlorophyll in the leaves. This isn’t a central hotspot, and is generally considered part of the charm of infrared photography. You’ll have to make up your own mind if you like it or hate it. Most camera lenses demonstrate some of level of “glare” when the shot includes a subject with this heavy infrared glow. The shot above is typical. My Sigma 14-24 f/2.8 Art lens, although awesome with visible light, shows an above-average level of this glare in infrared. With most subjects the glare is ignorable, but sometimes the shots get ruined because of it. Kolari Vision 590nm only. Red/Blue channel swap. The shot above shows the more conventional red/blue channel swap from the white-balanced version, using my photo editor. The leaves changed from blue to yellow/orange, and the sky looks a bit more normal. The 590nm IR conversion retains enough of the visible-light spectrum to enable nice colors. It’s easy enough to also convert the shot into black and white, which I end up doing at least half of the time. I was going to make up a database of my lenses to indicate which ones would work for infrared. After the Kolari Vision 590nm conversion, all of my lenses work with infrared! There are of course some qualifiers in saying “all” of the lenses work; lenses don’t work equally well under all lighting conditions. My Sigma 70-200 f/2.8 Sport, which I initially determined to be totally unsuitable for infrared (using IR filters on un-converted camera bodies), for instance, now works just fine with most subjects. Another bonus from getting a camera converted to infrared is that my lenses that don’t permit a filter on them can finally be used to shoot infrared. Rokinon 8mm fisheye, 590nm infrared, red/blue channel swap The Rokinon 8mm fisheye doesn’t allow a filter to be attached. It works great in infrared, at least with the AR coating option that Kolari Vision provides. The shot above was at f/11. The reason I used f/11 is because this is one of two lenses I have that cannot focus to infinity in infrared. Stopping down to f/11 gets the lens reasonably sharp at infinity. It doesn’t look quite as sharp as with visible light, but still looks pretty good. Tokina 11-16mm at 11mm, f/8. 590nm infrared. I used Silver Efex Pro 2 on the shot above, to convert it into black and white. The 590nm conversion can still provide a nice white-foliage look that people mostly associate with only long-wavelength infrared like 850nm. The distant atmospheric haze was totally eliminated. This Tokina lens isn’t regarded as good for IR, but it looks fine to me. The main problem with this lens, as with several others, is shooting into very bright IR light (or just outside the field of view). The lens elements show lots of reflections and veiling glare when pointed at bright infrared sources. In shots like the one above, you’d never suspect it has any problems shooting infrared. 590nm infrared with Nikkor 24-70 f/2.8 at 24mm, f/9. My Nikkor 24-70 f/2.8 AF-S VR also has a poor reputation for IR, even with the Kolari Vision website’s lens database (“bad after f5.6”). It, too, is mostly working fine for me, through about f/11. I have the same caution about shooting into lights, however: don’t do it. Some subjects (especially beyond f/8) will cause an overall central glow, although it’s not quite what you’d call a well-defined hotspot. I used Silver Efex Pro 2 on this shot, too. The Luminescentphoto web site specifically states that their 720nm infrared-converted Nikon Z6 (didn’t say which company converted it) with this Nikkor 24-70 f/2.8 VR: “Hotspots at all apertures”. They rate it “Poor”. The shot above doesn’t have a smidge of a hotspot, and it’s stopped down to f/9. Sigma 70-200 f/2.8 with 1.4X teleconverter, 280mm, 590nm IR The Sigma 70-200 f/2.8 Sport lens isn’t supposed to work with infrared. This lizard obviously didn’t know that fact. Shot at f/4 1/3200s ISO 200. The shot above even throws in the 1.4X Sigma teleconverter. It’s curious that this lizard looks almost exactly the same in both visible and infrared light; most subjects look quite different in infrared. My Sigma 150-600 Contemporary lens works surprisingly well. I would have thought that all of the glass in that lens would have made it terrible at infrared. Maybe the Kolari Vision AR coatings are working their miracles there, too. 590nm, Sigma 14-24 f/2.8 ART, 1/640s f/6.3 14mm ISO 100 I couldn’t find any positive reviews regarding my Sigma 14-24 f/2.8 ART in regards to infrared. If you keep it pointed away from lights, it can produce fine IR images. It’s ironic that my cheap Nikkor 18-55 DX VR f/3.5-5.6 G II lens is excellent with infrared. It has better contrast and glare resistance than most of my other lenses do with infrared. It’s a slow lens, but I virtually never need a fast lens for infrared work (I don’t shoot it at night or even at dusk). It’s a pity that it doesn’t go wider than 18mm, though. Summary I have to admit that an IR-converted camera produces superior results, compared to using the screw-on IR filters over lenses instead. Many of my lenses that were previously unusable for infrared now work just fine in most (not all) circumstances. My super-wide lenses that don’t even have filter threads on them are suddenly my go-to lenses for infrared, such as my Rokinon 8mm fisheye. You might have noticed that just about all of the photos in this article were made with lenses that have been reported as either substandard or unusable for infrared. This was done intentionally. I have to question if anybody making those reports has tried to use an AR coated sensor filter. Again, you will want to avoid shooting into bright lights. Given this limitation, many “unusable” lenses are suddenly usable. Hand-holding shots instead of multi-minute, tripod-anchored exposures is a real treat. I can always use my 10-stop neutral density filter (most lenses) when I want long exposures. I think that the Kolari Vision anti-reflection coating on their infrared sensor cover filter makes a huge difference. The cost increase for this camera conversion optional feature is about the same as buying a single good infrared filter. Money well spent, if you like infrared shooting half as much as me.

  • Focus-Trap Shooting on the D500 and D850

    Have you ever heard of focus-trap shooting? That’s where you set up your camera to wait until a subject moves into a pre-set zone of sharp focus. As the subject enters that zone, the camera automatically starts shooting. It stops shooting after the subject leaves the focus zone. If the subject re-enters the zone, the camera will start shooting once again. Humming bird fly-by: caught in a focus trap Focus-trap is useful for things like the finish line of races, where the photographer isn’t allowed to be there. He sets up his camera to automatically shoot the end of the race with his camera unattended. This feature is also used beside a trail where shy or dangerous wild animals will wander by, and you have your camera in a secured box with a hole in it for the lens to see through. It’s also great for shy bugs moving onto a pre-focused spot over a flower or waiting for birds to land on a perch. It’s not very straight-forward how you can do focus-trap with the Nikon D500 or D850, but it’s possible to do. High speed hummer: not easy to react to. For the shot above, I set up a focus position in mid-air where I knew that a humming bird would fly past on its way to eat. It’s almost impossible to shoot a humming bird up close if it’s not hovering or perching. With a focus trap, the camera could easily do what I find exceedingly difficult to do: get a shot in flight from just a few feet away. How to set up the camera Pre-focus your lens to the distance where you want your camera to trigger shooting. Set your shutter release to AF-C mode, and make sure you’re in auto-focus mode. Set Ch for high-speed continuous shooting to get lots of shots while the subject is in focus. This won’t work in manual-focus mode. You might want to set your focus-point selection to ‘single’, if you want a very selective focus zone, but this isn’t mandatory. Autofocus menu AF-C shooting priority configuration Focus priority with AF-C: only shoot an in-focus subject AF activation menu Select AF-ON only. Don’t allow combined shutter and focus Disable the Out-of-focus shutter release Go to the ‘Custom Settings’ (pencil menu) “a1” AF-C priority selection. Select “Focus” (you may want to switch back to something like “Focus + release” after you’re done with focus-trap shooting). Go to the “a8” AF Activation | AF-ON only | Out-of-focus release | Disable Now, point your camera in the direction of the zone where you want the subject to trigger shooting after it comes into focus. You probably want to set your camera on a tripod at this point, unless you plan on holding the camera yourself. Don’t touch the AF-On button! The trick here is that your camera can’t (auto)focus on its own, because this button doesn’t ever get pressed. Unattended operation: wired remote with shutter-hold feature Hold down the shutter, or use a wired remote that has a locking feature on its shutter release function to keep the shutter release active. The shutter is “held down” until you unlock it. When the subject moves into the correct-focus region, the camera will start shooting until the subject leaves the zone of focus. If the subject re-enters the zone of focus, the camera will start shooting once again. Make sure that you test your setup by waving your hand in the desired focus zone and verify the camera starts shooting. You’d hate to waste an hour waiting for that animal to arrive, only to discover later that you had overlooked something in the setup that caused to camera to ignore taking the shot. Remember to go back to the “a1” menu when you’re done, and restore the original setting you had (such as “Focus + release”). If you don’t remember to restore the old setting, you’ll get burned later when you try shooting and your camera behaves strangely. Many of the flying bird close-up shots require you switch to “M” and set a really high shutter (1/4000 and faster) with Auto-ISO and a stopped-down aperture to get some depth of focus. The Auto-ISO options on something like Aperture-priority mode just won’t go fast enough for the focal length. This mode of shooting causes fairly high battery drain, so be aware of that fact. Charge up your battery before you start up a focus-trap session. Samples Summary You may just find that some types of difficult/impossible shots become do-able with this technique. You wouldn’t make a steady diet of this kind of shooting, but when you need it you need it. You may think that animals wouldn’t be any good at taking selfies with a DSLR, but they may just surprise you.

  • Action Shooting: Why Aperture Priority Mode is Superior

    On Nikon cameras that support Auto-ISO with a minimum-shutter-speed option, Aperture Priority is the best option available when shooting action. Some people are of the opinion that Manual mode is best, but I’ll explain why that isn’t correct. Manual Mode When you set Manual mode with Auto-ISO, your camera will never change the shutter or aperture setting you already set. Instead, it adjusts the ISO higher as light dims until it gets to the programmed ISO limit. After this, your shots just start getting under-exposed. It’s up to you to manually change the shutter or aperture to get correctly-exposed pictures in this dim lighting. Having to manually alter the shutter speed is the opposite of proper ‘action’ shooting. Aperture Priority Mode When you set Aperture Priority Mode with Auto-ISO, your camera will adjust the ISO as light dims until it reaches the programmed ISO limit. So far, this operates just the same as Manual Mode with Auto-ISO. If light gets even dimmer, then your camera will now start lengthening the shutter speed to always maintain the correct exposure. In very bright light, you camera will drop the ISO as needed, and then finally start increasing the shutter speed to maintain good exposure. Your aperture doesn’t get touched under any circumstances; it’s up to you to manually change the aperture if you need it. Configuring the Intelligent ISO Settings (D500 Example): Start here to configure ISO behavior (on Nikons) Set the ISO min/max range and enable “Auto ISO”. Minimum shutter speed “Auto” uses 1/focal length Modify the shutter time: quadruple/double/as-is/half/quarter Photo Shooting Menu ISO sensitivity settings ISO sensitivity Typically a low value, like 100 To get better quality/dynamic range in bright light Auto ISO sensitivity control ON Automatically adjusts ISO to keep proper exposure Maximum sensitivity Typically 6400 Set maximum ISO limit to retain at least some quality Minimum shutter speed AUTO* Auto-adjust the shutter at (1/focalLength) I recommend auto with faster shutter (1 or 2 stops) Configurable +-2 EV for acceptable motion blur 400mm lens using the ‘half’ multiple (0.5) is 1/800s With this algorithm, it’s self-adjusting to your focal length (zoom setting) with extra margin if you want (+- 2 EV worth of shutter speed). This is quite intelligent, being able to automatically make the correct trade-offs in the correct priority order. You want the lowest possible ISO setting that will work with your chosen aperture and the action-stopping shutter speed. You lose about one stop of dynamic range every time you double the ISO value, so you don’t want to waste larger potential dynamic range by setting the default ISO sensitivity high. The camera will automatically increase this ISO value as needed, so there’s no advantage to setting a high default ISO. With long telephotos, you’ll freeze the motion better by adjusting the “Auto” minimum shutter speed to use an ‘Auto’ multiple of at least (0.5) as shown in the shot above. You would set this by clicking the ‘Auto’ setting once in the “Faster” direction. With a 600mm lens and a multiple of 0.5, your shutter speed would be 1/1200 second. Some subjects will require the multiple be set to the (0.25) multiple, (or 1/2400s for the 600mm), which is the maximum “Faster” setting allowed in the “Auto” menu for the minimum shutter speed. Configured in this way, you’ll always get the maximum shot quality that the lighting allows. Summary The aperture-priority mode with auto-ISO is simply smarter and more capable than manual mode using auto ISO. Under most circumstances, there’s no need to exit this mode just because you’re shooting something other than action. In extraordinary circumstances, you might need a shutter like 1/8000s. In that case, Manual mode with Auto-ISO might be needed instead, but that scenario would be exceedingly rare. When action is happening, your attention should be on keeping focus and framing. You don’t need to be distracted with maintaining proper exposure. Not all Nikon cameras (or other brands) provide this level of flexibility. If you shoot action, having these camera features available should weigh heavily in your future purchase decisions.

  • Kolari Vision Infrared Camera Conversion Review

    I finally ‘bit the bullet’ and had a camera converted to have an infrared sensor. The main reason I did it was to see what’s going on through my viewfinder. I got really tired of having to compose the shot, screw the opaque IR filter onto the lens, and then take the shot. I wasn’t enamored with 2 or 3 minute exposures, either. And I was not pleased with how the wind always blurred the tree branches and leaves. I chose to get my old Nikon D7000 converted, which was mostly relegated to collecting dust. Now, this camera is “new” again. I decided to use Kolari Vision , mostly based upon their IR filter specifications. Their filters are glass, thick enough to (mostly) shift focus to match your lens focus scale, and they provide an upgrade to have the IR filter coated with anti-reflection materials in the IR wavelengths. Additionally, the IR filter coating is supposed to be easier to clean and it’s scratch-resistant. Kolari Vision provides factory white-balance, assuming you get a camera converted that allows it. Their website has a list of cameras that allow custom white balance. If you don’t use an IR white balance, then you’ll only see red when you review your shots on your camera’s screen. You should still be shooting in Raw format, if you want to get the maximum quality from your camera. You can buy a camera from them directly, if you wish, and avoid having to send them your camera. If you send them your camera, they have an option to send you a well-padded, insured box to protect your camera for shipment to them (I selected this option, too). Kolari Vision replaces your IR-blocking filter with their IR-passing filter under a filtered clean bench to keep things dust-free. Even so, my sensor had a couple of specks on its surface after I got my camera back from them. I’m well-versed in sensor cleaning, so it was easy to quickly fix that issue. The main thing is to NOT get dust between the camera sensor and the new IR-passing filter; Kolari Vision made sure that didn’t happen. 590nm IR spectrum (with red-blue channel swap) 105mm f/2.8 White-balanced 590nm shot, using cloud for neutral color 590nm as black and white Kolari Vision (similar to other vendors such as LifePixel) offers many filter options. I chose the 590mn option, since it includes part of the visible spectrum (deep orange and red) in addition to infrared. This option allows you to get fairly vibrant colors, and you can get realistic sky colors if you process your shots to use a red/blue channel swap. Converting the shots into black and white can still capture the classic infrared ‘glow’ off of chlorophyll in plants (the ‘Wood’ effect named after Robert W. Wood), even though it’s not as dramatic as the 850nm filtering. I need to mention that you shouldn’t even consider getting an infrared camera conversion unless you’re willing to do plenty of post-processing in a photo editor. Many editors aren’t very suitable for IR editing, including Lightroom. You need a photo editor that is able to handle white-balance in the more extreme red end of the spectrum. You should also use a photo editor that allows red/blue channel swapping, unless you want to stick with black and white. I’m finding that my Zoner Photo Studio works very well for IR, since it has great white-balancing capabilities (the eyedropper tool) and it supports the plug-ins that I use for color channel-swapping. I’ll have to work on an article detailing the special kinds of tools and activities that you need to do for editing infrared photos, compared to regular photos. Kolari Vision offers IR conversions of 590nm, 665nm, 720nm, 850nm, and some mixed-spectrum options, too. The 850nm conversion will only produce black-and-white results, because there’s no human-visible-light left to apply color. If you get a conversion like my 590nm one, then you can still put IR filters such as the 850nm onto your lens and shoot long-wavelength IR. It won’t work to shoot short-wavelength filters over a long-wavelength conversion, though. Exposure times are just as short with IR-converted cameras as normal cameras, so you don’t need to use a tripod if you don’t want to. In fact, you can even do infrared video if you’re so inclined. I’d be remiss if I didn’t mention that most people won’t thank you for photographing them in infrared. Skin looks pasty and ghost-like. People that go for the ‘Goth look’ might really like this effect, however. Visible light: about 400nm through 700nm The chart above (thanks to Wikipedia) shows what portion of the light spectrum that most humans can see. The 590nm filter conversion eliminates the blue and most of the green spectrum. Infrared light starts at about 700nm and goes to about 1 millimeter. About half of the energy from our Sun is in infrared. IR Compatible Lenses Many lenses aren’t compatible with infrared photography, mostly because of internal reflections that cause the dreaded “hot spot” in the middle of your photographs. You can consult many web-based databases to try to find out which of your lenses will work. These databases are pretty sketchy, and only mention a fraction of the available lenses. Here’s the link to the lens database by Kolari Vision: https://kolarivision.com/articles/lens-hotspot-list/. An issue with the IR lens databases I’ve seen is that the filter wavelength-cutoff being used isn’t mentioned. The 590nm lens response won’t match the 850nm lens response, for instance. Also, there’s no mention of the Kolari coated/non-coated anti-reflection IR filter upgrade. Kolari Vision states that all lens hotspots will be reduced with their anti-reflection coating, but it won’t cure bad lenses. You’ll probably not even notice most lens “hot spots” unless you have a large expanse of cloudless sky in your shots, because they’re often subtle. If people judge hot spots while photographing trees or buildings, they’ll draw a very different conclusion than shooting a clear sky instead. There are also lenses (mostly super-wide lenses) that cannot focus far enough beyond infinity to achieve correct focus in infrared. My Rokinon 8mm fisheye is one of these, although stopping down to about f/11 still gets infinity into focus for it. I’m working on my own IR-compatible lens database for a future article, but it will take quite a bit of effort to finish it. Many lenses work okay for infrared as long as their apertures are wide-open through roughly f/5.6 or f/8. Most lenses start to develop a noticeable “hot spot” between f/11 and f/16 or beyond. If you care about sharpness, you shouldn’t be using f/16 anyway. All of my old Nikkor AI-converted manual-focus lenses are excellent with infrared. I think that Nikon used to consider IR in their lens designs (IR anti-reflection coatings and focus scale shift markings). I have found that placing my 850nm IR filter on my lenses can usually reduce or rid mild hot spots. For instance, my Tokina 11-16 f/2.8 has fairly poor performance with the camera’s 590nm, but it’s excellent when adding my BCI 850nm filter. This may be due to the 850nm shots being black and white, so that sky hotspot discoloration disappears. Some web sites report the opposite result, that the 850nm creates more hotspots than shorter cutoff filters. Every different lens may be a new and unique adventure in what works. Focus Calibration You’ll probably need to recalibrate focus, even with Kolari Vision’s thick glass filter. Every lens is a new and unique adventure with IR focus shift. With my D7000 camera, the focus calibration was typically shifted by +10 (out of the total range of +- 20). I noted that my old Nikkor 20mm f/4 AI-converted lens now focuses perfectly, using its normal “visible light” focus scale. With an IR filter on an un-converted camera, I’d have to use its little infrared ‘red dot’ focus scale shift. This tends to validate Kolari Vision’s claim that their thick IR filter glass does indeed shift IR focus to that of visible light (depending upon the lens). You can of course avoid focus calibration by using Live View and contrast-detect focus. This works fine, but you’ll probably need to invest in an LCD screen viewfinder when shooting in daylight. You can find decent viewfinder magnifiers for pretty cheap these days, such as my Xit finder. LCD viewfinder magnifier to use with Live View The other obvious choice to avoid focus calibration is to get a mirrorless camera converted, but only if your camera uses pure contrast-detect focus off of the sensor. Deep IR Filters As I mentioned earlier, you can still use IR filters on your IR-converted camera. I was pleasantly surprised to find out that deep IR filters, combined with Live View, let me shoot long-wave IR (850nm) hand-held. I just can’t use my optical viewfinder for this combination. 850nm IR lens filter with my 590nm IR camera When I want to explore the look of long-wave IR, I can do it much more easily than I used to. I got accustomed to making 3-minute exposures with other cameras, and now I can do them hand-held! The shot above was 1/100s f/5.6 at ISO 100 on my 85mm lens. I like to use my Silver Efex Pro plug-in to fine-tune my monochrome IR shots. My BCI 850nm IR filter drops the light levels by about 3 stops, as opposed to about 15 stops when shooting with a visible-spectrum camera. If I want to do really long exposures, I can always just add a strong neutral density filter to my lens. If you’re certain you won’t ever want to shoot color IR, you could just get Kolari Vision’s 850nm conversion instead. I’d recommend you invest in the Nik plug-in Silver Efex Pro if you get this type of conversion. You’ll be amazed at how long-wave infrared eliminates distant haze in your landscapes. LifePixel also offers an 850nm IR conversion. Samples Distant hill with all haze removed The hill shown above was quite hard to see, due to atmospheric haze. The haze was utterly removed by shooting in infrared. Note that infrared doesn't see through water vapor, however, such as clouds and fog. Polarized light Summary I’m very pleased with having my camera converted to infrared. I just love the unique look that you can get using IR light, particularly in black and white. I chose a conversion filter that lets me explore both color and black and white, so that I’m not limited in what I can do. I don’t want to forget to mention that Kolari Vision also sent me a camera neck strap (with their name on it, of course). It’s actually a quite nice one, to entice their customers to actually use it and therefore advertise for them. It’s a sort of symbiotic relationship. IR photography generally works best around noon on sunny days, which is exactly the opposite of regular photography. This way, you can keep shooting landscapes all day long instead of just during the ‘golden hours’, assuming you didn’t just convert your only camera. I don’t get any money from Kolari Vision, so I’m not trying to sell you anything here. I just thought you might like to know about what to expect if you decide to convert your camera into infrared. I didn’t research what IR conversion companies are available outside of the United States, so you’ll have to do that legwork yourself.

  • Use Nik Plug-ins Inside Nikon NX Studio

    Here’s a non-standard way to use the Nik Plug-ins: from Nikon’s free NX Studio. This is an easy way to vastly increase the power of Nikon’s free program. By the way, other plug-ins can be added with these techniques, but only if they’re a plug-in that is an “.exe”. The “8-bit filter” plug-ins (.8bf) aren’t compatible. The Nik plug-ins require input files to be converted into jpeg or (preferably) 16-bit TIF format; they can’t use raw-format files. If you want to use NX Studio, you can stick with RAW (NEF) format, and have it convert them into 16-bit TIF prior to running the plug-ins. If you’re interested in quality results, please skip using either jpeg or 8-bit TIF files. You can always use NX Studio to convert the 16-bit TIF results into jpeg as a final step before display. Nikon has abandoned Capture NX-D, which was also able to use the plug-ins. That program was slightly more sophisticated than NX Studio, because it had the ability to automatically convert the raw format into TIF/JPEG before calling a plug-in. Now, you have to manually convert the raw format before you can use the plug-ins. So much for progress. To convert your raw NEF photo into TIF, just select the photo and then click File | Export. Select the option to save it as 16-bit TIF format from the dialog that pops up. Convert your raw photo into TIFF format Be aware that all of the plug-ins except HDR Efex Pro 2 will overwrite the input TIF file when you save the results via their “Save” button. How to run NX Studio with Plug-ins Nikon’s NX Studio is a somewhat limited, but free program. Because Nikon keeps NX Studio current, it knows how to use Raw-format files from its most recent cameras, unlike my beloved Capture NX2. Although it can support “control points” and the auto retouch brush to increase its power, the plug-ins such as Silver Efex Pro, Viveza 2, Dfine 2, Sharpener Pro 3, and HDR Efex Pro 2 can greatly expand NX Studio’s power. Register your plug-in first I am assuming you have already installed your plug-ins. As of this writing, the plug-ins are available from DxO. You’ll need to locate where the plug-ins were installed. On my computer, they are installed into folders beneath “C:\Program Files\Google\Nik Collection”. I’m using the 64-bit versions, but Nik has also provided 32-bit versions for programs/operating systems that cannot support 64-bit. Before you can start using the plug-ins, you need to register them in NX Studio. As shown above, begin with File | Open With | Register… Add a new plug-in Highlight “Open with Application”, click “Add…” Next, click “Other…” Navigate to where your plug-in is located Select the desired plug-in executable, then click “Open”. Now, you can run the added plug-in from NX Studio. Select the desired plug-in to edit your TIFF photo If you’d rather, you can edit the selected photo by right-clicking it and then selecting “Open with” and select the (registered) plug-in. You will probably get an error dialog that complains “Error: Unsupported Image format”. This error alludes to other files in your folder that are raw format. Just acknowledge these errors, and then you’ll finally get to your plug-in to edit your photo. Running Silver Efex Pro 2 via NX Studio In the example above, the photo is being edited in Silver Efex Pro 2. After saving the results, you can then edit it further after you return to NX Studio. If you’re done editing, then you can also save the photo in jpeg format. Once again, be aware that the TIF file auto-created by NX Studio will be overwritten by the called plug-in when you click the plug-in “Save” button. Conclusion The Nik plug-ins are more generally useful and flexible than most people think. They make a great combination with NX Studio, particularly since it’s a bit limited in the editing feature set that it natively offers. Plug away.

  • Nikon D850 Individual Focus Sensor Actual Coverage Area

    When you look through your viewfinder, you could be forgiven for assuming that the little auto-focus sensor indicator squares on your screen are accurate. Think again. Active focus sensor square The little square(s) showing where your camera is focusing isn’t even close to showing you how big the actual focus zone is. I’m going to show you how you can discover for yourself the real size of the focus sensor. This technique should work for most (Nikon) camera models, even though I'm demonstrating the Nikon D850. My focus-checking chart from the MTFMapper site I use printed charts like the one shown above to check and calibrate my auto-focus. The chart is meant to be rotated by 45 degrees, and then my software can figure out where my lens actually focused versus where I pointed the focus sensor. In this case, I used this test chart to find out how accurately my D850 indicates the area of sensitivity of its focus points are, compared to the little etched squares on the viewfinder’s focus screen. This chart makes it easy for the camera to focus on the right-hand edge of the big square in the chart middle, so you don’t have to guess where the camera focuses. Before testing, I set the camera to be in AF-C mode and I select the “single-point AF” mode. I want the camera to concentrate on only one focus sensor, so I can find the boundary of sensitivity of that sensor. I use a sturdy tripod, and I pan the camera away from the rectangle edge I have focused on until I lose focus. To measure the actual size of the focus sensor area, I make use of the little viewfinder focus-confirmation “dot”. When the camera sees proper focus, it displays a little dot while you’re activating auto-focus. When the camera loses focus, it instead displays a couple of little flashing triangles. Out-of-focus indicator I make use of the transition from the in-focus “dot” to the flashing triangles to know when the camera has lost focus, while pressing the AF-ON button and moving the focus sensor away from the chart’s big rectangle edge. When the flashing signal starts, it means that your focus sensor has fallen off of the target edge. This indicator is located along the bottom left-hand side of the viewfinder. Aim the single focus point at the edge of the rectangle edge As shown above, I begin by centering the focus point over the middle of the target rectangle edge. I slowly pan to the right, watching to see when the camera focus indicator shows that it lost focus. This operation lets me find the sensor left-edge boundary. I repeat the test using the rectangle left edge to find the focus sensor’s right edge boundary while panning to the left. I mount the chart vertically and pan up/down to find the focus sensor boundaries in the vertical dimension. I tried the test using other focus sensors, to see if they behaved the same; it appears that they behaved similarly. I only tested the “cross-type” focus sensors, since I was looking at both horizontal and vertical details. I'll mention that mirrorless camera focus sensors are usually NOT cross-type sensors, and they cannot detect pure horizontal details. The real focus sensor boundaries I drew a square in magenta to show the size and location of the focus sensor, as seen through the viewfinder. I have overlaid (in purple) the extent of actual focus sensitivity for the sensor that I measured in the vertical and horizontal directions. The boundary is almost exactly an entire ‘square’ away from the viewfinder sensor indicator area. The total area of sensitivity maps out to a perfect circle, as near as I can tell; I used a round focus target and panned diagonally to estimate the extent of sensitivity in various directions. I drew a red circle around the estimated area of sensitivity for the focus sensor. Summary There are times that it’s good to have some elbow room around a sensor and still hold focus. Other times, it’s the last thing you want. In any case, it’s good to know what your focus sensor is actually seeing. I’m not trying to pick on the D850 here; it’s certainly not alone in exhibiting this sort of “dishonest” focus response. There are probably times that you noticed some unusual focus behavior. This analysis may shed some light on why that happened.

  • Nikkor 18-140 f/3.5-5.6G ED VR Review

    This review will primarily detail the lens MTF50 resolution performance and how well the lens autofocuses. Other reviews already rehash the Nikon specifications of the lens, so I don’t intend to repeat all of that here. I don’t sweat minor lateral chromatic aberration, noticeable distortion, and vignette issues; software fixes those. Software can’t create missing resolution, solve inaccurate focus, or rid focus chatter. I should mention that vignetting at 140mm is breathtaking, but lucky us that software rids it with virtually the click of a button. My usual disclaimer: this is looking at a single copy of the lens. Yours will be different, but hopefully ‘similar’. The only place I know of that tests lots of copies of lenses is here. These tests were done using a Nikon D7000 (16 MP) with unsharpened 14-bit compressed RAW format. Here is a link to get pretty good information on this lens. My main complaint with them (and most of the other web sites) is that they try to reduce resolution measurement down to a single number for an f/stop and focal length setting. It’s not that simple; resolution is a 2-dimensional thing (and also a sagittal versus meridional thing). Here’s another gripe: Nikon doesn’t even include a lens hood or lens pouch. That’s really insulting, considering the price of this lens. This lens telescopes out into three sections as you zoom. The tolerances are really tight, though, and it doesn’t have any “wiggle” in it. Everything feels surprisingly solid with this lens. Don’t drop it, though (as if ANY lens handles dropping well). As opposed to most of their “kit” lenses, the focus ring on this lens is a proper one, nearest the camera. It’s sufficiently wide that I have no complaints about it. Just like their ‘pro’ lenses, you can use it anytime you want to override autofocus. The lens also has a proper metal lens mount and gasket (rubber seal). That’s it as far as dust/weather resistance is concerned. If you damage any Nikon lens with water or dust, including weather-sealed lenses, your warranty plus 5 bucks entitles you to a latte. Nikkor 18-140mm zoomed out to 140mm Autofocus Focus was a bit sluggish on my D7000 in dim light, and quite a bit faster on my D7100. Since this isn’t exactly a sports/wildlife lens, that’s a minor point. Focus is very repeatable, and that is what’s important. There was zero focus chatter, and that’s crucial (lenses that have focus chatter are useless, in my opinion). If you’re too far out of focus, you will probably need to give it a nudge in the right direction using the manual focus ring. This is really common for lenses. I may be pickier than many on this point, but I did notice a focus calibration shift at different focal lengths. It’s not huge (about 4 fine-tune units), but I’d have to say that it’s my least favorite thing about this lens. Stopping down about one stop will mask it. I really wish that Nikon would “invent” a docking station like Sigma has, so you could customize focus tuning at different focal lengths and distances; oh, well. I don’t have a single lens that doesn’t need some focus fine-tune calibration. This lens is no different. If your camera doesn’t have fine-tune, you need to save up and buy one; refurbished cameras are pretty reasonable these days. Your lenses won’t be giving you what you paid for without fine-tune, unless you put up with Live View autofocus. Vibration Reduction (VR) This seems to be the most capable lens VR I have yet tried, at about 4 stops. Everybody is different in how they support the camera while hand-holding it, so your mileage will vary here. I determine “sharp” versus “un-sharp” by photographing a resolution chart at slow shutter speeds and measure where the resolution (MTF50 lp/mm) drops by 10% from maximum. I don’t know if there is some industry standard on VR effectiveness, but what counts for me is when pictures just start to show some blur, and I like to do it by the numbers. I haven’t figured out how to calibrate my level of nervousness with hand-holding, so this VR business is literally “hand waving”. Oh, also, I test at the longest focal length. Resolution Testing Here’s where I get to rant about those lens reviewers that grade lens resolution with adjectives like “good”, “fair”, and “excellent”. What the heck does that mean? How about 2 ‘blur’ units??? Really? I want real numbers and I want to see real pictures of things I’d actually bother to photograph. I use a (free!) program called MTF Mapper from here to measure lens resolution. The download site also has files for printing out the resolution targets (mine are A0 size on heavy glossy paper (‘satin’ finish seems to work just as well), dry-mounted onto a board). This program is covered in more detail in another article, but suffice it to say that this is really great stuff; it’s comparable to ‘Imatest’ in the quality of the MTF measurements, and it uses the “slanted edge” technology similar to ‘Imatest’, also. The author of MTF Mapper, Frans van den Bergh, really knows his stuff. Visit his site and give him the praise he deserves. The chart design used for resolution tests orients all of the little black squares to be ‘slanted’ but they’re generally aligned in meridional and sagittal (think spokes on a wheel) directions to correlate better with the usual MTF plots you’re familiar with. There’s often a dramatic difference in sharpness between these two directions, and the chart photographs show it clearly. The meridional/sagittal differences are what “astigmatism” is all about. This lens is decent in the sagittal direction when you get away from the lens optical center, and honestly really bad in the meridional direction. Corners are decent from 18-50mm, but then performance goes south (astigmatism galore). But the middle of the lens, at all focal lengths, is really, really good. In fact, it mostly makes up for the corner performance. What the resolution target looks like. Mine is mounted ‘upside down’. At long last, I’m getting around to some actual resolution results. Tests were done with “Live View” AF-S autofocus, contrast detect, IR remote, VR OFF, really big tripod. That’s how I get around any phase-detect problems with focus calibration. The results don’t seem to improve using manual focus and 100% magnification in Live View, so I don’t bother. I use the “best of 10 shots”; not every shot gets the same resolution results. All cameras operate on the “close enough” principle for focus, so many tests are needed to determine the best resolution that the lens can produce. 140mm f/5.6 APS-C Corner. Note Sagittal is MUCH better than Meridional 18mm center wide open is quite amazing. Even corners are pretty good. I got 38mm with 35mm mark on lens. Big resolution dip from 18mm. 50mm about the same as 35mm resolution 70mm wide open. Meridional-direction corners aren't looking too good. 140mm wide open. Again, weak corners but great in the center Samples 18mm. You really notice the barrel distortion. 18mm with software lens distortion fix. Voila, you can ignore distortion. Used Capture NX2. Southern Cal. Christmas Tree. 140mm. Near closest focus (1.55 feet versus 1.48 feet closest focus) at 140mm, 1/160, f/7.1, VR ON #review

  • Nikon DSLR Camera Focus Speed Comparisons

    I thought it would be fun to see how Nikon has progressed with improving focus speed over the years. I picked cameras starting with the D50, introduced way back in June 2005. I tested a representative sample of their DSLRs up through the D850. For the tests shown in this article, I chose my Sigma 70-200 f/2.8 Sport lens, since it’s about as fast at focusing as anything that I’ve got. My focus speed timing tests are usually done in bright sunlight, but for this series of tests I chose to try heavy overcast conditions. It turns out that focus speed didn’t seem affected by this lower light level (EV 14 to be exact), which is good news. I bet you think that the D50 probably has Alzheimer’s disease by now, and likely has forgotten how to autofocus. You’d be very, very wrong on that point. You’d be right, on the other hand, to think that its sensor is total crap compared to anything modern. For all testing, I chose the center focus point, since they’re always the most accurate and sensitive. I used slow-motion video from my D850 (120 fps) to monitor and time the focus action, except when testing the D850 itself, where I used the D500 with its 60 fps video. I start with the lens set on minimum focus (about 4 feet), and then I have it focus on a distant high-contrast target to force the lens to focus on infinity. The focus target here is “easy”; I didn’t try anything tricky to focus on. I used the Sigma USB dock to program the lens focus algorithm on my 70-200 f/2.8 Sport lens. I selected the non-default algorithm called “High Speed AF” (or “Fast AF Priority”), which has the highest available speed of its three focus algorithms. It has proven to be totally accurate and reliable, so it makes zero sense to me to use a slower focus algorithm, such as the default “Standard AF”. That’s right: that's a Nikon D50 behind that 70-200 lens Sigma USB Dock: program lens focus speed behavior The Sigma Optimization Pro program that is used with their USB Dock to program any of their modern (Global Vision) lenses is shown above. Besides focus speed behavior, it also lets you customize focus fine-tune calibration at various focal lengths and distances and also the optical stabilization characteristics. Companies like Nikon and Canon don’t have anything as sophisticated as this for their lenses. This series of speed tests produced some real surprises, both good and bad. First, I’ll show a table of results, followed by an analysis of the character of the focus behavior. Camera comparisons If you study the tabulated results shown above, something should stand out to you: The Nikon D50 performed very, very well. In fact, the D50 beat the D7000, D7100, and D610 cameras. How is this possible? I believe the answer lies in the camera’s “focus algorithm” and not the camera raw focus horsepower. Detailed Focusing Behavior Analysis For each test, I carefully studied how the camera would focus the lens in the slow-motion video frame-by-frame. The key to fast focus has a lot more to do with getting straight to the distance goal than with raw speed. Again, each test started at the lens minimum distance and I had the lens focus on a distant target requiring an infinity setting. A smart focus algorithm won’t cover the same ground more than once; a dumb focus algorithm will move forward, then backward, then forward to get to its goal (or the reverse for a near target). To nobody’s surprise, the Nikon D500 and D850 focus algorithms are “smart”; they move straight to the focus goal, and they do so at considerable speed, basically skidding to a stop at infinity. These cameras don’t suffer from getting distracted or being lazy; they’re all business when it comes to the job at hand. The results are quite close to each other, and probably within experimental error of being a tie. They finished within 1/100 second of the same time and won the contest. I used the slower D500 video at 60 fps to analyze the D850, but for the other tests I used the D850 video at 120 fps. The focus hardware in these cameras is equivalent to the D5 hardware, so it should behave similarly. The D50, surprisingly, is also quite smart; it went straight to the focus goal, although more sluggishly than the D500 or D850. As the lens got close to infinity, it just coasted at a slow pace to reach the final focus position with no overshoot. The D610 could move the focus at a reasonably quick pace, but it stopped at 4 meters, back-tracked to 3.5 meters, and then went forward to stop at infinity. Had the camera not back-tracked, it would have had a fast focus time. The D7000 focus looked very confused. It focused to 10 meters, back-tracked to 5 meters, went forward to 20 meters and stopped, then moved forward beyond infinity, then finally moved backward to get to the correct infinity setting. This extra motion caused the D7000 to finish in an embarrassing last place in the competition, behind the lowly D50. The D7100 focus moved to 4 meters, back-tracked to 3.8 meters, and then went straight from there to the correct infinity setting. The individual focus motions were quite fast, but the stutter maneuver cost it precious time. I always had the “feeling” that this camera was faster than my D610, but this test proves it. Summary Contrary to the expected results, camera age and the processor speed aren’t reliable predictors of how fast a camera can focus a lens. The intelligence of the focus algorithm built into the camera firmware can make a huge difference in focus speed. I remember when Sigma produced a firmware update for my 150-600 zoom (programmed with their USB Dock). The focus became both faster and more consistent. This indicates that firmware in both the camera and the lens factors into focus behavior. Life’s complicated. Different lenses may produce different focus behaviors, which is why I decided to use a single lens in all of my testing. I chose a very fast lens (with a fast focus motor), so that it wouldn’t be the limiting factor in determining how fast a camera can focus a lens. There are of course other factors involved in focus, such as low-light performance, the number of cross-type sensors, frame coverage and the sheer number of sensors. That kind of information is available in an uncountable number of sites on the internet. What you won’t find is information comparing camera focus algorithm cleverness, or the lack thereof. Someday, camera manufacturers will probably get around to using artificial intelligence. I think one of the big beneficiaries of this technology will be smart focus algorithms combined with smart subject tracking. That will be a great day. It’s very illuminating to study focus behavior using video. Your own eyes are just too slow to perceive what’s really going on. These test videos reminded me of Aesop’s The Tortoise and the Hare. Slow and steady can win the race, or at least get you the bronze medal.

  • Measure Camera Mirror Blackout Time Yourself

    Camera mirror blackout is the term for DSLRs when the mirror pivots out of the way for the shutter to open and take the shot. When you’re looking through the viewfinder and take a shot, there’s a brief time when you can’t see anything until the mirror pivots back to its “down” position. Most of the time, you simply ignore mirror blackout, because it’s really brief. It’s a whole different story if you’re taking a series of shots as fast as your camera allows (continuous shooting). During high-speed shooting, such as 10 frames per second, you really need to be able to track the action and follow your moving subject. Also, your camera needs to see the subject in order to update the focus distance as the subject moves. If you’re curious about how much time is spent flipping the mirror out of the way so that the camera can take the shot, a great way measure it is with video. Most smart phones and many cameras are capable of taking super slow-motion video at high frame rates. I’m going to show you how I used my Nikon D850 to track mirror activity at 120 frames per second (24p * 5 = 120). As an additional benefit of this test, you will be able to measure the actual frames-per-second your camera is able to shoot, versus what the manufacturer claims it can do. D850 filming a D500 In the shot above, you can see a D850 (with a 105mm Micro Nikkor) looking through the viewfinder of a D500. The D850’s105mm lens is focused on infinity, and is seeing what the D500 photographer would see. The lens has to be pretty close to the viewfinder to see very much of the frame. Video image of the viewfinder The D850 video, being replayed via its LCD screen, is shown above. The camera multi-selector right-arrow is used to single-step through the video frames to observe when the mirror blocks the viewfinder image during high-speed shooting. With the setup I used, I couldn’t see the entire frame, but that isn’t necessary to conduct a test like this. I attached a 10-pin remote release to the D500 to make it easier to shoot a sequence of 10 frames per second, while also starting and stopping the video recording on the D850. After taking the video footage, I just replayed it in the D850. The time duration of each frame of video is 1/120 second or 0.0083 seconds. The results were pretty consistent for each D500 shot that I reviewed. What follows is a summary of the D500 shooting chronology video. Frame 1: partial image visibility at the top of the viewfinder Frame 2-7: total image blackout Frame 8: partial image visibility at the bottom of viewfinder Frame 9-12: full image visible The 12-frame sequence above repeated itself throughout the video, with minor variations of how much of each “partial frame” was visible. I consider the partial frames to be the same as “blackout”. Adding up the frames, it turns out that the blackout time versus visible time is in a proportion of 8 to 4 (0.0667 seconds blacked out and 0.0333 seconds visible). In other words, two-thirds of the time the viewfinder is blacked out while shooting at 10 frames per second. As a reality check, 0.0667 + 0.0333 = 0.1 seconds per shooting cycle. As expected, each frame then takes 0.1 seconds or 10 frames per second. Nikon is telling the truth about the D500 frame rate, after all. A partially flipped mirror view You’d swear while looking through the viewfinder while shooting 10 frames per second that you ‘mostly’ see the subject, and the blackout time is only a fraction as long. Exactly the opposite is true. I think that your brain tries to stitch together each isolated view into a continuous image, just like when you watch a movie. Summary Super-slow-motion video can be a great tool for viewing things that are simply too fast for human perception. I showed in a previous article how lens auto-focus speed can be studied and measured in detail. If you’re interested in comparing cameras, don’t overlook video as a great measurement tool. It’s rather amazing that the auto-focus system, which only gets to see the subject one third of the time, is able to keep focus on a moving subject while shooting at ten frames per second. This may sound like a problem that’s been solved by mirrorless cameras, which by definition can’t have any mirror blackout time. Most mirrorless cameras only trade this problem for another, potentially more serious problem, however. The mirrorless camera electronic viewfinders have finite refresh rates, and many of them are slow enough that you can’t see where your fast-moving subject really is. Instead, you only get to see where the subject “used to be”. Whoops.

bottom of page