top of page

Search Results

268 results found with an empty search

  • SnapBridge and D500 Remote Control

    There’s a lot of web discussion about using SnapBridge with the D500, mostly centered on being able make it work or not. My own interest in SnapBridge is related to why I might want to use it at all. I’m probably in the minority here, but it’s more useful to me to be able to remotely trigger the shutter than to be able to transfer photos to my phone. The Nikon “pro” cameras have the infrared-trigger feature conveniently removed. I guess it’s convenient to somebody; I suspect that somebody is Nikon only. So the cheap and easy IR shutter control (via the little ML-L3 remote or via a little app running on your smartphone) is out. Let’s review some ways you can remotely trigger the shutter on your D500 (and most other “pro” models). The most versatile (and therefore the most expensive) way to remotely trigger your camera is the Nikon WR-1 wireless remote. It has a range of 394 feet, has a zillion options, and costs about 470 bucks. Ouch. But if you’ve got your camera at the feet of horses finishing the Kentucky Derby while you’re in the stands, then this is the ticket. If you can get within about 66 feet of your camera, then the Nikon WR-R10/WR-A10/WR-T10 can trigger your shutter and also control your flash without wires. The cost is around 200 bucks. If you can get within about 3 feet of your camera, then you can use the Nikon MC-30A 10-pin wired remote, at a price of about 65 bucks. Or, like me, you can get a cheap 10-pin wired remote (I got the Vello RS-N1II) for about 8 bucks. Now we’re talking. Plus, you can control it with just one hand, and you don’t even have to be looking at it. Enter SnapBridge. So, how can SnapBridge help me to take photos remotely? After you download SnapBridge to your phone (mine is a Samsung Galaxy S6 running Android 6.0.1) and then connect via wireless, you can select the option for “Remote Photography”. I'd suggest you visit the Nikon website to watch the SnapBridge video. Now, via SnapBridge, you can not only trigger your camera, but you can get Live View right on your phone’s screen. You can’t (as of this writing) alter exposure settings, but as least you can see what settings your camera is using and you can also see the battery level. You’ll notice that your phone’s “live view” will give you the impression that it has just consumed several espressos. It has the jitters. Note that you may want to use SnapBridge wireless in limited doses, because once your camera is out of “Airplane Mode”, it will cause pretty heavy battery drain. At least you can monitor that drain from your phone. The owner’s manual says you can use the D500 wireless from “approximately” 10 meters. I got out a tape measure, and successfully controlled my camera from 30 feet. Not that I’m a skeptic or anything. I wouldn’t suggest you try using a drone with this, but seeing through your camera and shooting from 30 feet away could open up some pretty creative possibilities. SnapBridge in “Remote Photography” mode. Note above how you can monitor shutter, f-stop, shots remaining, and battery level while shooting. The shots you take will show as thumbnails below the camera settings. The big white circle is the shutter button. What the camera sees while being controlled by SnapBridge View the SnapBridge training video at the Nikon web site My own preference for most remote-release scenarios. Cheap and reliable 10-pin connection. I would imagine that SnapBridge will get some enhancements in the future. Let’s hope it will someday enable you to alter exposure settings from your phone. By the way, if you're more interested in transferring photos to your phone and skipping the remote control, you can use BlueTooth. It's slower than wireless, but uses a tiny fraction of the battery power to operate. Don't forget to activate your phone GPS to embed the location data in the pictures. #review

  • How Bright Is Your Camera Viewfinder ?

    I have read for many years in marketing literature about how some camera’s viewfinder is “bright”. What does that mean? How would you know if your camera viewfinder is bright or dim? I prefer numbers to hand-waving. After pondering the issue for a while, it occurred to me that most of us possess an instrument to readily figure out what bright means. Our phones have a built-in camera, and photos from those phones contain EXIF data. EXIF data can be inspected for the “brightness”, which is called “Light Value” (EV). Recall that 1 EV difference equals 1 stop of light. I use the free program called “exiftool” to inspect the EXIF data. A link where I give more details on this program is here. Taking a look at the EXIF data from a phone photo, it’s packed with useful information. Note that this Samsung Galaxy S6 phone has a 4.3mm, f/1.9 lens. Because of its teeny sensor, that’s the equivalent of a 28mm lens with a 65.5 degree field of view. Also note that the EXIF data shows a “Light Value” (8.4 shown above). Why use your phone camera? Because it has a huge depth of focus and the lens fits neatly within your camera viewfinder while blocking external light. If you use the same lens, aperture, and lighting conditions on each camera, then you can take a picture using your phone through different camera viewfinders and compare them for brightness, via the “Light Value” in the EXIF data. In the comparisons below, I took a look at the Nikon D610, D7000, and D500 camera viewfinders. I expected the D610 viewfinder to be the biggest and brightest, since it has a full-frame sensor. I was wrong. The D500 is better. Please forgive the poor exposure in the phone photos below. It's not as smart as Nikon, and the large black expanse fooled the meter. I'm only interested in brightness (EV) and size in the frame, so the exposure technique just needs to be consistent for each camera viewfinder. Nikon D610 Viewfinder. EV 8.4, 35mm Nikon D7000 Viewfinder EV 8.1, 35mm Nikon D500 Viewfinder EV 8.8, 35mm Cropped view of D610 viewfinder Cropped view D7000 viewfinder Cropped view D500 viewfinder Viewfinder Comparisons The D610 viewfinder is EV 8.4 and the view width is 1896 pixels in the photo. The D7000 viewfinder is EV 8.1 and the view width is 1694 pixels in the photo. The D500 viewfinder is EV 8.8 and the view width is 1930 pixels in the photo. It struck me that the D500 viewfinder looked bright and large, but I didn't know if it was a psychological effect or real. Now I know it's real. I was surprised to discover that it's even larger and brighter than my full-frame D610 viewfinder. Try this test yourself; it's an easy way to compare camera viewfinder brightness, magnification, and even focus sensor sizes and focus sensor converage. #howto

  • Infrared Photography and the Nikon D500

    How does the D500 stack up shooting infrared? The last pair of Nikon cameras I evaluated shooting infrared were big disappointments: the D7100 and the D610. They had terrible internal reflections that would ruin most infrared shots, which requires use of the DK-5 viewfinder eyepiece blocker to avoid this problem. A link to those results (and the good D7000 results) are here. The D500, on the other hand, is excellent shooting infrared. Ironically, this camera has a built-in eyepiece shutter, although it is probably only needed to help with exposure measurement accuracy on a tripod. I use a Hoya R72 filter for infrared shooting. Something I consider to be very ironic: probably one of my best lenses for shooting infrared is the lowly 18-55 kit zoom (I’m using a VR model). Most other lenses I have tried that shoot wide angle either have a nasty hot spot in the middle or aren’t as sharp. I pretty much park the lens at 18mm; if it went wider, then I’d zoom wider. Here’s the best site I know for evaluating which lenses work for infrared, and also for the f-stop range that works. Even my 24-70mm f/2.8 VR 'E' zoom is crap for infrared. As a little side-note, many lenses perform fine with infrared until you stop them down too far. Up through f/8, they may be fine, but then start to exhibit the central hot spot. My Nikkor 50mm f/1.8D, for instance is okay until about f/8. Nikon seems to try pretty hard to keep people from shooting infrared. The D500 image sensor filter screens out IR very effectively, and exposure times are pretty outrageous. This totally precludes using a direct-measure, pre-set manual white balance (which by the way works perfectly fine on older cameras like the D50 and D60). My open-shade shots are made in the vicinity of ISO 100, f/8, 30 seconds. When you add in long exposure noise reduction, you literally need a lot of time on your hands to do this kind of photography. I like to manually set a gray point by selecting a rectangular picture area (in Nikon Capture NX2, it’s called a ‘marquee sample’) followed by severe hue-shifting. When I use something like Photoshop, then I typically perform a red-blue channel swap instead of the hue-shift. There are plenty of web articles on this topic. Enough talk. Let’s see some actual D500 results. #review

  • Does the D500 Automatic Focus Fine-Tune Calibration Work?

    Short answer: yes. But you know there’s a catch. There’s always a catch. So here it is: garbage in, garbage out. The ‘secret’ to getting great results for focus calibration is multi-faceted. You must use the proper focus target, your camera must be very stable, you need consistent technique, you need to decide on a zoom setting, you need to pick a target distance, you may need to pick an aperture, and you must use good lighting. As much as people hate to hear this, it’s a fact of life that measurement and calibration always involve statistics. Your gear isn’t perfect, so you are guaranteed to get some variability in measurement. Focus calibration isn’t a “one and done” scenario. Many measurements are required, and then you have to find the average value. You might also have to throw out the measurement values of outliers. The Nikon D500 “automatic” focus fine-tune calibration assumes that your live-view, contrast-detect focus is dead-on. It compares the distance your phase-detect focus system decided to focus on a target to the distance that the contrast-detect focus system decided to focus on that same target, and then calculates the fine-tune value that would shift the lens phase-detect focus to match the contrast focus. Depending on which direction (near-to-far versus far-to-near) your lens travelled to obtain focus, you may get a different calibration answer. Depending on what your selected focus point ‘sees’ as being the subject, you may get a different answer. Depending on your selected lens aperture (mostly high-speed lenses), the answer may vary. Depending on the zoomed focal length, you can get a different answer (think parfocal optics). Depending on the target distance, you can get a different answer. In dim light, you can get a different answer. The physical lens focus mechanics have some slop. So don’t expect miracles here. You need to decide on your favorite zoomed focal length, distance, and aperture to use while conducting the test (depending on the type of lens being measured, of course). You need to use a proper focus target that has absolutely no ambiguity about what the “target” really is, from the standpoint of your focus sensor. Sigma is savvy to these ugly facts of optics life, and provides their newer lenses with the ability to calibrate zooms at multiple focal lengths and multiple focus distances. Their (inexpensive) USB dock and free software lets users reprogram the lens firmware with this calibration information, in addition to letting users select focus algorithms, upgrade firmware, select anti-vibration modes, and other features. Nikon is not savvy, but I digress. I use a specific focus target, rotated to a 45-degree angle, to calibrate focus. The target design, and its associated analysis software, was created by Frans van den Bergh. I wrote an article to explain his software and focus target here. The recipe to perform a single calibration measurement on a D500 goes as follows: ⦁ Lens VR OFF ⦁ Focus mode set to AF-S ⦁ Select the center focus point ⦁ Camera/lens on sturdy support pointing at the focus target ⦁ AF fine-tune ON in setup (wrench) menu ⦁ Live View mode ⦁ Normal-area AF ⦁ Focus on the target. Focus point centered on an edge illuminated by bright light. ⦁ Press the “AF Mode” button and “Movie Record” buttons simultaneously and wait patiently for about 3 seconds. Don’t wiggle the camera… ⦁ Highlight “Yes” and press “OK” when prompted. The calibration value will be written for the lens in the usual AF fine-tune menu location. Rinse and repeat. The focus target using the MTF Mapper program, showing measurement labels on squares Note in the photo above that the camera focus point is centered on the right-hand vertical edge of the large black target. There is no ambiguity about what it is focused on; the focus point can only see a single vertical high-contrast edge. The MTF Mapper program measures the resolution of every black square it finds, and lets the user easily see where the optimal in-focus squares are located relative to the large black central target right-hand vertical edge. Using this software, you can visually see where the sharpest edges are located with each test you make (the edge labels eliminate any judgement calls you’d have to make in an un-labelled photo). After each AF fine-tune measurement is done, write down the value saved in the AF fine-tune menu for the lens being used. Take several measurements and average their values. You may have to throw out any wild readings first. These readings will give you a feel for the natural focus variation of the lens. I like to manually change the focus to alternate between near and far for each test, prior to initiating AF-S auto-focus. This will let you explore any bias that the lens has for focus direction. Manually enter the averaged fine-tune value for your lens in the AF fine-tune menu. This will give you the best “typical” focus result. An MTF Mapper “profile” plot of the focus chart photo. Note “AF Tune Value” is displayed in the plot above. The D500 focus calibration feature automatically saves this value in your camera. The MTF Mapper program extracts the value from the focus chart photo EXIF data to add it to the plot. Conclusion I used this automatic fine-tune calibration feature dozens of times, and the fine-tune value tracked the actual phase-detect focus error quite well (within about 1 or 2 counts). The problem is that it can only calibrate against a moving target. Your own lens natural focus variation will prevent you from ever getting “the” calibration answer. In my own testing, I would get an auto-calibration tune-value variation of about plus/minus 3 for a typical lens. This is roughly the same as my own manual calibration best efforts; it would get within about 1 or 2 counts of what the software measurement software determined to be "best". I have read reports that some users get terrible calibration repeatability, but I suspect that may be largely due to using a poor focus target and/or sloppy technique. I don’t think that phase-detect can ever compete with Live View focus accuracy across the board, because it cannot address issues related to focus-shift-with-zooming, focus-shift-with-distance, or spherical aberration effects. None the less, I applaud Nikon for adding this feature. Now if they could match Sigma lens firmware capabilities, they’d really be leader of the pack. By the way, now even the Samyang (Rokinon) company has added the same firmware-programming features that Sigma offers. Oh, and one thing I'd definitely change with their auto-calibration is the two-button-press thing. There's no way to keep things rock-steady doing that. They need to make it an operation you can do using a remote trigger or timer so that you don't have to jiggle the camera. #review

  • Do Long Lenses Not Like Filters?

    I was trying to solve a mystery about why my trusty Sigma 150-600mm seemed to lose its ability to resolve fine details. I recently got a D500 and was using the Sigma while testing the camera electronic front-curtain shutter feature (to totally eliminate vibrations). I noticed the resolution measurements (using the MTF Mapper program) were much lower than expected. Had I somehow bumped the lens and knocked the optics out of alignment? Was the camera focus system not functioning properly? Were the phase-detect and contrast-detect focus systems both out of whack? Was my vibration reduction accidentally left on while using a tripod? Am I getting sloppy and don’t even know it? Other lenses weren’t showing any resolving problems on this camera, but that only helped to show what the problem was not. Out of desperation, I removed the 95mm Hasselblad UV filter from the Sigma. Like magic, the resolution numbers were back to where I expected them to be (more than 20% higher). Are you kidding me? My premium Hasselblad filter is no good? I don’t have any other lenses that use 95mm filters, and I don’t have any step-up rings that big, either. I finally figured out how I could use my 24-70mm lens with its lens hood to hold the filter via friction-fit inside the hood. I took shots of my resolution target with this filter in place (at 70mm), and then removed it to repeat the shots with no filter. The MTF Mapper program showed zero resolution differences either with or without the Hasselblad filter! How is this possible? How can the Hasselblad show perfection on this lens and wreck the resolution on the Sigma? I don’t have any other long lenses to experiment with. I don’t think most users of really big glass use filters at all, except for the kind that fit in the drop-in filter holders near the rear of the lens. Maybe there’s a good reason they don’t use front-mounted filters, aside from the big cost and added weight. Let’s take a look at some resolution test measurements. The following plots compare shots taken with and without the Hasselblad 95mm filter. Sigma 150-600mm at 600mm with 95mm Hasselblad UV filter. Bad resolution! Sigma 150-600mm at 600mm without UV filter. Much better resolution. Note in the above plots that the peak resolution with the filter in place has an MTF50 of about 28 lp/mm. The plot of the photo taken moments later under the same conditions, but without a filter, has an MTF50 peak resolution of about 36 lp/mm. That’s a difference of about 22%. I repeated this same test a dozen times, with the same results. This would lead most people to conclude that the filter is absolutely terrible. But I'm stubborn. I investigated further. Next, I’ll show you some resolution test results using my Nikkor 24-70mm f/2.8 VR lens. 24-70 at 70mm f/2.8 with Hassleblad 95mm UV filter 24-70 at 70mm f/2.8 without filter 24-70 at 70mm f/2.8 without any filter 24-70 at 70mm f/2.8 with Hassleblad 95mm UV filter Maybe you can tell a difference between the measurement results with or without the filter, but I sure can’t. Based upon the tests with the 24-70 lens, I would have to conclude that there’s nothing at all wrong with the Hasselblad filter. I was taught that even a medium-quality filter would have a negligible effect on lens resolution, although it might increase light reflections or decrease light transmission. This is a whole different ball game. I now have a strong suspicion that even top-quality front-mounted filters on really long lenses have a big negative impact on resolution. Take my experience with this filter/lens combination as a heads-up. I mainly use UV filters as cheap insurance against accidents and because they’re easier to clean than the front lens element. From now on, though, I’m just going to rely on the lens hood to protect this Sigma lens. If I ever get a short focal length lens with a 95mm filter thread, I wouldn’t hesitate to use the Hasselblad filter on it. For long lenses, though, beware of filters over the front element. At least do some careful testing before you decide to permanently park a filter in front of your big lens. I can’t help but wonder if there are a lot of people out there that are disappointed in their long lens and don’t know that the blame lies in their filter. #review

  • Focus-Stacking: Camera Hardware Suggestions

    In a previous article, I discussed software to accomplish focus stacking. I glossed over the hardware that lets you get real quality results. I’m going to try to rectify that shortcoming in this article. I’m assuming you’re interested in macro focus-stacking. For those that are unaware of what focus-stacking is, it involves taking multiple photographs that you combine to get greater depth of focus. Macro photography is famous for suffering from paper-thin depth of focus. Optical physics is against you here, but software and digital photography comes to the rescue. What you need to accomplish macro focus-stacking is flexible close-up gear. If you check on web sites such as e-bay, and you shoot Nikon, you should be able to locate a bellows setup like mine, the PB-4. I also use rings that let me reverse the lens and attach a filter to the reversed lens. Nikon stopped making bellows hardware decades ago, because the customer base was just too small to make it worth it to them. Some camera bodies may not allow attachment to the bellows; it depends on how much overhang the “pentaprisim” portion of the camera has. My D610, for instance, barely fits, but the D500 fits fine. Battery grips, however, don’t allow you to connect or properly use the bellows. You may need to set the bellows into ‘vertical-shooting’ format to be able to mount the camera body; crank the camera-mount portion of the bellows to the rear-most position to mount the camera. I still use the vintage 55mm Micro-Nikkor f/3.5 lens, circa 1974. Think this old lens couldn’t cut it today? Think again. With focus-stacking, you should always set the sharpest lens aperture (f/8 for the Micro-Nikkor). Remember, stacking will take care of depth of focus, so you don’t need to worry about stopping the aperture down to get sufficient depth of focus. When you get into magnifications greater than life-size, you should reverse the lens to get the best optical results. I use the BR-2 lens reverse ring (52mm). Any other macro lens I have access to doesn’t have the 52mm filter thread, so lens-reversing isn’t an option. Wind and vibration is the enemy, so you will get best results in dead-calm conditions (such as indoors). I made a custom piece of hardware that fits into the end of my bellows unit; it has an alligator clip to hold small objects at whatever height and rotation I need. This clip hardware is connected to the bellows, so image motion relative to the bellows is virtually eliminated (it would move the same as the camera). The clip can also hold a little platform in front of the lens, allowing you to lay small subjects that aren’t “clip-able” onto the platform. Lighting is crucial. I often use an LED ring light, which stays cool and provides perfectly even lighting. When my lens is reversed on the bellows, I attach a BR-3 (52mm filter thread) ring to the rear of the lens. I can slip the ring light over the BR-3. The light won’t fit larger diameter lenses. Continuous lighting is really, really nice to see your subject well. I use a remote or wired release, and I set the camera up in the mode to make shooting a two-step procedure: the first release flips up the mirror and the second release triggers the shutter. Electronic front curtain shutter mode is ideal, if your camera supports it. I rotate the focusing knob on the bellows to shift the camera/lens combination toward the subject in increments of about 0.1mm for higher image magnifications. I’ll typically take about 50 shots to stack. The zone of sharp focus should overlap from adjacent shots. Be careful to never change the image magnification while shooting a focus stack. Focus-stacking is mostly incompatible with living subjects at high magnification, since image motion is verboten. Some people claim they can chill insects enough to temporarily stop their movement. In the demonstration shots below, I found a recently deceased bee and a beetle to use as a subject. Due to the way stacking works, you won’t want to use tight framing on your subject. Expect to lose about 20% of the image around the edges, which you’ll need to crop out of the final stacked image. Reverse-mounted lens with ring light and clip to hold subject I don’t have hardware to reverse my other macro lenses, so the above setup only applies to the 55mm Micro-Nikkor with its 52mm filter thread. This ring light only fits 52mm or smaller diameters. Note how the hardware that holds the small subject is connected to the bellows; subject motion is no longer a problem in still air. The LED light provides perfectly even illumination without heating up the subject. For really gung-ho macro photographers, the PB-4 bellows provides both tilt and shift controls to manipulate the plane of focus and also perspective. Focus stacking pretty much eliminates the need to alter the plane of focus, though. Flash close-ups with AR-4 release, BR-4 diaphragm control The above setup demonstrates using a flash instead of a ring light. For lenses that cannot be reversed, I use this lighting arrangement if the subject is far enough away. At higher magnifications, the lens will eventually cast a shadow on the subject. This arrangement doesn’t provide continuous illumination, so the AR-4/BR-4 arrangement can be handy to keep the lens diaphragm opened until you depress the plunger on the AR-4. This cable release was designed for cameras like the Nikon F2, where the second cable would connect to the shutter release. D500 with wired remote. Subject is lit up. Focus stack of 69 photos. 55mm Micro-Nikkor at f/8, Nikon D610. The demonstration photo above was made with the lens reversed and the ring light for illumination. Each shot was made in manual mode, ISO 100 and 1/3 second. You want the sharpest aperture and lowest ISO for this shooting, and please, please use RAW format. Using good light and an optimal aperture, this stacking technique gives you an idea of just how good the 55mm f/3.5 Micro Nikkor lens is. As I mentioned in another article, they used this lens for shooting the original Star Wars film, with good reason. Film cannot compete with this form of digital photography. I think that focus stacking is the perfect blend of art and science. Happy stacking. #howto

  • Convert your fisheye lens into a regular superwide

    Have you ever had to make a decision on which wide angle lens gets to go on your trip? The loser is usually the fisheye lens. Fisheye lenses are just a little too specialized, so they often fall into benign neglect. What if you could magically turn that fisheye lens into a regular rectilinear super-wide whenever you wanted? Would you find that useful? Duh! I found that I got caught flat-footed in keeping up with lens distortion correction technology. I tried using the lens “profile correction” in Adobe Lightroom with my Rokinon 8mm fisheye, and my jaw dropped. Words can’t do it justice, but pictures can. There are other software vendors, of course, that can do this task even a little bit better, because they can salvage more of the left and right frame edges. But I have Lightroom and I don’t have those other tools. And believe me, there’s plenty of image remaining after the lens profile corrections. Check out my Rokinon 8mm fisheye review here. I discussed a few postprocessing operations to ‘straighten’ shots in the article, but the results left a lot to be desired. The images (at f/8 and beyond) are incredibly sharp. But the lines are curvy. Very curvy. Boring chart in Lightroom before lens profile correction, usual fisheye effect Boring chart in Lightroom with lens profile correction. Rectilinear super-wide! I am thoroughly impressed at how much barrel distortion is corrected. All of those little squares are square again. Yes, you lose some frame edges, but what’s left still covers a huge angle. For all intents and purposes, your fisheye image is now looking like a typical rectilinear super-wide lens. I cannot guarantee that other brands of fisheye lenses will be so well corrected. You can consult the Adobe web site to see if your lens profile exists. The profiles are periodically updated for new lenses. The profiles are available in Photoshop, as well. Lightroom steps to use lens profile corrections for Rokinon Select the "Develop" portion of Lightroom Remove chromatic aberrations (select) Enable Profile Corrections (select) Make: Rokinon Model: Rokinon 8mm f/3.5 UMC Fisheye CS Tokina 11mm. I used to think this was a pretty wide angle lens. Rokinon 8mm with Lightroom profile correction. Now that’s wide. I chose the above shot to drive home the point that even though you can get super close and super wide, it doesn’t mean that it’s always the best choice. The lens distortion is removed, but the extreme perspective isn’t welcome in every shot. I consider this shot to show a lack of ‘taste’. Now, imagine if a person was sitting in one of those near chairs. Scary. Even with liberal cropping, you can now go really, really, wide. Rokinon 8mm with no profile corrections. Now that’s a REALLY wide fisheye. I think that these lens profile corrections are going to give my fisheye a whole new lease on life. All of those curvy lines can be made nearly dead straight. Claustrophobic spaces can be made to look immense. As with all super-wide angle lenses, you still need to pay attention to getting perspectives that are too extreme. Shooting discipline is even more important here, such as keeping the camera level and at a ‘respectful’ distance from the subject. But imagine the world of options that this technique can open up. Science and art live happily ever after once again. #howto

  • Keep Using Capture NX2 with Raw Format

    So what do you do when you get a new Nikon camera and discover that Capture NX2 barfs on your .NEF files? There may be no need to panic. This is definitely a niche article. I am an admitted Capture NX2 holdout. I know there are others in the resistance movement out there, at least in other countries. I refuse to abandon either raw format or Capture NX2. It’s well-known that Capture NX2 doesn’t understand new Nikon camera raw files. Maybe you, like me, don’t want to switch to TIF or Jpeg to keep using Capture NX2. To find out if you’re in luck, you need to check to see if you have a copy of Capture NX2 Version 2.4.6 (not the last one that is 2.4.7). No? You might have a lot of difficulty finding it, and Nikon won’t be of any help here. I'm a packrat when it comes to computer disk backups, so I had pretty much every Capture NX2 version saved. Next, you need to download the (free!) “Raw2NEF”, created (and maintained as of this writing) by Miguel Bañón. His download site is here: Miguel is supporting both Sony and Nikon cameras. The (growing) list of supported cameras is listed at his site. What’s the catch? Well, there are two. First, this slightly disrupts your workflow. Now, you have to run his program and do the conversion before using Capture NX2. Second, the NEF files that the Raw2Nef program creates are about triple the size of your original compressed NEF files. You can always "zip" the folder of converted files later, to save about 33% on disk. You don’t have to worry about the Raw2Nef program altering your original files; it just makes new files (CNX2_ prefix) where you tell the program to put them. It only takes about 1.5 seconds per shot to do the conversion on my own computer. There are other options to fine-tune the file conversion process, but I just select the input folder, the output folder, and then click convert. It can recursively run through sub-folders during the conversion process, but it places all of the converted files into a single directory, so just be aware of this. If you have Photoshop CS6, you can later select the "CNX2_" files and convert into Photoshop-compatible files. You should click the "Adobe Photoshop" button and browse to where Photoshop.exe is located before doing the conversion. The Raw2Nef interface Thank you Miguel! And long live Capture NX2. #howto

  • Make Manual Exposure Automatic

    It probably hasn't occurred to many photographers that they can automate exposure while in "manual" mode. Although this sounds like an oxymoron, it's really not. What if you would like a specific aperture to control either resolution or depth of field. But you'd also like to control the shutter speed to work well with lens vibration control or perhaps freeze action. But you don't want to give up automatic exposure. You can have it all simply by switching to "Auto ISO". Now, with the camera set to manual, you can control the aperture and shutter, while the camera adjusts the ISO to suit the required exposure. There's a variation on this theme, when you use a flash while in manual mode. That topic is discussed in an earlier article I wrote here: For skeptics, there's a way to prove that you still get the correct exposure using the technique described above. It's called a histogram. Simply chimp the shot and verify that your histogram looks correct. Sample camera Auto ISO menu option Along with this extra power comes extra responsibility, of course. If you truly want “manual” exposure, then remember to turn Auto ISO OFF… #howto

  • Using MTF Mapper 0.6.3 New Features

    The MTFMapper program that I use to measure lens resolution is constantly improving and learning new tricks. I thought it would be a good time to show you some of its new features that aren’t too obvious to the casual user. If you aren’t familiar with MTFMapper, then I suggest you take a look at a previous article I made for it here. This program is authored by Frans van den Bergh, and is roughly the equivalent of the Imatest program, except that it’s free. The features I’m going to show you are in the new Windows 64-bit version 0.6.3. This version allows use of very large files that are beyond the limits of Frans’ 32-bit versions. One thing I should mention is that I am still using the previous design of the lens resolution target. The new MTFMapper program still accepts it, but there is a newer chart design that is slightly more accurate at the very center of the chart (due to the “hourglass” shape of the middle target) and is more forgiving with chart rotation errors. I made a giant “A0” print of the earlier resolution target file on quality glossy paper and had it subsequently dry-mounted into a picture frame that I can hang where I want. The target files are also supplied by Frans at his web site (consult the link above). My “A0” size vintage resolution chart design It was bit of a pain and pricey to make my large, mounted, and framed chart. I haven’t yet managed to talk myself into abandoning it. I did print and mount a smaller new-style resolution chart that I use when I can get close enough to fill the frame with some of my lenses. Newer resolution chart design (with measurements on it) I should also mention that bigger is better. Obviously. In this case, I’m referring to the resolution target. To get truly useful results, you want to take lens resolution measurements at the same distance from which you shoot regular photographs. For my Sigma 150-600mm lens, this means from about 60 feet away. You need a pretty big chart to be able to do that. For wide-angle lenses, it takes a huge target to get very far away from it and still fill the image frame (which once again means you need a big chart). You get the picture. Everybody by now knows I can't resist using puns at every possible opportunity. A few program versions back, Frans changed his MTFMapper so that it no longer sets an automatic threshold value for locating target edges. As a result, you’ll probably need to change the value his program will otherwise use. For my own photographs, the ideal threshold value is 0.2. I set this in the “Settings, Preferences” dialog, which is the same place where you need to tell MTFMapper the size of your camera’s pixels. If you fail to do this step, you may discover that it refuses to measure many target edges. MTFMapper 'Settings' dialog showing the plot 'Scale' slider and 'Threshold' value In the picture above, note that I set the “Threshold” value to 0.2. For my photographs, this gets me the same results that the older versions of the program produced via its “auto-threshold” functionality. Note that I typically change my camera meter to use exposure compensation of about “+.7”, to get the chart white values to look reasonable in a photograph. Even in this day and age, camera meters can be a little stupid; the photographer still needs to supply the brains. The "Threshold" value tells the MTFMapper program how much of a contrast change is required to consider what it sees to be a valid target edge. By the way, the "Arguments:" box shown above is where you can type in custom arguments for the MTFMapper program, which are the commands that look like "--myargument myvalue". Measurement Plot Scaling The first new MTFMapper feature I want to discuss is measurement scaling. For several program versions, the default has been to start the resolution measurement plot scale at zero. My own preference is to “auto-scale" the plot, so that chart resolution values stay strictly within the range of actual measured values. Your own preferences may differ. Frans now lets the user decide, so now you’ll find a slider in the “Settings” dialog called “3D plot z-axis relative scale factor”. If you slide it all the way to the right, then it will “auto-scale” the plot; if it’s at the left side, then the scale will start at zero instead. You can see in the picture above how I have the slider set. By the way, the slider is used for both the “2D” grid plot and the “3D” grid plot scaling. Auto-scaled 2-D plot of the resolution target 3D Grid (Meridional) plot using "auto-scale" for an 85mm lens at f/2.8 3D Grid (Meridional) plot using "zero-scale" for an 85mm lens at f/2.8 The pair of plots above demonstrate the difference between auto-scale and zero-based scale. I prefer the auto-scale, because it maximizes the differences in measurements and it also shows what the minimum resolution measurement value is, in a simple way. The 0-based type of scale makes it too difficult for me to determine the minimum measured resolution values. I always select the "Line pairs per mm units" in the Settings dialog; otherwise, it will use "cycles per pixel" units. Automatic MTF Curve Plotting for Any Measured Target Edge The next new MTFMapper feature I want to cover is the ability to dynamically produce a chart of a selected measured edge of a square from the “Annotated” picture of the resolution target, as seen below. MTF Curve from clicking on a resolution measurement value As you can see in the picture above, a plot that shows contrast and frequency data from a single edge measurement in your “Annotated” picture can be generated from a single mouse-click. You merely find the edge of interest and click on the “cycles per pixel” (c/p) value to get the plot. The Annotated picture always displays "cycles per pixel" measured values, even when you have set "lines per mm" for your plot units. This plot has the ability to show you the frequency measurement for the entire MTF range (not just MTF50). To get the answers, you slide the vertical gray bar along the horizontal “Frequency (c/p)” direction and the chart will update the “contrast” value, which is also the MTF value if you multiply it by 100. Once the curve dialog is displayed, you can click on another edge measurement to replace the plot. Once the plot is displayed, you can hold the shift key and select an additional edge measurement that you want to compare, if you wish, such that both edge measurements are plotted together (in two different colors). Chart shows a frequency of .308 cycles/pixel at MTF30 Conclusion Frans is busy working on new features and refining existing program features all the time. If you want to track his progress, you might be interested in this link. Beware that most of Frans’ blogs are heavily dosed with matrix algebra and fast Fourier transforms; they're not for the feint of heart. If you have an interest in squeezing the maximum quality out of your camera gear, then you should probably try out this software. It will enable you to get the best possible focus calibration and resolution measurements from your cameras and lenses at a minimal cost to you. But only if you take the time to print and mount the quality target files that Frans has designed and provided. #howto

  • A Better Way To Test Fisheye Lens Resolution

    I had tried to measure my Rokinon 8mm fisheye lens resolution a while ago, but I couldn’t get very good answers. The link to that article is here. The MTFMapper program I use to measure lens resolution is designed to look for little squares in a test chart to make measurements. A fisheye lens distorts the chart squares until they no longer resemble a square (they look like a rhombus shape), and hence the (previous) program versions skipped measuring them. The MTFMpper author, Frans van den Bergh, has been working on this fisheye measurement problem. He, of course, has come up with a solution (or else I obviously wouldn’t be doing this article, duh). The fix to this problem is to un-distort the photograph of a test chart, so that the little squares in the resolution chart photo look like squares once again. Simple. Unless you had to do it. I’d like to show some screen shots of the new MTFMapper program, version 0.6.5, that demonstrate how to configure the software to be able to measure fisheye lens resolution. I always use Raw-format un-sharpened pictures of the test chart(s) that MTFMapper requires for proper measurements. The program uses the “Dcraw” program under the covers, which is constantly updated to understand new “raw” format files. The un-manipulated 8mm fisheye chart picture (18” away) Note that the chart shown above is the “vintage” resolution chart. There are newer chart designs available, but the latest MTFMapper version still accepts this older chart design. Old versions of MTFMapper couldn’t measure the chart sides As you can see above, the highly-distorted chart was just too much for older MTFMapper versions. Frans worked on adding the image manipulations to (mostly) eliminate the fisheye effect. His program needs to be told by the user that the chart shot needs distortion correction, and this is done in the “Settings” dialog. Distortion correction in the Settings dialog Different fisheye lenses use different optical “projection” formulas. The Rokinon 8mm lens uses “stereographic projection”. The options in the dialog include “none” (most lenses), “radial”, “equiangular”, and “stereographic”. If you don’t know the projection formula of your lens, you can just try experimenting with the options. You will need to include the actual focal length of your lens, unless you choose “none”. The MTFMapper “Annotation” dialog The picture above shows the measurements of the chart squares in the “Annotation” dialog, and incorporates the distortion correction. Note that it’s not perfect, but the program is able to readily measure all target squares now. The slight remaining barrel distortion doesn’t affect the MTF50 measurement accuracy. The sides of the little squares are now straight enough for MTFMapper to work its magic. The “Profile” dialog. This is a good lens! If you’re familiar with typical lens MTF50 measurements, then the above measurements should impress you. Although this Rokinon 8mm isn’t very good wide open, f/8 and beyond show that this lens is capable of amazing resolution. The “2D” grid MTF50 lp/mm measurements The resolution “fingerprint” of this lens is quite unique. This is a “DX” lens; the results above are using a Nikon D7100 (3.92 micron pixels). The “Lens Profile” MTF10 and MTF30 measurements Conclusion The new MTFMapper V0.6.5 is a resounding success for tackling the fisheye lens measurement problem. I realize that not that many people own fisheye lenses, so this new feature will probably have a limited audience. If you need it, you know who you are. I thought I’d mention that I also did an article here that discusses how you can effectively convert your fisheye images into a regular super-wide rectilinear lens (I used Lightroom lens profile corrections). This technique could also get you chart photos that MTFMapper could use, but it’s best to stick with un-sharpened raw pictures when measuring resolution. If you’re interested in getting this free program, take a look here. I made an article that gives a simplified explanation of its use here. If you like this program as much as I do, then please let Frans know! #review

  • Yet another MTF explanation article

    Lens resolution and contrast are discussed in so many different ways, it leaves most photographers dazed and confused. It’s time for my own two cents (or whatever that means to you at your own exchange rate). Most MTF discussions either quickly degrade into theory or never leave theory in the first place. I want to keep it real, with actual measurements and pictures. “MTF50” Charts So how do I arrive at my “resolution” measurements? I use “MTF50”, measured over the entire camera sensor. The Modulation Transfer Function I’m using (MTF50) measures how close black/white line pairs can get before they lose 50% of their contrast. As lines get skinnier (and closer), black lines start to get some white ‘contamination’ in them on their edges and white lines start to get some black contamination on their edges, too. When black turns 50% gray and white simultaneously turns 50% gray because this contamination on either side of a really skinny line meets in the middle, that’s the “MTF50” condition. The contrast (MTF) being measured is defined as: (“darkest” – “lightest”) / (“darkest” + “lightest”) A pure black line would have a measurement of 1.0, and a pure white line would have a measurement of 0. A perfect lens that would then have an MTF of (1-0)/(1+0) = 1.0 no matter how skinny the lines were (ignoring the finite wavelength of light and diffraction effects). If you were shooting black/white line pairs and your lens only lost 5% of contrast, then you’d have an MTF of (.95 - .05)/(.95 + .05) = 0.9. In truth, the lens and the camera sensor both contribute to the loss of contrast. Since it’s more useful to have a camera attached to that lens, the MTF50 measurements shown below are a combination of lens effects and sensor effects. Also, the two-dimensional MTF50 plots show what’s going on over the entire camera sensor from corner to corner. At least half of the cameras out there have an “optical low-pass filter” (OLPF) over their sensors, which fuzzes the image a little to avoid the moiré effect. Since the resolution measurements are performed without any sharpening, this has a slightly negative impact on the results. When you can get beyond about 30 line pairs per millimeter on the camera sensor before the lines drop to 50% contrast, then you have what most people consider good lens resolution. 30 line pairs per millimeter are really, really skinny lines. Keep in mind that 30 lp/mm is good unless you substantially enlarge an image, so “DX” sensors need 1.5X more resolution than “FX” sensors for the same-size print. The resolution measurements below are separated into “meridional” and “sagittal” directions (see the plots below), measured over the entire camera sensor. Think of the “sagittal” direction like spokes on a wheel, where the center of the lens is the hub. The rim of the wheel, where the spokes attach, is the meridional direction (perpendicular to the spoke directions). Lenses invariably do less well resolving lines in one direction or the other. When the sagittal and meridional resolutions differ, you get astigmatism. If you’re interested in the total resolution in a picture, you need to take the “lines pairs per millimeter” resolution value and multiply by how many millimeters tall the sensor is. This value gets you “line pairs per picture height” or, lp/ph = (“MTF50 lp/mm”) * (“sensor_height_mm”) Bigger camera sensors have more millimeters in them, so you get more total picture resolution than a small sensor. Besides megapixels, there are factors like focus repeatability, optical low-pass filters, air turbulence (heat shimmer), shooting distance, ambient light level, the aperture setting, and camera vibration that can get the resolution waters muddy in a hurry. Sensor noise is a factor too, but that’s beyond the scope of this article. Most manufacturer MTF plots are shown at the widest lens aperture. The MTF measurements can be dramatically higher when you stop a lens down more, but there are limits to the increase in resolution you can get by stopping down. A piece of a lens resolution chart photo The photo above shows a section of a resolution chart that has been analyzed by a program. It has little blue numbers on top of every edge that has been measured. The measurements shown are in units of “cycles per pixel”, which means how many light/dark transitions happen per sensor pixel (less than one transition per pixel). The pictured edges with lower values on them are fuzzier than the higher-valued edges. You’ll notice a pattern that the edges aligned in the sagittal direction are consistently fuzzier than the neighboring meridional direction edges. To get a better idea of lens performance, you need to take resolution measurements at literally hundreds of locations all across your camera sensor. Close up on a square. Numbers are cycles/pixel. What are the Limits of Resolution? The Luminous Landscape website has an interesting discussion on what resolution camera sensors and lenses are capable of producing. That link is here: You have probably heard the lens term “diffraction-limited”, but what exactly does that mean? When light passes an edge, like the edge of a lens diaphragm, it will diffract. If your lens is significantly stopped down, the diffraction gets huge. But how huge? The link above mentions that for MTF50, an aperture of f/1.4 could theoretically produce 494 lp/mm. No real lens is anywhere near this limit. At f/16, however, the theoretical limit is only 43 lp/mm! These numbers are for “yellow-green” light. By f/22 the limit plunges to 31 lp/mm. Many lenses are good enough to resolve more than 43 lp/mm at 50% contrast, but at f/16 and beyond, they never will. These lenses are “diffraction-limited”. On the camera side of things, a sensor has a Nyquist limit, beyond which it won’t record higher resolution (see below for that discussion). The Lens I did my testing with (a single copy) of the Nikkor 85mm f/1.4 AF-S lens. This lens has pretty good street cred, and I didn’t want questions about quality entering into the equation. As an aside, I thought I’d mention that “lenstip.com” reviewed (a single copy) of this lens on a D3X (24.5 MP) “FX” camera, and found no better than an MTF50 of 30 lp/mm at f/1.4 at the center of the lens. My copy measures between 32 and 42 lp/mm at f/1.4 in the lens center, depending on the camera. Its corner measurements on an FX sensor are as high as 27 lp/mm at f/1.4. I think that LensTip got a bad copy; I don’t think they have sloppy technique. The Software I make all of my resolution measurements using the MTF Mapper program, whose author is Frans van den Bergh. His software and printable test plots are available here. I’m using version 0.6.7 of mtf_mapper_gui.exe for these tests. I have an “A0” test chart (33” X 47”) printed on quality glossy paper, dry-mounted onto foam-board. This allows me to be about 16 feet from the resolution target and still fill the frame on DX when using the 85mm lens. I wanted to shoot at realistic distances, but not let air turbulence (think heat shimmer) enter into the mix. You must use software to evaluate resolution. It’s far pickier than you are, and totally repeatable. You also need software to properly evaluate focus when calibrating your phase-detect system, which the same MTFMapper program can do, although you need a different target for this. The Technique Before I discuss any test results, I’d like to mention that I think the biggest factor in measurement reliability is auto-focus variation. I used live-view, contrast-detect focus throughout. Results show the “best” resolution measurements I got, but the MTF50 results often vary by about 2 lp/mm from shot to shot. I focus in-between every shot. The camera stops focusing when it thinks its “good-enough”, so there is always some amount of variability in focus. Some people place their cameras on a moving platform and shift focus by progressively changing the subject distance between photos of the test chart. The next-biggest factor in spoiling resolution is camera motion. I use a big and heavy tripod in all testing, along with a remote release and “mirror-up”. Short of mounting your camera on a granite slab, however, you’re always going to experience some amount of camera shake because of the shutter motion. Except when your camera has an electronic front-curtain shutter (EFC) like the D500. If you have it, use it. I’m convinced that it got me about 2 lp/mm extra resolution. Shutter speeds were all around 1/1600s (the EFC is limited to 1/2000s). Take the photos at the camera base ISO. You don’t want sensor noise to be a part of the test. The tested lens has no vibration reduction; if it did, I’d turn it off for testing. The Camera Your camera sensor will affect your MTF measurements, as I already mentioned. Another influence on resolution is called the Nyquist limit. Your measured resolution can’t go higher than this value. I show some camera Nyquist limits below. D5000: 4288 X 2848, 23.6mm X 15.8mm, 5.5 micron pixel, 12.3MP, OLPF, Nyquist 90.1 lp/mm. D7000: 4928 X 3264, 23.6mm X 15.6mm, 4.78 micron pixel, 16MP, OLPF, Nyquist 104.6 lp/mm. D500: 5568 X 3712, 23.6mm X 15.7mm, 4.22 micron pixel, 20.9MP, no OLPF, Nyquist 118.2 lp/mm. D610: 6068 X 4016, 35.9mm X 24.0mm, 5.95 micron pixel, 24.0MP, OLPF, Nyquist 83.7 lp/mm. D7100: 6000 X 4000, 23.6mm X 15.6mm, 3.92 micron pixel, 24.0MP, no OLPF, Nyquist 128.2 lp/mm The Nyquist sensor resolution is: (pixels high / height_mm / 2) lp/mm. If a lens has better resolution than the sensor Nyquist limit, then that extra resolution won’t get recorded. Now, the dreaded MTF50 math The program measures the target edges, then converts the “cycles per pixel” into “MTF50 lp/mm”. This number of cycles is measured at a contrast of 50%. MTF50 lp/mm = cycles_per_pixel * height_pixels / height_mm For instance, the photo above shows a couple of “0.19” c/p measurements for this D610 (4016 pixels tall, 24.0 mm tall): MTF50 lp/mm = 0.19 * 4016 / 24.0 = 31.8 (Pretty awesome for f/1.4 near the corner of the photo!) The plots show how many line pairs per millimeter can be resolved before they reach the 50% contrast threshold over the whole two-dimensional camera sensor. Stop the lens down for dramatically better resolution The most common “MTF” chart style The “MTF10,30” plots show the lens measurement data in a different way. These are the plots most people are familiar with. The plots have lines that show “contrast” averaged over the lens, moving from the lens center (on the left) to the lens edge (on the right). The contrast is calculated the same way as the formula from above, using a couple of different sets of line frequencies (thicknesses). The chart plots are separated into 10 line pairs/mm and 30 line pairs/mm, in both the sagittal and meridional directions. The vertical contrast range scale goes from 0 to 1.0, where 1.0 represents 100% contrast. Some manufacturers use “radial” and “tangential” terms instead of sagittal and meridional, but they mean the same thing. What you get, then, is the measured contrast for relatively thick lines (10) and thinner lines (30). The “10” is considered the lens contrast, and the “30” is considered lens resolution. The MTF10,30 plots are traditionally shown at the lens maximum aperture only. These plots can be a little underwhelming, especially when compared to a lens at its optimum aperture. I think these “10-30” plots are much less informative than the “MTF50” plots in regards to resolution measurement, but they let you compare lens measurements to the same style of plots that most camera companies publish. Except for Leica and Zeiss (and possibly Sigma), the plots that the camera companies publish are “theoretical” and not actually ever measured. To me, this is “blowing smoke you know where”. I think of these plot types as a decent way to evaluate lens astigmatism, but not “resolution”. You need two-dimensional data to really know how a lens performs. A wide-open (f/1.4) MTF plot, D610 and 85mm f/1.4 lens 85mm f/1.4 lens stopped down to f/4.0. D7100 camera. Summary There are several ways to show the resolution of a lens. The worst way would be a single number. A better way is to show what the whole camera sensor sees, in two dimensions. An even better way is to show two-dimensional measurements that also segregates the sagittal and meridional information. Better yet, gather this information at the different aperture settings. #howto

bottom of page