top of page

Search Results

229 items found for ""

  • White Balance for Infrared Photography

    I haven’t owned a camera since the Nikon D60 that would successfully perform a “preset white balance” when I use my Hoya R72 infrared filter. I have no intentions to convert any of my cameras into “infrared only” by getting the sensor filter changed. All I’m after is a decent-looking picture on my camera LCD after I take a shot using the R72. All of the newer cameras screen out infrared light so effectively that regular white-balance measurements don’t work. My old D50 even let me take IR shots hand-held while using the Hoya R72! If you stick with “auto white balance”, then your IR shots look totally red on your camera LCD. Yuck. They’re not much better if you try setting the lowest “Kelvin” white balance (2500K on my D500). Ditto for using “incandescent” white balance. So, what to do? I came up with a procedure that “mostly” works to solve this IR-shooting problem. At least I get to see pretty decent images on my camera LCD. My secret procedure involves displaying a special color on my computer monitor, and then setting my camera white balance preset using this displayed color. I previously published an article here explaining how to create special colors and set your camera white balance using this color. The Hoya R72 leaves your photos with very little blue and green color, but a ton of red. It occurred to me that I should be able to create a color that could counteract this, so that you could set a camera white-balance pre-set without using the Hoya R72 at all. The following procedure is what I came up with. Auto White Balance with the Hoya R72. Yuck. The picture shown above is what your camera viewfinder looks like when you select “Auto White Balance”. It’s really hard to see what’s going on. Incandescent White Balance. A teeny bit better. 2500K White Balance. Slightly better. I started working on colors that would emphasize red with much less blue and green, so that I could emulate the color spectrum on my computer monitor. The spectrum I was after was the same one passed through the Hoya R72 IR filter. Red 240, Green 64, Blue 52 If I displayed the above color on my computer monitor and successfully performed a “preset white balance” against it, then I could use that in my camera while shooting with the Hoya R72. It turns out that going beyond the colors shown above made my camera stop accepting the color as a “good” white balance preset. Preset white balance from R240, G64, B52 screen color with Hoya R72 As you can see above, I’m definitely on the right track. Now, my camera screen shows pictures that make a lot more sense. I still need to post-process these pictures to get better white balance, but at least I’m not seeing red while shooting. I realize that color really has no meaning in infrared. But I think you would agree that pure red tones definitely aren’t what you want. Post-processed shot. Used a gray-point to get color closer to what I wanted. As you can see above, I was able to get a workable color pallet from the “R240G64B52” preset color. The “correct” RBGG white balance gains for the Hoya R72 from a D60 file My goal in getting an optimal white balance preset was to achieve the gain values shown above (the 4 numbers are Red, Blue, Green, Green). This Nikon D60 file shows the results of a “good” preset, based upon using lawn grass in full sun as a target. When I was trying different computer screen colors to WB preset against, I could never drive the red high enough to get the “0.507” gain before the camera WB preset operation would fail (showing “no good” feedback). I had to stop at the Red value of 240, instead of driving it to the maximum of 256, and its gain of 0.5649 versus the 0.507 goal. Exif data showing the R, B, G, G gain values I obtained Nevertheless, I’m now getting greatly improved rear LCD screen feedback on my D500 when I shoot infrared with the Hoya R72 filter and my custom white balance preset using the computer monitor. By the way, another article I wrote mentions how the D7100 and D610 cameras are terrible for infrared photography, with both having unacceptable internal reflections unless you use the little DK-5 viewfinder eyepiece blocker. The D500, on the other hand, is a very good camera for infrared photography, even without using its built-in eyepiece shutter. Many cameras still won't produce a good white balance preset using this procedure, but give it a try.

  • How to Correct an LED “White” Light Source

    I really love my LED ring light, which I use for macro photography. It’s just one example of photographic lighting based upon white LED technology. These lights are great, because they don’t heat up (no wilting flowers), they last a really long time, they’re small, and they consume much less energy. But there’s a slight downside. They put out kind of weird light. None of my cameras can quite figure out a good "auto" white balance for them. The reason for this problem is the blue part of the light spectrum. Most, if not all, LED ‘white’ lights put out an unbalanced spectrum that takes a special white balancing procedure. “White” light LED typical spectrum. Lots of blue, little red. Camera Auto White Balance just doesn’t cut it My camera “auto white balance” isn’t smart enough to get the color of a grey card correct using the white LED light. Using the Capture NX-D gray point “eye dropper” isn’t enough to fix it. Still too blue. It’s tempting to just use the little “gray point” eye dropper to fix the color, but this isn’t quite enough. The histogram above shows how there’s still too much blue. Manually increasing the color temperature helps, but it’s still too blue Additional blue adjustment (10000K plus Levels and Curves adjust) finally looks correct Capture NX2: adjust color temperature and adjust color balance as well Capture NX2 with 10000K and Blue -40 also gets me the correct grey As shown in the pictures above, it’s the norm for LED white lights to put out excessive blue light, while skimping on the red light. The camera single white-balance adjustment isn’t able to fix the problem. The image editor simple gray-point color-picker still leaves too much blue in the shot. In an image editor, however, you can fix the problem with a double adjustment. For this LED light, I start with a high color temperature (roughly 10,000K) to align the Red and Green channels. (Use a grey card image, of course!). Second, shift the blue peak until it lands on top of the red and green peaks. It probably won't be perfect, but it will be plenty close to look good. There are so many occasions where you need to adjust your white balance. Please, please invest in a grey card. They’re really dirt cheap and can greatly improve the look of your pictures, especially when you get into environments with unusual lighting. If I were smart, I’d save these color adjustments as a batch file, so that I could correct all of the pictures shot using the LED light at once. A batch file is also a good idea if you tend to forget after a few months exactly how you corrected the white balance in the first place. Don’t avoid getting a white LED light for photography, just because you’ve heard that their color is bad. Chances are that you can correct the color in post-processing by using a two-step procedure. Chances are less good that you can correct the white balance in-camera, so jpeg shooters beware. My LED ring light in action #howto

  • How to make a crowd disappear in broad daylight

    Have you ever been to a popular tourist attraction and simply can’t get a photograph devoid of people? How about you want to get one of those cool landscape shots where you make the ocean waves turn into a cloud of mist, even when it’s high noon and sunny? The most obvious answer for most photographers trying shots like these is to use a really strong neutral density filter. What happens when you only brought along your lens with the 105 mm diameter filter thread and you never got around to buying that $400 filter (they really can cost this much)? How about you at least remembered to bring your tripod, but forgot your arsenal of filters? How about you brought that cool super-wide lens that doesn’t even let you mount a filter? There’s still a way to get that image you crave. You don’t even have to sacrifice using your sharpest aperture or be forced to put a filter over a long lens that has its resolution ruined by filters. The solution to this conundrum is to use image stacking. For me, I turn to the same “CombineZM” free program that I use for focus-stacking my macro shots. (There’s also a version called “CombineZP”) Yes, you probably need a tripod, and yes, you need to take several shots of the same scene. You can download this program from a variety of sources, such as here. The technique is pretty simple. First, take several shots of the scene without moving the camera. To get rid of that crowd, take a series of shots where people aren’t standing in the same place for each shot. Don’t use auto-focus or change exposure between shots. ⦁ Process your shots into a format such as TIF or JPG (CombineZM doesn’t like raw format). ⦁ In CombineZM, select File | *New ⦁ Browse to your series of pictures and select which ones you want to combine (hold shift key to select a range). The pictures should all be in the same folder. ⦁ Select Stack | Replace Groups, Average The CombineZM program expects your exposure to be correct in each frame for this selected stacking option. It will “merge” all the selected photos into one shot, and will also reduce any image noise that might be present. In fact, some people use this technique purely for the noise-reduction feature, primarily for the shadows. For the best results, I’d recommend that you use at least 10 shots. If you use 10 pictures, then a person appearing in one of those shots will only contribute 1/10 to the final picture (90% transparent). 5-shot “average” stack. Still looks a bit ghosty. If a ghost-like shot is what you’re after, that is of course completely possible. You also might want to get a model to stand still while everyone else is moving around, leaving the model standing all alone. It’s all about getting creative, isn’t it? #howto

  • UniWB and ETTR: the Whole Recipe

    UniWB (unitary white balance) and ExposeToTheRight. What is it and why should you care? This is all about jamming the maximum range of light into your raw pictures and being confident you aren’t getting any color channel blowouts. Think maximum possible quality. The “Uni” part of UniWB alludes to uniform signal gain being applied to the red, green, and blue pixels on your sensor. The optimal gain (or signal boost) is ‘no’ gain, or a multiple of 1. If no gain is given to your pictures, then the histogram in your camera always speaks the “truth”. If you no longer need to leave a little slop in your exposure to avoid blow-outs, because you can now believe your histogram, that means you can get the maximum light into your shadows. So, there must be a catch. There’s always a catch. Here’s the catch: your pictures all look green. An ugly, sickly green. But not to worry; the green is curable. It's just not curable inside your camera. Here’s the deal. I’m going to show you how you can get set up and start using UniWB at no cost. Except maybe a little sweat. I verified these procedures with a D610, D500, and D7000. Canon (or anybody else) user procedures would be nearly identical. These procedures only apply to "raw" photos, though. What do I need? A grey card. So I lied already. You need to get a grey card. A Windows computer and monitor. You don’t even need a calibrated monitor. Download “Exiftool” for free. Or something else that can look at exif data. Your photo editor you normally use to process NEF (raw) photos. Windows Paint, or something that can draw and display a colored rectangle full-screen. Make a custom color on your computer monitor Open up Windows Paint Click “Resize”, select “Pixels”, de-select “Maintain Aspect Ratio” Assign something like Horizontal 1920, Vertical 1080 or whatever your monitor size is. Click “Color 1” (foreground color) Click “Edit Colors”, then select “Color Solid” Set “Color Solid” to Red 128, Green 64, Blue 128, then “OK” Click “Select All” Click “Fill” (the bucket icon) Left-click inside the big white rectangle to fill it with your new Color Solid. Click “View”, “Full Screen” You now have your monitor displaying a single (pink-ish) color across your whole screen. You will need to do this in a darkened room, so that you don’t see your own reflection. Later, you will probably need to repeat these steps, but change the values of the “Red” and “Blue” colored rectangle as needed in step (6). We will leave the Green value at 64 throughout the tests. Believe it or not, this pink-ish color is what your camera sensor perceives as “neutral”; the R,G,B gain values applied to it should all land near a value of 1. For my monitor, the rectangle on the right is nearly "perfect UniWB color" for my cameras. Set your 'White Balance Preset' to the color displayed on your monitor Put a long-ish focal length lens on your camera, in case your monitor screen distorts brightness/color if you get too close to it. De-focus your lens, and select “release priority” so it will take an out-of-focus picture. With your camera in “Raw” mode, set your white balance selection to “preset” (hold WB button and turn dial to get ‘PRE’). Select which preset number you want with the other dial, still holding the WB button. Fill your viewfinder with the screen color, and hold the “WB” button until the “PRE” flashes in your viewfinder. Press the shutter. You need to see “GD” displayed to know you have a “good” white balance preset. If you get “no GD”, you need to retry. (Too dim?) Now, go outside and take one or more interesting and tasteful photographs, using this new white balance preset. You’ll notice that your pictures are that sickly green I mentioned earlier. Measure the Red and Blue color channels in your photographs Take out the memory card from your camera and copy the photo(s) to your computer for analysis. Drag a green photo onto the “exiftool” to get a text file of the exif data. Exiftool feedback for Red, Blue sensor channels In the picture above, the Red feedback is about 0.95 (less than 1). Blue feedback is about 0.98, also less than 1. If I wanted to try to get closer to the ideal 1.0 gain, then I would DECREASE the Red value in the colored rectangle and therefore force a BIGGER red gain for the next test. If the Red gain had instead been larger than 1.0, I would instead use a larger red value in the colored rectangle for my next attempt. Similarly, the Blue gain is smaller than 1.0, so in a next attempt I would decrease the Blue in the colored rectangle, forcing the gain to get larger. To keep it simple, I named each rectangle I created with the RGB values used. In one of my experiments, the following shows the steps I took: (Start): Rectangle R128G64B128: exif feedback has R gain = 1.38818, B gain = 0.64941 (Need bigger R, smaller B): Rectangle R140G64B118: exif feedback has R gain = 1.15869, B gain = 0.73291 (Need bigger R, smaller B): Rectangle R145G64B100: exif feedback has R gain = 1.0249, B gain = 0.881347 (Need bigger R, smaller B): Rectangle R147G64B80: exif feedback has R gain = 0.9907, B gain = 1.13183 (R is good, need larger B): Rectangle R147G64B86: exif feedback has R gain = 0.9907, B gain 1.0449 For each iteration shown above, I had to go back into “Paint” and make a new rectangle with the adjusted R, B values. While displaying the new rectangle full-screen, I once again set the white balance preset and took another photo for analysis. As the saying goes, rinse and repeat. Once I have a calibrated UniWB with my camera, I can grab a different camera and perform a white balance preset using the screen rectangle color as-is. I have found that the different Nikon cameras are close enough in color response that I don’t need to iterate any more with them. I suspect the same is true with other camera brands. In other words, it was “one and done”. Trust but verify, though. Use your photo editor and grey card to get the correct color Now that your exif feedback is within 5% (R, B range from 0.95 to 1.05) you need to photograph that grey card I told you about, in the SAME light as your tasteful photos using this “good” white balance preset. Photo of grey card using the UniWB preset You can see above how the “grey” card photo looks anything but grey. The UniWB preset feedback in my photo editor (Capture NX2 here) shows 4938K. The histogram peaks aren’t anywhere close to being on top of each other. To get the grey card to look correct, I just move the “Fine Adjustment” color slider from its 4938K to where the histogram peaks are on top of each other. As shown below in Capture NX-D, a gray-point "eyedropper" picker can also be used. Color adjusted to 6457K gets the histogram R,G,B peaks to coincide By moving my color slider until the R,G,B peaks fully overlap, I get the correct color temperature. Now, the grey card looks grey again. I note this correct color temperature for later use. Capture NX-D correcting white balance I thought I’d try to see if Capture NX-D could also fix the white balance. It could, although I used a slightly different mix of “tint” and color temperature to get the best-looking histogram. Capture NX-D will also let you do batch processing to convert many files at once. The “eye dropper” gray point color picker works here, too. Capture NX-D When you have a nice, continuous light spectrum in your photo, then you can use the simple technique of picking a spot on the grey card using the "gray point eye dropper". This is a fast way to get a good white balance. Most photo-editing programs give you a similar option. The only downside, however, is that this 'eye dropper' technique doesn't tell you what color temperature is being used! ETTR (Expose To The Right) The photo above was taken using the "correct" UniWB preset, at 4938K. I could use the camera histogram and note where the now-accurate colors are located. The lighting used here matches the lighting where I photographed my grey card. I can now adjust exposure until I get the brightest color (green here) against the right-hand-side of the histogram, e.g. ETTR. I can expose with confidence that I'm not getting any blown color channels. I now know from my analysis of the grey card, which was also shot at the 4938K preset, how to adjust this green photo to make its color balance correct during post-processing. Adjusted photo All I have to do now is to set the color temperature to match the corrected grey card shot, which in this case was 6457K. If I had hundreds of pictures that all need this color adjustment, then I’d create a batch file and run it against all of the pictures. Conclusion The above procedures should be complete enough that you now know how to get a correct UniWB setting for your own camera(s). From there, you can now make better use of your camera’s histogram for fine-tuning exposure, via ETTR. Although I'm a Windows user, the same principles can be used for Apple, of course. You don’t need a calibrated monitor to use these techniques, but you may be forced to iterate more steps to finally locate the proper UniWB setting. I realize it feels a bit unsatisfying to constantly see green pictures on your camera LCD screen. But at least you can feel confident that your camera histogram will only show you blown color channels that are truly overexposed. You can finally get the absolute maximum amount of light into your shadows. Getting rid of that green is really quite simple, as long as you can wait to process those pictures on your computer. #howto

  • Nikon D500 Focus Bug

    I was exploring some D500 focus options when I discovered something was definitely amiss. It turns out that I’m not alone. Nikon has, for many years, offered auto-focus options that help you keep the subject in focus, even when the subject leaves your selected focus sensor. One set of focus options, in continuous-focus mode, is called “dynamic-area AF”. These modes have names such as [9-point, 21-point, 39-point] and [25-point, 72-point, 153-point] depending upon the camera model. Mode selection The general idea is to choose a bigger-numbered point when the subject gets harder to track. For a very erratic-moving subject, you’d want to use the D500 “153-point” dynamic-area AF mode, since it covers nearly the entire field of view. Right? Wrong. If you choose the 153-point mode and your subject moves off of your selected focus sensor, the D500 (and the D5) will simply re-focus on the background as soon as the time has elapsed according to your selection of the “Blocked Shot AF Response” and “Subject Motion” values in the “A3 Custom Settings” menu. This is an obvious bug! It’s easy to test, by simply pointing the lens slightly off-target while continuously focusing. The focus will shift to the background nearly immediately, ruining your shot. The selected in-viewfinder focus sensor never updates with dynamic-area mode, so you have to view the photograph to see which sensor was used (little red square). You’ll find that the camera isn’t switching focus sensors to keep up with the subject. I’m seeing the same problem with “D72”; it won’t track the subject when it moves off of the selected focus sensor but is still within the ‘box’ of 72 sensors. I always use the back-button for focus (the AF-ON button or whatever button you assign focus to). I haven't seen this focus bug on any other Nikon models, so it appears to only affect the D5/D500 models. I am using the firmware version 1.13 (the latest to date). I found out that Nikon was informed of this problem many months ago by multiple users, but each time has responded like they never heard of the problem. Here are some of the links on this bug I found from other users, after I decided to go on a web search: DpReview FredMiranda Note that in the “FredMiranda” link that his sample shots mention “d513” when he really means “d153”. A simple little dyslexia-type mistake. Steve used the Nikon D5 to demonstrate the exact-same bug I’m seeing with the D500. We can only hope that enough complaints will shame Nikon into finally fixing this firmware bug. By the way, their other camera models don’t have this bug. Sample shot by Steve Perry (from Fred Miranda link above). D5 has the same “d153” focus bug. The best substitute for 153-point 3D-tracking is the closest (functional) option to follow subjects all around your viewfinder. You start by putting your desired beginning focus sensor like usual (in continuous-focus mode) over the subject. Then you start focusing (presumably with the AF-ON button). 3D-tracking will use color information and then show you the automatically-selected focus point anywhere in the frame as the subject jumps around in your viewfinder image. If your subject moves quickly, they recommend you also set “3D-tracking watch area” to “Wide” in the “Custom Settings” (pencil) A5 menu. For quick response, Nikon also has the “Custom Settings” A3 menu to set both “Blocked Shot AF Response” and “Subject Motion” values. For these settings, a lower number is for quicker reaction to changes, although in 3D-tracking note that 1=2=3 for the “Blocked Shot AF Response” setting. If your subject is the same color as its background, then this mode will probably fail. On the D5/D500, this 3D mode has the advantage of actually showing you the selected focus sensor as it tracks the subject around the frame. I think their dynamic-area mode should do the same thing, since it gives you real-time feedback about what it is doing (or not doing). Update 9-24-2017 Steve Perry (mentioned above and in the Fred Miranda link) has been pursuing this issue, and wrote more updates about the focus problem. He thinks that Nikon fundamentally changed how dynamic-area AF works on the D5/D500, but didn't document it. Rather than paraphrase Steve, I'm including his comments (from page 9 of the Fred Miranda site link) below (in blue font). Steve attempted to reverse-engineer what the focus algorithm must be doing. OK, I think I finally have an answer. Before I lay it out though, I wanted to thank everyone who helped by posting to this thread and PM’ing me. A extra-special thanks to Snapsy and Keith for their help on this. Literally couldn’t have done it without them So, here’s what I think is happening, not sure if it’s 100% correct or not, but it seems to fit the facts and behaviors as we know them. Also, I reserve the right to revise this as time goes on : ) First, we know for sure that the D5/D500’s Dynamic area is not the same as the previous generation of bodies, no question there. In the past, you would typically acquire the target with the primary AF point, and then if the target slipped off that point, another AF point would jump in and take over – and it would track like that indefinitely. The new system on the other hand seems to let go of the target and go for whatever is under the primary AF point – almost like Dynamic wasn’t even there. This tends to appear broken since when viewing the images in View NX-i or on the back of the camera, the system never seems to move the AF point – it always shows the selected point. (In the past, you could see the point it used.) According to the EXIF data though, it actually is selecting different AF points as the subject leaves the primary point. However, it’s reporting it like Group AF does – just showing you where the selected area (point) is and not the actual AF point that was used. As Snapsy said, you can verify this with the EXIF Tool. The camera is unquestionably selecting different AF points as needed. So, after looking at far too many lines of EXIF code and finally seeing a pattern, here’s how the new system works (I think ) It locks on with the primary AF point and begins tracking. If the subject falls away from the primary AF point, the system will switch to one of the auxiliary points in the selected Dynamic area. However, unlike the old system, the new system has a bias for the primary AF point. After a brief delay, the camera tries the Primary AF point again. If there is a good target under the primary AF point, it will go for that. If there is not a good target under the primary point, it will go back to using the auxiliary points. It will continue to go back and forth like that until it can get a lock with the primary AF point again – or you stop focusing. Two notes - Note that it MUST be a good AF target for the system to switch – just a target that it can technically focus on isn’t good enough. I have tested this with poor targets the camera could just barely focus on. While the camera could technically get a lock, it would stutter a little trying to keep it. I would then switch to Dynamic and focus on a printed box with the poor AF target in the background. When I move my primary AF point over the poor AF target, it would stay with the first one indefinitely. Field tests also seem to confirm that it needs a good target in order to switch points – sadly, there are a LOT of those out there. Delay time – In the past, the camera would not invoke the delay time (Blocked AF Response) specified under A3 unless the target had completely left the AF area. However, that’s not the case now. The camera will start the countdown as soon as the target leaves the primary AF point (as a poster noted above) and use the auxiliary points until the time runs out – at which point it will try again with the primary AF point. Usage So, if this is the new normal, we have to adjust to the change. For some people, this system is actually an advantage, for others, not so much. The advantage favors more experienced shooters. In the past with Dynamic, if the system switched to a different AF point, it would tend to stick with it – but sometimes that’s a problem. With the old system, if I’m photographing a bird coming at me at a 45-degree angle, I would go for his head. However, if I accidently slide the primary point off, the system would pick a new AF point. If it decides to go to the spot on the bird down by where the wing meets the body, it was an issue. The camera would lock on and just stick there until you refocused – even if for the rest of the sequence you kept the primary AF point on his eye. With the new system, it may still move to the wing, but if you keep the AF point on the eye, the camera will get the idea and switch back to it. The downside of course is that if you really are having a difficult time tracking, in the past Dynamic would really help. Just get the initial lock and fire away. Even if the primary AF point never revisited the subject, it would continue to track and not jump to the background. IMO this is the better method – less experienced shooters can use wider areas and more experienced shooters would use smaller areas to restrict where the camera could focus. So, the bottom line is this – with the new system, you need to do your best to keep the primary AF point on your subject. If you’re having a hard time, set the delay under A3 to 4 or 5. However, keep in mind even at “5” the delay is short. However, just knowing that it’s critical to keep the AF point on target may be enough to help some shooters. -- Steve Perry There was this response, after more than a YEAR from Nikon: I apologize for the delay, and for the confusion. According to our design group at Nikon Corp, the Dynamic Area AF function has been enhanced with the newest AF sensor, particularly for subjects moving toward or away from the camera. Dynamic Area AF (9, 53, 72, or 153 point) does not track the subject, however it will expand the area the subject will remain focused should it BRIEFLY leave the initial focus point. If the subject leaves the selected number of AF points, then the camera will refocus. If the subject leaves the initial focus area and enough time has lapsed before the subject is recentered, the camera will refocus. If peripheral data from initial target area has enough of a difference from the initial target (unspecified), then the camera may refocus as well. This is not the intent of the function, but it may happen at times. Choose the numbered area based on your ability to keep the initial AF area on the subject, and also expected movement path, and always try to follow and center your subject during the burst. If the subject does refocus, and it may, then either let go of the button, reacquire the subject in the center AF area, and continue firing, or, depending on the quality of the subject (ability for the AF sensor to grab and hold), just keep firing and the lens will refocus on it's own. Success is dependent on a combination of subject contrast and user skill and speed of the user and the subject. If you want you to allow the AF area to track your subject around the frame, then select 3D or Auto. 3D will follow around the subject using a single AF point and Auto will use several points. The intended performance improvement, again, was for subjects moving toward or away the camera using information from surrounding points, that is one area where the enhancements were noted during testing with this new system. (See more on this response from here). Conclusion It appears that Nikon chose to change the way dynamic-area focus works, but the majority of photographers who have used it don't like it (including me). You need to use the "exif tool" to query what the focus points are doing (Capture NX-D for instance doesn't give you a clue). There is no official Nikon documentation noting the change or how to cope with it, as of this writing. I'd recommend that you try setting the "Blocked Shot AF Response" to the longest allowable (5), so that it waits a little longer before it picks a different focus sensor. Most photographers seem to prefer "Group Autofocus", although be aware that this mode will always pick the nearest subject. #review

  • Yet another MTF explanation article

    Lens resolution and contrast are discussed in so many different ways, it leaves most photographers dazed and confused. It’s time for my own two cents (or whatever that means to you at your own exchange rate). Most MTF discussions either quickly degrade into theory or never leave theory in the first place. I want to keep it real, with actual measurements and pictures. “MTF50” Charts So how do I arrive at my “resolution” measurements? I use “MTF50”, measured over the entire camera sensor. The Modulation Transfer Function I’m using (MTF50) measures how close black/white line pairs can get before they lose 50% of their contrast. As lines get skinnier (and closer), black lines start to get some white ‘contamination’ in them on their edges and white lines start to get some black contamination on their edges, too. When black turns 50% gray and white simultaneously turns 50% gray because this contamination on either side of a really skinny line meets in the middle, that’s the “MTF50” condition. The contrast (MTF) being measured is defined as: (“darkest” – “lightest”) / (“darkest” + “lightest”) A pure black line would have a measurement of 1.0, and a pure white line would have a measurement of 0. A perfect lens that would then have an MTF of (1-0)/(1+0) = 1.0 no matter how skinny the lines were (ignoring the finite wavelength of light and diffraction effects). If you were shooting black/white line pairs and your lens only lost 5% of contrast, then you’d have an MTF of (.95 - .05)/(.95 + .05) = 0.9. In truth, the lens and the camera sensor both contribute to the loss of contrast. Since it’s more useful to have a camera attached to that lens, the MTF50 measurements shown below are a combination of lens effects and sensor effects. Also, the two-dimensional MTF50 plots show what’s going on over the entire camera sensor from corner to corner. At least half of the cameras out there have an “optical low-pass filter” (OLPF) over their sensors, which fuzzes the image a little to avoid the moiré effect. Since the resolution measurements are performed without any sharpening, this has a slightly negative impact on the results. When you can get beyond about 30 line pairs per millimeter on the camera sensor before the lines drop to 50% contrast, then you have what most people consider good lens resolution. 30 line pairs per millimeter are really, really skinny lines. Keep in mind that 30 lp/mm is good unless you substantially enlarge an image, so “DX” sensors need 1.5X more resolution than “FX” sensors for the same-size print. The resolution measurements below are separated into “meridional” and “sagittal” directions (see the plots below), measured over the entire camera sensor. Think of the “sagittal” direction like spokes on a wheel, where the center of the lens is the hub. The rim of the wheel, where the spokes attach, is the meridional direction (perpendicular to the spoke directions). Lenses invariably do less well resolving lines in one direction or the other. When the sagittal and meridional resolutions differ, you get astigmatism. If you’re interested in the total resolution in a picture, you need to take the “lines pairs per millimeter” resolution value and multiply by how many millimeters tall the sensor is. This value gets you “line pairs per picture height” or, lp/ph = (“MTF50 lp/mm”) * (“sensor_height_mm”) Bigger camera sensors have more millimeters in them, so you get more total picture resolution than a small sensor. Besides megapixels, there are factors like focus repeatability, optical low-pass filters, air turbulence (heat shimmer), shooting distance, ambient light level, the aperture setting, and camera vibration that can get the resolution waters muddy in a hurry. Sensor noise is a factor too, but that’s beyond the scope of this article. Most manufacturer MTF plots are shown at the widest lens aperture. The MTF measurements can be dramatically higher when you stop a lens down more, but there are limits to the increase in resolution you can get by stopping down. A piece of a lens resolution chart photo The photo above shows a section of a resolution chart that has been analyzed by a program. It has little blue numbers on top of every edge that has been measured. The measurements shown are in units of “cycles per pixel”, which means how many light/dark transitions happen per sensor pixel (less than one transition per pixel). The pictured edges with lower values on them are fuzzier than the higher-valued edges. You’ll notice a pattern that the edges aligned in the sagittal direction are consistently fuzzier than the neighboring meridional direction edges. To get a better idea of lens performance, you need to take resolution measurements at literally hundreds of locations all across your camera sensor. Close up on a square. Numbers are cycles/pixel. What are the Limits of Resolution? The Luminous Landscape website has an interesting discussion on what resolution camera sensors and lenses are capable of producing. That link is here: You have probably heard the lens term “diffraction-limited”, but what exactly does that mean? When light passes an edge, like the edge of a lens diaphragm, it will diffract. If your lens is significantly stopped down, the diffraction gets huge. But how huge? The link above mentions that for MTF50, an aperture of f/1.4 could theoretically produce 494 lp/mm. No real lens is anywhere near this limit. At f/16, however, the theoretical limit is only 43 lp/mm! These numbers are for “yellow-green” light. By f/22 the limit plunges to 31 lp/mm. Many lenses are good enough to resolve more than 43 lp/mm at 50% contrast, but at f/16 and beyond, they never will. These lenses are “diffraction-limited”. On the camera side of things, a sensor has a Nyquist limit, beyond which it won’t record higher resolution (see below for that discussion). The Lens I did my testing with (a single copy) of the Nikkor 85mm f/1.4 AF-S lens. This lens has pretty good street cred, and I didn’t want questions about quality entering into the equation. As an aside, I thought I’d mention that “lenstip.com” reviewed (a single copy) of this lens on a D3X (24.5 MP) “FX” camera, and found no better than an MTF50 of 30 lp/mm at f/1.4 at the center of the lens. My copy measures between 32 and 42 lp/mm at f/1.4 in the lens center, depending on the camera. Its corner measurements on an FX sensor are as high as 27 lp/mm at f/1.4. I think that LensTip got a bad copy; I don’t think they have sloppy technique. The Software I make all of my resolution measurements using the MTF Mapper program, whose author is Frans van den Bergh. His software and printable test plots are available here. I’m using version 0.6.7 of mtf_mapper_gui.exe for these tests. I have an “A0” test chart (33” X 47”) printed on quality glossy paper, dry-mounted onto foam-board. This allows me to be about 16 feet from the resolution target and still fill the frame on DX when using the 85mm lens. I wanted to shoot at realistic distances, but not let air turbulence (think heat shimmer) enter into the mix. You must use software to evaluate resolution. It’s far pickier than you are, and totally repeatable. You also need software to properly evaluate focus when calibrating your phase-detect system, which the same MTFMapper program can do, although you need a different target for this. The Technique Before I discuss any test results, I’d like to mention that I think the biggest factor in measurement reliability is auto-focus variation. I used live-view, contrast-detect focus throughout. Results show the “best” resolution measurements I got, but the MTF50 results often vary by about 2 lp/mm from shot to shot. I focus in-between every shot. The camera stops focusing when it thinks its “good-enough”, so there is always some amount of variability in focus. Some people place their cameras on a moving platform and shift focus by progressively changing the subject distance between photos of the test chart. The next-biggest factor in spoiling resolution is camera motion. I use a big and heavy tripod in all testing, along with a remote release and “mirror-up”. Short of mounting your camera on a granite slab, however, you’re always going to experience some amount of camera shake because of the shutter motion. Except when your camera has an electronic front-curtain shutter (EFC) like the D500. If you have it, use it. I’m convinced that it got me about 2 lp/mm extra resolution. Shutter speeds were all around 1/1600s (the EFC is limited to 1/2000s). Take the photos at the camera base ISO. You don’t want sensor noise to be a part of the test. The tested lens has no vibration reduction; if it did, I’d turn it off for testing. The Camera Your camera sensor will affect your MTF measurements, as I already mentioned. Another influence on resolution is called the Nyquist limit. Your measured resolution can’t go higher than this value. I show some camera Nyquist limits below. D5000: 4288 X 2848, 23.6mm X 15.8mm, 5.5 micron pixel, 12.3MP, OLPF, Nyquist 90.1 lp/mm. D7000: 4928 X 3264, 23.6mm X 15.6mm, 4.78 micron pixel, 16MP, OLPF, Nyquist 104.6 lp/mm. D500: 5568 X 3712, 23.6mm X 15.7mm, 4.22 micron pixel, 20.9MP, no OLPF, Nyquist 118.2 lp/mm. D610: 6068 X 4016, 35.9mm X 24.0mm, 5.95 micron pixel, 24.0MP, OLPF, Nyquist 83.7 lp/mm. D7100: 6000 X 4000, 23.6mm X 15.6mm, 3.92 micron pixel, 24.0MP, no OLPF, Nyquist 128.2 lp/mm The Nyquist sensor resolution is: (pixels high / height_mm / 2) lp/mm. If a lens has better resolution than the sensor Nyquist limit, then that extra resolution won’t get recorded. Now, the dreaded MTF50 math The program measures the target edges, then converts the “cycles per pixel” into “MTF50 lp/mm”. This number of cycles is measured at a contrast of 50%. MTF50 lp/mm = cycles_per_pixel * height_pixels / height_mm For instance, the photo above shows a couple of “0.19” c/p measurements for this D610 (4016 pixels tall, 24.0 mm tall): MTF50 lp/mm = 0.19 * 4016 / 24.0 = 31.8 (Pretty awesome for f/1.4 near the corner of the photo!) The plots show how many line pairs per millimeter can be resolved before they reach the 50% contrast threshold over the whole two-dimensional camera sensor. Stop the lens down for dramatically better resolution The most common “MTF” chart style The “MTF10,30” plots show the lens measurement data in a different way. These are the plots most people are familiar with. The plots have lines that show “contrast” averaged over the lens, moving from the lens center (on the left) to the lens edge (on the right). The contrast is calculated the same way as the formula from above, using a couple of different sets of line frequencies (thicknesses). The chart plots are separated into 10 line pairs/mm and 30 line pairs/mm, in both the sagittal and meridional directions. The vertical contrast range scale goes from 0 to 1.0, where 1.0 represents 100% contrast. Some manufacturers use “radial” and “tangential” terms instead of sagittal and meridional, but they mean the same thing. What you get, then, is the measured contrast for relatively thick lines (10) and thinner lines (30). The “10” is considered the lens contrast, and the “30” is considered lens resolution. The MTF10,30 plots are traditionally shown at the lens maximum aperture only. These plots can be a little underwhelming, especially when compared to a lens at its optimum aperture. I think these “10-30” plots are much less informative than the “MTF50” plots in regards to resolution measurement, but they let you compare lens measurements to the same style of plots that most camera companies publish. Except for Leica and Zeiss (and possibly Sigma), the plots that the camera companies publish are “theoretical” and not actually ever measured. To me, this is “blowing smoke you know where”. I think of these plot types as a decent way to evaluate lens astigmatism, but not “resolution”. You need two-dimensional data to really know how a lens performs. A wide-open (f/1.4) MTF plot, D610 and 85mm f/1.4 lens 85mm f/1.4 lens stopped down to f/4.0. D7100 camera. Summary There are several ways to show the resolution of a lens. The worst way would be a single number. A better way is to show what the whole camera sensor sees, in two dimensions. An even better way is to show two-dimensional measurements that also segregates the sagittal and meridional information. Better yet, gather this information at the different aperture settings. #howto

  • A Better Way To Test Fisheye Lens Resolution

    I had tried to measure my Rokinon 8mm fisheye lens resolution a while ago, but I couldn’t get very good answers. The link to that article is here. The MTFMapper program I use to measure lens resolution is designed to look for little squares in a test chart to make measurements. A fisheye lens distorts the chart squares until they no longer resemble a square (they look like a rhombus shape), and hence the (previous) program versions skipped measuring them. The MTFMpper author, Frans van den Bergh, has been working on this fisheye measurement problem. He, of course, has come up with a solution (or else I obviously wouldn’t be doing this article, duh). The fix to this problem is to un-distort the photograph of a test chart, so that the little squares in the resolution chart photo look like squares once again. Simple. Unless you had to do it. I’d like to show some screen shots of the new MTFMapper program, version 0.6.5, that demonstrate how to configure the software to be able to measure fisheye lens resolution. I always use Raw-format un-sharpened pictures of the test chart(s) that MTFMapper requires for proper measurements. The program uses the “Dcraw” program under the covers, which is constantly updated to understand new “raw” format files. The un-manipulated 8mm fisheye chart picture (18” away) Note that the chart shown above is the “vintage” resolution chart. There are newer chart designs available, but the latest MTFMapper version still accepts this older chart design. Old versions of MTFMapper couldn’t measure the chart sides As you can see above, the highly-distorted chart was just too much for older MTFMapper versions. Frans worked on adding the image manipulations to (mostly) eliminate the fisheye effect. His program needs to be told by the user that the chart shot needs distortion correction, and this is done in the “Settings” dialog. Distortion correction in the Settings dialog Different fisheye lenses use different optical “projection” formulas. The Rokinon 8mm lens uses “stereographic projection”. The options in the dialog include “none” (most lenses), “radial”, “equiangular”, and “stereographic”. If you don’t know the projection formula of your lens, you can just try experimenting with the options. You will need to include the actual focal length of your lens, unless you choose “none”. The MTFMapper “Annotation” dialog The picture above shows the measurements of the chart squares in the “Annotation” dialog, and incorporates the distortion correction. Note that it’s not perfect, but the program is able to readily measure all target squares now. The slight remaining barrel distortion doesn’t affect the MTF50 measurement accuracy. The sides of the little squares are now straight enough for MTFMapper to work its magic. The “Profile” dialog. This is a good lens! If you’re familiar with typical lens MTF50 measurements, then the above measurements should impress you. Although this Rokinon 8mm isn’t very good wide open, f/8 and beyond show that this lens is capable of amazing resolution. The “2D” grid MTF50 lp/mm measurements The resolution “fingerprint” of this lens is quite unique. This is a “DX” lens; the results above are using a Nikon D7100 (3.92 micron pixels). The “Lens Profile” MTF10 and MTF30 measurements Conclusion The new MTFMapper V0.6.5 is a resounding success for tackling the fisheye lens measurement problem. I realize that not that many people own fisheye lenses, so this new feature will probably have a limited audience. If you need it, you know who you are. I thought I’d mention that I also did an article here that discusses how you can effectively convert your fisheye images into a regular super-wide rectilinear lens (I used Lightroom lens profile corrections). This technique could also get you chart photos that MTFMapper could use, but it’s best to stick with un-sharpened raw pictures when measuring resolution. If you’re interested in getting this free program, take a look here. I made an article that gives a simplified explanation of its use here. If you like this program as much as I do, then please let Frans know! #review

  • Using MTF Mapper 0.6.3 New Features

    The MTFMapper program that I use to measure lens resolution is constantly improving and learning new tricks. I thought it would be a good time to show you some of its new features that aren’t too obvious to the casual user. If you aren’t familiar with MTFMapper, then I suggest you take a look at a previous article I made for it here. This program is authored by Frans van den Bergh, and is roughly the equivalent of the Imatest program, except that it’s free. The features I’m going to show you are in the new Windows 64-bit version 0.6.3. This version allows use of very large files that are beyond the limits of Frans’ 32-bit versions. One thing I should mention is that I am still using the previous design of the lens resolution target. The new MTFMapper program still accepts it, but there is a newer chart design that is slightly more accurate at the very center of the chart (due to the “hourglass” shape of the middle target) and is more forgiving with chart rotation errors. I made a giant “A0” print of the earlier resolution target file on quality glossy paper and had it subsequently dry-mounted into a picture frame that I can hang where I want. The target files are also supplied by Frans at his web site (consult the link above). My “A0” size vintage resolution chart design It was bit of a pain and pricey to make my large, mounted, and framed chart. I haven’t yet managed to talk myself into abandoning it. I did print and mount a smaller new-style resolution chart that I use when I can get close enough to fill the frame with some of my lenses. Newer resolution chart design (with measurements on it) I should also mention that bigger is better. Obviously. In this case, I’m referring to the resolution target. To get truly useful results, you want to take lens resolution measurements at the same distance from which you shoot regular photographs. For my Sigma 150-600mm lens, this means from about 60 feet away. You need a pretty big chart to be able to do that. For wide-angle lenses, it takes a huge target to get very far away from it and still fill the image frame (which once again means you need a big chart). You get the picture. Everybody by now knows I can't resist using puns at every possible opportunity. A few program versions back, Frans changed his MTFMapper so that it no longer sets an automatic threshold value for locating target edges. As a result, you’ll probably need to change the value his program will otherwise use. For my own photographs, the ideal threshold value is 0.2. I set this in the “Settings, Preferences” dialog, which is the same place where you need to tell MTFMapper the size of your camera’s pixels. If you fail to do this step, you may discover that it refuses to measure many target edges. MTFMapper 'Settings' dialog showing the plot 'Scale' slider and 'Threshold' value In the picture above, note that I set the “Threshold” value to 0.2. For my photographs, this gets me the same results that the older versions of the program produced via its “auto-threshold” functionality. Note that I typically change my camera meter to use exposure compensation of about “+.7”, to get the chart white values to look reasonable in a photograph. Even in this day and age, camera meters can be a little stupid; the photographer still needs to supply the brains. The "Threshold" value tells the MTFMapper program how much of a contrast change is required to consider what it sees to be a valid target edge. By the way, the "Arguments:" box shown above is where you can type in custom arguments for the MTFMapper program, which are the commands that look like "--myargument myvalue". Measurement Plot Scaling The first new MTFMapper feature I want to discuss is measurement scaling. For several program versions, the default has been to start the resolution measurement plot scale at zero. My own preference is to “auto-scale" the plot, so that chart resolution values stay strictly within the range of actual measured values. Your own preferences may differ. Frans now lets the user decide, so now you’ll find a slider in the “Settings” dialog called “3D plot z-axis relative scale factor”. If you slide it all the way to the right, then it will “auto-scale” the plot; if it’s at the left side, then the scale will start at zero instead. You can see in the picture above how I have the slider set. By the way, the slider is used for both the “2D” grid plot and the “3D” grid plot scaling. Auto-scaled 2-D plot of the resolution target 3D Grid (Meridional) plot using "auto-scale" for an 85mm lens at f/2.8 3D Grid (Meridional) plot using "zero-scale" for an 85mm lens at f/2.8 The pair of plots above demonstrate the difference between auto-scale and zero-based scale. I prefer the auto-scale, because it maximizes the differences in measurements and it also shows what the minimum resolution measurement value is, in a simple way. The 0-based type of scale makes it too difficult for me to determine the minimum measured resolution values. I always select the "Line pairs per mm units" in the Settings dialog; otherwise, it will use "cycles per pixel" units. Automatic MTF Curve Plotting for Any Measured Target Edge The next new MTFMapper feature I want to cover is the ability to dynamically produce a chart of a selected measured edge of a square from the “Annotated” picture of the resolution target, as seen below. MTF Curve from clicking on a resolution measurement value As you can see in the picture above, a plot that shows contrast and frequency data from a single edge measurement in your “Annotated” picture can be generated from a single mouse-click. You merely find the edge of interest and click on the “cycles per pixel” (c/p) value to get the plot. The Annotated picture always displays "cycles per pixel" measured values, even when you have set "lines per mm" for your plot units. This plot has the ability to show you the frequency measurement for the entire MTF range (not just MTF50). To get the answers, you slide the vertical gray bar along the horizontal “Frequency (c/p)” direction and the chart will update the “contrast” value, which is also the MTF value if you multiply it by 100. Once the curve dialog is displayed, you can click on another edge measurement to replace the plot. Once the plot is displayed, you can hold the shift key and select an additional edge measurement that you want to compare, if you wish, such that both edge measurements are plotted together (in two different colors). Chart shows a frequency of .308 cycles/pixel at MTF30 Conclusion Frans is busy working on new features and refining existing program features all the time. If you want to track his progress, you might be interested in this link. Beware that most of Frans’ blogs are heavily dosed with matrix algebra and fast Fourier transforms; they're not for the feint of heart. If you have an interest in squeezing the maximum quality out of your camera gear, then you should probably try out this software. It will enable you to get the best possible focus calibration and resolution measurements from your cameras and lenses at a minimal cost to you. But only if you take the time to print and mount the quality target files that Frans has designed and provided. #howto

  • Make Manual Exposure Automatic

    It probably hasn't occurred to many photographers that they can automate exposure while in "manual" mode. Although this sounds like an oxymoron, it's really not. What if you would like a specific aperture to control either resolution or depth of field. But you'd also like to control the shutter speed to work well with lens vibration control or perhaps freeze action. But you don't want to give up automatic exposure. You can have it all simply by switching to "Auto ISO". Now, with the camera set to manual, you can control the aperture and shutter, while the camera adjusts the ISO to suit the required exposure. There's a variation on this theme, when you use a flash while in manual mode. That topic is discussed in an earlier article I wrote here: For skeptics, there's a way to prove that you still get the correct exposure using the technique described above. It's called a histogram. Simply chimp the shot and verify that your histogram looks correct. Sample camera Auto ISO menu option Along with this extra power comes extra responsibility, of course. If you truly want “manual” exposure, then remember to turn Auto ISO OFF… #howto

  • Keep Using Capture NX2 with Raw Format

    So what do you do when you get a new Nikon camera and discover that Capture NX2 barfs on your .NEF files? There may be no need to panic. This is definitely a niche article. I am an admitted Capture NX2 holdout. I know there are others in the resistance movement out there, at least in other countries. I refuse to abandon either raw format or Capture NX2. It’s well-known that Capture NX2 doesn’t understand new Nikon camera raw files. Maybe you, like me, don’t want to switch to TIF or Jpeg to keep using Capture NX2. To find out if you’re in luck, you need to check to see if you have a copy of Capture NX2 Version 2.4.6 (not the last one that is 2.4.7). No? You might have a lot of difficulty finding it, and Nikon won’t be of any help here. I'm a packrat when it comes to computer disk backups, so I had pretty much every Capture NX2 version saved. Next, you need to download the (free!) “Raw2NEF”, created (and maintained as of this writing) by Miguel Bañón. His download site is here: Miguel is supporting both Sony and Nikon cameras. The (growing) list of supported cameras is listed at his site. What’s the catch? Well, there are two. First, this slightly disrupts your workflow. Now, you have to run his program and do the conversion before using Capture NX2. Second, the NEF files that the Raw2Nef program creates are about triple the size of your original compressed NEF files. You can always "zip" the folder of converted files later, to save about 33% on disk. You don’t have to worry about the Raw2Nef program altering your original files; it just makes new files (CNX2_ prefix) where you tell the program to put them. It only takes about 1.5 seconds per shot to do the conversion on my own computer. There are other options to fine-tune the file conversion process, but I just select the input folder, the output folder, and then click convert. It can recursively run through sub-folders during the conversion process, but it places all of the converted files into a single directory, so just be aware of this. If you have Photoshop CS6, you can later select the "CNX2_" files and convert into Photoshop-compatible files. You should click the "Adobe Photoshop" button and browse to where Photoshop.exe is located before doing the conversion. The Raw2Nef interface Thank you Miguel! And long live Capture NX2. #howto

  • Convert your fisheye lens into a regular superwide

    Have you ever had to make a decision on which wide angle lens gets to go on your trip? The loser is usually the fisheye lens. Fisheye lenses are just a little too specialized, so they often fall into benign neglect. What if you could magically turn that fisheye lens into a regular rectilinear super-wide whenever you wanted? Would you find that useful? Duh! I found that I got caught flat-footed in keeping up with lens distortion correction technology. I tried using the lens “profile correction” in Adobe Lightroom with my Rokinon 8mm fisheye, and my jaw dropped. Words can’t do it justice, but pictures can. There are other software vendors, of course, that can do this task even a little bit better, because they can salvage more of the left and right frame edges. But I have Lightroom and I don’t have those other tools. And believe me, there’s plenty of image remaining after the lens profile corrections. Check out my Rokinon 8mm fisheye review here. I discussed a few postprocessing operations to ‘straighten’ shots in the article, but the results left a lot to be desired. The images (at f/8 and beyond) are incredibly sharp. But the lines are curvy. Very curvy. Boring chart in Lightroom before lens profile correction, usual fisheye effect Boring chart in Lightroom with lens profile correction. Rectilinear super-wide! I am thoroughly impressed at how much barrel distortion is corrected. All of those little squares are square again. Yes, you lose some frame edges, but what’s left still covers a huge angle. For all intents and purposes, your fisheye image is now looking like a typical rectilinear super-wide lens. I cannot guarantee that other brands of fisheye lenses will be so well corrected. You can consult the Adobe web site to see if your lens profile exists. The profiles are periodically updated for new lenses. The profiles are available in Photoshop, as well. Lightroom steps to use lens profile corrections for Rokinon Select the "Develop" portion of Lightroom Remove chromatic aberrations (select) Enable Profile Corrections (select) Make: Rokinon Model: Rokinon 8mm f/3.5 UMC Fisheye CS Tokina 11mm. I used to think this was a pretty wide angle lens. Rokinon 8mm with Lightroom profile correction. Now that’s wide. I chose the above shot to drive home the point that even though you can get super close and super wide, it doesn’t mean that it’s always the best choice. The lens distortion is removed, but the extreme perspective isn’t welcome in every shot. I consider this shot to show a lack of ‘taste’. Now, imagine if a person was sitting in one of those near chairs. Scary. Even with liberal cropping, you can now go really, really, wide. Rokinon 8mm with no profile corrections. Now that’s a REALLY wide fisheye. I think that these lens profile corrections are going to give my fisheye a whole new lease on life. All of those curvy lines can be made nearly dead straight. Claustrophobic spaces can be made to look immense. As with all super-wide angle lenses, you still need to pay attention to getting perspectives that are too extreme. Shooting discipline is even more important here, such as keeping the camera level and at a ‘respectful’ distance from the subject. But imagine the world of options that this technique can open up. Science and art live happily ever after once again. #howto

  • Focus-Stacking: Camera Hardware Suggestions

    In a previous article, I discussed software to accomplish focus stacking. I glossed over the hardware that lets you get real quality results. I’m going to try to rectify that shortcoming in this article. I’m assuming you’re interested in macro focus-stacking. For those that are unaware of what focus-stacking is, it involves taking multiple photographs that you combine to get greater depth of focus. Macro photography is famous for suffering from paper-thin depth of focus. Optical physics is against you here, but software and digital photography comes to the rescue. What you need to accomplish macro focus-stacking is flexible close-up gear. If you check on web sites such as e-bay, and you shoot Nikon, you should be able to locate a bellows setup like mine, the PB-4. I also use rings that let me reverse the lens and attach a filter to the reversed lens. Nikon stopped making bellows hardware decades ago, because the customer base was just too small to make it worth it to them. Some camera bodies may not allow attachment to the bellows; it depends on how much overhang the “pentaprisim” portion of the camera has. My D610, for instance, barely fits, but the D500 fits fine. Battery grips, however, don’t allow you to connect or properly use the bellows. You may need to set the bellows into ‘vertical-shooting’ format to be able to mount the camera body; crank the camera-mount portion of the bellows to the rear-most position to mount the camera. I still use the vintage 55mm Micro-Nikkor f/3.5 lens, circa 1974. Think this old lens couldn’t cut it today? Think again. With focus-stacking, you should always set the sharpest lens aperture (f/8 for the Micro-Nikkor). Remember, stacking will take care of depth of focus, so you don’t need to worry about stopping the aperture down to get sufficient depth of focus. When you get into magnifications greater than life-size, you should reverse the lens to get the best optical results. I use the BR-2 lens reverse ring (52mm). Any other macro lens I have access to doesn’t have the 52mm filter thread, so lens-reversing isn’t an option. Wind and vibration is the enemy, so you will get best results in dead-calm conditions (such as indoors). I made a custom piece of hardware that fits into the end of my bellows unit; it has an alligator clip to hold small objects at whatever height and rotation I need. This clip hardware is connected to the bellows, so image motion relative to the bellows is virtually eliminated (it would move the same as the camera). The clip can also hold a little platform in front of the lens, allowing you to lay small subjects that aren’t “clip-able” onto the platform. Lighting is crucial. I often use an LED ring light, which stays cool and provides perfectly even lighting. When my lens is reversed on the bellows, I attach a BR-3 (52mm filter thread) ring to the rear of the lens. I can slip the ring light over the BR-3. The light won’t fit larger diameter lenses. Continuous lighting is really, really nice to see your subject well. I use a remote or wired release, and I set the camera up in the mode to make shooting a two-step procedure: the first release flips up the mirror and the second release triggers the shutter. Electronic front curtain shutter mode is ideal, if your camera supports it. I rotate the focusing knob on the bellows to shift the camera/lens combination toward the subject in increments of about 0.1mm for higher image magnifications. I’ll typically take about 50 shots to stack. The zone of sharp focus should overlap from adjacent shots. Be careful to never change the image magnification while shooting a focus stack. Focus-stacking is mostly incompatible with living subjects at high magnification, since image motion is verboten. Some people claim they can chill insects enough to temporarily stop their movement. In the demonstration shots below, I found a recently deceased bee and a beetle to use as a subject. Due to the way stacking works, you won’t want to use tight framing on your subject. Expect to lose about 20% of the image around the edges, which you’ll need to crop out of the final stacked image. Reverse-mounted lens with ring light and clip to hold subject I don’t have hardware to reverse my other macro lenses, so the above setup only applies to the 55mm Micro-Nikkor with its 52mm filter thread. This ring light only fits 52mm or smaller diameters. Note how the hardware that holds the small subject is connected to the bellows; subject motion is no longer a problem in still air. The LED light provides perfectly even illumination without heating up the subject. For really gung-ho macro photographers, the PB-4 bellows provides both tilt and shift controls to manipulate the plane of focus and also perspective. Focus stacking pretty much eliminates the need to alter the plane of focus, though. Flash close-ups with AR-4 release, BR-4 diaphragm control The above setup demonstrates using a flash instead of a ring light. For lenses that cannot be reversed, I use this lighting arrangement if the subject is far enough away. At higher magnifications, the lens will eventually cast a shadow on the subject. This arrangement doesn’t provide continuous illumination, so the AR-4/BR-4 arrangement can be handy to keep the lens diaphragm opened until you depress the plunger on the AR-4. This cable release was designed for cameras like the Nikon F2, where the second cable would connect to the shutter release. D500 with wired remote. Subject is lit up. Focus stack of 69 photos. 55mm Micro-Nikkor at f/8, Nikon D610. The demonstration photo above was made with the lens reversed and the ring light for illumination. Each shot was made in manual mode, ISO 100 and 1/3 second. You want the sharpest aperture and lowest ISO for this shooting, and please, please use RAW format. Using good light and an optimal aperture, this stacking technique gives you an idea of just how good the 55mm f/3.5 Micro Nikkor lens is. As I mentioned in another article, they used this lens for shooting the original Star Wars film, with good reason. Film cannot compete with this form of digital photography. I think that focus stacking is the perfect blend of art and science. Happy stacking. #howto

bottom of page