top of page

Search Results

229 items found for ""

  • Do Long Lenses Not Like Filters?

    I was trying to solve a mystery about why my trusty Sigma 150-600mm seemed to lose its ability to resolve fine details. I recently got a D500 and was using the Sigma while testing the camera electronic front-curtain shutter feature (to totally eliminate vibrations). I noticed the resolution measurements (using the MTF Mapper program) were much lower than expected. Had I somehow bumped the lens and knocked the optics out of alignment? Was the camera focus system not functioning properly? Were the phase-detect and contrast-detect focus systems both out of whack? Was my vibration reduction accidentally left on while using a tripod? Am I getting sloppy and don’t even know it? Other lenses weren’t showing any resolving problems on this camera, but that only helped to show what the problem was not. Out of desperation, I removed the 95mm Hasselblad UV filter from the Sigma. Like magic, the resolution numbers were back to where I expected them to be (more than 20% higher). Are you kidding me? My premium Hasselblad filter is no good? I don’t have any other lenses that use 95mm filters, and I don’t have any step-up rings that big, either. I finally figured out how I could use my 24-70mm lens with its lens hood to hold the filter via friction-fit inside the hood. I took shots of my resolution target with this filter in place (at 70mm), and then removed it to repeat the shots with no filter. The MTF Mapper program showed zero resolution differences either with or without the Hasselblad filter! How is this possible? How can the Hasselblad show perfection on this lens and wreck the resolution on the Sigma? I don’t have any other long lenses to experiment with. I don’t think most users of really big glass use filters at all, except for the kind that fit in the drop-in filter holders near the rear of the lens. Maybe there’s a good reason they don’t use front-mounted filters, aside from the big cost and added weight. Let’s take a look at some resolution test measurements. The following plots compare shots taken with and without the Hasselblad 95mm filter. Sigma 150-600mm at 600mm with 95mm Hasselblad UV filter. Bad resolution! Sigma 150-600mm at 600mm without UV filter. Much better resolution. Note in the above plots that the peak resolution with the filter in place has an MTF50 of about 28 lp/mm. The plot of the photo taken moments later under the same conditions, but without a filter, has an MTF50 peak resolution of about 36 lp/mm. That’s a difference of about 22%. I repeated this same test a dozen times, with the same results. This would lead most people to conclude that the filter is absolutely terrible. But I'm stubborn. I investigated further. Next, I’ll show you some resolution test results using my Nikkor 24-70mm f/2.8 VR lens. 24-70 at 70mm f/2.8 with Hassleblad 95mm UV filter 24-70 at 70mm f/2.8 without filter 24-70 at 70mm f/2.8 without any filter 24-70 at 70mm f/2.8 with Hassleblad 95mm UV filter Maybe you can tell a difference between the measurement results with or without the filter, but I sure can’t. Based upon the tests with the 24-70 lens, I would have to conclude that there’s nothing at all wrong with the Hasselblad filter. I was taught that even a medium-quality filter would have a negligible effect on lens resolution, although it might increase light reflections or decrease light transmission. This is a whole different ball game. I now have a strong suspicion that even top-quality front-mounted filters on really long lenses have a big negative impact on resolution. Take my experience with this filter/lens combination as a heads-up. I mainly use UV filters as cheap insurance against accidents and because they’re easier to clean than the front lens element. From now on, though, I’m just going to rely on the lens hood to protect this Sigma lens. If I ever get a short focal length lens with a 95mm filter thread, I wouldn’t hesitate to use the Hasselblad filter on it. For long lenses, though, beware of filters over the front element. At least do some careful testing before you decide to permanently park a filter in front of your big lens. I can’t help but wonder if there are a lot of people out there that are disappointed in their long lens and don’t know that the blame lies in their filter. #review

  • Does the D500 Automatic Focus Fine-Tune Calibration Work?

    Short answer: yes. But you know there’s a catch. There’s always a catch. So here it is: garbage in, garbage out. The ‘secret’ to getting great results for focus calibration is multi-faceted. You must use the proper focus target, your camera must be very stable, you need consistent technique, you need to decide on a zoom setting, you need to pick a target distance, you may need to pick an aperture, and you must use good lighting. As much as people hate to hear this, it’s a fact of life that measurement and calibration always involve statistics. Your gear isn’t perfect, so you are guaranteed to get some variability in measurement. Focus calibration isn’t a “one and done” scenario. Many measurements are required, and then you have to find the average value. You might also have to throw out the measurement values of outliers. The Nikon D500 “automatic” focus fine-tune calibration assumes that your live-view, contrast-detect focus is dead-on. It compares the distance your phase-detect focus system decided to focus on a target to the distance that the contrast-detect focus system decided to focus on that same target, and then calculates the fine-tune value that would shift the lens phase-detect focus to match the contrast focus. Depending on which direction (near-to-far versus far-to-near) your lens travelled to obtain focus, you may get a different calibration answer. Depending on what your selected focus point ‘sees’ as being the subject, you may get a different answer. Depending on your selected lens aperture (mostly high-speed lenses), the answer may vary. Depending on the zoomed focal length, you can get a different answer (think parfocal optics). Depending on the target distance, you can get a different answer. In dim light, you can get a different answer. The physical lens focus mechanics have some slop. So don’t expect miracles here. You need to decide on your favorite zoomed focal length, distance, and aperture to use while conducting the test (depending on the type of lens being measured, of course). You need to use a proper focus target that has absolutely no ambiguity about what the “target” really is, from the standpoint of your focus sensor. Sigma is savvy to these ugly facts of optics life, and provides their newer lenses with the ability to calibrate zooms at multiple focal lengths and multiple focus distances. Their (inexpensive) USB dock and free software lets users reprogram the lens firmware with this calibration information, in addition to letting users select focus algorithms, upgrade firmware, select anti-vibration modes, and other features. Nikon is not savvy, but I digress. I use a specific focus target, rotated to a 45-degree angle, to calibrate focus. The target design, and its associated analysis software, was created by Frans van den Bergh. I wrote an article to explain his software and focus target here. The recipe to perform a single calibration measurement on a D500 goes as follows: ⦁ Lens VR OFF ⦁ Focus mode set to AF-S ⦁ Select the center focus point ⦁ Camera/lens on sturdy support pointing at the focus target ⦁ AF fine-tune ON in setup (wrench) menu ⦁ Live View mode ⦁ Normal-area AF ⦁ Focus on the target. Focus point centered on an edge illuminated by bright light. ⦁ Press the “AF Mode” button and “Movie Record” buttons simultaneously and wait patiently for about 3 seconds. Don’t wiggle the camera… ⦁ Highlight “Yes” and press “OK” when prompted. The calibration value will be written for the lens in the usual AF fine-tune menu location. Rinse and repeat. The focus target using the MTF Mapper program, showing measurement labels on squares Note in the photo above that the camera focus point is centered on the right-hand vertical edge of the large black target. There is no ambiguity about what it is focused on; the focus point can only see a single vertical high-contrast edge. The MTF Mapper program measures the resolution of every black square it finds, and lets the user easily see where the optimal in-focus squares are located relative to the large black central target right-hand vertical edge. Using this software, you can visually see where the sharpest edges are located with each test you make (the edge labels eliminate any judgement calls you’d have to make in an un-labelled photo). After each AF fine-tune measurement is done, write down the value saved in the AF fine-tune menu for the lens being used. Take several measurements and average their values. You may have to throw out any wild readings first. These readings will give you a feel for the natural focus variation of the lens. I like to manually change the focus to alternate between near and far for each test, prior to initiating AF-S auto-focus. This will let you explore any bias that the lens has for focus direction. Manually enter the averaged fine-tune value for your lens in the AF fine-tune menu. This will give you the best “typical” focus result. An MTF Mapper “profile” plot of the focus chart photo. Note “AF Tune Value” is displayed in the plot above. The D500 focus calibration feature automatically saves this value in your camera. The MTF Mapper program extracts the value from the focus chart photo EXIF data to add it to the plot. Conclusion I used this automatic fine-tune calibration feature dozens of times, and the fine-tune value tracked the actual phase-detect focus error quite well (within about 1 or 2 counts). The problem is that it can only calibrate against a moving target. Your own lens natural focus variation will prevent you from ever getting “the” calibration answer. In my own testing, I would get an auto-calibration tune-value variation of about plus/minus 3 for a typical lens. This is roughly the same as my own manual calibration best efforts; it would get within about 1 or 2 counts of what the software measurement software determined to be "best". I have read reports that some users get terrible calibration repeatability, but I suspect that may be largely due to using a poor focus target and/or sloppy technique. I don’t think that phase-detect can ever compete with Live View focus accuracy across the board, because it cannot address issues related to focus-shift-with-zooming, focus-shift-with-distance, or spherical aberration effects. None the less, I applaud Nikon for adding this feature. Now if they could match Sigma lens firmware capabilities, they’d really be leader of the pack. By the way, now even the Samyang (Rokinon) company has added the same firmware-programming features that Sigma offers. Oh, and one thing I'd definitely change with their auto-calibration is the two-button-press thing. There's no way to keep things rock-steady doing that. They need to make it an operation you can do using a remote trigger or timer so that you don't have to jiggle the camera. #review

  • Infrared Photography and the Nikon D500

    How does the D500 stack up shooting infrared? The last pair of Nikon cameras I evaluated shooting infrared were big disappointments: the D7100 and the D610. They had terrible internal reflections that would ruin most infrared shots, which requires use of the DK-5 viewfinder eyepiece blocker to avoid this problem. A link to those results (and the good D7000 results) are here. The D500, on the other hand, is excellent shooting infrared. Ironically, this camera has a built-in eyepiece shutter, although it is probably only needed to help with exposure measurement accuracy on a tripod. I use a Hoya R72 filter for infrared shooting. Something I consider to be very ironic: probably one of my best lenses for shooting infrared is the lowly 18-55 kit zoom (I’m using a VR model). Most other lenses I have tried that shoot wide angle either have a nasty hot spot in the middle or aren’t as sharp. I pretty much park the lens at 18mm; if it went wider, then I’d zoom wider. Here’s the best site I know for evaluating which lenses work for infrared, and also for the f-stop range that works. Even my 24-70mm f/2.8 VR 'E' zoom is crap for infrared. As a little side-note, many lenses perform fine with infrared until you stop them down too far. Up through f/8, they may be fine, but then start to exhibit the central hot spot. My Nikkor 50mm f/1.8D, for instance is okay until about f/8. Nikon seems to try pretty hard to keep people from shooting infrared. The D500 image sensor filter screens out IR very effectively, and exposure times are pretty outrageous. This totally precludes using a direct-measure, pre-set manual white balance (which by the way works perfectly fine on older cameras like the D50 and D60). My open-shade shots are made in the vicinity of ISO 100, f/8, 30 seconds. When you add in long exposure noise reduction, you literally need a lot of time on your hands to do this kind of photography. I like to manually set a gray point by selecting a rectangular picture area (in Nikon Capture NX2, it’s called a ‘marquee sample’) followed by severe hue-shifting. When I use something like Photoshop, then I typically perform a red-blue channel swap instead of the hue-shift. There are plenty of web articles on this topic. Enough talk. Let’s see some actual D500 results. #review

  • How Bright Is Your Camera Viewfinder ?

    I have read for many years in marketing literature about how some camera’s viewfinder is “bright”. What does that mean? How would you know if your camera viewfinder is bright or dim? I prefer numbers to hand-waving. After pondering the issue for a while, it occurred to me that most of us possess an instrument to readily figure out what bright means. Our phones have a built-in camera, and photos from those phones contain EXIF data. EXIF data can be inspected for the “brightness”, which is called “Light Value” (EV). Recall that 1 EV difference equals 1 stop of light. I use the free program called “exiftool” to inspect the EXIF data. A link where I give more details on this program is here. Taking a look at the EXIF data from a phone photo, it’s packed with useful information. Note that this Samsung Galaxy S6 phone has a 4.3mm, f/1.9 lens. Because of its teeny sensor, that’s the equivalent of a 28mm lens with a 65.5 degree field of view. Also note that the EXIF data shows a “Light Value” (8.4 shown above). Why use your phone camera? Because it has a huge depth of focus and the lens fits neatly within your camera viewfinder while blocking external light. If you use the same lens, aperture, and lighting conditions on each camera, then you can take a picture using your phone through different camera viewfinders and compare them for brightness, via the “Light Value” in the EXIF data. In the comparisons below, I took a look at the Nikon D610, D7000, and D500 camera viewfinders. I expected the D610 viewfinder to be the biggest and brightest, since it has a full-frame sensor. I was wrong. The D500 is better. Please forgive the poor exposure in the phone photos below. It's not as smart as Nikon, and the large black expanse fooled the meter. I'm only interested in brightness (EV) and size in the frame, so the exposure technique just needs to be consistent for each camera viewfinder. Nikon D610 Viewfinder. EV 8.4, 35mm Nikon D7000 Viewfinder EV 8.1, 35mm Nikon D500 Viewfinder EV 8.8, 35mm Cropped view of D610 viewfinder Cropped view D7000 viewfinder Cropped view D500 viewfinder Viewfinder Comparisons The D610 viewfinder is EV 8.4 and the view width is 1896 pixels in the photo. The D7000 viewfinder is EV 8.1 and the view width is 1694 pixels in the photo. The D500 viewfinder is EV 8.8 and the view width is 1930 pixels in the photo. It struck me that the D500 viewfinder looked bright and large, but I didn't know if it was a psychological effect or real. Now I know it's real. I was surprised to discover that it's even larger and brighter than my full-frame D610 viewfinder. Try this test yourself; it's an easy way to compare camera viewfinder brightness, magnification, and even focus sensor sizes and focus sensor converage. #howto

  • SnapBridge and D500 Remote Control

    There’s a lot of web discussion about using SnapBridge with the D500, mostly centered on being able make it work or not. My own interest in SnapBridge is related to why I might want to use it at all. I’m probably in the minority here, but it’s more useful to me to be able to remotely trigger the shutter than to be able to transfer photos to my phone. The Nikon “pro” cameras have the infrared-trigger feature conveniently removed. I guess it’s convenient to somebody; I suspect that somebody is Nikon only. So the cheap and easy IR shutter control (via the little ML-L3 remote or via a little app running on your smartphone) is out. Let’s review some ways you can remotely trigger the shutter on your D500 (and most other “pro” models). The most versatile (and therefore the most expensive) way to remotely trigger your camera is the Nikon WR-1 wireless remote. It has a range of 394 feet, has a zillion options, and costs about 470 bucks. Ouch. But if you’ve got your camera at the feet of horses finishing the Kentucky Derby while you’re in the stands, then this is the ticket. If you can get within about 66 feet of your camera, then the Nikon WR-R10/WR-A10/WR-T10 can trigger your shutter and also control your flash without wires. The cost is around 200 bucks. If you can get within about 3 feet of your camera, then you can use the Nikon MC-30A 10-pin wired remote, at a price of about 65 bucks. Or, like me, you can get a cheap 10-pin wired remote (I got the Vello RS-N1II) for about 8 bucks. Now we’re talking. Plus, you can control it with just one hand, and you don’t even have to be looking at it. Enter SnapBridge. So, how can SnapBridge help me to take photos remotely? After you download SnapBridge to your phone (mine is a Samsung Galaxy S6 running Android 6.0.1) and then connect via wireless, you can select the option for “Remote Photography”. I'd suggest you visit the Nikon website to watch the SnapBridge video. Now, via SnapBridge, you can not only trigger your camera, but you can get Live View right on your phone’s screen. You can’t (as of this writing) alter exposure settings, but as least you can see what settings your camera is using and you can also see the battery level. You’ll notice that your phone’s “live view” will give you the impression that it has just consumed several espressos. It has the jitters. Note that you may want to use SnapBridge wireless in limited doses, because once your camera is out of “Airplane Mode”, it will cause pretty heavy battery drain. At least you can monitor that drain from your phone. The owner’s manual says you can use the D500 wireless from “approximately” 10 meters. I got out a tape measure, and successfully controlled my camera from 30 feet. Not that I’m a skeptic or anything. I wouldn’t suggest you try using a drone with this, but seeing through your camera and shooting from 30 feet away could open up some pretty creative possibilities. SnapBridge in “Remote Photography” mode. Note above how you can monitor shutter, f-stop, shots remaining, and battery level while shooting. The shots you take will show as thumbnails below the camera settings. The big white circle is the shutter button. What the camera sees while being controlled by SnapBridge View the SnapBridge training video at the Nikon web site My own preference for most remote-release scenarios. Cheap and reliable 10-pin connection. I would imagine that SnapBridge will get some enhancements in the future. Let’s hope it will someday enable you to alter exposure settings from your phone. By the way, if you're more interested in transferring photos to your phone and skipping the remote control, you can use BlueTooth. It's slower than wireless, but uses a tiny fraction of the battery power to operate. Don't forget to activate your phone GPS to embed the location data in the pictures. #review

  • Photo Noise Reduction: Nik Dfine 2.0

    Many photo-editing programs include noise reduction, but they’re typically crude. If you’re the type of person that wants the nth degree of control over this process, you might consider using Nik Dfine 2.0. Nik Dfine is a “plug-in”, so that means it runs inside of another program. Many programs, such as PhotoShop, LightRoom, Aperture, and Zoner Photo Studio can use plug-ins. If you use Nik Dfine, it means you get the same user experience inside any of those programs that can run it. Nik Dfine comes from Google, and they decided to discontinue it; they made it free, so you can’t beat the price! Google also discontinued the other Nik plug-ins, so they’re all free now. You may need to consult Google for specific procedures on how to install the Nik plug-ins for your particular program. You can still get the plug-ins here: Why would you want noise reduction in the first place? Two of the biggest reasons that come to mind are small-sensor cameras and dim light pictures where you were forced to really crank up the ISO. Those color spreckles, especially in deep shadows, can look terrible. You should know that the order in which you process your pictures is important. You want to handle noise reduction first, before any other photo manipulation. By the way, you want to handle sharpening last. In-between these two editing operations, order isn’t too important. I noticed that the Dfine user guide said it supports only Tiff format, but I used it in Zoner Photo Studio with Nikon NEF format pictures, so you don’t have to worry about that constraint. A little noise reduction goes a long way, so you don’t want to overdo it. It you don’t heed this advice, you’ll probably end up with pictures that have a lot of mush instead of fine details. If you’re working with “raw” format, then don’t apply any sharpening or noise reduction before using Dfine on the image. Nik Dfine is extremely smart about how and where it rids noise. It can aggressively attack featureless areas and barely touch areas with fine detail. You can go as manual as you wish to “take control”, or you can let Dfine do its magic automatically. Personally, I love the automation and the end results. Many photographers find that they hate the color noise, but that they don’t mind the luminance noise. I (mostly) fall into that category. I don’t mind the gritty or sandy effect luminance noise can have. Leaving luminance noise alone can result in an overall sharper-looking photo. If you want to rid both types of noise, however, Dfine can deliver. The Dfine developers were very smart about leaving the fine details alone while smoothing the out-of-focus areas. Dfine is essentially a two-step process. The first step is “Measure”, where it analyzes the photo and decides what to do and where to do it. The second step is to act on the noise, or “Reduce”. Before working on noise reduction, you will probably want to set up some preferences. I prefer the “single” versus a “split” or “side-by-side” view of the photo, and I toggle the “Preview” checkbox to see the “before” and “after” effect on the whole photo. I also prefer the default “RGB Mode”, versus modes such as “Chrominance Only” or “Luminance Only” (which switch to black-and-white). I like to keep the “Loupe” enabled, so that I can selectively look at the pixel-level wherever the mouse pointer is. You can also lock its view into a position of your choosing by selecting its “pin” icon. Some notes on setting up the Dfine functionality “Measure” showing Automatic mode Again, click on “Measure” to let Dfine analyze the photo and determine its game plan on how to reduce noise prior to clicking “Reduce”. “Reduce” showing “Color Ranges” Method. After clicking “Measure”, the interface will change to allow you to fine-tune settings. I typically will increase the “Edge Preservation” (under the “More” drop-down) to save really fine details, such as fur or feathers. For global changes based upon color, select the “Color Ranges” Method. If you wish to work on areas of the picture, you can select “Control Points” instead, and add as many as you need. People familiar with Nikon Capture NX2 know all about control points, since Nik wrote that program, too. When you’re happy with your noise reduction setup, click on “Reduce”. Example noise reduction at pixel-level zoom Typical noise (in shadows) Noise reduced without detail loss in fur As camera sensors get better, noise reduction is needed less and less. But when you need to fix a noisy image, Dfine is a great tool to have in your arsenal. #howto

  • Test Your Secure Digital Card: Lame and Lamer

    Why Lame and Lamer? I can guarantee that the specifications of your secure digital card are bogus. As I'm fond of saying, they're blowing smoke you know where. You'll find that the read speed isn't as fast as the card manufacturer says (lame), and the write speed will typically be even slower than the read speed (lamer). When a manufacturer like Nikon tells you how many frames you have in your camera “buffer”, they assume you have the very fastest SD card available. If you don’t have a fast card, your buffer size is much smaller. What this really means is that Nikon doesn’t have as big of a buffer as they claim; they’re depending upon your SD card to write out the pictures while you’re shooting a sequence. I'll show you how you can test the card for yourself outside of your camera to see how fast it really is. I’m going to demonstrate results from a Windows environment; you can download other free software to perform tests in an Apple and Linux environment, as well. The Windows program "h2testw" can measure both read and write speeds of your secure digital card. If your computer cannot take advantage of UHS-II hardware, you can't accurately assess those newer-generation cards for speed. Newer UHS-II cards have an additional row of electrical contacts on them, allowing for parallel data transmission, which is how they’re able to get so much more speed out of them. Cameras that only have UHS-I capability can still use the UHS-II cards, but they ignore the extra row of electrical contacts and run them at reduced speed. As a result, you won’t ever get beyond the “UHS-I” speed using a “UHS-II” card. If you use a card reader and your computer USB port is USB 3.0, then you may still be able to accurately measure it, assuming your card reader has the electrical capability of the UHS-II specification. Older computers may in fact be too slow themselves to give you accurate information, so be forewarned. Many off-brand SD card manufacturers lie even more than the big guys. The speeds they claim aren’t even close to name-brand manufacturer speed ratings, let alone reality. Buyer beware. Be advised that this (h2testw.exe) program will destroy any pictures on the card, so save existing pictures elsewhere before running the program. You should format the SD card once you place it back into your camera after testing it. h2testw.exe User Interface In-progress program screen SanDisk Extreme Pro 95MB/s test results SanDisk Extreme Plus 90MB/s test results Sample computer "SD card slot" tests: SanDisk Extreme Plus 32GB 90MB/s UHS-I card. Actual Read speed: 69.6 MB/s Actual Write speed: 55.2 MB/s SanDisk Extreme Pro 32GB 95MB/s UHS-I card. Actual Read speed: 66.5 MB/s Actual Write speed: 67.5 MB/s Nikon D7100 results (from cameramemoryspeed.com): SanDisk Extreme Pro 32GB 95MB/s UHS-I card = 69.8 MB/s Conclusions The computer results and camera results are comparable to each other (within 3% for write speed). Compared to the “95MB/s” being advertised for the Pro version, however, these speeds are quite different. Be aware that the newest SD cards (with write speeds approaching 300MB/s) are “UHS-II” or beyond. You’ll have to upgrade to a new camera (like the D500) if you expect to take advantage of such speeds. #howto

  • “Safe” Storage of Camera Gear

    Many people have purchased a safe to store their camera gear and protect themselves from theft and fires. This might not be the best idea. Let me explain. Safes (mostly advertised as “gun safes”) are usually advertised as being fireproof, typically able to withstand a fire for about a half-hour. How is this achieved? It’s achieved typically in two different ways: via the gypsum in drywall material, or through clay-like materials. Typical fire-rated gun safe with electronic lock Typical gun storage cabinet with key lock This fireproof insulation material makes up the bulk of the wall thickness of a safe. I bet you thought those safe walls were pure steel. Nope. If that were true, a safe would weigh probably triple what it actually does. So why should you care how a safe is made fireproof? The key here is moisture. These fire-proofing materials contain water molecules, and that can create a high-humidity environment. High humidity is not a friendly environment for cameras or lenses. The way these materials work is to convert water molecules into steam during a fire, which keeps the safe’s flammable contents from igniting. A “steamed camera” is probably a dead camera, so fire-proofing won’t really help you anyway. Some, but not all, fire-proofing materials in safes may keep a high-humidity environment inside the safe. The worst offenders here are “document safes”. I used to use one of these myself, and noticed that my camera LCD screen would always fog up when I started using my camera. Not good. Another concern with fire-proof safes is formaldehyde. The drywall insulation material (often from China) might contain formaldehyde. It would be great if you could buy a safe that used Space Shuttle tiles for insulation, but I doubt you could afford it. So, how do you keep your gear secure, without ruining it by high humidity? One solution is a lockable steel storage cabinet. It won’t have insulation that can pose a humidity problem. Steel cabinets can be secured to the floor or a wall via lag screws, etc. Cabinets made for gun storage typically have the heaviest gauge steel, which rival the steel thickness found in modestly-priced safes. These gun-storage cabinets typically have hardened steel, as well. What about storage capacity? You will gain about 4 inches in all interior dimensions for extra storage space if you omit insulation. You will find over time that you ALWAYS need more space. Professional thieves probably have grinding tools that can penetrate most safes without much more difficulty than a steel cabinet. It’s probably more important to protect your gear from humidity and resign yourself to protection against mere amateur thieves. Fire protection? Forget it. #review

  • Lens Focus Repeatability and Calibration

    Many people are under the impression that your camera/lens will auto-focus the same way each time. Nope, nope, nope. Camera designers have to live in the real world of “close enough”, “fast enough”, and “cheap enough”. The holy grail of focus is to make sure your target gets inside the zone of acceptable focus. If your camera misses focus every time (and it probably will), it doesn’t really matter as long as the target is still in focus. I’m going to show you some real-world measurements and what kind of compromises you need to make when evaluating and calibrating your focus. I’m assuming you have a camera that supports focus calibration. It drives me crazy when people make claims about “facts” without the data to back it up and without giving you the tools to repeat the same experiments for yourself. What follows should be reasonably repeatable by anyone, without much expense involved. As always, it bears repeating that “your mileage may vary”. Measurements are affected by the camera, lens, light level, aperture, target size, alignment accuracy, target distance, and stuff I haven’t even thought of. I decided to try the experiment with two different cameras and two different lenses. I chose a Nikon D7100 with my Sigma 150-600 at 300 mm and a Nikon D610 with a Nikkor 24-70 at 70 mm. They’re both competent combinations, and should be representative of what an average camera enthusiast might use. I did all tests with the aperture wide-open, since stopping down would only obscure the results. The Nikon D7100 has a focus sensitivity down to -2 EV, and the D610 has a focus sensitivity of -1 EV. That doesn’t mean you should perform a focus test there. I always use a light level of at least 10 EV for testing. I’m after focus repeatability, and repeatability will go out the window if you shoot in dim light. How far out of the window would be an excellent topic of study for another time. Keeping with the theme of doing things by the numbers, I’m using my go-to analysis software MTF Mapper (version 0.5.13) by Frans van den Bergh. I used the “focus” option with his “mfperspective_a3.pdf” chart, printed to about 10” by 12” and mounted flat. The chart is oriented 45-degrees to the camera, to capture correct depth information. This arrangement gives me ample accuracy for evaluation. By the way, the camera pixel size doesn't matter for this particular test, but you need to remember to set it for the other measurements in the program options. All of my photographs are made with un-sharpened RAW format. I de-focused the lens between each shot, and I alternated between too-near and too-far de-focus to exercise both directions of auto-focus. I always use back-button focus. I didn’t want any directional bias in the shots. What the focus chart looks like. The image above shows what gets photographed and analyzed. The chart left side is rotated farther away from the camera’s plane of focus by 45 degrees. The camera focus sensor is pointed dead-center on the chart. Sample measurement from Nikkor 24-70 mm at f/2.8 and 1 meter. The above picture shows how the MTF Mapper can measure the key elements in the chart and provide focus error measurements. In this picture, the camera missed focus by 2.8 mm at a distance of 1 meter. The camera focus sensor was aimed at the marker that’s under the vertical orange arrows. The depth of sharp focus for this lens/aperture/distance combination is about 15 mm. Anything inside the 15 mm focus window would be “success”. It’s possible to make the measurements using only the red, blue, or green-sensitive sensor pixels in the photo, if desired. Lenses with significant longitudinal chromatic aberration will have focus peaks that are widely separated. For this experiment, all that is needed is to be consistent in using the same settings each time. Sample measurement Sigma 150-600 at 300 mm f/5.6 and 4.22 meters. The depth of sharp focus for this lens/aperture/distance combination is about 45 mm. Anything inside the 45 mm focus window would be “success”. Here, the camera missed focus by 0.7 mm, which is pretty much dead-on. Sample results for Nikkor 24-70mm f/2.8 Test The Nikkor 24-70 mm at 70 mm test follows. It should be noted that this lens is notorious for focusing differently at different focal lengths. This means that it is impossible to “fine tune” focus at a single value and have it correctly calibrated throughout the focal range. I have set a fine-tune value (+16) that under-compensates at 70 mm and over-compensates at 24 mm, with a bias toward 70 mm. The Sigma lens has far smarter firmware in it, and I have it calibrated at 4 different focal lengths and at 4 distance settings per focal length, for a total of 16 calibration fine-tune settings. Measurement Errors Per Photograph (mm): -8.1, -11.2, -10.6, -1.7, -8.4, -7.6, -5, -7.2, -4.5, -6.2, -7.3, -8.9, -1.9, -7.1, -1.1, -7.0, -4.3, 2.8, -2.6 N = 19, MEAN = -5.68 mm, STDEV = 3.55 mm Given a “sharp zone” of about 15 mm (plus, minus 7 mm), I’d say that this test showed a focus miss about a third of the time. The mean of -5.68 mm is my “bias” focus error to help minimize the “+” focus error I get when zoomed to 24 mm. The standard deviation of 3.55 mm is a measure of the typical magnitude of each focus “miss”. Call this repeatability. This is actually an impressive value. That's a typical error of 1 part in 282, or (1000 mm / 3.55 mm). Bear in mind that the 15 mm “sharp focus depth” criteria is actually being quite picky. Also note that the focus chart target was only 1 meter away and the shots were with a wide-open aperture. Longer distances and/or stopping down would make the target zone quite sharp. If I were to expect to spend the day shooting at 70 mm, I’d certainly adjust the focus fine-tune up to the maximum +20, and my focus miss rate would go toward zero. Typical focus error plot, showing it needs more “+” focus fine-tune to drive it toward the vertical blue marker. Sample results for Sigma 150-600 mm f/5.6 Test The Sigma 150-600 mm at 300 mm test follows. In contrast to the Nikkor, this lens has focus fine-tune calibration throughout its zoom and focusing range (16 calibration fine-tune settings). It makes all the difference. Measurement Errors Per Photograph (mm): -0.7, -9.8, -14.5, -16.6, 1.1, 14.5, 0.6, -0.19, 9.4, 4.9, -15.3, 7.0, 4.9, -8.2, -12.0 N = 15, MEAN = -2.3 mm, STDEV = 9.8 mm Given a “sharp zone” of about 45 mm (plus, minus 22.5 mm), I’d say that this test showed it NEVER missed the focus zone. The standard deviation of 9.8 mm is a measure of the typical magnitude of each focus “miss”. Given the focal length and target distance, this is awesome. That's a typical error of 1 part in 431, or (4220 mm / 9.8 mm). Bravo, Sigma. Conclusions If you really, really want to know how your lenses and cameras perform, these tests are representative of how you would do it. Being an engineer myself, I never fail to be impressed at how far photographic technology has come. Twenty years ago, you couldn’t get this level of camera/lens performance at any price. Thanks again, Frans, for your incredible MTF Mapper program. #howto

  • White Balance Calibration When Colors Go Haywire

    Setting the white balance is one of those things that can be laden with a lot of guilt. If you shoot RAW, it’s supposed to be a “don’t care”, but many photographers will look down their nose at you if you don’t “do Kelvin”. I thought it’s high time to do a little comparison shopping. Modern cameras have a lot of computing horsepower to figure out what white balance you should be using ala “auto”, but is it any good? What about using tables that supply the answers? What about a color meter? What about just setting it in your photo-processing software after the fact? How about using "Live View" mode to help you decide? So many questions. I always shoot RAW, so white balance decisions aren’t a big deal to me. If I don’t like the color balance, I just change it in photo editing software. With jpeg, though, it’s not nearly as forgiving. Jpeg has very little elbow room for errors, so you want to get it right in the camera. But how can you reliably get it right in the camera? I conducted some tests using a Nikon D610 and an android smartphone program called “Light Meter” (version 2.6) written by Borce Trajkovski. This program lets you measure light levels (as in LUX) and also color temperature in degrees Kelvin. The program has the additional advantage of allowing you to calibrate it for both light (scale and offset) and also for color (scale and offset). It gives you approximate calibration values to use for various smartphone models. I used Nikon’s Capture NX2 to adjust and analyze my RAW shot tests, but many photo editing packages would work just as well. Sunshine I took 3 shots of my neutral grey card target illuminated by direct sun. I used “Auto” white balance, the measured color meter temperature (5560K was closest), and the “Direct Sunlight” camera white balance selection. Not surprisingly, all of the shots look acceptable (although the “direct sunlight” choice was off the most). Using my histogram view in Capture NX2, I could see that the R,G,B color peaks were nearly on top of each other, as they should be for a neutral grey target. The in-camera histogram showed the same result. “Auto” white balance in sunshine. 5433K was set automatically. “5560K” white balance in sunshine. Meter indicated 5600K “Direct Sunlight” white balance selection in sunshine. 5209K was set by camera. Most cameras have sunlight pretty well figured out, so you’d expect those shots to have well-balanced color. Shade My next test was in open shade, with a clear blue sky. “Auto” white balance in shade. 7662K got set automatically. “7140K” white balance in shade. Meter indicated 7000K. “Shade” white balance selection in shade. 7989K was set by camera. The analysis of the histogram peaks indicates that “Auto” white balance is the best here, but again all three are reasonably close to each other. I prefer the "color-metered" setting. Indoor LED Lighting “Auto” white balance inside using LED (ceiling) lighting. 3390K was set automatically. “4350K” white balance in LED lighting. Meter indicated 4300K. LED lighting “Live View” guide with “3030K” selected WB. Now things get interesting. The “Auto” setting was pretty inaccurate, and using the color meter was really terrible. It turns out that by using “Live View”, I could really nail the white balance. In "K" WB mode, I could spin my camera's control dial and instantly see the color change on the screen in Live View. An electronic viewfinder would work the same way. Since I use RAW, though, all is not lost. All I have to do if I messed up the in-camera white balance is to adjust it via my photo editor when I get back home. Here’s the trick: with a neutral grey target, adjust the white balance value until the RGB histogram peaks coincide. Let’s take a look at the badly-adjusted shot in Capture NX2 next. With the 4350K original setting, the RGB peaks aren’t even close to where they should be (they should land on top of each other). By adjusting the slider “Fine Adjustment” to 2950K, the peaks overlap and the picture is now perfectly neutral. “Live View” really saved the day on the indoor shots. If I were shooting an indoor wedding ceremony at the mercy of whatever lighting was there, I’d definitely want to consult “Live View” to set my white balance. Outdoors in bright lighting is another animal, however. Live View (unless you use something like a Hoodman Loupe or have an electronic viewfinder) is an underwhelming experience. You should still be able to analyze the histogram peaks on shots after the fact to help you dial in white balance, though. By the way, don't even think about using published tables of color temperatures for indoor lighting. Indoor lighting color is all over the map, and tables are mostly useless. Conclusion So what have we learned today, class? Auto white balance can be your friend and your foe. Learn when it’s safe to use it and when it’s not. Carry a grey card with you to calibrate the white balance. Take a shot of the card so that you at least have a good reference picture to dial in white balance at home with your photo editor. Remember to take another shot of the card when the lighting changes. Live View can really be your friend, even if you just use it to dial in the white balance and then turn Live View off. Shooting birds in flight moving in and out of shade, however, leaves a single viable choice: "Auto WB". This is what RAW format is all about; just fix the color in your photo editor. Happy (calibrated) shooting. #howto

  • The Orton Effect

    Michael Orton is a photographer who wanted to re-create the look of a water color painting with film. He invented a technique that combines slides containing a sandwich of in-focus and de-focused images. Michael originally called his technique “Orton Imagery”, but now everybody just calls it “the Orton Effect”. When digital photography came along, people wanted to emulate this effect using software. Perhaps the most famous use of this effect is in the Hobbit movies. People knew the “look” was different, but they couldn’t put their finger on what the difference was. I really love the look of the Orton Effect for certain kinds of subjects. Just like cupcakes, though, you may like them but they're less than ideal as a steady diet. Everything in moderation. A straight shot The Orton Effect (Capture NX2, blur radius 25) A Few Ways To Create The Orton Effect Many different photo-editing packages have the capability to create the Orton Effect. Some examples are Gimp, Photoshop, and Nikon Capture NX2. Maybe one of these days, your camera will have an “Orton Effect” setting to create it directly. Nikon Capture NX2 I must be one of the last hold-outs on using Capture NX2. I have a zillion batch files to process pictures using this software; one of them is, of course, the Orton Effect. The first step is to set the Output curve to a value of 3 in "Levels & Curves". Leave other settings at their defaults. Second step: Set a Gaussian Blur value to around 25. Use a blending mode of “Multiply”. The radius value here should be set to suit your subject matter. Third step: Set a midpoint value to “2” in Levels & Curves. Alter the blending mode to “Multiply”. There you have it. At this point, it would be prudent to save your steps as a Batch Process (Batch | Save Adjustments… | Save As | OrtonEffect). Now, you can select photos and run the batch process on them to get the Orton Effect without having to memorize any more steps. You might want to save a few different batch files, setting a different Gaussian Blur radius in each one (the “second step”). The Orton Effect (Capture NX2, blur radius 35) For simple subject matter, I prefer a larger blur radius. For you, sprinkle to taste. Adobe Photoshop First, duplicate the photo and call it “Sharp” Second, right-click on the “Sharp” layer, select “Duplicate Layer…”. and name it “Sharp copy”. Select “Screen” for the blending mode of this layer. While “Sharp copy” is still selected, right-click and select “Merge Down”. You will be left with just the “Sharp” and “Background” layers. Right-click on the “Sharp” layer, select “Duplicate Layer…” and name it “OutOfFocus”. With the “OutOfFocus” layer selected, go to “Filter | Blur | Gaussian Blur” and set a radius suitable to the effect you want. No details should be visible, but you can still make out shapes. Set the “OutOfFocus” layer blending mode to “Multiply”. All that’s left is to save your final image in whatever format you prefer. Finished Orton Effect using Photoshop Summary I would encourage you to explore this processing technique. It can transform a blah photo into something special. I find that purely literal recording of images can start to feel a bit mundane. Try something on the wild side once in a while. Michael Orton did some really pioneering work in photography. We owe him a big thank you. It does look a little like a watercolor painting, doesn't it? #howto

  • Clean Your Camera Image Sensor

    Are you a little intimidated about cleaning those dust bunnies off your camera sensor? Should you punt and pay to have it done for you? It's a little scary to clean your camera sensor if you haven't done it before. I used to bring my camera to a Nikon repair facility to get it cleaned. They would keep my camera overnight, and it would cost me $70.00 for something that took them about 5 minutes of their labor. My D7000 camera, for roughly the first 12,000 exposures of its life, would sling oil onto the sensor. The Nikon service center denied this was oil, and indicated I was probably a little sloppy in my camera-handling cleanliness. Beg to differ. One time I cleaned my sensor (a "wet clean") and then made a 1000-shot time lapse video. By the end of this 20-minute video, the sensor probably had a hundred oil blobs on it. Arrgh. There is a cheap solution (one of those bad puns again). Believe me, you can't get oil off of a camera sensor unless you give it a "wet clean". But what about the more usual case of mere sticky dust? The kind of dust that a blower can't budge? There's now a tool you can buy that can clean off stubborn sensor dust in a safe and easy way. I made a little video that shows how simple and quick it can be to clean your camera sensor yourself. You don't need to be a frady cat any more! #howto

bottom of page