Search Results
268 results found with an empty search
- How Lens Optical Stabilization Works
The more you understand about how stuff works, the more amazing it is. I had heard long ago that lenses with optical stabilization use “gyros”, but I never gave it much thought. I had heard of “gyroscopes”, but spinning wheels aren’t what we’re talking about. The name gyro has the Greek root gyros, meaning rotation. A typical lens optical stabilization unit (a Canon lens) The picture above shows a stabilization unit, which moves a compensating lens group to keep the image on the sensor (and in the viewfinder) from moving. The question to ask is how the lens knows the photographer is jiggling the lens, and what to do about it. In a DSLR, it can’t use the camera sensor to help figure out image movement, since the shutter is hiding the sensor. Let’s start with something called the Coriolis Effect. Coriolis comes from a French mathematician named Gaspard-Gustave de Coriolis. On a big scale, it’s what causes large-scale weather patterns in the northern hemisphere moving north to have an east-word velocity, and the opposite effect in the southern hemisphere. On a smaller scale, it’s the force required to keep walking in one direction as you try to move from the center to the edge of a rotating merry-go-round. When a photographer hand-holds a lens, the lens invariably starts to rotate a bit in various directions. This rotation results in the Coriolis Effect, which can be sensed and then compensated for. A really smart person envisioned a gyro design that could notice rotation by jiggling a weight in one direction and sensing a force that was perpendicular to the direction of that jiggle. As shown above, when the weight was moving up, the force would push the weight to the left. The same weight would get pushed to the right if it was moving down. The forces would all reverse when the rotation switched from clockwise to counter-clockwise. This design is known as a vibratory rate-measuring gyro. It’s common to jiggle these weights at about 10,000 cycles per second (or 10 kHz). The “rate-measuring” here is the rotation rate, usually expressed as degrees per second. Using MEMS technology (micro-electrical-mechanical-systems), the miniature weight, the springs, a drive motor, and position sensors could all be built on a microscopic scale. Power requirements scale down with the size of the parts being used, so a camera battery could drive this system easily. Even with low-power requirements, cameras will typically turn off the stabilization when you take your finger off of the focus button. This type of gyro design was introduced in 1991. The typical name for these units is “Coriolis vibratory gyroscope”. It finally found its way into lenses in 1995. The gyro concept shown above can only sense rotation along one axis, so two of them would be needed in a lens to handle the yaw (left-right) and pitch (up-down) axis of potential rotation. A photographer typically wouldn’t be rotating the lens along its optical axis (roll axis), so that motion isn’t compensated for. Typical hand-held rotation rates being counteracted are around ½ degree per second to 20 degrees per second. The center of rotation is roughly the rear of the camera (or the photographer’s eye). The Analog Devices, Inc. description of their gyro design The picture above shows a little bit more detail. The “Coriolis Sense Fingers” in the drawing are little capacitors that sense the gap distance between their little parallel fingers as the “resonating mass” shifts left or right, according to the direction of rotation of the device. The tiny signal from these capacitors can be converted into a voltage that varies according to the rotation. What the “sense capacitors” look like in the silicon design The Sense Capacitors (via a scanning electron microscope) The whole silicon design of the gyro Getting into even more detail, the picture above shows “Comb-Drives”. These little guys are given an alternating positive and negative voltage, to force them to have a net positive or negative charge. The moving “Active Mass” has little fingers that fit in-between these Comb-Drive fingers. The active mass is alternately pushed away or pulled toward the stationary comb drive fingers, since its electric charge is either attracted to or repelled by the comb drive fingers as their voltage is switched between positive and negative. Comb-drive actual silicon, via scanning electron microscope The shot above is a close-up of the little silicon comb-drive fingers. They push and pull the “active mass” to keep it vibrating. The “active mass” is suspended on silicon beam “springs” to greatly increase the magnitude of its motion when it vibrates, which enhances the signals produced by the gyro. The whole gyro needs to keep the “active mass” vibrating back and forth in its “drive direction”. The mass will get a sideways vibration (the sense direction) when the device (lens) is rotated, due to that Coriolis Effect. When a sideways vibration happens, that’s when those “sense capacitors” mentioned earlier produce a signal to indicate the device is rotating, and in which direction it’s rotating. The signal coming from the gyro is typically a low current, which is converted into a digital count that is proportional to the rotation rate. These little gyros are so small that even air becomes a problem. When they’re manufactured, they have to install them into a package that maintains a vacuum. How small is that gyro, anyway? Small. The lens optical stabilization unit needs a separate little gyro to sense rotation about each axis being controlled (e.g. yaw and pitch). Once the stabilization unit gets the gyro signals to indicate that the lens is rotating, then its microcomputer commands little actuators to move the compensating lens group in the stabilization unit to counteract that rotation. There are other kinds of gyro designs that are much more complicated than the “simple” one I have described. In fact, they can get mind-numbingly complex. Better-quality vibratory gyros are actually able to detect rotation rates of less than 10 degrees per hour. If that isn’t enough to bring tears to your eyes, I don’t know what will. You also probably have MEMS gyros in your smartphone. The next time you complain about having to spend extra money for a lens that has optical stabilization, just think about the technology that goes into it. And try to imagine the brilliance of the people that invented it. By the way, in-lens stabilization is generally preferred over in-camera stabilization. With a DSLR, lens-based stabilization lets you see a steadier viewfinder image and it makes it easier for your camera to focus on a non-moving target. One downside, however, is that the moving stabilization optics can make for slightly worse bokeh. A big thanks to Canon, Analog Devices, Inc. et al. for the visuals used in this article. I don’t yet have my own scanning electron microscope. #howto
- F-stop Fun Facts
Did you ever wonder how they decided upon camera lens f-stop numbering? Are there any other numbering schemes that could be useful? And did you know that the ‘F’ stands for “focal ratio”, which is “the ratio of the system's focal length to the diameter of the entrance pupil”. Nikkor Noct f/1.2 (half-stop faster than f/1.4) F-stops by the numbers F-stop progression Did you know that the standard F-stop numbering scheme comes from a math sequence? Most people know that full stops are based upon doubling or halving light intensity, but not where the actual numbering scheme comes from. Now you know. If you wanted to calculate “standard” F-stop ranges, here’s what you would do: F-stop full scale calculation For the above, the progression solving the above sequence would be: 1, 1.4, 2.0, 2.8, 4.0, 5.6, 8.0, 11.0, 16.0 … You can actually go the other direction, too, if you use (-1 x 0.5), (-2 x 0.5), … for the exponent sequence above! This gets you F-stops like 0.5 and 0.707 for those lenses few mortals will ever be able to afford. F-stop half-scale calculation For the above, the progression solving the above sequence would be: 1, 1.2, 1.4, 1.7, 2.0, 2.4, 2.8, … F-stop third-scale calculation For the above, the progression solving the above sequence would be: 1, 1.12, 1.26, 1.4, 1.6, 1.8, 2.0, 2.2, 2.5, 2.8, … If they ever made them, you should now be able to see how lens manufacturers could mark lenses in fourth-stop, fifth-stop, sixth-stop etc. using fractions in the exponents like 4, 5, 6. You’d think that could be useful for something like cinema lenses, where they like finer exposure control, but instead they go one better and offer step-less aperture control. Speaking of cinema lenses, those lenses are marked in “T” stops, where the T is for “transmission”. I think all lenses should be marked this way, because what really counts is how much light gets to your camera sensor. The F stops can be off by up to a whole F-stops’ worth of transmission, depending upon the lens design and how good the lens multi-coating is. Zoom lenses are particularly dishonest about their transmission. It doesn’t make you a better photographer, but its fun to know how things got to be the way they are. By the way, did you know that early camera shutter speeds had sequences like 1/400, 1/200, 1/100, 1/50… ? That actually seems more logical to me than what they have today. Also, did you know that speeds like 15 and 30 seconds are actually 16 and 32 seconds, respectively? The camera makers just lie about these values. Time them for yourself to see. #howto
- Nikon Custom Settings Banks versus Photo Shooting Banks
Many people have a fundamental confusion about Nikon memory banks on their “pro” model cameras. Nikon keeps the distinction as clear as mud. The “Photo shooting menu bank” is found in the “Photo Shooting Menu” (the little camera icon). You can assign up to four of these (A,B,C,D) to have unique settings in each. You can also name these banks to be something meaningful. I have settings for “Sports”, “Landscape”, “Manual”, and “Live View”, so that’s what I named them. Here’s where you save your unique setups that you configure for things like Auto-ISO, default ISO, Manual shooting mode, picture controls, shutter speed, aperture setting, etc. The “Custom settings bank” is found in the “Custom Setting Menu” (the pencil icon). What you save here are things like custom button assignments. You also get four of these (also named A,B,C,D). This is where the confusion sets in. Fortunately, you can give these banks names, too. If you give these banks different names than the “photo shooting” banks, then it will help eliminate confusion. Both banks are added into "My Menu" for fast access Photo shooting bank Custom settings bank I have the names “Focus buttons” and “Live View” in my custom settings banks. The “Focus buttons” save the way I have configured my AF-ON, joystick, Fn1, Fn2, and PV buttons. These button assignments prevent focusing in Live View unless I use the touch-screen, which I find enormously irritating. The bank I named “Live View” rids the various “AF-ON plus area mode” features I assign to other buttons. Once these custom button assignments are cleared (by selecting my “Live View” custom settings bank), then my “AF-ON” button can once again be used to get Live View to auto-focus. In the “Setup Menu” (the little wrench icon), there’s a “Save/load settings” option to save the shooting configuration (all shooting banks and all custom settings banks). To get at the two memory banks quicker, I assign my “Fn2” button to go to “My Menu”. Inside “My Menu”, I added the “Photo shooting menu bank” and the “Custom settings bank”. Even if I don't select a different bank here, it's a very fast way to verify how my camera is presently configured. I still wish Nikon would just stick the “U1”,”U2”, etc. dial on all of their cameras, which is infinitely faster than menu-diving. And just try menu diving if you're half blinded by bright sunshine. #howto
- Nikon D850 Buffer Capacity Reality Testing
Competitor #1: Sony XQD I tried to look up the D850 camera shot buffer numbers on the internet. All over the map. Either terrible or stunning, depending on who you ask. Nikon claims it should have a 51 shot buffer with the settings I shoot with. I felt compelled to conduct some testing of my own, using a couple of different memory card types, since I’m a natural born skeptic. The manufacturers of those memory cards seem to completely fabricate their numbers. Going by “the specs” is a fool’s errand. So here’s my testing scenario. I set my D850 to ISO 64 (least noise and therefore smallest picture memory) with large-14-bit-lossless-compressed RAW format. I used continuous-high shooting speed, which the “specifications” rate at 7 frames per second. I only populated the camera with a single memory card at a time. I shot landscape pictures in the sunshine with “typical complexity”. Noisy, complex pictures take up more memory and will therefore decrease the buffer numbers. I use a battery grip, but I just use the standard “EN/EL 15-a” battery in it, so no 9 frames per second for me. The first card I tested is the “Sony G-series 400MB/s write speed“ 32GB XQD card. I have read that in actual reality it writes at very roughly “113.84 MB/s”, according to this site when tested in the Nikon D850 camera. This sounds like a case of “the large Sony print giveth and the small reviewer’s print taketh away”. The second card I tested is the “Lexar Professional 1000X 64GB 150MB/s” card, which the fine print states as being capable of writing at 75MB/s. Competitor #2 fits in the SD card slot. For both cards, I formatted them just prior to testing, so that storage fragmenting wouldn’t be an issue with the timing. Results So here’s what I got. The Sony XQD card managed 37 shots before it hiccupped and slowed down. The Lexar card (Laxar?) got me 24 shots before slowing down. Yikes. Pretty underwhelming. If I were shooting on an NFL sideline or an Olympic track, this camera setting probably wouldn’t be my first choice. For most other stuff, I probably couldn’t care less. I have to give credit where credit is due, however: I got a little faster than 7 frames per second. Actually, I got 37 complete (XQD) frames in 5.06 seconds, or 7.31 frames per second. I used the sound track timing from a video to “visualize” each shutter/mirror slap. Nikon wasn’t lying there; they were actually a bit conservative. Next, I set the D850 to 12-bit lossless compressed raw, and voila, the XQD got 200 shots at full speed! The ‘Laxar’, however, only got me 34 shots with this 12-bit setting. I could change the scene 'complexity' and brightness and get fewer shots; typically about 193 shots in 27.5 seconds (7.02 fps). For a really complex scene, I once got only 101 shots in 13.2 seconds (7.65 fps). Now you can see why people argue about the real buffer size. Addendum 9-21-2019: I got more interested in 'scene complexity' and did a lot more testing. There were times where I got an average of 43 shots in 6.1 seconds (7.05 fps) using the lossless compressed 14-bit. Still not Nikon's 51 shots, but maybe there's a super-fast XQD card out there that can squeeze out the extra 8 shots. The more I test, the murkier the results... Since the D850 is all about quality, why on earth would I ever be willing go down to 12-bit shooting? I read an article here by ‘Verm’ Sherman that changed my mind about 12 bits. He tried and tried to demonstrate how inferior the 12-bit files are, compared to 14-bit, but was unable to do so. The shots just kept looking spectacular and equal in his tests. I did some tests myself, and I have to agree; I can’t tell the difference. But I did notice the difference of about 15MB smaller file sizes, which really adds up over time. On my own camera, I have a “Sports” photo shooting bank that uses the 12-bit lossless compressed setting, to 'guarantee' that I get the 200-shot buffer. I have a separate “Landscape” photo shooting bank that is set to 14-bit lossless compressed mode, but it’s mostly for “insurance” just in case in the future there may be displays that can possibly show a difference. Spending an extra 15MB per shot does seem like a painful insurance premium, however. I have to admit that I feel a lot less guilty about having my “Sports” mode on 12-bit, though. The quality to my eye is stunning, and there is still a ton of elbow room in the dynamic range. I can’t resist mentioning that my Nikon D500, using the exact same “Sony G-series 400MB/s write speed“ 32GB XQD card, has a 200-shot buffer, and it shoots 10 frames per second to boot using 14-bit lossless compressed (or any other setting except uncompressed 14-bit). Smokin. And verified. And no excuses. There are just so many variables when it comes to shot buffer capacity that I have to recommend that you verify yours before you try shooting that once-in-a-lifetime opportunity. You never know when the Loch Ness monster and Bigfoot might happen to show up in that forest clearing at the same instant, and you’re the only witness. #review
- Nikon D500 Un-cropped versus D850 Cropped Shot Comparison
There’s been a lot of talk about using crop sensor cameras for subjects that need that “effective focal length” increase, for distant subjects like birds. For example, a 600mm lens has an effective focal length of (600 X 1.5) or 900mm. This may be close to the truth if both the full-frame and the crop sensors have the same overall resolution, but what about an FX camera that has more resolution than the DX camera, like the D850 versus the D500? Does the Nikon D500 beat the D850 if you crop the D850 picture down to the same field of view as the D500? I decided to find out for myself just how good the D850 sensor is, and see if it can match the D500, even if you crop the D850 shots down to the same view you get with the D500. I know that the D500 can shoot faster and has a bigger frame buffer, but the D850 is no slouch, either. Both cameras are just as sensitive to light and can focus at the same speed, too. I’m not here to talk about the merits of one camera over the other; I’m only interested in knowing if I lose any quality using the D850 and crop the shots, when compared to un-cropped D500 shots. If you like wide-angle shooting, there is of course no substitute for a full-frame camera. There are multiple “full-frame advantage” topics I could talk about, but I want to focus strictly on cropping here. To get some answers, I set up a resolution target and then shot it from the exact same position and with the same lens; I just swapped camera bodies. After running it through my image analysis software, I took a look at the shots up close. The shootout: D850 versus D500 I used my Sigma 70-200mm f/2.8 lens at 70mm and f/2.8 for these tests. I shot in raw format, and I didn’t post-process the shots in any way before analyzing them in my image analysis software. I wanted to mention that I shot the chart with an exposure compensation of +0.7 stops with both cameras. The D500 meter consistently ended up with slightly darker images than the D850, but the image analysis software measurements are unaffected by that small difference. The shot above is the MTF50 2-dimensional resolution plot from the D500. The measurements are in units of line pairs per millimeter (lp/mm). The 20.9 megapixel D500 sensor pixels are 4.22 microns, compared to the 45.7 megapixel D850, with pixels of 4.35 microns. The resolution chart is filling the frame left-to-right, and some of the chart goes outside of the frame top-to-bottom. The shot above is the D850 using the same lens at the same distance. The plot looks a little funny, because the resolution chart no longer fills the frame. The actual resolution measurements of the little trapezoids in the target chart are unaffected by the framing difference, however. There are essentially the same number of sensor pixels on each little target trapezoid for each camera. The resolution results between the D500 and D850 look quite close. Given the different pixel dimensions between the cameras, the D500 is expected to have slightly higher MTF50 lp/mm resolution numbers than the D850, if the little target trapezoids have the same “cycles per pixel” resolution. The resolution results are remarkably similar, here, and the D500 numbers are a tiny bit higher, but generally within experimental error. D500 target center The D500 edge measurements of the little trapezoids are in the range of 0.27 c/p to 0.32 c/p. For this sensor, a measurement of 0.31 c/p is equivalent to an MTF50 of 73.3 lp/mm. The peak value of 0.32 equates to an MTF50 of 75.7 lp/mm, which matches the maximum value shown in the D500 MTF50 chart above. D850 target center The D850 edge measurements of the little target trapezoids are in the range of 0.28 c/p to 0.31 c/p. For this D850 sensor, a measurement of 0.31 c/p is equivalent to an MTF50 of 71.6 lp/mm. At least in the lens center, then, there’s essentially no difference between the cameras. Next, let’s take a look at the sensor right edge. D500 right edge D850 right edge Comparing the cameras on the right edge, the D500 fared a little bit better on most of the target edges, but the measurements aren’t hugely different. D500 top edge D850 top edge The readings between the D500 and D850 on the top of the frame are also comparable, but here I’d give the ‘edge’ to the D850 results. Without the little blue measurement values to guide me, I would have a hard time telling which shot was from which camera. Conclusion If I were to take a bunch of shots with both cameras and crop the D850 shots to match the D500 and then hand them to somebody to choose which was which, I’ll bet they couldn’t tell. The bottom line is that cropping the D850 shots gets me the same quality as the D500; there is no DX “effective focal length” advantage to be seen here. I have always been on a big guilt trip when I crop a shot; this is definitely going to make me feel better about myself in that regard. At least with the D850. #review
- Sigma Focus Algorithms: Speed versus Accuracy
Sigma lets you program their “global vision” series of lenses with their USB dock. This includes the Sport, Art, and Contemporary lenses. One of the things you can program is which autofocus algorithm to use. You get three algorithms to choose from: “Fast AF Priority”, “Standard AF”, or “Smooth AF Priority”. By assigning a different algorithm to different custom switches on the lens (C1 and C2), you can change your mind on the fly and pick the appropriate focus algorithm to fit the shooting conditions. The “Smooth AF Priority” algorithm is primarily for video use, so I never use it (it’s the slowest focus algorithm). I’m interested in getting the fastest focus performance that I can get, so I want to use the “Fast AF Priority” whenever I can. I have already measured the speed of the “Fast AF Priority” algorithm versus the “Standard AF” algorithm, and found that the Fast algorithm is about 20 percent quicker than the Standard algorithm. I had used a Nikon D500 and the Sigma 150-600 Contemporary for the speed test. I thought I’d try to determine just how repeatable the focus algorithms are. If a camera/lens combination is super fast to focus but is totally unreliable at getting to the correct target distance, then you haven’t really gained anything. I decided to use my Sigma 70-200 f/2.8 Sport lens for this test. I have programmed the C1 switch for the “Fast AF Priority” algorithm, and the C2 switch is programmed with “Standard AF” (“Standard” is also Sigma’s default algorithm if you don’t program the lens). I used a Nikon D850 for the tests. All of the test shots were done at 190mm and f/2.8 from a distance of 1.88 meters. This is a fairly close subject distance, but I wanted to do a test where I could spot even tiny focus errors. Sigma Custom Switch (C1) settings options The screen above shows how to access the autofocus speed options, via the “AF Speed Setting” button. It also shows how my C1 lens switch is currently programmed with the “Fast AF Priority” and “Moderate View Mode” optical stabilization on my 70-200mm lens. All of the same options are available for the C2 lens switch. Sigma’s available programmable AF Speed algorithms The picture above shows you the three autofocus speed selections that are available for programming a lens with their Optimization Pro software and their USB dock. You can always change your mind and reprogram the lens later, if you’re not happy with a selection. Sigma already upgraded the firmware in their 150-600 lenses, which vastly improved focus speed. If I hadn’t purchased their USB dock, I couldn’t have taken advantage of their improvements. Focus Comparison Testing Procedures To perform the tests, I would start by first selecting the desired (already-programmed) custom switch setting. I mounted the camera onto a sturdy tripod, because it’s critical to keep the camera at a fixed distance from the target. The camera was set to phase-detect autofocus, with all of the same settings I’d use for regular action photography (where I want fast autofocus). I only used unsharpened raw format for the testing, although jpeg can be used here if you aren’t concerned with accurate target edge resolution values. I mounted a focus target that is designed for focus evaluation/calibration using the free MTFMapper software. The target is designed such that the (middle) camera focus sensor only sees a single high-contrast edge, and won’t be confused by neighboring details to focus on. The target is mounted at a 45-degree angle relative to the camera sensor. This makes it easy to determine what’s in focus and what isn’t. I focus on the middle of the target, where the big vertical trapezoid edge is located. When the target is rotated about the vertical, the trapezoid shape starts to look like a rectangle. I focus with the lens wide open, so that there will be no room for doubt about where the plane of best focus ends up. This is, by the way, the same basic setup that I use to focus fine-tune my lenses at close distances. I have bigger targets for focus calibration at longer distances. To spice up the test a little bit, I shot the photos at a light intensity of EV 7.3, which is typical indoor room lighting, and definitely more of a challenge for a focus system than sunlight. The Focus Target The photo above shows what the focus target looks like. The little blue numbers on each little slanted square are resolution measurements for each measured edge. These numbers are placed there via the MTFMapper program when the photo is analyzed. I’m using a small target, which has overall dimensions of just 8.5 inches tall by 9.5 inches wide, plus some whitespace around that. I want small little squares so that I can discern very small focus errors. The “large” vertical target edge I focus on is just 2 inches tall, and each little square is just a quarter inch on an edge (6.4mm). Since the test shots are done with the target rotated by 45 degrees, the little squares in front and behind the large black target edge go quickly out of focus, and have a very low corresponding measured resolution number. Ideally, the highest resolution measurement would be the large vertical edge in the middle of the shot, since that’s where the focus sensor I’m using is aimed. The little squares that line up with that large vertical edge should have a similar resolution number (assuming the camera sensor edge is aligned parallel to the chart). I start by manually shifting the focus well away from the middle of the target and press my “AF-ON” button to initiate autofocus. If all goes well, then the camera will of course focus perfectly on the large vertical edge in the middle of the field of view. The resolution reading (little blue number on the edge) should be highest on that same edge. I repeat this procedure over and over again; each time I de-focus the lens and press the AF-ON button to re-focus on the target edge and then take the shot. Reality rears its ugly head, however. The resolution measurements will show where the lens actually ended up focusing. If you have quality equipment and have properly calibrated the focus “fine tune”, the best focus should at least be “near” to the desired focus distance. The camera’s phase-detect sensors will tell the camera when focus is “good enough”, and the camera then tells the lens to stop focusing. If you were to shoot in really dim lighting, then you may experience focus-hunting; use bright-enough lighting that your camera doesn't have to struggle with this test. This test, then, is to evaluate the range of distances where focus ended up while using first the “fast” autofocus algorithm (C1 switch), and then using the “standard” autofocus algorithm (C2 switch). Examining the focus target up close In the shot above, I had turned the focus target upside-down, so that the right side of the target is rotated away from me. As you can see, the zone of sharp focus is really narrow. In this shot, the focus was perfect, and the little squares aligned above and below the large vertical edge have the highest resolution numbers (0.18 cycles per pixel). You might notice that your camera will tend to focus too near if you start your focus distance setting in front of the target. As soon as the camera thinks focus is “good enough”, it stops the focus action. If you start from the far side of the target, the focus can tend to be too far (once again, it entered the “good enough” zone and stopped). Keep this in mind when performing focus fine-tune calibration; do a set of shots starting focus nearer and then a set of shots beyond the focus target to verify your camera’s focus behavior. My Nikon D850 doesn't suffer from the stop-focus-too-soon problem, no matter if I focus near-to-far or from far-to-near. Test Results I couldn’t detect any difference in the tendency to miss focus with either the Standard or Fast autofocus algorithms. I did half of the tests starting focus too near the target and half starting focus beyond the target; it didn’t alter the results. I didn’t have a single focus miss of more than 7mm at 1.88 meters target distance, no matter which focus algorithm was chosen. I shot about 100 tests overall, to best determine “average” focus behavior. Never make a focus determination on the basis of a single shot; this is one of those “statistical” things. With either focus algorithm, the focus was on average within 3mm of perfect. I had previously done this same testing procedure on my Sigma 150-600 Contemporary lens. I didn’t see any accuracy or repeatability problems by using the Fast algorithm instead of the Standard algorithm on that lens, either. This doesn’t, of course, guarantee that all of Sigma’s lenses behave this well. Always "trust but verify". Here, then, is a case where you get it all: speed, repeatability, and accuracy. If there aren’t any focus repeatability differences between the Fast and the Standard algorithms, then why would you choose the slower Standard algorithm? I have kept my C2 switch programmed with the Standard algorithm as a sort of insurance policy, but I haven’t needed it yet for general photography. It may be that in extremely dim lighting the Standard focus algorithm might be more reliable, but I haven’t tested it. I’ll leave that task to the reader, as they say. I tried to describe my test procedures in painstaking detail, in case you want to verify your own Sigma lens/camera combination. The autofocus algorithm choices, not to mention all of the other programmable choices, are of course unavailable to you if you don’t get the Sigma USB dock. For me, the ability to customize my Sigma lenses using their dock has made all the difference. #review
- Flashpoint Wave Commander Remote Shutter Intervalometer Review
If you have to deal in using long exposures or image stacking, here’s a gizmo you might be interested in. The Flashpoint Wave Commander can control taking a long series of photographs. You get to specify how long to wait before taking the shots, the shot duration, how many shots, and the delay between shots. Flashpoint Wave Commander You can see the plug-in cord for the camera. This part is what you can replace to fit other camera models. Use the multi-direction control and its “set” button to program it. The Flashpoint shutter release button is the big round button shown on the left. Connected to camera’s 10-pin plug The Flashpoint connects to your camera’s remote control input plug (e.g. the 10-pin plug on my Nikon D500 and D850). It’s modular, so you can buy cheap (about $8) separate plugs to fit many Nikon, Canon, Sony, Samsung, Matsushita, Pentax, Olympus, and Panasonic cameras. I endured a lot of tedious photography of things like star-scapes and infrared landscapes using my watch to monitor shutter times from around 15 seconds to 4 minutes. I finally got smart and got this unit. This intervalometer lets you specify how long to wait before you take any shots, how long the exposure should be, how many shots to take, and how long to wait between shots. It’s really easy to set up, and it remembers your settings for the next time, unless you turn if off. It has a beeper, if you want sound, and also a screen backlight for night photography. A pair of AAA batteries powers the unit. You can set any of the times from 1 second through 100 hours, and you can take from 1 to 399 shots in a sequence. Just press its little start/stop button to start the program running. To configure your camera to use the intervalometer, you need to be in “manual” mode, and set the shutter on “bulb”. Set the “single-shot” mode, also. Make sure you’re in “release-priority”, so that the camera won’t freeze if it isn’t in focus. It’s also wise to close the eyepiece shutter (or your viewfinder blocker) so that light can’t enter the viewfinder during the exposure. Even though my Nikons have intervalometer features built into them, I find this device superior. And the price is right. It is “wired” to your camera, but once you program it, you can start it and walk away until the program finishes. You can also use this remote as a simple wired shutter release, even if its batteries go dead. If all you want to do is take a photo without camera shake, then just connect the unit (don’t even bother to turn it on) and press the Flashpoint’s shutter release button instead of its start/button. Simple. I don’t get any money from these guys, so I have nothing to gain if you get one or not. I just wanted you to be aware that this device exists; I really like mine. #review
- Fix that Lens Infrared Hotspot with LightRoom
If you have a lens that generates that dreaded hotspot in the middle of your photos when you try infrared photography, you may want to try this trick. LightRoom offers the “radial filter”, which you can use to make that hotspot disappear. Most modern lenses are quite poor at infrared photography, because manufacturers no longer take care to use proper internal anti-reflection coatings that are effective against infrared light. There are of course limits to how bad your lens can be, but for many lenses, you can use the radial filter to darken that hotspot and save the picture. The dreaded hotspot in the middle of the shot The shot above was taken with a Nikkor 18-55mm kit lens (Nikkor 18-55 3.5-5.6 GII DX VR) that most websites will report as “good” for infrared photography. I used an 850nm infrared filter and took the shot at f/11. The picture looks ruined to my eye, due to that pesky hotspot. Let’s take a look at what Lightroom can do to try and rescue the shot next. Configure a radial filter to fix that hotspot As shown above, select the radial filter, and click the middle of the hotspot in the picture. Drag the mouse to get the desired diameter for the filter to surround the spot. Make sure to click on “Invert Mask” so that the filter will affect the interior of that circle. Set the feathering amount, so that the edges of the filter circle will blend into the background. You might want to temporarily set the following, also: Tools | Adjustment mask overlay | Show overlay This command will let you see your mask, and it’s quite helpful while you are adjusting the “Feather” amount. After you’re done, select “Hide overlay”. There's also a "Show Selected Mask Overlay" checkbox below the image to turn the mask on/off. Lightroom also lets you change the mask color, if you find it too difficult to evaluate the effect using the default red color. Fine-tune the radial filter Decrease the exposure value, until the hotspot is darkened to match its surroundings. When you’re happy with the mask settings, click “Done”. Go ahead and perform the usual edits after you’re finished using the radial filter. With the infrared filter I used, I usually prefer to turn the shot into black and white. The hotspot is gone Finished shot As you can see above, the hotspot is basically gone. I converted the shot into black and white, which I almost always do with this particular IR 850nm filter. The plug-in Silver Efex Pro 2 can be very helpful in manipulating the shot as black and white, by the way. You have to be careful that you don’t over-expose the shot to the point where the hotspot gets into the “clipping” region in any of the R, G, or B color channels. At that point, you have to admit defeat; the shot’s not recoverable. I have a few lenses that are so-so when shooting infrared. There’s a mild hotspot in each of them, particularly when I stop the aperture down beyond about f/5.6. This simple trick can save the shots that I’d otherwise send into the trashcan. #howto
- Should You Turn Off Vibration Reduction When Using a Tripod?
I have always read that you must turn off your lens vibration reduction when shooting on a tripod. So what happens if you don’t? Are your shots hopelessly blurred? Do all lenses behave equally badly if you forget to turn VR off? Is Vibration OFF mandatory for tripod use? I tend to reject just accepting what I’ve read or been told at face value. So, naturally, I decided to conduct a test to find out for myself. I already know that keeping VR active while using a gimbal head works fine. I decided to test a Sigma and a Nikon lens, in case the two different companies use entirely different technology in their anti-vibration systems. In both cases, I chose their latest-generation lenses that should represent the state of the art in vibration reduction (or “optical stabilization” as Sigma calls it). Really old lenses with first-generation VR might give different results, but for now I wanted to try modern gear. I chose to test the Sigma 70-200mm Sport at 70mm and f/2.8 and the Nikkor 24-70mm f/2.8 E VR at 70mm and f/2.8. In both cases, I used a shutter speed of 1/160. There is also lore that says “don’t go beyond 1/500 shutter with VR active”, which I have also debunked with my “modern” lenses. The Sigma lens was set up with their OS algorithm called “Moderate View Mode”, although all of their OS algorithms are supposed to achieve identical anti-shake results on the sensor. The Nikkor lens was set up with the “Normal” VR reduction mode. Both of the selected VR modes mentioned above are my standard ones to use, and therefore the ones I’d forget to turn off when mounting my camera on a tripod. Believe it or not, I have forgotten to turn off VR more than once. In all tests, I used a really heavy tripod, since a flimsy tripod would probably need lens VR active anyway. I mounted the lenses onto my Nikon D850, and I shot the tests using Live View (with contrast detect) and with “Silent Shutter”, to guarantee that there would be zero camera vibrations. I used a wired remote shutter release. Comparison Resolution Results: Sigma The plots above show the MTF50 resolution (measured in line pairs per millimeter). These 2-D plots show the entire sensor surface results. This kind of plot could be handy in case any vibrations would tend to mess up resolution in either the vertical or horizontal directions. The “meridional” plot measures resolution in what’s often called the “tangential” direction. The “sagittal” plot is measuring resolution parallel to “spokes” emanating from the lens center. The first plot is a “reference”, since vibration reduction is turned off. Center resolution peaks at about an MTF50 of 62 lp/mm. Again, the camera is on a tripod. In the plots above, vibration reduction was turned on while being mounted on the tripod. The resolution in the “VR ON” mode is actually a tiny bit higher, but essentially the same as the “VR OFF” results, within experimental error. I would conclude from these results that it really doesn’t matter if anti-vibration is active or not. I actually took many shots of my resolution target with both VR=ON and VR=OFF. I really couldn’t discern any overall difference between VR active or not. The average MTF50 for 10 shots with VR ON was 62.3, and the average for 10 shots with VR OFF was 60.2 lp/mm. Given the shot-to-shot variation, these values should be considered to be about the same. Comparison Resolution Results: Nikkor The plots above are my reference standard for my Nikkor 24-70 at f/2.8 without any vibration reduction while mounted to my tripod. Peak resolution is about 52 lp/mm With VR active, the results don’t look any different. Again, the camera is on the tripod. Peak resolution looks about the same as with the VR OFF shot. The average of 10 shots with VR ON was 50.2 lp/mm and the average of 10 shots with VR OFF was 50.5 lp/mm. Again, these average values should be considered about equal. Slow Shutter Speeds Is there any concern about VR with slow shutter speeds? I tried using my Sigma 150-600 at 600mm, ISO 64, f/11, and 1/25 second shutter. This is a crazy slow shutter speed for this lens, even on a tripod. VR ‘off’ testing showed a peak MTF50 of 34 lp/mm and an average MTF50 of 31.3 lp/mm. VR ‘on’ testing showed a peak MTF50 of 37 lp/mm and an average MTF50 of 32.6 lp/mm. If anything, leaving VR active helped a little bit. It certainly didn’t harm anything. Conclusion I don’t think I’ll bother to turn VR OFF when I use a tripod for a short period. I will still probably turn it off for extremely long shutter speeds (like several seconds) if for no other reason than to save some battery power. You might want to do some testing of your own if you have some old lenses with ancient vibration reduction hardware. I don’t want to imply that these three different lens test results are guaranteed valid across all lenses (especially other brands). I keep finding that you can’t just take photography rules at face value. Find out what your gear can actually do, and it will enable you to be a better photographer. #howto
- Lens Resolution Measurement: Avoid Sharpened Jpeg Like the Plague
You can see wildly-varying lens resolution measurements for the exact same lens model out there on the internet. Do manufacturers really make lenses with that much variation? I think not. Many (most) internet sites that show lens resolution measurement results don’t divulge how their measurements are done. Some sites actually state that they use jpegs of their resolution target straight out of the camera. Those same sites don’t tell you how much sharpening was used for those jpegs. What you do notice, however, is that they invariably show lens resolution results that are “too good to be true”. The way you’re supposed to capture resolution chart images for analysis is with un-sharpened RAW. Only. And leave them that way. Exposure isn’t too critical here, but generally light meters will make black-and-white charts end up with whites looking too grey, unless you boost the exposure a little. Why does it matter how resolution charts are photographed? Because resolution is based upon the transition from light-to-dark on target edges. Modern resolution measurement software is based upon how many pixels it takes to go from the white chart background to the maximum black on a target edge. The faster that transition occurs, the higher the resolution measurement you’re going to get. Details of the measurement process are discussed in this article. How does sharpening of a photo work? By altering the light-to-dark transition on edges of objects in the photo. Do you see the connection? You can basically dial in the desired test results by adjusting the sharpening. Now your test results are meaningless. Internet sites that provide lens resolution information should also discuss what kind of camera was used (assuming the measurements include the use of a camera sensor). The sensor resolution and whether or not the sensor has an “optical low-pass filter” (OLPF) is important information. An OLPF will lower the measurement numbers that get quoted. If you don’t know this information, then you can’t compare one site’s lens measurements against another site’s measurements. I think an example is in order, to prove the point. And because talk is cheap. I am using the MTFMAPPER program, but programs like Imatest work the same way. They all find (slanted) edges in the photo, and count how many pixels it takes to go from white to black. When they know the size of the camera sensor pixels, how many pixels are in a row or column of your sensor, and how big your sensor is, then they can give you resolution measurements in a variety of different ways. You might get readings such as “cycles per pixel”, “line pairs per picture height”, “lines per picture height”, etc. at a particular contrast level (like 50%). The resolution chart with lots of edges to measure The chart shown above is a typical “slanted edge” resolution chart. You photograph it with the lens, camera, aperture, distance, and zoom setting you want to evaluate. Each edge of the little trapezoids will get measured by software to determine the lens resolution at that location in the field of view. For optimal results, the chart should just barely fill the field of view and be absolutely parallel to the camera sensor. The chart should also be parallel to the edge of the camera frame (for an optimal ‘slant’). Resolution Measurement Comparisons RAW, unsharpened chart: How it is supposed to be done Shown above is the two-dimensional MTF50 chart plot, showing the “line pairs per millimeter” (lp/mm) measurements from the un-sharpened Raw photo of the test chart. This is a really good lens, and the peak measurements around 62 lp/mm indicate how good the lens is. This is the picture format of the resolution target that should be used for analysis. Same chart shot, but now Jpeg sharpened in LightRoom. Amount = 36, Radius = 1.0 The moderately-sharpened jpeg of the same chart photo shows some too-good-to-be-true resolution measurements. Everybody would be standing in line to buy this baby, if these measurements were actually legitimate. I used Lightroom to adjust the original .NEF raw photo with very modest sharpening, and then exported it into jpeg format. You see a huge jump in resolution; upwards of 118 for the MTF50 lp/mm measurement. Fake! Fraud! Bogus! Jpg sharpened in LightRoom Amount = 50, Radius = 1.3 I jacked up the sharpening in this version of the same raw original, exporting it with the “Amount” parameter changed from 36 to 50, and increasing the “Radius” parameter from 1.0 to 1.3. The MTF50 now passes 120! Anybody who’s paying attention would start to get pretty suspicious about these measurements. Faker! Frauder! Boguser! The Chart Up Close Let’s take a look up close to see what’s happening in each shot. Unsharpened Raw shot. Numbers are edge “cycles per pixel” measurement. The close-up above is near the chart center, showing the edge MTF50 measurements in units of “cycles per pixel”. The measurement software overlays the measurements onto each of the edges. This is the raw-format shot without any image processing to adjust it. The measurement of 0.26 above, for instance, is an MTF50 of 60.1 lp/mm on this Nikon D850 sensor. In other units, this measurement is 2873 lines per picture height. If this shot was a landscape, the urge to sharpen it up would be overwhelming, but don’t! The jpeg shot above is the same photograph, at the same chart location, but with minimal sharpening applied. The resolution measurements are hugely different, because the edges have a much shorter transition zone between black and white. The fuzzy grey zone between black and white is mostly removed. This is what makes sharpened photos look so much better than untouched raw versions. The measurement of the same edge has jumped from 0.26 c/p to 0.51 c/p, or from 60.1 to an astonishing MTF50 of 117.8 lp/mm. This same measurement is the equivalent of 5630 lines per picture height. Almost like getting your hands on some sort of advanced alien technology. More aggressive sharpening makes the edge transitions even more abrupt, which translates into astronomically high resolution measurements. But those measurements are of no use to evaluate actual lens performance. The same edge here jumps to an MTF50 of 122.4 lp/mm, or 5851 lines per picture height! Outrageous! Conclusion There’s lots of bogus information out there on the internet. This is just another example of how that can happen, couched in the cloak of “science”. Editing tools like the “unsharp mask” definitely have their place in photography, but not when trying to analyze how sharp a lens is. As the old saying goes, “buyer beware”. #howto
- Nikon AF Nikkor 75-300 f/4.5-5.6 Zoom
This lens harkens back to the early era of Nikon zoom lenses, when everyone was still using 35mm film. It was manufactured from 1989 through 1999. Your Nikon camera needs to have the in-camera focus motor to use this lens; I performed all of the lens tests using my D850. This is a push-pull kind of zoom, which has long since gone out of favor with photographers. At least you don’t have to worry about which direction to twist a zoom ring. If you want to use manual focus, you have to switch the camera focus switch to “manual”. The lens uses 62mm filters, and the filters (plus the end of the lens) unfortunately rotate while focusing. There’s a focus-limit switch, and I’d recommend that you use it. Try to avoid the “full” focus range setting; focusing through the full range is dog slow. The lens has 13 elements in 11 groups. The lens weighs 850 grams. To me, it feels pretty light. It uses the HN-24 screw-in lens hood, although I got a cheap rubber lens hood for it that works just fine. The lens is about 6.6 inches long un-zoomed. The 9-blade aperture can be stopped down to f/32.0 at 75mm and f/40.0 at 300mm. This lens has the old-style full aperture ring with click-stops, but you lock it at the minimum aperture on modern cameras for auto-exposure. The lens barrel is all metal, and it operates smooth as silk. Nikon really went all-out with mechanical tolerances during this era, and its functionality hasn’t degraded at all over the years. There’s no “wiggle” to be found in this lens. It has, of course, a metal lens mount, but there’s no rubber weather seal or any other sealing. The 75-300 has a non-removable tripod collar that doesn’t have any click stops in it. It’s quite solid, although it’s narrower than today’s tripod collars. The lens isn’t heavy enough to make a tripod collar mandatory, but it does help the balance. The collar tripod foot is quite small; I think it should be a bit larger to make it more stable on tripod heads that have plastic or rubber pads on them. This lens predates vibration reduction, and you really notice its absence at 300mm. It’s easy to get spoiled with modern technology. I have to admit that I was anticipating doing little else besides making fun of how poor the sharpness of this lens is. I didn’t give Nikon enough credit, though. If you’re willing to close the aperture down by only about a half-stop, this lens has very good resolution (at least at the shorter focal lengths). The focus distance data (exif data) saved in the photos is garbage. It’s not a “D” lens, so there’s no distance data. It focuses from about 5 feet (1.5m) to infinity. The “macro” range (marked in red on the lens barrel) goes from 5 feet to about 10 feet (3m). The focus “limit” switch keeps the lens inside either of these ranges, depending upon what distance the focus is at when you set the “limit” switch. At the macro setting, you can get down to a magnification of about 1:3.8, which is quite good for a telephoto. Speaking of focus, don’t bother using this lens unless your camera has focus fine-tune calibration or you use live view. This lens desperately requires focus fine-tune calibration or else the results are terrible. Also note that focus calibration changes wildly from short to long focal lengths. Nikon’s mirrorless cameras don’t have in-camera focus motors, so they are of no use here, either. The mirrorless cameras require manual focus with this lens, and also require the FTZ (Fmount to Z mount) adapter. I didn’t notice any distortion in my photographs at any focal length. I didn’t notice enough vignetting to bother fixing it in my photo editor, either. Shots at the end of the article show the extent of vignetting and distortion. There didn’t seem to be much chromatic aberration, which surprised me. I really only noticed it at longer focal lengths with wide apertures. Subjects like small tree branches against the sky are where you see this purple fringing; see the photos at the end of this article. 75-300 lens at 300mm zoom on Nikon D850 The shot above shows the manual-focus ring near the front of the lens. Note the fairly skinny tripod collar and its tiny foot. There’s no wiggle in this lens or collar, though. The rear of the lens has the full-blown aperture ring. Lens at 200mm Focus scale and limit switch up close Note that there is a white infrared focus-shift dot at both 75mm and 135mm just to the left of the visible-light infinity mark. The limit switch (set at the “limit” position) will keep the lens outside of its macro range as shown above. The macro range (5 feet to 10 feet) is the red stripe on the right. Autofocus Speed and Focus Calibration This lens’ autofocus is pretty slow, or reasonably quick; let me explain this awkward statement. After about 30 seconds of focusing frustration, I slid the focus limit switch from “Full” to the “Limit” position; there was a world of difference in speed. With this switch in “Limit”, it would focus from the regular (about 10 feet) near-distance limit to infinity in 0.415 seconds at 75mm. Using the full focus range, it took 0.933 seconds at 75mm (it feels like an eternity). Using the “Limit” switch position at 300mm, it took 0.433 seconds. Leaving the switch in the “Limit” position, focus was pleasantly responsive. I did the testing in good light; my D850 and D500 cameras got the same focus speed results. Lesser cameras are probably a bit slower than this. The first thing I always do with a lens is to focus-calibrate it. An out-of-focus shot is a useless shot. I found out right away that at 75mm, the focus fine-tune setting (-10 on my D850) was nowhere close to what was needed at 300mm. I determined that 300mm needs a fine-tune setting of +10 on the same camera. Major disappointment. Nikon, unlike Sigma, has no way to cope with a focus calibration problem like this other than to tell you to buy one of their mirrorless cameras – oh wait, their mirrorless cameras don’t support screw-drive lenses! I always write the fine-tune calibration settings data on the inside of the lens cap on a sticker (per-camera); it’s too hard to memorize this stuff. If I don’t remember to reprogram the appropriate calibration setting when I zoom in or out, picture sharpness suffers. Chromatic Aberration Worst case chromatic aberration These shots show how bad it can get with lateral chromatic aberration in the corner of the frame (100% magnification). The left-hand f/10.0 shot shows how much it gets improved by stopping down. As the labels indicate, this is at 300mm and the right-hand shot is wide-open f/5.6. The full shots are shown down below; this was taken from about 220 yards away. Given the extreme distance of this shot, I think the lens resolution in the corner of the frame is really remarkable. Infrared Since Nikon added the IR focus-shift white dots on their focus scale, I thought I’d give the infrared capabilities a little test. I used an 850nm IR filter. I found that the focus shift indicators to not be very accurate. I actually needed to shift the distance scale marker more to the left (closer distance) by an additional 3mm beyond the white dot at 75mm zoom. I was impressed by the very minimal hotspot in the middle of the shot (it was only brighter by about 0.3 stops). The vast majority of modern lenses are terrible at infrared, and zooms are the worst. 850nm IR 75mm f/8.0 Resolution I do resolution testing with un-sharpened raw-format pictures. My resolution target is 4 feet by 5 feet, to enable me to be at realistic shooting distances. All tests were done using my Nikon D850 (45.7 MP). I used the MTFMapper program to evaluate the results. I used contrast-detect focus to side-step using focus calibration. As I mentioned above, the phase-detect calibration is all over the place; it depends upon the focal length. I have noticed that this lens prefers distance shots over close-range, especially from 200mm to 300mm. My resolution target (at about 40 feet with 300mm) leaves you with the impression that the lens is worse than it is; some sample distance shots at the end of this article give you a better idea of its sharpness. The resolution measurements are in units of “MTF50 lp/mm”. To convert these units into “lines per picture height”, just multiply by the result by (23.9 * 2.0). For instance, an MTF50 of 40.0 lp/mm is (40*23.9*2) = 1912 lines/ph. The D850 sensor is 23.9mm tall. MTF50 lp/mm 75mm f/4.5 Even at a wide open aperture, 75mm is decent. MTF Contrast Plot: 75mm f/4.5 Test chart center detail with MTF50 lp/mm values shown on edges Test chart corner detail. MTF50 lp/mm values shown on edges MTF50 lp/mm 75mm f/5.6 There’s a huge increase in resolution by stopping down just a little from wide open. MTF50 lp/mm 75mm f/8.0 This is the sweet spot for 75mm. It’s only a tiny bit better than f/5.6, though. MTF50 lp/mm 75mm f/11.0 MTF50 lp/mm 75mm f/16.0 MTF50 lp/mm 135mm f/5.0 MTF Contrast Plot: 135mm f/5.0 MTF50 lp/mm 135mm f/5.6 MTF50 lp/mm 135mm f/8.0 MTF50 lp/mm 135mm f/11.0 MTF50 lp/mm 135mm f/16.0 MTF50 lp/mm 200mm f/5.3 Yikes! Avoid 200mm f/5.3 at all costs. MTF Contrast Plot: 200mm f/5.3 MTF50 lp/mm 200mm f/5.6 Stopping down just a tiny bit from wide open really helps sharpness. MTF50 lp/mm 200mm f/8.0 This is probably the sweet spot for 200mm. MTF50 lp/mm 200mm f/11.0 MTF50 lp/mm 200mm f/16.0 MTF50 lp/mm 300mm f/5.6 MTF Contrast Plot: 300mm f/5.6 MTF50 lp/mm 300mm f/8.0 MTF50 lp/mm 300mm f/11.0 MTF50 lp/mm 300mm f/16.0 This is definitely the best aperture for 300mm, even though lens diffraction is setting in just a bit. Sample Pictures 300mm f/5.6 Macro, 5 feet Believe it or not, this is considered one of the worst settings for this lens. I think the lens did quite well. The background melts away beautifully. This would be an ideal distance to avoid disturbing a butterfly, compared to regular macro lenses. 75mm f/5.6 I don’t see any vignetting here, and the palm fronds are razor sharp. 75mm f/5.6 I don’t see any linear distortion 300mm f/5.6 I don’t see distortion here, either 300mm f/10.0 Very sharp distant branches at about 220 yards 300mm f/5.6 has chromatic aberration & vignetting, but pretty sharp 300mm f/8.0 Decent sharpness Conclusion Before I started testing this lens, I figured there would be little to do besides mock it and talk about how old lenses really show their age. This has been a humbling experience. The mechanical and optical quality is really quite good. By far, my biggest complaint about this lens is the annoying shift in focus calibration as you zoom it. Mirrorless cameras can’t cure it, since they can’t use the screw-drive lenses. It’s easy to imagine many photographers thought it was a generally un-sharp lens, not realizing how to compensate for it. When this lens was introduced, autofocus calibration fine-tune hadn’t even been invented yet. Chromatic aberration at longer focal lengths can be seen in high-contrast scenes, but stopping down greatly improves it. Although my modern Sigma telephoto zooms smoke this lens, I can honestly say that the AF Nikkor 75-300 f/4.5-5.6 takes really beautiful photographs. If you think about the primitive state of computers and software back when this lens got designed, it’s quite amazing what those Japanese engineers were able to accomplish. They should be rightfully proud. Nikon sold this lens for a whole decade; now I can see why it sold for so long.
- High-Res Camera Sensors: Worth It?
It’s assumed that when you double your camera’s megapixels that you get all of that new resolution, right? Not quite. Usually, not even close. Hasselblad X1D-50C: 50 megapixels I did a little test using a Nikon D610 (24 MP) and a Nikon D850 (45.7 MP). I didn't have any Hasselblads handy. The pixel count on the tested cameras is thus: D610 = 4016 X 6068 pixels; the D850 = 5520 X 8280 pixels. The linear change is 5520 / 4016 = 1.37 (37% increase in “linear” resolution). You’d typically expect that whatever lens you use, it would now get about 37% more resolution (as opposed to expecting nearly double the resolution going from 24 to about 46 MP).You’d typically be dead wrong. My testing has shown that the limiting factor in resolution is more the lens than the camera. This might not be a big deal if you’re buying a typical DSLR or a mirrorless camera, but I think it’s a huge deal if you’re shelling out about $17,000 for a medium format camera to get those extra pixels. I understand that there are other factors, such as “color bit depth”, but in actual fact the color bit depth isn’t that much different in going from FX-sized DSLR technology to medium format. Similarly, the dynamic range being captured isn’t very different, either. There are a couple of web sites that evaluate camera sensors, and they bear out what I’m talking about. At DXO, for instance, I saw the following: Hassleblad X1D-50C is 26.2 bits color bit depth versus D850 26.4 bits. Hasselblad X1D-50C dynamic range is 14.8 EV versus D850 14.8 EV. Hasselblad resolution: 50MP versus the Nikon 45.7 MP. Now, what’s the price difference? About $17,000 versus $3,000. Wow. I’d be slightly concerned if I were Hassleblad these days. By the way, the autofocus on the D850 smokes the Hassleblad. I didn’t test the Hasselblad; I’d rather buy a car. But I digress. Getting back to resolution gains, I decided to take a look at a lens with a pretty decent reputation: the Nikkor 85mm f/1.4 AF-S “pro” lens. How much do you gain in resolution by switching to a camera with nearly double the megapixels? Let’s take a look. Nikkor 85mm at f/1.4 on Nikon D610 Peak resolution is about 36 lp/mm with the D610. Nikkor 85mm at f/1.4 on Nikon D850 Ouch. You can barely tell the difference between the D610 results and the D850 results. What in the heck happened? The lens itself is kind of “treading water” at f/1.4, and more camera sensor resolution doesn’t get you anything extra. Next, let’s try stopping down that lens, to see if that helps the situation: Nikkor 85mm at f/2.8 on Nikon D610 Nikon D610 gets about 47 lp/mm at f/2.8. Nikkor 85mm at f/2.8 on Nikon D850 Within experimental error, the D850 resolution is no better than the D610 resolution in the f/2.8 shots. The overall resolution gets better when you stop down, as expected, but the lens resolution is still maxed out on the D610; the D850 can’t improve it. Sigma 70-200 at 70mm f/2.8 on Nikon D610 Nikon D610 MTF50 results using the Sigma 70-200 at 70mm and f/2.8 is a better example for resolution comparison. The resolution range is from about 20 lp/mm to 51 lp/mm. Sigma 70-200 at 70mm f/2.8 on Nikon D850 Shifting over to the Nikon D850 shows a resolution range on the Sigma 70-200 at the same 70mm and f/2.8 from about 20 through 62 lp/mm. That’s roughly a 22% resolution gain (or 62/51 = 1.22) by using the higher resolution sensor. We’re still not up to a 37% resolution gain, but I think we’re once again up against a lens resolution limit. Stopping the lens down further, let’s see what we get. Sigma 70-200 at 70mm f/4.0 on Nikon D610 Sigma 70-200 at 70mm f/4.0 on Nikon D850 The D610 center resolution is about 60 lp/mm. The D850 center resolution is around 71 lp/mm. That’s 71/60 or roughly a 16% increase over the D610. You can tell by looking at the two-dimensional resolution results that providing “the lens resolution number” is pretty much a fool’s errand. Resolution is all over the map in different parts of the sensor, and sagittal versus meridional directions are hugely different as well. That’s why I use words like “roughly” and “about”. That’s also why I always show these somewhat messy two-dimensional plots. Conclusion This testing shows why lens manufacturers have their work cut out for them. New camera sensors are now hungry for better lenses. It also shows that you’re wasting your time and money if you think that a new camera is going to make that old lens really excel. The only conclusion that should be drawn from this testing is that the combination of a good lens on a high-resolution sensor will net better resolution than a good lens on a lower-resolution sensor. How much better depends upon many factors; generally speaking, the improvement will be a bit underwhelming. The actual mathematics behind this phenomenon goes like this: System_MTF = Camera_MTF x Lens_MTF The “MTF”, or modulation transfer function, is a measure of resolution-versus-contrast that ranges from 0 to 1.0, where 1.0 would be perfect. This math shows that even a great sensor combined with a poor lens won’t give great results, because the lens drags down the “system”. The same is true for a great lens on a poor sensor. The weakest link in a chain spoils the whole chain. I’m not even considering things like diffraction (by stopping down the lens aperture too far) or poor photographic technique. There’s a whole laundry list of ways to ruin your picture resolution. It’s a good thing that newer cameras offer more features like faster focus, bigger shot buffers, more frames per second, reduced sensor noise, and the ability to basically see in the dark. Increased sensor resolution isn’t going to win them many more fans, unless photographers enter the very expensive avenue of buying new, higher-resolution lenses.











