1. Welcome to Mu-43.com—a friendly Micro 4/3 camera & photography discussion forum!

    If you are thinking of buying a camera or need help with your photos, you will find our forum members full of advice! Click here to join for free!

Test Dynamic Range - What's It All About? A Panasonic GH2 vs Pentax K5 Comparison

Discussion in 'Reviews, Tests, & Shootouts' started by Amin Sabet, May 1, 2011.

  1. Toonman

    Toonman Mu-43 Regular

    Jul 19, 2010
    Interesting, thanks.
  2. RobWatson

    RobWatson Mu-43 Hall of Famer

    DR ... not so simple

    The nit I have to pick is the referenced definition of dynamic range is incorrect. The most basic definition is maximum signal divided by the noise level.

    There is also the matter of linearity. Film is not linear. Many 'consumer' grade cameras also have a non-linear response. The human eye is non-linear.

    I recall an amazing photo from and 8-bit digitized image that demonstated (what was claimed) to be 100 dB of dynamic range. The subject was a fellow holding an incandescent light bulb (which was on) and the filament was plainly exposed and not saturated, the mans face was 'natually' illuminated and the printing on the top of the bulb was plainly visible (GE logo). Clearly there was a very wide range of brightness levels but the actual dynamic range is not so clear.

    Something important to consider is just how is the image to be presented and viewed? The vast majority of computer monitors do not have anywhere near 1000:1 dynamic range (~10 stops). Despite manufacturer claims of 10,000:1 dynaminc range a simple test is to manufacture an artificial imiage with blocks or greyscale values that span any range of interest and then see how many different levels can actually be seen on the monitor.

    Another point to consider is the sensor PNU (pixel non-uniformity) spec and to what degree the PNU is mapped and corrected. Yes, the pixel to pixel variation in response can be significant and can even form as a gradient. In the field of astrophotography the vignetting from the optical system can be significant but for best quantitative rigor a pixel wise correction for both PNU and vignetting is required. In the most demanding aperture photometry uses even the placement to sub-pixel accuracy is required as the pixel actually has a variable response across it physical surface and 1/4 pixel shifts can result is excessize noise levels (not really noise but uncontrolled systemtatic offset error signal).

    Lastly (for now) let us not for get that we are collecting photons and photon shot noise becomes the significant limiting factor to noise levels and in most cases much greater noise levels than are found just from the sensor and electronics. You want a dynamic range of 1000:1? Then you need to collect 1,000,000 photons per pixel. Pixel capacities of most sensors are less than 100,000 and in some cases much much less than that even. 14 stops of DR? No way. Signal range in a non-linear mode - sure.

    Trying to explain all this, much less understanding how it impacts a photo session, is difficult but the best thing is a technical image of a controlled object. I use a series of power stabilized LEDs with ~2x drop in intensity from one to the next. A line of 16 such LEDs give a range of about 70,000:1 signal range. Simply take an image with the brightest LED just under saturation level andcount the visible LEDs. The dimmest LED that is too grainy (aka noisy) can be considered the lower limit.

    How do I use all this stuff when taking photos? Don't bother as I'm too busy with focus and framing and worrying about flash and ambient light levels. Under studio/controlled conditions things can get more technical but running around with the family snapping away ... who has time to bother?
    • Like Like x 3
  3. RobWatson

    RobWatson Mu-43 Hall of Famer

    Direct measurement with no fancy gear!

    Sorry for my rambling .. sometimes I forget stuff.

    There is a fairly easy way to directly measure dynamic range and the same data set yields the response curve so you can check for linearity as well.

    Photon Transfer Curve. Reference book that is good stuff Photon Transfer - Janesick | Publications: SPIE

    FYI Janesick is a real pioneer in the whole ccd imaging field.

    For those already familiar with the transfer curve skip the next part!
    Basically the idea is take a set of images (in pairs) at different exposure times under constant (moderately even) illumination. For modest light levels the dominant noise source is photon shot noise so calculate the variance between the pair of images and you get shot noise. Also the shot noise variance is equal to the number of photons. Plot variance versus signal level and the slope of the curve is the system gain which yields how many photons for each digital count. Things are not so cut and dried for non-linear responses.

    Also plot signal versus exposure time and see the response curve - check for linearity. Depending on the shape of the response curve the interpretation of the photon transfer curve varies.

    Note that the photon transfer curve gives an good estimate of read noise level as well as an indication of the capacity of the sensor (how many photons per pixel). Real simple dynamic range is capacity divided by read noise.

    So what is the easy way to build this magical dataset to produce a photon transfer curve? Actually, I use a copy stand to hold the camera and focus far away then place a piece of laser printer paper over the lens face (camera points towards the ceiling) and just image the diffuse room light (Prefer a wide open apeture and maybe light from the hall way - or more layers of paper the act as ND filter). If you have PC control over the camera just snap a set of image pairs at different exposure times to cover just over satruration to as dark as you can go. Otherwise use the timer/remote. Use raw and turn off as much of the image processor functions as you can.

    The images are not much to look at but they do give a good representaion of the vignetting present in the system. Pick the central region (say 50x50 pixels) to do the calculations.

    I'll post a sample curve from my recently acquired Oly EP1 sometme this weekend.

    Just how does all this improve ones photos? Well, we'll see, some folks just love to fiddle.

    PS On my new 50,000:1 "Dynamic Contrast" monitor I can only see ~200 shades so maybe it is my eyes ... except when I take a photo of the monitor and check the image data - only 200 shades! HA! I wonder if I can get a rebate?
    • Like Like x 1
  4. m43_user

    m43_user Mu-43 Regular

    Aug 4, 2010
    Thanks for the comparison. Very interesting. I may have to do an informal test on a tripod with my new GH2 and my D700, just to see what the differences are. I expect the D700 to be better of course, but I'm just curious to see how much different.
  5. Brian Mosley

    Brian Mosley Administrator Emeritus

    Dec 15, 2009
    Excellent demonstration Amin, and a great thread too. Much appreciated

  6. Amin Sabet

    Amin Sabet Administrator

    Apr 10, 2009
    Boston, MA (USA)
    Thanks, Brian!
  7. Djarum

    Djarum Super Moderator Subscribing Member

    Dec 15, 2009
    Huntsville, AL, USA
    One of the things that confuses me is the difference between tonal range and dynamic range. What i need is a camera when i shoot waterfalls is for the shadows not to be so noisy and highlights nlt blown out. Im only seeing about a stop difference in dynamic range between my ep1 and my 200 point and shoot that is 4 years old. Rather dissapointing.
  8. LovinTheEP2

    LovinTheEP2 Mu-43 Top Veteran

    Feb 15, 2011
    Great write Amin and thread.

    I personally think the u43's dynamic range issue and shortcomings of 2 years back have been significantly improve the past 12 months and with the OM-D appears to be capable of delivering.. the gap may finally have gotten so small to APS-C cameras that it's no longer an issue.

    In fact, u43s may have have advantages over and above the APS option as the lens selection is greater.

    Autofocus has improved by leaps and bounds compared to my EP2 which is only 16 months old.

    AA filter has been significantly thinned since the EP2 and no longer a significant issue with blurring of fine details.

    ISO performance in the OM-D seems to final be acceptable past 1600

    DR still too soon to tell but I have a feeling, it won't compete with the K5 but should be easily inline other APS cameras and no longer an achielles heel.

    If what the Olympus director said was true that the OM-D sensor could make it's way into the PENS.. I'd love to see them finally work on base ISO now that they have High ISO address and launch the EP-4 with base iso of 100, NO EVF (even though it sounded like they may), 5 axis stablization and keep the flash onboard and fixed screen.

    Be nice to see a review of the EP2, EP3, OM-D, K5 comparison detailing ISO, DR and IQ when the OM-D goes live.
    • Like Like x 1
  9. jyc860923

    jyc860923 Mu-43 Hall of Famer

    Feb 28, 2012
    Shenyang, China
    judged by that GH2 isn't too bad, call me crazy but i actually prefer it to k5

    it's just i have to admit the DR of GF3 (quite close to GH2's) is way behind Sony NEX's, and sometimes it's possible to push to the limits.
  10. flaxseedoil1000

    flaxseedoil1000 Mu-43 Regular

    Mar 10, 2011
    Just stumbled upon this from luminous-landscape

    I think that's from back in 2003.
  11. Amin Sabet

    Amin Sabet Administrator

    Apr 10, 2009
    Boston, MA (USA)
    The K5 uses the same sensor as the NEX-5N.
  12. jyc860923

    jyc860923 Mu-43 Hall of Famer

    Feb 28, 2012
    Shenyang, China
    Wow, thanks, I didn't know that, apparently K5 should be real good.

    I own a GF3, knowing GH2/G3 are supposed to have superior sensors with higher resolution, am I right to think that the 16MP sensor may bring more noise or some downsides than the 12MP? However GX1's 16MP ISO-1600 seems quite clean!
  13. jyc860923

    jyc860923 Mu-43 Hall of Famer

    Feb 28, 2012
    Shenyang, China
    That comes from the K5 has larger sensor thus less DOF, doesn't it? Makes it easier to get layered depth of field.
  14. Lawrence A.

    Lawrence A. Mu-43 All-Pro

    Mar 14, 2012
    New Mexico
    Right. And most of the difference in size is on the long end, which I inevitably crop. I've never loved the 3:2 aspect ratio, and when shooting film loved square formats or 4x5. Of course I shot a lot of 35mm too -- and cropped it. I understand why someone would want to go full frame -- sort of the medium format of digital photography in terms of quality, but the hoopla about the quality difference between APS-C and 4/3 is a tempest in a teapot, informed by pixel peeping and chart gazing more than by practical photographic considerations.
  15. Amin Sabet

    Amin Sabet Administrator

    Apr 10, 2009
    Boston, MA (USA)
    It depends on the particular sensors involved. It is not necessarily the case that smaller pixels means more noise or other downsides for a given sensor size, but depending on the particular implementation, it may be the case. There is no downside in going from the GF3 sensor to the GH2 or G3 sensor (other than bigger files being more demanding on your computer during postprocessing and cataloging/storage).
    • Like Like x 1
  16. jimr.pdx

    jimr.pdx Mu-43 Veteran Subscribing Member

    Dec 5, 2010
    ~1hr north of Portland OR
    Jim R
    And of course it's so much more than the sensor - the supporting chips determine why that Sony sensor scores differently in the K-5, D7000, Nex and wherever else it pops up in. The K-01 uses the same or very similar chip, but does 'only' 12-bit raw imaging; since the K-5 does 14-bit raw the dxomark scores are higher for the K-5.

    I sure wish I could drop one of those old Olympus E500 chips from Kodak into a new :43: body. In my 2008 test images, that camera gave me most impressive results for a mere 8Mpixel cam. If only...
    • Like Like x 1
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.