I Gave up the Panasonic G9 for? Sanity Check!

speedy

Mu-43 Hall of Famer
Joined
Nov 27, 2015
Messages
2,696
Ah, but they want us to buy 8K next! 4K was already a smokescreen in most households. You'd need a massive TV and sit way too close to it to notice 4K over 1080P. What made the most noticeable difference with the "UHD" spec was not so much the 4K resolution, but the inclusion of HDR. But they got to get us to buy something new, right? It's funny, for the last 3 HDTVs I purchased, I spent $400 each time. The first was a 32" 720p Sony. The next was a 42" 1080P Viseo, and the last and still household champion was a UHD 48" LG. They are running out of reasons to get us to upgrade, and my last upgrade was just due to the Viseo taking a dump. I don't see a compelling reason to upgrade from the LG, and thankfully it has also lasted the longest!
What was the thread about again? 🤣🤣🤣🤣🤣
I've simply given up on free to air TV -it's garbage in both content and picture quality. Don't have the time to spend to make use of cable/Netflix etc. Wife wanted a new TV a year or so back, so ended up with a 52"? 4K Panasonic, which no one watches any more. Well, occasionally we'll watch something on the YouTube, but not very often. They took MotoGP off free to air, so I subscribed to Kayo Sports, which mostly gets watched while laying in bed on my little 11"? Chromebook 🤣🤣🤣🤣
Talking to the young blokes at work, most content seems to be viewed on their phones, tablets and laptops these days. Makes one start to wonder who's buying all the big TV's. And yeah, not sure why the push on 8K. Maybe there's more movie buffs with theatre rooms out there that I'm unaware of 😁
 
Joined
Dec 26, 2019
Messages
45
$320 CAD for Panasonic 25mm F1.4 probably wouldn't seem like such a good deal if people realized it is actually a F2.8 lens. Unless people start recognizing those gaps, it will be difficult to finally begin moving past them, and towards full frame.
Although it is a v1, IMHO, the price is fair given the quality of the glass. I cannot afford running two systems and spend $1500 on a FF camera so I can save on the cost of a 2.8 lens.
 

archaeopteryx

Gambian sidling bush
Joined
Feb 25, 2017
Messages
1,775
At a 'normal' viewing distance, glazed, there is little noticeable difference in resolution between 5 and 20 MPx cameras.
That sounds like, given no substantial sensor side issues, negligible difference between ~225 and ~450 dpi input to whatever interpolation the printing process is doing to 2880 dpi. The rules of thumb I know suggest a viewing distance under 40 cm for the eye to start resolving above 225 dpi, maybe closer with the interpolation, which seems probably consistent with what you're saying.

Ah, but they want us to buy 8K next!
At the risk of some redundancy, I find Panasonic's 4k photo stuff handy for obtaining ~8 MP focus stacks. The GH5 II only ups the maximum bit rate from GH5 and G9's 150 Mbps to 400 Mbps, though. That's an average of 1.6 bpp (bits per pixel) per frame rather than the 0.4 bpp of the 100 Mbps available on the G7-G80-G90.

In comparison, fine jpeg is around 3 bpp. So the most cost effective image quality improvement I could make would be to change from 4k post focus on a G7 to focus bracketing medium jpeg a G80 in order to stack the SOOCs. The drop to 9 fps from 30 isn't great but, even if the GH6 introduces 800+ Mbps 4k modes (jpeg bpp parity), it'll take a long time for those to become (semi)affordable.

8k at 3 bpp would be 3.2 Gbps and 6k 1.6 Gbps. Even farther over the event horizon.

Makes one start to wonder who's buying all the big TV's.
I suspect TV enthusiasts on that level are similar to people who buy APS-C mirrorless systems or Olympus PRO/Panasonic-Leica μ43 for hobby use, just with a different interest. The two cost about the same (135 mirrorless is around twice as much).
 

JDS

Mu-43 Top Veteran
Joined
Nov 11, 2014
Messages
784
Location
San Francisco, CA
Real Name
David Schultz
It's crazy how hard it is for folks to get this, on both sides.

Yes, f/1.4 on FF vs 1.4 on m43 captures more photons, but there's also a bigger sensor, so it needs to (assuming here same shutter speed). f/1.4 on m43 has the same aperture diameter (well, not exactly the same, but close) as f/2.8 on FF, but if you close down to f/2.8 on FF then you are getting the same number of photons as 1.4 on m43, but over a larger sensor - hence a darker image and the need to boost ISO or lower shutter speed.

It's like if we measured fish nets in f/stops.

If f/2 in fish nets meant 2X the size of the fish (work with me on this, since a 1/2 fish net doesn't make sense), then you need a much larger physical f/2 net to get an Oscar than if you want, say, a betta fish. Sure, you get more fish with the "full frame" oscar net, but you also need a much larger tank to store it in, larger filter, more food, etc. Sometimes you just want a betta. If the fish analogy is lost on you, try fishsize.com and attach some lenses while you're there.

Oh, and sometimes people get a baby Oscar because it's really not much bigger than the beta, but later on they realize everything has gotten a lot larger and more expensive than expected.
I wish people just explained it simply, because it is simple. Take a M4/3 camera and a FF camera side by side, both shooting the same lens & settings (say 25mm f2.8). The M4/3 and the middle of the FF photo will look identical, the same amount of light is hitting those areas with the same depth of field - they are identical. But, the FF sensor is bigger so the field of view is bigger - 2x bigger sensor, 2x bigger field of view. Easy- that FF sensor is capturing image that's hitting metal inside the M4/3 camera. So, to get the same final result, that FF photographer would likely grab a 50mm lens, and of course 50mm at f2.8 is going to have half the depth of field - we all know this. Then again, the M4/3 owner saved enough money and weight to maybe bring a 25mm f1.2, so back at you Mr. FF!
 

John King

Member of SOFA
Joined
Apr 20, 2020
Messages
3,732
Location
Beaumaris, Melbourne, Australia
Real Name
John ...
That sounds like, given no substantial sensor side issues, negligible difference between ~225 and ~450 dpi input to whatever interpolation the printing process is doing to 2880 dpi. The rules of thumb I know suggest a viewing distance under 40 cm for the eye to start resolving above 225 dpi, maybe closer with the interpolation, which seems probably consistent with what you're saying.
Jürgens "The Digital Print" covers all these technicalities pretty well. It was an expensive book when I bought it.

I note that it's only USD 60 at Getty these days.
 

archaeopteryx

Gambian sidling bush
Joined
Feb 25, 2017
Messages
1,775
Jürgens "The Digital Print" covers all these technicalities pretty well.
What model does Jürgens use?

The ones I've come across most commonly are
  1. dpi = 90 visus / (viewing distance in meters)
  2. dpi = 1/(2 distance tan(1/2 1/visus 1/60)) ≈ 88 visus / (viewing distance)
The first is often simplified to 180 / distance, though visus ≥ 2 (20/10 vision or more) is fairly uncommon so that's probably a bit on the conservative side. The second is the visual acuity model motivating the simplified form of the first (for unfamiliar readers: 1/visus 1/60 is the visual resolution in degrees and the distance-tangent bit translates it to the smallest visible dot size after unit conversions are included).
 

John King

Member of SOFA
Joined
Apr 20, 2020
Messages
3,732
Location
Beaumaris, Melbourne, Australia
Real Name
John ...
What model does Jürgens use?

The ones I've come across most commonly are
  1. dpi = 90 visus / (viewing distance in meters)
  2. dpi = 1/(2 distance tan(1/2 1/visus 1/60)) ≈ 88 visus / (viewing distance)
The first is often simplified to 180 / distance, though visus ≥ 2 (20/10 vision or more) is fairly uncommon so that's probably a bit on the conservative side. The second is the visual acuity model motivating the simplified form of the first (for unfamiliar readers: 1/visus 1/60 is the visual resolution in degrees and the distance-tangent bit translates it to the smallest visible dot size after unit conversions are included).
It's quite a complex book, examining all types of digital print making.

Far too complex for me to summarise!

Your local library might have it?
 

archaeopteryx

Gambian sidling bush
Joined
Feb 25, 2017
Messages
1,775
Far too complex for me to summarise!
It was a specific question about print DPIs meeting perceptual limits. From what I'm able to discern from the answers, the book doesn't obviously contain such guidance and therefore probably wouldn't be useful to consult in this regard. That appears to match with Getty's abbreviated summary and neither library system I've access to has a copy.

Investigations I know of high acuity observers (e.g. Gaska 2010, Lloyd 2015) strike me as consistent with the 1.0 arcminute at visus 1.0 basis of the models above. Maybe it should be more like 0.7 arcminutes but I suspect that's just the difference between a round dot and a square pixel.
 

snegron

Mu-43 Veteran
Joined
May 9, 2013
Messages
217
Location
SW Florida
I currently have (too much money wasted on) cameras/lenses in 3 systems:

- Nikon D750, D200 with several FF and DX lenses.

- Canon 6dmk2 and 7dmk2 with several full frame (L) and EF-S lenses.

- Panasonic GX85 with several lenses.

I haven't switched from one format/system to another; I use all three systems as needed.

I would like to get rid of my Nikon system because of Nikon's new marketing strategy of replacing their aweson F lens mout system wirh their new mirrorless Z system. If I were to buy into Nikon's new Z system, most of my older Nikon AF-D lenses would not be fully compatible as their adapters are not that great.

Canon, on the other hand, has an adapter that lets my older EF lenses work perfectly well with their new R mount.

As for my future purchases, I would like to try a higher end Olympus. Also, if I can sell off all my Nikon equipment, I would probably look into Pentax full frame.
 
Joined
Mar 30, 2021
Messages
240
Location
Romania
Real Name
Cata
We aren't too bad here.
That's pretty much true. And another observation I have made is related to the Romanian Olympus forum, where we talk, generally, m43 gear, but the people over there are also very "civilized".

Just as quick example from couple of days ago when I was surfing some MF forum (to get info on my "new" Hassy) I found over there people being quite harsh one-to-another which was sort of un-expected to me, to be perfectly honest. I was really expecting people into "heavy stuff" (a.k.a. MF camera and lenses) to be old and wise. They could've been old.....
 
Joined
Mar 30, 2021
Messages
240
Location
Romania
Real Name
Cata
IMNSHO, most 'noise' seen on monitors is an artifact of this low resolution, and inability to display fine gradations and transitions.
No idea what "IMNSHO" means, but that aside, are you saying that, if one manages to bypass the display for seeing an image (say, making a contact copy print directly from the sensor and then, eventually, enlarging it to A3) then the image will be clean(er)? I don't think I'm buying this. The noise happens at sensor level due to some factors, one being the (electric) signal amplification (I remember reading about those things few time ago, but can't bother to go back and re-read now). I can't imagine "everything" is fine and clean at sensor level, but as soon as we sent it over to our monitor, this monitor is not able to reproduce it, so it creates "artifacts" a.k.a "noise"
 

John King

Member of SOFA
Joined
Apr 20, 2020
Messages
3,732
Location
Beaumaris, Melbourne, Australia
Real Name
John ...
No idea what "IMNSHO" means,
IMNSHO = In My Not So Humble Opinion.
but that aside, are you saying that, if one manages to bypass the display for seeing an image (say, making a contact copy print directly from the sensor and then, eventually, enlarging it to A3) then the image will be clean(er)? I don't think I'm buying this. The noise happens at sensor level due to some factors, one being the (electric) signal amplification (I remember reading about those things few time ago, but can't bother to go back and re-read now). I can't imagine "everything" is fine and clean at sensor level, but as soon as we sent it over to our monitor, this monitor is not able to reproduce it, so it creates "artifacts" a.k.a "noise"
Noise is introduced at a number of levels:

1) Sensor and ADC

2) Digital amplification in camera (do not use intermediate ISO stops ... )

3) By poor editing of low bit depth, small colour space files

4) By low resolution monitors, with low bit colour LUT and panels, and small colour space display

5) By printing on low resolution, low bit depth, low colour space printers.

6) Lack of a colour managed workflow.

Etcetera ...

These are real things, as against your unspecified objections.

[Edit] You seem to think that you print what is on your screen. You don't. The edited file is transferred from your hard disk or memory file via your printer software, graphics card. and output port to your printer.

Your display is merely a side road that allows you observe what's happening.

[End edit]
 
Last edited:
Joined
Mar 30, 2021
Messages
240
Location
Romania
Real Name
Cata
As far as "colour" related reasons above, I could not care less, since I'm mostly (now, at least) a B&W guy. So we're left with the first two reasons, and with "etcetera"

The other thing is that most of us enjoy our pictures on digital devices (unfortunately), not on prints, so probably, the way a picture looks on a screen is important. Yes, I now, not at 100% magnification, but still on a screen. I do want to print more of my pictures, but I would not have enough physical space and resources to do so on a regular basis. Even my films will be digitalized in the end.

Another thing I can't really understand from you statement is how would be possible for a 4k monitor (for instance) to introduce artifacts and noise into an (say) 16MP image. The opposite is understandable, when the image on the monitor would be "enlarged" from a small file/picture (like a 2MP image) filling a full 4k screen.
If I'm completely off and you were talking about 100% enlarged images seen on the display, then again I can't understand why the display would introduce any noise/artifact which is not present within the digital image, since we are talking/seeing pixel-per-pixel transfer from the file to the display.

[edit] I would assume that whatever happens on the digital path (e.g. sensor-to-display) is more truthful of the digital image (whatever was recorded on the digital sensor) than an (analog) print. So it could be that the print(er) is more forgiving, and this is good news for those whose final output is a print, but for "digital folks" the digital output could be more of a concern. So it might be the print that is merely a sideroad of the digital path [end edit]
 
Last edited:

John King

Member of SOFA
Joined
Apr 20, 2020
Messages
3,732
Location
Beaumaris, Melbourne, Australia
Real Name
John ...
As far as "colour" related reasons above, I could not care less, since I'm mostly (now, at least) a B&W guy. So we're left with the first two reasons, and with "etcetera"
Not so. Colour gamut and bit depth are just as important with B&W photography as they are with colour.

The other thing is that most of us enjoy our pictures on digital devices (unfortunately), not on prints, so probably, the way a picture looks on a screen is important. Yes, I now, not at 100% magnification, but still on a screen. I do want to print more of my pictures, but I would not have enough physical space and resources to do so on a regular basis. Even my films will be digitalized in the end.
True. It's why I have three high quality monitors in the house (not counting two high quality TVs). My main monitor is 100% aRGB gamut, with a 14 bit colour LUT and 12 bit panel. I edit, such as I do, using 16 bit ProPhotoRGB.
Another thing I can't really understand from you statement is how would be possible for a 4k monitor (for instance) to introduce artifacts and noise into an (say) 16MP image.
You are not understanding the physical properties of any digital display. Each pixel on the display is comprised of three sub-pixels.

The opposite is understandable, when the image on the monitor would be "enlarged" from a small file/picture (like a 2MP image) filling a full 4k screen.
See above.
If I'm completely off and you were talking about 100% enlarged images seen on the display, then again I can't understand why the display would introduce any noise/artifact which is not present within the digital image, since we are talking/seeing pixel-per-pixel transfer from the file to the display.
It doesn't "introduce" noise. It appears noisy because of the low (relative) resolution of the display, and how the display works. An analogy - the difference between a fine silk cloth and a coarse cotton one, both printed with the same image.
[edit] I would assume that whatever happens on the digital path (e.g. sensor-to-display) is more truthful of the digital image (whatever was recorded on the digital sensor) than an (analog) print. So it could be that the print(er) is more forgiving, and this is good news for those whose final output is a print, but for "digital folks" the digital output could be more of a concern. So it might be the print that is merely a sideroad of the digital path [end edit]
If you read and understand what I have already written, printers are far higher resolution devices than monitors. Orders of magnitude higher. I cannot think of an analog printer since typewriters bit the dust ... I'm sure that they still exist (e.g. printing presses and the like), but even the simplest inkjet is a digital device from end to end.
 

RichardC

Pastafarian minister
Joined
Mar 25, 2018
Messages
5,035
Location
The Royal Town of Sutton Coldfield, UK.
Real Name
Richard
What was the thread about again? 🤣🤣🤣🤣🤣
I've simply given up on free to air TV -it's garbage in both content and picture quality. Don't have the time to spend to make use of cable/Netflix etc. Wife wanted a new TV a year or so back, so ended up with a 52"? 4K Panasonic, which no one watches any more. Well, occasionally we'll watch something on the YouTube, but not very often. They took MotoGP off free to air, so I subscribed to Kayo Sports, which mostly gets watched while laying in bed on my little 11"? Chromebook 🤣🤣🤣🤣
Talking to the young blokes at work, most content seems to be viewed on their phones, tablets and laptops these days. Makes one start to wonder who's buying all the big TV's. And yeah, not sure why the push on 8K. Maybe there's more movie buffs with theatre rooms out there that I'm unaware of 😁
The TV landscape has changed so much in recent years. I used to be a telly addict (watched every episode of Dallas even when it went a bit mad). Modern TV is mostly for the cranially challenged. We have a 55" LG in the corner of the lounge. It's there for decorative purposes only and gets switched on at Christmas to keep the mother-in-law entertained. (See above).
 

speedy

Mu-43 Hall of Famer
Joined
Nov 27, 2015
Messages
2,696
The TV landscape has changed so much in recent years. I used to be a telly addict (watched every episode of Dallas even when it went a bit mad). Modern TV is mostly for the cranially challenged. We have a 55" LG in the corner of the lounge. It's there for decorative purposes only and gets switched on at Christmas to keep the mother-in-law entertained. (See above).
Yeah, I reckon you're right. Even though I'm advancing in years (mid 50's) I simply don't watch it any more. Especially after they announced they'd taken the MotoGP off free to air. As I mentioned earlier, most media content I now watch, is on my little 11" Chromebook. Laying on the sofa, or tucked up in bed. If you compare apparent viewing sizes, they're actually not that far off, as you sit the Chromebook on your belly, while the 55" TV is about 5 meters away. For reference, here's a quick snap to illustrate. If anything, the little Chromebook gives you a larger apparent viewing area. Crazy hey. I think it's a bit similar to camera formats. Sure, the larger sensor formats are exactly that -larger, but does that make a difference when actually looking at the output as a whole?
viewing size.jpg
Subscribe to see EXIF info for this image (if available)
 

archaeopteryx

Gambian sidling bush
Joined
Feb 25, 2017
Messages
1,775
(do not use intermediate ISO stops ... )
This claim merits some qualification and support, I think, as there's no general evidence apparent for the more specific versions of it made elsewhere from DxO, photonstophotos, and other mu-43 threads. Measurement does show minor degradation moving from certain whole stops to certain intermediate stops on certain bodies but, if μ43 bodies are considered as a population, the concern probably applies to something like 2% of the relevant ISO transitions.
 
Joined
Mar 30, 2021
Messages
240
Location
Romania
Real Name
Cata
Not so. Colour gamut and bit depth are just as important with B&W photography as they are with colour.


True. It's why I have three high quality monitors in the house (not counting two high quality TVs). My main monitor is 100% aRGB gamut, with a 14 bit colour LUT and 12 bit panel. I edit, such as I do, using 16 bit ProPhotoRGB.

You are not understanding the physical properties of any digital display. Each pixel on the display is comprised of three sub-pixels.


See above.

It doesn't "introduce" noise. It appears noisy because of the low (relative) resolution of the display, and how the display works. An analogy - the difference between a fine silk cloth and a coarse cotton one, both printed with the same image.

If you read and understand what I have already written, printers are far higher resolution devices than monitors. Orders of magnitude higher. I cannot think of an analog printer since typewriters bit the dust ... I'm sure that they still exist (e.g. printing presses and the like), but even the simplest inkjet is a digital device from end to end.
As far as I understand, the digital sensor works exactly the same as the monitor, so for each photosite there is a set colour assigned, so the ratio should be the same as for the display. Nevertheless, this still does not explain the difference I see @ 100% magnification between a FF image and a m43 image. As I said before, the m43 image usually looks "muddy", the pixel is "dirty". And this is not theory, is fact, as seen on my calibrated monitor. So why then, the FF magnified image does not appear as noisy as the m43 image. Because is far more cleaner, that's why.

Let alone the fact that the m43 RAW image is not a RAW anymore, since it's already heavily processed - see the real m43 RAW images that show tremendous geometrical aberrations and so on. Going down this road then, the phone camera image is further more processed yielding, sometimes, a perfectly usable image (for, say, printing purposes). But when looking at 100% magnification on a display, this image will look even worse than a m43 image, just because the "starting point" is much worse. So, the display is not introducing artifacts or noise, is just simply multiplying whatever artifacts/noise exists within the recorded image.

This is my take, at least. And I see images on my monitor/display - that's a fact. And my intention was not to bash the m43, but to put it into perspective. And another thing is, to me, the more an image is processed (by the camera, I mean), the more "digital" it looks (on a screen). Or artificial, for a better wording. So yes, the Sony images, for me, look more "natural" than the Olympus ones - sorry! (on the same monitor/display).
 

frankmulder

Mu-43 Top Veteran
Joined
Jun 8, 2019
Messages
548
As I said before, the m43 image usually looks "muddy", the pixel is "dirty".
You probably mean that there's more noise (than a bigger sensor at the same ISO setting)? That is of course true. In my experience, the muddiness goes away when I process the image with DxO PhotoLab. The pixel level quality is very good, actually. Much better than what I had with my Canon G7X, for instance (1 inch sensor). I think some muddiness also stems from the lenses that you use. My pictures taken with the Olympus 9-18mm or the 17mm f/1.8 looked a lot "muddier" than the ones taken with the Sigma 56mm, for instance.

Let alone the fact that the m43 RAW image is not a RAW anymore, since it's already heavily processed
To my knowledge, this is not true. Your RAW processor might apply some adjustments by default, but the RAW file itself is what it says on the tin (the raw output from the sensor when you took the image).

So yes, the Sony images, for me, look more "natural" than the Olympus ones - sorry! (on the same monitor/display).
That's to be expected with a bigger sensor. Of course the images are better and need less processing to look good.


People are making things way too complicated in this thread (speaking in general here). Bigger is better, as long as you can put up with the extra cost and weight and the other implications that a bigger system has.
 

John King

Member of SOFA
Joined
Apr 20, 2020
Messages
3,732
Location
Beaumaris, Melbourne, Australia
Real Name
John ...
As far as I understand, the digital sensor works exactly the same as the monitor, so for each photosite there is a set colour assigned, so the ratio should be the same as for the display.
It doesn't.

The camera is using a Bayer array that uses a CFA. Each element of the CFA has a response curve centred around its primary colour. The pixels are a 2x2 array, usually R-G-B-G, where the greens are usually differentially responsive to the green part of the spectrum.

A monitor pixel consists of a stack of three sub-pixels, R-G-B plus a white back light. These can be seen with a magnifying glass.
Nevertheless, this still does not explain the difference I see @ 100% magnification between a FF image and a m43 image. As I said before, the m43 image usually looks "muddy", the pixel is "dirty". And this is not theory, is fact, as seen on my calibrated monitor. So why then, the FF magnified image does not appear as noisy as the m43 image. Because is far more cleaner, that's why.
"100%" is different, depending on the resolution of the original image.
Let alone the fact that the m43 RAW image is not a RAW anymore, since it's already heavily processed - see the real m43 RAW images that show tremendous geometrical aberrations and so on.
This is plain wrong. Digital correction is part of the mFTs standard. To examine an image without it is like removing a couple of lens elements in another system, then complaining about IQ ...
Going down this road then, the phone camera image is further more processed yielding, sometimes, a perfectly usable image (for, say, printing purposes). But when looking at 100% magnification on a display, this image will look even worse than a m43 image, just because the "starting point" is much worse. So, the display is not introducing artifacts or noise, is just simply multiplying whatever artifacts/noise exists within the recorded image.

This is my take, at least. And I see images on my monitor/display - that's a fact. And my intention was not to bash the m43, but to put it into perspective. And another thing is, to me, the more an image is processed (by the camera, I mean), the more "digital" it looks (on a screen). Or artificial, for a better wording. So yes, the Sony images, for me, look more "natural" than the Olympus ones - sorry! (on the same monitor/display).
You really need to go and read the Cambridge in Colour web site tutorials (or similar) to better understand what you are writing about, as you plainly misunderstand much of it.

Sorry to be so blunt, but I hope this helps you to better understand the processes.
 

Latest threads

Links on this page may be to our affiliates. Sales through affiliate links may benefit this site.
Mu-43 is a fan site and not associated with Olympus, Panasonic, or other manufacturers mentioned on this site.
Forum post reactions by Twemoji: https://github.com/twitter/twemoji
Forum GIFs powered by GIPHY: https://giphy.com/
Copyright © Amin Forums, LLC
Top Bottom