Much like the second coming of Christ, there is no sensor tech coming that will make that much of a difference much less eclipse FF/APS-C, and
I think I saw an article where they said Panasonic gave up on the organic sensor development and sold the company/project to somewhere else. But they do have a massive video camera with an organic sensor (which costs $$$$$$s) so the tech is somewhat real. Just expensive, demands lots of cooling and power etc.As a matter of fact there is a new technology that could bring m4/3 sensors much closer to FF at least until it becomes available in larger formats as well.
And the very company that demonstrated working 8k video camera in 2019 happens to be a m4/3 camera manufacturer who promised to bring this technology to consumer models after launching it first in broadcast systems for 2020 Tokyo Olympics
I'm of course talking about Panasonic organic sensor technology which promises global shutter, 15dB dynamic range, substantially less noise in low light, less heat problems and even much cheaper sensors once it's mature enough for high volume manufacturing.
And this is where m4/3 has a huge advantage over FF just beause it requires just 1/4 of sensor area. Any new technology will have extremely low yield and very high production cost at the beginning and therefore first applications will be the onces which require small footprints
Remember that Canon had to develop now obsolete APS-H sensor format just because back in the day it was technically and commercially not feasible to produce full-size FF sensors but Canon's customers had invested huge amounts of money in 35mm lenses which they wanted to use even with digital bodies.
Unfortunately the actual launch of Panasonic GH6 and organic sensors seems to share a common pattern with the Second Coming you mentioned.
I think I saw an article where they said Panasonic gave up on the organic sensor development and sold the company/project to somewhere else. But they do have a massive video camera with an organic sensor (which costs $$$$$$s) so the tech is somewhat real. Just expensive, demands lots of cooling and power etc.
On a related note: NHK released last summer a working prototype of organic sensor which is a direct competitor to Foveon technology. It can detect all three colours: blue, green and red on each photosite.
I believe time is of the essence with this tech. Merrill was at JPL working on Foveon in the early-mid 90s long before it ever ended up in a consumer camera. Merrill's tech came from work done in the late 80s by another scientist.
Foveon sensor has one serious flaw: it's a solution to a problem that doesn't exist.
They had their reasons for not wanting to work with interpolated images and up until then, their only choices were monochrome sensors and beam-splitting.
I remember those. Not only that, but they had no invested in any "photographic" colour science. Everything was calibrated to NTSC/PAL television colours. With proper TV lighting, those cameras did have good results. But who carried a lighting rig with those cameras? Nobody.My first digital camera was a 3CCD Panasonic DV camcorder which they promoted as a hybrid camera because it even had real a flash for "3,1MP still images".
They went even so far that jpegs which camera captured were automatically extrapolated in-body to 3,1MP from whatever was the active area of nominally 0,8MP CCD sensor. Trying to extrapolate entirely new photosites which were never captured in the first place is an entirely different task than to interpolate 2 out 3 RGB values when one is known and missing coulours were captured in adjacent pixels.