While I am a proponent of 4k for filming, because it offers so much material to work with, so many advantages in any material manipulation operation, I am convinced that it is yet another 21st century hoax to sell the new TV, the new old content.

Why is 4k useless in televisions, cell phones, amateur cameras?

not to mention the fact that if it is not used for postproduction it is a waste of resources cpu of the capturing device, occupied memory space, overheating of the device, 4k on a device below certain size is a useless waste. The visual acuity of an average human being is such that it perceives about 1/10 mm at about 50cm distance, given that an average size 4k panel is a density of 110 ppi, i.e. on a 40 inch we are talking about 4 dots per square mm, too bad that a 40 inch will look from at least 150 cm where the resolving capacity has dropped to 2-3mm on average, so there are as much as ten times the information perceivable by an average human being…

this calculation is only valid if we have a pure 4k source, because if on that TV we see a 4k movie from streaming, youtube 4k, cell phone or other elements we actually won’t have all that information, because the panel offers that definition, but the transmitted data don’t have it, so the TV will be completely useless….

so why buy a 4k television today?

Let’s do a very simple and quick analysis of the pros and cons :

Pros:

  1. it is a great high-resolution digital frame for your photographs
  2. if I have a real 4k digital camera I can see the pictures in their real glory if I stand very close to the 40 inch I bought.

Cons:

  1. there is no real 4k content yet to take advantage of a 4k TV with
  2. the 4k bluray standard is still theoretical, but there are no 4k blurays on the market
  3. there are still no movies shot completely in 4k, so you would still see “bloated” and non-native 4k movies
  4. streming 4k actually offers less than 2.7 k effective resolution, the rest is born by interpolation so again useless.
  5. 4k broadcasting is theory, in reality to date there are few channels in fullHD (1920 x 1080), most at most are hd (1280×720).
  6. 4k TVs do not have digital decoders to receive 4k signals because 4k transmission standards have not yet been defined, so by the time 4k channels exist our TV will already be obsolete and unable to show movies at 4k resolution.
  7. reading FullHd movies offers a blurry view of the content (because inflated 4 times) or too sharpened because to mask the soft image is processed actually eating several levels of detail during the raw sharpening operations. So to view a bluray is a bad vehicle.
  8. it will cost much more than a quality fullHD equivalent, not offering to date the ability to view truly different images.
  9. in viewing 4k amateur images from cell phones, cameras etc. they may not have enough detail to show quality images to take advantage of the 4k matrix
  10. to perceive the real difference between a 4k TV and a Fhd TV you have to go from 50 inches up, which however you will have to see more from a distance and then you go back to the initial absurd, useless for most people who do not have the visual ability to appreciate the detail on a physiological level.

why is 4k not needed in cinema?

4k projection is a bit of a scam, because in reality most movies are shot with 2k digital cameras, such as alexa, so the fact that it is then projected in 4k gives no advantage in fact, in fact, it is an unnecessary flexing of the muscles of the whole system, because it will require 4 times the original space of 2k, more resources for play and content management without giving any real advantage.

But can you really not see the difference?

Nolan bragged that he shot The Last Dark Knight and Interstellar in Imax (ultra-high-definition film format), and a lot of people said they noticed the difference…
I’d be curious to ask those same people if they can tell me what shots they noticed the difference in, because neither of them are shot completely in Imax, too expensive, too big and uncomfortable cameras, etc etc… so traditional s35 shots were alternated with imax shots (purely of exteriors where it was easier to handle bulkier cameras)… especially since these films in most situations were seen in digital theaters on 2k projectors, so where everything was flattened and shifted down.

Another plus point for the overall mix is given by the fact that many films digitally undergo very heavy postproduction, even if only at the level of editing and color correction, one is then not able to distinguish in short takes made by dslr, go pro, cameras, from professional cameras. All thanks to the fact that they are used in the best situation to best extrapolate the visual quality from the different sensors by making them work to the best of their ability.

so for 4k is it early?

well you shoot 4k for the future, because you are extracting a very high quality 2k and fullHD, but directly using 4k at the home level is a waste, because it is not directly perceptible to the eye in most home situations.

why then are 4k televisions and all the 4k peripherals proliferating?

something has to sell you, or does it?
In marketing, numbers have always been used to give a tangible perception of a value, even if here numbers had no real connection to the value of the product.

for example, burners started with the 2x of cds, up to x52 of dvds, but no one tells you that x52 media does not exist, because burning is a balance between write speed and number of errors introduced, depending on the quality of the media the speed is dosed to introduce a minimum number of write errors, to allow the data to be read and thanks to an error correction system to be able to go back to the original data. The concept of read error correction was originally born to compensate for manufacturing defects, and/or scratches or damage to the media, over time this system has become a method of speeding up writing to the media based on the fact that in the end you can still read the data.

Where does the problem lie? In the fact that if you take a media to the limit of readability because we want to burn it at x52 instead of x8, all it takes is slight wear and tear to make the written data unreadable. Not only that, slow writing applies the write laser differently and by introducing fewer errors also makes the media more resistant to harder wear, uv exposure, deformation of the writable media, etc.
Which makes one think about how superficially one writes data to a medium, without having had notions of how to do it and how to store it.. good luck to formula 1 burners, maybe after 3-4 months they will still be able to reread something from their media.

another example, megapixels in cameras:

it has always seemed that megapixels are an indication of quality, but if you squeeze 40 megapixels onto a 1/4-inch sensor, you cannot expect to have the same cleanliness and light as 12 megapixels on a fullframe sensor, because the light captured by each receptor is greater. Actually it is not only the megapixels but also the ability of the megapixels to capture information, the area covered that offer the actual quality of the captured image, but the concept is too complicated, so for the masses megapixels = quality.

i still remember when I gave my sister a three megapixel compact camera, in a world where several 5-6 megapixel cameras had already come out, but the photographs and the detail of those photographs was unmatched by the equivalents as a price range, because some interpolated, some had yes more receptors but less sensitive etc etc.

today one competition in cameras and cameras is sensitivity (actually even 25 years ago, since we talk even then about shooting with a candle).
If you don’t shoot in the dark, and I’m not talking about natural light, I’m talking about dark, then the camera is not worthwhile…so a red and an alexa, digital cameras they use to make movies that have a native sensitivity of only 800 iso are scamorces…

Why is it already late to buy a 4k television?

let’s say mine is a provocative statement, but not too provocative….
because the Japanese are already experimenting with 8k broadcasts, so why buy an outdated product, you might as well go straight to 8k 😀

jokes aside, the Japanese have always been at the forefront with experimentation, I remember the first fullHD shooting and viewing system seen in person at SIM audio Hifi in milan in 1992, a joint experimentation between RAI and a Japanese giant Ikegami, ironically I had captured such images with my very powerful 200 line vhs and it seemed so far away as quality and power.

well before these pioneers back in 1986 Francis Ford Coppola, produced by George Lucas, made a special 4D (3d plus what is now called augmented reality) video clip using experimental HD cameras starring the great Michael Jackson in Captain EO.
This is to point out how if HD was already present as a technology in 1986, today after almost 30 years, it is still not the TV standard, so let’s consider well how far 4k can penetrate inside our homes in a couple of years.
Above all, one has to think about the fact that 4k does not only mean changing the reception systems, which are inexpensive, but changing all the airing and broadcasting systems, and for televisions it would be a monstrous investment, which I doubt they will make so quickly, since many are still at Standard definition.