4K Movies in the very Beginning
The trend for videophiles to spend thousands of dollars until they get the perfect picture continues. Can the human eye perceive the difference between 1080p and 4k and now 8k sitting 10 foot away from the display? This has become one of the arguments against ultra high definition and therefore 4K movies.
I am skeptical about a lot of things, chasing after newer and ‘improved’ technology before the previous has matured or implemented to it’s maximum capability is leading landfills to saturate. Many years ago – possibly 20 years ago at max, already there were arguments for and against what standards would be used to broadcast HD and gauging by what I currently see nothing was ever set in concrete. Actually, worse still, moving from 1080p to 4K where does this leave the consumer? The older line standards of 525 and 625 were with us for nearly 50 years and is still with us in many countries – in our quest to make things bigger, better and faster we are still limited to what our eyes can or cannot see. Don’t always be fooled by the hype – in reality the more money you splash out the closer you can sit to the display. Expert opinion says that you cannot see the difference between a 1080p and 4K 65″ monitor sitting 12 foot away.
The old TV Standards in a nutshell – 525 and 625 lines of colour: NTSC, PAL and SECAM
Many years back the NTSC standard in the USA (RCA) allowed 525 horizontal lines at a frame refresh rate of 30Hz which is half mains frequency. The human eye does not detect this slow frequency switching. With colour broadcasts a sub-carrier was injected into the signal at 3.58MHz and the frame refresh reduced by a fraction of a Hertz to prevent hetrodyning of the chromo and FM sound signals.
NTSC transmissions brought about colour tone or hue shift due to ghosting which lead to Telefunken producing the PAL or Phase Alternating Line system which used a 4.43MHz colour carrier where the phase was reversed 180 degrees on each line scan. The PAL system used 625 lines and the frequency of frame was usually 50Hz/2 or 25Hz interlaced, which is also not detected by the human eye.
SECAM was developed by French company Compagnie Française de Télévision (Technicolor), It uses an FM chromo subcarrier sending the R-Y and B-Y in sequence. SECAM was originally designed to run at 819 lines. The system was not adopted in the west and most Asian countries (except N.Korea). Many SECAM countries later migrated to the PAL system.
Tube Design – above 1280 lines but not yet 4K
The Shadowmask Tube
1 = RGB Guns, 2 = electron beam RGB, 3 = Focus Ring, 4 = Deflection Yoke X/Y axis, 5 = Final Anode > 25kV, 6 = Shadowmask with either holes or slots for delta or in-line electron guns, 7 = RGB Phosphor, 8 = Zoom to Phophor.
The first delta and in line tube technology used shadowmask tubes. By their very nature they did succumb to convergence and efficiency problems, the delta tube being very much inferior in this regard. Numerous companies tried to improve their tube design but none were as successful as Sony which brought out the Trintron tube or aperature grille using vertical wires instead of a shadowmask – the picture was brighter, crisper and paved the way for flat tube technology used in computer monitors.
The Trinitron and most flat screens had one or two horizontal stabiliser wires to prevent shimmering and although not really visible on TV standard tubes at 625 lines was quite noticeable on computer monitors running a white background, the visibility thereof being a common complaint from users. Picture and high speed graphic quality even above 1600 lines was excellent.
Digital Vs Analogue
The older CRT based technology was mostly analogue, the tubes were either cathode or control grid modulated to get the relevant intensity of colour which was set up by grey-scale, convergence and purity. Purity rings around the neck of the tube could be turned to compensate for magnetic influences, especially earth’s magnetism after degauss. CRTs are heavy, could radiate X-Rays, are energy hogs and suffered from flashover which of course would damage circuitry if the energy was not properly contained. CRTs, which follow vacuum tube technology was also ‘old’ tech – LCDs were becoming popular in the 90s and although manufacturing costs for the first commercial panels and prescaler chips were exorbitantly expensive, LCDs paved the road to digital signal processing and our now lower energy usage footprint.
With all the fanfare though, for those of you whom can remember, this technology was full of surprises. Using a fixed frequency but via means of prescaling the LCD display could be run at different resolutions which for those using their computer as a text editor were mostly unenthusiastic about the image quality if not run at the native frequency. In fact early passive matrix LCD picture quality was pathetic, compounded by smear when using detailed and fast graphics which left gamers and programmers still sticking to CRT technology or buying really high end active matrix screens which still exhibited fairly slow response times, sometimes as low as 20ms.
LCDs, because they are not light emissive, require backlighting which in those early times were strictly fluorescent. Early displays were not uniformly illuminated and in poor design there was noticeable flicker. Purchasing one would have in all likelihood left you with a feeling of despair as well, many times they were shipped with dead pixels or worse still, a sub-pixel which stayed on. The panel was not shipped as an A-grade panel, these were reserved for professional series displays and even they had their fair share of problems unlike the ‘super A grade’ reserved for military and medical sectors.
With improvements in technology over the last ten years, LCD panels with LED backlighting have all but eradicated other forms of display used in notebooks, TV receivers and computers. Although LED backlighting has improved the black saturation of LCD displays they still do not compete with older plasma technology except in light output. LCD television, with possibly the Sony Bravia technology being a big driving factor, have made big inroads however.
Plasma, for many, rules the roost – better colour saturation, black levels and faster image reproduction. With an affordable price tag too. Definitions are all HD, like LCD. Both LCD and Plasma now run at 2k or 2 000 horizontal lines. Panasonic has stopped manufacturing plasma, from 2013 which has left LG and Samsung on their own – so what next?
The general train of thought is that until 4k matures, users should stick to plasma. I agree. Both Samsung and LG may stop production of plasma as well once pricing for OLED technology drops, this is after all a significant role player in the buyer’s decision making process. Plasma means more bang for your buck.
4K – the step forward…
So with many manufacturers now opting to drop plasma (for the second time) for the favoured 4K standard we need to look at what is realistic or not. What can the eye see and what can we hear? Will we see new audio amplifiers being released with a 1Hz to 1MHz bandwidth? Can we feel the difference as many people think they can. As mentioned before – the picture on my plasma is better than the image I see out of the lounge window on a sunny Saturday afternoon. Have you watched brain games? Our eyes and brain are wonderful things, we can even perceive things to be there which just aren’t.
Personally and I am no neurosurgeon, I am very skeptical. 1080p is not even mature, yet we have moved on to a resolution which our eyes will tell us is better than 2K because our brain will tell us so. Just as people can hear 30kHz or 40kHz and get the feel for the sound.
Certainly, we can now create things to look better than we originally anticipated – but is it real? Using image enhancing software to make our photo look better or more realistic than the real thing seems to be the way we are going – comparing 1080p and the 2K models abundantly out there I see something which is surreal. It concerns me because we believe that 4k is better than 2k and now already there is talk of 8k. Early digital processing left a lot to be desired, LCD image qualities were inferior to CRT and there certainly was nothing ‘lifelike’ about the movies shown on these devices.
Plasma and 1080p took care of that. LCD with evolved LED backlighting still does not come up to the picture quality standards of plasma although there have been great strides through 2013 to put matters right. OLED and 4k is a technology which, like early LCD will become more popular over the next two years but right now, marketing is all around how close one can sit in front of the wide screen display, the size of the display, seeing crystal clear text and just be in awe at the definition. Our brain and eyes however, deceive us. There are also limitations.
4K Movies and 3D Television
Here is an article on Wiki which covers the salient points on 3D reproduction. Whilst 3D or stereoscoping is not new, manufacturers have devised new ways to reproduce a 3D signal on modern television receivers by using active shutter 3D glasses linked via IR and autostereoscopic where the viewer does not need glasses. Multiview, using arrays of cameras to capture the stereoscopic affect has been taken up Sony and Hitachi. This, IMO, is the best way to capture depth but as there is no particular standard as of current, 3D really is not nearly mature enough for mass retail although the sales of 3D television receivers has nearly doubled every year since 2010.
Although the definition of current 3D models can be breathtaking viewers complain of headaches and motion sickness. Most viewers prefer the older passive lens specs to the newer active. The one big problem with passive however is that the viewer only sees half the amount of lines at any one given time compared to the crosstalk issue with active lenses which viewers in general find more objectionable. Therefore on a 1080p line system one sees 540 lines which theoretically puts you in the same space as the older NTSC television broadcast system but with a major difference – missing lines. 4K movies on 4K displays with 2160 lines is a huge jump in the right direction, the resolution is such that the viewer is not aware of any artifacting, jagged edges and horizontal lines. This is where 4K shows the most promise with even 8K being a possible standard in the very near future.
The Future of 4K
Both the Samsung and Sony 4K displays have incredible definition but one needs to utilise the technology properly – as surely as one does not watch a Blu-Ray movie through a 625 line television receiver, the 4K display can very easily be under-utilised. 3D with passive lens technology is where it comes into it’s own, the 4K’s biggest strength: Showing 4K Movies in 3D – otherwise at present I see no other merit. Boasting the latest display which is used to view analogue over the air content is pretty much not the way to go. Until there is a standard set of rules which all manufacturers of broadcast and media content can abide to, likewise utilising 1080p to get maximum performance which covers 3D as well, we may very well end up chasing our own tail.
The reality is simple and it’s all in the test. I have watched 4K movies on a Sony and they were simply breathtaking, 3D just made it better. With a 28 500 U$D price tag, it had to be. Right now I am happy with my $1 000 plasma.
4K and 8K is really ideal for the movie house, now this is an industry standard I understand.
[Ed – The new Samsung Curve I believe is something else. Does this change my mindset? We’ll see].