• Super Hi-Vision 8K TV approved
    62 replies, posted
[url]http://www.bbc.co.uk/news/technology-19370582#sa-ns_mchannel=rss&ns_source=PublicRSS20-sa[/url]
Can we not have 8k? Broadcasts are having enough problems with 1080i, it's expensive for all that bandwidth and not even every channel is in HD. Then we have to upgrade the cables to transmit more data in houses, which means more storage at the cable dealer, which means bigger bills than what we already have. 2mp is perfectly fine for home viewing of a live format. Megapixels mean more in a still format as it has more time to be analyzed. But expecting this by 2020 is ridiculous considering how HD is finally just settling in, and most blu rays and games are 720p, not even 1080p
[QUOTE=Trogdon;37390145]Can we not have 8k? Broadcasts are having enough problems with 1080i, it's expensive for all that bandwidth and not even every channel is in HD. Then we have to upgrade the cables to transmit more data in houses, which means more storage at the cable dealer, which means bigger bills than what we already have. 2mp is perfectly fine for home viewing of a live format. Megapixels mean more in a still format as it has more time to be analyzed. But expecting this by 2020 is ridiculous considering how HD is finally just settling in, and most blu rays and games are 720p, not even 1080p[/QUOTE] You are thinking about it wrong, once we reach 8K you will never have to upgrade again.
[QUOTE=Trogdon;37390145]Can we not have 8k? Broadcasts are having enough problems with 1080i, it's expensive for all that bandwidth and not even every channel is in HD. Then we have to upgrade the cables to transmit more data in houses, which means more storage at the cable dealer, which means bigger bills than what we already have. 2mp is perfectly fine for home viewing of a live format. Megapixels mean more in a still format as it has more time to be analyzed. But expecting this by 2020 is ridiculous considering how HD is finally just settling in, and most blu rays and games are 720p, not even 1080p[/QUOTE] It has a long, long, way to go before you'll see any broadcasts in 8K. They don't expect it below $10,000 for another 15 years, and there are only three cameras capable of producing content for it.
I'm still fine with my loyal Bendix 24" Model KM21C 1952 Television. It's provided great 240p picture quality in a large quality grayscale for 60 years now. [editline]24th August 2012[/editline] Why anyone would need to upgrade is beyond me.
[QUOTE=Trogdon;37390145]Can we not have 8k? Broadcasts are having enough problems with 1080i, it's expensive for all that bandwidth and not even every channel is in HD. Then we have to upgrade the cables to transmit more data in houses, which means more storage at the cable dealer, which means bigger bills than what we already have. 2mp is perfectly fine for home viewing of a live format. Megapixels mean more in a still format as it has more time to be analyzed. But expecting this by 2020 is ridiculous considering how HD is finally just settling in, and most blu rays and games are 720p, not even 1080p[/QUOTE] Just futureproofing. [URL="http://en.wikipedia.org/wiki/List_of_common_resolutions"]There are plenty non-practical official resolutions.[/URL] I'm sitting on QSXGA (not including multiple monitors) myself.
It even has interlacing frames which other television sets do not have so they have coarse horizontal lines.
[QUOTE=Neo Kabuto;37390446]It has a long, long, way to go before you'll see any broadcasts in 8K. They don't expect it below $10,000 for another 15 years, and there are only three cameras capable of producing content for it.[/QUOTE] We're definitely a long time away from this being in widespread use, just think of the bandwidth required to stream 7680*4320 video at 120fps, it would probably be in the order of 1Gb/s. We'll probably see the transition to 4K resolutions first, but even that will take quite a while.
is that camera really fucking huge or there is some tricky perspective on that photo?
Soon we'll have native RED EPIC devices.. [thumb]http://www.evanagee.com/wp-content/uploads/2010/09/28k_RED_CAMERA.png[/thumb]
[QUOTE=C4rnage;37390786]is that camera really fucking huge or there is some tricky perspective on that photo?[/QUOTE] Yes they used the trick of replacing camera operators with Asians so it looks bigger.
I saw the TV at the Olympic park, it's ridiculous.
It'd make more sense to perfect the existing standards before starting to go to absolutely enormous sizes
[QUOTE=alien_guy;37390291]You are thinking about it wrong, once we reach 8K you will never have to upgrade again.[/QUOTE] We said the same thing about dial-up ages ago, we said practically the same thing about games 20 years ago. "Graphics will never be better than this!", "Internet will never be faster than this!". Never say never.
[QUOTE=Trogdon;37390145]Can we not have 8k? Broadcasts are having enough problems with 1080i, it's expensive for all that bandwidth and not even every channel is in HD. Then we have to upgrade the cables to transmit more data in houses, which means more storage at the cable dealer, which means bigger bills than what we already have. 2mp is perfectly fine for home viewing of a live format. Megapixels mean more in a still format as it has more time to be analyzed. But expecting this by 2020 is ridiculous considering how HD is finally just settling in, and most blu rays and games are 720p, not even 1080p[/QUOTE] HD was being introduced around the 90s and was even worked on even earlier. We might not see 8k Res until around 2040 or 2050 in the home because of the point of diminishing returns. The jump from 480p to 1080p is easily seen by many people, but for 8k, you'd need a really big screen, probably unmanagable to make the difference seen for the majority of people. Personally, I can't wait for 8k Screens. Also you're looking at this from a completely wrong angle. Providers won't force you to 8K, and then proceed to charge you $1000 per month. You won't see a difference in price because that's not how it works. Internet has consistently been roughly the same as far as price for average internet goes. You simply now have options for higher speeds for more money, but the same person in the early 2000's would be paying roughly the same for internet now. Also the cables themselves don't really need to be upgraded unless you're talking about moving from copper to fiber, which we need to do anyways. Copper can only transmit so much. Fiber allows for even a form of light twisting for a sort of quad pumped mode for many terabits of bandwidth. We need that anyways. 8K won't come to homes probably until like 2050 or 2055 because we don't see the need for 8k broadcasts. You'll be seeing these in venues, and then high end theatres, and then maybe high end home theatres and then enthusiasts. As far as storage goes, storage will continue to progress in line. If someone can't store that media on current technology there will be demand for higher density storage and the market will let it push to be developed. If people stopped having the need to store more and more things at higher and higher resolutions, the HD market would have gone a bit futher and then slowed down if there was no demand for the storage. Most of this stuff is inter-twined and will balance out. As far as resolution the same could be said for when we were on 480i connections. Who needs 2MP for video, we can keep that on digital cameras. You don't know you want or need something until you experience it in this case. Once people slowely get a taste of in-home 4k video, I can assure you more and more people will want to own a 4k display. Also BluRays are set for 1080p because even some theatres are at 1920x1080. It's a standardized format, but 4k and 8k are also standardized formats. We may see the new "DVD" format at 4k res. Games are limited not because they feel like it but because of hardware limitations. Consoles are on 2006 hardware or older right now. But new gen video cards could have no issue gaming at 4k I'd imagine. I game at 2.5k right now on a machine I built last year. Yes it was expensive, but performance always goes up. By the time 8k is standardized a machine for like 500 bucks could probably render 8k really well. It's the natural progression of technology.
I'd assume modern GPU's can handle 4k pretty easily. Heck, my 570 runs more than fine at 2k and a bit, and it's last generation (not to mention it is below the 580/90) I think the only thing holding some cards back may be the lack of sufficient display connectors.
[QUOTE=FlubberNugget;37392099] I think the only thing holding some cards back may be the lack of sufficient display connectors.[/QUOTE] DisplayPort.
[QUOTE=Silikone;37392341]DisplayPort.[/QUOTE] I know most modern high end cards use displayport, that's why I said 'some' Also, Dual link DVI is an option too
[QUOTE=FlubberNugget;37392362]I know most modern high end cards use displayport, that's why I said 'some' Also, Dual link DVI is an option too[/QUOTE] 4K might work at 24hz. But Dual Link DVI tops out at 2560x1600@60hz unless you overclock the port basically.
[QUOTE=Derp Y. Mail;37391715]We said the same thing about dial-up ages ago, we said practically the same thing about games 20 years ago. "Graphics will never be better than this!", "Internet will never be faster than this!". Never say never.[/QUOTE] What a stupid thing to say, Im making the point that once res gets high enough that you cannot see the pixels (like iphone or ipad retina displays) you no longer have to increase it cause it doesnt make a difference, those examples can get infinitelly better because there will actuallly be a noticable improvement. [editline]24th August 2012[/editline] [QUOTE=FlubberNugget;37392362]I know most modern high end cards use displayport, that's why I said 'some' Also, Dual link DVI is an option too[/QUOTE] Thunderbolt is a good option.
[QUOTE=Trogdon;37390145]Can we not have 8k? Broadcasts are having enough problems with 1080i, it's expensive for all that bandwidth and not even every channel is in HD. Then we have to upgrade the cables to transmit more data in houses, which means more storage at the cable dealer, which means bigger bills than what we already have. 2mp is perfectly fine for home viewing of a live format. Megapixels mean more in a still format as it has more time to be analyzed. But expecting this by 2020 is ridiculous considering how HD is finally just settling in, and most blu rays and games are 720p, not even 1080p[/QUOTE] it's just prototypes at this stage, while the technology exists now to film, transmit and display 8K signals, the means to do this are incredibly expensive, large and impractical, just as 1080p was back in the day. as technology marches on and miniaturization takes hold this tech will find its way into people's homes naturally as 1080p has done
[QUOTE=alien_guy;37392638]What a stupid thing to say, Im making the point that once res gets high enough that you cannot see the pixels (like iphone or ipad retina displays) you no longer have to increase it cause it doesnt make a difference, those examples can get infinitelly better because there will actuallly be a noticable improvement.[/QUOTE] And you think that's going to stop us from trying to increase it anyway? It's not a point of "it's no use", it's a point of "it'll happen anyway".
[QUOTE=alien_guy;37392638]What a stupid thing to say, Im making the point that once res gets high enough that you cannot see the pixels (like iphone or ipad retina displays) you no longer have to increase it cause it doesnt make a difference, those examples can get infinitelly better because there will actuallly be a noticable improvement.[/QUOTE] I don't think you get the point of Retina displays.
[QUOTE=Trogdon;37390145]Can we not have 8k? Broadcasts are having enough problems with 1080i, it's expensive for all that bandwidth and not even every channel is in HD. Then we have to upgrade the cables to transmit more data in houses, which means more storage at the cable dealer, which means bigger bills than what we already have. 2mp is perfectly fine for home viewing of a live format. Megapixels mean more in a still format as it has more time to be analyzed. But expecting this by 2020 is ridiculous considering how HD is finally just settling in, and most blu rays and games are 720p, not even 1080p[/QUOTE] The fiber cable is the last stop we need. Your cables don't need to be changed in the future unless you're one of those stupid fellas who chose coax cables instead of fiber cables. Fiber cables operate with light, ultimately the cable will only be limited by the speed of light, and well, we can't get anything faster than that. The equipment you have in your home will eventually have to be changed to keep up, processing that light faster, but that's it. Something you'd expect to happen no matter what just for changing the strength of your router or adding new functions to your TV box. Your bills won't get bigger either because the speed of your internet service gets faster as technology progresses. The lowest net speed with the lowest price now will eventually be the same, and of course later surpass, the highest net sped your operator can deliver (through fiber or coax) today. Nobody said anything about blurays and dvd's. We're talking broadcasting here. All they need is harddrives big enough to store the footage and you're done. Nobody said anything about selling series and whatnot at 8k. There is plenty of movies and series using 4K RED cameras today, but they don't make anything higher than 1080p (which is good for cinematographers as they can crop shots without losing any quality since the image is so big). Bigger resolutions will come as the market demands it and the industry can support it. 10 years in the future is actually far from far-fetched for this. Just compare how it was 10 years ago. Youtube and Facebook didn't even exist back then, 720p and 1080p was pretty much unheard of, the net was finally starting to get 1-2Mbit speeds and 3D movies had just become the big new thing. Just saying. [editline]25th August 2012[/editline] [QUOTE=SCopE5000;37390804]Soon we'll have native RED EPIC devices.. [thumb]http://www.evanagee.com/wp-content/uploads/2010/09/28k_RED_CAMERA.png[/thumb][/QUOTE] Imagine what it looks like if you film a close-up of the eye or whatever. Those details man. Those details.
[QUOTE=alien_guy;37392638]What a stupid thing to say, Im making the point that once res gets high enough that you cannot see the pixels (like iphone or ipad retina displays) you no longer have to increase it cause it doesnt make a difference, those examples can get infinitelly better because there will actuallly be a noticable improvement. [editline]24th August 2012[/editline] Thunderbolt is a good option.[/QUOTE] Correct me if I'm wrong, but isn't Thunderbolt basically a fork of DisplayPort? I mean, it uses the Mini DisplayPort connection It's also proprietary to apple IIRC. Also er, not being able to see the pixels isn't the only benefit to a 'retina' display. Increased workspace, less eyestrain, looks better in general.
Thunderbolt isn't proprietary to apple. It's just apple puts it on their laptop. And yes thunderbolt uses DP circuits. I don't understand why they put the chips in the cables themselves though.
[QUOTE=Brt5470;37396432]Thunderbolt isn't proprietary to apple. It's just apple puts it on their laptop. And yes thunderbolt uses DP circuits. I don't understand why they put the chips in the cables themselves though.[/QUOTE] Oh, looks like I remembered wrong. Apple registered the trademark but Intel own all of the rights to it
[QUOTE=Brt5470;37391894] It's the natural progression of technology.[/QUOTE] sure there's such thing as natural progression of technology, but this is 8k video we are speaking of, this is not the same jump as 480i to 1080p. this is a jump from 2 megapixels (1080p) to 36 megapixels (8k). This is unfeasibly useless for a video format. Why you ask? Well it comes down to three important areas of the camera, the video codec, the sensor, and the lenses. If we look at the lenses, we will immediately notice an issue. Cine lenses are designed for a 35mm strip of film, or the usual cine format known as "super35". This is around 24 and probably 18 megapixels in terms of digital resolution. Now we are looking at something with far more detail than that. We are using 36 megapixels, and we don't even know the size of the sensors for these cameras. This alone is a huge deal. If the sensors for the camera do not cover at least a diagonal of 35mm then feasibly this is an absolutely useless jump (hint, it probably doesn't). The pixel pitch of a 35mm diagonal with 36 megapixels means that lenses cannot physically make the image sharper, and that there is absolutely no more detail provided by having more of them. This is certainly a problem red cameras run into, as there certainly aren't videos of them using native resolution. We do not have lenses that can physically make use of the sensor area, so why even bother? This is a problem among STILL cameras currently. If you compare a 36 megapixel image with a 24 megapixel one, and resize the 24 to 36 megapixels, you will have approximately the same level of detail between the two due to the lens not physically being able to resolve the difference. This means bigger lenses, bigger cameras, bigger heating systems, etc. And all for what? A difference in video quality that is just in terms of "resolution". The gain won't be much different because of the reality that they won't increase lens mount (the PL mount has been around for over 100 years, and the standard "broadcast" mount only covers a 16mm strip of film), they will just increase the pixel pitch and cram more pixels onto the same sensor. Big pixels = better picture. We aren't doing that, we are cramming more and in reality gaining no real resolving difference. And that's even assuming there would be a difference in 8k versus 4k. Certainly it is double the resolution, so double the detail. But how large would the TV need to be to discern the difference? How far would the optimal viewing distance need to be? At what point would natural vision deficiencies set in and make the difference immeasurable? There's a reason why for still photography 6mp is more than most people will EVER need. Because they are not making gigantic prints. But if you do make big prints, when are you going to have people looking at them at a close enough difference to make a discernible difference? the jump between 4k and 8k is not that much of a deal. In still frames at 100% crops you might be able to tell the difference and see more detail. But that is not how you view the format. You view it at the size of the screen, and it's moving. We should be focusing our efforts on making better video codecs so that way the resolution we are currently getting isn't shit (why flip HD's will not look as good as a high end camcorder despite having the same "resolution"), and focus on having better framerates instead of stupid 120Hz interpolation. I do not support 8k because 1080p isn't great at the moment. We have a LOT of work left to do here before we even think about moving up to such a high resolution. Higher pixel count does not mean better quality. /rant
[QUOTE=Trogdon;37396998]sure there's such thing as natural progression of technology, but this is 8k video we are speaking of, this is not the same jump as 480i to 1080p. this is a jump from 2 megapixels (1080p) to 36 megapixels (8k). This is unfeasibly useless for a video format. Why you ask? Well it comes down to three important areas of the camera, the video codec, the sensor, and the lenses. If we look at the lenses, we will immediately notice an issue. Cine lenses are designed for a 35mm strip of film, or the usual cine format known as "super35". This is around 24 and probably 18 megapixels in terms of digital resolution. Now we are looking at something with far more detail than that. We are using 36 megapixels, and we don't even know the size of the sensors for these cameras. This alone is a huge deal. If the sensors for the camera do not cover at least a diagonal of 35mm then feasibly this is an absolutely useless jump (hint, it probably doesn't). The pixel pitch of a 35mm diagonal with 36 megapixels means that lenses cannot physically make the image sharper, and that there is absolutely no more detail provided by having more of them. This is certainly a problem red cameras run into, as there certainly aren't videos of them using native resolution. We do not have lenses that can physically make use of the sensor area, so why even bother? This is a problem among STILL cameras currently. If you compare a 36 megapixel image with a 24 megapixel one, and resize the 24 to 36 megapixels, you will have approximately the same level of detail between the two due to the lens not physically being able to resolve the difference. This means bigger lenses, bigger cameras, bigger heating systems, etc. And all for what? A difference in video quality that is just in terms of "resolution". The gain won't be much different because of the reality that they won't increase lens mount (the PL mount has been around for over 100 years, and the standard "broadcast" mount only covers a 16mm strip of film), they will just increase the pixel pitch and cram more pixels onto the same sensor. Big pixels = better picture. We aren't doing that, we are cramming more and in reality gaining no real resolving difference. And that's even assuming there would be a difference in 8k versus 4k. Certainly it is double the resolution, so double the detail. But how large would the TV need to be to discern the difference? How far would the optimal viewing distance need to be? At what point would natural vision deficiencies set in and make the difference immeasurable? There's a reason why for still photography 6mp is more than most people will EVER need. Because they are not making gigantic prints. But if you do make big prints, when are you going to have people looking at them at a close enough difference to make a discernible difference? the jump between 4k and 8k is not that much of a deal. In still frames at 100% crops you might be able to tell the difference and see more detail. But that is not how you view the format. You view it at the size of the screen, and it's moving. We should be focusing our efforts on making better video codecs so that way the resolution we are currently getting isn't shit (why flip HD's will not look as good as a high end camcorder despite having the same "resolution"), and focus on having better framerates instead of stupid 120Hz interpolation. I do not support 8k because 1080p isn't great at the moment. We have a LOT of work left to do here before we even think about moving up to such a high resolution. Higher pixel count does not mean better quality. /rant[/QUOTE] Who says we're trying to jump right now? Things like this are designed so developers can make sure their technologies are compatible with eachother You're hardly going to make lots of 8k screens for stadiums and not set a standard resolution for cameras to capture video at, it's just futureproofing.
well clearly there is a jump trying to be made right now, and it's just premature. making 8k screens is fine, but having 8k cameras (now) is a little overkill, when the department is clearly in its infancy and has so many issues to conquer before it's actually a viable format. Lenses and sensors are not ready for 8k to be accurately resolved, thus making 8k video nothing more than a buzzword
Sorry, you need to Log In to post a reply to this thread.