• HD 5850 + 32" TV = 1080p monitor?
    17 replies, posted
Rather a simple question but in theory it should work i am just curious if anyone has a similar set up? Basically i have a 32" Television that i am currently using to run my system with a very old graphics card (thus not a huge resolution) actually with this really old graphics card its running on the screen (via VGA) 1360 x 768. Now too the question if i where to use a 5850 using a HDMI port on the 5850 into the HDMI port on the Television would i get 1980x 1080? (aka HD) In theory it should work but are there any limitations that you can think of that could prevent this?
uh, your TV probably only supports 1366x768. Most 32" TVs don't actually do 1080p, but a bastardised version. Sorry.
While it's true that a lot of 32 inch tvs used to have that panel size, most panels on the market today in that range are 'true hd'. OP - what is the model/manufacturer of your tv?
Im going to have to go ahead and find the manuals for model but the make is a samsung
I think OP is asking if a newer GPU will make is 720p HDTV do 1080p, to that the answer is no.
[QUOTE=Falubii;24036974]I think OP is asking if a newer GPU will make is 720p HDTV do 1080p, to that the answer is no.[/QUOTE] Not what i am asking i am just wondering if it would be supported? and if there is any real reasons too ditch the TV and get a smaller monitor if it will fit into my budget.
Just because the name of the video card has HD in it, doesn't mean that everything its connected to magically turns to HD.
xxncxx, he didn't ask that. He's asking if his tv will support full HD.
My understanding of HD tvs is that they will take the signal you give them and convert it to fit. So if your set is a 32" 720p TV then it'll take whatever you send it via HDMI and turn it into 720P, which is what you'll see on it. Of course, this means for best quality picture you may as well send it 720P from your computer to start with.
Can you provide a link to your TV? Also, what GPU do you currently have?
I use a 5770 and a 30 inch lcd- Crystal clear.
[QUOTE=Nipa;24033217]uh, your TV probably only supports 1366x768. Most 32" TVs don't actually do 1080p, but a bastardised version. Sorry.[/QUOTE] actually most 32" displays actually advertising full 1080p. and there are a lot of those. however, there are a lot of 1080i 32" displays as well, and usually those are 1366x768. i'm not sure why the FCC allows displays that aren't 1920 pixels in width nor 1080 interlaced pixels in height to say 1080i, but as long as it's larger than 720p, it can be marked as and accept 1080i. it's funny how that works. [editline]12:05AM[/editline] [QUOTE=cecilbdemodded;24047970]My understanding of HD tvs is that they will take the signal you give them and convert it to fit. So if your set is a 32" 720p TV then it'll take whatever you send it via HDMI and turn it into 720P, which is what you'll see on it. Of course, this means for best quality picture you may as well send it 720P from your computer to start with.[/QUOTE] what? are you trying to suggest that...huh? [editline]12:11AM[/editline] ok i'm going to respond to what i [I]think[/I] you're saying. to the first part: out of the box, sort of, but only stretching which is optional and preferential. to the second part: HDTVs do not upscale sources, for example it will take a 720p source and display a 720p source, upscaling is something the source device has to do (BD/DVD players, 360 and PS3, some cable/satellite boxes, etc.). so, if he did send a 720p signal, of course the tv would have 720p, unless you mean it would turn it into 1080p, which it wouldn't, and it wouldn't be possible if display itself wasn't 1080p (which it evidently is not). however, yes, the best quality would be to send the displays native resolution, which in this case is 1366x768. however, that's an abnormal resolution, and without display software, kind of hard to come by in Windows. [editline]12:13AM[/editline] [QUOTE=Handy_man;24039525]Not what i am asking i am just wondering if it would be supported? and if there is any real reasons too ditch the TV and get a smaller monitor if it will fit into my budget.[/QUOTE] no, it wouldn't. of course, your computer could perform better, but that's it. if you really want 1080p, you'll either have to buy a better TV or a new monitor.
OP, tell us the model of the TV and the graphics card. It is possible that the card doesn't support higher resolutions, however it would have to be VERY old. It's more likely that the tv doesn't support 1920x1080 resolution.
His gpu will support up to 2560x1600, all modern ones do for a single monitor. [url]http://www.amd.com/us/products/desktop/graphics/ati-radeon-hd-5000/hd-5850/Pages/ati-radeon-hd-5850-specifications.aspx[/url] Underneath ATI Avivo HD Video & Display technology. Although, with HDMI you'll only get 1920/1080, which is what HDMI runs at.
My HD tv(Panny plasma) upscales, not very well but it does. Before I got a blu ray player I had an old dvd player. We are talking not even a progressive scan with composite output(forget about upscaling). My tv handled it just fine. It took that 480i output and turned it into a viewable 1080p signal. I figured most HDs at this point have basic upscaling built in.
[QUOTE=Legend286;24059102]with HDMI you'll only get 1920/1080, which is what HDMI runs at.[/QUOTE] depends, you [I]can[/I] get HDMI to pull off 4096x2160 [editline]04:06PM[/editline] [QUOTE=cecilbdemodded;24063540]My HD tv(Panny plasma) upscales, not very well but it does. Before I got a blu ray player I had an old dvd player. We are talking not even a progressive scan with composite output(forget about upscaling). My tv handled it just fine. It took that 480i output and turned it into a viewable 1080p signal. I figured most HDs at this point have basic upscaling built in.[/QUOTE] no it didn't. it probably applied optional contrast "enhancement" and edge enhancement as well as noise reduction, but it does not magically produce 1080p out of 480i. you can't make an orange out of an apple. technically speaking, no DVD or Blu-Ray players can, either, they actually all tend to upscale it to some weird resolution between 720p and 1080p, then output it as either a 1080i or a 1080p signal. none of them literally upscales to 1080p as it just isn't practical dividing pixels (480 lines of height to 1080 lines of height is 2.25 times the pixels). they usually go for something like 1707x960 as that's approx. twice the size. and then they just stretch it and output it at 1080, which is just scaling, not upscaling. it stops upscaling at that other resolution. well, not really in width as a widescreen DVD is 720x480 stretched to 854x480, but really width doesn't tend to matter as it's not as obvious as height - quite a few console games aren't literally 16:9 (much less rendered at 1080p), they're actually more like 3:2 or 4:3 stretched to 16:9, but that really doesn't happen too often anymore, but it's not so easy to notice.
I forget the exact make of the current card that is being used but it is seriously old (from an old pre-built system) that was a dell dimension its like a radeon x900 of something similar will re-post when i get home from work. From what i can gather though it seems that people seem to think it may work, although i might not like the outcome in terms of the fact if the monitor has to be forced into upscaling and is not a native resolution of 1080p.
Its a radeon x600 card i mean that thing is terribly old well i guess it seems there won't be a definitive answer anyway i didn't really expect one i just wondered if anyone can think of any limitation (so long as they both support the Resolution) that the system would have and none where presented.
Sorry, you need to Log In to post a reply to this thread.