• My 32" Philips 720p TV will play 1080i?
    9 replies, posted
Am I missing something. This TV is advertised and spec'd at 720p. However, when I put my Blu-ray player in 1080i it shows up on the screen as well. But, when it's in 1080p obviously it doesn't show up. Anyone know why this would happen?
Because it isn't built to play 1080? Plug it into your computer though and you'll be able to watch Blu-ray movies.
[QUOTE=Cheezy;18562689]Because it isn't built to play 1080? Plug it into your computer though and you'll be able to watch Blu-ray movies.[/QUOTE] I don't think you're understanding what i'm saying. It DOES play 1080i. But it's not supposed to. Nvm i found out why. Some 720p TV's can play 1080i, however it downscales it. Seems kind of pointless to me.
remember that interlaced logically means double framerate half resolution.
[QUOTE=BmB;18562778]remember that interlaced logically means double framerate half resolution.[/QUOTE] Well, kind of, not really. The framerate comes in at 60 Frames per second, but each frame contains half the information, then the next frame that comes right after that contains the other half. The way that the frames are split up is every second column of pixels is removed, then the next frame fills up the other half. What this means is odd lines or even lines are drawn first. Basically, it's 60 fps, but every frame is half of the image, and the next frame is the other half. When you put this all together on a screen, you can't see it happen because it's happening so fast. So the actual framerate you are seeing is 60 Interlaced frames = 30 FPS. The reason why most HDTV broadcasters use 1080i is because, unlike 1080p, which is better, 1080i uses half as much bandwidth at 1080p does, so it saves them bandwidth, and money. And 1080i and 1080p are the same exact resolution. They are just drawn on the screen differently.
Each field is still half the resolution of the frame. And interlaced content can have each field represent it's own point in time. Doesn't have to be 60 fields = 30 fps. Could also mean 60 fields = 60 fps effective.
Stick with 720p if you can set it. 1080i is inferior for anything involving movement (it's a TV not a projector.)
720P is better for motion(like sports and action movies). 1080i gets you extra resolution, which is only valuable if you have a tv large enough to show the difference. (Hint, 32" is not large enough)
[QUOTE=cecilbdemodded;18576481]720P is better for motion(like sports and action movies). 1080i gets you extra resolution, which is only valuable if you have a tv large enough to show the difference. (Hint, 32" is not large enough)[/QUOTE] That doesn't make any sense. You can see a difference in resolution with any size television.
I attempted to use my 720p 32" TV as a monitor once. Needless to say, it didn't work very well. First off, the fonts were rubbish at even 720p. When I tried 1080i, the refresh rate hurt my eyes very badly. If you do require a higher resolution, go for it. But I just feel safer with the native one for some reason. >.<
Sorry, you need to Log In to post a reply to this thread.