Pixel Question!!!

Status
This thread has been Locked and is not open to further replies. Please start a New Thread if you're having a similar issue. View our Welcome Guide to learn how to use this site.

aviynw

Thread Starter
Joined
Jul 25, 2005
Messages
12
First let me start by saying how I came about this pixel question. I am trying to set up a home theatre system on a very small budget. For my speaker system I’ve already decided to get an Onkyo 7.1 surround sound HT-S780 HTIB refurbished for $300 when using a %10 discount promo code. To complete my set up I also wanted to get an HDTV. When trying to decide how much I was willing to spend, a brilliant idea popped into my head. I already have an old Pentium 4 computer with s-video input and a very nice 21” CRT monitor. From my understanding the only difference between an HDTV and a regular TV was that an HDTV has a much higher resolution with 2,000,000 pixels, around 6 times the amount of a normal TV. My max resolution on my monitor is 1920x1440 for a total of 2,764,800 pixels. I’m not an expert in math but it seemed that logically if I had an image on my computer monitor at it’s highest possible resolution it would look better than if it were displayed an HDTV. I’ve watched TV on my monitor before with the signal going through my computer and it did in fact look significantly better than on my TV. Though there were a few things that made me doubt my logic that the picture would look better on my monitor than on an HDTV. Firstly, I thought shows had to be specifically broadcasted for HDTV to be viewed at that high resolution. If it worked like how I thought my computer worked, how come it too wouldn’t be able to convert the signal from low resolution to high resolution? How come a digital camera with 4 mega pixels can only print a picture that has 4 mega-pixels if my computer can convert that TV signal to such a high resolution? Also, when viewing media at full screen when my monitor was set at 800x600 and 1920x1440 I couldn’t notice a difference in quality. This is when I came to the realization that when I increase the resolution I’m just spreading across how many pixels one color is displayed and not giving the image any more detail. Is that correct? If so, then the only reason for the difference in quality that I saw was be because I was using an s-video connection instead of a composite connection? Can it also be that my higher quality monitor has more displayable colors?

Either way, I was very satisfied with the conclusion I came up with and everything seemed to make sense until another thought popped into my mind. If I were watching a video made for 320x240 resolution and my monitor was displaying 640x480 resolution, the video would take up half the screen. If I were then to enlarge that video to full screen the quality of the video would be noticeably worse. However, if I would change my monitor’s resolution to 1920x1440 and still watch that same video in full screen, the video would not look any worse than if I were watching it in full screen at 640x480 resolution. How come the difference in quality from watching the video at its native resolution (320x240) to 640x480 is worse but there is no difference in quality from watching a video at 640x480 to 1920x1440? The only thing that I can think of is that when you hit full screen the first 200% increase stretches the pixels differently then the second increase from 640x480 to 1920x1440. But if the image looses quality why would you want to stretch the pixels using the first method in the first place. Also, the reason I used an example that first stretched the video by 200% is that on media players it always has the option for a 200% increase or to have it at full screen. I’ve never seen a media player that allowed you to stretch a video by 300%. Is this because one can’t stretch this particular way more than 200%? Though I still can’t figure out why one would want to. I bet if I would give it some more thought and observation I would be able to figure this out but I thought I would pose that question to the experts (I hope experts are reading this).

The next thing I was wondering is if I got an HD-Receiver that was broadcasting a video meant for HDTV would I still be able to use my monitor for viewing and would it look better or worse than if I used an HDTV, and if so, by how much. I just don’t see the need in spending extra cash on an HDTV if I already own as good or close to as good parts at home. Though I probably could’ve made enough money to buy an HDTV in the time that I’ve spent pondering these issues, ;).

Thanks, and I greatly appreciate any help. :D
 

aviynw

Thread Starter
Joined
Jul 25, 2005
Messages
12
It actually looks like I'm not going to put in the extra cost of an hdtv even if I it will produce a much better picture then on my monitor because there simply aren't enough HDTV chennels to justify the cost with my budget (the cost of the HDTV and the additional channels).

But if I just stick to my monitor I have another question. Does it matter if I buy a cheap device that has an s-video input and VGA output at a max reolution of 640x480 or if I buy one that has a max resolution of 2048x1536 as the tv broadcasts are only going to be at 512x400 anyways, right? Then I can not use my computer at all and just buy a $50 device, right? Are there any other advantages of using my computer such as color quality ect.? I have a all-in-wonder ATI 8500 graphics card. It's an old card so would there be any advantage in picture quality of buying a newer card?
 
Joined
Jun 27, 2000
Messages
6,832
If I were watching a video made for 320x240 resolution and my monitor was displaying 640x480 resolution, the video would take up half the screen. If I were then to enlarge that video to full screen the quality of the video would be noticeably worse. However, if I would change my monitor’s resolution to 1920x1440 and still watch that same video in full screen, the video would not look any worse than if I were watching it in full screen at 640x480 resolution. How come the difference in quality from watching the video at its native resolution (320x240) to 640x480 is worse but there is no difference in quality from watching a video at 640x480 to 1920x1440?
Say your screen is 14 inches wide. When you display the 320 pixel wide image on the screen set at 640 X 480 you are showing the image 8 inches wide. That is 40 pixels per inch or probably less than half of what your screen can display – 72PPI is a myth unless you happen to own a very old Mac. Anything you do to display the 320 pixel wide image at the full 14 inches width is going to result in 20 PPI. Regardless of how and how much you interpolate the image you have only 20 PPI of information spread across your screen. That is going to look pretty crappy, and the amount you have interpolated to isn’t going to make much difference.

The next thing I was wondering is if I got an HD-Receiver that was broadcasting a video meant for HDTV would I still be able to use my monitor for viewing and would it look better or worse than if I used an HDTV, and if so, by how much. I just don’t see the need in spending extra cash on an HDTV if I already own as good or close to as good parts at home. Though I probably could’ve made enough money to buy an HDTV in the time that I’ve spent pondering these issues
If you bought a pricey HDTV card for your computer and paid for HDTV cable service you would get a pretty nice picture on your monitor. Digital is a step up from analog and HD is a big step up from standard digital. And if you had one of the new wide screen monitors and a dolby setup the cost of service and the adapter might be justified if you are a hermit and don’t share your TV.

Most HDTV sets don’t come with a tuner because most people get cable or satellite. But there is a signal in the air from network TV you can get free with a tuner and antenna. I think HDTV cards have a tuner. Most prime time network shows are HDTV anymore as well as major sports broadcasts.

I’m a football fan and bought a 60 inch HDTV before the 2004 season. It is the best purchase I ever made other than my lawn tractor and two wedding rings. But the ongoing costs are substantial and will exceed the cost of the unit over its lifetime.

I would think your All-in-Wonder 8500 is going to give you as good a picture as is possible without pricey cable upgrades or a HDTV card and good antenna.
 

aviynw

Thread Starter
Joined
Jul 25, 2005
Messages
12
slipe said:
Anything you do to display the 320 pixel wide image at the full 14 inches width is going to result in 20 PPI. Regardless of how and how much you interpolate the image you have only 20 PPI of information spread across your screen.
Thanks for the helpful response but I am still a little confused. If my screen's resolution doesn't change the ppi won't change either. Lets use easy number for convinience's sake. Lets say you have a 10 inch by 10 inch monitor with 10ppi going across and down. If I have a 50x50 pixel image it will take up half the screen, 5 inches across and 5 inches down. in that 5inch x 5inch space there is still 10 pixels per inch. If you enlarge that image to the full 10 inches x 10 inches it will also still be 10ppi, the image doesn't skip every other pixel so how come you say it would be half the ppi.

I also still have this question...
aviynw said:
But if I just stick to my monitor and regular tv I have another question. Does it matter if I buy a cheap device that has an s-video input and VGA output at a max reolution of 640x480 or if I buy one that has a max resolution of 2048x1536 as the tv broadcasts are only going to be at 512x400 anyways, right? Then I can not use my computer at all and just buy a $50 device, right? Are there any other advantages of using my computer such as color quality ect.?
 
Joined
Aug 17, 2003
Messages
17,584
Enlarging the image simply makes the pixels larger, given less effective pixels per inch.

It will use several screen pixels for each image pixel, the actual information will be duplicated in these pixels, giving a lower effective PPI.

Sorry but I just cannot follow your reasoning on this.
 

aviynw

Thread Starter
Joined
Jul 25, 2005
Messages
12
Okay, that makes sense and I get it now.

Thanks!

but if anyone is able to answer my other question that would be much appreciated.
 
Status
This thread has been Locked and is not open to further replies. Please start a New Thread if you're having a similar issue. View our Welcome Guide to learn how to use this site.

Users Who Are Viewing This Thread (Users: 0, Guests: 1)

As Seen On
As Seen On...

Welcome to Tech Support Guy!

Are you looking for the solution to your computer problem? Join our site today to ask your question. This site is completely free -- paid for by advertisers and donations.

If you're not already familiar with forums, watch our Welcome Guide to get started.

Join over 807,865 other people just like you!

Latest posts

Members online

Top