Tech Support Guy banner
Status
Not open for further replies.

TV Error - codec error video size wrong

625 views 3 replies 2 participants last post by  Oddba11 
#1 · (Edited)
I use AVC, a free/paid video converter. Mine is the free version.
I don't know what size to make my videos to play on a 42.5" Smart TV. When I had my old TV I made them 720x404 and they worked. That size also works on my 42.5, but when I switched it to 720x480 it stopped working.
So I read that my video will play 1920x1080 so I am trying that now, it that the right size?

AND (the question that came up when I was looking at the AVC Settings)

How many Kernels should I give to AVC to use to convert videos. I have an AMD 8350 CPU.
On my 6100 CPU I gave 2 kernels, how many should I give with the 8350. (right now I upped it to 3 kernels)
Does it matter, does each video file being converted need its own kernel?
I read a Wikipedia article on Kernels because I did not even know what it was (as far as that goes, I still don't know much) and the article did not help much. I guess it organizes things, deciding which device or program gets CPU resources.

Anyway

How many kernels should I assign?
(If I assign 3 kernels should I be converting 3 files at once or will it help while just converting one video file?)

What is the right video size?

and Thank You
...............
also, should I use OpenCL or Cuva? why does it take so much longer at 1920x1080 than it does at 720x404. It is a big difference in time.

Tech Support Guy System Info Utility version 1.0.0.2
OS Version: Microsoft Windows 10 Home, 64 bit
Processor: AMD FX(tm)-8350 Eight-Core Processor, AMD64 Family 21 Model 2 Stepping 0
Processor Count: 8
RAM: 16330 Mb
Graphics Card: NVIDIA GeForce GTX 1060 6GB, -1 Mb
Hard Drives: C: Total - 487526 MB, Free - 53331 MB; D: Total - 1430346 MB, Free - 419452 MB; F: Total - 1907726 MB, Free - 168059 MB;
Motherboard: ASUSTeK COMPUTER INC., M5A97
Antivirus: Windows Defender, Disabled
 
See less See more
#2 ·
You don't select the resolution (ie: video size) based on your TV. Any current TV should accept standard HD resolutions (ie: 480P / 720P / 1080P). For best quality, keep the output resolution the same as the source. Adjust the bitrate as needed to for quality or reduce output file size. If you are trying to make the files as small as possible, then you could try reducing the resolution (depending on the source material).

I'm not sure about the kernel question as the tools that I've used typically refer to the number of cpu cores. I usually use VidCoder, which is a gui for Handbrake, and by default it will use all of the available cores and run them at 100%. You are better off running a single file at a time at 100% rather than multiple files at once. Aside from cpu core usage, multiple files will have a bigger impact (or be negatively affected) on disk access.
 
#3 · (Edited)
Ok, so I tried it at several sizes. with 720x404 it works, with any of the other settings I have tried it does not.
On my first attempt to I asked AVC to keep the same video size and to convert to a better definition, so it is a larger file size.

Let me try it by using the same video size with normal definition and see what happens.

My TV screen is 42.5" and it has a 2160 resolution, although it sounds like that does not matter.

EDIT:
OK, so I did that and it did not work. Nevertheless, I still think you are right. Here might be why. I forgot to tell you this.
I am not plugging my flash drive into my TV. I have been plugging it into my DVD player. It has a USB port and will play only XVID encoded files.
Is it possible it is the DVD player that cannot deal with the resolution and not the TV. All I see is an error message saying " wrong size video" (or whatever it says). Perhaps that is coming from the DVD player. it is an old Phillips brand DVD player. It is old and I may have bought it before 1080 came out.

Do you think that might be what is happening.
 
#4 ·
Just a little background information...excuse me if you know and understand this already.

Current LCD type screens have a fixed resolution. Lets take your 2160P screen for example. The native resolution of that panel is likely 3,840 x 2,160. All video input to the TV will be displayed at 3840 x 2160. The best picture, will result from a true 3840 x 2160 source. Any other resolution which is input will be automatically scaled by the TV to match the native resolution. If it's a higher resolution, it gets down scaled, if it's smaller, it gets up scaled. This is all performed by the TV for ALL video.

When processing video, you want to preserve quality as much as possible. And each time you encode video, you reduce the quality. It may be by a small amount that you don't notice, but it is still being reduced. It's just a drawback of working with video. So for best results, you want to process as little as possible. For best quality, you will want to keep the resolution the same or lower (depending on your wants/needs).

For example, lets use a 1080P video with a perfect picture. When that video was created, a video bitrate was selected to ensure enough data was provided to maintain that quality at the 1920 x 1080 resolution. That amount of data that was used is now a fixed amount within that file and accounts for the files size. Now, you could encode that video to a different format (ie: codec) at the same resolution and at best, maintain the quality. If you were concerned with file size, you could reduce the bitrate until you start to notice quality loss or reduce the resolution and maintain a high bitrate. Or, as you mention, you could increase the resolution. The problem now, is that the file only contained enough data for the 1080P resolution, so increasing the resolution to 2160P is going to reduce the quality (the data just isn't there).

Now, having said all of that, you will need to spend a little time experimenting. What you are currently doing is using AVC to up scale the video, and even then, you aren't creating files in the native resolution of your TV. So your TV is going to up scale the video again for display. What you need to determine is whether AVC is better at up scaling the video or the TV. In most cases, unless you are using a budget line model TV, the TV will be at least as good as the software. And even if it isn't, it's already reprocessing the files that you are creating. Your time is better spent ensuring the files used are the highest quality possible, regardless of resolution.

There could be other factors at play here as I don't know your entire playback path. But I would concentrate on video quality over resolution.
 
Status
Not open for further replies.
You have insufficient privileges to reply here.
Top