1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Video file types and resolutions

Discussion in 'Other video questions' started by pkrillo, Feb 27, 2012.

  1. pkrillo

    pkrillo Member

    Joined:
    Dec 18, 2008
    Messages:
    4
    Likes Received:
    0
    Trophy Points:
    11
    This is actually a two part question relating to the same topic ... system resources (CPU usage) during playback of video files.

    The first concern relates to different video file types.

    Certain file types offer better data compression than others (from personal experience, it would seem that the old MPEGs were the least compressed, while the f4v flash video format seems to be highly compressed).

    Well, assuming that you had TWO video files of the same exact video (meaning same length, same framerate, same resolution, etc.) ... but one was encoded as an MPEG file and the other an mp4 file (which offers much greater data compression).

    So the mp4 file takes up much less space than the MPEG file, despite the fact that they are essentially the same video.

    When playing them on a media player on your computer ... which will require more CPU and which will use less? Or would they essentially be the same, considering the framerate and resolution are the same? Would the larger file size use more system resources to playback the video or not?

    The second concern relates to watching videos in their native aspect ratio (1:1) versus stretching them out to fullscreen.

    Say you have TWO video files of the same exact video again (this time, same length, same framerate, and same file type). However, the difference this time is in the RESOLUTION. One video is rendered in 360p ... while the other is in HD (720p). The HD video file is exactly the same as 360p video ... except when it comes to resolution.

    Assuming that your computer monitor has a native resolution of 720 vertical pixels ... then if you played back BOTH files using a media player ... which would use up more CPU (system resources) ... the 360p video in FULLSCREEN (meaning it is upscaled from 360 to 720) or the 720p video in its native resolution (which would fill up the whole screen automatically anyway in a 1:1 aspect ratio).

    In other words, which uses more CPU ... having to upscale the 360p video so that it is now displaying in 720p ... or simply playing back the 720p video (automatically fullscreen), which is larger in size and contains more data than the 360p video to begin with?

    Seems to me like if watching the 360p video in fullscreen uses the SAME amount of CPU as watching the 720p video normally ... then you're basically getting ripped off by watching a lower-resolution version of the same video ... upscaled (and poorly), are you not? The 360p video would look poor (needless to say) in fullscreen, while the 720p would look perfect in its native resolution ... and you'd still be straining your computer the SAME amount in either case ... total gyp. Or would playing the 360p video in fullscreen actually use MORE CPU ... because now the computer has to figure out how to fill in the missing pixels (and again, quite poorly)?

    So what's the benefit (if any) of watching a lower-resolution version of the video? Simply that it conserves HDD space?

    In addition, in the above example ... when you are stretching out the 360p video to fullscreen view ... which device is actually doing the upscaling? Is it the COMPUTER or the MONITOR that is doing the work? When you watch standard definition video on an HDTV, 90% of the time the TV is the one doing the upscaling ... however, on a computer it's the computer and not the monitor that actually does the upscaling, correct?
     
  2. hello_hello

    hello_hello Member

    Joined:
    Mar 14, 2012
    Messages:
    77
    Likes Received:
    0
    Trophy Points:
    16
    As a general rule, the more tricks used to compress a video, the more CPU power required to decode it, so mpeg2 video would probably take less CPU power than mpeg4. When it comes to a video encoded with something like Xvid, it's not overly taxing on a CPU anyway. h264 can be, but most modern video cards can decode the video rather than having to use the CPU. It just requires the right card and the right player. For example Media Player Classic Home Cinema will use a video card to decode h264 video if the card supports it. If it's using the video card to decode, DXVA will be displayed in the status bar when starting playback and CPU usage will be very low.
    I don't know anything about ATI cards, but I'm pretty sure any Nvidia card from the 8000 series onwards will support DXVA (hardware decoding).

    I can't say I've ever directly compared them, but I'd assume decoding a HD video would require more CPU power than decoding and upscaling a SD video, based on the assumption the quality is the same, the same encoder and the same encoder restrictions were used each time, and therefore the HD video will have been encoded using a much higher bitrate. Even if the CPU is decoding though, the video card might be doing the upscaling. I think it depends on the video player and the video card.

    HDD space, or whatever your player can play.

    If you're using a DVD Player or Bluray player capable of upscaling (and I'd assume most Bluray players can) then it's probably the player doing the upscaling, depending on how you've configured it's output options.
    In the case of a PC the PC does the upscaling, not the monitor. If a PC is connected to a TV at the TV's native resolution (720p or 1080p) then the PC is doing the upscaling. If it's connected at a lower resolution, then the TV is upscaling.
    There's also different ways to upscale video. I've no idea how DVD, Bluray players or TVs upscale, but a decent PC video player will let you choose between upscaling methods (I prefer a softer upscale to a sharper one as softer upscaling doesn't enhance the compression artifacts as much). I'm fairly sure not all hardware players upscale the same way. I don't think the Samsung Bluray player in this house upscales as well as the Sony player, and I think the PC does a slightly better job than the Sony player.
     
    Last edited: Mar 14, 2012

Share This Page