It could be a bandwidth issue? With WiFi you will be more limited to the quality of the streaming signal. So if you used the LAN connection instead you may get better resolution. Of course that also depends on your high-speed WAN connection which can also be a limiting factor. I have a chart but don't have time to find it right now but this may help for now, it is MS so it isn't the best but will give you an idea at least. Video Resolution Bandwidth
Our viewing experience can vary on netflix. It uses a wifi dongle, and we get a 5Mb connection. I noticed last night, that what they were watching was obviously cropped from a wider aspect, to 16:9. That would cause video to look slightly blockier. Essentially, you're zooming in on the video a bit. I heard netflix was doing this not long ago. Cropping the video I suggest trying to stream a video, that has HD available(Provided you're at least 5Mb), and seeing how that looks. I believe 7Mb is the recommended minimum though. Which is laughable. Because I've streamed HD at 5Mb, looks really good, and it isn't utilizing my full bandwidth. Roughly 85%, but that likely depends on the compression scheme of the video I'm watching. Resolution isn't necessarily the determining factor. The Bitrate of the video frames/audio is what's most important. 500kB/s seems to be plenty, for intelligently compressed video. Don't get me wrong though, I'd much prefer upwards of 12Mb! Unfortunately, I have to live with what I can get
Thanks guys I appreciate the comeback, I have so many questions on this subject right now I don't have the time to get into it but I will, anyway Steve that bandwith link Wow!! I'm still scratching my head trying to figure it out lol, I have to start somewhere I guess to grasp on this streaming, networking etc.
Okay, so for us 30" monitor guys, having twice the detail on the same size screen is going to affect us .... How? Just everything super sharp, right? Fred - it should have looked totally 1080p perfect - that's how the roku was doing it, although initially I found it had been set to the default 720, and I had to change that to the 1080. Those Breaking Bad episodes looked totally awesome - not like VHS. As Kevin said - your bandwidth is a factor - I do have good bandwidth at that house, and down here we also have good 12Mb bandwidth. But like Kevin also says, 500kB/sec - which is only 4Mb/sec - should give you a great 1080p picture. When we had netflix down here, I was finding that HD content seemed to be streaming down at 350 to 450 KB/sec. We were streaming on the HTPC, the phenom, with hdmi connection to the big screen. This was a year ago or so. I remember there was one little trick I had to do to get it to stream properly, however. I seem to recall that it was trying to stream down at unreasonably high KB/sec, and I had to manually set it to lower to get a smooth stream - it would fall behind, and the video would stop, then start up again. When I took manual control I solved the annoying problem. But with the Roku the other day, there were no issues like that at all. Rich
The problem is, even if you have 7Mbps, it may not be 7Mbps all the time - they have to allow for variances in speed that ISPs will consider 'normal behaviour' and still deliver an acceptable experience, otherwise they'll be had over for false advertising. 720p on youtube is actually between 2100-2900kbps bitrate in almost all cases. Fred: It can seem complicated at first, but is actually fairly simple. A few pointers: Quality is primarily determined with bitrate. The higher the bitrate the better the quality, up to the limit of the resolution. Once you have a perfectly clear image (or close enough such that you can't tell the difference) your next limit is the resolution determining how small the pixels are in the image. On a large HDTV, at 720p 'high definition' the pixels are still very large. Moving to 1080p includes twice as many pixels in the same space - smaller pixels = a finer image, but if the file bitrate is too low, you may as well have 720p as it won't look good enough to justify the extra resolution. Rich: Super-sharp, and super small. Remember Windows still doesn't really have proper DPI scaling. It's there, but it's a bit dodgy and it makes windows look odd, I don't use it. The higher the dots-per-inch with windows, the smaller everything gets, as things like the start menu are always a standard number of pixels - therefore with more pixels per inch, the start menu is fewer inches wide.
This is true about bitrate. I've seen 480p DVD rips that could embarrass some 1080p BluRay rips. All about bitrate, how it was encoded, and the wrapper they chose. Resolution doesn't mean much in terms of video quality if the bitrate is poor.
I have the pixal, 720, 1080 pretty much down pat, this bitrate I am totally confused, does it have anything to do with for example I get a 1080 bluray movie off the net, than of course I need to convert it to be able to burn it on a disc, I use ConvertXtoDVD, than after the conversion it's not 1080 anymore and sure that quality before the conversion is not there anymore, would this have anything to do with this bitrate you guys are talking about?
The reason bitrate matters is because of compression (the same is true of audio by the way). If video was truly uncompressed (this effectively never happens, not on blurays or DVDs, or video cameras) - then it'd work like this: As an example, take a 1280x720 (720p) video, 1 hour long, at 30 frames per second. Each pixel would be three bytes, as that is how colour is represented digitally (one byte each for how much red, green and blue is in each pixel to give it its colour). Now you have 921,600 pixels right there, so that makes each frame 3x921600 = 2.7648MB, just for one frame. 30 of those a second and you get 82.94MB/s - that's more than a lot of devices can read, certainly more than any optical disk drive is able to achieve. Multiply that by 3600 seconds to make the hour, and you get a total file size of 298584MB or 290GB, just for an hour in 720p! Clearly, compression is always going to be necessary, and it does this by taking a 'what changes?' approach. From one frame to the next, in the example that it's a sitcom, it may be that what's going on right now is someone talking, in a static set. All that's going to change therefore is the person's mouth moving, maybe their hands slightly, but a large proportion of the frame is going to remain unchanged from one frame to the next - depending on the length of the scene maybe as many as 200 frames (6-7 seconds at 30fps). Rather than provide the entire detail of the frame every single time, you only need to provide detail on what changed, and how much it changed by. The more stuff that changes, the more data you need. The more stuff that changes at a given time, the higher the bitrate you need. There are two types of bitrate used for this purpose: continuous bitrate: For example, you allocate a bitrate of let's say, 3000kbps, all the way through the file. This means you know how big the file will be based on its length. However, the risk here is, you will be wasting data whilst filming someone standing still (where 3000kbps is far more than you need), but you won't have sufficient for things such as action scenes/explosions and so on. This creates compression artifacting, blurriness and so on, and is the difference you typically observe between the original bluray/DVD of something, and a rip. variable bitrate: The same principle applies here, but the bitrate varies up and down accordingly, depending on what's needed. This takes longer to encode the file in the first place (as you have to examine the file to determine how much bitrate will be needed where, first), but it allows the file size to be minimised when nothing is happening, and the picture quality to be preserved when a lot is happening. If you're converting to DVD specifically, DVDs are only 720x576 (PAL) or 640x480 (NTSC) in resolution, which of course makes a considerable difference. The bitrate of DVD is, however, very high so this itself should not be a problem. Another thing to bear in mind when upscaling/downscaling is that if the resolution difference is not an exact multiple (480 does not go into 720 or 1080 a round number of times), then you will get some blurring from interpolation - Say you have a very small part of an image made of a black square and a white square next to each other. If you double the resolution exactly (e.g. from 640x480 to 1280x960) you'd get two black squares and two white squares - it fits perfectly. If not, you only have room for one square in the middle, what colour do you make it? Neither black nor white, it will have to be grey. Thus your nice sharp edge has a grey bit in the middle. Hopefully that makes some sort of sense!
In the most basic terms, bitrate is equal to the actual quality of the file. Higher bitrate generally = better. A 480p file at a high bitrate will likely have better quality than a 1080p file with a lower bitrate.
Yes it did Sammy, I didn't get it all but it's a start for me, I'm not good or have ever been in schooling, why if something didn't interest me in school I was bored, the way most teachers explained things it was boring, I sucked in every subject except history, geography and of course gym lol, I always had and still do bad with text explanations, to put a bike together I need to read the manual 50 times before I get it, and at times that won't work, but you show me one time and I got it, e.g. I have learned so much more messing around on the Internet than I ever did in school, if I still can't get something I can youtube it, more than half the time somebody shows it to you, anyway I did get most of what you explained, I know it's small potatoes to you guys but it's a big thing for me, for a young man Sammy you should apply for a teaching job, become a dean or something like that. As a kid growing up we all admired the bad asses, the cool one's, the athletes, when you get older you realize how stupid that was, and I know I have said it before and I will keep on saying it, I admire and appreciate the help I get from all you guys.
Here's another way to explain it. Ever download a music album Fred? The General consensus, is that 128kbit/s is agreeable. Myself, I prefer a minimum of 192kbit/s. This is bitrate. Go below 128, and it will become obvious, that the MP3 file is seriously lacking the original source information/quality. Compression algorithms, are coded to remove imperceptible audio ranges efficiently, with size in mind. The same goes for video. With video, it obviously takes color space into account. Lets say a single frame has 300 shades of red, and 20 of them are highly similar. Compress too much, and you begin to lose the gradient like effect. And lighting will begin to look wrong. And of course, this is basic, and only the tip of the iceberg.
Excellent explanation Kev, and yes I do a lot of music, and I like to go even higher if possible 258 I think and 320, than my question is, how do you know what you are watchig e.g. streaming with netflix, or downloading a movie from the net, another example RUNTiME.......: 1h 43mn SiZE..........: 4.35 GiB VIDEO CODEC...: x264, 2pass, L4.1, 8ReFrames FRAMERATE.....: 23.976 fps BiTRATE.......: 4 494 Kbps RESOLUTiON....: 1 280 x 720 AUDiO.........: English DTS 5.1 1510 Kbps SUBTiTLES.....: Romanian,English,Bulgarian ENCODER.......: ? I know about the resolution, subs and audio, I can see the bitrate, the codec, framerate which I know nothing, but now that I see your so called bitrate is this what I look for as far as quality? and in this case the bitrate says 4 494, I never paid attention to any bitrate in video anyway, doing music it didn't take long to figure out the quality between 128 below or above, so is 4 494 considered a good number as far as quality, whay are good bitrates to look for in video.
for 720P, 4,000Kb/s and up seems to be agreeable. GENERALLY, that bitrate is good. But bitrate isn't all there is to it. If X264 was instructed to perform the operation at "Extreme" speed, then the quality won't be as good as the slowest setting. With netflix, I've not heard of a utility that measures the bitrate of what you're watching. Though I'm sure it would not be difficult to code, and probably exists.
I didn't think there was a way to eye the bitrate while watching something on netflix but had to ask, and this is why I still don't fully get about the quality with video, now were into it's not just about the bitrate, now were getting into other things, I can see this video thing is a lot more complicated than music, but I have to start somewhere.
You're taking the bitrate too literal if you viewed the chart they show resolution and compression levels versus bandwidth required to view the full quality of the original source. If the original source is garbage guess what you'll get, garbage in, garbage out. Obviously if your a dipshit nothing you do is going to work well even if it is 1080p or even UHD quality. please use some common sense here. And as to Netflix they do modify the original movie sometimes, letter-boxing/over-compress/cutting parts out and so on, so you won't always get a good 1080p movie, like you should.
No it is exactly the same as MP3's or WMA, or OGG, or whatever compression is used. When you compress anything you typically decrease the original resolution, degrading the music/video quality and often adding noise as well. Even Loss-less, which I prefer for audio compression isn't truly loss-less. Also my TV will tell me what the bitrate of a source is and the resolution of the source.
Exactly the same on music?, OK, if you say so I believe it than, you know much more on this than I, although it's the same it does seem much easier, there really is not much to figure out, I can go lossless or downgrade, I picked wma pro, I think it's suppose to be lossless but not sure, it sounded good lol, as video goes it is more complicated, as I use to get movies from the net I always picked 1080p than realized after converting them with ConvertXtoDVD that clear sharpness was downgraded, don't get me wrong it's still a nice picture but before conversion on my PC screen it can look like a 35mm snapshot sometimes, I even bought a bluray burner, tried to see if I could burn some BR movies to a DL disc the burner wont even play a BR movie, I had to download this or not to view it I just gave up. Russ a while back sent me a copy he made one of the Batman movies he got of the net, he put it on a DL disc I must say the picture was gorgeous especially for a dark movie, he did it with imgburn, I tried it, I wanted to make a 1080p movie look like one with a DL disc, I also gave up with imgburn, it turned out to be problematic plus all my kids are deaf and getting subtitles to work properly was a nightmare, with ConvertXtoDVD which is a great piece of software it does everything pretty much automatically but after the conversion yes you do loose some quality, it's standard DVD quality which is still good and they made applying subtitles very easy which is important to me and my kids. Yeah with all this new stuff coming out like streaming etc. I need to get on the ball, I'm way behind, I'm even considering checking out the docking station thing, putting a lot of my movies on a HDD, that's another thing I need to learn, remember way back a lot of us got into a tiff about storing our movies which is better on HDD or disc's, I held fast with disc's lol, and still do because my kids and their friends that view movies in many separate rooms, it's a lot easier to just hand them a disc than to have docking stations with computers sitting next to them if that's what's needed I think in every room, but I would like to get into it anyway, my friend who just got into his netlix thing was interested in that after I told him how you guys do it, I did explain to him if you store all your movies on HDD and it takes a crap well than lol, he said he didn't care much, he liked the idea better than having disc's laying around, he rather go with the ease of storing his movies on a HDD and the worry about loosing them was secondary, he doesn't have to worry about letting kids watching movies all over the house like myself, it's just him and his wife, so for him it's a good idea, he wants me to teach him about it all and I told him that's still out of my league, we will have to learn together. Steve when you said (Also my TV will tell me what the bitrate of a source is and the resolution of the source) How would you do that?
I use WD TV Live coupled with a docking station (no computer needed, although it can be used with a computer). WD can also be view Netflix and a whole range of apps on the Net. It also can handle a whole range of filetypes, such as VOB, and Blu-ray (M2TS) files. I recently purchased another one at Newegg for $39.99 with rebate, but that sale is no longer. BTW, I back up my movies to another HDD, so if I lose one, I have a full backup. Best of luck.
I was always told you need a computer with your docking station, how would a doc station work without a pc?