all the talk has been about 60fps and 100fps.. but apparently the next gen consoles are capable of 120fps ( so the Sony president Ken Kutaragi says)..but TVs cant display the images yet. But with the firther development of HD plasmas and lcd screens maybe they will in a few years. That will really bring the graphically power of the new consoles to a screen where they can truely be appreciated. heres the link http://www.gamespot.com/news/6136786.html
Its not the TVs that are the problem the fact that humans can only see something like 30fps make it seem pointless.
dont think thats true man, think that if it drops lower than 20-30 it becomes more noticable to the human eye but as it increases it becomes harder to notice a rise in quality on our current tvs.. for a game like burnout or any FPS say.. a higher frame rate would result in a more smooth game..so when quickly turning 360 degrees in an FPS or travelling at 200mph in burnout, loading of new cars, buildings ,characters and surrounding would be seemless and without any jumpy bits, like in reality. humans can preceive very very high amounts of informations instantanously. Some japanese students made a two-minute video at just under the natural resolution of the eye...it had such a high frame rate and resolution that the people they tested the viewing on became sick.. it was explained as basically being the opposite of sea sickness (where your eyes dont think your moving but your body knows it is moving on the waves)..so basically your brain thinks your moving beacuse of the images but your not.. so if humans eyes can be fooled by super images im sure a better frame rate in games would enhance gaming (sorry if some of this sounded like jibberish, had to write it quickly..going trick'r'treatin (",) )
that is in fact utter gibberish, the lot of it lol p4 is right the human eye cannot distinguish any improvement in frame quality of over 30 fps, tv, cinema and cartoons have different frame rates cant remember the order, should do really its my job i believe cinema is 24fps, tv is 30fps, cartoons used to be shot in 12fps, but now its higher for 3d apps, like the incredibles etc as you want to see fluid dynamic movements not jerky movements like in some old disney films
Didn't you just say earlier that the people exposed to highter frame rate videos suffered from nauseau? How would becoming sick enhance gameplay? Did I miss something?! :-/
jesus i posted this ages ago, thought it was well gone.. I read that artical on afterdawn i believe.. anyways im just assuming that a more realistic higher resolution type of gameplay would be great.. imagine playing a game that looked exactly like you would perceive it in the real world.. im sure anything like that is bout a half century away anyways.. even if the PS3 and Xbox360 can display at HD and at 120fps.. i played the 360 the other day in Dublin in Ireland and i wasnt overly impressed at all although i understand that there were no graphically good ps2 games for atleast a year and a half after its release so i didnt really expect much. Also i think in gaming a higher FPS would simply result in smoother simulation in fast paced games but i dunno for sure
ps3 will run up to 120 fps, but taht s only how fast t the lk images are moving. Doesnt really matter. Just google ps3 fps in google news and you will get the article
nah the max framerate wont be *upto 120 FPS*, that can only happen if TV's had vsync with 120 Hz refresh rates (only applies to CRT tv's), FPS is ALWAYS changing it will go way higher than that with the hardware on the 360 and PS3 at times, and it will stay high (aka 60+) even in hardcore action, thats how powerful they are (both regardless of which is the more powerful), the future is bright. and all that stuff, about human eyes, Yes the human eye cant see the *physical change* of the frames but the lower the framerate the more jumpy it is and the more 'shit' it is basically, the higher it is the smoother the game is and trust me, i can tell the difference between 90FPS and 100FPS i know games and PC;s very well. all that human eye shit doesnt apply to game framerates, and people who think it does really need find out about this stuff properly. Playing a game at max 24FPS will suck really really badly. 40FPS is regarded my the standard PC games to be *playable*, 50+ is what people want *at least* if you watched 2 game stress test's (aka CS;S source) and you watched one to be set at at 24FPS (somehow, by a tech master or someone?) and one at 60FPS, you would notice a distinct difference, and you would be able to tell the difference between 60FPS and 24, regardless of that human eye crap, it doesnt apply to games (i know ive just repeated that)
[bold]Ripped from wikipedia:[/bold] [bold]For about this and video game framerates visit:[/bold] http://en.wikipedia.org/wiki/Framerate
they probably meant 120 FPS on average, on a good performance game (one that doesnt kill even the very very best system on performance)
Bs. At least I can tell the differnce between 30fps and 60fps. You must be blind if you can't see the diffrence.
"Even with expensive monitors that can reach even higher frequencies, the effect is somewhat lost as the human eye has difficulty in perceiving differences in frame rates above around 50-60 fps." There's the more accurate human perception. Thanks for the wikipedia article, Mr. Grimmace. After reading that, the 120 fps for PS3 figure isn't that surprising at all.
Isnt lk fps..well i know what it stands for, but isnt how fast the image goes?>! I dont think fps has anything to do when like compairing system specs...example: On my n64 emulator, at the bottem right corner it states the fps, regularly 60, when i increese the speed, the fps goes up.
I am still more fasinated by the prospect of Cell processing than I am over the ps3 overall. Man what I would do with a supercomputer-what can I do? One thing that I am interested in though is what "bit" are the Nintendo Revolution, PS3 and Xbox 360? I mean I know that the Dreamcast, Xbox, Gamecube, and PS2 are 128 bit systems so are the next generation systems 256bit?
not sure, i think calling a system 128 bit etc, is outdated now. because of all the more complicated better hardware and technologies coming out. thry minght do though? (though i dont think they will)