1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

The Official Graphics Card and PC gaming Thread

Discussion in 'Building a new PC' started by abuzar1, Jun 25, 2008.

  1. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Some leaked info about AMD's HD7 series:

    HD7800 series (Thames)
    1536 VLIW4 @ 950mhz, 2GB GDDR5 @ 5800mhz (256-bit, 186GB/s), 120W TDP (HD7870)
    1408 VLIW4 @ 850Mhz, 2GB GDDR5 @ 5200mhz (256-bit, 166GB/s), 90W TDP (HD7850)
    compares to
    HD6900 series (Cayman)
    1536 VLIW4 @ 880mhz, 2GB GDDR5 @ 5500mhz (256-bit, 176GB/s), 210W typical (250W max) (HD6970)
    1408 VLIW4 @ 800mhz, 2GB GDDR5 @ 5000mhz (256-bit, 160GB/s), 170W typical (200W max) (HD6950)


    HD7670/HD7570 (Lombok)
    768 VLIW4 @ 900Mhz, 1GB GDDR5 @ 5000mhz (128-bit, 80GB/s), 60W TDP
    768 VLIW4 @ 750Mhz, 1GB GDDR5 @ 4000Mhz (128-bit, 64GB/s), 50W TDP
    compares to
    HD6700 series (Juniper)
    800 VLIW5 @ 850Mhz, 1GB GDDR5 @ 4800mhz (128-bit, 76.8GB/s), 108W TDP (HD6770)
    720 VLIW5 @ 700Mhz, 1GB GDDR5 @ 4600mhz (128-bit, 73.6GB/s), 86W TDP (HD6750)

    HD7900 series (Tahiti)
    2048 GCN @ 1000Mhz, 2GB XDR2 @ 8000mhz (256-bit, 256GB/s), 190W TDP
    1920 GCN @ 900Mhz, 2GB XDR2 @ 7200mhz (256-bit, 230GB/s), 150W TDP

    It's becoming pretty apparent that the switch to 28nm will offer HD6950/6970 performance including the memory capacity and bandwidth on a single PCIe 6-pin card. Here's hoping the pricing will be similarly low. The performance of the HD7900 series however, isn't immediately guessable, as it's a new architectural design, and new memory type as well. Theoretically we should be looking at something of the order of a 40% performance increase against a 10-20% power reduction. Not bad, but once again AMD are pretty conservative with the new architecture here, presumably because the silicon yields will once again be very low, causing availability to be very poor, much like the HD5 series. Once again, nvidia are several months behind at this stage with Kepler, but with a later release date this time around, it's difficult to say if AMD will get much of a retail headstart, if any.
     
    Last edited: Sep 14, 2011
  2. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Probably be buying one of the HD7 series cards come february. Provided it's been released. If info has been leaked, I would imagine it will by February ;)
     
  3. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Now that 2GB memory has become the standard (as I was hoping), my next hope is that AMD bother to put 3 displayport connectors as standard on the new cards, to make eyefinity setups a bit easier. As nice as this new generation looks for efficiency, I don't see much of a reason for high-end HD6/GTX5 users to upgrade for it. Personally, I'm likely to be waiting until the HD8 series.
     
  4. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    How long on 8 you think? Perhaps I could find another gtx260, and be fine for a while longer. Dual 260's though, probably would only effect a select number of games though...
     
  5. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Depends. HD8 would most likely be a revision of 28nm, since 22nm silicon will be years off yet, perhaps not even until early 2015. With the HD7900 series already bringing a brand new graphics architecture to the fore, I'm not entirely sure what they could do. It's becoming increasingly obvious that reducing silicon process further is becoming increasingly difficult, and an industry-wide expectation is appearing that by 10-15nm, the entire shrinking process will stop altogether, and we will likely need to switch to something like optical electronics. Stagnant times are ahead.
    As a reference, one HD7850 (90W TDP) would almost certainly outperform two GTX260s in SLI.
     
    Last edited: Sep 14, 2011
  6. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    THanks for the info. I knew my card was grossly outdated LOL! 10 - 15nm... 0__0 It boggles the mind LOL!
    Or perhaps at the end of next year, we'll be thrown back to the dark ages all together :S Heaven forbid. Contrary to what my close friends and family think, i'd survive without a computer. I have other passions after all. Chemistry, reading, science in general!
     
  7. DXR88

    DXR88 Regular member

    Joined:
    May 29, 2007
    Messages:
    714
    Likes Received:
    2
    Trophy Points:
    28
    we could go backwords and start packing more Crap in the chip.
     
  8. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    I don't think Fecal matter will help us... :p
     
  9. shaffaaf

    shaffaaf Regular member

    Joined:
    Jan 5, 2008
    Messages:
    2,572
    Likes Received:
    4
    Trophy Points:
    46
    i was hoping for more of a jump considering, that we missed a generations worth of improvemetns due to 32nm being canceled.

    disapoitned. was hoping for 4870x2 speeds at 75w tbh haha
     
  10. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Well, 6950 speeds at 90W, that's not far off to be honest. In high resolution environments the HD6950 is often faster than the HD4870X2.
     
  11. DXR88

    DXR88 Regular member

    Joined:
    May 29, 2007
    Messages:
    714
    Likes Received:
    2
    Trophy Points:
    28
    who knows it could be the worlds first organic microchip, and in stead of heatsink compound we would use eatstink compound.

    ...i should patent this.
     
  12. harvrdguy

    harvrdguy Regular member

    Joined:
    Sep 4, 2007
    Messages:
    1,193
    Likes Received:
    0
    Trophy Points:
    46
    DXR quoted this before, but Jeff's "review" deserves to be quoted again. I started playing Mafia 2, and it is amazing, and in my opinion, far better than GTA4 - so Kevin, you might like it. The graphics are super sharp - I was amazed that in the car, as you turn into the sunlight, the light breaks apart into its component colors just like in real life - RGB, and you can see overlaps of two colors forming yellow - in fact headlights and all lighting is handled very well. In addition, there is some incredible snow and icy street maps - just for the fun of it I took a break from the story centered around Vito, and looked at Jimmy's vengeance and Joey's story.

    The first main episode of Joey's story involved killing off a main witness - a stool pigeon. As the witness was escaping, there was a car chase over a frozen river, with chunks of ice, and places where the ice had thawed - an incredible car chase if there ever were one.

    If you play it, don't wait for the Garand like I did, wondering where it was. It is in any of the many gun shops spread around New York City - just browse the shop like I finally decided to do, and they sell Garands - and THAT rifle is very deadly against these hoods. It has a lot more range than the pistols, plus the single shot power of the magnum. (To buy tommy guns or others, only the one Army surplus at the top of the map sells those items.)

    Anyway, to echo Jeff and DXR - Mafia 2 is a great game!

    -----------------------------------------------

    "Eatstink compound - instead of heat sink"

    hahahahahahaha. Good one DXR!

    ----------------------------------------

    What did you say, Sam, a 40% performance increase versus 10-20% power drop on 7000 family? Well, if that is coming in Feb - maybe that's when I'll take the plunge. It would be nice to put the 9450 in the position of being the bottleneck, and then OCing it to 3.4 to fix that. (So is crysis 30" very high settings gaming finally going to be a reality for me at 30fps min all the way through the game including ice? After Mafia 2 I don't know if I care about Crysis any more, lol.)

    Rich
     
  13. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    I may just check it out. I'm trying to figure out a bug first. Until this bug is discovered, I cannot, I will not install anything else LOL! I wish there was a version of windows 7 that could fit on a flash drive. Oh wait! I have a 32Gb drive! But surely there's a lite version...

    I wonder if "7" will even allow installation on a 32Gb USB drive. Apparently not designed natively for installation to flashy.

    I think I found a way to install windows 7 to flashy :) The reason I have to use "7", is because with something like say Linux, I may not recognize how the bug manifests. With windows 7, I'll just know.
    Windows 7 installation to thumbdrive/flashdrive
     
    Last edited: Sep 15, 2011
  14. Griff88

    Griff88 Member

    Joined:
    Apr 3, 2011
    Messages:
    28
    Likes Received:
    0
    Trophy Points:
    11
    Hi,

    I have an older system... a dual core e8500 @4.35Ghz. I'll be getting a new 2GB 6950. Do you think there will be a big bottleneck with my cpu?
     
  15. Estuansis

    Estuansis Active member

    Joined:
    Jan 18, 2006
    Messages:
    4,523
    Likes Received:
    8
    Trophy Points:
    68
    Not really. Even though it's a dual core it's at quite an impressive speed and is a very up-to-date design. Now if you were going Crossfire you'd most certainly need a quad core, but with a single card you should be just fine.
     
  16. Griff88

    Griff88 Member

    Joined:
    Apr 3, 2011
    Messages:
    28
    Likes Received:
    0
    Trophy Points:
    11
    Thanks for the info Estuansis. Now I can hold off a bit longer on an upgrade.
     
  17. harvrdguy

    harvrdguy Regular member

    Joined:
    Sep 4, 2007
    Messages:
    1,193
    Likes Received:
    0
    Trophy Points:
    46
    Wow, at 4.35ghz - that is one fast chip - like the new sandy bridges! I'm eventually going to overclock my 2.6 quad core to get to 3.4, and I'll still be a full 1 ghz slower. Like Jeff said, sounds like your system will be balanced with the one card.

    Kevin, for sure, if you can ever get that bug handled - as much as you liked GTA4 you will like Vito, and Jimmy's Vendetta, and Joey's Stories in Mafia 2. Good luck with windows 7 on a usb flash drive.

    Rich
     
    Last edited: Sep 16, 2011
  18. omegaman7

    omegaman7 Senior member

    Joined:
    Feb 12, 2008
    Messages:
    6,955
    Likes Received:
    2
    Trophy Points:
    118
    Luck? I'll need it LOL! Not sure how well it'll run on a drive, that can't handle a whole lot of simultaneous read/write operations ;) But if it can handle basic stuff like streaming netflix, or watching a few videos from other drives, I should be able to determine if something is going on with my Velociraptor, yet again...

    I finally have time to attempt it. The drive is ready to boot and install now. I'll be attempting it within the hour. If it works, I'll post a link to the site that helped me. That way somebody else can be linked there ;)

    Are people who are subscribed to the AMd building thread, not getting update emails? I know the thread is plagued with the new page bug...
     
    Last edited: Sep 16, 2011
  19. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    It depends on the game, an E8500 will be a massive bottleneck in quad-oriented games like Battlefield 3. There are several other games for which an E8500 is starting to no longer be sufficient. You might be OK for the time being depending on the game, but you should definitely look to upgrading the CPU before too long as well.
     
  20. sammorris

    sammorris Senior member

    Joined:
    Mar 4, 2004
    Messages:
    33,335
    Likes Received:
    7
    Trophy Points:
    118
    Had a little play with a new performance metric.
    Using previous established values with a default resolution of 1920x1080, a baseline which measures how fast everything performs, given a fixed screen resolution, can be calculated.
    In other words, clearly two PCs with the same hardware but one running a larger monitor, the latter is going to effectively become 'a slower PC' if run with the same detail settings, but the higher resolution.
    Using a metric that considers both the graphics hardware and the resolution, a single figure can be obtained for how demanding a game is, based on benchmark data. By considering the score for your system against the score from a game, you can more immediately identify how many detail setting adjustments (if any) will be needed to achieve your desired frame rate, rather than having to recalculate benchmarks based on cards you don't own, or resolutions you don't use :)

    The benchmark as always, is going to be the Radeon HD4870, with a score of 100. So an HD4870, with a resolution of 1920x1080, gives a PC score of 100.
    Thus, a Geforce GTX580 with a resolution of 1920x1080, gives a PC score of 230, and a Radeon HD6970 with a resolution of 2560x1600 gives a PC score of 106.

    Here are some example systems based on my own, and various friends' PC systems:
    8800GTX @ 2560x1600: 35
    HD4870X2@ 2560x1600: 91
    HD4870 @ 1680x1050: 118
    HD6970 @ 2560x1440: 118
    HD6850 @ 1920x1200: 126
    HD5850 @ 1920x1200: 140
    HD6870 @ 1600x1200: 173
    2xHD6970 @ 2560x1600: 208
    GTX580 @ 1920x1080: 230
    2xHD6850 @ 1920x1200: 246

    With that as a reference, here are the recently benchmarked titles at GameGPU.
    The objective frame rate is 60. This means if you only desire 30fps, you can get away with a PC score exactly half that of the game's score, before needing to reduce detail.
    Both average and minimum frame rates are considered here, so for an average frame rate of 60fps, read the first value. For a minimum frame rate of 60fps, read the second value.
    If a single figure is noted with an A, only the average frame rate is available in the benchmark.

    W40K Space Marine: 75A
    Dungeon Siege 3: 80/90
    Call of Juarez 2 The Cartel: 95/115
    HoMAM 6: 100A
    Alice: Madness Returns: 100/115 (Normal), 375/510 (PhysX Med), 445/655 (PhysX High)
    Dead Island: 105/125
    Hard Reset: 100/120 (NoAA), 155/180 (MLAA), 255/340 (FS4)
    From Dust: 110/130
    Duke Nukem Forever: 145/185
    Red Faction Armageddon: 170/205 (DX9), 205/250 (DX11)
    Deus Ex Human Revolution: 180/200
    Battlefield 3 Alpha: 185/250
    Fable 3: 185/260
    DiRT 3 v1.1: 195/230
    F3AR: 235/280
    Crysis 2: 290/375 (DX9), 345/485 (DX11)
    Total War Shogun 2: 305/370 (DX10 MLAA), 440/575 (DX10.1 MS8), 485/650 (DX11 MS8)
    Rift: Planes of Telara: 375/500
    The Witcher 2: 230/315 (High), 760/915 (Ultra), 780/980 (Max), 625/810 (High 3D), 2120/2520 (Max 3D(est.))

    Some interesting things to note here:
    1. Even at 1080p resolution, an old HD4870 can still run the first eight titles pretty close to 60fps.
    2. Even with some concessions, Crysis 2, Shogun 2, Rift and The Witcher 2 all require a higher PC score than any of the example systems listed.
    3. The penalty for PhysX, and for 3D Vision, is enormous. (Also note, Geforces are still required to use either of these technologies)
    4. The Witcher 2 is ridiculous. A performance rating of 2520 for the maximum setting in 3D mode would require Two GTX570s in SLI just for 640x480!
     
    Last edited: Sep 16, 2011

Share This Page