View Full Version : So it begins - upgrades part 1
Colonel Mitch
17th December 2010, 04:24 PM
as some of you know know im planning to build a new pc in the next month or so, but with the launch of the new ati cards i cant resist buying the first round now xD
Im geting 2 6950's in crossfire as the value for money they offer is staggering.
http://i513.photobucket.com/albums/t333/piers1989/gfx.png
(performance index is representative of the average fps / cost - and each fps column is a different game)
review page here http://www.anandtech.com/show/4061/amds-radeon-hd-6970-radeon-hd-6950/23
Im also going to get a shiney new screen. Im planning on a dell U2711 right now:
http://www.bit-tech.net/hardware/monitors/2010/03/16/dell-u2711-review/1
To summarize its a 27" 2,560 x 1,440 ips display.
Im going to be buying this on monday so if anyone has any thoughts please post below :)
Calneon
17th December 2010, 04:37 PM
Should be absolutely amazing. ATI naming structures are getting out of hand though, why is the 5000 series better than the 6000 series?
VoX
17th December 2010, 05:51 PM
My only thought is that the U2711 is bloody amazing and that I am incredibly jealous :D.
Colonel Mitch
17th December 2010, 06:14 PM
Should be absolutely amazing. ATI naming structures are getting out of hand though, why is the 5000 series better than the 6000 series?
Its not. The 69xx cards are better.
the first digit just defines the generation. second defines the level of performance and 3rd defines sub catagories.
phantom, how do you find the input lag on your IPS. its the only bad thing ive read really people saying it is noticeable if you play FPS games?
Considering i mostly play wow now its not that much of a deal to me but just wanted to know.
Calneon
17th December 2010, 08:48 PM
Its not. The 69xx cards are better.
the first digit just defines the generation. second defines the level of performance and 3rd defines sub catagories.
phantom, how do you find the input lag on your IPS. its the only bad thing ive read really people saying it is noticeable if you play FPS games?
Considering i mostly play wow now its not that much of a deal to me but just wanted to know.
Thought that graph at the top of your link was FPS benchmark, nevermind.
Input lag isn't noticeable to me.
Colonel Mitch
18th December 2010, 04:45 PM
Thought that graph at the top of your link was FPS benchmark, nevermind.
Input lag isn't noticeable to me.
cool ty for the info.
Looking forward to getting it all now :P
Also - jsut found out that the monitor supports 30-bit deep colour, which is 1.07 billion colours (64 times that of std 24 bit), unfortunatley most non workstation gfx cards dont support 30 bit colour..... BUT WAIT! THE NEW 6900 SERIES DO!! :D
VoX
22nd December 2010, 12:45 AM
They only do if you use a displayport connection ;). (Which you can lol)
Anyway, we demand pics of new shizz :D
Colonel Mitch
22nd December 2010, 01:22 AM
Built and setup ^^
pics in this album below, and the best ones linked here too.
http://img815.imageshack.us/g/20101221235807.jpg/
Isphera
22nd December 2010, 03:17 PM
Damn, I knew my reign as king of awesomesauce PC wouldn't last long ;D
Calneon
22nd December 2010, 03:38 PM
What's the deal with showing MW2 capped at 60fps to demonstrate your awesome PC power? Lets see some 3DMark or Crysis or Metro 2033 or SOMETHING other than a badly ported console game :).
Colonel Mitch
22nd December 2010, 06:02 PM
What's the deal with showing MW2 capped at 60fps to demonstrate your awesome PC power? Lets see some 3DMark or Crysis or Metro 2033 or SOMETHING other than a badly ported console game :).
I didnt show anythign to demonstrate the pcs awesome power - vox nagged me for pics and id just installed windows.
Its Black ops not MW2 btw.
Also its not finshed yet as i dont have a new processor.
Will 3dmark it when ig et back to newcastle whenever that is, as the server is there all my files like 3dmark arent with me in cumbria.
I can play BC2 @ 2560x1440 with everything set to max in the game and not experience noticeable fps drop ever if thats anything to go on.
EDIT: Downaloded 3dmark vantage. here:
http://i513.photobucket.com/albums/t333/piers1989/3dmark.jpg
VoX
31st December 2010, 05:01 PM
Not sure if you've seen this, and its probs not needed, still might be worth it for the lols
http://downloads.guru3d.com/Radeon-HD-6950-to-HD-6970-Flashing-Tools-download-2658.html
Kiliv
31st December 2010, 05:36 PM
oooh now thats some shiny stuff :D
Colonel Mitch
1st January 2011, 07:29 PM
voxfail i heart you :D
now running my 2 XFX 6950s as 2 stock overclocked Asus 6970s xD
EDIT: It wont let me run 3d mark tho, says cant analyse system
VoX
1st January 2011, 07:35 PM
:D.
Pure win, and have you got the latest patch for 3D Mark? Everything else work though?
Colonel Mitch
1st January 2011, 07:41 PM
:D.
Pure win, and have you got the latest patch for 3D Mark? Everything else work though?
yeah its up to date, and it works fine in game. Its probably just conflicting information about the cards. unfortunatley you cant run it without the system being scanner first.
Vicious Horizon
1st January 2011, 07:50 PM
LOL. WIN.
VoX
1st January 2011, 09:21 PM
Ah well, only a benchmark, just enjoy the free upgrade :p
Isphera
2nd January 2011, 12:15 AM
I love it when companies do this, it's just so funny when they try cut costs but end up chancing losing revenue :P
Colonel Mitch
2nd January 2011, 12:30 AM
I love it when companies do this, it's just so funny when they try cut costs but end up chancing losing revenue :P
To be fair AMD were happy enough leaving the phenoms unlocked, and even added dual bios to these cards with a easy switch to go to the backup. Its my guess that they arent fussed about this being possible as only the most hardcore nerds will go messing with their VBIOS, and in those few nerds, it will give them something to brag about and probably inspires some loyalty.
Speaking purely of gains vs losses my guess is its a LOT cheaper for them to just mass produce 6970s and have a very small number of people flash the bios than it is for them to make two cards.
Isphera
2nd January 2011, 12:32 AM
To be fair AMD were happy enough leaving the phenoms unlocked, and even added dual bios to these cards with a easy switch to go to the backup. Its my guess that they arent fussed about this being possible as only the most hardcore nerds will go messing with their VBIOS, and in those few nerds, it will give them something to brag about and probably inspires some loyalty.
Speaking purely of gains vs losses my guess is its a LOT cheaper for them to just mass produce 6970s and have a very small number of people flash the bios than it is for them to make two cards.
Exactly - which is why they do it. Otherwise they need to create a production line for 6750's as well which costs a lot more in the long run than just disabling. They do it in TV's as well.
VoX
2nd January 2011, 03:03 AM
And it lets them use chips that don't quite cut it (i.e one core of the CPU is a bit weak, so tri-core it becomes.)
Colonel Mitch
2nd January 2011, 03:07 AM
And it lets them use chips that don't quite cut it (i.e one core of the CPU is a bit weak, so tri-core it becomes.)
Usually theyre only damaged is the laser cutting went slightly wrong, which should be a very small %
But with these cards they dont cut them off tike cpus and didnt seperate the pipes by hardware so every card should be unlockable, tho some may struggle with 6970 memory speeds as the ram is the only non identical component aparantly.
VoX
2nd January 2011, 03:12 AM
True, but that small % could still impact overall turnover if they were just binned, cause while its a small %, we're talking big numbers.
Yeah, but you could always downclock the memory slightly, there'd still be an improvement on the core.
Colonel Mitch
2nd January 2011, 03:15 AM
True, but that small % could still impact overall turnover if they were just binned, cause while its a small %, we're talking big numbers.
Yeah, but you could always downclock the memory slightly, there'd still be an improvement on the core.
Oh yeah i didnt mean they should scrap them, i was meaning the chance iof having a non unlockable dual/tri core is slim
VoX
2nd January 2011, 03:23 AM
Ahh yeah, see what you mean. Tbh, people seem to be able to unlock pretty much anything given a little extra voltage.
Isphera
2nd January 2011, 04:47 AM
I couldn't find a 'Needs more voltage' image, so this will have to do
http://images.cheezburger.com/completestore/2009/9/11/128971632384374957.jpg
Colonel Mitch
2nd January 2011, 05:29 AM
I couldn't find a 'Needs more voltage' image, so this will have to do
http://images.cheezburger.com/completestore/2009/9/11/128971632384374957.jpg
Yays ^^
Waiting for techpowerup to release their updated version of RBE (radeon BIOS editor) so i can modify it with more voltage and higher clocks :D
Before vox mentions temps - runs are whatever temp it wants to by default with the fan on auto, if i max the fan it literally sounds like a fighter jet, but it will keep even the active gpu for aero and windows when idle at around 35*C in a room at 24*C, and i tested it fulled loaded by folding at home but i cant really remember but think it barley made it to 50*C
VoX
2nd January 2011, 01:29 PM
Tbh I stopped caring about temps long ago, I just put headphones on and let the auto fan do its thing while making my room nice and warm :D
Powered by vBulletin® Version 4.2.2 Copyright © 2024 vBulletin Solutions, Inc. All rights reserved.