# Cheap video card to play Tomb Raider?



## DrRingDing (May 30, 2013)

I would like something cheap as chips to play Tomb Raider on.

I've got an i5 processor with 16GB RAM.

Any ideas?

I'd like 2nd hand.


----------



## UnderAnOpenSky (May 30, 2013)

What's cheap?


----------



## ruffneck23 (May 30, 2013)

how much is cheap ?


----------



## stuff_it (May 30, 2013)

Find out what sort of connector you need and get one off ebay. 

Why do you want used? Unless you have an ancient PC chances are you can afford a new one.


----------



## DrRingDing (May 30, 2013)

stuff_it said:


> Why do you want used? Unless you have an ancient PC chances are you can afford a new one.


 
I'm hoping to get more for my money.

£50 - £100. I'd much prefer it to be around the 50mark.


----------



## ruffneck23 (May 30, 2013)

http://www.ebuyer.com/396969-gigaby...al-dvi-hdmi-pci-e-graphics-card-gv-n650oc-1gi

will probably run it


----------



## stuff_it (May 30, 2013)

DrRingDing said:


> I'm hoping to get more for my money.
> 
> £50 - £100. I'd much prefer it to be around the 50mark.


 
Look on ebay still. Choose a well rated seller and a brand anyone has ever heard of. So long as it's genuine and does what it says on the tin you should be fine. 

FWIW I have yet to find a modern game that doesn't run ok on my getting on for ancient (in gaming terms) GeForce GT 320, my computer only has 6 GB of RAM as well.


----------



## UnderAnOpenSky (May 30, 2013)

I'd look for something like an ati 6850 or 6870.


----------



## UnderAnOpenSky (May 30, 2013)

stuff_it said:


> Look on ebay still. Choose a well rated seller and a brand anyone has ever heard of. So long as it's genuine and does what it says on the tin you should be fine.
> 
> FWIW I have yet to find a modern game that doesn't run ok on my getting on for ancient (in gaming terms) GeForce GT 320, my computer only has 6 GB of RAM as well.



Depends on how pretty you want it doesn't it? 

6gb is more than you need for gaming.


----------



## Sunray (May 30, 2013)

Just, Far Cry 3 and Tomb Raider could be considered similar.

http://hexus.net/tech/reviews/graphics/53365-nvidia-geforce-gtx-650-ti-boost-2gb/?page=8

So passably playable at HD, drop that and you'll be fine.


----------



## Chz (May 30, 2013)

I know you'd prefer to keep it at the £50 mark, but I'd really recommend a 2GB version of a 650Ti or 7790 as the very minimum if you intend on keeping it for any length of time. These are sufficient for 1080p gaming with some of the eye candy turned down. Getting anything less saves money, but what the heck is the point if it leaves you with an unsatisfying experience? False economy unless you truly can't afford better.


----------



## treelover (May 30, 2013)

Believe it or not, on Ebay Gtx 560's and 570's are going for 85 quid as gamers upgrade to Titans, etc.


----------



## treelover (May 30, 2013)

Sunray said:


> Just, Far Cry 3 and Tomb Raider could be considered similar.
> 
> http://hexus.net/tech/reviews/graphics/53365-nvidia-geforce-gtx-650-ti-boost-2gb/?page=8
> 
> So passably playable at HD, drop that and you'll be fine.


 
I don't get this I only have a 460 and I can play most games at high fidelity.


----------



## Chz (May 30, 2013)

I obviously picked a good time a couple of months back to flog my 460. Got £80 for it on Ebay then. 

From experience, a 460 can't even play GTA4 very well at high detail, and that's hardly new. It's still a capable card, about the level of a 7770 or so. You need to go to a 660 (which I did) or 7870 to get a worthwhile upgrade. But then you get to have all the eye candy. New games are sooo beautiful, but you won't see it without the hardware. The 1GB of memory is quite limiting with 1080p resolutions and high def textures.


----------



## treelover (May 31, 2013)

Ah, my rez is 1680 by 1050, not HD then, but can run all games, though Witcher 2 can't use uber sampling


----------



## Firky (May 31, 2013)

HD is a bit surplus IMO, I never really notice the eye candy after ten minutes of playing a game so turn it off for a little performance boost.


----------



## Chz (May 31, 2013)

Most LCD screens get abominally blurry at anything other than native resolution, so it's not an option to turn down the resolution for most people.

Treelover, that's the same as my screen. Moving to a 660 was still a big boost. It changes you from "runs tolerably with some/most eye candy on" to "like butter with nearly everything on". And only nearly everything because some games throw in a stupid option or two just to please people with £1000 video cards. Factoring in what I sold the 460 for, it was a worthwhile £80 upgrade.


----------



## UnderAnOpenSky (May 31, 2013)

It's been a bit of a pain for me jumping to 1080 tbh as it shows I need an upgrade. This is won't be a cheap on as will need to do the Cpu/mobo/ram whilst I'm at it.


----------



## Chz (May 31, 2013)

Don't sweat the CPU too much these days. I run a Lynnfield Core i5 and it's amazing how nothing short of video encoding is CPU limited with a nearly 4 year-old CPU. (granted one that I've tweaked to run at 4GHz on one core and down to 3.33 for all four) The teenager's machine next to me runs a Sandy Bridge Pentium for a CPU, and it's not noticeably slower in anything. Well it is, but that's the old 5400RPM "Green" HDD in it vs. my SSD. Everything's perfectly fast once it's got off the disk and into RAM.


----------



## UnderAnOpenSky (May 31, 2013)

Mine is looking a bit older then that, it's an E8400 @3.6 ghz with a 5850. Some things still run sweet, but some stuff is starting to look juddery. Got an ssd already. It's a little frustrating as everything else I use it for is buttery smooth.


----------



## FaradayCaged (May 31, 2013)

treelover said:


> Ah, my rez is 1680 by 1050, not HD then, but can run all games, though Witcher 2 can't use uber sampling


 

That is a higher definition than 720p (1280x720) but not 1080p (1920x1080), so it could be considered high def in a way, it jut doesnt conform to either of those two standard which were made for tv/films in a 16:9 whereas yours is a 16:10 ratio which is more common in monitors.


----------



## Epona (Jun 4, 2013)

Global Stoner said:


> Mine is looking a bit older then that, it's an E8400 @3.6 ghz with a 5850. Some things still run sweet, but some stuff is starting to look juddery. Got an ssd already. It's a little frustrating as everything else I use it for is buttery smooth.


 
I know I bang on about this constantly, but I find the 5850 in my secondary PC tends to run hot and can lose performance as a result, completely cleaning the fan blades every 3 months at minimum is a lifesaver. What stuff do you find is juddery? Just curious because it's the exact same GPU in my other PC (well the OH's PC) his CPU is a Phenom II 3.4Ghz, and he hasn't experienced any problems or judderyness.

Edit to add: I thought I'd mention it just in case - If you play Skyrim and experience 'stuttering' which is common in a modded game, that isn't due to insufficient hardware, it's due to the way RAM is allocated to running .esp files that can cause massive performance issues if you have mods installed that add new worldspace, or loads of objects to exterior cells. I find housing mods to be the worst culprit (that pretty garden out the front can reduce me to a slideshow whenever I enter the cell), but quest mods can also put a bit of a strain on. It's a game engine issue (allocation of resources), not a hardware issue (lack of resources!) The only fix is to start a new game with fewer mods activated!


----------



## Quartz (Jun 10, 2013)

Global Stoner said:


> Mine is looking a bit older then that, it's an E8400 @3.6 ghz with a 5850. Some things still run sweet, but some stuff is starting to look juddery. Got an ssd already. It's a little frustrating as everything else I use it for is buttery smooth.


 

Your CPU looks fine. Things to check should be your VRAM usage and your RAM usage. VRAM is the memory directly on the graphics card, Video RAM, and very fast. If you run out of VRAM, the PC will have to page stuff from the graphics card to main memory, which is slow. If your PC runs out of RAM it will page to the SSD, assuming that's where your pagefile resides, which again is slow (but not as slow as to a HDD). These days I suggest 8 GB RAM on the motherboard, and a video card with at least 2 GB VRAM. That said, I think there's a good chance that the new games consoles with their unified memory structures will mean that the next generation of video cards will have much more memory. And that's not counting the move to 4K resolution.


----------



## UnderAnOpenSky (Jun 12, 2013)

Quartz said:


> Your CPU looks fine. Things to check should be your VRAM usage and your RAM usage. VRAM is the memory directly on the graphics card, Video RAM, and very fast. If you run out of VRAM, the PC will have to page stuff from the graphics card to main memory, which is slow. If your PC runs out of RAM it will page to the SSD, assuming that's where your pagefile resides, which again is slow (but not as slow as to a HDD). These days I suggest 8 GB RAM on the motherboard, and a video card with at least 2 GB VRAM. That said, I think there's a good chance that the new games consoles with their unified memory structures will mean that the next generation of video cards will have much more memory. And that's not counting the move to 4K resolution.


 

Thanks Quartz. I think it's just moving to 1080p at the same time the graphics card looking a few years old. I've only got 4gb of RAM, but it's really not worth adding more as it's DDR2. My 5850 only has 1gb of RAM, but not really much to be done about that either. However looking at benchmarks I need to spend a fair bit to get something that would give a worthwhile upgrade.

I know there is always something better round the corner, but think I'll try and hang on six months until the new consoles are out and we've seen what they do to machine specs. I wonder if they will make the 8 core AMD chips more competitive? Can't see me playing games in 4k this decade, but 1440p would be nice for my next monitor upgrade.


----------



## Chz (Jun 12, 2013)

Well, the more powerful of the two consoles - the PS4 - essentially has a HD7870 in it. Take that for whatever it's worth.


----------



## UnderAnOpenSky (Jun 12, 2013)

Chz said:


> Well, the more powerful of the two consoles - the PS4 - essentially has a HD7870 in it. Take that for whatever it's worth.


 

I'll wait for the next generation of cards I think. 

The new consoles also 8 core AMD chips in them....I wonder if that will make AMD more competitive for gaming as games get optimized for all those extra cores?


----------



## Chz (Jun 12, 2013)

I wouldn't especially think so. There may be 8 of them, but they're fairly slow cores by desktop standards. An quad-core i5 should still outrun it on just about anything.


----------



## Quartz (Jun 12, 2013)

Global Stoner said:


> I've only got 4gb of RAM, but it's really not worth adding more as it's DDR2.


 
It's £56 for 2x 2GB from Crucial. But a full revamp in the new year sounds like a good idea.



Chz said:


> Well, the more powerful of the two consoles - the PS4 - essentially has a HD7870 in it. Take that for whatever it's worth.


 

It's not really the same as the programmers can have direct or near-direct access, whereas on a PC there are many layers to traverse. I expect the PS4's performance to handily exceed a PC with a HD7970 once developers have mastered the platform.


----------



## Chz (Jun 12, 2013)

Except that they don't. It's still through AMD's API, which is no different to the PC one. This generation of consoles is different to what we've seen before. The underlying graphics hardware is far too complex to program "on the metal".

And 7970? That's delusional even if you could write bare metal code. Graphics work isn't branchy CPU code.  There aren't the opportunities for that sort of speed-ups.


----------



## Quartz (Jun 12, 2013)

Chz said:


> Except that they don't.


 

So you're disagreeing with John Carmack?


----------



## Chz (Jun 12, 2013)

I'm telling you that the Xbone and the PS4 don't work the same way. 

Though he didn't really prove his point with Rage, did he?


----------



## Quartz (Jun 12, 2013)

Chz said:


> I'm telling you that the Xbone and the PS4 don't work the same way.


 

We'll see soon enough.


----------



## Edward Kelly (Jun 12, 2013)

Firky said:


> HD is a bit surplus IMO, I never really notice the eye candy after ten minutes of playing a game so turn it off for a little performance boost.


same here..and tbh don't miss it.


----------



## Epona (Jun 13, 2013)

Oh and forgot to mention that OH plays the latest Tomb Raider on max settings on his PC with my old HD 5850 graphics card (I swapped it out for a 2Gb GTX 670 in my own PC) - it's not a game that is massively taxing or requires the latest hardware to play.

If buying an ATI/AMD graphics card, bear in mind that the numbering is not always a case of bigger number = better card because the way they number their cards is slightly odd, at least to the buyer. A 5870 is better performance than a 6870, so check reviews and performance charts before buying.

Here's a very useful page about graphics card performance comparisons:

http://www.videocardbenchmark.net/


----------



## Chz (Jun 13, 2013)

Quartz said:


> We'll see soon enough.


The devkits have been out for ages. There's no "soon enough".




> A 5870 is better performance than a 6870


That's a real oddity though. In general it is true that a 77xx is faster than a 67xx. But the real performance difference usually lies in the second number rather than the first one. There needs to be a several generation difference before x8xx doesn't outperform x7xx.


As for Tomb Raider, a 6870 (still a pretty decent card!) can't even pull 30fps average at ultra settings and a 1680x1050 screen res. It is actually a very stressing game at Ultra settings.


----------



## Epona (Jun 15, 2013)

Chz said:


> The devkits have been out for ages. There's no "soon enough".
> 
> 
> That's a real oddity though. In general it is true that a 77xx is faster than a 67xx. But the real performance difference usually lies in the second number rather than the first one. There needs to be a several generation difference before x8xx doesn't outperform x7xx.
> ...


 
No it's not true. The example you give of a 6870, is actually the series replacement for the 5770, and is lower performance than any 5800 series card. ATI/AMD cards come in 2 series - a high performance series (model numbers starting 58xx or 69xx and so on) and a budget series (model numbers starting 57xx or 68xx and so on). The high performance series is priced higher and outperforms the budget series. You've misunderstood if you think the first 2 numbers of the model number are unimportant in terms of performance. My 5850 outperforms a 6870.

ATI gpu numbering system has been this way for a while now - we all know it's hard for the consumer to wrap their head around it, but it's the way they have decided to do it.


----------



## Quartz (Jun 15, 2013)

Chz said:


> The devkits have been out for ages. There's no "soon enough".


 
The devkits aren't the final silicon. Indeed, for a release date of the end of this year, I would expect the final silicon to have only just been finalised. You do know that the demo systems at E3 may well have been PCs running Windows 7, don't you? And, allegedly, Nvidia's Geforce cards to boot! True, I don't expect the first round of games to be optimised, but once developers get their hands on the actual kit, then we'll see significant performance improvements.


----------



## Epona (Jun 16, 2013)

Quartz said:


> The devkits aren't the final silicon. Indeed, for a release date of the end of this year, I would expect the final silicon to have only just been finalised. You do know that the demo systems at E3 may well have been PCs running Windows 7, don't you? And, allegedly, Nvidia's Geforce cards to boot! True, I don't expect the first round of games to be optimised, but once developers get their hands on the actual kit, then we'll see significant performance improvements.


 
Yeah I have heard from numerous other sources that the xbox One demos at E3 were actually run on PCs with Win 7 or 8 and nVidia graphics cards (given that they don't actually have an xbox one yet!), so not a surprise at all to hear it on Urban also!


----------



## Chz (Jun 16, 2013)

Sony was most definitely using PS4 devkits though. The hardware is quite similar to the Xbox. Sony's is even a bit more powerful. It's just that Sony had their order in before MS and GF only has so much capacity. For god's sake, it's running a CPU you can buy today with a GPU that's just like one you can buy today (but with a slightly different number of pipeline clusters).


----------



## Delroy Booth (Jun 16, 2013)

Global Stoner said:


> Mine is looking a bit older then that, it's an E8400 @3.6 ghz with a 5850. Some things still run sweet, but some stuff is starting to look juddery. Got an ssd already. It's a little frustrating as everything else I use it for is buttery smooth.


 
If it's a 5850 then you should be able to overclock the tits off it no problem, it's a good card that.


----------



## Epona (Jun 16, 2013)

Delroy Booth said:


> If it's a 5850 then you should be able to overclock the tits off it no problem, it's a good card that.


 
5850 is good but I do have a slight heat issue with mine in my 2nd PC (and also had a heat issue when it was in this PC) - it's slightly overclocked (not as OC'd as my GTX670 mind you), but you'd better have good fans on it if you want to push it. Use a utility to monitor GPU temperature during use.


----------

