Sign in with
Sign up | Sign in
Your question

7970 vs 670 - Page 2

Last response: in Graphics & Displays
Share
a c 133 U Graphics card
January 6, 2013 2:17:18 AM

Go big or go home dude :p 
m
0
l
a c 185 U Graphics card
January 6, 2013 2:19:01 AM

bigshootr8 said:
Go big or go home dude :p 
:lol: 
m
0
l
a c 185 U Graphics card
January 6, 2013 2:22:38 AM

Just get the little brother to the vapor x 7970GHz get the 7950 vapor x i am telling you my son has one.It's an excellent price for an extremely good gaming card that will play pretty much everything you throw at it at high to ultra settings.
m
0
l
Related resources
Can't find your answer ? Ask !
a b U Graphics card
January 6, 2013 9:50:10 AM

BigMack70 said:
other physics engines e.g. Havok are just as good and aren't constrained to a single party's GPU.


What engine is used is irrelevant. You don't tell Gearbox which to use - they decide for themselves. They chose PhysX. And you can't prevent them (or any other developer) from doing the same in the future.

redeemer said:
Physx and Adpative Vsync are gimmicky feature that make no difference at all.


You obviously don't know how adaptive v-sync works. If you have basic v-sync enabled on a card that would be putting out 55fps, it'll put out 30fps (if frame can't be rendered in 16ms for 60Hz, it's held over until the next refresh cycle). 55fps vs 30fps is a 83% performance gain.

EDIT: To clarify, I should have said "your argument for what physics modelling could have been used is irrelevant.". Was on my way out, so fast post.
m
0
l
a c 133 U Graphics card
January 6, 2013 10:04:11 AM

So what you are saying is that vsync the way that it works is a bit more violent of a frame change then with adaptive vsync.

Either way though it kind of gets away from the topic of hand and for this guys solution he has it figured out. I see what sams doing hes trying to push aside the fanboyism that the AMD folks on the forums show but it won't help this thread either side.
m
0
l
a b U Graphics card
January 6, 2013 12:12:50 PM

Ah I didn't realise guy had already chosen :-) I normally read all posts, but there's been a ton of new posts since I was last on here and I was just about to go out, so fast response. It's worth understanding adaptive v-sync though, because it's definitely not just a gimmick (the classic response of the AMD fanboy - if it's an NV feature AMD lacks then it's a gimmick :-)). I'll admit that 55 vs 30 is a worst-case scenario for basic v-sync - at 45fps the GeForce would just have a 50% lead and at 35fps, a 16% lead. But it's well-worth having over 30fps limit.

Anyway, to explain: v-sync (as in basic v-sync) is a technique used to manage framerates, so that if your framerate exceeds your monitor's refresh rate (typically 60Hz), you won't experience screen tearing (Google screen tearing images and you'll see what I mean). All cards support basic v-sync, including Radeons. The drawback is that if your card drops below 60fps, even just slightly, the framerate will be significantly reduced by v-sync.

To sync with the monitor's refresh rate, the frame has to either be rendered in 16ms or less (for 60fps) or it's held over to the next refresh cycle. One frame over two refresh cycles (on 60Hz monitor) results in 30fps. So even if your card would be capable of 50-55fps, you drop to 30fps! So v-sync is really bad for performance if your GPU can't stay above 60fps, but v-sync is needed if you sometimes go over 60fps and so need to prevent tearing.

Adaptive v-sync will deliver smooth, fluid performance, since it won't restrict your framerate like basic v-sync does. With adaptive v-sync, when you drop below 60fps, v-sync just switches itself off so you get 55fps instead of 30fps. As soon as you reach 60fps again, adaptive v-sync switches itself back on. Sounds like a very simple solution, but it must be hard to build into the drivers because AMD still hasn't done it (and it's taken nVidia years to do aswell).

You can read about it in detail at http://hardocp.com/article/2012/04/16/nvidia_adaptive_v...

"With Adaptive VSync turned on, the feeling of the game felt smoother compared to regular VSync turned on. The performance felt much like the game felt with VSync turned off. This is the kind of technology we like to see which has improved the innate nature of the gameplay experience. If all you need is 60 FPS in a game for it to be playable, then why not just go ahead and cap the game there so it doesn't exceed your refresh rate. Then, if the game has to fall below that, allow the game to perform at its real-time actual framerate, and Adaptive VSync allows that. It really is the best of all worlds, with no drawbacks. We didn't find any negatives to using Adaptive VSync, and we tried it out in a handful of games."
m
0
l
January 6, 2013 12:29:54 PM

sam_p_lay said:
What engine is used is irrelevant. You don't tell Gearbox which to use - they decide for themselves. They chose PhysX. And you can't prevent them (or any other developer) from doing the same in the future..


Know how many games are set to use PhysX in the future currently? By my count: 2 that are of any consequence: Metro Last Light and Hawken. Now, I can't speak for Hawken, but the PhysX in Metro 2033 runs OK off the CPU and it's reasonable to expect the same from Metro Last Light.
m
0
l
a b U Graphics card
January 6, 2013 12:39:15 PM

BigMack70 said:
Know how many games are set to use PhysX in the future currently? By my count: 2 that are of any consequence: Metro Last Light and Hawken. Now, I can't speak for Hawken, but the PhysX in Metro 2033 runs OK off the CPU and it's reasonable to expect the same from Metro Last Light.


How many of the existing PhysX games did we know would use PhysX well before their release? You may be right. Infact there may never be a single new game from now on that uses PhysX. But there might be a few. There might be a whole load. I can't see the future though, and neither can you.

For me, it's really important to me that I'm playing the game the way the developer intended it to be played (isn't that why it's important to us to play games at max settings?). If that developer implemented PhysX for the physics modelling (and there's no option to get the same effects via Havok or an in-house implementation) then PhysX is the way they intended it to be seen.

I played Borderlands 2 and the existing DLCs on a Radeon, but I'll be waiting until I get a GeForce to play the remaining DLCs, so I can play them the way Gearbox intended them and so I can see them at their best.
m
0
l
January 6, 2013 12:45:23 PM

sam_p_lay said:
You may be right.


I am right.

And it's never a secret surprise when a game uses GPU-accelerated PhysX... Nvidia's marketing team is always all over it.

Also, I'm enjoying maxed out Borderlands 2 and Metro 2033 (both PhysX games) just fine :bounce: 
m
0
l
a b U Graphics card
January 6, 2013 12:49:19 PM

Please don't make me argue common sense with you. You can't count future events. Because they haven't happened yet. Don't let this argument descend into idiocy.
m
0
l
January 6, 2013 12:52:44 PM

I fail to see how you think that in the next year or so there's going to suddenly be all these surprise PhysX games... and I fail to see how you can bring up "common sense" after making the ridiculous claim that PhysX is "gaining momentum".

And if you're thinking about games more than a year out that aren't even announced yet, then the discussion is completely irrelevant, as we don't even know if current cards will be able to max the games that come out then or not, irrespective of PhysX.
m
0
l
a b U Graphics card
January 6, 2013 12:57:33 PM

I'm pretty sure I said "seems to be gaining momentum". As in, barely any big PhysX titles a few years ago. Last couple of years, some undeniably big titles. I've seen you do this before - selectively editing what people say. Give it a rest. Anyone can scroll and see what I actually wrote. And as for a GTX670/7970 not handling games more than one year into the future... are you joking?
m
0
l
January 6, 2013 1:02:30 PM

Ah OK well let me apply some common sense to the situation:
With just 27 titles total and only two of those being relevant titles in development, PhysX seems to be going nowhere fast.

I've seen you do this before... repeating Nvidia's marketing line about PhysX over and over and over. Give it a rest.

Also, the GTX 580, released at the end of 2010/early 2011, cannot max out all games smoothly. It's entirely possible that in another year, there will be games that a 7970/680 cannot max out smoothly.
m
0
l
January 6, 2013 1:03:44 PM

In this case I would go for the Asus Direct CU 670, it gives you excellent of overclocking and cooling potential that's unmatched by any other manufacturer. Considering your resolution is 1080p or around that, you won't need more than a 670. However, the 7970 does offer higher memory bandwidth and an extra gig of VRAM and the new 12.11 drivers are fantastic for 7000 series cards. It's all really up to you, which path you want to go.
m
0
l
a b U Graphics card
January 6, 2013 1:11:36 PM

BigMack70 said:
Ah OK well let me apply some common sense to the situation:
With just 27 titles total and only two of those being relevant titles in development, PhysX seems to be going nowhere fast.

I've seen you do this before... repeating Nvidia's marketing line about PhysX over and over and over. Give it a rest.

Also, the GTX 580, released at the end of 2010/early 2011, cannot max out all games smoothly. It's entirely possible that in another year, there will be games that a 7970/680 cannot max out smoothly.


Kepler was a major new architecture - significant gains from GTX680 over GTX580 compared to GTX580 over GTX480 (since that was just a refinement of Fermi). Next serieses are a refinement of what's out now. I don't think these cards will be struggling a year from now. Anyway, as for games GTX580 can't max at 1080, can you list more than three?
m
0
l
a b U Graphics card
January 6, 2013 1:12:16 PM

And don't tell me about repeatedly recommending nVidia hardware when you do exactly the same for AMD. Why don't you give it a rest?
m
0
l
a b U Graphics card
January 6, 2013 2:04:50 PM

Or look for that Club 3D 7950 that seems to have Godly Overclocking abilities!
m
0
l
January 6, 2013 2:38:19 PM

Thanks guys. I was gonna go with the 670 but than I realized that since the 12.11 drivers, the 7970 has been faster than he 680. I also realized that since I will be modding skyrim and minecraft alot WHILE recording, the extre v-ram paired with the 12.11 driver would be awesome.
Yes I agree, go big or go home.
Ahhh, so I guess I'll have to save up a but longer.

And one more thing. Since I will be ocing the 7970, and the cpu (3570k) to 4.2 ghz, will a 550w psu be enough for the job? Or should I look into a 650w. I won't be doing extreme ocing with the 7970.

Thanks for all the help guys :o 
m
0
l
a b U Graphics card
January 6, 2013 3:28:20 PM

650watt
m
0
l
January 6, 2013 3:45:36 PM

redeemer said:
650watt

Any specific company?
m
0
l
January 6, 2013 6:08:30 PM

sam_p_lay said:
And don't tell me about repeatedly recommending nVidia hardware when you do exactly the same for AMD. Why don't you give it a rest?


You should read before you accuse me of things I don't do.

I don't try to evangelize AMD, I try to recommend the appropriate card based on what people say their desires/needs are. I just have very, very, VERY low tolerance for people who spit back stock marketing and/or fanboy lines from one company or another, and posts like yours espousing the awesomesauce of PhysX almost always fall in that category.

The PhysX nonsense is no different than AMD fanboys who jump on the vram/bus width bandwagon and use that to bash all GTX 6xx cards even if the data shows that the card performs fine in its given price class.

It's no secret that I am not personally a fan of the Kepler cards, and you can look through my posting history from months back if you want explanations for that. I personally tend to alternate brands each build, and my preference leans slightly to Nvidia overall, but not when they pull some of the nonsense they tried with GK104.

People hop on forums to get something more than the marketing BS they can find at Nvidia and AMD's websites. And the way PhysX gets marketed by Nvidia and their fans is almost nothing but BS.
m
0
l
a c 185 U Graphics card
January 6, 2013 6:18:54 PM

Guys stop argue about it.Its a feature you get with NV some like it some do not no need to get into a big debate about it.
m
0
l
January 6, 2013 6:23:09 PM

bigcyco1 said:
Guys stop argue about it.Its a feature you get with NV some like it some do not no need to get into a big debate about it.


You just like pushing my buttons about this :kaola: 
m
0
l
a c 185 U Graphics card
January 6, 2013 6:26:45 PM

BigMack70 said:
You just like pushing my buttons about this :kaola: 
I can't lie i sure do. :lol:  ;) 
m
0
l
January 6, 2013 6:43:09 PM



Nooooooooooooooooooooooooooooooooo! Stay away, stay FAR away from a unit like that... Bigcyco's recommendation was a much much much much better one. The safest bet is to get a Corsair or Seasonic power supply from one of their midrange or high end product lines.

You want a quality unit; you absolutely do NOT want to just find the cheapest unit that has the right wattage on the label. This is doubly important since you want to overclock a decent amount. Cheap units don't deliver the stable power that your components need, and especially when you start talking about overclocking this can cause your components to die much faster than they should.

If money is a concern, you would be better off in the long run spending a little extra on the PSU to get something nice and dropping down a bit on the GPU to something like one of the other 7970s or the 7950 Vapor-X (which was a pretty good suggestion).
m
0
l
January 6, 2013 6:45:47 PM

BigMack70 said:
Nooooooooooooooooooooooooooooooooo! Stay away, stay FAR away from a unit like that... Bigcyco's recommendation was a much much much much better one. The safest bet is to get a Corsair or Seasonic power supply from one of their midrange or high end product lines.

You want a quality unit; you absolutely do NOT want to just find the cheapest unit that has the right wattage on the label. This is doubly important since you want to overclock a decent amount. Cheap units don't deliver the stable power that your components need, and especially when you start talking about overclocking this can cause your components to die much faster than they should.

If money is a concern, you would be better off in the long run spending a little extra on the PSU to get something nice and dropping down a bit on the GPU to something like one of the other 7970s or the 7950 Vapor-X (which was a pretty good suggestion).

Alright, will do.
m
0
l
a c 133 U Graphics card
January 6, 2013 6:46:08 PM

You wouldn't put 10 dollar components in a 20,000 car why would you do so in a computer :)  Its far better for you to buy a quality power supply. You see less ripple and power and will have cleaner and more stable power throughout your computer. His recommendation was sound.
m
0
l
a c 133 U Graphics card
January 6, 2013 6:50:02 PM

lol these emoticons are killing me from you man hahah!
and something that bothered me a little bit normally games that have physx will be ran on low with amd cards which is processed by the cpu as you said most of the eye candy will be seen on high. For example with borderlands 2 the highest you can get is low on a amd card. Futhermore, with metro 2033 I have it and I don't notice the physx in it I know its there but I don't notice it maybe they will improve on it. But again its just a feature that I would prefer in a game because I enjoy the visuals in the games that I do have is it good for the industry probably not. Would be nice for both companies to have the ability to render via there own physx engine.
m
0
l
January 6, 2013 6:59:00 PM

bigshootr8 said:
For example with borderlands 2 the highest you can get is low on a amd card.


:heink: 
...

[:pdxalex]

...
........
.............

[:lutfij:6]
m
0
l
a c 133 U Graphics card
January 6, 2013 7:26:21 PM

What you just don't listen why do you think big throws it at your face haha! If you are going to judge something only because you can run it on "low" then quit it its a matter of opinion.
m
0
l
January 6, 2013 7:42:21 PM

^that confused me XD
m
0
l
a c 133 U Graphics card
January 6, 2013 7:48:12 PM

Okay with phsyx most some games will implement it for amd but on low because it's being processed by the cpu. And there are effects that you only start seeing once you get to high. That is what I was painting a picture of well at least with Borderlands 2 you cant touch Physx in Batman games with a AMD card.
m
0
l
a c 133 U Graphics card
January 6, 2013 7:48:36 PM

But again for what you are doing for the millionth time 7970 is better for you sir.
m
0
l
January 6, 2013 7:48:49 PM

bigshootr8 said:
What you just don't listen why do you think big throws it at your face haha! If you are going to judge something only because you can run it on "low" then quit it its a matter of opinion.


PhysX runs just fine on "high" in Borderlands 2 on my system and the consensus around the interwebs is that even when the setting gets grayed out (which apparently happens sometimes on AMD rigs), you can still set it to medium through the ini. Stop making things up.
m
0
l
a c 133 U Graphics card
January 6, 2013 7:50:40 PM

Right and when you go into the .ini file and set it to medium what is it drawing from huh the freaking CPU lol :p  nice try. And you would need a hacked driver to really push physx anyway on the gpu ha!
m
0
l
a c 133 U Graphics card
January 6, 2013 8:10:56 PM

So you aren't running physx off the GPU then so you are running off your cpu gotcha. And the frames are somewhat acceptable for you. I suppose that would make sense I mean a direct x 9 game should be able to run somewhat okay on a 2600k overclocked as it is.
m
0
l
January 6, 2013 8:13:25 PM

Exactly... I do get minimum fps around 30 in hectic areas/fights, as I've said. It's playable then but not real smooth, obviously. If I drop PhysX to "medium", I can maintain minimum framerates closer to 50 and it's smooth.
m
0
l
a c 133 U Graphics card
January 6, 2013 8:16:30 PM

Right I've read that as well that people who go down to medium on AMD cards get more playable settings. Now don't get me wrong. I'm not really NVIDIA v AMD. I would rather games be set around optimizing both amd havok and nvidia physx if a game were to use it. I just prefer for the games I play to not offload that onto the CPU with batman games I like the physx others like yourself may not shrugs (which will lock you out if you don't a nvidia card)
m
0
l
January 6, 2013 8:55:20 PM

Batman:AC pretty much needs an Nvidia GPU for the PhysX to be turned up to meaningful levels - anyone for whom that's an important game should be using an Nvidia GPU.

When I played through it, I just left PhysX off, I think (maybe used it on low, can't remember).
m
0
l
a c 133 U Graphics card
January 6, 2013 9:24:23 PM

When I was using my ATi 5770 I wasn't able to turn it on at all. I mean all these games can be enjoyed with no Physx it's just something I enjoy when its in a game whether its a AMD based Physx or a Nvidia Physx.
m
0
l
a c 185 U Graphics card
January 7, 2013 5:33:49 PM

Nvidia Physx. = Unstoppable!
m
0
l
a c 133 U Graphics card
January 7, 2013 6:37:18 PM

I know big you make the amd girls/men go crazy with your embedded links :p 
m
0
l
a b U Graphics card
January 7, 2013 9:00:14 PM

Keep it up by the way :-D It makes them so angry when there's video evidence!
m
0
l
!