G
Guest
Guest
Ive been starting to suspect how efficently Athlon XP uses SSE commands. Only very rare software seems to even notice this possibility.
SSE is used with AXP processors for example in Intel optimized DIVX 4.11 codec, which seems to speed up quite a bit.
But then again in gaming it look like a totally different case. Is DirectX8.1 automaticly using SSE (when working with AXP cpu)?
How about 3DNOW! and SSE working simultaneously, is it possible?
Next row is extrapt from my wolfconfig.cfg file (Return to castle wolfenstein):
<b>seta r_lastValidRenderer "GeForce2 GTS/AGP/3DNOW!"</b>
I suppose this implies tha SSE _is_not_used_.
Hopefully some Intel users can cut+paste here their equivalent for this row so that we can be sure.
If anyone has more information about this is issue, or just want to corect my spelling, please, go ahead, the stage is yours
tyia
-raaggu
SSE is used with AXP processors for example in Intel optimized DIVX 4.11 codec, which seems to speed up quite a bit.
But then again in gaming it look like a totally different case. Is DirectX8.1 automaticly using SSE (when working with AXP cpu)?
How about 3DNOW! and SSE working simultaneously, is it possible?
Next row is extrapt from my wolfconfig.cfg file (Return to castle wolfenstein):
<b>seta r_lastValidRenderer "GeForce2 GTS/AGP/3DNOW!"</b>
I suppose this implies tha SSE _is_not_used_.
Hopefully some Intel users can cut+paste here their equivalent for this row so that we can be sure.
If anyone has more information about this is issue, or just want to corect my spelling, please, go ahead, the stage is yours
tyia
-raaggu