More Penryn benchmarks!

http://www.divx.com/divx/windows/codec/

Looks like for video encoders, Penryn will dramatically improve performance.
19 answers Last reply
More about more penryn benchmarks
  1. Nice find ^^
  2. Awesome find! :trophy: :D
  3. Quote:
    http://www.divx.com/divx/windows/codec/

    Looks like for video encoders, Penryn will dramatically improve performance.


    It's good to know there is at least one program ready to take advantage of SSE4. ;)

    Now, we need more media encoding programs/codecs to follow suit!

    Does anybody know whether SSE4 provides any instructions that can be beneficial to gaming?
  4. nice find, i wonder how barcs sse4 will go...
  5. even with the previous benchmarks it was evident that SSE4 brings in dramatic improvement in dicx encoding. Even a dual core penryn was able to beat current core 2 quad in divx encoding

    nice find anyways... :trophy: :trophy: :trophy:
  6. Quote:
    Does anybody know whether SSE4 provides any instructions that can be beneficial to gaming?


    After a quick look it doesnt look like there is. There isn't really any instructions that can benefit gaming, gaming doesn't follow patterns, it's pretty simplistic code. There are some SSE2? instructions that added a much quicker way of moving/copying memory which would help quite a few engines i've seen. Maybe when a more complex problem comes along for gaming there will be something, ray tracing maybe ;)

    http://softwarecommunity.intel.com/articles/eng/1247.htm
  7. Quote:
    http://www.divx.com/divx/windows/codec/

    Looks like for video encoders, Penryn will dramatically improve performance.


    It's good to know there is at least one program ready to take advantage of SSE4. ;)

    Now, we need more media encoding programs/codecs to follow suit!

    Does anybody know whether SSE4 provides any instructions that can be beneficial to gaming?

    I don't believe SSE4 benefits gaming. But Penryn does contain a better shuffling engine that greatly improves SSE performance. Expect to see a 40% improvement in games based upon known benchmarks.
  8. Quote:
    nice find, i wonder how barcs sse4 will go...


    Has it been confirmed that Barcy will even have SSE4?
  9. Sounds good, however, they're comparing SSE2 with SSE4 performance and that is a bit like comparing a PentiumII with a Pentium4... stupid parallelism, I know, but SSE2 is considered as having brought the least improvement of the whole SSE series; SSE3 is much better and I'd like to see hoe SSE4 compares to SSE3.
  10. Quote:
    Sounds good, however, they're comparing SSE2 with SSE4 performance and that is a bit like comparing a PentiumII with a Pentium4... stupid parallelism, I know, but SSE2 is considered as having brought the least improvement of the whole SSE series; SSE3 is much better and I'd like to see hoe SSE4 compares to SSE3.


    I think they had to compare SSE4 performance to SSE2 because Divx doesn't use SSE3. I'm not a programmer, but if I remember correctly, the focus on SSE3 was not video encoding. Not to mention SSE3 was weak to begin with thus necessitating SSSE3. If someone has more insight about the use of SSE3, that would be appreciated.

    I read through the release notes on the Divx site and found no mention of SSE3, but I could find mention of SSE2 optimizations. They go back to March of 2002 when Divx 5 was released.

    DivX release notes

    Ryan
  11. I am kind of confused now on video encoding, but I know for sure that in rendering, I got ~ 16% more when my renderer went from SSE2 to SSE3.
  12. Quote:
    Sounds good, however, they're comparing SSE2 with SSE4 performance and that is a bit like comparing a PentiumII with a Pentium4... stupid parallelism, I know, but SSE2 is considered as having brought the least improvement of the whole SSE series; SSE3 is much better and I'd like to see hoe SSE4 compares to SSE3.

    SSE2 was arguably the biggest, as it supported double precision calculations and effectively replaced x87 going forward, at least for those looking for maximum performance.
  13. So what's the rumor on when Intel is going to be shelling out this 45nm chips? I'm hoping I could get for my DX10 upgrade by the end of this year.
  14. Quote:
    Sounds good, however, they're comparing SSE2 with SSE4 performance and that is a bit like comparing a PentiumII with a Pentium4... stupid parallelism, I know, but SSE2 is considered as having brought the least improvement of the whole SSE series; SSE3 is much better and I'd like to see hoe SSE4 compares to SSE3.

    SSE2 was arguably the biggest, as it supported double precision calculations and effectively replaced x87 going forward, at least for those looking for maximum performance.
    :oops: You're right; the ones that sucked were SSE(1).
  15. Which renderer are/were you using?

    I'm still looking for a good description of what SSE3 was for. All I can remember is thread synchronization was a big part of it to assist Hyperthreading. (Prescott's SMT was better than Northwood's) I also remember 3D gaming and voice recognition mentioned as well. SSE3 is only 13 instructions so how much could it possibly do. (Compared to SSE at 70 instructions, SSE2 at 144, and SSE4 is 47) Beyond that we need a programmer here who is familiar with SSE3 to set the record straight.

    Ryan
  16. I am talking about Blender, it's internal render more precisely.
  17. That makes sense then. You are referring to a 3D renderer. I thought maybe you meant a video renderer like After Effects. As I mentioned before, 3D applications were supposed to benefit from SSE3.

    Ryan
  18. Intel should be worried and not underestimate the people who brought us the amazing


    and this incredible roadmap...





    But the botched marketing made a debauchery of it chopping this image says alot about their expertise...



    It would be nice if this chart were of AMD's performance benchmarks showing scalability, but their debt is scaling quite well...




    AMD's hope lies here... not that AMD lies...


    but they need this technology it seems to overcome Penryn...


    too bad it seems they have been wasting monies on these boobs...
  19. best damn post ever, Rich.

    HAHAHAHA :lol:

    even better than the benchies themselves

    :trophy: :trophy: :trophy:
Ask a new question

Read More

CPUs Performance Video