Neverwinter is built on a modified Cryptic Game engine, an offshoot of the code that powers Champions Online and Star Trek Online. These are the best visuals we've seen from this company to date, though.

The art style is vibrant and exaggerated. It's slightly cartoonish, but far more realistic than World of Warcraft. The shading, models, and textures are much prettier than Turbine's Dungeons and Dragons Online, even if it lags far behind the artwork in modern first-person shooters.

The Cryptic engine gives you a lot of control over graphics quality settings. The first tweak we'd like to talk about is the render scale detail slider. This appears to control the resolution of the game output: for example, if you're playing at 1920x1080 and lower the render scale to 50%, the result looks like 960x540. The only advantage over actually lowering the resolution is that interface elements like buttons and text boxes are displayed at 1920x1080. This isn't a setting we often see in games, but it's featured in the Cryptic engine (and other games that are designed to work on low-end hardware, such as Dota 2). Plain and simple, we don't like lowering render scale at all because it has a profound impact on clarity, and it only helps graphics performance if the CPU isn't already a bottleneck. Neverwinter employs built-in defaults that depend on the video card it detects. Whoever set them up appears to favor high details and a low render scale. We couldn't disagree more, so we're leaving the render scale at 100% for all our benchmarks and dropping the details instead.

For the most part, we're sticking to the minimum, medium, and maximum detail settings. We did notice that the character detail distance slider has a significant impact on frame rates, and we chose to reduce it to 50% for our medium detail benchmarks. This does mean that it's easier to see the transition from low- to high-detail models as they get closer. But without this concession, frame rates often dip too low, especially when we use mid-range processors. The good news is that anti-aliasing and anisotropic filtering have little effect on this platform-limited title, so we added 2x AA and 8x AF to the medium benchmark configuration.

Clearly, the biggest leap in visual fidelity comes from jumping to our middle configuration from the lowest-level settings. The highest options yield little improvement over medium, despite the reduced frame rates (though that character quality level difference isn't noticeable at the higher-end options, which we like).

The game looks pretty crude at low detail, but it's a lot more attractive at medium detail settings.
- Never Say Neverwinter Again
- Image Quality And Settings
- Test System And Graphics Hardware
- Results: Low Quality, 1280x800
- Results: Low Quality, 1920x1080
- Results: Medium Quality, 1920x1080
- Results: Medium Quality, 5760x1080
- Results: High Quality, 1920x1080
- Results: CPU Benchmarks
- Neverwinter: Lots Of Fun, Despite The CPU Bottleneck
It's running a 3317u w/HD4000 4GB RAM on Win8 @ 1600x900 & it runs w/o issues on minimum settings (100% scale, 50% hi-res character draw distance).
No exact numbers to report, but I can run around the main city (which with it being the central congregation point for everyone tends to be one of the laggier spots) without issues. Sure it doesn't look the best my any stretch, but it's workable without a doubt in a pinch.
http://www.tomshardware.com/reviews/neverwinter-performance-benchmark,3495-4.html
http://www.tomshardware.com/reviews/neverwinter-performance-benchmark,3495-5.html
i could believe fx8350 sinking itself to core i3 level performance (it's kinda fx8350's routine) but hd4000 significantly outperforming radeon 7660g in min., avg., and frame time variance? with dual core i5 vs quadcore a10 even...
how would an overclocked i5 3550 or 3570k or fx6300 would fare in this game?
A good measure of how badly the 4600M is limiting performance would be to give the 5800K a run with its integrated graphics - there's a significant clock speed difference.
You obviously missed where they said it is unrelated to Neverwinter Nights - different studio, totally different game. Neverwinter is merely a place in the Forgotten Realms. So bringing up 'waaaahhh, I want NWN3' is rather pointless here.
And I don't see how an MMO based on the Forgotten Realms ruins everything. Why can't you have both this AND NWN3? Just don't play this one, and play what you want...
I'm sorry, but this comparison is ABSOLUTELY WRONG. Yes, it's easier to trade for pay-to-play content in neverwinter, but you then say that this is far better to DDO, where you can't...
Except that you can. Playing even a little bit will give you favor with certain patrons. As you get more of this favor, you are AUTOMATICALLY given "turbine points" which is the currency you buy with money. You can earn everything in the game just by playing; sure, it'll take a little while, but I'm sure that neverwinters' solution will too.
So don't make a claim that's completely wrong, please. The "review" parts on the game felt so biased it's not even funny.
I sure wish AMD was still in this competition so that we wouldn't have things like Haswell being essentially the same CPU as Ivy Bridge with a few tweaks and a better useless section that gets instantly disabled if you want to actually play games.
Should read:
"But don't bother with a dual-core CPU that is Hyper-Threaded."
or
"But don't bother with a dual-core CPU that has Hyper-Threading."
or
"But don't bother with a dual-core CPU that uses Hyper-Threading."
Thanks
My 1090t @ 4.0GHz on max detail and a single 7950 gives 60+ FPS in explorable areas and mid 40's in populated towns. Happy to say it performs well, much better than the APU in this test.
This game, like Guild Wars 2, seems to rely heavily on a single core render pipe. 2-3 other supporting cores will get used, but at a reduced level. So the intel CPUs will do better thanks to stronger single core performance.
The problem is, single-core development is 5 years old. No matter what CPU you use, modern games will bottleneck there when developed this way. Optimizing to take *FULL* advantage of 4+ cores will greatly improve performance for everyone, intel or AMD.
Because CPU frame latency results are so consistently low (well below 15 milliseconds) they're insignificant. The only time I've seen them be an issue is with APUs, and that's covered.
I know its getting a little long in the tooth, but I would have loved to see an I7-920 included in the test. I am now very interested to see how my system compares as far as numbers go. Playing I haven't felt like the game was running slow at all, but I haven't actually looked at fps numbers, nor have I changed any of the graphics settings from default.
Actually, *all* of those are incorrect due to our omission.
Fixed to:
"But don't bother with a dual-core CPU that isn't Hyper-Threaded. "
You're absolutely right, we neglected to mention Turbine points. That was an oversight that's been fixed with the following edit:
[edit: we should clarify that DDO does allow you to earn Turbine Points in game, but they're relatively difficult to accumulate. More importantly, DDO requires players to purchase access to game content, but all of the content is free in Neverwinter]
Having said that, if you've played DDO you know that it's not viable to expect to have all access to all the content simply by playing and accumulating Turbine points "a little while". They're much, much harder to come by than Astral Diamonds.
Astral Diamonds are much easier to accumulate, but the real point here is that you don't need them at all to access every mission in the game. That's a huge distinction.
I've played DDO for years with friends who won't play anything else, so I'm not without experience. It's just not realistic to suggest you can have access to all content in DDO without using actual cash. In Neverwinter, it's not an issue.
I would disagree. Your accusation seems a bit vague, aside from the Turbine Point mistake that we fixed, what "review" parts are you referring to?
Geezzz ... what a horribly coded game. 5% difference between a 650ti and an HD 7970?