GTX770 and Triple-Monitor:
Performance PLUMMETS with three monitors which is why many people would have a 2xGTX770 setup. Here's an EXAMPLE (5760x1080 is three 1920x1080 monitors):
http://www.bit-tech.net/hardware/2013/05/31/nvidia-geforce-gtx-770-2gb-review/5
Crysis:
1920x1080 - 56FPS
5760x1080 - 21FPS
BF3:
91FPS vs 35FPS
Bioshock Ultimate:
81FPS vs 28FPS
Skyrim:
120FPS vs 45FPS
So you would have to LOWER THE QUALITY quite a lot to achieve 60FPS or else game at a much lower refresh which all defeats the point of buying three monitors (a BETTER experience).
My graphics card is nearly identical to yours and I have almost every games (damn Steam sales). Despite my 2560x1440 resolution I play MOST games at 1920x1080 because most, such as Skyrim, look almost EXACTLY the same but the performance drop is about 30% to get to 1440p at similar screen quality.
So I have a nice, high-res DESKTOP environment, the 27" screen fills my field of view quite nicely, and I game EVERY game at 60FPS with the HIGHEST quality or really close to it. And again, many games do look quite a bit better at 2560x1440.
Here's an interesting COST breakdown in DOLLARS:
1) 2nd GTX 770 - $400
2) 3x 1920x1080 monitors - $500
That's the cost of a good 2560x1440 monitor!! Anyway, not trying to pressure you, just giving you some facts. It's really the PERFORMANCE thing that kills you with a single card.
You really should have MORE than 2GB of VRAM per GPU as well for 5760x1080. While one 770 really isn't enough many games do suddenly DROP in performance if there isn't enough VRAM.
OTHER:
*Read about the new monitors in 2014 that support G-Sync. Unfortunately a little expensive to start. Just thought you'd be interested: http://www.engadget.com/2013/10/18/nvidia-g-sync/
For example:
It would probably be a 120Hz screen. NVidia drivers would cap the output to a maximum of 120Hz (120FPS). You might play a game and have the refresh rate average 90FPS, cap at 120FPS but drop as low as 30FPS.
G-Sync just has the GPU output each frame as quickly as it gets rendered, then the screen displays it right away. Normally the graphics card puts the output frame in a BUFFER when it waits for the next refresh cycle (i.e. 1/60th of a second). This causes a bit of LAG. If you turn VSYNC OFF then you get screen tearing.
It would also pretty much eliminate micro-stutter especially in SLI. Rather than complicated buffering, each GPU alternately fires a new frame to the monitor to process.
This would also be perfect for a new HDTV for game consoles.
**If you have ANY questions on PC's go ahead as I have some time. I recently picked up playing Fallout 3 and NV after fixing the stutter completely using the ifpsclamp method.