if you want a direct amp requirement on paper rather than actual intake:
the formula for Amperage is:
Watts/volts = Amps
so what you need is to know how many watts your gpu needs on load(look online for your cards max power consumption through benchmarks somewhere) and then divide by 12 volts, the rail on a psu that powers the GPU majorly. that will then give you the amps you need on the +12 rail for your psu.
He's talking about the different cards.
GTX 560 and GTX 560 Ti are different cards entirely. (Ti for titanium, it's better)
The AMP editions are highly overclocked versions of the card.
GTX 560 < GTX 560 AMP < GTX 560 Ti < GTX 560 Ti AMP there's better cards in that price range though.