This thread is pretty hilarious, and as people who are into hardware and gaming, it's surprising how misinformed a few of the posts are.
I have dealt with flicker and high refresh/FPS quite a lot in my life because my eyes are overly-sensitive. As a result, I can tell you that refresh rate does NOT equal FPS at all. over 99% of humans can't see 30FPS. I am one of the unlucky few who can (just barely), so a slight flicker registers, especially when FPS is lower. Most people can't see the difference between 75Hz refresh and 100Hz refresh, but a small number can. Many more can see the difference btwn 60Hz and higher, as that's the normal threshold for being able to detect the flicker.
200FPS is a hilarious marker, people. it's a number used for testing equipment in a vaccuum, there are no humans that can see anything close to that. Trust me, I am an outlier and I peak in the 40's at the most. 200FPS is something that a graphics card to muster in a test, and there would be no difference, visually btwn that and 100FPS to us. The only diff is when you put those results under stress (heat, bottlenecks, etc), and the performance is pushed way down. When your 200FPS video card is put under heavy stress, perhaps it will show 35FPS, when a lesser card will be 20FPS, and then you'll notice the difference. Otherwise, go brag to your friends, it's useless in the real world.
In any case, LCD's don't refresh in nearly the same way that CRT's did, and that whole discourse hasn't been updated in common talk, so it's hard to get people on the same page. 60Hz can look very different on different screens, because their vert/hor refreshes are synced differently.
And I'm using an LCD on a DVI conn that's refreshing at 100Hz right now, so unless something has changed since the earlier posts, DVI can support over 60Hz (although someone mentioned that it drops, not sure if that's the case).