I'm watching some Digital Foundry videos and it seems like console version of games are always getting (for a lack of better words) "torn up" for inconsistent frame rates while the pc versions shown have wildly inconsistent frame rates (with VYSNC off).
Why is this? Why does it seem like when console games fps dips, it's a bad thing? But when pc games dip, it's never really mentioned as negatively?
I'm guessing it has to do with the display monitor: TV versus computer monitor. Like maybe fps dips on TVs are just more detrimental to the gaming experiences versus fps dips on monitors.
I'm just curious. I'm coming over from console to pc, but I couldn't help but notice this trend in comparing console vs. pc ports.
Edit: BTW, I'm not insulting DF. I like their videos. Just had a question after watching some of their stuff.
Why is this? Why does it seem like when console games fps dips, it's a bad thing? But when pc games dip, it's never really mentioned as negatively?
I'm guessing it has to do with the display monitor: TV versus computer monitor. Like maybe fps dips on TVs are just more detrimental to the gaming experiences versus fps dips on monitors.
I'm just curious. I'm coming over from console to pc, but I couldn't help but notice this trend in comparing console vs. pc ports.
Edit: BTW, I'm not insulting DF. I like their videos. Just had a question after watching some of their stuff.