I've recently read that image quality in low-end graphics cards (lets say 7750) is worse than that of high-end cards (7870), when settings are set to high in a given videogame, despite using same system specs, same resolution and obviating framerate.
I would appreciate someone to confirm this, explaining why or pointing to some article, as I've been unable to find information with regards to the subject.
They affirm that you won't see the same efects, shadows, reflections when on ultra (to sum up, the same quality) if you are using a 7750 than if you're using 7870.
Like if the 7750 wasn't able to show the ultra's radiant beauty of a game.
Thank you very much for your answer and sorry for insistence and basic questions.
The only reason why it wouldn't on same games is because it wouldn't give playable FPS on those settings, so you have to turn them down. Other than that, on identical settings, it will look very very similar (differences according to driver, but those differences are not quality one, more like light color tone or something similar).