Technically, it'll run, but there's a good possibility you may find trying to get it to play decently to be something too frustrating to deal with. To look over your parts, and why each is a bad sign for playing the game:[*:94fc459cd1]CPU: The minimum requirements specify a CPU equivalent to a 2.0GHz Pentium 4. That's about equal to an Athlon XP 2100+ (had such a CPU existed) However, a weak CPU really does hurt the game in ways that you really can't adjust for by changing the settings; no matter what, you're going to be dealing with much of the same AI and object prioritization code that will make your framerate slow to less than 10 in most cities. You'll come to dread going there. Similarly, larger battles (such as once you get near Oblivion gates) will slow you down quite a lot as well - NOT ideal when you're in combat.
[*:94fc459cd1]RAM: 512MB is technically enough, though the game REALLY needs 1GB. With that much, expect it to take frequent stops to load, and they will be pretty long; expect a 30-40 second pit stop for every 200 feet you go outdoors. The game could also very well crash at load points.
[*:94fc459cd1]Video Card: Unfortunately, GeForce cards, at least before the GeForce 8, don't age well at all. Though the 6200 isn't as bad as the GeForce FX series, nVidia still neglected pixel shaders over texturing, and unfortunately, that's just the sort of game Oblivion is, making use of tons of shaders on every surface to produce uber-realistic effects. While the card DOES have 256MB of VRAM, which technically is enough to load up the "large" textures, that will mean larger shader maps as well, which is not good. I'd recommend running at 640x480, killing the shadows entirely, and reducing the grass/tree settings as low as they'll go. THEN it could very well run tolerably outdoors.
[*:94fc459cd1]Motherboard: Okay, perhaps I can't really say that your motherboard is a bad sign. Except that it's Socket A, only supports only up to a 333MHz FSB (no 400MHz Bartons for that) and uses AGP graphics. From my experience, AGP video cards take longer to load textures up in Oblivion than PCI-express ones. Plus, AGP cards are more expensive.
We fooled around in th lab once with Oblivion and tried to get it to run on somewhat similar specs. You know, just for sh*ts and giggles. Well, the system gagged at several points like a cat trying to hack up an extra large hairball. Even with the settings toned down, I don't advise it.
There's a mod called Oldblivion, I think it's made to run on slower computers, or SM1.0, dunno.
Yes, there is that, and I did think on that when writing, but I would like to point out that it really is only of any use, for performance-improving reasons, to GeForce FX cards, and potentially some other third-party chipsets like the Volari V8 or S3 Chrome.
The GeForce 6 series really doesn't lack all that much in terms of shader power - it's fully SM 2.0 and 3.0 compliant, and unlike the GeForce FX series, it has one shader unit for every texture unit, rather than one shader for every two texture units, that gives the 6200 plenty enough power. Using OldBlivion would merely decrease image quality, with a negligible impact on performance.
The real killers here would likely be the low amount of memory bandwidth, especially with how the architecture handles things like the foliage, which tends to be the biggest source of performance woes on GeForce 6 and 7 cards.