Just take the DVI-I to DVI-D cable and plug the DVI-I part into the monitor and the DVI-D side into the computer. For some reason, monitors typically put DVI-I to their monitors.
Anyways, the reason they have all these different types of connections is because the industry is constantly evolving and improving on old standards. Some monitors need special versions of connections to support high refresh rates or high resolutions.
Video cards, on the other hand, need to be able to support all the options that customers might need. If they are using a modern TV, the chances are, they need HDMI, so they give you HDMI ports. If the customer has a 120hz 3D Vision monitor, they need Dual-Link DVI-D connections. If the customer has an older monitor with a VGA or d-sub connection, they need analog, so they support DVI-I connections which can be used with an adapter to support analog VGA signals. AMD started using displayport to support multiple monitors and high refresh rates, so displayports are now on video cards.
As new hardware comes out with new needs, new connections are needed.