[citation][nom]c123s456[/nom]Firewire was slow to adoption because when it came out it was inferior to USB. Intel had nothing to do with that =/[/citation]
Fire wire had slow adoptation because of many reasons
1) more expensive to manufacture compared to usb 1.1
2) larger interface and controller chips compared to usb
3) very few devices demanded or needed the 200Mbps of firewire (actuially it could go up to 400Mbps, but most controllers were gimped at 200Mbps until 1394a was released). The 12Mbps of USB was more than adequate for most devices at the time.
4) when USB2 came out against firewire2 (1394a, which is still oddly the standard for most FW devices in spite of its age and the vast superiority of 1394b), USB had already won the standards war for periphrials, and it was backwards compatible to a multitude of older devices, so people stuck with what they were familiar with. USB2 was superior for small file transfers, while FW400 was faster for large chunk transfers, so FW became almost entirely dedicated to video camera connectivity, and a select few pro audio devices and HDD enclosures
5) And this is the most important reason; Fire Wire is called fire wire because it will literally burn out your equipment! The amperage is too high, and the interface is too poorly designed, so it can literally bridge and fry your (expensive) video cameras and HDD interfaces. And if you burn out the controller on your mobo then it will instantly fry every piece of equipment that you plug into it (ask me how I know lol). USB was simply safer to use, so manufacturers used it.
6) FW800 made too many changes, and has become nearly entirely useless. USB3, eSATA, thunderbolt, and even HDMI make for better transfer mediums, and so FW800 died before it even reached the PC market... doomed forever to be that one odd mystery port on macs that is never used.
Firewire was (until recently) always superior to USB. It has much faster data rates, much less overhead in it's transfers, and a much better ability to power devices so that less power adapters were needed. But (like betamax) it just goes to show that the 'superior' technology is not always 'better', and certainly not given any sort of right to survive in the market when it fails in other areas (like expense, and design flaws)... Intel should take note of this with DP/Thunderbolt...