I'm hoping someone can clear this up for me, because I don't have the resources to test for myself. I'd like empirical data, but theory will also be welcome.
If you put 2 IDE drives on the same channel, what (if any) performance implications are there?
I've heard the following:
1) Both drives will function at the speed of the slowest drive, even if only one is in use. I find this unbelievable - I've had a HDD and a CDROM sharing the same channel and I was getting more than 7MB/s off the HDD.
2) Both drives will be limited to the ATA standard of the lowest. For example, an ATA-100 compliant HDD on the same channel as an ATA-33 compliant optical drive will function as an ATA-33 HDD, with a limit of 33MB/s shared between both drives. I see how this could be possible, but I don't think it's true.
3) The only effect is on access time, as only one request for access can be processed at a time. If a request for access for one drive comes in while a request for access to the other drive is being processed, the second request will have to wait until the first has been processed. This is an issue if a HDD and an optical drive are on the same channel, due to the relatively long access time of optical drives. I know this used to be the case, but I thought it no longer applied.
4) Using two drives on the same channel simultaneously is extremely inefficient. Some people have said this is because it's actually impossible and what is happening is time-slicing between the two drives. Others say that there are some vaguely specified overheads in using two drives on one channel that drastically reduce performance.
5) It has no effect at all, unless the combined transfer rates of the drives exceeds the practical limit of the motherboard (e.g. probably about 80MB/s for ATA-100).
I'm not sure which of those is true. I can see an effective way to test it - take 2 HDD and set them up as a striping RAID on separate channels, then benchmark. Put them on the same channel and benchmark again. Ditto with some copying, to see the possible effect of reading one drive and writing to another at the same time. Unfortunately, I don't have the hardware to do that.
1) This is true for some older controllers that are not able to do device independent timing. Hence they default to the UDMA mode (not the actual speed) of the slowest drive. Modern controllers do not have this problem.
2) See 1)
3) Is true. As I see it the largest disadvantage of having an UDMA33 drive and a UDMA100 drive on the same cable is the slowest drives ineffecient usage of the bandwidth. Suppose the UDMA100 drive transfers about 10MB/s. That will use about 10% of the available bandwidth. However if the UDMA33 drive does the same, it will use almost 33% of the available bandwidth. Further if the UDMA33 is in progress of transferring a block of data and a HD request occur, this will have to wait until the slower drive is completed.
4) see 3)
5) see 1) - 4) and judge for yourselves
There are methods of testing some of this without requirering the HW you mention. You could plug a CD and HD to the same cable and copy a file from the CD to the HD. Then plug the drives to individual channels and repeat the test.
Thanks for the reply. A couple of things I'd like to clarify:
1) The bandwidth usage issue you refer to. To use an example, say you have a hypothetical ATA-100 HDD capable of transferring 80MB/s and an ATA-33 drive capable of 10MB/s, both on a hypothetical perfect (in this context) motherboard, so 100MB/s is actually possible from the IDE drives. The slower drive is transferring at a rate of 10MB/s. That limits the faster drive to about 68MB/s?
2) Only one drive on a channel may be in use at any one time? ("if the UDMA33 is in progress of transferring a block of data and a HD request occur, this will have to wait until the slower drive is completed")
As for the test you refer to, I could do that but it wouldn't be all that conclusive as my HDD is archaic and ATA-33 might be enough for both the HDD and my CDROM.