OK, i'll do your homework for you...
4Mbps (megabits) equals to approximately 500KB/s (kilobytes).
100MB (Mega Bytes) equals approximately 100000 KB
So assuming you can have a perfect transfer rate of 500KB/sec (assuming there is no TCP/IP overhead) you divide 100000KB by your 500KB/Sec transfer rate:
= 200 seconds
200 seconds divided by 60 = 3.3 minutes.
There you go. That;s it. Anyone care to correct any flaws in my calculations go ahead. I won't mind.
Edit: Feel free to flog me for answering this guys homework question. Also I don't mind walking the plank.
Totally fine with me, final exams exist for this reason. Cant go on toms while writing that final