Sign in with
Sign up | Sign in
Your question
Solved

Downloading Large Files

Tags:
  • Cable
  • Boost
  • Error Message
  • Internet
  • Backup
  • Business Computing
Last response: in Business Computing
Share
July 8, 2013 5:27:41 PM

I have cable internet 30Mbsp/10Mbps with 20 Mbps boost. I have been trying for days to restore a backup (10k files=8.9Gb) from a remote location and get an error message 'service interruption'. Called the cable company and they said the files are causing a time-out, buy a booster. Already have a cable copmany provided booster. Not sure what I can do to fix this. This NEWBIE appreciates any help from 'the vast unpaid research department'.

More about : downloading large files

July 8, 2013 5:35:32 PM

Try doing the files in smaller chunks. Not all 10k at once.
m
0
l
July 8, 2013 5:36:14 PM

If there on a FTP server try an FTP downloader. If not there are a few free downloader that will restart a stalled download. Another trick is to zip the file and then break the file into smaller download chunks.
m
0
l
Related resources

Best solution

July 8, 2013 5:58:31 PM

1) juts because your speed is all hot and fast, unless YOUR cable company ALSO provides service to your remote location you have to HOP from point to point, which depending on the number of hops would probably cause the failure (also having so much data and so many files doesn't help).

2) When you connect to your remote location, you have to think of ever single step you take to get 'there' physically. So you have to leave your desk (1) then exit your house (2) get in the car and down your specific street (3) connect to the next street (4) to get to the other streets (5,6,7) to get to the freeway (8) and so on. This also includes getting into the remote locations parking lot, entering the building, then how many rooms before you get to where the actual computer is, etc. are all added too.
To test how many 'hops', open a DOS CMD window, type in tracert and the IP of the remote location your copying from. This will tell you how much 'work' each small packet needs to go through to get back and forth

3) The normal easiest solution for MASS Data moves is still 'sneaker net'. That is, someone plugs in a external HD (say a 500GB) copies the files then put it in a padded envelope and sends it to you. That is the most reliable form to get LARGE DATA SETS from point A to B

4) Do you use any VPN, Firewalls, etc. between your computer and the distant one? How are you 'restore' a backup (application?) over the internet?
Share
July 9, 2013 2:30:48 PM

USAFRet said:
Try doing the files in smaller chunks. Not all 10k at once.


Tom Tancredi said:
1) juts because your speed is all hot and fast, unless YOUR cable company ALSO provides service to your remote location you have to HOP from point to point, which depending on the number of hops would probably cause the failure (also having so much data and so many files doesn't help).

2) When you connect to your remote location, you have to think of ever single step you take to get 'there' physically. So you have to leave your desk (1) then exit your house (2) get in the car and down your specific street (3) connect to the next street (4) to get to the other streets (5,6,7) to get to the freeway (8) and so on. This also includes getting into the remote locations parking lot, entering the building, then how many rooms before you get to where the actual computer is, etc. are all added too.
To test how many 'hops', open a DOS CMD window, type in tracert and the IP of the remote location your copying from. This will tell you how much 'work' each small packet needs to go through to get back and forth

3) The normal easiest solution for MASS Data moves is still 'sneaker net'. That is, someone plugs in a external HD (say a 500GB) copies the files then put it in a padded envelope and sends it to you. That is the most reliable form to get LARGE DATA SETS from point A to B

4) Do you use any VPN, Firewalls, etc. between your computer and the distant one? How are you 'restore' a backup (application?) over the internet?


m
0
l
July 9, 2013 2:41:28 PM

Thanks everyone, good explaination & suggestions. The remote location is Norton online backup. And Yes, I have Norton security. Thus, I asked them about everyone's suggestion (FTP, smaller files, etc.). No Luck. I decided two months ago when this problem started to buy 2Tb external harddrive so I don't have to depend on them. Looks like I'm at their mercy and have to wait for a hard copy. Unless there is some other magic wand! 8-D iane
m
0
l
July 9, 2013 3:30:27 PM

RoaminRoman said:
Thanks everyone, good explaination & suggestions. The remote location is Norton online backup. And Yes, I have Norton security. Thus, I asked them about everyone's suggestion (FTP, smaller files, etc.). No Luck. I decided two months ago when this problem started to buy 2Tb external harddrive so I don't have to depend on them. Looks like I'm at their mercy and have to wait for a hard copy. Unless there is some other magic wand! 8-D iane


The normal Point to Point standards business I worked for did was
A) Both machines (target and receiptant) were put on the DMZ so they were exposed to the Internet
B) Run a FTP server on target end then use FTP client to pull the data.
C) break the data down into small chunks (no more then 250MB) and down load each 'grouping' in a scripted sequence that calidates CRC and so on before doing the next chunk.

In your case your using Norton online backup, this is NOT made for across the wire solutions like this (them to backup to you), but more for the Norton Servers specifically connected to a DataCenter connected on the Internet Backbone to alleviate latency by hops, lag, and resend probabilities. The Norton Servers would be also configured as I said, but the client software (Norton online backup) doesn't show these basics on the screen but does it in the background.

As you have no such investment in the standard practice, I would then suggest back to SneakerNet solution.
m
0
l
July 9, 2013 5:35:03 PM

I would imagine the main cause of the data errors is the large size of your initial transfer, even if you have a pretty decent connection speed. Break up data downloads into smaller parts as much as possible. If everything is compressed into one big file, that's probably going to be harder to handle. A lot of this is also determined by the type of protocol being used and settings from where you are getting the file. When I first set up a small FTP server for various file usage, I found that I got similar results to what you are getting if I tried to download or upload anything greater than like 300 MB at one time. Break it up smaller than that, and it worked fine.
m
0
l
!