I needed to transfer a 4GB file the other day. Unfortunately, there is no easy free way to do it. So I went the tech way.
Upload the file to a server you have access to (e.g. your server slice) via scp:
scp my_large_file my_server:.
In case your connection breaks up, you can resume the upload like this:
rsync –partial –progress my_large_file my_server:.
Log to your slice and move the file to a directory of its own:
mv my_large_file public_data/
Start a simple HTTP server on the port of your choice (in this case 8080) from that newly-created directory, by running:
python -mSimpleHTTPServer 8080 &
That’s it! Your file is now available for download at:
If you want to have that HTTP server process running even after logging out, find out its PID by running the jobs command, and then you can disown that process from the shell like so (assuming the PID was 1):
- no need to go through the hassle of installing a FTP server on your own machine.
- don’t care about leaving your machine turned on because you don’t know exactly when the other party will be fetching the file.
- the other party is not slowed down by your ISP’s upload limits while trying to fetch the file from you.