BASH HTTP server evolves

31 Jan 2012 18:34

Some time ago, mainly for fun I created a HTTP server in just BASH and netcat. The aim was to instantly and simply share files between computers in local network with a one-line command:

quake@vaio /home/quake/files $ http_server.sh 8000

And voila, the files in directory /home/quake/files are accessible via a web browser (or a wget command) on every computer in local network.

Some time ago I learned I can achieve the same effect using simple Python one-liner:

python -m SimpleHTTPServer

This work for standard Python 2.x installations, for Python 3.x even simpler:

python -m http.server

No need for custom scripts, netcat or other fancy stuff, you need just a standard Python installation and it works ;-).

But recently I faced a challenge of copying many gigabytes of files over network. Copying files over SSH was too slow (data is copied from an ARM machine, not really blazing fast at encryption stuff). I tried to copy files over FTP, but I failed at configuring read-write FTP server in limited time. I wanted to avoid configuring other fancy file servers like Samba/CIFS. I could use NFS, which is both simple to configure and fast enough, but I decided to go more fancy.

I took my old http_server.sh, tweaked it a bit (including replacing lame cat "$file" | wc -c - with stat -c %s to determine file size and replacing gawk with awk in the script) and then created a specialised version of it: tar_server.sh.

tar_server.sh is a HTTP server based on http_server.sh that shows you list of directories inside of directory it was run from and allows you to download the directories as tar files. It does the taring on the fly, so you don't waste the disk space.

It's as simple as:

quake@vaio /home/quake/files $ tar_server.sh 8000

Then you can see the list of tar files to download at http://your_ip:8000/ . Suppose you have a directory /home/quake/files/backup. You can download it on some other machine using:

quake@other-machine /home/someuser/files $ wget http://your_ip:8000/backup.tar

Or to unpack on the fly:

quake@other-machine /home/someuser/files $ wget http://your_ip:8000/backup.tar -O - | tar -x

This way you can mirror part of your filesystem with almost no dedicated tools. The script is quite OS-independent and requires only netcat, awk, tar and stat commands, that are likely to found in any Unix-like systems.

Also the script proves BASH is still very useful tool and adapting simple scripts is easy and FUN :-).

Remember, the scripts (just like original version) can handle only one client at once, so if you want to do parallel stuff, you need to launch more of them on different port each.

Here are the scripts:

http_server.sh:

tar_server.sh:


More posts on this topic

Comments

Add a New Comment
or Sign in as Wikidot user
(will not be published)
- +
Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License