31 Jan 2012 18:34
TAGS: 100 bash http lines server
Some time ago, mainly for fun I created a HTTP server in just BASH and netcat. The aim was to instantly and simply share files between computers in local network with a one-line command:
quake@vaio /home/quake/files $ http_server.sh 8000
And voila, the files in directory /home/quake/files are accessible via a web browser (or a wget command) on every computer in local network.
Some time ago I learned I can achieve the same effect using simple Python one-liner:
python -m SimpleHTTPServer
This work for standard Python 2.x installations, for Python 3.x even simpler:
No need for custom scripts, netcat or other fancy stuff, you need just a standard Python installation and it works ;-).
But recently I faced a challenge of copying many gigabytes of files over network. Copying files over SSH was too slow (data is copied from an ARM machine, not really blazing fast at encryption stuff). I tried to copy files over FTP, but I failed at configuring read-write FTP server in limited time. I wanted to avoid configuring other fancy file servers like Samba/CIFS. I could use NFS, which is both simple to configure and fast enough, but I decided to go more fancy.
I took my old http_server.sh, tweaked it a bit (including replacing lame cat "$file" | wc -c - with stat -c %s to determine file size and replacing gawk with awk in the script) and then created a specialised version of it: tar_server.sh.
tar_server.sh is a HTTP server based on http_server.sh that shows you list of directories inside of directory it was run from and allows you to download the directories as tar files. It does the taring on the fly, so you don't waste the disk space.
It's as simple as:
quake@vaio /home/quake/files $ tar_server.sh 8000
Then you can see the list of tar files to download at http://your_ip:8000/ . Suppose you have a directory /home/quake/files/backup. You can download it on some other machine using:
quake@other-machine /home/someuser/files $ wget http://your_ip:8000/backup.tar
Or to unpack on the fly:
quake@other-machine /home/someuser/files $ wget http://your_ip:8000/backup.tar -O - | tar -x
This way you can mirror part of your filesystem with almost no dedicated tools. The script is quite OS-independent and requires only netcat, awk, tar and stat commands, that are likely to found in any Unix-like systems.
Also the script proves BASH is still very useful tool and adapting simple scripts is easy and FUN :-).
Remember, the scripts (just like original version) can handle only one client at once, so if you want to do parallel stuff, you need to launch more of them on different port each.
Here are the scripts:
http_server.sh:
#!/bin/bash
function debug {
local severity="$1"
shift
local message="$@"
echo -n "`date -u`" 1>&2
echo -ne '\t' 1>&2
echo -n "$severity" 1>&2
echo -ne '\t' 1>&2
echo "$message" 1>&2
}
function fix_path {
echo -n "$1" | head -n 1 | sed 's|^[/.]*||' | sed 's|/\.*|/|g'
}
function serve_dir {
local dir="`fix_path "$1"`"
if [ "$dir" = "" ]; then
dir="./"
fi
echo 'HTTP/1.1 200 OK'
echo 'Content-type: text/html;charset=UTF-8'
echo
echo LISTING "$dir"
echo '<br/>'
ls -p "$dir" | sed -e 's|^\(.*\)$|<a href="/'"$dir"'\1">\1</a><br/>|'
}
function serve_file {
local file="`fix_path "$1"`"
echo 'HTTP/1.1 200 OK'
echo 'Content-type: application/x-download-this'
echo 'Content-length: '"`stat -c %s "$file"`"
echo
debug INFO serving file "$file"
cat "$file"
}
function process {
local url="`awk '{print $2}' | head -n 1`"
case "$url" in
*/)
debug INFO Processing "$url" as dir
serve_dir "$url"
break
;;
*)
debug INFO Processing "$url" as file
serve_file "$url"
;;
esac
}
function serve {
local port="$1"
local sin="$2"
local sout="$3"
while debug INFO Running nc; do
nc -l -p "$port" < "$sin" > "$sout" &
pid="$!"
debug INFO Server PID: "$pid"
trap cleanup SIGINT
head -n 1 "$sout" | process > "$sin"
trap - SIGINT
debug INFO Killing nc
kill "$pid"
done
debug INFO Quiting server
}
function cleanup {
debug INFO Caught signal, quitting...
rm -Rf "$tmp_dir"
exit
}
tmp_dir="`mktemp -d -t http_server.XXXXXXXXXX`"
sin="$tmp_dir"/in
sout="$tmp_dir"/out
pid=0
port="$1"
mkfifo "$sin"
mkfifo "$sout"
debug INFO Starting server on port "$port"
serve "$port" "$sin" "$sout"
tar_server.sh:
#!/bin/bash
function debug {
local severity="$1"
shift
local message="$@"
echo -n "`date -u`" 1>&2
echo -ne '\t' 1>&2
echo -n "$severity" 1>&2
echo -ne '\t' 1>&2
echo "$message" 1>&2
}
function fix_path {
echo -n "$1" | head -n 1 | sed 's|^[/.]*||' | sed 's|/\.*|/|g'
}
function serve_dir {
echo 'HTTP/1.1 200 OK'
echo 'Content-type: text/html;charset=UTF-8'
echo
find -type d -mindepth 1 -maxdepth 1 | sed -e 's|^\./||' -e 's|^\(.*\)$|<a href="/\1.tar">\1.tar</a><br/>|'
}
function serve_file {
echo 'HTTP/1.1 200 OK'
local file="`fix_path "$1"`"
local dir="`echo "$file" | sed 's/\.tar$//'`"
debug INFO serving tarred dir "$dir"
echo 'Content-type: application/x-tar'
echo
tar -c "$dir"
}
function process {
local url="`awk '{print $2}' | head -n 1`"
case "$url" in
/)
debug INFO Processing "$url" as dir
serve_dir "$url"
break
;;
*)
debug INFO Processing "$url" as file
serve_file "$url"
;;
esac
}
function serve {
local port="$1"
local sin="$2"
local sout="$3"
while debug INFO Running nc; do
nc -l -p "$port" < "$sin" > "$sout" &
pid="$!"
debug INFO Server PID: "$pid"
trap cleanup SIGINT
head -n 1 "$sout" | process > "$sin"
trap - SIGINT
debug INFO Killing nc
kill "$pid"
done
debug INFO Quiting server
}
function cleanup {
debug INFO Caught signal, quitting...
rm -Rf "$tmp_dir"
exit
}
tmp_dir="`mktemp -d -t http_server.XXXXXXXXXX`"
sin="$tmp_dir"/in
sout="$tmp_dir"/out
pid=0
port="$1"
mkfifo "$sin"
mkfifo "$sout"
debug INFO Starting server on port "$port"
serve "$port" "$sin" "$sout"
More posts on this topic
Comments
So you should have used echo -e "HeaderName: HeaderValue\r".
I would also prefer the file to be served with the right mime-type (file utility might be helpful - "file -bi $file ") rather than application/x-download-this.
And last but not least, if your looking for the nice utility able to list directory content in form of nice HTML, than I highly recommend tree.
We had the same conclusion for the original http_server.sh, but the point is not to make full-fledged web-server (there are "real" project for this ;-) ), but to make some funny piece of software, that fits for a very specific usage and in the same time can be freely modified to one's needs.
Piotr Gabryjeluk
visit my blog
Post preview:
Close preview