A web server in a shell script

Posted by mcortese on Mon 13 Mar 2006 at 07:25

Suppose you want to experiment a little with web pages and CGI's, but you don't want the hassle of installing the full Apache package. This quick and dirty shell script could just be what you need.

Put simply, a web server is an application that sends local text files over the network to the clients that request them. If you let another program (for example inetd) deal with the network part, the web server could be reduced to a mere cat "$filename" to stdout. Of course, the difficult part would be to extract that filename out of the HTTP request string: nothing that a Bash script cannot easily do!

The script

Step 1: Our script begins like any other scripts. The Bash magic, plus some definitions:

#!/bin/bash
base=/var/www

Step 2: The inetd will feed our script with the data received from the remote host, the first row being the standard HTTP request, followed by zero or more header lines. Let's record the request and discard the rest:

read request

while /bin/true; do
  read header
  [ "$header" == $'\r' ] && break;
done

Step 3 (the tricky part): extract the URL from the request string and locate the document on the local file system:

url="${request#GET }"
url="${url% HTTP/*}"
filename="$base$url"

Step 4 (the heart of the script): send the file (if it exists) with a leading standard header:

if [ -f "$filename" ]; then
  echo -e "HTTP/1.1 200 OK\r"
  echo -e "Content-Type: `/usr/bin/file -bi \"$filename\"`\r"
  echo -e "\r"
  cat "$filename"
  echo -e "\r"
else
  echo -e "HTTP/1.1 404 Not Found\r"
  echo -e "Content-Type: text/html\r"
  echo -e "\r"
  echo -e "404 Not Found\r"
  echo -e "Not Found
           The requested resource was not found\r"
  echo -e "\r"
fi

That's all.

Installation

To make it work, you must add the following line to your /etc/inetd.conf file:

www stream tcp nowait nobody /usr/local/bin/webd webd
where webd is the name you gave to the script.

After instructing inetd to re-read its configuration with /etc/init.d/inetd restart, you are ready to test it. Make the directory /var/www, place some HTML files in it, fire up your favourite web browser and try the URL http://localhost/FILENAME.html

Please note that this installation is not a wise thing to do if your computer is connected an insecure network, because you are exposing your hard disk contents to anyone who can establish a connection with your port 80. A better idea would be to use the tcpd wrapper and only allow local connections. I'll leave the implementation details to your imagination!

What about CGI's?

Put like this, this web server is pretty useless: it does not do anything more than you could already do simply pointing your web browser to a local file. We need support for (very simple) CGI's.

Theory: instead of sending a text file over the network, we run an executable file and send its output. Before doing that, we have to decode a little further the HTTP request, building a variable called QUERY_STRING that we must export to the executable.

Practice: in the above script, replace Step 3 with this slightly less trivial version:

url="${request#GET }"
url="${url% HTTP/*}"
query="${url#*\?}"
url="${url%%\?*}"

filename="$base$url"

if [ "$query" != "$url" -a -x "$filename" ]; then
  export QUERY_STRING="$query"
  echo -e "HTTP/1.1 200 OK\r"
  "$filename"
  echo -e "\r"
  exit 0
fi

Of course this overly simplified web server cannot even be compared to application the size and complexity of Apache. It is just a hack!

 

 


Posted by Steve (82.41.xx.xx) on Mon 13 Mar 2006 at 08:25
[ View Steve's Scratchpad | View Weblogs ]

That is a pretty interesting script. (Although personally I'd use the Net::Server Perl module ;)

I see that you're using inetd so that your shell script doesn't need to make use of sockets. That means you need root access to install it.

If you're happy to server upon a port higher than 80 (which needs root privileges to bind to) then you might want to explore using netcat to do the socket code instead with something like this:

nc -l  -p 8000 -e /usr/local/bin/webserver-script.sh

This causes netcat to listen upon port 8000 and execute the named script for each incoming connection. Almost identical to your code using inetd, but without the need to become root.

Steve

[ Parent | Reply to this comment ]

Posted by Anonymous (213.112.xx.xx) on Mon 13 Mar 2006 at 17:51
But! Let's not forget that it is considered good manners to use the designated HTTP alternate port which is 8080.

http://grc.com/port_8080.htm

[ Parent | Reply to this comment ]

Posted by Anonymous (130.226.xx.xx) on Wed 15 Mar 2006 at 11:31
FYI: OsX + RH does not have support -e option

/Trakic

[ Parent | Reply to this comment ]

Posted by Anonymous (83.102.xx.xx) on Thu 13 Apr 2006 at 13:54
Or even we can try this :-)

socat TCP4-LISTEN:8080,fork,tcpwrap=coolwebserver EXEC:/bin/coolwebserver,chroot=/home/sandbox,su-d=sandbox,pty,std err

our cool webserver will fork on each connect, checked against service "coolwebserver" in /ets/hosts.allow, chroot to /home/sandbox and execute /home/sandbox/bin/coolwebserver :-)

or just in order to fork:

socat TCP4-LISTEN:8080,fork EXEC:/usr/local/bin/webserver-script.sh

Cheers,
Matvey

http://matvey.org.ru

[ Parent | Reply to this comment ]

Posted by Anonymous (78.45.xx.xx) on Fri 17 Jun 2011 at 00:31
very nice, thanks :)

[ Parent | Reply to this comment ]

Posted by Steve (82.41.xx.xx) on Mon 13 Mar 2006 at 08:28
[ View Steve's Scratchpad | View Weblogs ]

ps. Don't forget to filter out .. from your incoming request - otherwise somebody could request:

../../etc/passwd

Steve

[ Parent | Reply to this comment ]

Posted by Anonymous (62.6.xx.xx) on Mon 13 Mar 2006 at 16:34
I remember being able to do this with a streaming mp3 server ;p. Seriously though, It would probably be better running a better tested server if opening ports to the outside world.

[ Parent | Reply to this comment ]

Posted by Steve (212.20.xx.xx) on Fri 17 Mar 2006 at 11:20
[ View Steve's Scratchpad | View Weblogs ]

True enough - but I've learnt my lesson now!

Steve

[ Parent | Reply to this comment ]

Posted by Anonymous (138.100.xx.xx) on Tue 4 Apr 2006 at 17:58
to solve this, the server-script would run in a chroot jail...

[ Parent | Reply to this comment ]

Posted by dopehouse (84.131.xx.xx) on Mon 13 Mar 2006 at 16:44
[ View dopehouse's Scratchpad ]
Just instert the lines between the two comment-lines markt with #
url="${request#GET }"
url="${url% HTTP/*}"
filename="$base$url"

# This will open index.html if no filename is given after the last slash.
if [ -d "$filename" ]; then
        filename="${filename}index.html"
fi
# end of index.html extension ;)

if [ -f "$filename" ]; then
        echo -e "HTTP/1.1 200 OK\r"
If you call a URL like 'http://localhost/', you'll get the index.html in '/var/www/'.

[ Parent | Reply to this comment ]

Posted by mcortese (82.48.xx.xx) on Tue 14 Mar 2006 at 22:42
[ View Weblogs ]

Good suggestion. If you start adding features, though, you don't know where you can end up.

Personally, since this script is more a development tool than a production application, I find it more convenient to list the directory contents. Add the following lines just before the else statement:

elif [ -d "$filename" ]; then
	echo -e "HTTP/1.1 200 OK\r"
	echo -e "Content-Type: text/html\r"
	echo -e "\r"
	echo -e "Listing of $url \r"
	echo -e "Listing of $url\r"
	echo -e "NameDetails\r"
	( cd "$filename"
	for f in `ls`; do
		desc=`ls -ld "$f"`
		desc=${desc% $f}
		href="${url%/}/$f"
		echo -e "$f
$desc\r" done ) echo -e "\r" echo -e "\r" echo -e "\r"

But, I admit, this is plain bloatware!

[ Parent | Reply to this comment ]

Posted by dopehouse (84.131.xx.xx) on Wed 15 Mar 2006 at 00:56
[ View dopehouse's Scratchpad ]
I think this is a very good example of the easiness of programing services and how powerfull linux, bash and the gnu-tools are.

If we fix the path-bug from comment #2, than we have a more productive and secure server than M$-IIS xD

[ Parent | Reply to this comment ]

Posted by cswd (62.255.xx.xx) on Wed 15 Mar 2006 at 10:10
Oh c'mon give the poor thing a break - that bug was ironed out YEARS ago...

--
http://www.cswd.co.uk/

[ Parent | Reply to this comment ]

Posted by simoesp (213.63.xx.xx) on Fri 31 Mar 2006 at 11:26
hahahah very nice script! :D
i didn't see that bash was so powerfull that he could serve http resquests :D

very nice :D

[ Parent | Reply to this comment ]

Posted by Anonymous (4.236.xx.xx) on Sun 9 Apr 2006 at 08:03
here is a combinded version I put together:
Cheers,
Kev
PS. this would make a great basis for a talk about the simplicity and power of the unix system.
--------------------------------------------
#!/bin/bash
#To make it work, you must add the following line to your /etc/inetd.conf file:
#www stream tcp nowait nobody /usr/local/bin/webd webd
#where webd is the name you gave to the script.
# a version by Kevin Mark
# http://debian.home.pipeline.com
# based upon Steve Kemp's version on debian-administration.org

dynamic_request () {
export QUERY_STRING="$query"
echo -e "HTTP/1.1 200 OK\r"
"$filename"
echo -e "\r"
exit 0
}
static_request() {
echo -e "HTTP/1.1 200 OK\r"
echo -e "Content-Type: `/usr/bin/file -bi \"$filename\"`\r"
echo -e "\r"
cat "$filename"
echo -e "\r"
}
dir_list() {
echo -e "HTTP/1.1 200 OK\r"
echo -e "Content-Type: text/html\r"
echo -e "\r"
echo -e "Listing of $url\r"
echo -e "<br>"
echo -e "Details Name\r"
echo -e "<br>"
( cd "$filename"
for f in `ls`; do
desc=`ls -ld "$f"`
desc=${desc% $f}
href="${url%/}/$f"
echo -e "$desc
$f\r"
echo -e "<br>"
done
)
echo -e "\r"
echo -e "\r"
echo -e "\r"
}
404_page () {
echo -e "HTTP/1.1 404 Not Found\r"
echo -e "Content-Type: text/html\r"
echo -e "\r"
echo -e "404 Not Found\r"
echo -e "Not Found
The requested resource was not found\r"
echo -e "\r"
}
base=/var/www

read request

while /bin/true; do
read header
[ "$header" == $'\r' ] && break;
done

url="${request#GET }"
url="${url% HTTP/*}"
query="${url#*\?}"
url="${url%%\?*}"
filename="$base$url"

if [ "$query" != "$url" -a -x "$filename" ]; then
dynamic_request;
fi

if [ -f "$filename" ]; then
if [ -x "$filename" ]; then
dynamic_request;
fi
static_request;
else
if [ -d "$filename" ]; then
if [ -f "$filename/index.html" ]; then
filename="${filename}/index.html"
if [ -x "$filename" ]; then
dynamic_request;
fi
static_request;
else
dir_list;
fi
else
404_page;
fi
fi

[ Parent | Reply to this comment ]

Posted by Anonymous (70.29.xx.xx) on Sat 22 Jul 2006 at 00:29
This is very useful script to show how we can use bash and services on Linux, However I was wondering if anyone has an idea about creating a request from a shell script?? I mean what if you dont want to use a browser in this case, or how about compiling a file on adifferent machine. For example, let's say I do have a program that manuplates or compiles latex documents, and I can install all the macros on all the computers I have, Therefore every time I create a new Latex document I have to go and compile it on that PC. Is't there a way to write a small script that sends the file to the other PC, where it gets compiled ??


I will really appreciate any help.

[ Parent | Reply to this comment ]

Posted by Anonymous (149.254.xx.xx) on Tue 19 Dec 2006 at 22:29
Excellent idea for a webserver - Considering using it for control panel software, instead of cpanel, etc...

One thing that concerns me... What about processes? If you were to use this as a cp management server wouldn't the processes mount?

Like the tutorial though, I'm seriously considing using it... Seems more stable then perl or php for a webserver heh :D

[ Parent | Reply to this comment ]

Posted by mcortese (213.140.xx.xx) on Wed 2 Jan 2008 at 18:14
[ View Weblogs ]

I strongly recommend you don't use it for production. There are too many evil things one can do by just sending the right URL (Steve's message only suggested the simplest case).

[ Parent | Reply to this comment ]

Posted by chatur (202.51.xx.xx) on Sun 30 Dec 2007 at 07:31
Can that be done with perl script with multiple client connection?

[ Parent | Reply to this comment ]

Posted by funkysoul (80.218.xx.xx) on Sun 29 Nov 2009 at 14:29
thank you for this very instructive howto. i first read it about a year ago but only recently found the time to write my own servers. i have written and published 13 versions of shell script web servers on my forum as a preparation for my Arduino web server project. i have also covered the HTTP details in the thread understanding HTTP.
happy networking

[ Parent | Reply to this comment ]

Sign In

Username:

Password:

[Register|Advanced]

 

Flattr

 

Current Poll

What do you use for configuration management?








( 58 votes ~ 0 comments )