You are here

General web questions

Printer-friendly versionPrinter-friendly version

General web-related 'Frequently Asked Questions'

Listed below are an assortment of frequently, and not so frequently, asked questions relating to websites and webpages hosted by the School of Informatics.

If the question you wanted to ask isn't covered here, then please check the support FAQ as well.



How can I control access to my pages?
To control access, create a file called .htaccess (note the leading dot) in the directory which you want to restrict. The restrictions placed in this file will apply to this directory and to all levels of subsidiary directories within it.

It's usual to restrict access based either on who someone is or on where they are connecting from (their IP address).

To restrict users based on their IP address use something like:

deny from all
allow from

This example would restrict access to most machines within the University's network, that is within However, it would exclude the student accommodation network (ResNet). To allow ResNet machines to access the pages too, add the following line:

allow from
Remember that IP address-based restrictions are by no means 100% secure, especially as the number of machines you allow access to increases. Note that the move to IPv6 address space will mean that you will also see addresses of the form:
which can also be used in .htaccess files.

To restrict access to a user (or group of users), they need to prove who they are. This is usually via a username/password mechanism. The Apache links below go into the gruesome detail, but the simplest methods are AuthType Basic (like the example below) or AuthType Cosign (see the Cosign page for more information).

Note the above link to version 2.2 of the Apache web server documentation, but as we update web servers to SL7, then we will also upgrade to version 2.4 of the Apache web server.

homepages.inf password restriction example

Here's a quick example of creating a password-restricted area on Doing the following (replacing $USER with your UUN) will make a password protected area at Users will need to enter the username "test" and the password "foobar" to access the area.
cd /public/homepages/$USER/web
mkdir private
chmod o-rwx private
cd private
htpasswd -bc .htpasswd test foobar
cat > .htaccess <<EOF
# create an .htpasswd file with the following
AuthType         Basic
AuthName         "Test passwd"
AuthUserFile     /public/homepages/$USER/web/private/.htpasswd
Require          valid-user
Note that this is not secure, so do not use your DICE username and password when creating the password file. However it should be sufficient for simple, non-critical uses.

How do I stop my web page being indexed by search engines?
This solution is not guaranteed to work for all web robots that trawl the web looking for web pages, but it should work for the more well-known and well-behaved ones - Google, for example.


The web-master of a web service can create suitable entries in the /robots.txt file. If you are not in a position to maintain this file yourself, then you may be able to ask the web-master. However for services like or, this cannot be done, as we don't have the effort to deal with every individual request. We will be looking at automating the process, but not yet.

Meta tags

Newer robots pay attention to META tags in the HTML documents themselves. This is something you can do, and does not need action from the web-master. However, as the page says, not all robots support this. See:

How do I stop email addresses being harvested?
It's a sad fact of life that certain individuals trawl the web looking for web pages with email addresses on them. These addresses are then harvested and sold on to spammers (people who send unsolicited junk email), who then clog up mailboxes with junk.

If a web page contains an email address (useful when telling someone how to get in contact), then it's likely that the address will start receiving junk mail.

When including an email address on a web page, try to disguise the fact that it's an email address. Humans should still be able to recognise it, but programs may not. For instance rather than putting "send mail to" on your web page, write "send mail to SomeAddress (at) inf . ed . ac . uk". It looks a little clumsy, but people should know what you mean, and it should throw most automated harvesters off the scent.

You can go further by using convoluted HTML to obfuscate things, for instance "send mail to"

Which actually looks like this in HTML:

send mail to <a

This example also uses the mailto: URL in the <A> tag. This is usually an even more sure-fire way of the harvesting programs finding real email addresses. If you must use mailto, try to confuse things by using numerical entities as in the above example.

None of these tricks will guarantee that the address won't be harvested, but they should help.

How to redirect from legacy personal pages to homepages.inf
If you've moved your old personal pages from the likes of to, then you'll probably want anyone (or search engines), that have bookmarked the old location of your web pages, to be automatically forwarded to their new location. You can do this by creating an .htaccess file in the root of your legacy web space containing a line similar to the following:
# Redirect all requests to my new homepage
Redirect permanent /home/legacy_username
This assumes that you've moved all the files and kept their relative positions and names. So someone visiting will automatically be redirected to

Note that the second argument to Redirect (the location that you want to redirect to somewhere else) should match that portion of the URL after the machine address, thus - for example - the redirect-from part of would be /home/legacy_username, and of would be /~legacy_username.

How do I see who's been accessing my pages - logging?
Though technically there are ways you can log access to your web pages yourself, you must not do so without first obtaining the Head of School's permission, and then you must abide by the various legal requirements. See the Access to web logs topic for a bit more detail.

How does AFS affect web services?
To be able to serve pages to the outside world a web server needs to be able to read the pages to be served. If the apache web server daemon can't read a file, it can't serve it. As the bulk of our file system is now based on AFS, and AFS ACLs restrict who (and what) can access files, then the web servers need to be given access (via the ACLs) to read the files it then serves on the web.

See the Serving Web pages from AFS page for more information.

How can I get my link to homepages to appear on my 'people' page?
The people pages under are automatically generated from data held in Theon (the school database).

If you wish to have a 'Personal Page' link to your homepages appear beside your telephone and room number details on your contact page, then you can use the Self Service interface to add this (if you are either a member of staff or a PGR student). Changes made through this interface will take up to 24 hours to actually appear on your contact page.

Last reviewed: 

System Status

Home dirs (AFS)
Other services
Scheduled downtime

Choose a topic