You are here

General web questions

Printer-friendly versionPrinter-friendly version

General web-related 'Frequently Asked Questions'

Listed below are an assortment of frequently, and not so frequently, asked questions relating to websites and webpages hosted by the School of Informatics.

If the question you wanted to ask isn't covered here, then please check the support FAQ as well.

Questions

Answers

How can I control access to my pages?
To control access you need to create a file called .htaccess (note the leading dot) in the directory that you want to restrict. This will impose the restrictions on this directory, and all of the levels below it.

Generally you restrict access to someone, or where they are connecting from (their IP address).

To restrict users based on the IP address that they're connecting from, you need to use something like:

deny from all
allow from 129.215.0.0/16

This example would restrict access to all machines within the University's network, ie within .ed.ac.uk. However, it would exclude the student accommodation network (ResNet), to allow these machines to see that pages too, add the following line:

allow from 10.0.0.0/8
Remember that IP address-based restrictions are by no means 100% secure, especially as the number of machines you allow access to increases.

To restrict access to a user (or group of users), they need to prove who they are. This is usually via a username/password mechanism. The Apache links below go into the gruesome detail, but the simplest methods are AuthType Basic (like the example below) or AuthType Cosign (see the Cosign page for more information).

homepages.inf password restriction example

As a quick example of creating a password-restricted area on homepages.inf.ed.ac.uk, if you do the following (replacing $USER with your UUN) then you'll end up with a password protected area at http://homepages.inf.ed.ac.uk/USERNAME/private/ , where you'll need to enter the username "test" and the password "foobar" to access the area.
cd /public/homepages/$USER/web
mkdir private
chmod o-rwx private
cd private
htpasswd -bc .htpasswd test foobar
cat > .htaccess <<EOF
# create an .htpasswd file with the following
AuthType         Basic
AuthName         "Test passwd"
AuthUserFile     /public/homepages/$USER/web/private/.htpasswd
Require          valid-user
EOF
Note, this is not secure, so do not use your DICE username and password when creating the password file, but it should be sufficient for simple, non-critical uses.

How do I stop my web page being indexed by search engines?
You can't guarantee that this will work for all web robots that trawl the web looking for web pages, but the more well-known, well-behaved, do (Google, for example).

robots.txt

The web-master of a web service can create suitable entries in the /robots.txt file. If you are not in a postition to maintain this file yourself, then you may be able to ask the web-master. However for services like homepages.inf.ed.ac.uk or groups.inf.ed.ac.uk, this cannot be done, as we don't have the effort to deal with every individual request. We will be looking at automating the process, but not yet.

Meta tags

Newer robots pay attention to META tags in the HTML documents themselves. This is something you can do, and does not need action on behalf of the web-master. However, as the page says, not all robots support this. See:

How do I stop email addresses being harvested?
It's a sad fact of life that certain individuals trawl the web looking for web pages with email addresses on them. These addresses are then harvested and sold on to spammers (people who send unsolicited junk email), that then clog mailboxes up with junk.

This means that if you have a web page containing an email address (useful when telling someone how to get in contact), then it's likely that the address will start receiving junk mail.

When including an email address on a web page, try to disguise the fact that it's an email address. Humans should still be able to recognise it, but programs that go around looking for them will probably not. So rather than putting "send mail to SomeAddress@inf.ed.ac.uk" on your web page, do "send mail to SomeAddress (at) inf . ed . ac . uk", it looks a little clumsy, but people should know what you mean, and it should throw most automated harvesters off the scent.

You can go further by using convoluted HTML to obfuscate things, eg "send mail to SomeAddress@inf.ed.ac.uk"

Which actually looks like this in HTML:

send mail to <a
href="mailto:SomeAddress&#64;inf&#46;ed.ac.uk">SomeAddress&#64;inf&#46;ed.ac.uk</a>

This example also uses the mailto: URL in the <A> tag. This is usually an even more sure-fire way of the harvesting programs finding real email addresses, however if you must use it, you can still try to confuse things by using numerical entities as in the above example.

None of these tricks will guarantee that the address won't be harvested, but they should help.

How to redirect from legacy personal pages to homepages.inf
If you've moved your old personal pages from the likes of www.dcs.ed.ac.uk/home/legacy_username/ to homepages.inf.ed.ac.uk/dice_username/, then you'll probably want anyone (or search engines), that have bookmarked the old location of your web pages, to be automatically forwarded to their new location. You can do this by creating an .htaccess file in the root of your legacy web space containing a line similar to the following:
# Redirect all requests to my new homepage
Redirect permanent /home/legacy_username http://homepages.inf.ed.ac.uk/dice_username
This assumes that you've moved all the files and kept their relative positions and names. So someone visiting www.dcs.ed.ac.uk/home/neilb/foo/bar.html will automatically be redirected to homepages.inf.ed.ac.uk/neilb/foo/bar.html.

Note that the second argument to Redirect (the location that you want to redirect to somewhere else) should match that portion of the URL after the machine address, thus - for example - the redirect-from part of www.dcs.ed.ac.uk/home/legacy_username would be /home/legacy_username, and of www.cogsci.ed.ac.uk/~legacy_username would be /~legacy_username.

How do I see who's been accessing my pages - logging?
Though technically there are ways you can log access to your web pages yourself, you must not do so without first obtaining the Head of School's permission, and then you must abide by the various legal requirements. See the Access to web logs topic for a bit more detail.

How does AFS affect web services?
To be able to serve pages to the outside world a web server needs to be able to read the pages to be served. If the apache web server daemon can't read a file, it can't serve it. As the bulk of our file system is now based on AFS, and AFS ACLs restrict who (and what) can access files, then the web servers need to be given access (via the ACLs) to read the files it then serves on the web.

See the Serving Web pages from AFS page for more information.

How can I get my link to homepages to appear on my 'people' page?
The people pages under http://www.inf.ed.ac.uk/people/ are automatically generated from data held in the school database.

If you wish to have a 'Personal Page' link to your homepages appear beside your telephone and room number details on your contact page, then please ask the Informatics HR office IF-4.36 (infhr@inf.ed.ac.uk) if you are a member of staff, or Informatics Student services if PGR.

Last reviewed: 
13/11/2014

System Status

Home dirs (AFS)
Network
Mail
Other services
Scheduled downtime

Choose a topic