Why Do So Many Websites Fail On The Most Basic SEO?

I added a link checker to EssexPortal.co.uk and had loads of broken links caused by design changes, e.g. links to index.html now failing. I was tempted to start contacting some of the owners but then realised how many there were – it took me a whole day just to fix the links on my side.

One example of shoddy SEO is the Chelmsford City Council site. At some point in the last year or so they decided to use the www subdomain, and all links pointing to their site without the www fail.

eg. http://chelmsford.gov.uk/theatres – now a 105 (According to Chrome). http://chelmsford.gov.uk/theatres is alive and well though.

Another example popped up today on my Dad’s horse racing blog, Runnersandriders.co.uk – http://attheraces.com/index.asp is now failing – drop the index.asp and it loads OK (as attheraces.com/) – I guess a new website was built but nobody thought to redirect the old pages.

What is shocking is that installing a link checker on an old website brings up so many of these problems. They are so easily fixed and will help the sites rank better. Why doesn’t the webmaster take the time to check these things? Tsk.