When it comes to SEO, everyone likes to talk about how ‘content is king’ and user experience is so important; but there’s an important part of search optimisation which is often overlooked – technical SEO.

Your technical SEO is the supporting framework for your creative SEO efforts. While your content should be designed around users and their search queries, your technical SEO elements are there to ensure search engines can ‘read’, evaluate and prioritise your website in their results (though they often overlap with usability and user experience as well).

Don’t think of technical SEO as trying to appeal to search engines in the same way you want to appeal to customers. It’s more like putting your site into a language and context that they can understand, so that they can evaluate you on your own merits.

Technical SEO often requires more knowledge of coding and web development, but there are lots of tools out there to help small and medium-sized businesses get to grips with their search optimisation. Here are a few tips for getting started.

Set up your XML sitemap

Before we go any further, let’s explain how search engines actually find your site and add it to their results. Little automated programs known as spiders or web crawlers explore the web by ‘clicking’ on all the links they find and recording information about all the pages they land on. This process is known as indexing – the crawlers are systematically adding web pages to the engine’s index of searchable pages.

To help get your site indexed, it’s best to set up an XML sitemap. This lays out the structure of your site’s pages and provides links to help Google’s crawler software (known as Googlebot) work its magic. This is especially important for larger sites with more than four or five pages, where the structure is less straightforward and clear and it’s easier to miss newer pages during the indexing process.

Don’t worry if you’re not sure how to create one, because there are plenty of free third-party XML sitemap generators out there on the web. Google themselves recommend XML-Sitemaps.com – all you need to do is put in your URL and fill in a brief form, and you’ll have a full sitemap. This can then be uploaded to your site and submitted to Google via your Search Console (aka Webmaster Tools) account.

Get your robots.txt file right

You don’t necessarily want Google to index every page on your site – some pages might be home to backend stuff that you don’t want users to see, and you definitely don’t want those pages appearing as search results (or dragging down your credibility as a search result).

This is where your robots.txt file comes in. This little text file tells web crawlers and other automated ‘bots’ which pages you don’t want them to access (and which ones they can access freely).

Unfortunately, many SMEs make the mistake of either not including a robots.txt file on their site at all, or setting it up incorrectly so that their entire website ends up getting blocked from appearing in search results. If your site isn’t crawlable, it’s not indexable, which means absolutely no search traffic for you until you unblock it!

Fortunately, your Search Console will be able to tell you which pages are being blocked, and/or if you’ve made any syntax errors. Just go to your dashboard and click the robots.txt Checker option in the sidebar menu.

Don’t use your robots.txt to try to block private areas of your site. No bot is required to adhere to robots.txt file rules, and while Googlebot may respect your privacy requests, some bots and search crawlers won’t. In addition, anyone can look at the file’s code to find out which pages you’re trying to hide and link to them, where they can be indexed freely.

Instead, lock any private pages behind a password system. Googlebot can’t access these and neither can anyone else (unless they know the login details, of course).

Repair broken links, images and features

At some point in your website’s lifetime, something is going to break. Perhaps a full site makeover will screw up your URL structure and render a bunch of your internal links useless, or a coding bug will mess up your design layout and content formatting. Encountering these problems is inevitable, but putting up with them is a big mistake.

Put simply, broken features and functionality make you look bad – you’re delivering a poor user experience to your site visitors and making your business appear unprofessional, which means Google will rank you lower than sites which do work as expected.

Dead internal links in particular come with their own set of problems – as well as cutting off users from where they want to go on your site, they can also prevent crawlers from accessing and indexing your site properly, and can throw away all the link equity you’ve built up over years of SEO work. (Keep in mind that any time you delete a page, all links pointing to that page will return a 404 error.)

Thankfully, finding and repairing broken links is fairly straightforward. Just use Xenu Link Sleuth to run a check across your site, or check your Search Console dashboard for crawl errors. Edit each dead link with a more up-to-date URL (or if you know the link’s destination no longer exists on your site, redirect the URL to its newer counterpart).

Sometimes features are ‘broken’ from the moment they’re introduced, simply because they haven’t been properly optimised for their purpose. For example, does your on-site search bar return the right pages? Maybe not, unless you’ve optimised it for the kinds of queries your users are searching for.

Make your content ‘readable’ for search engines

As we’re sure you’re aware, all content on your site should be there for your users, not for search engines. Chasing high rankings by pushing out lots of weak, spammy content simply won’t work (and even if it did, you wouldn’t get any conversions from it anyway).

However, you still need to let Google know what’s on your site; and you may be trapping your content from being accessed by Google’s crawlers, depending on the format you’re using to present it.

For example, Googlebot can’t ‘read’ and index any content which is nested in iframes, so it’s better to remove them. Videos and rich media formats such as Silverlight are also something of a struggle for Googlebot to decipher, so be sure to provide a text alternative or provide a text-based description which explains what the content is.

When it comes to text within images, the situation’s a little less clear cut. It’s true that Google can extract text from images to a pretty good standard – however, it’s thought that this text probably doesn’t affect your rankings. In any case, it’s best to leave each format to do what they do best; image files for photos and graphics, and text for… well, text.

Avoid duplication

The web is full of examples of duplicate content – highly similar or identical content which appears across more than one page. Sometimes content is copied unintentionally or without malicious intent, while other times it’s there to exploit search rankings.

Search engines want to avoid duplicate pages as much as possible, as they harm the variety and quality of results. This is why Google ranks the original page above any copies (and may leave the copies out of the top results entirely).

Obviously, if your content strategy is well planned out (and you’re not just copying and pasting your posts each time) then you don’t need to worry about your content being penalised. Unfortunately, it’s easy to duplicate different elements across multiple pages by complete accident, which can make it appear to Google as if you’re just posting lots of copies of the same pages.

Make sure all your title and meta description tags are unique and descriptive. Don’t just put your company name, and certainly don’t leave your homepage title tag as simply ‘Homepage’!

Pay attention to your URL addresses too – some pages may each have multiple addresses due to the parameters you’re using. You can tell your Google Search Console how to categorise your URL parameters, or set up canonical tags for the different URLs which will point all your link juice to the main URL you specify.

For more SEO advice, follow Advantec on Facebook, Twitter and LinkedIn for all the latest blog updates – or to discuss technical SEO for your next website, contact us to talk to our friendly, expert team.

 

Blog
More Stories
How does Advantec communicate with clients (and internally)?
Advantec's Spring Networking Drinks Event 2018
How to pick the right digital agency for you

We'll help you exceed your digital ambitions and get results online that make us all proud.

Interested in working with us on a project or finding out how we can help you?

Drop us a line and we'll be in touch soon!

A bit about you
What do you need?
What features are you looking for?
Total Website Budget
Monthly Marketing Budget
How can we help you?
Hi there! What's up?

Sending

Success! We'll be in touch soon.

Error! There was a problem sending the form, please re-try or contact us directly.