Search Engine Optimisation (SEO) can be baffling – and with talk of ‘Black Hat SEO’ and ‘White Hat SEO’ it’s no wonder it seems like a Dark Art only Wizards can master.
A quick Google search for information about SEO will return hundreds of in-depth articles and hour-long YouTube videos, making the task of understanding even the basics seem daunting.
To help with demystifying SEO and understanding how to apply it to your own website, we’ve pulled together this quick ‘SEO Audit’. It’s a list of questions for you to answer, to identify if your existing website is SEO friendly from a technical perspective. It’s also a checklist you can use during the development of a new website.
If you can answer these questions with a ‘Yes’ then you’re #winning. If not, by the end of this post you will know what you need to do, to begin to optimise your site for search engines.
A key thing to remember about SEO, is that optimising your website for search engines should also improve your website’s user experience. Google’s main aim is to serve a user the best content for their search phrase – it’s all about relevance to the user and ensuring they quickly and easily get to the information they’re searching for. That’s a great aim for your own website – if you can help visitors to quickly find what they need – and the information is relevant, users will visit your website again and again.
SEO Audit – Technical Set Up For SEO
Is your website mobile-friendly?
The key place to start is ensuring your website is mobile friendly. How do you know if your site is mobile-friendly? That’s easy: take Google’s Mobile Friendly Test. Google imposes a search rankings penalty on sites that do not work well on mobile devices, so if there’s one place to invest your resource – it’s making your website responsive.
Read more: What is a responsive website?
Is your website quick to load?
Google PageSpeed Insights is a useful tool for measuring the performance of your site for both desktop and mobile browsers. PageSpeed Insights will identify if your website is ‘Good,’ ‘Needs Work,’ or ‘Poor’ for users – based on time to load your web pages. It will score your site from 0 – 100 and provide a list of optimisations you can do to improve your score. A high score suggests your site has a fast load speed which is good for users – but your score shouldn’t be used to determine the overall user experience.
Is your site server secured with an SSL certificate?
In its simplest form, SSL is the difference between URLs that begin with http:// (unsecure site) or https:// (secure site).
SSL is essential for protecting your website, even if it doesn’t handle sensitive information like credit cards. It provides privacy, critical security and data integrity for both your websites and your users’ personal information.
Google is now forcing websites to implement SSL certificates, by penalising sites without one – sending users to a page which tells them the site they are about to visit is ‘unsecure’ and recommending they don’t visit it. Without an SSL certificate, your site won’t rank well in Google Search Results.
Luckily for Webstruxure clients, we have implemented an SSL certificate on all the websites we host, for free – so you can be assured that your site is secure and Google won’t penalise you.
Do you have more than one URL directing to what looks like the same web page content?
Individual web pages can sometimes be loaded from multiple URLs. For example, if your homepage can be loaded from https://www.yourdomain.co.nz and https://yourdomain.co.nz – without it redirecting to the preferred domain, then Google will treat these as two separate domains, and potentially flag your content as duplicate. This can impact negatively on your search ranking.
A solution to this issue is canonicalization – implementing a rel=canonical tag will indicate to search engines what the original (or ‘canonical’) page is. The tag will prevent search engines from crawling multiple versions of the same page.
Can you identify crawl errors and set up 301 redirects?
Using Google Search Console, or free tools like Screaming Frog SEO Spider, you can find out whether there are any ‘crawl errors’ on your website – that is, pages that return a ‘404 Not Found’. Many websites contain 404 errors and they don’t harm a site’s rankings as these pages won’t be crawled. However, if a webpage has been deleted and there is an alternative page for that content, a 301 redirect should be set up – directing users (and search engine bots) to the new page. This is good for ensuring any deleted web pages that continue to show in search engine results pages, will take users to a web page that’s relevant, rather than a 404 error.
There are plugins to aid with setting up 301 redirects yourself, such as Redirection for WordPress.
Where a page has been deleted and there is no alternative page, a 404 error is inevitable. For these instances, your 404 page should be customised to contain links to key content on your site, and some contact information so users can get in touch if they need further help.
Have you implemented an XML Sitemap in Google Search Console?
XML Sitemaps help search engines to crawl your site by providing a full list of web pages that may be missed during indexing. Once created, XML Sitemaps should be submitted to Google Search Console.
There are plugins, such as Yoast SEO for WordPress, that automagically create and update XML Sitemaps – making the task of submitting a sitemap much simpler.
Do you have unique pages with individual URLs?
If search engine bots can’t understand what the main topic of a page is about, then they won’t display the page in search results for your relevant keyword/s.
Having unique pages with individual URLs is a key starting point. Some websites are created as ‘one page’ sites, or have a ‘Services’ section which appears to contain in depth information about multiple services, but the information lives on one page. The site has been developed to allow users to click on buttons, which open up individual snippets of content – helping users to read in bite sized chunks. While the user is staying on the same page, the buttons they click are ‘showing’ or ‘hiding’ content.
The issue for search engine bots is that they read ALL the content the page contains at once, whether it’s shown or hidden for the user. Lots of content, about multiple topics, means the bots can’t categorise your content into one topic or keyword. And if the bots can’t decide what your content is about, then they won’t rank your site highly in search results.
One way to determine if your site has unique pages for each piece of content, is to click on a button and see if the URL in your browser stays the same or changes. If it stays the same, you may have a problem.
Can you add metadata to every page on your website?
In addition to having unique pages with individual URLs, is the need to add metadata or meta tags to every page. Meta tags are snippets of text that describe a page’s content, helping to tell search engines what a page is about.
Search engines can use meta to display the title and description in search results pages – meta is important in helping users to choose between clicking on one search result over another. The more specific, unique and compelling your meta is, the better – so being able to create and update it yourself is important.
WordPress has plugins such as Yoast SEO and All In One SEO, which allow admin users to add and update meta across individual pages.
SEO Audit – On Page and Off Page
The above points are all about ensuring your website is technically set up to support an SEO strategy. A website that ticks the boxes above means you have a tool that’s going to help rather than hinder your SEO efforts.
Our next posts, coming soon, will take you through an SEO Audit of On Page and Off Page Factors.