50 SEO Tips to better optimise your website
Optimise your website in 2020 with these 50 SEO tips!
Published byAlexis Pratsides
The role of an SEO agency is never complete. Whether it’s gathering new backlinks to stay relevant and competitive, making use of new features or plugins, or editing content to meet the latest digital marketing trends, the process is never-ending.
Website optimisation can seem daunting and it’s often difficult to know where to begin. So, here’s 50 of our top SEO tips for website optimisation to help you move forward with your SEO strategy!
1 – Check for internal broken links and dofollow.
Make sure all broken internal links are removed or fixed. A broken internal link will show as a 404 error to those that land on the page. 404s can be harmful to SEO as they can lead to your audience exiting the site (thereby also affecting your user experience). A 404 left alone can also affect rankings of the site, particularly if the number of 404s starts to increase. Updating internal links to 404s will mitigate this impact, as well as signal site maintenance to Google; which can help Google build trust in your website.
2 – Make sure your title tags describe the content with the keyword at the beginning.
Keywords are essential for search engines to show your website to those searching for what you are offering. Page Titles are one of the confirmed ranking factors for Google, so you want to get them right. Optimising your title tags is one of the quickest changes you can make to boost your organic rankings!
3 – Link your more authoritative pages to your key landing pages.
Authoritative pages pointing towards your key landing pages will boost rankings and traffic on your desired landing pages.
4- Review your Google Search Console coverage errors .
Regularly check your website’s coverage report on Google Search Console (GSC) for any ‘errors’ or ‘issues’. Where errors are showing they should be addressed and resolved. Fixing coverage ‘errors’ is important as these pages won’t appear in Google so long as the error is in place, which can mean a loss of traffic to your site. Resolving coverage issues is also another signal to Google of site maintenance. If this seems like a daunting task then don’t worry, GSC will give you some indications on how to fix each error.
5 – Fix all of your 404 errors.
‘Error 404 not found’ will often lead to the user leaving your website altogether as it weakens their trust. It can also be very frustrating and time consuming trying to then navigate the site to find what they want.
404s are made when pages are deleted and not redirected. When a page is 404’ed, a 301 redirect rule should be implemented, pointing to the most relevant page on the site. This will help preserve any rankings of the deleted page. Following that, all internal links that point to that page should be updated or removed.
6 – Fix all of your internal redirects, redirect chains and redirect loops.
Redirects add an extra ‘hop’ for both users and search engine crawlers. They can be harmful to your organic performance as they impact page load time and use up valuable crawl budget when search engines crawl and index your website. Where these are internal redirects, you can easily fix these so you don’t give bots extra work! Use Screaming Frog’s redirect checker to check your websites redirects using their SEO Spider.
7 – Review your website indexation using site:domain.com
…and make sure all your content is indexed! If content is not indexed, it cannot appear in search results.
8 – Review your website speed with Google’s Lighthouse.
Google’s Lighthouse is a great tool for auditing your website’s “performance, accessibility, progressive web apps, SEO and more”.
9 – Write your content for users, not for bots.
Writing your content targeted towards bots will make it less enjoyable to read for users leading to decreased session time and an increase in bounce rate, ultimately decreasing SEO.
10 – Include outbound links to authoritative websites.
Outbound links help search engines better understand the content on your page and can help increase your relevance, reputation and value scores while also encouraging backlinks.
11 – Avoid so many 301 redirects in your website.
This will affect your website speed and consume crawl budget.
12 – Don’t underestimate meta descriptions.
Strong meta descriptions will improve your CTR (Click-Through Rate) as users are better informed about the contents of your page, making them more likely to click onto it.
13 – Remember that a 302 redirect does not send as much link equity as a 301.
If you have implemented 302 redirects, we recommend reviewing them to see if a 301 redirect would be better. 302s are ‘temporary’ redirects and are unable to pass on the full equity of any backlinks. 301 redirects are ‘permanent’ and are preferable. This is because they will pass on more link weighting from the original page to the new URL.
14 – Include LSI keywords in your content. Google will love it!
Latent Semantic Indexing Keywords are “conceptually related terms that search engines use to deeply understand the content on a webpage”.
Back in the day, the density of keywords was hugely important as if a word wasn’t used multiple times search engines would struggle to understand that that’s what the page is about. But now, Google can also search for semantically related words and better understand the page through them.
15 – Double check your website is mobile friendly and responsive.
It’s now essential to consider your web design on mobiles. In fact, how your website both looks and works on mobile as much as, if not more so, than on a desktop or laptop. With people constantly on the go, we are living in a mobile-first era where people are using their phones more and more to view websites and buy online.
Here’s 6 simple website simplification tips from Adom Enfroy via SEW:
- Reduce the number of pages on your site
- Add an improved search feature so users can still find what they need
- Increase white space to reduce the appearance of clutter
- Use clean lines and wide borders
- Use a simple font and make it larger
- Keep a maximum of two columns on mobile
16 – Update your WordPress plugins regularly to avoid security issues.
Over 80% of websites being hacked is due to updates not taking place. Don ‘t put your website at risk! Make sure any custom WordPress Plugins are up to date
17 – Perform A/B testing using Google Optimize to improve the UX.
“A/B testing is a simple and cost-effective method for any aspect of your business. It provides you with insightful analysis and a profound look at the performance of your tested elements,” says Jade Nguyen.
The process is useful to all businesses, big or small, to help determine the right options for your website which drive the desired action the most. It essentially eliminates any guesswork and pre-conceptions on what you believe your audience will like and respond to the most, and instead puts it to the test.
18 – Contextual internal links have way more SEO value than navigational internal links.
A contextual link is one that is found within the body text of your page and its anchor text surrounds the idea of the link itself. This is different to a navigational internal link which are found in the navigational menu at the top of pages. The navigation menu is usually dedicated to the primary landing pages on the site, limiting the value it can offer for SEO. Secondary pages such as blog articles wouldn’t often be found here. This makes contextual links more valuable as they offer more opportunity for websites to link to pages found deeper in the navigation and keep them accessible.
Contextual links will use anchor text that gives users and search engines an insight into what the linked page is about. It is this context that makes them a valuable for your organic performance.
19 – Avoid changing the URL structure of your website.
Changing your URL can lead to a lot of internal redirects and all posts previously posted will be pointing to the wrong URL. Google also doesn’t like URL changes, unless you’re updating the URL to be more ‘SEO friendly’ by including the page’s target keyword. Every time you change the URL there is a risk that your rankings will fluctuate or change whilst Google acknowledges the change; your rankings are not guaranteed to stay at they are. It is because of this, we recommend restricting the amount of URL changes made.
20 – Review the keywords your competitors are ranking and you are not.
Review where your competition is succeeding but you are not. Have a look at their rankings and identify keywords they rank for, but you do not. Maybe you’re missing a trick which can easily be pointed out by them!
21 – Remember to include the tag ‘sponsored’ to affiliate links.
The ‘sponsored’ tag can represent the involvement and support of another company, giving authority to your post. This is recognised by both bots and users. However, dependent on the circumstance be careful where you seek these out. For instance, many social media users show distrust to ‘sponsored’ posts from influencers as they are aware they are being paid to publish the post and are not posting purely from a pleasant experience of a product or service.
22 – Keep up to date with Google Webmaster Central Blog reviews.
Is there any other more reliable source to keep yourself up to date with the latest SEO news? As digital marketers you should keep a constant eye on this blog.
23 – Use a concise keyword strategy in your website mapping keywords to URLs.
Make sure you don’t incur in keyword cannibalisation by mapping out keywords to URLs. In the end, what you need is to have unique content in each page and not having the same set of keywords competing each other in the SERP.
24 – Double check in GSC keywords with lots of impressions and few clicks.
If you identify keywords with lots of impressions and just a few clicks, you might want to look into the reason. Where you rank for keywords for high impression levels, check your content meets the intent behind those keywords. Small changes you can try include updating the Title Tag or Meta Description to be more optimised towards these keywords, as well as offer a more engaging text that encourages users to click on your result. Your CTR should increase and so should your rankings.
25 – Make sure you don’t have duplicate content.
Having duplicate content on your site will confuse Google; it won’t know which version should rank. In these situations, often the result is none of the versions rank well. Unique content is the way forward, along with updating existing content to remain relevant and insightful.
Duplicate content can also be harmful to your brand’s reputation. If your audience picks up on your content’s similarity and duplication, they’ll perceive it as a low level of effort and time has gone into the website, decreasing their trust in the website and potentially overall perception of your brand. Reducing duplicate content will help with your brand management and present your website as a useful resource of information.
Use Siteliner.com to check your website for duplicate content.
26 – Review the search intent of the keyword you want to optimise before writing your article.
Not understanding why a keyword receives high searches may lead to you writing irrelevant content that users are uninterested in reading or will make you target the wrong keywords altogether.
27 – Use heatmaps to improve the website usability.
Website heatmaps aggregate data to visually show user behaviour on a webpage. They vary in colour to represent different levels of engagement and can help identify areas which you’d like to see high engagement but currently are not. Maybe you need to move it or highlight it in some way in order to increase engagement with the feature.
28 – Each webpage should have different Title Tags.
Duplicate title tags will lead to search engines only ranking you for a 1 page on your website. Essentially, avoid becoming your own competition. Crawl the site using Screaming Frog to find duplicate titles.
29 – Sync GSC, Analytics and Ahrefs with Screaming Frog to make better decisions.
We love data, don’t we?
30 – Find out the links your competitors have in common and try to get them.
Similarly to reviewing what your competitors are ranking for, you can also review their backlinks. This may find authoritative websites with a strong affiliation to your pages. Gaining these will detract from the competitive advantage they might hold over you.
31 – Review unused plugins and delete them to save space.
Many plugins may actually slow down your site speed. Therefore, if they are not being used, delete them!
Click here for a WordPress specific test to see if your plugins are slowing down your site.
32 – Set up website goals on Google Analytics using Google Tag Manager and identify underperforming landing pages.
Whether it’s converting to submit a form, watch a video or make a purchase, your web pages have a purpose and the desired action. Track them to better understand how users are navigating your website and which pages are generating less positive leads than others in order to improve them.
33 – Compress all images above 100kb to improve website load time.
Images can be compressed via websites such as TinyPNG.com. This is a very quick process that can improve your website loading speed dramatically!
34 – Create a xml sitemap and upload it to Google Search Console and Bing Webmaster Tools to improve indexability.
Sitemaps are a powerful way to make your pages indexed on Google. This is exceptionally important for websites with a high number of pages to make it easier for Googlebot to find your content and index your pages.
35 – Review your Robots.txt file and make sure you are not disallowing unnecessary pages.
You might want to block certain pages to be crawled by Google e.g.: thank you pages, WordPress admin page, parameter URLs, etc. There are certain times you might want Google to block a complete directory and block by mistake importat pages or important resources. Also you or your website developers wanted temporarily to block the whole domain and forget to enable it again.
To avoid this mistake, check the status of your robots.txt from time to time or check Google Search Console for Robots.txt errors.
36 – Only build links from industry-related websites with a low spam score.
Search engines use links on your website to better understand its contents. Therefore, only link to those in a related field. Your website will be leveraged through strong affiliation and authoritative links.
37 – Include your sitemap URL in Robots.txt for Googlebot to find it more easily.
Help Google to find your sitemap by including the URL at the end of your Robots.txt file. Follow the below steps:
38 – Make sure the number of URLs match approximately with the indexed URLs in the SERP.
Use the site:domain.com operator to identify how many URLs have been indexed in your preferred search engine and compare it with the URLs your crawler tool has found. If the numbers differ from each other significantly, you probably have an indexation issue.
39 – Get a ‘https’ certificate if you haven’t already.
“84% of online shoppers say they abandon a purchase when they realise the website is insecure.” SSL gives your website that little padlock in the browser bar and can benefit your site dramatically.
40 – Include descriptive Alt Text in your images.
This helps Google understand what the image is about in order to serve it in search results.
41 – Try to get local citations from trusted directories with NAP (Number, Address, Phone) consistency.
This is a great way to provide your key information to local individuals as well as helping you rank well locally.
42 – Ask your clients, colleagues and providers to leave you a positive review on Google, Yelp, Trustpilot and Glassdoor.
People value other people’s opinions and reviews. A statement is much more reliable if it hasn’t come directly from the brand. Not only that but individuals will often view brands/websites in order of their average review score. So ensuring you have plenty of positive reviews will help drive more traffic to your website.
43 – Don’t overuse exact anchor text.
Too many inbound links with the exact same anchor text can actually make it look suspicious and spammy so make sure it’s different each time.
44 – Review all your webpages provide value to the user and have enough content.
Strong content will encourage higher levels of engagement which has become increasingly powerful in pushing pages up the ranks. To rank in organic search, your content needs to be helpful and meet the needs of the user. If it does not do this, it will start to drop down the rankings.
45 – Make sure you don’t have duplicate versions of your website (www, non-www, http and https).
A quick look using the site:domain.com operator can tell you if you have duplicate versions of the website. If that’s the case, choose one version and redirect the rest to the chosen one. Similar to duplicate pages, having the entire site duplicated can confuse Google as it won’t know which version to rank (especially if you don’t have canonical tags in place).
46 – Create a Google Business Profile account and optimise it to improve your Local SEO.
Local SEO has advanced significantly over the last few years. When building out your local SEO strategy, optimising a Google Business Profile account will help Google understand what your business offers and therefore rank you more highly for related searches in your local area.
47 – Include real authors for your content with a link to a profile page with biography.
Real authors can help your content signal Expertise, Authority and Trust; all factors Google considers when evaluating and ranking content. Google and your audience are much more likely to engage with and trust a piece of writing if it’s written by an expert in the field. Creating an author page where you can shout out their achievements and expertise can help showcase this.
48 – Include client’s testimonials on your website to provide trust.
When people want to know if a service or product is good, often they’ll turn to others for an opinion. Word of mouth is still a strong way of passing and receiving recommendations, so you want to try and incorporate that on your website. Testimonials and case studies are great ways of showcasing the positive experience clients or customers have had with your website, put time into creating these and them make them visible across your site.
49 – Include your most important keyword in the page title of the homepage.
Help bots and users understand what your website is about by including your most important keyword in your homepage title.
50 – Handle parameter URLs with canonical tags pointing to the primary source of content.
This trick will sort out your duplicate content issues and pass authority to your main content.
More insights from the team