Checklist for a healthy website

Do not block critical web addresses.

Robots.txt files are the first thing Google search spiders will look for on your site. If you wish to prevent less important URLs from being crawled, these files are quite handy. If you employ them on critical pages, though, your website will become a ghost town. To see which pages are restricted from crawlers, use Google's robots.txt Tester.

When was the last time you checked the health of your domain? It might be time for a tune up if it's been a while. While your site doesn't have to be perfect, don't let too many problems pile up or Google will direct your visitors to your competitors.

Regrettably, not every website issue is black and white. Some may have an influence on your entire domain, while others may simply have an impact on certain pages. We created a simple website health checklist to make your life easier by addressing the most common issues that will harm your SEO.

Looking for something a little more advanced? Take a look at our comprehensive SEO checklist!

Points to Remember

  • Learn how the many aspects of website health affect your SEO performance.
  • Use sitemaps, standardised URLs, correct Rel Canonical tags, and internal links to make your site crawlable.
  • By employing HTTPS and disavowing spammy backlinks, you can reduce your chances of being tagged as spam.
  • Improve the user experience by optimising the site's speed, streamlining the HTML, and making it mobile friendly.

Checklist for a healthy website

If you want to increase your SEO, make a list of specific, concrete activities that you can follow on a daily basis. Many of these on-page SEO tactics have already been discussed in that post, so go there for a more in-depth look. You can also use our checklist template, which you can see below.

1. Do not block critical web addresses.

Robots.txt files are the first thing Google search spiders will look for on your site. If you wish to prevent less important URLs from being crawled, these files are quite handy. If you employ them on critical pages, though, your website will become a ghost town. To see which pages are restricted from crawlers, use Google's robots.txt Tester.

2. Make use of XML sitemaps

Consider an XML sitemap to be your domain's table of contents. They provide spiders with a bird's eye perspective of your website, allowing them to see how everything works together. Search engines may only crawl a portion of your site if you don't have an XML sitemap. You can either create your own XML sitemap or use a third-party solution.

3. Make urls consistent

Another prevalent issue that even huge company websites confront is incorrect URL setting. Let's say your company's homepage is This page might appear if you type in the following URL variations:


What impact does this have on your SEO?

Each of those URLs will be viewed as a different page by search engine crawlers, despite the fact that the content is identical. If websites link to numerous variations of a URL, it will not only confuse Google, but it will also dilute your backlink profile.

4. Make sure your meta descriptions are unique.

Meta descriptions are crucial to the health of your domain. Whether your goal is to optimise for traffic or keywords, each page on your site should have its own meta description. If you don't have one, Google will make one for you. And it almost never works out for you.

5. Make sure your page titles are optimised.

Because title tags are one of the most powerful on-page ranking signals, it's critical to use them correctly. Put keywords near the start of the tag as a best practise. However, you should also match the keyword's Google search intent and write something that people want to click on.

6. Include alternative text for images.

Images are a great method to get people's attention, tell a story, or explain something complicated. However, don't forget to use descriptive alt text to tell Google what the images are about.

If the image is also a hyperlink, the alt text is utilised as anchor text to help the image appear in search results. In our free downloadable below, for example, we used the alt phrase "website health checklist."

7. Rel canonical tags should be used.

Web spiders are confused by duplicate and thin material, which leads to keyword cannibalism. As a result, Google may mark your material as spam and penalise it in the search results.

Crawlers are told which version of each web page should be indexed as the primary source of content using canonical tags. If you have different versions of your site for mobile devices and desktop computers, or if you have many languages, for example, you should canonical each one to the preferred version.

8. Resolve 404 errors

A 404 error is displayed when a spider or user hits a URL that does not exist on your domain. Maybe you accidently removed a page or someone linked to the wrong URL. In any case, if you haven't checked your domain health in a while, you're likely to have some issues that need to be addressed.

Why? Because 404s aren't simply inconvenient for visitors, they can also have a significant influence on your SEO. Backlinks are still vital for search engine optimization, as you surely know (SEO). So, if you remove a page with dozens of great links but don't use a 301, you'll lose all of the link value.

Worse, if you don't address the problem promptly, your competitors may be able to persuade those sites to link to their content instead.

9. Redirect chains must be fixed.

Redirects are necessary for SEO, but if done incorrectly, they can cause difficulties. The same item may be forwarded to different URLs over time. This results in a redirect chain, which slows down your site and lowers your SEO.

Keep in mind that Google favours fast-loading pages. As a result, reduce your redirect chains and fix any links to obsolete 301 pages.

10. Reduce the depth of your crawl.

Is it necessary to use more than three clicks to access part of your content? If this is the case, search engine spiders may conclude that certain pages are unimportant.

It goes without saying that this post does not cover changing your site architecture. However, adding internal connections to relevant pages is a quick approach to boost crawl depth.

11. Include internal links.

Internal links are far more significant than you may believe. For starters, they assist spiders (and humans) discover your material. This is especially true if you have pages that are orphaned.

They also suggest which pages are your most important depending on how often you connect to them, in addition to usability.

Finally, the only way to spread PageRank throughout your website is to use internal links. Have you written some fantastic blog pieces that have gotten you a lot of backlinks? Internal links to your services pages or other relevant blog entries can also help to increase their visibility.

The subject is enormous, and there are numerous misconceptions about it. So, I recommend reading our piece that describes what internal links are, and provides some recommended practises to embrace.

12. Use HTTPS instead of HTTP.

When Google warns you that a website isn't secure, you're probably going to hit the back button right away. You're not alone, which is bad for your bounce rate and SEO. That is why it is critical to have an HTTPS (Hyper Text Transfer Protocol Secure) site. Google not only flags HTTP sites as insecure, but it also prioritises HTTPS sites in search results.

Do you require proof? According to a survey conducted by Moz, safe websites dominate Google page one ranks, particularly for head phrases.

Make sure you have an SSL certificate if you exchange any sensitive information on your site, such as credit cards, usernames, or passwords. Fortunately, implementing HTTPS on your site isn't complicated.

13. Make use of hreflang tags.

Have you ever visited a website with various language options and wondered how Google knew which version to give you? That webpage almost certainly employed hreflang tags. They're also quite significant for SEO.

Why? Search engines employ hreflang tags to determine the nation and language version of your site to present to users. If Google believes a given IP address is in Germany, it’ll show that user the version of your site labelled hreflang=”de=de”.

14. Backlinks that are spammy should be disavowed.

This is a place where no one will judge you. Perhaps you were desperate for backlinks and enlisted the help of a dubious black hat SEO firm. Perhaps they didn't employ the most effective off-page SEO techniques. Have you ever paid for backlinks, spammed blog comment areas, or targeted link directories with your former SEO partner?

Don't worry, we'll keep your secret safe.

Despite Google's assurances that those links will be ignored, it's recommended to disavow them just to be cautious. If you clean the skeletons out of your closet, you'll be able to sleep much better.

15. Improve the site's performance.

The year 2018 was a watershed moment for SEO. Google revealed that website load time is a Google ranking criteria for mobile searches and pushed out their mobile-first index. Even though Google states that the "Speed Update" will only effect the slowest pages, it's best not to risk it.

Every ranking element matters in the competitive realm of enterprise SEO. Google actually recommends that web pages load in under 2 seconds. As a result, it's in your best interest to make your site faster and lower server response time.

Pay particular attention to elements that can effect site speed as you tackle things on your domain health checklist, such as:

  • Slow or shared serversImage files of a large size
  • Site traffic is high.
  • The absence of a CDN
  • There are a lot of HTTP requests.
  • CSS and JavaScript that haven't been minified
  • HTML that is bloated
  • Using an excessive number of WordPress plugins

16. Ensure that your software is up to date.

Outdated software puts your website at danger of being hacked and infected with malware. Malware can cause your site to slow down and put your users at risk, as well as inject spammy links into your carefully designed content.

If Google detects malware on your site, it may flag your organic search results with a malware warning label. Few things will hurt your SEO more than a Google "proceed at your own peril" warning.

Worse, Google may choose to blacklist your site, resulting in the loss of 95 percent of your organic traffic.

17. HTML code that is more streamlined

Your website isn't merely a collection of images and phrases that appear on a screen. Each element is made up of code lines. How much code is there? Consider this: a basic iPhone game contains over 100,000 lines of code, so think how much code your website has. (There are more than 60 million lines of code on Facebook.)

Dense code will slow down your site, so remove any unused WordPress plugins or redundant code that may be slowing things down.

18. Images and videos are missing.

Consider what would happen if you went to a page to download an infographic, but when you clicked the link, nothing happened. The infographic is completely missing! What if a video appears on a page but does not play?

Broken links and missing parts are noticed by both Google and your users. This has a negative impact on your SEO, visitors, and conversions.

Would you like to learn more about video SEO? Take a look at our guide on video search engine optimization!

19. Mobile-friendliness

In 2015, mobile searches overtook desktop searches for the first time. Google launched mobile-first indexing three years later. Of course, if your site doesn't have a mobile-friendly design, it will still get indexed. Google's priorities, on the other hand, have clearly shifted.

The priorities of users have evolved as well. People now demand mobile-friendly websites to have responsive web design. That implies they won't waste time if your website isn't mobile-friendly. They'll leave, and your SEO will suffer as a result.

20. Downtime on the website

Domain downtime is, without a doubt, a horrible website user experience. Although Google does not impose penalties for site downtime, you will experience a brief drop in your ranking until your site is crawled again and the search engines are informed that you are back online.

If you're going to take your site down, use a 503 status code to inform Google that it's only temporary.

The bottom line when it comes to domain health

Your website serves as a central centre for all of your digital marketing efforts, including content marketing, social media, and advertising campaigns. Because it's so important, your website requires regular maintenance to stay in top shape. So keep our list accessible and go over it again and again.

What if there are issues? You'll be well-prepared to deal with issues quickly so that your users (and Google) have the best possible experience on your site.