Return to site

SEO Audit Quick Guide

Learn best way to audit your website

An SEO specialist’s job is never finished—you could spend a lifetime tweaking your website, and you’d still find something to improve. But for those of us who don’t have a lifetime to spend split-testing every element, I’ve created this 15-minute guide for auditing your website.

Note: For this guide, I used the latest version of my tool, WebSite Auditor, to crawl my website, analyze site errors, fix broken links and redirects, review my title and meta tags, optimize on-page content and images, etc. I use this software instead of separate crawling, keyword research, and link analysis tools, but you can follow along using your favorite auditing tools instead.

1. Crawl and Audit Your Website

Start crawling your website by entering its URL into your favorite SEO crawling tool. Make sure that your crawling tool is capable of scanning every resource on your website (JavaScript, CSS, Images, Video, Flash, etc.).

This scan usually only takes a few minutes, but time will vary depending on the size of your website.

Checklist:

  • Crawl your website.

2. Check for Crawlability/Indexing Issues

Once you’ve scanned your website, it’s time to review anything that might be affecting your website’s crawlability. Some of the most common problems include poorly maintained sitemaps, robots.txt errors, and HTTP response code errors.

  • Check your sitemap

If you’re using a tool or plugin to automatically manage your sitemap, you should be able to automatically create an XML sitemap or retrieve, view, and validate an existing sitemap. If you’re not using a sitemap tool, you’ll have to manually review your sitemap. Either way, you want to make sure that your sitemap is accurate and up-to-date.

Once you’ve updated your sitemap, check it for errors in Google Search Console. Simply log in to your account, go to Crawl > Sitemaps and then click the Add/Test Sitemaps button. Enter the URL of your sitemap and click Test to see if Google encounters any errors while parsing your sitemap.

  • Check your robots.txt

A page might be blocked from indexing for a variety of reasons. For example, it could have a “noindex” tag in its <head> section, the X-Robots-Tag in the HTTP header, or it could be that your robots.txt file itself is blocking the page. Using your auditing tool, check your robots.txt’s instructions and see if any important pages are being blocked. If they are, you need to add an allow rule to enable indexing.

If you’re struggling with CSS and JavaScript being disallowed in particular, Google’s Gary Illyes recommends you allow Google to crawl these resources by adding the following three lines of code to your robots.txt file:

  • Check your website for errors

HTTP response code errors are easy to detect and responsible for many indexing errors. You can check your website for crawl errors through your site auditing tool or by going to Crawl > Crawl Errors in your Google Search Console.

You can correct whatever 4XX errors you find by fixing your internal links or setting up 301 redirects that point visitors to similar pages (especially useful on e-commerce sites). Also, make sure that your website has a 404 page set up in the first place to link your visitors back to your website’s important pages.

  • Ask Google (and other search engines) to re-crawl your pages

Whenever you’ve finished making these essential on-site improvements, you can ask search engines to re-crawl your pages. This will let you know if you still have errors that need addressing or if you can submit your page for indexing.

To have Google re-crawl a page, log into your Google Search Console, go toCrawl > Fetch as Google, enter the URL of the updated page, and click Fetch.

Checklist:

  • Fix any errors on your sitemap.
  • Review your robots.txt file.
  • Ensure all of your important pages can be indexed.
  • Fix HTTP errors.
  • Ask search engines to re-crawl your site.

3. Avoid Common Redirect Mistakes​

If redirects don’t properly funnel visitors to the right page when that page has moved or changed URL, they pose a potential problem for SEO. Here are some of the most common issues:

  • 302 redirects

302 redirects, aka. “temporary” redirects, tell search engines that you’re temporarily redirecting visitors, but that the missing page will return shortly. As such, 302 redirects won’t pass on many qualities (e.g. PageRank, Traffic Value, etc.) that a 301 redirect would. This is fine if you’re doing something like testing a new page, but you should try to replace these redirects with 301s otherwise.

  • Meta refresh

Inserting code into a page’s metadata that automatically redirects visitors results in a “meta refresh.” Spammers tend to abuse this by redirecting visitors to pages with unrelated content, and as such it is penalized by most search engines.

  • Rel=”canonical” issues

If you have some duplicate content on multiple pages, you should indicate which page is the primary page by setting it up as your canonical URL. Each page should have no more than one canonical URL. Most site auditors will allow you to view your rel=”canonical” pages under a Redirects tab.

Checklist:

  • Replace temporary redirects (302) with permanent redirects (301).
  • Remove meta refresh from pages
  • Check for rel=”canonical” issues.

4. Fix HTML and CSS markup Errors

Coding errors slow down your website, interrupt user experience, and essentially shoot your SEO in the foot. Performing regular website audits can help you detect and correct these errors as they crop up. Some issues to look for include pages that are too big, pages that use frames, and pages that contain W3C errors and warnings.

Checklist:

  • Find and correct common coding errors

5. Fix Your Links and URLs

One of the fastest ways to regain lost link equity is to find and fix broken backlinks. Use your auditing tool to keep tabs on all of your Broken Links and then fix them by either contacting the webmasters who have linked to your website or by creating 301 redirects that link back to relevant, updated pages.

Here are some other best practices:

  • Limit outgoing links

When 100 or more outbound links originate from a single page, it tells search engines that the page might be spam. Limit the number of links you include per page.

  • Limit redirects

Whenever possible, fix 301 and 302 redirects by changing links to lead visitors directly to their destination page. The more links you have, the more you use up your search engine crawl budget.

  • Change dynamic URLs

Dynamically generated URLs are often difficult to read and don’t tell users where they’re heading. Try to fix dynamic URLs so that they’re more user-friendly.

  • Shorten long URLs

Recent evidence suggests that URLs with 35–40 characters dominate search listings. Of course, long URLs can rank—but practice brevity wherever possible.

Checklist:

  • Fix broken links
  • Mitigate outgoing links
  • Limit redirects
  • Fix unreadable URLs
  • Keep your URLs short and sweet

6. Review Your Title Tags, Meta Tags, and Header Tags

It doesn’t take much to fix a bad meta description or to turn a good title tag into a great one. The goal of all of your titles and meta descriptions is to inform visitors (and bots) what your page is about and, whenever possible, to include a choice keyphrase or two.

Here are some best practices:

  • Fix empty or lengthy titles and meta descriptions

Most SEO auditing tools will tell you when your titles and meta descriptions aren’t optimized. Make sure all of your titles fall within the appropriate character count (That means 50-60 characters for the title; 150-160 characters for the description).

  • Avoid duplicate titles and meta descriptions

E-commerce sites, in particular, suffer from duplicate content, or meta descriptions that read very similarly. Duplicate content can confuse search engines to the point where they rank none of your pages, so try to keep your content as unique as possible.

  • Target buying phrases

Header (H1) tags are important on-page ranking factors. Use these to target specific buying phrases that users are likely to search for. Try to Also try to avoid generic subheaders (H2, H3, etc.); use them to target secondary keywords where appropriate.

Checklist:

  • Optimize titles for SEO
  • Optimize meta descriptions for SEO
  • Optimize header tags for SEO

7. Optimize Your Content

Since Panda’s launch, high-quality content has been the defining factor of a great website. In other words, you need to make sure that each of your pages has enough content and that the content is unique and valuable. You may want to invest in a grammar tool (e.g. Grammarly) and a plagiarism checker (e.g. Copyscape) to help you with this stage (they’re free!).

Here’s what I did:

  • Identify thin content

To make sure that every page had enough relevant data, I filtered my pages by word count. I did this by right-clicking on the header of any column and adding a new column called Word Count. Then I made sure that every page had 1-2 paragraphs of text (aiming for at least 250 words per page). I also double-checked their formatting and readability while I was at it.

  • Corrected duplicate content

It’s in your best interests to ensure that all your titles and descriptions are unique. The thing is duplicate titles and descriptions are confusing to search engines as they wouldn’t know which of the pages should be ranked in response to a given query. Often, they’d choose to rank neither page instead.

  • Audit site images

Searching for broken images is all well and good, but you should also look for images without alt tags as an additional SEO opportunity.

Checklist:

  • Find and add to thin content
  • Make content as unique as possible
  • Fix broken images and empty alt text tags

Conclusion

You’re done! While this is by no means the limit of the optimization you could perform on your site, it’s a great start and one that will ensure your website stays healthy and relevant. Your page stats on Google analytics will be a strong indicator of which pages are affected the most, and after auditing your old content, you may realize your content creation habits have improved and you have some old content that needs to go. The only way to properly audit a website is to leave no stone unturned, and to audit every aspect of your website content. Don’t forget to use tools and software to make your life easier.

Upcoming Guide to Auditing a Website’s Onsite SEO Health visit after few days