Preparing for an SEO audit

13/12/2021
2236
0
Preparing for an SEO audit
Preparing for an SEO audit

The best place to start your audit is to scan your site by uploading it. You can unload the site using the Screaming Frog program. During the upload, we will be able to analyze the following points:

  • Server response codes (3XX redirects, 200OK, broken links (4XX) and pages with 5XX page response codes;
  • Duplicate metadata (title, descriptions, H1 titles);
  • Missing metadata (title, descriptions, H1 titles);
  • Check server response speed;
  • Which pages are blocked from indexing in robots.txt or using the meta tag;
  • Availability of customized ALT/TITLE attributes for pictures, etc.

Also, you need to immediately check the robots.txt file (perhaps some important pages will be closed from the index, or the site may be completely closed from indexing) and the sitemap (sitemap.xml, which should specify priority, changefreq , lastmod, loc), as well as view errors in Bing Webmaster and Google Search Console and upload errors.

You also need to run the site through the following services:

  • PageSpeed ​​Insights or GTmetrix
  • schema.org markup validator
  • w3.org

Based on the received data, it is already possible to form a task for the programmer to correct technical errors.

Check site functionality

At the second stage of the audit, you need to analyze the current functionality of the site, as well as the presence of possible errors in the layout.

To identify these errors, you need:

  1. Make a test order for a product or service (it is desirable to test immediately from a PC and from a smartphone). If this is an online store, then you need to fully test the entire order chain (selecting different product parameters, changing the number of products and seeing if the price will change, filling in the fields in the cart, checking for required fields, etc.). With service sites it is easier, there is just a form to fill out. Also, you need to immediately check whether the application comes to the admin panel or to the mail.
  2. Check the adaptability of the site (on a smartphone, optionally on a tablet). It is desirable to look at the adaptive when checking both in a vertical form and in a horizontal one). There may also be non-responsive tables, which will cause horizontal scrolling.
  3. Check other forms on the site

After carrying out these analytical works, you can create a task for the programmer to correct errors in the functionality and layout.

Metadata and content

The next step is to analyze the content and metadata on the site.

The metadata should contain (if it is a commercial site) commercial queries. Also, in descriptions it is desirable to use special. symbols (emoji). This will increase the attractiveness of the snippet in the search results and can improve the CTR (click through rate of the snippet).

The content on the page needs to be analyzed for spam and other indicators (they are individual for each specialist and you need to check the text according to your requirements).

When analyzing a link profile, first of all, you need to look to see if there were any sharp jumps in links, because competitors can buy a lot of low-quality links to your site, which can result in sanctions being imposed on your site. Or if there is a sharp decline in links, then you may have purchased temporary links.

Also, you need to pay attention to the indicators of donor domains linking to your site. Donor domains with DR below 10 are recommended not to be used.

It is important to correctly distribute anchor and non-anchor links. To do this, you need to analyze competitors and see their anchor / non-anchor ratio and build approximately the same strategy for yourself.

Oleksandr Shmidko
SEO Specialist
commercial offer

    SEO promotionCopywritingSMM promotionDevelopmentContextual advertisingDesign
    Digital новини в нашому телеграм-каналі
    Інтернет-маркетинг
    простою мовою
    subscribe
    Other articles by the author
    30/05/2025
    User-Agent is one of the HTTP request headers responsible for identifying the browser, application, or operating system connecting to the website.

    16/07/2021
    To determine the pages to be improved (re-optimized), you must first take traffic-generated pages. It is important to consider their behavioral factors (bounce rate, browse depth, time spent on the page).

    13/12/2021
    You should immediately check the robots.txt file (it may contain some important pages that are not indexed, or the site may be closed from indexing) and the sitemap (sitemap.xml, which should contain priority, changefreq, lastmod, loc), as well as view errors in Bing Webmaster and Google Search Console and download the errors.

    Latest articles by #SEO
    19/06/2025
    Every Google update brings changes to search algorithms that can affect your website's visibility. Sometimes these are minor tweaks, other times they are large-scale Core Updates that completely overhaul the ranking principles.

    18/06/2025
    Images have long been an integral part of website SEO and require special attention not only from SEO specialists but also from web developers.

    17/06/2025
    Search engines scan all websites available on the Internet to determine their relevance to a specific topic. Once Google understands which category a web resource or specific page belongs to, it will start recommending it to users.

    WhatsApp Telegram Viber Почати розмову