Content of the article
- /01 Analysis of competitors
- /02 Site indexing works
- /03 Technical optimization
- /04 Content On-Page optimization
- /05 Expanding the structure
- /06 Working with internal relinking
- /07 Working with an external referral profile
- /08 Increasing site conversion/work with usability
- /09 Works with micro-markup and optimization of snippets
- /10 To summarize

A site SEO promotion strategy is a general plan of action for a specific site that covers a certain period of time and helps to achieve the set goal.
An SEO strategy is needed for the following purposes:
- Visual observation of all stages that lead to the goal.
- Process progress control.
- Increasing work motivation and awareness of the entire team.
- Quick familiarization of new employees with the project.
In addition, you simply need a site promotion strategy if: the niche in which you work is highly competitive; your resource contains more than 30 pages; there is a certain purpose of promotion; want to “connect” with potential customers.
Advantages of SEO strategy for companies:
- Reducing the time and resources spent on achieving the goal. All work is carried out according to a prepared plan in a structured way, not chaotically and casually.
- Monitoring the performance of work and recording intermediate results, which helps to determine in time if something is going wrong.
- Increase conversion. Usually, during the complex promotion of the site, its technical side, interface, and functionality are improved, and all this helps to turn visitors to the resource into buyers.
- You can quickly find errors or problems and eliminate them, because thanks to the systematic approach to work, the SEO specialist constantly monitors the indicators and the state of the resource.
During SEO promotion all works must be carried out comprehensively and them can be divided into main interrelated directions:
- Analysis of competitors.
- Site indexing work.
- Technical optimization.
- Content On-Page optimization.
- Optimization and expansion of the structure.
- Working with internal linking.
- Working with a referral profile.
- Increasing site conversion/work with usability.
- Work on improving snippets and implementing structured data.
The presented areas of work are formed on the basis of primary data: site analysis, technical audit or analysis of competitors. Next, we will consider each direction of optimization and specific tasks for your resource separately and in more detail.
During the work of an SEO specialist, the list of tasks may change. This is due to the fact that after in-depth study of the niche and obtaining additional analytical data, the strategy may need minor adjustments to obtain the best results from promotion in the search engine.
Analysis of competitors
Analysis is necessary in order to understand what promotion methods your competitors use, what they focus on during promotion, what their weaknesses and strengths are, and by what parameters they can be bypassed. Also, by analyzing competitors, you can understand what USP they use on the site. What usability features did they integrate into their site and why? It is very important that the analysis of competitors is not those sites whose offline points are nearby, but those sites that occupy the top positions in the output for the main key queries.
Most of the analysis of competitors is done in specific areas of work – external links, content, etc. The work is distributed and done only as necessary and important. Doing tests for the sake of doing them is “stupid work” for which you have to pay later.
Analysis of competitors is carried out immediately in some directions:
- Analysis of the technical condition of competitors’ sites.
- Analysis of the structure expansion methods they use and how they use them.
- Chips in usability.
- Features of content, size of texts, technical characteristics, number of products on the page, names of products/services, etc.
- Link profile, niche averages – anchor/non-anchor ratio, follow/nofollow, crowd/directories/articles.
In most cases, comparison with competitors and selection of their strengths takes place during the entire work process. If you need to implement something, the first thing you need to do is to turn to your competitors and see how they do it, whether it is possible to adapt it to yourself, or whether it is possible to do it better. Then the right solution is born, which is optimal for your site.
Site indexing works
Correct indexing of the site is a guarantee that all useful pages are included in the search results, and “junk” and duplicate pages are not. When indexing a resource, search engines scan it and analyze its text, images, videos, meta tags, micro markup, service data and other content. In practice, the larger the site, the greater the problem with indexing.
Improvement process site indexing begins with analysis and finding common problems. It is necessary to carefully analyze the data presented in Google Search Console and determine the ratio of the number of indexed and non-indexed pages. Find out the reasons, namely why the Google robot does not add data about certain pages to the index of the search engine. Among the common problems that affect the indexing of pages, the following can be noted: missing links to the page from the site, redirects, the page is closed from indexing in robots.txt, meta robots, the canonical tag is not specified correctly, etc.
It is necessary to compare the number of pages in the index with the number of pages specified in the sitemap.xml file. This file helps the search engine index your site more accurately. With its help, you can indicate which of the sections are of higher priority, transfer additional information about the language versions of the pages, the presence of images, video materials and news. If the resource has a massive and complex structure, then in this case the file sitemap.xml and simplifies the work of the search engine when scanning the site.
An additional area of work for acceleration or re-indexing is the direct addition of site pages to bypass search operations. This process is performed when it is necessary for a new page to get into the index of the search engine faster or to notify about changes on the page. For example, after updating the content or making technical changes. This can be done manually by adding one page at a time to Google Search Console, or you can do it through a specially configured Indexing API. This will allow you to send all the pages you need to the index as quickly as possible and not waste extra time adding one URL at a time.
This includes the following list of works:
- Checking site indexing and finding issues based on data from Search Console.
- Adjustment of robots.txt file directives for more accurate indexing of pages.
- Checking the presence of all required pages in the sitemap.xml file and checking/adjusting its syntax.
- Setting the meta name=“robots” tag to remove pages from the main/secondary index or to allow indexing of required pages.
- Checking that the main pages have the correct Canonical tags.
- Creation and configuration of functionality for adding pages to the index through the Indexing API.
- Search for “junk” and duplicate pages in the index and remove them.
- Control and analysis of indicators and report results in Search Console.
Technical optimization
Technical works on the site have a large list of inspection and optimization points. This is due to the fact that the functioning of the site depends on the correct operation of a large number of files and data that are related and affect each other. Thus, considering only individual elements, rather than performing a comprehensive inspection, would be the wrong approach to overall optimization. For the most part, technical edits are related to site indexing work.
Technical optimization is necessary so that the resource meets the requirements of search engines and is evaluated as a “quality” site. In order to fully satisfy these requirements, it is necessary to make sure that the site meets the following criteria:
- validation of the page code takes place without errors and warnings, meets generally accepted standards;
- page loading speed has satisfactory indicators and meets the requirements of the Interaction to Next Paint (INP) indicator;
- the site is user-friendly, has a clear structure and intuitive navigation elements (the parameter is partially related to work on usability);
- the site is compatible and optimized for mobile devices;
- crawlers have access to all the main and priority pages of the site.
For example, optimizing the language versions of the site does not mean the banal creation of duplicate pages with content in different languages. This means that these pages must be properly linked to each other, indexable, and have the correct hreflang tags set up on the pages. Moreover, the methods of technical implementation of multilingualism of the site are different and depend on the ultimate goal of promotion and the features of the CMS (Content Management System) of the resource.
The site must have correctly configured pagination pages in order for all content to be indexed. Pagination pages used to be blocked from indexing with canonical tags, but now Google’s documentation emphasizes that these pages should be accessible because they have useful content for the user. Closed pagination pages will not allow indexing part of the content on them. For example, a search bot will not be able to scan all tiles of products on the website of an online store, except for the first page, if they are closed with a tag.
Missing or duplicated Title, Description meta tags and H1 headings indicate an incorrectly optimized structure. Google algorithms interpret such problems on the site as the presence of duplicate pages, or pages that do not contain useful content for the user. Therefore, it is important not only to integrate keywords in metadata, but also to make sure that all of them are unique without exception. This is a direct indication to the search engine that there are no technical duplicate pages within the site. For sites with a large number of pages, an effective tool for eliminating such problems is the implementation of automatic metadata generation.
Most of the problems on the site can be detected while browsing the site, fully loading the pages using special software and analyzing problems and warnings in Google Search Console.
According to Google, one of the most important technical optimization parameters is site loading speed. The faster the resource is loaded, the more people will wait for this download and come to it. A separate service has been created to analyze the speed parameters, which analyzes the site loading process in great detail and points to problem areas and factors that slow down or block page initialization processes.
It is necessary for the site to be not only fast and filled with valuable content, but also as easy to use as possible. This especially applies to mobile versions of the site. Therefore, there is a separate service for the analysis of the convenience indicator – Mobile Friendly. It is used to check this parameter, and the data about the quality of the pages goes directly to Google Search Console during the crawling of the site by the search bot. Therefore, it is worth paying attention and analyzing these data.
Optimizing pages for the basic requirements of Google Web Core Vitals is also a very important area of work. First, you need to analyze the site for common problems and errors on five main types of pages: main, category, subcategory, product/service, service. If problems are found, you will most likely have to find a compromise between what is absolutely necessary to fix and what cannot be changed due to redesign/version etc.
Below are the main points of the checklist during technical optimization and this is only part of all the necessary checks and corrections that are carried out by an SEO specialist.
- Defining and configuring the main mirror of the site.
- Search and elimination of complete technical duplicates.
- Setting up redirection rules using 301 redirects.
- Configuring rules for triggering 404 server responses and error pages.
- Optimizing site performance and loading speed.
- HTML file validation and error correction.
- Validation of CSS style files and error correction.
- Find and fix internal redirect links (codes 3xx) and broken links (codes 4xx).
- Site optimization works for WebCore Vitals requirements.
- Adjusting the correct operation of pagination pages.
- Checking and configuring the rel canonical tag.
- Checking the settings and generation of the sitemap.xml file.
- Analysis and editing of directives in the file robots.txt.
- Setting up navigation chains (breadcrumbs).
- Analysis and optimization/expansion of existing structured data on the site.
Content On-Page optimization
On-Page optimization is a necessary stage of improvements on the website. With the help of this direction of work, we indicate to the search engine which queries should be ranked for this or that page. This is achieved with the help of many search engine optimization factors and tools. The main initial stage for this direction is the collection of the semantic core. You need to understand what search queries potential customers are looking for your site. Keywords can be found using various services, the main ones are Google Keywords Planner, Ahrefs and SerpStat. For sites that already have a history and are ranked, use Google Search Console to understand what queries are coming to the site.
Having initially analyzed page-by-page optimization for search queries, the following errors can be detected:
- one or both meta tags are missing from the pages;
- meta tags automatically generated on pages do not fully cover all key queries;
- missing or insufficiently clear and logical structure of the page formed using H1-H6 headings;
- priority pages do not have text content that fully describes the topic and its content;
- the texts are not optimized for all keywords, there are nuances in their design and compliance with the requirements of search engines;
- the text content on the pages is not updated regularly. You should consider updating it by automatically creating some additional blocks;
- there are no graphic and video content in the texts. Integration of media information into texts will improve its perception by users, which will improve behavioral factors and positively reflect on search ranking.
Taking this into account, we will consider the phasing of the works below.
Collection of semantics
Collection of semantics is a responsible component during SEO promotion, as it is necessary to find not only explicit semantics for the site, but also synonyms by which people can search for your product/service. For example, “machine gun/circuit breaker/fuse” and so on.
In the process of collecting semantics, it is possible to conclude whether it is appropriate to expand the structure based on the grouping of requests into separate clusters. For example, if there are keywords for which you can create a separate page, then it is better to do so. Thus, the request will be more relevant to a specific page. Based on this principle, an SEO structure is created on the site – a structure that covers as many search queries as possible in the niche, without creating duplicate pages and avoiding internal cannibalization.
Most often, website owners focus on a basic set of keywords in the niche they want to promote their business/project. The specialist’s task is to perform the widest possible search, analysis and collection of semantics for the site. This helps to promote the site not only on high-frequency queries with high competition, but also on medium/low-frequency keywords, due to which you can get high search positions and targeted traffic much faster.
Based on the collected semantics, it is necessary to properly optimize the page for these requests. Here are all the potential points of influence and use of semantics in order of priority:
- Formation and optimization of the Title meta tag (manually or by setting up automatic generation).
- Optimization of the H1 header (manually or by setting the automatic generation).
- Optimization of text content on pages (names of goods/services, seo-texts, etc.).
- Forming the Description meta tag (manually or using the automatic generation setting).
- Formation of the structure using H2-H6 headers (manually or using automatic generation settings).
- Setting of internal anchor link.
- Adding alt and title metadata for images (written manually or configured for automatic generation).
Point processing of pages is also necessary to expand semantics and increase its relevance. This applies to those pages that already have organic traffic and rankings. In addition, additional semantics must be selected not only from publicly available sources, but also from Google Search Console (there is a lot of semantics that most experts do not take into account).
Also, don’t forget that you need to work with internal linking to your priority areas to strengthen your pages and increase the relevance of queries. That is, to configure internal linking blocks to transfer weight to priority pages.
Content analysis of competitors
Before you start hastily adding keywords to the text, spamming in meta tags and images, you need to analyze the niche and determine the main parameters: the optimal average size of the texts, the percentage of keywords and other quality indicators. All this is calculated based on the analysis of competitors’ content.
Analysis of the content component is necessary in order to adopt the experience of competitors and integrate it on your site. Relying on actual data, rather than template work, it is possible to minimize the risk of incorrect On-Page optimization, which not only will not allow you to get the expected result, but can, on the contrary, harm promotion and lead to wasted money on creating ineffective content.
Writing meta tags
Writing metadata on the site based on the collected semantics is a rather difficult job (although, at first glance, it may seem the opposite). They are also formed based on key queries. Meta tags serve to help the search engine better understand the topic and content of the site. Also, properly designed metadata can improve the click-through rate (CTR) of pages in search results.
Certain requirements should be met when creating the title meta tag:
- use the correct order of words;
- make sure there are synonyms that can expand the niche;
- integrate commercial set-top boxes;
- use city names if appropriate.
For the Description meta tag, it is important to use a Unique Selling Proposition (USP) to increase the click ability of your snippets. Therefore, you should identify and present your competitive advantages so that specialists can use them correctly for writing meta tags and texts.
Writing/updating texts
Writing new or updating old texts on the site is necessary, first of all, in order to use them to correctly integrate and organically distribute key requests.
Sometimes it happens that users don’t even read the texts that are closer to the footer. Mostly, this applies to online stores, where the text on the category page is not interesting to us, as potential buyers. But the description on the product page, on the contrary, plays an important role when the user makes the final decision to design the product. But for the search robot, these two pages have the same value, and it can determine their value during the processing of the content, including SEO texts, which are placed on them.
Texts for information resources are the main source of traffic. It is not enough to simply write a large text on a popular topic and expect to attract additional users with it. That is, the texts on the site are a very important factor in the ranking of the site. Therefore, their writing should be treated very responsibly. Otherwise, there is a possibility that the funds for writing content will be spent in vain, and the site will remain in the TOP-100 in search results.
You need to start with a niche analysis. The specialist studies the average indicators of competitors’ texts: water content and density of keywords. To determine the optimal structure and size of the text, the specialist also studies the text data of competitors in the TOP. And already on the basis of the received data the necessary number of characters is calculated and the best structure of the future text is formed.
Content update
For pages that are a priority for promotion, it is necessary to update the content at least once a year. It is not necessary to completely rewrite the text. It can be slightly updated or expanded/completed. For the search engine, this is a signal that changes are taking place on the site, that the data on the site is relevant and constantly updated. Also, the request pool may change every year. This is especially appropriate if it is a new niche, seasonal product or service, technology, etc. New queries can be obtained by analyzing Google Search Console data. With its help, you can understand how the dynamics of requests change, due to which the site receives organic traffic. This data provides insight into which keywords should be added to your content and which ones have already lost their relevance.
For any type of resource, the best solution would be to implement constantly updated blocks. This applies specifically to textual content. Examples of such blocks can be:
- lists of popular or new products for the week/month;
- new reviews on the site;
- blocks with tabular arrangement;
- popular/related services;
- block with short news or announcements;
- promotions and promos;
- citation in the media.
This will allow you to constantly update the content, and search bots will scan the site with greater frequency, which means that the crawling budget will increase.
Auto-generation of meta tags or content
If there are a lot of pages on the site, then collecting semantics and manually writing meta tags for each of them will not be an effective solution. It will take a lot of specialist’s time, which is not commensurate with the final result. This mostly applies to large online stores/marketplaces where tens/hundreds of thousands of product pages may exist. Part of the metadata optimization work will need to be “closed” with the help of auto-generation of meta-tags. This will solve problems with duplication and lack of basic metadata on pages, minimal optimization of keywords on pages will be carried out. This method provides additional time until it is time to fully work through certain pages manually.
It is important to understand that it does not always make sense to collect semantics for each individual page and manually optimize it with the help of meta tags and unique texts from the copywriter, since there may be a small number of requests, and they may have a low frequency or may not be present at all for this group. In this case, it does not matter whether the automatic generation of metadata and text is configured there, or the optimization is carried out manually – the page will be ranked equally in the search for ultra-low-frequency queries. Therefore, typical semantics are often collected on similar groups of pages and basic auto-generation of content is configured. For example, this may apply to filter pages in online stores. For example, a group of pages based on the “color” parameter of the product, which have the following values: yellow, blue, red, black, white, etc.
Expanding the structure
This line of work is aimed at ensuring that the site ranks for all possible search queries that are relevant to your niche and list of goods or services. Expanding the structure is achieved by creating new pages for different clusters of requests and their On-Page optimization for the necessary keywords.
You can use several methods of expanding the structure on your site. All of them are used by your competitors and can potentially increase the number of keys and reach the target audience.
List of methods:
- Creation of a multilingual site and correct implementation of multilingualism.
- SEO filter based on objective parameters – product characteristics.
- SEO filter/tagging pages by subjective parameters.
- Creation of a structure for requests with toponyms for large or priority cities.
- Separation of the structure for the necessary requests (a decision is made by a specialist during the collection of semantics).
- Separation of spliced categories into separate pages.
- Blogging.
These directions help cover all points of increasing potential traffic and ranking for all relevant keywords in the niche. Below we will consider each method in more detail.
Creation of a multilingual site
Ukrainian version of the site
The presence of the Ukrainian version of the site is no longer just a necessity, but a rule of law, which provides for fines for site owners for its absence. According to statistics, over the past year, the percentage of Ukrainian-language requests has almost doubled. This means that the number of Russian queries is reduced, and part of the traffic is converted to Ukrainian keywords. Therefore, the presence of the Ukrainian version of the resource will not only save you from problems with the law, but will also increase the potential traffic to the site and the visibility of the site for all niche keywords.
Creation of a Russian version of the site
At the moment, Ukrainian search queries are gaining more and more popularity and in some niches the frequency is already higher than that of similar queries in Russian, but still, queries in Russian still give traffic and can generate orders. Even if the total share of Russian keys were 10-20 percent, these are still potential orders. Therefore, it is also important to make a Russian version on the website. But this is with the condition of the main Ukrainian version.
Features of multilingual site
When creating a multilingual site, it is necessary not only to create a copy of the site and translate/write unique content (it is highly desirable that the content for each language version is based on its search keywords). It is also important to implement it technically correctly. Correctly and fully write the link between language pages. Correctly show the language versions of each site to the search engine, using hreflang in the code or in the sitemap, etc.
Creating pages for relevant queries
When collecting semantics or analyzing queries that already rank for a site, you can highlight queries for which there are no separate relevant pages. This is a relevant method especially for service sites, when one service may involve a large number of smaller services. For example, “treatment of teeth” may include “treatment of caries, periodontitis, periodontitis”, etc. It is necessary to collect all the semantics of embedded services, create landing pages and prepare content for them.
SEO filter
This method is also called a “request page”, because a static page with unique products, content, etc. will be generated for any, even complex, request, which will respond to the user’s request as relevant as possible. Why is it needed? It creates massive pages for LF requests and for a large number of groups of clusters of requests with a small frequency.
Basically, the task of the seo filter is to help the seo specialist to quickly set up high-quality promotion for low-frequency queries. Let’s consider the advantages of using an SEO filter:
- An increase in the number of queries ranked by the site.
- Increasing the visibility of the website in the PS due to the entry of LF keys into the TOP-10.
- Increase in organics due to a fairly quick entry into the TOP of low-frequency queries.
- The growth of the main positions of requests due to the promotion of LF requests.
- Improving behavioral factors through more targeted inquiries.
- Reduction failure rate due to more targeted requests.
- An increase in conversion due to the fact that a person is already closer to the purchase stage for narrower queries than when searching for general queries.
- Increasing the number of pages, which is important to evaluate the site for the probability of receiving a good response. The larger the site, the more likely a person will find an answer to their query.
SEO filter based on objective parameters
Advanced SEO filter by product parameters (by objective parameters) – creation of static pages with unique content (purpose, texts, set of products) when selecting one or more filter parameters. The number of parameters at which a static page is generated, available in the index, is called the level of the seo filter. How deep the filter should be is decided by a specialist based on analysis of competitors, requests, etc. There may be a situation when in one category the filter works at the first level, and in others it has the second or even the third level.
Advanced SEO filter (additional subjective parameters)
This is a filter that uses not only objective parameters for generating pages, but also subjective ones. For example, there is a concept of “gaming laptop”, but one cannot objectively call this or that laptop gaming. Therefore, this is a more subjective concept. Or another example of a “refrigerator for an apartment”, how does it differ from refrigerators for a country house and home? There is no objective gradation that can be attributed there. Such pages are also called tagging pages.
Technical features
In order for these pages to work and function correctly, it is necessary to correctly implement their work from a technical point of view. The Google search engine does not give precise descriptions of the functioning of the SEO filter. It is partially described in various Help manuals.
The main rules that should be followed when creating such pages:
- page addresses must be understandable and static;
- there must be unique content on each page;
- addresses of all pages must be available for indexing;
- there should be links to these pages from the site + links from the site map.
Therefore, it is important not so much to simply make an SEO filter, but to do it correctly so that Google indexes and starts ranking the page data.
The correctness of the internal structure
Using such a filter can greatly complicate the work with filling the site, because everything needs to be standardized on the site so that all products of the same category have identical sets of attribute fields. In order for all attributes to be as standardized as possible, the same attribute must contain a completely identical name. For example, all bags must have such a list of attributes – size, style, color, purpose, type, manufacturer.
The “color” attribute can have the following values: brown, beige, red, etc. But in no case should it happen that on one bag the color is written “brown” and on the other – “brown”. This will generate 2 duplicate brown bag pages. This is a one-time job – everything is brought to a single standard once, and then simply when adding new products, you select the already specified attributes and that’s it. This, by the way, is very important during the initial development of an online store website. Taking into account all the nuances at the development stage, you will save yourself problems in the future.
Content filling
The key queries for each filter page are not fully collected and measured in seranking, in most cases. Because for large IMs there can be hundreds of thousands, and the budgets spent on collection time and measurements can be colossal. But for example, you can select several clusters of similar requests to see the frequency and current position of the site according to these requests.
The main feature of the correct operation of such pages is the correct technical implementation. They should not only be correctly created and unique due to OnPage optimization. That is, unique meta tags, unique content and a unique list of products must be entered on these pages. But it is not necessary to write the meta and text manually, in most cases auto-generated metatags and/or texts are enough.
Queries with toponyms
Creation of pages for the relevance of queries with toponyms – “query + city”. This functionality overrides all queries related to the name of the city where your product or service is being searched for. If you have regional stores or are verified on Google Maps and Google My Business, this functionality will show itself very well in “Regional SEO”. You can implement similar functionality through filter pages, through output on the pages of a separate block with cities, or through the creation of subdomains.
Similar functionality works well in Rosetta, Epicenter, and Eldorado. That is, you need to analyze them and understand how they achieved a good index. And you need to do it as soon as possible, before everyone else starts implementing, so as not to be at the end.
Blogging
Keeping an informational and/or news blog brings traffic to the site for informational requests through writing articles for informational requests and further linking to the site for products or categories.
So, we solve several tasks at once:
- An increase in the number of activities on the site will be more positively reflected on site ranking. And traffic for informational articles always has higher behavioral factors, which will increase the average behavioral factors for the site. Also, when articles get into Google’s recommendations, we get a sharp increase in traffic to the site (but to get into Google’s recommendations, you have to try very hard, and you need a lot of luck).
- An anchor internal link tells the search engine bot what key queries the linked page is relevant to and what it should rank for.
- Updating content is also very important for good ranking. Very often, previews of new articles appear on a number of pages on the site and thus partially update the content, which is positively perceived by the search bot.
Blog can be followed by informative articles, news, presentations – in all directions that can attract potentially commercial traffic. In addition, if you label the articles correctly, you can create subcategories, just like on the website. For example, by marking articles with tags, you can form a structure in the blog by topics. Blog articles, similarly to website articles, are written taking into account semantics.
Authority
After the release of the Google E-E-A-T algorithm, it is very important that all articles are written by professional authors only (or you need to create visibility for the search engine). For this, it is necessary to indicate the authorship of the article on the page and separately create a page for each author with brief information about him.
The author’s page should have a link between the articles and the author’s page. In addition, it is desirable that the author’s page has links to live pages in social networks where the author shares thematic information. You can also use any other methods of confirming your personality and professionalism in the subject.
Working with internal relinking
Internal linking is very important, because it is what the search bot relies on when scanning the site. And the correct transfer of the internal weight of pages with correct anchors gives a better ranking for key queries. And it is very important that the correct connections between goods/services and categories/subcategories/filter pages, correctly configured pagination help to index the site better. In practice, it has been verified that the presence of a page in the site map does not mean that it is indexed. Therefore, it is necessary to circle all links on the site as much as possible, so that there are no hanging nodes – pages that the bot does not get to.
Various methods are used for additional relinking:
- Additional link blocks on site pages (previously it was called “tag repository”, but with slightly different functionality).
- Linking from product characteristics to filter pages.
- Linking from articles to categories or goods/services.
For the correctness of linking, hanging nodes are also excluded, from which there are no links to other pages, or there are pages with a huge number of outgoing links. Also, in linking, it is very important to correctly use anchors in links so that they are relevant to the pages to which they lead. This shows search engines the high relevance of queries and helps with promotion.
Working with an external referral profile
External links are one of the most important areas when working with sites. The more authoritative sites refer to your resource, the more authoritative your site is in the eyes of the search engine. Google believes that it can be trusted and improve its search results. In addition, the presence of anchor links shows the search engine the correct relevance of the anchors to the page. But it is very important to work correctly with building up the link profile, you cannot buy only anchor links, and only non-anchor links will not have the effect that is needed for growth. All types of links must be combined in the correct proportions. Only this will ensure a good effect and exclude the fact of sanctions from search engines.
The main areas of work with the reference profile are:
- Analyzing the current link profile and cleaning it from low-quality donors.
- Analyzing competitors and identifying the features of promotion by links in a niche.
- Creating a strategy based on niche characteristics and the current profile of the site.
- Search for donors of sufficient quality for unanchored and anchored promotion.
- Constant purchase of links according to the developed strategy.
Analysis of the reference profile of competitors + formation of strategy and budget
To begin with, it is necessary to analyze the TOP competitors in the niche and calculate the average indicators in terms of number and ratio – anchor/non-anchor, article/crowd/submit, etc. This is very important, because by making a bias in the direction of article anchor links, we can get pessimism for using purchasable links, without using them at all – we will get a weak result in terms of position growth. Therefore, it is very important to correctly calculate the average niche indicators.
Next, you need to analyze the link profile of your site to understand the current situation, the same ratio and total that we were looking for from competitors. Based on the data obtained from 2 analyses, a budget for link mass and a link building strategy is formed – how many links are needed in general, separately by type, etc. On the basis of this, proposals are made for the formation of the budget for the reference.
Selection of donor domains
After the link strategy and budget are approved, the specialist forms a list of donor domains on which links will be purchased based on the monthly budget and the selected strategy. The list of sites is formed after an in-depth analysis of their quality and clarification of the price. The site is analyzed according to more than 10 parameters and indicators. Then, based on the figures of indicators and cost, the specialist makes a decision about the relevance of purchasing a link from this site for such money. Thus, a final list of them is formed from a certain number of initially selected sites (for placing texts with links).
Uploading competitor’s link profiles also gives a list of sites where they have been published and where you can place yourself. So here it is better to outreacher, who directly or through exchanges can look for the opportunity to place links on the sites.
Increasing site conversion/work with usability
Work on increasing the conversion rate of the site is a whole separate pool of tasks, which potentially have the goal of increasing the number of targeted actions on the site without increasing the amount of traffic to the site. For different resources, conversion can be completely different actions. For information sites, this is a greater number of pages viewed, for a service site – applications left, callbacks and callbacks, for online stores – this is an increase in the number of final transactions (orders placed in the shopping cart or through a quick order).
In order to properly improve website conversion, you need to know the so-called customer journey. This is the path that the user takes from landing on the site to the moment of performing conversion actions. In order to properly improve the conversion rate, you need to work constantly. It is necessary to carry out works aimed at the usability of the site – usability. This includes a number of works that make using the site convenient:
- All functionality on the site must work correctly, buttons/texts/blocks on the site must be as clear and easy to use as possible. The simpler and clearer, the better.
- All the necessary information should be in prominent places so that the user can easily find it and use it if necessary.
- All call-to-action blocks should be highlighted and clear.
- Using navigation, opening pages, moving around the site and performing other actions should be clear and predictable.
To increase usability, you can start using the recommendations provided by the Lightroom service. This will allow to solve the basic problems of ease of use of the site.
In order to increase the conversion of the site, it is necessary to break down the behavior of a person from the event to the order into stages and understand at what stage, and most importantly – why do users drop out? For example, for an online store, the scheme is quite simple: I went to the product, saw all the attributes, added the product to the cart, went to the cart, started checkout and finished the order. Next, you need to find reasons for rejection in the funnel.
Works with micro-markup and optimization of snippets
This is a special type of html tags in the site code that give search bots an understanding of what information is presented on the site. For example, the Local Business micro markup shows the address, opening hours, and name of your company. And when you mark certain blocks of text on a site, Google understands that this particular line of text is an address, and that piece of text is a company name. Google does not just mark some of the micro markups as available on the site, but can display them in a snippet. And in this way, you can expand and improve the appearance of your site snippet in search results. That is, by making the correct micro markups on the site, you can not only show the search engine some important data on the site, but also increase the CTR of the snippets. What does this mean for the site? Even without improving the position of the resource, you can get more targeted actions.
For your site, you need to configure the following micro-markups:
- Organization
- BreadCrumbs
- Local Business
- Product
- Review
- AggregateRating
- CollectionPage
- SearchAction
- FAQ
- ImageObject
- VideoObject
To summarize
The development and implementation of an SEO strategy is the basis of website promotion, without which achieving a good result is impossible. In the article, we talked about the advantages of using an SEO strategy and considered each of its stages in detail. As you can see, it is not so easy to do all this without special knowledge and a sufficient amount of time, that’s why contact us, we will be happy to help you achieve your goal!





