Content of the article

When keywords are lost on a website, a decrease in traffic from organic search results is inevitable. Search engines rank your resource based on many indicators. Thus, if keywords are lost for one reason or another, then this content will not respond to the user’s query. Search rankings will plummet, and your potential visitors will not be able to find you.
Website owners who use SEO optimization often have a semantic core. When such a situation arises, semantics can be used to audit priority pages of the resource and quickly discover the reason for the traffic drop. But what if the semantic core is missing and there is no information about which keys were lost?
Resource Analysis
First of all, it is necessary to diagnose the site. It is important to determine in which part of the resource the problem occurred, since a decrease in traffic is often observed on certain traffic-generating pages, and not on the entire site. The simplest and most effective option is to use free analytics tools for popular search engines.
Google Analytics
If your priority search engine is Google, first of all you need to analyze the pages of your site using Google Analytics. On the main screen of the tool, select the Interaction tab – Landing Page, then on the right side of the page, you need to specify filters – Parameter, Match Type, and Value, as well as specify the date range where the organic traffic drawdown began and compare it with the previous period. This way, you can compare the data “before” and “after” the problem was detected. Here you can view all the important metrics on the pages:
- page views;
- unique page views;
- average page view time;
- entrances;
- bounce rate;
- page value.
By choosing the number of pages to display from 50 to 100 (default 10), you can identify pages where there is a sharp decline in traffic and worsening behavioral factors. If there are a lot of pages on the site, or you are only interested in priority categories and subpages, you can filter the data by URL. All received data can be exported to a separate file for further use.
Google Search Console
Once the pages with traffic loss have been identified, you need to go to the tool Google Search Console. On the “Efficiency” tab, you can select a period or manually specify a date range before the traffic drop. The first tab will display all popular queries for your site. With their help, you can understand exactly how users find you in search. Using a list of pages exported from Google Analytics, you can filter data by URL in order to obtain key queries for a specific page.
IN Google Search Console You can also analyze popular pages, just like in Google Analytics, but there are significantly fewer metrics.
Main reasons
Thus, using the presented tools, you can analyze the site and return lost keywords. Next, the most difficult task is to correctly determine the source of the drawdown and make the necessary adjustments, since the presence of requests alone does not solve the problem.
Below are the most common problems that may help save time and help bring back traffic.
Technical problems
When various kinds of technical changes are made on the site, various errors may occur, the consequences of which bring negative results. This often happens when transferring to a new hosting or redesigning a site, changing the structure or template.
TOP 7 technical problems:
- The site is closed from indexing
After transferring a new site from a test server to the main one, the resource may not be available for indexing, since meta tags were simply not removed. Meta tag <meta name=”robots” content=”noindex, nofollow” /> from pages or the entire site was blocked from indexing by the Disallow: / directive in the file robots.txt.
- Change URL
Due to technical intervention, an unplanned change in the URL of priority pages occurred. In this case, you need to make changes to the URL – change it to the old values and set up a 301 redirect from new URLs to old ones. If more than 14 days have passed since the change, then in this case it is better to leave everything as is and set up a 301 redirect from old pages to new ones.
- Missing metadata
After the transfer, there is a situation when meta tags disappear Title and Description. In this case, they must be filled in using the collected keywords and using automatic metadata generation. These meta tags must be filled in on all pages.
- Sitemap.xml not updated
After transferring the file sitemap.xml not available or not updated. This file helps speed up search engine robots crawling all pages of your site. If it is not available or the addresses are incorrect, then the process is significantly delayed and your resource loses traffic.
- Errors or lack of internal linking
If the site has incorrect links to pages or they are missing for certain reasons, then, as in the case of sitemap.xml, search robots cannot correctly analyze the site.
- No text content
During technical work, some text blocks were not transferred to the pages. The text contains keywords. In this case, you need to return the lost text to your pages as quickly as possible so that search engines can find your content again.
- Speed
There are times when, after introducing new “chips” to a site, its loading speed drops. A resource with such problems will soon cease to be ranked by search engines as before, and you will lose traffic.
Content
Search engine algorithms are constantly being improved. In view of this, it is necessary to periodically update the content on the site. Otherwise, it may be regarded by robots as of little value or not relevant, which will entail a loss of positions and traffic. The content should contain not only search queries, but also be useful, relevant and interesting to the user.
Backlinks
A large number of acquired links can negatively affect your resource if they are on sites with a bad reputation. Instead of growth, you will get a drawdown of positions. Also, according to search engine recommendations, links should appear naturally. If the search engine detects a sharp increase, then sanctions may be imposed on your resource for over spam and returning traffic from organic search results will be extremely problematic.
Behavioral factors
An important factor is how convenient it is for users to interact with your resource. If for some reason visitors begin to leave your site in the first seconds, then it is necessary to analyze and correct the problems due to which users stop interacting. Bounce rate and other metrics tell search engines about the quality of your site. The lower it is, the lower you are in the search results.
Sanctions
One of the most dangerous reasons is sanctions from search engines. This means that your resource has grossly violated the rules and is subject to restrictions. In this situation, it is necessary to identify the violation and correct it as quickly as possible. You can also find out about the presence of sanctions in Webmaster or Google Search Console. Even after fixing the problem, the resource will take an extremely long time to recover after a drawdown.
External interference
In rare cases, external interference may be a problem. This reason is often found in highly competitive niches. The resource is attacked by bots or the site’s CMS is hacked. You can notice this by unnatural traffic data or sharp jumps in metrics.
To summarize
Key queries are very important for successful website promotion. Their loss entails a loss of organic traffic and a decrease in conversion rates. The free tools described above are your reliable tools for diagnosing and restoring the “health” of your site. If you study their functionality well enough, then you will not be afraid of such problems. Moreover, you will be able to independently conduct segment analysis and develop your business by attracting new users.




