Yandex sanctions. Yandex filter low-useful content, spam, excess advertising

Hello dear readers of the blog site. In this series of articles, we will talk about sanctions and filters that we tend to impose on offending sites. search system Yandex.

We will also consider the factors that can lead to a ban and describe the filters that are applied in droves. Well, and, of course, it will be impossible not to say that.

True, within the framework of one article, we will not be able to physically cover the entire layer of information on filters, so this publication will continue. Today we will talk about, and also learn how to calculate the hit of a site under the AGS and filters for over-optimization or over-spam of content. Naturally, the set of factors leading to the imposition of these sanctions will be described, and proven ways to get out of them will be indicated. I hope that the information will be useful and you will use it for the benefit of your project.

Types of Yandex sanctions and reasons for the site being banned

Unfortunately, even if you follow all of the instructions below, you will not be one hundred percent insured against pessimization, but the risk will decrease significantly. Anyway, understanding the mechanism of applying the filter, you can understand for yourself what you were punished for and try to eliminate the existing shortcomings. Consultations with Plato (usually, an impersonal person answering a question asked to Yandex technical support is signed by Platon Shchukin, well, or at least it was before) will also be easier to conduct, understanding the principles of banning.

So, all Yandex filters, and, can be divided into:

  1. Pre-filters - the search engine applies them at the stage of getting sites into the index and basic search. A typical example would be.
  2. Postfilters - applied after calculating the relevance of the document (ranked web page)

Sanctions imposed by Yandex can also be divided into:

  1. Manual
  2. Automatic

And also, according to the degree and type of damage, one can distinguish:

  1. Full ban (site banned), i.e. all pages were excluded from the search base. Naturally, the traffic from this search engine will stop completely.
  2. A filter is possible in the form of ignoring outgoing links from the resource. It can be important, for example, for those who plan to buy links from it.
  3. It is also possible that incoming links are not taken into account, which will seriously worsen the ranking.
  4. For aggressive advertising or existing content for adults, you may be limited in search (for example, in the second case, you will be shown only when the user selects,).
  5. The notorious ACS, which leads to the exclusion of a part of the resource pages (often a large part) from the search base.
  6. Punishment for a detected attempt to cheat behavioral factors (we will talk in more detail below).
  7. A number of filters and sanctions are imposed after ranking.

What reasons can lead to the imposition of sanctions?

  1. Texts can easily lead to the imposition of filters. For example, I have this.
  2. Links can also be punished. For example, for inept and excessive work on recruiting external link mass or for re-optimization when organizing internal linking.
  3. For abuse of advertising posted on the pages of the resource, punishment may also follow.
  4. It is clear that some attempts at cheating (PF, etc.) can lead to sad consequences.
  5. They can identify your affiliated sites - created to promote services or products of the same company in order to occupy as many positions in the search results as possible (for most search queries, only one result from one resource will be presented in the search results).

Why can you get banned in Yandex?

Let's start with Yandex's most radical punishment for simple webmasters, namely with things that can lead to a website being banned (so-called black SEO).

  1. Hidden text - in white by white background or using any other methods of posting content that ordinary visitors do not see. A ban can be obtained, for example, for using it to hide text.
  2. Placing a list of keywords on the pages for which they are promoted. Probably, they met (before it was common) the design of this list in the form of the block "We were still searched for by requests". However, they can be punished (by a ban or a filter) even for the usual linking, where you diligently link to other pages using keys (for example, by knocking from competitors). In this regard, it is rather dangerous to overdo it.
  3. Cloaking is when search robots (they are recognized by the username they present themselves with when entering a website or IP address) are given some content (usually over-optimized and hard to read), and completely different content to visitors (without extra keys and quite concise). Cloaking was actively used at the dawn of search engines, when they did not yet punish spam with keys, etc. things.
  4. Links from a picture in size, which, of course, is not visible on the site, but the search engine sees and takes into account the link. Again, this is a rudiment from the past, which now only a very dense webmaster will use.
  5. Content created automatically, for example, in dorgen (doorway engine) or synonymizer. These things are still actively used, and doorwayers still manage to bypass the obstacles of search engines. Doors' life is not a debt, but efforts are not spent on their creation, so they take the number and deep study of changing algorithms by a search engine and try to play on their vulnerabilities.
  6. The presence on the site of pages or sections for the exchange of links (created manually or automatically) or any other structures that can be classified as link-cleaning (hundreds and thousands of links to various resources). And it doesn't matter if the links will be provided with a text anchor or they will come from banners or pictures. All this is a direct path to the Yandex ban, or (if you're lucky) to the imposition of sanctions. Moreover, you can change links, but they should be thematic for your resources and in small quantities.
  7. Placing a web page as if outside the site, because there are no internal links to them (only external) from your resource and. At the same time, they can be executed in the site design and have all the same navigation elements.

If on your resource something from the above is available (maybe even created without malice), then.

From the ban, let's move on to the milder penalties used by Yandex, namely filters. The presence of many of them is officially confirmed by this search engine, and some are idle speculation of optimizers, but this does not mean that they are not used (remember as in DMB: “Do you see a gopher? - No. - And I do not. there is!").

let's let's list Yandex filtersthat no one would like to fall under.

AGS filter (17, 30 and 40) from Yandex

AGS (antigo * nosaite) - there are varieties with numbers 17, 30 and 40. Fear of this filter is primarily for those who create GS on the stream (sites created for search engines or for selling links, but not for the benefit of visitors). This type of pessimization appeared in 2009, and its last modification, AGS 40, dates back to the end of 2013. As it is already clear from the name, this filter is designed to cut out low-quality resources from the output, and its new versions are associated with an increase in its selectivity.

After its imposition, only a few pages (from one to ten) remain in the Yandex index. At the same time, the pages remaining in the index are not subjected to any additional pessimization (their positions in the search results remain unchanged). The AGS works, of course, in automatic mode.

Let's take a look at some of the criteria by which the likelihood of falling under the Yandex AGS filter increases:

  1. Newly registered domain name (young site)
  2. Resource placement on free hosting
  3. Usage (Joomla, WordPress, cracked DLE, etc.).
  4. Low or almost zero attendance (few people per day). According to the existing statistics, it can be concluded that if the traffic is less than fifty people per day, then the probability of getting the site under the Yandex AGS is very high. But if the attendant crosses the threshold of three hundred unique visitors, then picking up the AGS will become a very difficult task. read in the cited publication.
  5. ... Once in both of these directories, you can guarantee yourself virtually inviolability in your relationship with the AGS. This is evidenced by the analysis of those resources that fell under this filter.
  6. Absence or a small number of external links leading to this resource. To fill this gap without allowing excesses, I advise you to familiarize yourself with the materials:
  7. The presence of forums, blogs or message boards (on the same domain or subdomain), to which information is added by everyone without moderation. Usually, the result is a monstrous spam dump, which greatly discredits this resource.
  8. Multiple duplication of internal content (for example, to get more pages from which to sell links). In WordPress, these can be archives for tags, categories, temporary archives, etc. things.
  9. or texts bad quality (low-quality rewrites or auto-generated texts).
  10. Too aggressive internal linking of pages with large blocks of internal links. In this regard, many advise you to be careful with things like tag clouds.
  11. The presence of signs of selling links (presence of external links on many pages of the site).
  12. Availability aggressive advertising type of pop-up or pop-up windows as well. A resource that respects itself and its readers, as a rule, will not use such things.

How to get a site out of the AGS (17, 30, 40)?

However, if you have already fallen under this type of pessimization, then you will be primarily interested in methods for getting out of the AGS filter... Basically, these are universal guidelines to help you understand what happened to your site when you noticed.

  1. Naturally, you need to start with a "letter to Plato", or, in other words, try to contact Yandex support to find out the reason. In no case do not download the rights, but simply explain the situation and ask for advice. you can see the link.
  2. If, after Plato's answer or by your own understanding, you understand that the matter is in the content (namely, in its low quality or over-optimization), then nothing else will remain but rewrite content... It is clear that if your resource is a real GS, you will not do this due to inexpediency (it is easier to slap a new GS), but when you really get under the AGS your brainchild, which you groom and pour, it will be possible to strain.

    True, for me it resulted in five months of hard labor to rework four hundred articles, but, therefore, the game was worth the candle. You can learn the method of working with texts from here:

  3. If you are coupled with free engine use also, which is simultaneously used by thousands of resources on the network, then it makes sense to try changing the design to a more unique one.
  4. If the site you are creating is positioned as a commercial one, then be sure to add contact information to it (phone, office address, location map), clearly indicating which organization it belongs to. , so ... A lot of GS are disguised as commercial resources, but it is the lack of contacts that gives them away. For an information resource, add a contact Email or.
  5. Try to add some kind of useful service... For example, the standard for many commercial resources is the availability of calculators for calculating the cost of services or something else. You can spy on this business from competitors.
  6. If there is a suspicion that the usability and navigation of the site is lame, then do not apply this fix. In general, read the article about "" and draw conclusions for yourself about the inconsistency of your project with the standards of modern SEO, so that the AGS is not even close.

In fact, all of the above tips for getting out of the AGS boil down to one thing: make websites for people - with a normal design, with normal formatted content and easy navigation (usability). The advice is simple, but you will have to spend a lot of time (or money) on its implementation.

Yandex sanctions for the text content of the site (for the content)

Text filters can be divided into two main types, which are listed and described below. The way out of them will be slightly different, but the common feature is to optimize the content for the current SEO requirements, and not for what ruled a decade ago.

Cloth filter

Filter for spamming or a tailor, which was introduced in 2010 and was not confirmed by Yandex in any way (but the principle of a gopher from DMB still works here). Its distinctive feature is page sinking for one request, while other requests for which this web page ranks may remain in their positions.

The methods of dealing with the tailoring filter are quite simple: diluting the clean occurrences of a sagging keyword (phrase), reducing the volume of text (again, this is more important for commercial, not informational sites), and also rules (paragraphs, headings of different levels, lists, etc. etc.). That is, a colorfully and conveniently designed "footcloth" (even tens of thousands of characters) without the dominance of keys with direct entry can be stunningly perceived by visitors and will drag the request up.

Overspam sanctions most often include:

  1. Pages heavily crammed with keywords (in direct occurrence).
  2. Texts tens of thousands of characters long that no one reads. The key is precisely that they are not read due to their frightening size, lack of formatting and a heap of allocated keys, instead of some semantic highlighting. A readable text of a large volume, on the contrary, can be quite a good help for promotion (especially an information site).
  3. By the way, precisely because of the fact that five or seven years ago, SEOs wrote huge texts for commercial sites (without much formatting, because they were written only for Yandex and often hiding them from visitors behind spoilers or in forms without scrolling), into which they could it is easy to cram hundreds of occurrences of a key without exceeding the critical value of nausea (the percentage of occurrence of keywords relative to the total number of words in the text). Search engines have figured out this method of dishonest optimization over time and introduced appropriate penalties for such a cheat.
  4. Someone compares the texts of my articles with footcloths, but this is not correct, because they fall under this definition only in large sizes. And it’s not so critical for an information resource (see wikipedia, habraharbr and other mainstream informational resources). The main problem with long texts is to keep the user reading them, which is helped by formatting, styling, and catchy subheadings. Well, and the content, of course, where can we go without it.

Steps for getting out of a tailor filter (for spamming):

  1. We write to Yandex support (yes, at the same time) about the possible imposition of sanctions with an attempt to find out which ones.
  2. We clean texts from spamming with keys, and also remove the "water" along the way to reduce the volume while preserving the essence. This will lead to a better user experience of the content, especially when the conditions in the next paragraph are met.
  3. We add formatting, if it was not there (breaking into paragraphs by meaning, adding subheadings, lists, semantic highlights, pictures, as well as tables or something else if necessary). This will increase the likelihood of reading the text and lead to improved behavioral factors. In other words, we make candy out of footcloths.
  4. Check if the content of the page fully responds to the user's request. This condition must also be met to exit the filter for overspam.

Content re-optimization filter

The Yandex filter for re-optimization was imposed on my blog in the spring of 2013 and I felt all its charm pretty well on my own skin. It appeared two years earlier, and its existence was officially confirmed by Yandex. Distinctive feature filter is the subsidence not of any one request, but of all the requests for which the web page that got under the filter was ranked (promoted) (both high and midrange, and even low and over low frequencies are subsided). In my case, there were many such pages, and the total traffic loss was three to four times.

Well, firstly, you will need a full correspondence or a significant correction of the text of those pages that have fallen under this type of Yandex sanctions. Personally, I rewrote everything that I added in the first three years of the blog's existence. Also, in the course of this lesson, it is advisable to remove the excessive use of accent tags (b, strong, em), as well as the abuse of keys in subheadings.

I also deleted the titles with alts from the pictures, because there was a sea of \u200b\u200bkeys (this was my first step even before rewriting the content, with which I thought to quickly stop the problem, but this turned out to be not enough, and to restore at least the alts my hand was no longer force of scale of work). I also (highlighted links to useful resources, updated outdated information and removed the "water").

In addition, I want to say that, which is imposed only for overspam keys. The second of the described sanctioning tools is imposed not only for overspam keys, but also for an attempt to slip a text that is not entirely relevant to the search engine's query, for which it is optimized. Such text does not fully answer the user's request or does not answer it at all, although it is satiated with the necessary keys.

Often, nonsensical content containing “water” on an ocean scale also falls under the filter for over-optimization. An example of a lack of response to a user request can be optimization for commercial request some product or service, but without specifying prices. Or an attempt to advance on a query with the word "reviews", but not providing those very reviews on the page. Such pages can fall under the Yandex filter precisely because you deceive the expectations of the user (your potential client).

Over-optimization filter symptoms usually result in a sharp drop in requests promoted on a given web page at once by tens of positions down. In very advanced cases, there may be a similar subsidence not only of individual pages, but of the entire site as a whole (for all promoted queries), but without changing the relevant pages. I did not have this, because for a number of queries after the filter was applied, the positions in the Yandex search results did not change. But it was still decent (the total flow of visitors to the blog decreased three times).

Summarizing all of the above, we can give several general tips for getting out of Yandex text filters:

  1. We are writing to Platon Shchukin (in support of Yandex) to find out or clarify the reasons for the decrease in the flow of visitors coming to your site from this search engine. Again, remember that nothing is given to us as cheaply or valued as much as courtesy.
  2. We clean the content from overspam with keywords (the process may require a lot of effort from you in the case of a large amount of content that needs to be revised from a new angle).
  3. Put yourself out of the place of the user typing into search bar the query you optimized for this page, and are trying to understand whether she gives an exhaustive and comprehensive answer to it. In the course of this, it may be necessary to update outdated information, remove "water" and place emphasis (highlights) not on keywords, but on important points that the user's eye should cling to. This will help get the page good behavioral factors and get out of the filter for over-opting or spamming content. For the examples discussed just above, we can say that prices will need to be added to the pages (either give a link to them or organize their display in a pop-up window), if the user expects this by entering his request or add reviews if they are mentioned in the request. In general, do not deceive the expectations of your potential customers and they (and together with them and Yandex) will be satisfied with you. Everything is like in real life.

I think that's enough for today (otherwise you will end up with a “footcloth”, not an article). We will continue in the very near future, do not switch.

Good luck to you! See you soon on the pages of the blog site

You might be interested

Website output from under the Google Penguin filter - step by step guide
SEO terminology, acronyms and jargon
Minusinsk in Yandex - why the filter is applied, how to check if a site is located near Minusinsk and what to do
Yandex updates - what are there, how to track ap Tic, changes search results and all other updates Disavow links or how to determine which Google filter (Panda or Penguin) the site is under
Relevance and ranking - what is it and what factors affect the position of sites in the search results of Yandex and Google
Content for the site - how filling with unique and useful content helps in modern website promotion
How to add a site to add url of Yandex, Google and other search engines, registration in panels for webmasters and directories
Site trust - what is it, how to measure it in XTools, what affects it and how to increase the authority of your site

Yandex strives to find an answer to a user's request by providing information and links to it on the Internet on the results page. We are based on our understanding of what users need and what information is valuable.

Therefore, following the guidelines below will help index and better rank your site, while it may lead to a decrease in its rankings or exclusion from search.

Basic principles

    Think about users, not search engines. Would you create a site, page or element if there were no search engines? Are users coming to your website or online store from other than search engines?

    Be honest. Attracting users for requests that your site cannot adequately respond to does not mean retaining them. Think about.

Examples of principles

If this section does not describe some technique that helps to artificially influence the ranking of a site in Yandex, this does not mean that we welcome it. Follow common sense and the above principles.

We try not to index or rank high:

    Sites that mislead visitors: when downloading a file (audio, video, torrent file, etc.), a third-party program is loaded. Or a third-party program is hosted under the guise of a popular application, etc. An example of such a violation is the so-called.

    Sites that copy or rewrite information from other resources and do not create original content.

    Sites that copy or rewrite information from other resources, with low-quality automatic translation of content into another language, do not create original content.

    Pages and sites, the sole purpose of which is to redirect the visitor to another resource, automatically ("redirect") or voluntarily.

    Automatically generated (meaningless) text.

    Sites with directories (articles, programs, enterprises, etc.), if they are only content aggregators, do not create texts and descriptions themselves and do not provide any unique service.

    Sites that serve different content to visitors and search engine robots ("cloaking"),

    Websites providing products or information on affiliate programsbut of no value to the visitor.

Today we will touch on the site errors in the Yandex webmaster, which he has been generously distributing lately to site administrators: the topic is very extensive and endless in terms of conversations, because there are no significant explanations from, say, the legislators of the Yandex PS, which means that you can assume endlessly.

Below we will consider examples of tips in the webmaster about fatal errors: when, they say ...

... according to the ps Yandex algorithm, our site threatens the safety of users ... or an almost direct accusation of the site administration of spam, the uselessness of the resource (its content) for job seekers on the Internet, etc., etc., which are very vaguely indicated by messages in the Yandex toolkit.

The curtain opens:

yandex search engine webmaster - site errors

Sections of the article:

(I received several letters during the week in which my novice colleagues asked to clarify the (personal) situation in the Yandex.

I must, to my greatest joy, report that the described problem is observed on one of my sites under my supervision. The site is quite new, the administration still considers itself to be a beginner, and therefore I sometimes help ...

So, so as not to say the same thing over and over again, I decided to write such a short post thinking ...

Once we go to the webmaster's holy of holies and see the following picture: (by the way, this warning refers to fatal errors by Yandex)

If you click on exclamation mark, then we will be transferred to a page with more detailed descriptions problems, etc.

However, here are some nice phraseological units formulated by “webmaster J. to help the“ punished site owner ”:

The site may threaten the user's safety, or violations of the search engine rules have been found on it. This problem negatively affects the site's position in search results.

Site positions in search results are lowered

The site does not comply with the basic principles (hint link //yandex.ru/support/webmaster/yandex-indexing/webmaster-advice.xml) by which our algorithms evaluate its quality: contains useless content, excessive advertising, search spam, etc. P.

Usually, restrictions are lifted within a month after the violation is eliminated.

//yandex.ru/support/webmaster-troubleshooting/lowqualitysite.xml

And one more explanation to the site owner, which says that everything about everything (that is, the webmaster corrects errors, and the robot re-indexes the site) a month until the next button click.

The next opportunity to send a correction signal will appear in 29 days

What happens? ... if, say, we have corrected our mistakes, which, by the way, are not clear to many owners, we have the opportunity to send a request to review our site for errors.

Within a month, PS promises to resolve the problem and eliminate warnings !! if errors are eliminated.

I thought about this: (of course, this is a purely personal opinion)

The picture is as follows: there is a website. The site contains gross errors (which are stated by Ya. As fatal - in Russian, this concept simply means - kirdyk)). And everyone knows the word "kirdyk" what is - the end! Naturally, the novice admin is bitterly worried! And, most importantly, he does not know how to solve these problems of errors: with which I have to agree - such explanations, extreme disgusting on the part of the PS.

I dont know! who made such explanations. In my opinion, there should be at least more or less clear indications of errors. Further…

This is especially amusing: suppose the site was originally created by someone as a GS! - to sell links, spam and other garbage ... The authors of this site understand this very well and they frankly by bolt all these warnings in the webmaster.

But when such affectionate reminders of mistakes come to a respectable beginner (and even if he still does not understand everything in web development ... the path makes some mistakes: this is why he is a beginner) - then such warnings sound somehow not human!

The site owner is very worried ... nervous ... and some (there are some) end up quitting blogging / websites ...

I would like to ask a sacramental question: with whom will PS remain? ... although my question is, of course, so-so ...

Okay…

Now, for my part, I will intercede for search engines, because - those are only machines; and such formulations, careless in relation to the psyche of a neighbor, are made by people:

or is the Yandex webmaster right in pointing out errors?

There are billions of sites on the Internet ... a colossal amount!

Let's ask ourselves this question: here's how, tell me, to grade ... to divide machines (PS robots) our sites into "good" and "bad"?

Right! a very difficult question. In addition, we are, in fact, still at the origins of the development of the Internet industry! That time has passed from the great beginning ...

Machines with all their "memory" do not yet possess intelligence. And you and I, beginners, I think we should understand this.

And, no matter how regrettable it is for a beginner to understand, sometimes (I would even say - often) the site author himself is to blame for imposing such sanctions and warnings.

If PS pointers like: spam, danger to users can be attributed to the category of ancient sophistry ... then the uselessness of the content takes place !!

Spamming and, to some extent, danger are visible visually, and do not require any special knowledge to identify ... A diligent admin can easily fix all this.

... while many of the site owners do not want to understand in any way that copying material to their site from other people's resources (partially or completely) is regarded by search engines as theft.!.

in addition, many SEO specialists claim that you can steal from yourself and get it for ethno by the site header: that is why they close all pages of categories or tags from search engines from indexing, thereby eliminating duplicate content.

I personally, to these beliefs - duplication of myself within the domain)) - I am rather cool, because it is not ethical to consider machines dumber than myself) But I fully admit that in the programs of the PS there is an algorithm for "theft", copy-paste ... and including repetition domain-bound texts.

I dare to assume that it is precisely this copy-paste that the search engines find fault with (how to deal with duplicate pages, described in articles, example code and links below).

And I remind you, people write algorithms for PS!

I saw many sites from the inside, and saw how the administrator is trying to develop his resource by simply copying material from someone else's site ... I do not want to say that such desires are always driven by some malicious motives ... no! ... rather ... often - naivety ... from halva: that is, freebies))

Don't forget this.

You should always remember that a search robot does not have even a drop of intelligence! and would rather attribute innocent pericopy to theft than ... well, you understand ...

As an epilogue:

You can print anything on your website, but only when it gains some weight in the network among its own kind ... In the meantime, our resource is young, you should still adhere to certain rules of the Internet as a whole, albeit not always clear and understandable, but these are the rules, whatever one may say:

1 - if you copy known material to your site, then:

a) give it maximum uniqueness, or

b) close the page with such content from indexing.

... if you include in the text some other person's voluminous quotation (well, or even from your earlier published article) out of sin - close it in citation tags

close pages with non-unique content from indexing

It is easy to close pages from indexing, I have said this many times ...

/ ** close the page or post from indexing ** / function my_meta_catss_robots () (if (is_category () or is_page ("77") or is_single ("77")) (echo "". " "." \\ n ";)) add_action (" wp_head "," my_meta_catss_robots "); / ** close the page or record from indexing ** /

In the code, I gave examples of "closing categories" is_category, site pages is_page and records is_single - where you simply (in brackets) specify the post ID, page - I have 77 ... and so on ...

Thus, after registering the above code on your server, for example, in the functions.php file of the active template, in the site header (in the meta), if you open the source code Ctrl + U, a meta tag will be automatically generated under the code concept robots, namely, this is the thing:

… Where noindex means that the search engine is not indexing the content (which is what we need). and follow - allows the robot to pass through the document: the robot will see links and go to useful open documents: that is, the page will not be useless ... unlike if you completely close it! type: noindex, nofollow.

And when you close it in the way described above - by all means - remove the restrictions in robots, if any were registered!

In short: in my opinion, often all these warnings from the Yandex webmaster and so on are bullshit! although, of course, they upset a little. And with a little work it will be great!

What I wish you with great pleasure!

here's another thing:

… I think you already know without me - and yet, I consider it my duty to remind you: sometimes there are situations that you don't even suspect !!

everything seems to be fine, the content is unique ... but the site does not want to go up to a decent attendance ... and suddenly - bam !! - warning from the Yandex webmaster.

I strongly recommend that even if you have a brand new domain, check external links for several months (whether spammers' resources refer to your domain: porn and other harmfulness in this regard)) - all this can turn into unexpected surprises, which are very, very negative will affect the promotion of the site: well, at least test here - //www.linkpad.ru/

Don't be lazy - check the domain history before starting the project! It often happens that all sorts of scammers get rid of their waste domains! and we, without suspecting anything, acquire ...

Be attentive to your favorite site address ( domain name) - because it often happens that, for example, Yandex webmaster blames your resource for spamming. In such cases, very good. hard to prove that you are not a pig !!

Take an interest in the life of your site on the Internet in more detail)) - it's easier! do not bring the site / blog to the state of critical errors, rather than solve them later!

... And I just have to bow ... and wish you useful work on the network

Share your thoughts in the comments ...

At this, the curtain of the show falls ...
... sad dust falls on the ramps ...

Good day. For any webmaster, the most important goal is the high positions of his site in the search results. But, not every webmaster remembers that the search engine was created not only to raise his site to high positions and increase traffic to his site, but also so that ordinary users receive high-quality results for their queries. And the last purpose of a search engine is a priority for search engines, and therefore every webmaster must be able to balance between these two goals of search engines so as not to fall under the filter and sanctions of search engines. And today we will talk about the filters of the Yandex search engine in this article.

Yandex main filter - this is the so-called ban. The ban is applied to the entire site, all pages of the site are excluded from the search. The most common reason for a ban may be search engine spam. What is search engine spam is when a website page is oversaturated with keywords or phrases. Many people do this deliberately, believing that the higher the keyword saturation of the page, the higher its position in the search results, but this is not the case, this judgment has been mistaken for several years, and for this reason many pages fell under the search engine filters. Some webmasters do not deliberately saturate their pages with search phrases and may also fall under the sanctions of the system.

Most common reasons for search engine sanctions

  • Oversaturation with search keywords or words of site pages.

    As mentioned above, many do this on purpose, believing that the higher the keyword density, the higher the site's position, but this is not the case. Search engines fight against such phenomena as search spam and punish such sites with a ban, that is, excluding the site from their search. In order not to fall under the Yandex filter, you need to be smart about placing keywords in your text, after writing the text, be sure to re-read it and delete unnecessary keywords in order to make the text readable for users. And always remember that a keyword density of 5% is not a guarantee of getting a site into the TOP of the search engine results, but a guarantee of getting a site under the search engine filter.

  • Publishing a list of search queries on the site pages.
  • In the recent past, it was very fashionable to publish on a site page a list of key phrases by which there is a transition to this page from search engines. But, this technique has not worked for several years, and it can negatively affect Yandex's attitude to your site, or it can even get your site banned. Many engines popular CMS allow you to easily track key phrases for which there were transitions from search engines to the site and place them on the page. So, such a function must be turned off immediately, because this is a direct road to the ban.

  • Publishing a list of key phrases on the pages of your site.
  • Again, it is a popular misconception that putting a keyword list on a page on your site will improve your site's ranking. Some people place such lists at the bottom of the page in a noticeable color or even make them match the background color of the page itself. But you yourself understand that you can hide this list of keywords from users, but search robots do not care what color this list is, they still see them and evaluate them accordingly.

  • A large number of keywords in meta tags.
  • Tags such as h1-h6, meta keywords, description. Spamming these tags can lead to a ban.

  • if the site contains pages that do not contain useful information for users or are created specifically to promote the site for certain requests, you can go to the Yandex filter, get a ban.
  • the presence of text hidden from users on the pages of the site, no matter how it is hidden.

Ban signs

  1. Complete exclusion of a site from Yandex search.
  2. But, there are times when the entire site or some pages fall out of the index and this does not mean that a ban has been imposed on it, there simply could be problems with the hosting and when the robot crawled it, it could not index the site. If this happens, then you need to wait, and after the search engine updates the issue, your site will return to the index again. If after the updates your site has not returned to the index, then you need to write to the support service with a question about the reasons for your site being dropped from the search. Sometimes violations are not noticeable to webmasters, then by correspondence with the support service, you can find out about violations and fix them.

  3. Another sign of a site falling under the Yandex filter is a partial dropout of the site from the index, from 1 to 10-30 site pages remain in the search.
  4. This Yandex filter is also called AGS. In this case, when you correspond with the support service, you will receive an answer that the search engine algorithms have decided not to include your site in the Yandex index. To get out of this filter, Yandex just needs to develop your site, because in narrow circles the AGS filter stands for AntiGovnoSite, so draw your own conclusions.

Other reasons for falling under the Yandex filter

  1. If the site consists of stolen content, if the uniqueness of this is zero.
  2. The content is completely uninteresting to users. But this point is rather vague, it is difficult to say how the search algorithm thinks that the content is not interesting to users!

  3. Sites with content containing spelling errors, written illiterately, can fall under the Yandex filter. Don't forget, you are creating websites for users!
  4. If your site consists only of videos and there is no content at all, then such a site may not only fall under the filter, but also not be indexed by a search robot.
  5. If the site is created solely for the sale of advertising.

Lead appearance your site to the human mind, and it will not fall under the filter.

How to get out of the ACS?

It's pretty difficult. To get out of the AGS, you need to double-check your site for non-unique content, and if there is any, then you need to replace it. Delete all duplicates, fill your site with new content, that is, start updating the content. You can change the domain, revise the site structure and, if necessary, change it.

Types of Yandex filters

Pessimization - the type of filter that disables the link ranking of the site. When a site gets under such a filter, all links, both purchased and natural, leading to the site, stop working, and the site's performance drops sharply. Signs of hitting a site under the filter are a sharp drop in site positions by several hundred positions, and all the pages of the site remain in the index. If there is a fluctuation of several tens of positions, this does not mean that your site has fallen under the filter.

The reason for the site's sanctions by Yandex may be the placement on the site pages of the so-called "link dumpsters", various link catalogs and any other links that are not intended for ordinary users, and are placed only for manipulating the search engine results. For example, if you are going to register your site in any directories, then you should not post backlinks or banners of this or that directory where you register your site, because there is a great chance to guess under the Yandex filter. It is also possible to fall under the filter due to selling links. Therefore, it is worthwhile to approach such links wisely, to place them no more than 5 pieces on one page. Pessimization can also affect sites that are full of all kinds of advertising banners, pop-ups that prevent users from browsing the site, while the TC of the site is reset to zero.
The “pessimization” filter has been replaced by the “You are the last” filter.

You are the last - This is a Yandex filter that excludes a site from search, but remains indexed. In the search, the site can be found, but only by unique phrases that are present on the site. When you enter a site address as a request, the site that fell under the filter may not be in the first place in the search results. You can get under the filter because of non-unique content that is intended to make money. At the same time, the owner of this content, that is, the original source, can also fall under the filter. To get out of the filter, you need to change non-unique content.

Filter "Affiliates" Is a Yandex filter that applies to sites of the same subject, owned by the same company. The Yandex filter believes that there should be only one company site in the top of the search results, and not several at a time. At the same time, both one affiliate site and all sites belonging to one company can fall out from the issue.
Today, it is impossible to determine the signs of hitting the site filter, since the algorithm of its operation is not known.

You are spam filter... It was not officially presented, but it was noticed and could be classified. This filter does not cover the entire site, but only that page or pages for certain requests that are overflowing with keywords for this request. That is, a separate page is checked for a specific query, and if its content contains many search words of this query, then it can go under the filter. Its creation was aimed at fighting SEO texts that are oversaturated with keywords, created solely to promote the page for certain queries, and which do not provide any benefit to users. To check if the page is under the filter, you need to enter the required query, but in a modified form, and if there is a checked page in the SERP, it means that it is under the filter.
To get out of the "You are spam" filter, you need to rewrite the text by removing spam keywords from it, reduce the size of the content, and adapt the page for users, not for search robots.

Filter "nepot"... It consists in zeroing the weight of links from the site. This happens when there are a lot of outbound links from the site.
Filter "adult" - This is a Yandex filter for sites with a theme "for adults". A quite decent site, the content of which does not contain pornographic or erotic content, can also get under the filter if there is erotic advertising on its pages. Therefore, you need to be careful with erotic advertisements.

conclusions

Let us summarize the above. There are not so many Yandex filters and it is very easy to get under them. As it was said many times in this article, you need to write content for users and create sites for people (SDL is a site for people), Yandex representatives always talk about this. You do not need to use black methods of website promotion, abuse selling links and advertising. Write unique content that is interesting primarily to users. Observing these rules and always thinking about users, your site will not fall under the sanctions of Yandex.

How I solved the problem with the Yandex filter for useless content, spam, excess advertising. Removing such a Yandex sanction is a rather complicated and lengthy process. I described some recommendations in my last article. Describing general methods for determining search engine sanctions and general initial actions. Which the user must perform at the initial stage of the fight against this unpleasant consequence.

It's time to combine all the actions in a sequence and describe the whole process of getting out of the search engine sanction.

The content of the article:

Useless content, spam, excess advertising - we perform a full site audit

Many believe that it is necessary to fight against the listed Yandex filters. In fact, everything looks quite the opposite. The problems can run deeper than meets the eye. To identify obvious problems in the structure of the pages, as well as to identify gross errors in the content. It is important to perform a high-quality audit of all site content. This will reveal a set of the most serious errors that need to be corrected first.

In this case, the service "Saitreport" ( Report website - diagnostics and audit). Let's say your site is young, contains a small number of pages. Then you can audit 25 pages for free. To scan and diagnose more pages, we purchase an additional option. I used this service several times, paying only 190 rubles for the site audit. This is not a lot of money amid the effects of a search engine filter.

We will not dwell on the extensive functionality of the service, the following points must be taken into account:

  • Content component on the site, unique and non-canonical pages.
  • A large number of: headings of the same type, highlighting of words and phrases, errors in the text and HTML.
  • A huge number of internal links on the site, also significant nesting. An example of my audit: Number of links: 7 545 Deepening to: 6 levels of accessibility from the starting one.
  • There are significant page redirects - circular links.

This is the minimum that you should pay attention to in order to carry out further work. We figured out the audit of the site, if you have any questions, write in the comments. Let's continue.

Excess advertising on the site - more than three ad units

This problem is simple enough to solve. I purchased a paid theme template with ad insertion feature. Thus, the ad unit code is not included in the article text.

Yandex text filters: Spam, Prespam, Re-optimization how to solve this problem

It is difficult to cope with this sanction, or rather, it is very long and tedious. How to solve this problem - the audit will help to understand some of the reasons for this filter (sanctions).

Solving the text filter problem:

  1. Check all articles for text spam. Spam content should fluctuate between 49% and 55% no more. This indicator is quite favorable, less is just better.
  2. The content of specific words and phrases in the text. Use a keyword or query no more than 3-4 times. Do not repeat one word over 10 times.
  3. Avoid a lot of emphasis in the text. In this case, variety will help, some significant words should be indicated in italics.

Based personal experience, I advise you to adhere to the following criteria and actions:

  • Use LSI in conjunction with SEO. Engage additional search queries combined with keywords. Try to compose a cloud of subject text surrounded by the main key. It makes no sense to shove requests everywhere, implement only where it is appropriate.
  • Influence visitors with images, visualize content elements (icons, icons, etc.).
  • Simplify the text, write only to the point, revealing the main issue of the topic.
  • The most optimal article criterion: characters at least 5000, spam 48% -53%, uniqueness 92% -100%, water in the text 12% -16%
  • Check for spam and the main page, it is a priority for indexing. In case of a problem, reduce the number of displayed articles by home page site.

These are the most basic actions for the implementation of which you will receive a favorable result.

Filter of little use content or non-unique content on the site

Sanctions for content of little use are quite relevant and spread across many sites. Not many webmasters know how to deal with unhelpful articles and images. Everything is very simple:

  • The most correct way is to check all articles for uniqueness. Then correction of texts with bad indicators.
  • Write down all alt, metakeywords, description, this also applies to images.
  • The most important thing for this kind of sanction is to indicate 301 redirects to all images. After further correction of the nesting of the image in the text. With indication no image attachment pages, and the addresses of the media file or without a link at all.

I talked about the most important actions affecting the solution of the problem with the filter from Yandex for content of little use, spam, excess advertising. Having followed all these recommendations, the Yandex search engine will remove such sanctions from the site in a month.

In my case, the withdrawal happened after 1 month 14 days. All the site's position in terms of queries has recovered, many are in the top 5,10,20,50. Single articles occupy the first 3 top positions in search results. The total number of requests on the site in Googl is more than 2000, Yandex is more than 2500. I will describe the additional steps that I performed in the next article.

  • Read articles on this topic: