Written by Anastasia Kurmakaeva
Índice
The SEO and search engine optimization landscape changes with rampant frequency.As professionals in the sector, we must always be aware of the constant changes in Google’s algorithms, as well as all the news, trends and discoveries in the world of digital marketing that other experts in the field share in a disinterested manner.
To this day we still find cases of websites that continue to use on-page and off-page SEO practices that have become completely obsolete, some of them more recently, and others that have had zero relevance for some years now. In this article we will try to compile the most important ones, so that our readers do not invest unnecessary efforts in optimizations that will not bring any kind of benefit to their projects – in the best case scenario – or that could even be detrimental to them.
1. Use of meta keywords
We start with the use of the meta keywords element, which we could say already belongs to the equivalent of the stone age of SEO. And yet, it is still an element that we encounter in the code of some websites. We do not know why there is such a high level of attachment to this technique. Perhaps it is due to lack of knowledge, or laziness of having to take care of eliminating it. The truth is that we will make it clear that we are not doing ourselves any favor by using meta keywords in our portal. The reasons?
- Google has not taken them into account in its calculation of the relevance of a website for years.
- We reveal the objectives of our SEO strategy to the competition.
- We have elements in our code that serve absolutely no purpose other than to take up space.
- And furthermore, if our website has a very high number of meta keywords in its code, it could even be detrimental to our positioning, because we would be incurring in the practice of keyword stuffing, which we will see later in the article.
2. Anchor text of links over-optimized with keywords.
Technique penalized by Google. This especially affects inbound links or backlinks, although if we overdo it by including keywords in the anchor texts of internal links, it will not bring us positive results either. Of course, a good internal interlinking is necessary, but using a good reiterated the same keywords in the anchor text in a large number of links is seen as an attempt at manipulation for the sole purpose of obtaining good results in the rankings.and therefore, an unnatural practice, whether we do it in inbound or internal links. It is also not correct to use the same anchor text over and over again for the same link on a large number of pages.
We should always avoid this type of optimization on our site. Remember that the goal Google always wants you to have in mind with your content is to provide value to users, first and foremost. Otherwise, it will consider that you do not deserve to appear among the top results.
3. Number of backlinks > quality of backlinks
This consideration is a big mistake, and if we still think this way, we must “change our chip” as soon as possible.
Because we get a higher number of inbound links, we are not going to get a higher level of popularity with our website. Moreover, it is a technique that is highly penalized by Google, with consequences that can be very negative for the positioning of a website.
While it is true that many years ago this SEO practice worked very well and was widespread on the web, in 2005 the Google Jagger update started to devalue low quality links, such as those coming from link farms and/or purchased links. In 2012 Google’s Penguin algorithm put a sharp end to it, given the excessive amount of spam links that were being generated in order to achieve better positions in the results, through directories, comments in forums, blog posts, reciprocal links…
What we have to pay attention to in the attainment of referral links nowadays is that they are obtained in the most natural way possible, are of quality, grow at an adequate rate and come from reliable sources related to our own content.
And if we believe that we have links that could be harmful, it is advisable to remove them as soon as possible using the Google Disavow Links Tool, or by contacting directly with the owners of the sites that give them to us requesting their removal or correction.
4. Keyword stuffing
This practice was discredited with the advent of the Google Panda algorithm update in 2011, and rightly so. It consisted of using the same term (or a very similar one) over and over again in the content of a page, achieving a high keyword density in the text, whether it made sense or not. For example:
Absurd, isn’t it?
Nowadays, Google is betting that the published content should be much more natural and fluid, so that we pay more attention to a correct, structured and coherent writing that encourages reading, relegating the use of keywords to a second place.
In addition to using keywords in a more contained way, it is recommended to take advantage of the use of synonyms, polysemy, abbreviations, etc., thus consolidating the trend of semantic SEO.
The so-called keyword stuffing is also not well seen in the title, nor in the meta description, although the latter has not been used in the calculation of the relevance of a site for quite some time.
In general, this is a practice that does not give us a good image, and does not encourage the user to click, which is our ultimate goal, so a natural wording of any type of text for our website is essential.
5. Exact match domains (EMD)
Refers to domain names that match an exact search term for which a website is to be optimized. For example: agenciademarketingonline.es, posicionamientamientoseoenalicante.com, desarrollo-web-alicante.net, etc.
This is an SEO practice nowadays obsolete for the development of a business, because it used to be used by portals of very low quality, with poor content, but that by the simple fact of containing those keywords in their address could get good positions. Evidently, Google put a stop to it, giving preference to other SEO factors when calculating relevance, and has even penalized EMDs that offer no value to users, as we see in this article.
This type of domains are not usually positive for the marketing strategy of a business, for different reasons:
- If we want to develop as a brand, they do not inspire confidence in any type of public, whether it is our own target, sector media, or others. In addition, seriously affect the chances of getting referenced on websites. precisely for that reason, because it does not create an association between the brand and a type of business, since there is no brand as such, and in a way, this makes us lose credibility and generates a dubious reputation.
- These domains are difficult to internationalize, if at some point we want to expand our borders, so we should not use them if we intend to grow internationally.
Although at the SEO level it is true that they can attract certain benefits, these are very scarce and in the long run are likely to do more harm than good, so their use is not recommended if we want to develop as a brand.
If we are not interested in developing our branding, they are still totally valid, as long as they are complemented with quality content of value to the user.
6. One page = one keyword
Another erroneous approach is to create and optimize a page for each keyword and keyword variant on a website. It is correct if we are dealing with two different terms (even if they share similarities) as long as they can give more of themselves, but if for example we create pages for each of the following keywords:
- Digital marketing agency
- Online marketing agency
- Online marketing consultant
- Marketing Consultant
Or terms with a geographic component:
- Reforms in Alicante
- refurbishments in Orihuela
- Refurbishments in Elche
- Etc.
It doesn’t make much sense. It is best to avoid this technique at all costs, in favor of creating a unique page for a given concept, with a wealth of language, synonyms, and informative and quality content.
# Obsessing about rankings [bonus]
We’ve all been there, and I’m sure sometimes we still give it more importance than it deserves. Keeping a rigorous control and checking our rankings obsessively and repeatedly to see if this or that specific keyword has risen or fallen positions from one day to another, or even from one hour to another, is counterproductive, as well as foolish, for the reasons I describe below:
- To begin with, we are investing time that we could be spending on other tasks to improve our website, or monitoring and analyzing metrics that can provide us with valuable information about our Internet strategy.
- Google search results tend to fluctuate for certain terms and this is normal. This does not mean that we should immediately throw our hands up in the air if we have dropped from position 1 to 3 on the page, or a little further down. If we have not changed anything on our site, it is most likely a one-time change and we will be back to the Top in no time. The important thing is to avoid drawing conclusions from one day to the next, but to look at the longer-term trend.If something smells fishy, investigate: for example, if we detect a very pronounced decline that is sustained over time, or if we observe that our competitors have implemented changes that are working very well.
- The results we see are not the same as the results other people see. Search results are customized to fit different user profiles based on many factors, such as geographic location, device in use, personal settings and other personal information. For this reason, there is no need to be alarmed if we detect any change, as it does not necessarily affect everyone in the same way.
Although it makes sense to monitor our positioning through the use of tools such as SEMrush or Sistrix (to name a few examples) to keep track of how our overall strategy is working, spending hours and hours doing this kind of checks will only waste time and generate unnecessary stress. As a result, we are likely to make impulsive decisions that can easily harm the positioning of the site.