Written by Fernando Maciá
Índice
It seems that Google knows that during the month of August companies let their guard down and are “at half throttle” to catch us off guard with new changes that deeply affect all of us who work as SEO consultants. Last summer was the implementation of the first version of Panda along with a change in the way unique visitors and sessions are counted in Google Analytics that diffused for a few days the impact that the new algorithm update could have had on many Web sites.
On this occasion, the hare was raised by Bill Slawski, of SEO by the Sea, who in a article published last August 17 warned about the recent approval of a new patent called Ranking Documents in Google’s favor that could have a huge impact on the traditional working methods of search engine optimization professionals.
Temporary transition classification
In essence, what the text of the patent describes is a system whereby Google could, upon suspicion that the relevance factors of a piece of content have been artificially manipulated to obtain a better ranking, temporarily classify the suspicious content or page in a different position than the one to which it would have actually belonged The word “different” can, in this context, take on many different meanings: lower, higher, random, changing, etc., which makes it practically impossible to establish correlation or causation relationships between improvements in the content relevance factors -either on page o off page– and the achievement of a certain position in the results pages.
Textually, Google’s patent application states that:
Implementations consistent with the principles of the invention may rank documents based on a rank transition function. The ranking based on the rank transition function may be used to identify documents that are subjected to rank-modifying spamming. The rank transition may provide confusing indications of the impact on rank in response to rank-modifying spamming activities. Implementations consistent with the principles of the invention may also observe spammers’ reactions to rank changes to identify documents that are actively being manipulated.
(Implementations of this invention can classify documents according to a transient classification function. Positioning based on the transition classification function can be used to identify content subject to spamming activities. The transient ranking will provide confusing signals of the impact that spamming activities (over-optimization) are having on the results. The implementation of this invention could also observe the reactions of spammers to these changes in positions to better detect documents that are being actively manipulated.
Traditionally, the work of optimizing a Web site for search engines has involved work in different areas:
Indexability analysis
indexability analysis: detecting and resolving all obstacles that prevent a Web site from being found, crawled and indexed correctly and in full by a search engine.- Relevance optimization: improve the wording of texts in prominent areas of the page (those areas that we know have a higher weight in the calculation of relevance such as titles, image alternative texts, headings, bold, etc. as well as achieving an optimal keyword density).
- Server performance optimization (WPO): optimize response codes and server content download times by optimizing code, image weight, download parallelization, proper management of cached content, etc.
- Popularity generation: through the gradual inclusion of links in thematically connected Web sites with high popularity, among other techniques.
- Social network generation: through the dissemination of content in social networks, mentions, favorable votes, etc.
By working in these areas, SEO professionals have tried to move a given piece of content from an initial position in the search engine results to a desired position – traditionally the highest possible position – for a given search.
According to Google’s patent, the position of content in the results may vary over time legitimately or illegitimately:
The rank of a document may change over time due, for example, to changes in the document itself, the links pointing to the document, or documents with links to the document (sometimes referred to as “linking documents”). These changes may be the result of legitimate modifications or rank-modifying spamming. The rank of the document before the changes may be referred to as the “old rank” and the rank of the document after the changes may be referred to as the “target rank.” The rank transition function may generate a “transition rank” that is interposed between the old rank and the target rank. The transition rank may cause a time-based delay response, a negative response, a random response, and/or an unexpected response to occur during the transition from the old rank to the target rank.
(Own translation) The positioning of a document may change over time due to, for example, changes in document content, links pointing to it, or documents with links to that content. These changes may be the result of legitimate content modifications or spamming activities to modify your positioning. The position of the document before the changes can be called “old or initial ranking” and the position of the document after the changes can be called “ranking or target position”. The positioning transition function can generate a transition classification that is interposed between the initial position and the target position. This transition classification may cause a delay in the expected response of improvement of the positions, a negative response, a random and/or unexpected response that will occur during the transition from the initial position to the target or final position.
Google’s new patent states that, in the event that “suspicious spamming activity” is detected in the optimization of the relevance factors of a given page, it could apply a “temporary transitional ranking” to that content.
What impact can I expect from Google’s new patent?
In practice, this means that it could be the case that, once optimized, a certain content could:
- To worsen your positioning for the desired search.
- Randomly change its positioning.
- Delay for an undetermined time the potential improvement of your positioning.
- Positioning in a totally unexpected way, for better or worse.
In any case, the application of this“temporary transitional classification” would be in effect for a variable time frame. And what this means in practice is that we would not be able to establish as directly as before correlation and causality relations between the improvements made on the contents -I do this, this and this- and the improvements obtained in the results -and I climb X positions-, with the multiple consequences that this may imply.
Traditionally, SEO experience has shown us that ehere is a certain time lag between the implementation of the improvements (architecture, indexability, relevance, popularity, server performance, etc.) and improved results and that, well executed, a search engine positioning campaign generates generalized improvements in the positioning of a Web for multiple searches (not only the targeted ones) and that the best positions are obtained gradually. This is what we usually reflected in the campaign monitoring reports.
Now, this scenario could change radically, because between the implementation of the improvements and the final achievement of a better positioning we could have to face, in case of raising the search engine’s suspicions, a totally unexpected transition positioning and, what is more serious, difficult to explain and justify for the client who pays our fees because it will not be so easy to demonstrate that we did A, B and C and we managed to move from position 32 to 7, for example.
In addition, the “reverse engineering” experiments that have been used to try to identify how Google’s algorithm works may have their days numbered, as it will be more difficult to establish the relationship between cause and effect.
Is there anything we can do to protect our Web site and further improve its search engine ranking?
According to the text of the patent, the objective is to prevent spammers from artificially altering the results:
By artificially inflating the rankings of certain (low quality or unrelated) documents, rank-modifying spamming degrades the quality of the search results. Systems and methods consistent with the principles of the invention may provide a rank transition function (e.g., time-based) to identify rank-modifying spammers. The rank transition function provides confusing indications of the impact on rank in response to rank-modifying spamming activities. The systems and methods may also observe spammers’ reactions to rank changes caused by the rank transition function to identify documents that are actively being manipulated. This assists in the identification of rank-modifying spammers.
(Own translation) By artificially increasing the ranking of certain (low-quality or unrelated) documents, spamming that affects positioning degrades the quality of search results. Systems and methods consistent with the principles of this invention may apply a transition function (e.g., time-based) to identify those attempting to abuse positioning. The position transition function will return confusing signals about the impact of optimization on positions in response to spamming activities. The systems and methods could also observe the reaction of spammers to changes in positions caused by the transition classification to identify those contents that are being actively manipulated. This will help to better identify spammers.
Recommendations
Scrupulously observes the Google Webmaster Guidelines
A first aspect of concern, then, should be todelimit what could be the signs or symptoms that Google could consider to detect that it is trying to artificially manipulate the relevance of a content and, therefore, apply this “temporary transitional ranking” to a given content after its optimization. The factors that may be suspicious are well known and are listed in the famous Google Webmaster Guidelines and include:
- The abuse in the use of keywords or excessive keyword density in the content of the page (
keyword stuffing
) - Invisible or minimum size text
- Page redirects and cloaking techniques
cloaking
(showing the search engine content other than that shown to the user). - Abuse of keywords in prominent areas (title, alt text, description, meta keywords, title attributes in links or images, headings, link text, etc.)
- Manipulation of popularity, with techniques such as link buying, inclusion of links in link farms, etc.
These are the old and well-known Google guidelines, which are becoming increasingly thinner, so it is important to keep them constantly in mind – now more than ever.
Take more care than ever of your popularity profile and its evolution
Better a few links on quality sites than thousands of links on directories and blogs with no thematic connection to your site. Be careful because the rules of linkbuilding have changed radically. Improve your public relations and this will improve your popularity. Buying links has become a high-risk activity.
Optimize from the ground up or take advantage of major Web site redesigns
Don’t leave search engine optimization for later. If you are developing a Web site, optimize the information architecture, the page structure and the wording of the texts in the prominent areas at the first time, before publishing.
If you must optimize an already published Web, do not “touch up” only the texts of the prominent areas. It takes advantage of major Web redesigns to force Google into a global recalculation of content relevance. The constant modification of the title, alt text, links, etc. in the content can be very easily detected as “suspicious activity” by Google.
First of all, be patient
If you optimize a Web site and do not get the expected results, do not despair and start “tinkering” with texts, links or architecture. Perhaps Google has applied transitional positioning to you and the effect of the optimization could take some time to be confirmed. Keep in mind that if you respond by modifying the content relevance factors again, you will be making it easier for Google to identify you as a spammer and it is possible that it will indefinitely prolong the application of the transitory positioning or penalize you more severely.
More references
Article by Peter da Vanzo in SEObook.
Barry Adams article and the difficult relationship between Google and SEOs.