None of us have a crystal ball when it comes to SEO, but certain things aren’t so much a matter of if, but when.
Let’s start with two things we know:
It’s in Google’s best interests if users find their search engine pleasant and useful.
The processing speed of computers is expected to double every 3 years after 2013 for decades.
Putting those two things together, it stands to reason that what works today in link building won’t necessarily work going forward, especially when what we do today works against Google’s best interests.
With that in mind, here are three ways I expect link building to get more difficult in the future.
1. User data could take precedent
Google has a tremendous number of ways to measure user behavior, and as any conversion rate optimizer or user tester knows, user behavior tells you way more about what it actually likes than what it is willing to overtly tell you.
Here are some of the ways that Google is already accumulating user data:
Google admits that it uses Google toolbar data to influence rankings.
Google has unfettered access to Google+ activity.
Google has gone as far as installing cookies on Safari browsers to track data.
Google has even collected data from wi-fi hubs using Street View cars in the past.
While most of these privacy violations are being used for the obvious financial reason of creating targeted advertisements, it is extremely likely that Google has and will use this kind of information to adjust search results.
One of the more interesting conclusions from the recent SearchMetrics “ranking factors” study was the existence of a “brand factor.” The top 3 results of many search results are “reserved for brands,” according to the correlative data. The top three spots tended to feature pages with weaker on-page ranking signals.
While off-page signals like links and social sharing seem to play a major part in this, there is reason to suspect that user data will start to play a larger role in the coming years.
We can expect Google to start optimizing its SERPs much the way a conversion rate optimizer would, split testing them for the most favorable user behavior. It has done exactly that with YouTube videos and time on site while optimizing suggested videos.
I’m not arguing that user data will replace links. (Ranking factors don’t “replace” one other. That’s not how the algorithm works.) I am arguing, however, that it may become impossible to rank for certain keywords without positive user data, and that the user data of pages that link to you could influence how powerful those links are.
In short, if your site has no positive user data, and none of your links do, don’t expect your rankings to last.
What’s the solution to this problem? Optimize for user interface and user experience before you ever optimize for search engines. Here’s how to do it.
This will help you in so many ways beyond the SEO value. You want user data on your side.
2. Links you built could be devalued
Google’s major breakthrough was to use external backlinks as a ranking factor. The reason this gave the company such an advantage was because the other search engines relied on on-page signals that could be easily manipulated by the webmaster. External links were very difficult to manipulate at the time, and so Google’s search results were far and above anything that came before.
In that tradition, Google is most interested in ranking factors that you don’t have any control over. If you have control over your link profile, that makes it less valuable. If you built your entire link profile, Google may consider ignoring it entirely.
I’m not saying that Google is going to penalize you for links that you built yourself (except in extreme cases, such as private blog networks). I’m just saying that Google has no reason to pay much attention to links that you can easily manipulate.
Google, the company, doesn’t care about your PageRank. It only cares about what your link profile says about your influence and relevance on the web. If a link indicates that you are influential and relevant to a particular search term, then Google wants that link to help you rank for that search term. If it doesn’t, Google doesn’t really care.
To use guest posts as an example, a guest post link from a top industry blog does say something about your influence, even though you built the link yourself. Guest posts on article directories do not.
Guest posts leave patterns. Most of them have resource boxes. Most of them exist on sites that have published other guest posts. Most of them use the word “guest post” just below the title. There’s a lot of data to cross reference here. No matter how high quality your guest post is, if it exists on a site that links to sites with crappy user data and terrible links, that link has limited lifetime value.
The only truly safe way to approach link building is to ask yourself about the non-SEO value of the link. Does it help improve your image, send referral traffic, help with conversions or make money outside of SEO? If not, it’s probably not worth investing your time.
3. Links will face machine learning
Most of you probably already know that Panda almost certainly had something to do with machine learning. Here are a few things you may not know:
Google has developed a machine learning algorithm capable of identifying human faces and cats with zero guidance from the programmers.
Google sells a “prediction API” service that performs cloud-based machine learning for its customers.
Google’s research portals lists more than 190 publications on the subject of machine learning.
Google will soon release a wave of apps powered by machine learning.
Machine learning algorithms mimic processes like those occurring in the human brain or evolution in order to solve problems that the programmers have difficulty solving. Machine learning algorithms are so effective that they were able to reproduce Newton’s laws of motion in just a few minutes – something that took humans tens of thousands of years.
Don’t get me wrong. Machine learning algorithms are unpredictable. They tend to come up with a different solution every time the simulation is run, and sometimes they’ll keep running forever, never finding a solution. But as computer speeds increase, machine learning effectiveness will as well.
What will SEO look like when the search engine’s capabilities start resembling those of human quality raters? As time progresses, SEOs will need to act more and more like they are being watched by manual reviewers, as though their link profiles were being tallied up by human beings.
I’m not claiming that the algorithm is going to become “self-aware” or anything. I’m simply stating a much more mundane fact: it’s nearly impossible to reverse-engineer an algorithm produced by machine learning. Most of the algorithms that come out of these simulations can’t even be described in human language, or boiled down to anything less than a long and complicated set of steps.
Anybody who’s dealt with a penalty from Panda knows this all too well. There’s no simple tweak you can use to recover from low quality content. You can only delete it, noindex it, or rewrite it.
Discovering loopholes in link profile manipulation is going to be close to impossible.
Link building will change, but survive
Link building will get more difficult for spammers and entry-level SEOs as the years go by, but it will never completely lose its value. A well-executed link building campaign will always be defensible as pure marketing. Marketers who focus on link building campaigns that prove beneficial outside of SEO will continue to see sustained growth, and build businesses that will last for decades.