A lot has changed after over 10 years in the search ecosystem. It almost feels like we are trapped in the Google box that we’ve created ourselves…doesn’t it?
When Google came on the scene their “do no evil” phrase sounded so good and so pure that we all bought it and made Google what it is today.
Their business model was so simple – Drive traffic to other people’s sites based on what the user typed into the search box. They did it better than anyone else at the time and because of that they quickly gained huge market share, and dominated the search business.
It was awesome! We were made to believe that all we had to do was create high quality content and Google would reward us for it with great search exposure. The funny thing is that it actually worked exactly that way for the first few years.
Google was making a ton of money from Adwords and content creators were enjoying the free traffic they were sending us. It was a great relationship…Google actually cared back then about small publishers and business owners.
But all good things come to end 😉
Google started slowly to change and their “do no evil” motto no longer carried the weight it used to back in the day. They realized that they can produce their own properties and actually compete with us in search.
Eliminate The Future Competition (Panda Update)
In 2010 after Google obtained a patent that would allow them to identify “inadequate content” and enter all sorts of new markets, they needed somehow to get rid of existing big publishers also known as content farms.
At first people thought that it was a great update because now we didn’t have to compete against sites like Ezinearticles, Hubpages, Buzzle, Ehow etc. You can check out this post where I showed how some of those sites got impacted by Panda.
Somehow Ehow survived though. Not sure what kind of deal was made there but they continue to dominate search results with their low quality articles. In fact, their traffic has been steadily growing for the past 2 years.
Anyway, Google made us believe (again) that they wanted to reward high quality content and that Panda update is going to help us do that.
What we didn’t know was that Google was just getting ready to enter into many of these markets themselves using other people’s content. In other words, they would scrape the web to create their own properties.
It’s smart if you ask me, I mean why not send people from OUR search engine to OUR content and make a lot more money?
Well, it sounds good for Google but I think it’s a short term investment on their part because if we can’t use Google to find other sites what’s the point of using them at all?
Let’s Weed Out The Smaller Players
There was another obstacle for Google though – the small publishers writing quality content in smaller niche markets. How do we get rid of them?
Here is where the unnatural link penalties come into play 🙂
I mean, it’s brilliant, penalize smaller sites for things they really have no control over. In March 2012 Google wiped out thousands of smaller blogs and little stores that tried to compete in search against bigger already established brands.
If you were dependent 100% on Google traffic you got screwed. If you had staff now you had to let them go and employ link removers, or try to do it yourself.
I removed thousands of links since then and also spent quite a bit of money on speeding up the process.
A lot of publishers couldn’t afford wasting time on removing links that Google saw as unnatural. Most of them didn’t even know what that meant.
We have been told for years that we need to get other sites to link to us and that external links cannot harm your website, so we were happy to get any kind of link that could help us get better exposure.
Google stated that it’s for the better because it’s going to help us weed out spammers from the serps, but I honestly don’t think that worked. Spammers are back doing what they do best and ranking well while those that actually had real sites are still trying to save it without much success.
Spammers are actually laughing at us all. They can switch domain names quickly and get new ones without risking anything. We on the other hand can only try to fight the penalty or shut our website down for good.
Give Big Brands A FREE Pass…For Now (Even if they violate our guidelines)
Eric Schmidt said once that:
Brands are the solution, not the problem…Brands are how you sort out the cesspool.
According to Schmidt you need strong brand signals (whatever that means) in order for Google to trust your content.
I think this type of thinking is flawed because it basically gives too much power to popular sites with big budgets. For example, they can easily test out markets with scraped content without providing much value while pushing smaller publishers down in the search results. It’s already happening on massive scale.
Matt Cutts said once (in one of his videos) that topicality matters. He specifically pointed out that big news sites shouldn’t rank for some keywords simply because they published a random article with those phrases.
That might have been correct 2 years ago but now I constantly see big sites like FoxNews, CNN and others ranking for keywords that they shouldn’t have. They rank because they posted some generic article targeting these keywords. Big brands know that Google right now is giving them a green light to fill the search results so they write about whatever at this point to maximize their advertising revenue.
For example, if you type “make money online” into Google search box you will get links to Foxnews, ABCNews, About.com, Forbes, and Youtube. Other sites and blogs that are specifically about this topic are being pushed down. Maybe this is a bad example because in this niche there are a lot of crappy and scammy sites, but it is what it is.
Another interesting thing is how big brands get away with breaking the rules. Many of them were caught violating Google’s webmaster guidelines but none of them suffered any real consequences.
Usually they just get a slap on the wrist and get back in the index quickly. They also get better help at identifying what they are doing wrong. For example, Forbes.com was selling links once and when they posted in Google Webmaster Forum regarding their penalty they got a quick response from Matt Cutts pointing out exactly what’s wrong.
I understand that Matt wouldn’t be able to provide such help to everyone over there, but Google could be more specific about what exactly is hurting your or my site, don’t you think?
Let’s Go Social (Google +)
No doubt that Google+ is trying to compete with Facebook, but in my opinion they are going a bit too far.
It feels like we are being forced to use it otherwise our sites could suffer. Maybe it doesn’t impact search engine rankings in a big way yet but it definitely is heading in that direction.
Recently Eric Schmidt said the following
Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in most users naturally clicking on the top (verified) results. The true cost of remaining anonymous, then, might be irrelevance. – source link
What this means is that if you don’t use authorship markup then you probably won’t rank as high as sites or blogs that do. Here is another interesting article about that topic.
You see, the thing is that users that use Facebook weren’t forced to sign up. They did it because they wanted to. A lot of Google+ users on the other hand are there simply because they don’t have a choice. I think that’s a huge difference and from my own experience whenever you try to force something on people it will usually backfire sooner or later.
Scraping The Web (Great Business Model)
Google stated many times before that they don’t see scraped duplicate content as valuable source of information for their users, but for some reason this problem continues to grow.
Sites that scrape content automatically lately happen to outrank the original source.
I was browsing Google Webmaster Forum the other day and found threads that highlight this specific issue:
You would think that Google with all their resources could easily identify which document is duplicate and which one is not, but for some reason they still have issues with that…or don’t want to fix the problem at all?
Big brands get away with scraping on massive scale. For example, Aaron Wall made a post about FindTheBest.com here, where he pointed out how Google is bias towards big corporations or companies they have connections with.
This behavior by Google in my opinion injects fear into the search ecosystem and is designed to squeeze little players out.
When scraping happens among small publishers they usually get penalized. If someone scrapes your content with a link back to your original piece it could be classified as unnatural. Now it’s your job to go out there and clean it up even though you can’t control it.
Moving Away From True Search To Providing Content And Answers
Google wants to move away from providing just 10 links on their search results pages. It’s more visible in some markets than others but the writing on the wall is there.
Have you seen Google Now? Basically, the point is to keep you on their platform so you become more dependent on them. Read this if you want to find out more. It all looks like great innovation but where does it end?
This major shift started to happen when Google went public. Since that point they started caring more about their investors and stockholders. Google needs to develop more revenue streams to keep growing but at what cost?
We use Google to find information from publishers not from Google, well at least that was the idea. If Google becomes a one-stop shop users might find them irrelevant at some point in the future because their data already exists on billions of other pages.
At the end Google might become just another big website.