Google’s 2013 algorithms have been very controversial.
The most important and highly debated issue is accreditation.
For example, if we run the search “King of the Hill Theme Song” Google just tells us the answer. From a user perspective, this is beneficial right? Just as fast as we hit enter on that search we get the answer – as opposed to seeing it in the title or meta description – or even clicking through to a site to find the answer.
On the other hand (from a webmaster/SEO/marketing perspective) Google is stealing traffic, like a super fisherman ninja…
Other examples include comparing ingredients, ie; chocolate vs. vanilla:
Let’s say we were ranking #1 for “king of the hill theme song” and related terms. This would be netting us around 2k-4k traffic a month (based on current Google Keyword trends).
Then all of a sudden Google comes in and says, “Sorry bub, you’re ruining the user experience, we’re going to just show them the answer ourselves.”
So my question to you is, “Does this improve the user experience ENOUGH to warrant its existence?”
Personally, I think it doesn’t. As the internet population continues growing, there should be more room for more authoritative sites to rise. What if someone wants to be the go-to source for all theme songs? (In fact, TelevisionTunes is) Or ingredient combinations?
Some have pointed out that Google’s answers are “basic enough” to not interfere with the business of sites ranking for those terms. But this is obviously subjective.
Where Wikipedia, About, and Examiner Fall Short
Another thing I’m going to rant about about is Wikipedia.
They land top tier rankings for everything, but they’re NOT CREDITED AS A LEGITIMATE SOURCE.
- 99% of Universities will shun any paper that has Wikipedia listed as a source.
- Wikipedia entries are supposed to be ‘neutral’ to present an unbiased representation of ‘facts’ even though thousands of entries are riddled with false information and bland insight
Speaking of marketing, when we run a search for ‘marketing’ (or any niche study) why is Wikipedia #1 ? Just because you can find “credible” sources and conglomerate them into an entry, doesn’t mean that entry should be the #1 ranking for field they have no experience in. We have to get through 5 links before an actual credible source is listed. The AMA, BMA, and SBA should all outrank Wikipedia. Again, this is just my personal opinion.
However, Google’s In-Depth articles feature does seem to ‘balance’ weak Wikipedia links.
Shouldn’t students and professionals alike learn and source information from people who’ve devoted their life to a practice, as opposed to over-zealous editors and a system that restricts valuable insights?
Here’s another way to think about it:
Imagine all your local shops; the community auto-mechanic who’s carrying on the 3rd generation, the flower shop run by a retired horticulture professor, and the town bakery that’s regularly featured in culinary news and has accumulated dozens of awards.
When we search for:
- auto repair
- gardening tips
- using butter vs. margarine
Who would you rather get information from?
- Your local experts who have devoted their lives to their trade
- Generalists who conglomerate dry information and disregard opinions and experience
Google sticks the generalists on top of the experts.
In addition to Wikipedia, generalists like About and Examiner fall along the same vein.
While About and Examiner at least feature experts, they’re still trying to conglomerate them.
Essentially, it’s a greedy SERP battle that ultimately HINDERS the user experience.
Monopolization and Greed in SERPs
These are massive sites that try to cover as many fields as they can. In doing so, they are knocking out thousands of niche sites with higher quality information from people that have taken the time to create something out of their passion – not to meet a pageview quota. We have to dig through monotonous fluff to find the gems because these sites have reached such a level of authority they automatically override everything else when they hit publish.
I know there are pros and cons on both sides, this is just how I personally feel about the current state of SERPs.
Of course the reasons for this imbalance is directly related with how algorithms work and the steps sites are taking to rank. A culinary god might not have a clue how to set up the most basic onsite SEO – let alone even care about getting links from reputable sites. The glorious Matt Cutts would suggest they just keep on keeping on because they’re creating honest and quality content.
But the reality is that it’s a popularity contest, it’s not a quality contest.
This culinary master will never see the exposure for his awesome recipes that a sub-par foodie might post on the recipe section of Examiner.
I guess it all boils down to just being “okay” with the monopolization of SERPs, as our end goals are usually not that “dependent” to begin with. I just think there is more opportunity for niche sites if the “Generals of Generalization” were devalued a bit…
Does Google deserve praise or criticism for these changes?
(Sorry, I know it’s kind of a semi-loaded question and there are too many factors to count)