Quantcast
Channel: Marketing & Social Media Blog | Marketing insights, tips and advice » black hat
Viewing all articles
Browse latest Browse all 2

Issues Google Needs To Address in 2014

$
0
0

All is not well in the world of Google. In the post Hummingbird landscape of SEO, monitor any related message board and the Google related sentiment is generally negative. Perhaps this isn’t surprising given the fact that SEO black-hatters, slow or reluctant to change their ways, have been hit hard by the latest changes. However, for many ethical SEO’s who were already using sustainable, Google-pleasing techniques, there is still a lot of negativity cast in the search giants direction.

google_seo_issues3

Claims of anti competitive and unethical practices are on the rise, and webmasters are increasingly vocal in condemning Google’s often nonsensical policies. Let’s take a look at their stance on ‘bad’ links for example. Google remains wooly, but generally describes bad links as being those from Low-quality directory or bookmark site links, text advertisements that pass PageRank and advertorials. These guidelines aren’t as clear as they might initially seem. How do you define a ‘low quality’ directory? What’s more, how does Google define a low-quality directory?

Read more about low quality links and Link Schemes here.

Directories such as the long standing, human edited Dmoz are a pretty safe bet, but what about all the others? Currently, all webmasters can do is make an educated guess, looking at factors such as relevancy and Domain Authority, but it’s not always easy to know which directories and websites are on the Google safe list and which are on their black list. It would of course be a huge help if Google gave clearer information, but it doesn’t.

Disavow Can Harm Your Sites Performance

This has lead to webmasters using the disavow tool on vast swathes of backlinks, in an attempt to limit damage already caused or to reduce their chances of picking up a future penalty.  However, Disavow a large number of backlinks, some of which may be actively helping your website rank well, and this could be harmful, for example, you may see your domain authority fall.

Google’s own Eric Kuan warns “The disavow backlinks tool should be used with caution, since it can potentially harm your site’s performance.” At the same time, Google’s Matt Cutts suggests that aggressive use of the disavow tool is the way to go in his recent video, How can a site recover from a period of spamming links?

Google has effectively created an environment where it warns against having too many low quality back-links and suggests using the disavow tool whilst at the same time warning against using that same tool to clear them up. In both cases, the overall result is that the average website is more likely to rank poorly, leading to decreased visibility. So why does Google choose to do this? Simple, loss in organic rank makes it more necessary for website owners to turn to paid search, so Google benefits from an increase in AdWords revenue.

Tackling Webspam

Of course Google could choose to simply ignore these bad links, (as it historically did). Alternatively, it could let webmasters know which links to remove by providing a notification and a grace period in which action would need to be taken. Instead, without any warning, Google chooses to hand out often drastic penalties, leading in some cases to a devastating drop in rank or exclusion all together from search results. Recovering from a penalty is often a long and drawn out process (i’m talking several months here, not days), as webmasters flounder around, trying to manually identify and disavow ‘bad’ links and remove any potential over-optimisation that might have historically been done on the website.

Now, almost every business i’ve worked with, has used a number of different SEO companies over the years. Businesses pay for the services of an SEO expert and put their trust in them, on the implicit belief that the techniques being used will have a positive, rather than a negative effect.

Forcing Websites Down The Paid Search Route

The simple truth, is that Google is now penalising thousands of businesses who have done nothing more than unwittingly employ a black-hat SEO company at some time in their past. I don’t disagree that Google needs to tackle webspam and I believe the recent algorithm change has proved successful in doing this up to a point, however, the simple fact remains, thousands of websites owners are suffering due to harsh penalties that put them at a huge disadvantage, and as such, are being forced down the paid search route.

Read more about Google’s quality guidelines here.

Sure, there are always going to be those who try and game the system, but I don’t believe for one second that every single website that’s picked up a penalty or suffered a drop in rank, has an evil black-hatter at the helm, hell bent on playing the system. I would say that these sorts of websites would be in the minority.

Some of The Issues Google Needs to Address in 2014

1.More information and better support for webmasters

Better support is needed for webmasters struggling to recover from penalties. Whilst Google does report on manual actions within Google Webmaster Tools, often, penalties are picked up with no such manual action being reported. This makes it extremely difficult to figure out what a website is being penalised for, making recovery lengthy and ultimately, more unlikely. Webmasters need better and more detailed information to help them recover and a clean-up grace period needs to be introduced.

2. Credit to original content sources

Google also needs to bee better able to identify (and penalise) low quality content aggregation websites, that simply scrape other blogs for content. I’m fed up and I know other bloggers are too, of seeing content from my blog published on other sites i’ve not authorised to share my content, with no rel=”canonical” link element. A website like Yahoo Business has far more traffic that my own blog as well as thousands more back-links, better domain Authority and so on. The chances of my blog being credited as the source is unlikely, making content scraping sites more likely to obtain the benefits.

3. Full Implementation of publisher markup

I also hope that we see Google publisher markup working effectively and consistently, as this is still currently very hit or miss in my experience.

4. Alternatives to local search for internet only small businesses

Finally, i’d like to see Google start supporting SME’s with more innovation in the area of local search.  Worryingly, there is an increasing gap being created when it comes to search visibility. On one end of the scale you have the big brands with plenty of cash who are able to maintain prominence in search via AdWords. At the other end you have local businesses, being serviced relatively well by Google Local listings.  Somewhere in the middle though, we have a plethora of small internet only businesses who are struggling to maintain visibility in organic search, who can’t afford extensive P.P.C. campaigns and for whom local search is irrelevant.

How do you feel about the recent changes that Google has made? Have you been affected by a penalty or perhaps you’ve recovered; what issues do you see facing Google over the coming year?

Resources

Google: The Link Disavow Tool Can Harm Your Sites Performance
http://www.seroundtable.com/google-disavow-tool-harm-17327.html

What is Duplicate Content
http://moz.com/learn/seo/duplicate-content

The Benefits of Rel=Publisher
http://www.advancessg.com/googles-relpublisher-tag-is-for-all-business-and-brand-websites-not-just-publishers

 

The post Issues Google Needs To Address in 2014 appeared first on Marketing & Social Media Blog | Marketing insights, tips and advice.


Viewing all articles
Browse latest Browse all 2

Latest Images

Trending Articles





Latest Images