Articles Tagged with Google

Black-Hat - Stormtrooper image

The Sad Truth: Black-Hat SEO is Working…

There’s been some great articles recently on the topic of white-hat vs. grey-hat vs. black-hat SEO, regarding what’s working and what Google’s doing about the situation, and so I wanted to weigh in with my recent observations.

While the likes of “SEO evangelist” (and one of my SEO heroes) Rand Fishkin argue that white-hat really does work and should be the way to go, others have vented their frustrations, saying that Google isn’t making it easy to be a pure white-hat anymore; that you may have to turn a little ‘grey’ in order to succeed. In the meantime, black-hats are the ones who are ultimately winning – while we all wonder, worry and play by the rules, they’re doing fine and reaping all the benefits the top of the SERPs can bring.

As that self-confessed link spammer put it in the SEOmoz comment I’ve linked to above: “Don’t blame me for doing it, blame Google for not fixing it. And that’s part of the problem: it’s all well and good being white-hat, honest and ethical, and for Google to ask us to do it this way, but they’re not exactly doing us any favours if they’re not weeding out the link spammers as well.

I’ve been doing link building full-time for over 3 years. In all that time, I’ve looked at a lot of backlink profiles, mainly competitors of my clients. I have to say though, that of all the clients and industries I’ve looked into, I’ve seen more dodgy backlink profiles in the last few months than I’ve seen in all the years previously. What makes it worse is that they’re the ones who are currently ranking well in (and often at the top of) Google for their industry head terms.

In this post, I’m going to share my experiences – the types of links I’ve seen that seem to be working for these sites – and what I think Google should be thinking of doing in order to counter it (which is certainly easy for me to say, and easier said than done, I’m sure).

Disclaimer: Before I continue, I just want to say that I won’t be naming and shaming. Apart from the fact that I don’t want to get into any legal or libel trouble (*coughcowardcough*), it’s simply not my style. What I am hoping is that fellow SEOs will read this, nod along and say “yep, I’m seeing this too, and I’m equally annoyed and gutted by it.” Besides, when Eppie Vojt wrote a similar, excellent post outing a website and its crappy links, the site being scrutinised had been spotted and banned by Google before his post had been published (while the sites I’m talking about are still ranking).

Is black-hat victorious? A backlinks case study

Black-Hat - Stormtrooper imageThe first (and main) site I will be looking at is in a similar vertical to the one Eppie outed – a very competitive financial sector. Similar to (previously) ranking 2nd for “car insurance” in the US, my example was* ranking 2nd for its head term in the UK – it is an affiliate site, with a non-branded exact-match domain URL, fighting some of the biggest household-name brands in the UK (and – for the most part – beating them). I know this industry and its SERPs pretty well, as I used to work on a site that is currently one of the ones ranking near it (read: below it). I’ll call this site Example X (or EX).

* I say “was” because when I first started typing this post, it was 2nd. However, it’s now 8th. Typical eh? Still, it’s still on the 1st page, and therefore it hasn’t been entirely nuked by Google, unlike Eppie’s example. Perhaps it took a hit from the recent blog network devaluation, if it’d been guilty of that practice, too – a quick glance at the overall backlinks profile suggests that it could have been.

The second cluster of sites I will be examining are in a very competitive industry relating to law and legal services. I have been working on a client’s site and while we’re trying everything we can that is white-hat and by-the-book, the people ranking at the top are pretty much as black-hat as can be… I’ll call this group of sites Cluster Y (or CY).

I have mainly used link analysis tool Majestic SEO to analyse these sites’ ‘best’ backlinks, based on their ACRank score. I love using Open Site Explorer, but I find Majestic’s results more interesting in this instance. Besides, at a quick glance, many of Majestic’s top 100 links for Example X overlap with OSE’s top 100, which are ordered by Page Authority.

Example X‘s strongest links

When Example X is thrown into Majestic SEO (root domain, Fresh index), its top 100 backlinks range from 7 to 4 in ACRank. The majority of them reveal themselves to be blog comments (similar to Eppie’s example: 43% of’s backlinks were comments), but there’s more to it than that. These blog comments have a lot in common:

  • The majority of them are not at all relevant to EX‘s industry, including sites to do with technology, software, gardening, fashion, politics and even gambling,
  • The majority of the comments are not relevant to EX‘s industry – some of them themselves are dodgy, including gambling, adult and Viagra comment links,
  • Some are in English, but a lot of them are in foreign languages, including French, Russian and Japanese, and therefore also on foreign domains (e.g. .fr and .jp),
  • Many of them have hundreds (if not thousands) of blog comments, which are not only dofollow but are also seemingly unregulated (i.e. they go live instantly without needing approval). This also means that each of these pages links out to hundreds or even thousands of external sites,
  • As you can imagine, the comments themselves are pointless and contribute absolutely nothing to the posts; they’re your typical “great post” type comments in broken English (possibly produced by spinning software),
  • In some cases, you can actually see ‘broken’ HTML code, where either the commenter hasn’t configured the link properly, or the blog doesn’t allow – or have the capability – to show in-comment links (i.e. the link is coming from the commenter’s ‘Name’ instead).

For those that still have a comment form and have the capability to add to the almost endless list of spammy, useless comments, most of them have a CAPTCHA form, which suggests one of two things: 1) that the blog commenting process isn’t entirely automated – someone still has to manually enter the data into the fields and type in the correct CAPTCHA code each time, or 2) it is automated, and CAPTCHA-filling software is being used (kudos to @Andrew_Isidoro for pointing out that such software exists, as I – somewhat naïvely – had no idea). Call me a sceptic, but my money’s on the latter.

Cluster Y‘s general backlinks

I know what you’re thinking: for Example X, I’ve only really properly assessed 100 links. It might be what Majestic considers to be its 100 best links, but it is just 100. The site has more than 5,000 in total, so I’ve only really scrutinised 1-2% of its backlink profile.

Of course, I have trawled through the entire list of 2,500 backlinks that Majestic came back with (and, let’s be fair, it’s not much better)! But it is true that it might’ve been the case that the rest of its links are perfect, by-the-book and ticking all the boxes as far as Google is concerned, and to be fair, the likes of Rand argue that Google has to allow for a bit of spam, as no one has a 100% perfect link profile (after all, if your competitors get dodgy sites to link to you, it shouldn’t be considered your fault, but if the majority of your links are spammy, it’s more likely the case that it’s your own doing). That said, even that’s hotly debated at the moment (see Reason #1 of this post), but I digress…

Cluster Y is a different story though. I’ve looked at 5 sites all ranking for the industry head terms on page 1. They have only a few hundred links each. While there are some non-relevant blog comments à la Example X, there’s also a lot of footer and blogroll links going on. And guess what? They’re all coming from completely off-topic sites. Out of the hundreds of backlinks each of the sites has, 4 out of 5 of them do not have what I would classify as ONE relevant link. Ironically, the one that does ranks the lowest out of all 5 of them, but still also has a lot of non-relevant links as well. The site I’ve worked on ranks lower than all 5 of them, and I’ve tried to keep it relevant as much as possible.

So what’s the problem?

The problem is that Google has guidelines and Google employees – like Matt Cutts and his Webmaster Help videos – like to recommend white-hat ways of doing SEO. For the most part, these sites are doing the absolute opposite and succeeding from it. How so?

Relevance Google say it themselves:

“Your site’s ranking in Google search results is partly based on analysis of those sites that link to you. The quantity, quality, and relevance of links count towards your rating. The sites that link to you can provide context about the subject matter of your site, and can indicate its quality and popularity.” (Emphasis added.)

Wil Reynolds has commented on this as well. As I’ve established above, there’s barely any instance of relevance in either Example X or Cluster Y‘s backlinks. The only relevance I can see is the keyword anchor text: the site might be able blue widgets and the anchor text might be “blue widgets,” “blue widget,” “cheap blue widgets,” etc., but the sites these links are coming from are not about blue widgets – instead they’re blog posts on a different subject, with hundreds of blog comments linking out to completely different sites in completely different industries each time.

Large numbers of out-bound links – Back in 2009, Cutts stated that pages shouldn’t have more than 100 links on it – internal or external – if it can be helped. However, since then, it has been highlighted as more of a technical guideline than a quality guideline, and it’s particularly fine if “your page is sufficiently authoritative.” Now going by to Majestic’s AC Rank, these pages are considered sufficiently authoritative, which is precisely the reason they’re being spammed to death (that and the fact that they’re dofollow, etc.)

But c’mon, Goog… These sites have links in the 100s, if not 1,000s – one even has about 4,000 out-bound links on it, due to the excessive number of comments on the page. Maybe it should be a quality guideline? You could argue that it’d be unfair for the website to become devalued by Google and to suffer because of something outside of its control, but then again, it is in the site’s webmaster’s control: they could consider deleting the comments and/or consider disabling comments from appearing whatsoever. After all, it’d probably be better to have no comments at all than a plethora of nonsense, adding absolutely squat to the topic of conversation. This also ties in with Google’s stance on ‘bad neighbourhoods’ and the fact that for these blogs, where you’re linking out to is also important, not just where you’re being linked from.

The white-hat’s dilemma

White-Hat - R2D2 imageWhat really sucks about all this is the fact that these mad (black-)hatters are getting away with it.

Yes, you can be white-hat and work hard and slave away on getting 100% genuine, honest, by-the-book links, but when your competitors can spend 5 minutes using a bit of automated software to get really powerful – but really irrelevant and dodgy – links and succeed, then what’s the incentive for the white-hat? Work hard, be good and still not rank?! Sod that! Clients won’t thank their agencies for it, and Marketing Directors won’t thank their in-house team for it. And this is precisely the gauntlet Google is running by not tidying up this crap: “if you can’t beat ’em, join ’em”, may become the case for many honest, hard-working white-hats who just aren’t seeing the benefits from all their hard work.

But what’s a search engine giant to do?

Right, now we enter the part of the post where I enter ridiculous self-righteousness…

It’s easy for me to say “this is what Google should do” – especially as I don’t think I could ever work for them, as their crazy interviews questions leave me utterly dazed, confused and feeling stupid – but that said, I think the above certainly highlights that there is a problem and that the problem needs to be seen to and fixed. Easier said than done, yes…

The first thing they should be thinking of doing is turning up the ‘relevancometer’. It’s true that completely irrelevant, off-topic sites can and do link out – for example, someone on a gardening forum might ask its members for advice on car insurance, just because they trust the community and value their opinion – but when every link is irrelevant and off-topic then that seems a little more than coincidence…

I think they should also consider making excessive numbers of links a quality guideline, not just a technical one. I can’t see why a page’s links overall – not just the most recent ones – cannot and shouldn’t be given slightly less value as more links flood onto a page. Obviously it’s a part of Google’s PageRank algorithm – that more links on a page means less ‘link juice’ is allocated to those pages – but I’m talking about a limit where that drops even further; maybe not in terms of PageRank, but in terms of looking at a page, and if it hits a certain limit, its value is affected slightly, and then if it hits another limit, it’s affected even more, and so on.

In fact, if this isn’t a consideration already, what about working out a ratio between page size/copy length and the number of links on the page? For instance, I’ve linked out to a lot of sites within this post, but then it’s also 3,000 words long. 30 links for every 3,000 words is a bit different to having 100s within less than that. Naturally, having comments following a blog post – genuine or otherwise – is going to affect that ratio significantly, but I think it’s safe to say that natural, genuine blog comments are going to be longer and more substantial than your typical, spammy “great post” style comments.

Not only should they consider turning up the ‘relevancometer’ from an on-site copy or general domain point of view, but also take into account the relevancy of other out-bound links. If you’re the only link about “blue widgets” on the page and you’re featured among dozens of other links, each one linking to a site on a completely different subject, then it really only spells one thing: link farm.

While I’d say the above should be serious considerations, we’ll call the next lot ‘maybes’ – I don’t even know if it’d be feasible (or even possible) to assess some of these criteria…

By language/location: Arguably, this depends on industry. For example, a car insurance website is probably not going to get many international links because it’s specific to an individual country, although it might get international links if it does something worthy on a global scale, e.g. a successful viral campaign that’s admired worldwide. Other companies in other industries might be global. However, the EX and CY examples above are UK-specific, but have an irregular amount of links from sites from European and Asian countries. It’d be great if Google could measure the authenticity of these links in a way that works out if the site being linked to warrants the link, i.e. are they the subject of the discussion, or just another random link appearing on the page?

By blogging platform: Speaking of foreign language sites, I noticed that some of the sites being spammed like crazy in the comments were French and that most of them had a very similar blog structure and set-up, so I’m wondering if Google has the ability to discount or discredit certain blogging platforms, assuming this is a blogging platform targeting the French-speaking market? Again, this is a bit unfair for all those who are using their blog properly and are strict when it comes to monitoring comments, but, as I’ve argued above, it could be the blogging platform developers’ fault for not configuring its features properly for its users, e.g. maybe the spam filter isn’t strong enough, or there isn’t even one in place.  I don’t think this too outrageous a consideration, especially given the fact that some blogging platforms are seen to be more SEO-friendly than others

By the extent of the visibility of ‘bad’ code: By “visibility,” I mean exactly what I say: not just bad code by the search engine spiders’ standards, but bad code that people browsing the site can actually see for themselves, too. For example, when these spammy blog comments have been entered irrespective of the blog’s guidelines and you can actually see the <> of broken HTML code (I even saw the [] of broken BB code, in a couple of instances), then this could – and maybe should – be an indication of improper on-site technical practices. It goes without saying that Google likes clean code (insofar as the site is easy to navigate for both search engines and humans), although I can’t say that I’ve come across any reference to visibly bad code, but maybe that’s because it’s just so obvious that it’s probably not a good thing to have on your site.

Again, I’ll repeat: it’s easy for me to say all this – a Google engineer might read this and say “we KNOW, get off our case!” and it’s clear that – even recently – they are apparently working very hard to try and solve the problems associated with spam affecting their index. Even so, I’m hoping that some of my observations indicate the seriousness of the problem and help them in their quest for a search engine that’s swayed by spam as little as possible (if that is even possible and can ever be achieved). Heh, I was even going to say that maybe they should consider hiring someone who’s worked in the SEO industry as part of their engineering team, although it looks like they’ve already something done this for their Google Webmaster Trends Analyst team.

Whatever the case and whatever the future holds, here’s to hoping that black-hats shall rue the day (one day) and that white-hats will see their efforts pay off…

[Image credits: Black-Hat Stormtrooper by Jay Malone; White-Hat R2D2 (a.k.a. one of the coolest pics I’ve ever seen) by Brittney Le Blanc]

Google Places Beyond Local Searches – What It Means for Small, Local Businesses

Buy Local signYesterday evening, Gareth at Liberty reported that Google had introduced a new look and feel for Google Places. More than just testing, it turns out that these changes have gone live fully, having been dubbed the ‘grey pinned results.’

When I looked into it a bit more this morning, I noticed a potentially bigger, more substantial change: I was finding that Places results were a lot more present in the search results, especially for searches that did not specify a location. In other words, when I searched for something like “it recruitment,” I was seeing a map of Cardiff-based IT recruitment agencies (as I’m based in Cardiff, of course). Notice that I didn’t have to “it recruitment cardiff” to reveal the map – “it recruitment” was enough.

Example of Google Places changes in the SERPs

Although many will argue that this is not a “new” change (as I’ll touch upon later on in this post), I’ve definitely noticed it for keywords – and industries – that have previously been unaffected.

Typically, a search like this would show national results, i.e. the top 10 results for “it recruitment” – typically nationwide agencies, or agencies in the big cities, such as London. In fact, these are still present amongst the Places results, but results local to the searcher are nestled amongst them. And if you consider the fact that recently carried-out testing has shown that Google Places listings are good at catching the searcher’s eye-line and attention, this could be very good news indeed for small, local businesses.

What this means for small businesses

Currently, I’m only seeing these types of results for a handful of keywords and industries. However, if these results become permanent and more wide-spread, then small, local businesses have more chance of getting traffic and enquiries.

The biggest impact will be seen for industries where people may not particularly consider searching locally. For example, according to Google’s own data (via the AdWords Keyword Tool), “it recruitment cardiff” gets about 30 search per month, while the broader, non-location-specific “it recruitment” gets 3,600 – more than 100 times the amount.

Results for [it recruitment] in the Keyword Tool

Admittedly, the latter will be people across the UK, while the former will probably be people in and around Cardiff, but there may be people in Cardiff who are searching for “it recruitment” without actually including the word “cardiff.” If this happens to be 1% of all those “it recruitment” searchers (i.e. 1% of these UK-wide searchers are based in Cardiff), then that’s another 30+ people, and they will now be seeing a map of Cardiff instead of just UK-wide results.

It might even lead to bigger enquiries for a small business. If a big company in Cardiff is looking for an IT recruitment agency, they might just type in “it recruitment.” After all, they might have the budget to afford a big, national or London-based agency – the likes of Hays, etc. A smaller company might type in “it recruitment cardiff” – they might want someone smaller, potentially cheaper and more local to them. With this shift, Cardiff results will have more of a fighting chance in getting noticed and receiving an enquiry from a bigger business.

A major change in search, with a focus on local?

It could even potentially change the way we searching for products and services online and buying them. It was about a year ago when this “huge change” by Google was seemingly intended to “[favour] truly local businesses for queries that are likely to be local in nature.” As Eric Enge argued back then, Google was trying to steer the results based on what the person might want to see – if someone types in “pizza,” do they want something local or something national and informational?

I remember Google testing this quite a bit back then. I remember typing in “car insurance” and seeing a map of Cardiff, which – when you think about it – is absolutely radical. “car insurance” is an industry where people don’t search locally; they just type that in and go to one of the big insurer’s websites or a comparison site. Again, using the Keyword Tool, we can see the difference – except this time it’s 20-30 versus half a million!

Results for [car insurance] in the Keyword Tool

If it comes about again then imagine what it could do for the likes of small, locally-based insurance providers!

What should a small business do?

If you’re a small business and you haven’t claimed your Google Places listing, then do it now. If you can’t do it now, do it ASAP. Seriously, do it, get cracking! Prioritise it! Get it done!

It’s free to set up a Places listing and doesn’t take long, either. You might even have a listing already – it might just be a case of ‘claiming’ and updating it.

Haven’t got the time? SEO and marketing agencies often offer a Google Places Optimisation service, meaning that they will set up listing(s) for your business locations and optimise them on your behalf. (And yes, that may or may not have been a bit of a shameless plug for my employer!)

Just what is Google up to?

It’s an interesting change by Google. Again, it could just be case of them experimenting again, or it could be longer-term.

It might even be an attempt to encourage more businesses to create and claim their listings, which this will no doubt do. As Mike Blumenthal put it, “if users won’t go to Places, bring Places to them.”

Even if it is a bit of a testing phase, or perhaps if a small business owner is reading this and their main industry keywords aren’t showing a map just yet, then it might still be an idea to sort out a Places listing, for future-proofing purposes.

Blimey… honestly – I should be on commission. What do you say, Google?

[Buy Local image credit: Ari Moore. Also, many thanks for Computer Recruiter for letting me use them as my example for this post. Can you guess what they do? You guessed it… IT recruitment in Cardiff!]

Is Link Building a Job “a Monkey Could Do?”

Monkey on a Mac imageWhen I found out that an acquaintance of mine – who I looked up to and respected – had badmouthed and insulted my main area of expertise, I was pretty hurt. I had been informed that when he was asked by someone if he thought there was any value in link building as an SEO practice, he said that he didn’t think there was and that it’s a job “a monkey could do.” Charming!

Although his line of work isn’t primarily SEO, it is closely related, so I was quite shocked – but also somewhat relieved – to find out that his thoughts were based on the ignorant belief that there’s only one thing link builders do, and that’s simply to “submit links to general directories.” Although directories can still be included as part of an all-round link building strategy, gone are the days when SEOs could rely on link directories alone.

Of course, with Google’s on-going changes to its search engine’s algorithm, for an SEO to concentrate on just directories, they’d have to be absolutely bananas! (Sorry, couldn’t resist – there had to be a monkey-related pun in this post somewhere!) Some SEOs argue that link building (a major element of off-site SEO) is a stronger SEO signal than on-site SEO (although many will also argue that both should be considered for the best possible chance of success, as one element being weak can let the side down overall). So it’s crucial that an SEO strategy involves a well-rounded and varied link building strategy, particularly in competitive industries.

Then there’s the subject of how many factors come into Google’s search engine rankings, and how often the search engine algorithm is updated. The answer? They “use more than 200 signals… and [they] update these algorithms on a weekly basis” (source). 200… Two years ago, when it was first mentioned by Matt Cutts, folks over at WebmasterWorld made an attempt to list them, and the provisional list had at least a dozen variables affecting the finer details of an individual in-bound link, including the anchor text used, its position on the page, its relevancy to the content as well as a whole bunch of other factors. It’s complicated stuff. And that was two years ago! There’s probably more now, or if now then at least they’ll have changed, with some variables gaining more importance over time; others less.

Then there’s the type of links an SEO can go out and obtain. Sure, there are your bog-standard general directories, but there’s also article marketing, reciprocal linking, paid links (boo!), blog comments, blog rolls, guest blogging, forums signatures, linkbait ideas including infographics, online tools, badges, etc. etc. – some of which have become less important compared to others as time has passed, and I’m sure there’s a lot, lot more than what I’ve just covered. It’s enough to make someone go ape… (Last pun, I promise.)

But don’t just take my word for it. When I first heard about the monkey comment, I decided to get the feedback and opinions of a few fellow SEOs via Twitter.

Profile pictures of the three tweetersThree got back to me, with the following:

Brighton-based SEO Yousaf (@ysekand) said: “Anyone can build links but it takes creativity & outside the box thinking to build quality/juicy links, that is the difference.” (Just realised that I misspelt your name in my original tweet to you – I apologise sir!)

Recent Liberty starter Andrew (@Andrew_Isidoro) has a similar opinion: “Most people could link build with training, but the best guys are web-savvy, intelligent & work with a high level of creativity.”

Tweet #3 by Mike (@Koozai_Mike) – on the other hand – seemed to be more concerned with the monkeys’ supposed confusion over their occupation: “I thought monkeys were busy writing the works of Shakespeare? If they can’t finish that yet, then they’ve got no hope.” An unusual comment, but good on him for countering the comment with a bit of humour, rather than writing an epic, probably-OTT blog post about it (we all have our ways of dealing with things, I suppose…)

However, the link building fun doesn’t stop there. It’s occurred to me typing this now that Yousaf, Andrew and Mike could possibly share this post via their Twitter profiles and possibly link to it from their own sites because they’re mentioned in it. In fact, there’s a chance that others in the SEO industry will be interested due to the industry’s strong sense of community (you only have to take a look at the likes of SEOmoz’s thriving SEO community and the WebmasterWorld Forums to see how strong the communities can be as well as the simple fact that most SEOs like to help each other out). Plus we like to defend hurtful, attacking comments made towards SEOs (remember the guy who called us “SEO bastards,” and the SEOs who wrote blog posts in response?)

I wrote this post because I wanted to defend what we do from a silly, harmful comment. But if this blog post – and my website domain as a whole – garners links in the process then I’d consider it an added bonus. This post is potentially a link-earning bit of content – a link building consideration in itself.

But hey… We’re only monkeys, right?

[“Monkey on a Mac” image credit: thegarethwiscombe – cheers also to Yousaf, Andrew and Mike for their contributions as well (their images courtesy of their respective Twitter profiles)]

Dislike a company? Leave your review somewhere other than Google…

Evil laugh at the ready?Have you been a naughty, naughty company? Have your actions resulted in bad reviews on Google? Well now, don’t shake your fist and curse the ‘good guys’ just yet. Get ready to tweak your evil mustache, pet your Bond villain-esque feline and prepare your best maniacal laugh, for you’re about to find out how you can make them disappear!

Reviews… Some people think they’re a waste of time, when others relish the fact of sharing their experience with others, good or bad. Although some might argue that we leave reviews for purely self-obsessed reasons, I’d actually argue that it’s probably the other way around…

  • The Positive – A company’s done you well and you want to return the favour by spreading the good word about them.
  • The Negative – Usually a result of being stung by a bad service or sale and therefore wanting to warn others.

My reviews tend to fit those categories. I love giving good reviews who deserve them, especially if they’re hardworking people who mean well (e.g. small, local suppliers). Likewise, I hate shoddy service and I hate seeing people get duped, so if I’ve been stung, I’ll happily warn others of my unhappiness towards the company responsible.

My personal weapon of choice? Google. Leaving a review on a Google Places listing (a.k.a. Google Maps, Google Local – call it what you will!) can help with a listing’s ranking, so if you truly want to help a company to get more business then that’s a good way to do it. The same applies with some third-party sites, such as Qype and Yelp. (See? I do actually talk about SEO from time to time on this blog!)

But what does a company do when it receives lots of negative reviews? Well, y’know… They could just delete their own listing…!

That’s what happened to a company I (negatively) reviewed recently. Well, I don’t know for sure if they deleted it, but I can’t see any other reason. It makes sense that they’d want to delete it, as I’ll go on to explain…

They had received six (SIX!) 1-star reviews, with the stars showing even before you clicked-through to the listing – i.e. a company needs 5 reviews for the stars to show, otherwise it’ll still say the number of reviews there are but won’t show the stars until you click onto the profile itself. This meant that people could see that their average rating was 1 out of 5 when they did a Google search for their name. Worse still, if they showed up amongst the Map listings with their competitors for a general search relating to their service, it would show then, too (and their competitors either didn’t have any or enough reviews, or had more positive reviews).

So what’s a company to do? Hell, if they’re lousy enough in what you do to annoy that many people that badly, why not do the unethical thing and wipe the slate clean rather than to accept the (potentially) deserved onslaught? Your Google Places listing disappears, along with your chances of ranking in the Map results amongst your competitors, but hey, maybe it’s better that people find out about you another way, rather than via Google and they witness the negativity surrounding your brand. As businesses are able to create or claim listings, they’re able to un-list and delete them, too. Like I say, I don’t know if that’s the case for sure, but I can’t see it being anything else. They knew the reviews existed, as they tried to contest them (they commented on mine and others apologising and asking us to call their customer services person), so maybe they decided they’d go one step further? Well it’s one way to sort out your brand reputation woes, I guess!

The review still lives in my Google account. I can login and read it. The link’s still there, but if I click it, I see this page:

Screenshot of Google Places 404

“We currently do not support the location,” which I’m pretty sure is the Google Places equivalent of a blank page or a 404 error.

So when you next think of leaving a negative review of a company online, it might be best to look beyond Google. Maybe use another site instead? Then again, with other review sites possibly providing the same option (how else can someone remove a listing of a business that’s no longer operating?) and the likes of Yelp supposedly letting people pay to remove their bad reviews (which I’ve heard about but can’t seem to verify), where else can you go…?

…Customary moan on Twitter and Facebook? Sounds like a plan.

[Green Goblin image credit: doug88888]

Want more traffic? Teach the experts something new

I won’t lie… I’m a search engine geek. Since discovering SEO 2-3 years ago, I have gradually yet increasingly become more passionate on the subject. And as anyone who’s passionate on a subject will attest to, every subject and/or industry has its experts and its heroes. I have a few, one of them being Rand Fishkin of SEOmoz.

So I was delighted when after liaising with him on Twitter, I eventually discovered something and taught him something that he didn’t already know, leading him to then share the discovery with his 30,000+ Twitter followers. Here’s what happened…

Teaching an expert

A few weeks ago, Rand tweeted saying that he’d seen a weird search result, linking to a screenshot of it and saying that he couldn’t figure out why some of the results were ranking. Looking into it, I responded saying that I thought the anchor text of the in-bound links was helping at least one of the results (a result that didn’t even have the words on the page whatsoever p after all, how else would Google know to show that page for that keyword?)

@randfish & @steviephil tweetsAlthough Rand agreed with my theory, he still wasn’t convinced that “it would be enough for such a tough-to-rank SERP.” I replied asking if he thought that perhaps negative/removed keywords could affect the anchor text of in-bound links as well as the on-page text.

@randfish & @steviephil tweetsAt this point, I was tempted to leave it be, but after thinking about it for a while, I decide to look into it some more. Before Rand had the chance to respond, I took a deeper look into it and drew a few conclusions. To my delight, Rand responded positively and enthusiastically.

@randfish & @steviephil tweetsThe next day, I detailed my findings in a post for the Liberty Marketing blog. Although arguably a bit cheeky on any other occasion, I notified Rand of the post’s existence, seeing as we’d discussed it the day before and I thought that he’d be interested.

@steviephil tweetThe result? Rand didn’t retweet my notification, but tweeted about it in its own right, mentioning me in the process, which was probably better than retweeting my tweet (it was certainly more presentable than what I’d written to him).

@randfish tweetCompared to other tweets, this one didn’t start with “@steviephil,” meaning that it wasn’t sent solely to me… Instead, it was addressed to his followers. All 30,000+ of them.

For someone who loves SEO, loves learning new things (especially something that no one’s ever documented or picked up on before) and who also looks up to Rand and what he’s achieved in the industry, this was a huge honour. I was ecstatic.

But the purpose of this blog post isn’t to brag about what happened. It’s to talk about the benefits of going to the effort of doing what I did and suggesting that others try and do the same if and when they can. When Rand tweeted the first time, it was Sunday evening (UK time) – I could have ignored it. Hell, I could have missed it altogether, so I was lucky to have caught it and that I wasn’t busy doing something else at the time. I persevered and the end result was certainly worth the effort…

An influx of traffic

Rand’s tweet saw the Liberty blog and the website as a whole get a ton more traffic than usual. Unfortunately I don’t have access to Liberty’s Google Analytics account as I type this, although you can picture the graph: a huge peak on the date of the post, with a drop in the days afterwards.

I may not have Analytics access, but I do have account access, and I can tell you that this particular blog post had 30 times the clickthroughs compared to the blog’s other recent posts. We couldn’t believe it!

Other benefits

Okay, so admittedly, although the volume of traffic was great, one can argue that the traffic was probably primarily made up of other SEOs, and although that’s still cool from a relevancy point of view (e.g. they may then go on to browse other news and advice posts we’ve written), they’re hardly our target market. We want business owners to check out the Liberty site – they’re the ones who enquire and hire us for our services, not our industry peers.

However there are still some great benefits attributed to the tweet and the rise in traffic that can benefit Liberty in other ways:

Links: The blog post has acquired more in-bound links than some of Liberty’s other blog posts, probably because more people saw it, offering more of an opportunity that someone would link to it. Also, being mainly industry peers, SEOs – many already owning blogs and knowledgeable about linking – are probably more inclined to link to it than other people. Not only that, but we might also have a legitimate and genuine Wikipedia link opportunity, what with is being an industry discovery and research.

Retweets: Old and new-style tweets combined, Rand’s tweet was retweeted about 20 times. Although the sharers themselves might have mostly been made up of industry peers, their followers may not be. It’s not impossible that one or more of the retweeters was an SEO agency or freelancer in the UK, who has followers that might benefit from Liberty’s services, the retweet along with the link to the blog post now putting Liberty on their radar.

New Followers: Both me and Liberty earned a few more followers as a result of Rand’s sharing, some of whom have hopefully continued to follow us for future tweets and updates, both business and SEO-related.

Pride: In my excitement, rather than retweeting Rand’s tweet, I tweeted about the whole thing separately, giving me a chance to word it how I wanted (a bit like Rand not retweeting my notification but putting it in his own words instead). It gave me the opportunity to call it a “massive honour,” while linking to the Twitter profiles of Liberty, Rand and SEOmoz, all in one tweet. Liberty shared it, as well as Liberty’s PR agency, making it more widely accessible to our more local contacts.

@steviephil tweetAuthority: Linked to the above point (especially in terms of Liberty sharing the tweet), it helped to strengthen Liberty’s authority and standing in the SEO industry. By discovering something like this, we are showing that we know what we’re on about and know what we’re doing. This should give comfort to clients – present and future, current and prospective – to give them confidence in our abilities, skills and knowhow.

Recognition: Now that Rand has seen what I/we can do, it might be easier to do something like this again, with him sharing another discovery. It’s like a foot in the door, with it being not impossible that he might remember and recognise me in the future, especially as I have started to comment on a number of SEOmoz blog posts in my own right (and with the fact that I currently use the same avatar on my SEOmoz profile as I do on Twitter).

Networking: I’m a member (and a big fan) of BNI. It’s given me another thing to talk about and to tell people – in my opinion, saying “we taught an expert in our industry something new” is as impressive as saying “we helped to get a website higher in Google.” Although very few people in my chapter will know who Rand is (and that’s fair enough), they can always look into it afterwards, plus some people in related industries may already know who he is (e.g. web developers and social media specialists – I may not be a dedicated expert in either area but I’ve still heard of some industry experts in both areas).

Things to be careful about

I can’t see this type of thing working for everyone. I do think I was extremely lucky, in noticing and responding to the tweet and in taking the time and initiative to investigate and then write about the issue.

A big risk is the person taking the credit for the discovery themselves. Given Rand’s standing in the industry and his morals and views on sharing with others, I knew Rand wouldn’t do such a thing (“that’s definitely a discovery worth sharing” was almost his way of saying “you should tell people about it”), but that’s not to say that everyone would necessarily follow his example.

Alternatively, they might simply not share it. Rand might have not bothered to pass on the tweet, even with my nudge/notification to him. Or they might not share it properly – I was lucky that Rand @mentioned me in the tweet as well as linking to the blog post, but others might only do the latter.

Which brings me onto a big point – not everyone is familiar with Twitter and not everyone uses it. It may differ from industry to industry, with Rand in SEO being a regular Twitter user, while an expert in another industry simply doesn’t touch it.

However, for those who do, there is no harm keeping an eye on what they say and jumping on an opportunity to help them if they want feedback, advice or someone’s input – it sure worked well for me.