Facepalm kitteh

MSN Fail: Video Of Car Crash Injuring 10 People Is “Funny”

Ok, so this is a bit off-topic for SEOno, but I’m wondering how long it takes before MSN UK realise the stupidity of what they’ve done and make the relevant changes…

This morning,  I logged out of Hotmail (yes, I still use Hotmail, and yes, it’s on my to-do list to do something about it) and saw this on my screen:

MSN homepage screenshot
(As an aside, I’m loving the juxtaposition of the Sky ad – “dim the lights – for full effect…”)

O-ho, a video! “Car crashes through front of supermarket.” Like William H. Macy’s character in Magnolia! Brilliant. I’m sure it’ll be… hang on…

MSN homepage close-up
Are those people in the way of the car, including a woman with a pram?!

If you are sadist enough (like me, apparently) to go beyond the homepage and actually watch the video, you’ll see it happen at normal speed and as well as about 4 different rates of slow motion. Tasteful.

You might also notice this:

MSN video page screenshot
Can’t see that? Let’s zoom in…

MSN video page close-up
Category: “Funny Videos”?!

Facepalm kittehOh, MSN… That’s pretty bad taste. Turns out 10 people were injured, including “3-month-old baby Tyshawn Davis, who was in the stroller visible in the video and escaped with minor injuries.”

Reminds me of the lyrics to the Faith No More song “Ricochet”:

It’s always funny until someone gets hurt…
And then it’s just hilarious!

At closer examination, it turns out that that “Funny Videos” link is at the top of every video on the MSN UK website, regardless of the content. But still – I was fooled by it, so how many others would be? I’d say this is a big user experience factor: that maybe MSN should consider not showing a link that says “Funny Videos” on a video that clearly isn’t funny, even if said link isn’t necessarily associated with it…

[Itty bitty kitty facepalm image credit: Robyn Anderson]

#SMsceptic: True Twitter Authority Is All About Follow Ratio

100,000 CupcakesLet’s begin with a (slightly rude/NSFW) quote:

“Having the most followers on Twitter is akin to having the most imaginary friends, the biggest Gamerscore, or the world’s longest e-penis. In other words, what does it mean in the real world? Precisely f*** all.”

A friend of mine wrote that on his Facebook profile a while back. He was annoyed because a friend of his was paying a lot of money to see a social media professional for social media training. This professional’s big, bold unique selling point was that he had a lot of followers, the most in his chosen field and area of expertise, apparently. So he must know what he’s talking about and be good at what he does if he’s that popular, right? And fair enough, he did have a lot of followers. I saw his profile and he had about 100,000 followers on Twitter. Nice!

The only problem? He was also following about 100,000 in return. His Follow Ratio was pretty much 1:1.

Why do I have a problem with this? A few reasons:

Quantity can be gamed: Auto-follow tools such as TweetAdder make it easy for someone to obtain a large number of followers. Set it to automatically follow people based on various criteria (e.g. their location, keywords in their profile’s Bio, etc.). Eventually, as you’ve gone to the effort of following these people, some will follow you back – and you can even automatically unfollow those who do not reciprocate after a certain amount of time. Rinse, repeat, and after a while, voilà: you’re “popular” (read: you look more popular).

Why do I say “look more popular” when they could be genuine followers? Well…

You could be preaching to following the choir: What if the 100k that you’re following – to get 100k people to follow you back – are doing exactly what you’re doing? Then it’s purely a numbers game – you’re not reading their tweets, they’re not reading yours.

…And why do I say that? Well…

It’s impersonal: I think it’s pretty safe to say that if someone is following 100k people, they’re not actually reading the tweets in their Twitter feed. I follow 200+ people I genuinely care about as I type this, and I struggle to keep up! In fact, at an event I went to a while ago, one of the speakers – who gave a talk on Twitter – said that you should just follow lots of people from your business profile, and use a separate/personal profile or a Twitter List to follow the people you actually want to keep up-to-date with. Umm… no thanks, that’s not for me.

Scrambled NumbersQuantity isn’t everything: Social media isn’t necessarily about having lots of (or the most) followers. As I’ve said before (point #12), I’d much rather have 10 followers who care about what I have to say than have 10,000 followers who don’t and who only follow me so that I follow them back and beef up their stats. As always, quality trumps quantity.

And at the end of the day…

It’s snake oil – it’s tricking potential customers/clients: I know all this, and I’m assuming most other online marketing professionals reading this know all this, but does your average Joe Bloggs – who wants to learn how to use Twitter for business use – know to watch out for it? Probably not. My friend’s friend didn’t.

So why is Follow Ratio (FR) important? Well compare the above gent’s ratio of 1:1 (followed by 100k, following 100k) to someone who truly is an authority. If someone is followed by 100,000 people but is only following 100 in return – their FR being 1:1,000 – then it seems a lot more legitimate that this individual is genuinely being followed because people care about them. The person doesn’t have to follow people back and they will still follow him/her.

Fortunately, contrary to what I’ve said above, I think people are gradually getting wise to this. SEO has had a similar problem: it seems logical to think that the people ranking at the top of Google for a keyword like “SEO agency” are the best at what they do, but what if they’ve gotten via dodgy/spammy means, or it’s a keyword that looks good but doesn’t even get much search volume? Meanwhile, Twitter does have Klout as a metric, but then it isn’t exactly accurate (and I believe Klout doesn’t currently take followers into account)…

To me, what’s important are things like reviews, testimonials and word-of-mouth. Fair enough if this social media trainer with a 1:1 FR is actually really good at giving social media training, but in my opinion, they shouldn’t use “I have lots of followers” as a USP when such a thing can be easily manipulated (and – judging by his profile – probably has).

Funnily enough, as I was going to publish this post, someone on my Twitter feed complained about how people he knows are falling for follower numbers. Using Storify Wakelet, I’ve included the tweets and @mentions between me and two others: @NeilCocker and @tombeardshaw. (More people and tweets were involved in the discussion, but as some of the tweets went a bit off-topic and became quite negative – pin-pointing a particular individual guilty of the practice – I’ve only included a few of them.)

[Image credits: 100,000 cupcakes by Adam Tinworth (because everyone loves cupcakes!); “Scrambled” by Nick Humphries]

Brighton SEO

My Top 10 Takeaways from #BrightonSEO April 2012

Brighton SEO
The only pic I took – terrible quality, I know!

Having been in SEO for over 3 years now, attending a conference was long overdue. Thankfully, we won some tickets to Brighton SEO and so I headed there with Liberty colleagues Andrew (@Andrew_Isidoro / @SEOFoSho) and Ceri.

While Andrew tweeted like a madman (this tweet sums it up well!), I made a ton of notes, equalling 1,000 words – good fun on an iPad, let me tell you…!

Anyway, here are my top 10 takeaways from the event:

1. Bing: Social is a “strong signal” for content
Talk: Panel – Ask the Engines with Pierre Far, Dave Coplin, Martin McDonald, Rishi Lakhani & Tony Goldstone

Straight from the horse’s mouth – Bing’s Director of Search Dave Coplin explained that social is used as a ranking signal in Bing. He even specified that they definitely take Facebook and Twitter into account, and those whose efforts are “bloody good” will be rewarded with better rankings.

2. ISO DateTime gives search engines context to dates
Talk: Microformats & SEO – Glenn Jones, @glennjones

I’m still fairly new to the head-scratching-inducing world of schema.org and rich snippets, but I thought it was cool that “ISO DateTime” can give context to dates that search engines will understand. With so many ways to write a date (17th Apr 2012, 17/04/12, 2012-04-17, and so on), it can be used to clarify a date in one standard format. It can even be used when a date isn’t actually written, but a date is still suggested (e.g. “next Tuesday”).

Glenn’s slides can be found here. See slide 17 for more info.

3. What info to include when reporting on online PR
Talk: How you can get BIG links from BIG media sites – Lexi Mills, @leximills

Lexi’s talk was by far my favourite at the event. In terms of reporting on online PR efforts, one should consider including:

  • Domain Authority of the site (not PageRank of the page: the article/content will be brand new on the site – as a brand new page – and therefore PageRank will be low (n/a) for that page to begin with, so for that reason, DA is a more sensible metric to use),
  • Whether the link is dofollow or nofollow,
  • Whether the link is an image or text,
  • The anchor text of the link.

I think the same easily applies to guest blogging as well.

4. Follow #prwin, #prfail and #journorequest for potential online PR opportunities
Talk: How you can get BIG links from BIG media sites – Lexi Mills, @leximills

Another gem from Lexi. Keep an eye on the above hashtags for an opportunity to strike.

My tip: Want to filter it by industry? Add a keyword after each one, e.g. #journorequest fashion. You could have one (or a few) per client/site.

5. Tell clients their month-average ranking as well as/instead of their current ranking
Talk: Maximizing your SEO Agencies – James Owen, @jamesoSEO

It’s happened to all of us… When we give our client their end-of-month report, they’ve performed consistently well all month, and then Sod’s Law strikes and on the 29th or 30th they’ll drop a few places. We give them their current rank and they wonder it’s been like that the whole time…

In those situations, it might also be worth including their average ranking over the month, so that you can say “yes, it is nth right now, but look at where it was before…!” Especially handy if it’s a temporary dip.

6. Say “Did I explain that clearly?” instead of “Did that make sense?” or “Did you get that?”
Talk: Sell the Sizzle, Not The Search: Tactics for Appeasing Marketing Directors – Chelsea Blacker, @ChelseaBlacker

This is very timely for me. I’ve been meaning to write a post about sales/networking tips for non-sales people, and although Chelsea’s talk was applied to Marketing Directors and others within an organisation, I think it applies to any/all environments involving laymen.

After exploding someone’s head with overly-technical information, I’ll often say something like “do you know what I mean?”, which might leave the listener feeling a little silly (albeit unintentionally). However “did I explain that clearly?” is a softer approach and – chances are – I probably didn’t explain it clearly, so more accurate, too.

For me personally, this has been one of the most valuable takeaways of the event. Thank you Chelsea!

7. Use competitor downtime to your advantage…
Talk: Enterprise SEO Titties (was that a typo or the actual title of the event in the end?!) – Tony King, @ToastedTeacake

All’s fair in love, war and search…

We all know that competitors bid on each others’ brand terms using PPC (especially big brands), in an attempt to cheekily pinch each other’s traffic before it reaches the site. But Tony made a very good point – if you notice that one of your main competitors is experiencing website downtime, increase your bids on those terms. That’s the time to strike, offering yourself as a (functioning) alternative to frustrated customers who could use you instead of waiting for their usual port-of-call website to get themselves sorted and fixed…

It’s cheeky as hell (although brilliant, mind you), but hey – they’d probably do it to you, too!

8. Shape your response to emotional highs (and use SEO and PPC accordingly)
Talk: SEO & PPC Working Together in Harmony – Tim, @JellyfishAgency

Use SEO and PPC together, but for different reasons. As PPC can be turned on and off very quickly and ads can be shown at certain times of the day, it can be used to drive people to a website at a time when they might be feeling an “emotional high,” as Tim put it. Don’t just rely on SEO, when PPC could be used to draw in additional traffic that may be more inclined to read/react/buy compared to usual.

EDIT: Sorry, it was Tim who was speaking, not Craig! Cheers to @JellyfishAgency for clarifying!

9. Author Rank could be swayed by industry
Talk: I Believe Authors are the Future – James Carson, @mrjamescarson

James’ talk was interesting – it’s early days for the likes of Author Rank, rel=author, etc., but it’s clear that Google is becoming more and more fixated in this area as time goes on.

James has a theory that in the future, Author Rank could differ by industry. Rather than a well-respected, high-ranking author always ranking well no matter what they publish, Author Rank could be determined by the consistency of what they publish by industry, based on their previous successes. For example, if a famous fashion blogger suddenly blogged about football, it may not necessarily rank well – even if their fashion posts usually do – because it is inconsistent of what they’re known and respected for.

10. Mascots can cause a reaction (but be a distraction)
Talk: I appear to have started a sweetshop (and advertising company) – Dom Hodgson, @Thehodge

Dom easily wins the award for the most entertaining talk of the day (as I’m sure fellow attendeanales reading this will agree…)

Dom originally used a mascot – a “f***ing squirrel,” as he so eloquently put it! – on the first design of his sweet shop website. Although they had a lot of social media mentions revolving around said mascot to begin with (“did that squirrel just f***ing wink at me?!”), showing initial promising signs that his(?) inclusion was a good move, they decided to “kill” the squirrel and eventually removed it from the site. Why? Because an eye-tracking test showed that visitors were distracted by the squirrel, and in some cases it might’ve been such a distraction that it was putting some customers off from buying anything.

I found this fascinating. It just goes to show that even if people say something positive via social, it may not actually be a positive for the website or company.

11./Bonus: Advanced Search String Queries for SEO
Talk: Word from a Sponsor – Analytics SEO, @analyticsseo

Ok, so I lied – I’ve included an 11th takeaway, as while writing this post, I remembered another good takeaway from one of the sponsors – Analytics SEO – who used their ‘sponsor message’ section to share their list of advanced search string queries for SEO.

So that’s it! That’s some of the words from the 1,000-word tome that’s left me with aching fingertips and a low iPad battery…

I’d like to take the opportunity to thank a few people:

  • Analytics SEO, who ran the ticket competition and therefore the whole reason I managed to go,
  • Kelvin (@kelvinnewman), the event’s organiser, for his help and patience with the infamous ‘ticket confusion’ on Thursday,
  • The Brighton SEO blog (I’m assuming Kelvin again?), as I used their list of last-minute Brighton hotels when booking accommodation – a great idea for appropriate, helpful content,
  • Emily (@ems_ob), for the catch-up,
  • The man who bought me a shot of sambuca because I apologised for accidentally queue-jumping him at the bar at the afterparty. Alcohol + poor memory (generally) = I’ve forgotten your name, but if you tweet me and remind me then I’ll edit this post and link to you as promised. (And before anyone tries pulling a fast one, I’ll know the name when I see it!),
  • The magician (@mcrmagic), for blowing my mind to smithereens.

Oh and for anyone reading this who enjoyed the karaoke at the afterparty, I’m the guy who sang the Foo Fighters song. I apologise for the high bits!

EDIT (03/05/12): I thought I’d share this awesome infographic as well…

Brighton SEO Infographic April 2012

Infographic Design by ShellShock uk

Black-Hat - Stormtrooper image

The Sad Truth: Black-Hat SEO is Working…

There’s been some great articles recently on the topic of white-hat vs. grey-hat vs. black-hat SEO, regarding what’s working and what Google’s doing about the situation, and so I wanted to weigh in with my recent observations.

While the likes of “SEO evangelist” (and one of my SEO heroes) Rand Fishkin argue that white-hat really does work and should be the way to go, others have vented their frustrations, saying that Google isn’t making it easy to be a pure white-hat anymore; that you may have to turn a little ‘grey’ in order to succeed. In the meantime, black-hats are the ones who are ultimately winning – while we all wonder, worry and play by the rules, they’re doing fine and reaping all the benefits the top of the SERPs can bring.

As that self-confessed link spammer put it in the SEOmoz comment I’ve linked to above: “Don’t blame me for doing it, blame Google for not fixing it. And that’s part of the problem: it’s all well and good being white-hat, honest and ethical, and for Google to ask us to do it this way, but they’re not exactly doing us any favours if they’re not weeding out the link spammers as well.

I’ve been doing link building full-time for over 3 years. In all that time, I’ve looked at a lot of backlink profiles, mainly competitors of my clients. I have to say though, that of all the clients and industries I’ve looked into, I’ve seen more dodgy backlink profiles in the last few months than I’ve seen in all the years previously. What makes it worse is that they’re the ones who are currently ranking well in (and often at the top of) Google for their industry head terms.

In this post, I’m going to share my experiences – the types of links I’ve seen that seem to be working for these sites – and what I think Google should be thinking of doing in order to counter it (which is certainly easy for me to say, and easier said than done, I’m sure).

Disclaimer: Before I continue, I just want to say that I won’t be naming and shaming. Apart from the fact that I don’t want to get into any legal or libel trouble (*coughcowardcough*), it’s simply not my style. What I am hoping is that fellow SEOs will read this, nod along and say “yep, I’m seeing this too, and I’m equally annoyed and gutted by it.” Besides, when Eppie Vojt wrote a similar, excellent post outing a website and its crappy links, the site being scrutinised had been spotted and banned by Google before his post had been published (while the sites I’m talking about are still ranking).

Is black-hat victorious? A backlinks case study

Black-Hat - Stormtrooper imageThe first (and main) site I will be looking at is in a similar vertical to the one Eppie outed – a very competitive financial sector. Similar to AutoInsuranceQuotesEasy.com (previously) ranking 2nd for “car insurance” in the US, my example was* ranking 2nd for its head term in the UK – it is an affiliate site, with a non-branded exact-match domain URL, fighting some of the biggest household-name brands in the UK (and – for the most part – beating them). I know this industry and its SERPs pretty well, as I used to work on a site that is currently one of the ones ranking near it (read: below it). I’ll call this site Example X (or EX).

* I say “was” because when I first started typing this post, it was 2nd. However, it’s now 8th. Typical eh? Still, it’s still on the 1st page, and therefore it hasn’t been entirely nuked by Google, unlike Eppie’s example. Perhaps it took a hit from the recent blog network devaluation, if it’d been guilty of that practice, too – a quick glance at the overall backlinks profile suggests that it could have been.

The second cluster of sites I will be examining are in a very competitive industry relating to law and legal services. I have been working on a client’s site and while we’re trying everything we can that is white-hat and by-the-book, the people ranking at the top are pretty much as black-hat as can be… I’ll call this group of sites Cluster Y (or CY).

I have mainly used link analysis tool Majestic SEO to analyse these sites’ ‘best’ backlinks, based on their ACRank score. I love using Open Site Explorer, but I find Majestic’s results more interesting in this instance. Besides, at a quick glance, many of Majestic’s top 100 links for Example X overlap with OSE’s top 100, which are ordered by Page Authority.

Example X‘s strongest links

When Example X is thrown into Majestic SEO (root domain, Fresh index), its top 100 backlinks range from 7 to 4 in ACRank. The majority of them reveal themselves to be blog comments (similar to Eppie’s example: 43% of AutoInsuranceQuotesEasy.com’s backlinks were comments), but there’s more to it than that. These blog comments have a lot in common:

  • The majority of them are not at all relevant to EX‘s industry, including sites to do with technology, software, gardening, fashion, politics and even gambling,
  • The majority of the comments are not relevant to EX‘s industry – some of them themselves are dodgy, including gambling, adult and Viagra comment links,
  • Some are in English, but a lot of them are in foreign languages, including French, Russian and Japanese, and therefore also on foreign domains (e.g. .fr and .jp),
  • Many of them have hundreds (if not thousands) of blog comments, which are not only dofollow but are also seemingly unregulated (i.e. they go live instantly without needing approval). This also means that each of these pages links out to hundreds or even thousands of external sites,
  • As you can imagine, the comments themselves are pointless and contribute absolutely nothing to the posts; they’re your typical “great post” type comments in broken English (possibly produced by spinning software),
  • In some cases, you can actually see ‘broken’ HTML code, where either the commenter hasn’t configured the link properly, or the blog doesn’t allow – or have the capability – to show in-comment links (i.e. the link is coming from the commenter’s ‘Name’ instead).

For those that still have a comment form and have the capability to add to the almost endless list of spammy, useless comments, most of them have a CAPTCHA form, which suggests one of two things: 1) that the blog commenting process isn’t entirely automated – someone still has to manually enter the data into the fields and type in the correct CAPTCHA code each time, or 2) it is automated, and CAPTCHA-filling software is being used (kudos to @Andrew_Isidoro for pointing out that such software exists, as I – somewhat naïvely – had no idea). Call me a sceptic, but my money’s on the latter.

Cluster Y‘s general backlinks

I know what you’re thinking: for Example X, I’ve only really properly assessed 100 links. It might be what Majestic considers to be its 100 best links, but it is just 100. The site has more than 5,000 in total, so I’ve only really scrutinised 1-2% of its backlink profile.

Of course, I have trawled through the entire list of 2,500 backlinks that Majestic came back with (and, let’s be fair, it’s not much better)! But it is true that it might’ve been the case that the rest of its links are perfect, by-the-book and ticking all the boxes as far as Google is concerned, and to be fair, the likes of Rand argue that Google has to allow for a bit of spam, as no one has a 100% perfect link profile (after all, if your competitors get dodgy sites to link to you, it shouldn’t be considered your fault, but if the majority of your links are spammy, it’s more likely the case that it’s your own doing). That said, even that’s hotly debated at the moment (see Reason #1 of this post), but I digress…

Cluster Y is a different story though. I’ve looked at 5 sites all ranking for the industry head terms on page 1. They have only a few hundred links each. While there are some non-relevant blog comments à la Example X, there’s also a lot of footer and blogroll links going on. And guess what? They’re all coming from completely off-topic sites. Out of the hundreds of backlinks each of the sites has, 4 out of 5 of them do not have what I would classify as ONE relevant link. Ironically, the one that does ranks the lowest out of all 5 of them, but still also has a lot of non-relevant links as well. The site I’ve worked on ranks lower than all 5 of them, and I’ve tried to keep it relevant as much as possible.

So what’s the problem?

The problem is that Google has guidelines and Google employees – like Matt Cutts and his Webmaster Help videos – like to recommend white-hat ways of doing SEO. For the most part, these sites are doing the absolute opposite and succeeding from it. How so?

Relevance Google say it themselves:

“Your site’s ranking in Google search results is partly based on analysis of those sites that link to you. The quantity, quality, and relevance of links count towards your rating. The sites that link to you can provide context about the subject matter of your site, and can indicate its quality and popularity.” (Emphasis added.)

Wil Reynolds has commented on this as well. As I’ve established above, there’s barely any instance of relevance in either Example X or Cluster Y‘s backlinks. The only relevance I can see is the keyword anchor text: the site might be able blue widgets and the anchor text might be “blue widgets,” “blue widget,” “cheap blue widgets,” etc., but the sites these links are coming from are not about blue widgets – instead they’re blog posts on a different subject, with hundreds of blog comments linking out to completely different sites in completely different industries each time.

Large numbers of out-bound links – Back in 2009, Cutts stated that pages shouldn’t have more than 100 links on it – internal or external – if it can be helped. However, since then, it has been highlighted as more of a technical guideline than a quality guideline, and it’s particularly fine if “your page is sufficiently authoritative.” Now going by to Majestic’s AC Rank, these pages are considered sufficiently authoritative, which is precisely the reason they’re being spammed to death (that and the fact that they’re dofollow, etc.)

But c’mon, Goog… These sites have links in the 100s, if not 1,000s – one even has about 4,000 out-bound links on it, due to the excessive number of comments on the page. Maybe it should be a quality guideline? You could argue that it’d be unfair for the website to become devalued by Google and to suffer because of something outside of its control, but then again, it is in the site’s webmaster’s control: they could consider deleting the comments and/or consider disabling comments from appearing whatsoever. After all, it’d probably be better to have no comments at all than a plethora of nonsense, adding absolutely squat to the topic of conversation. This also ties in with Google’s stance on ‘bad neighbourhoods’ and the fact that for these blogs, where you’re linking out to is also important, not just where you’re being linked from.

The white-hat’s dilemma

White-Hat - R2D2 imageWhat really sucks about all this is the fact that these mad (black-)hatters are getting away with it.

Yes, you can be white-hat and work hard and slave away on getting 100% genuine, honest, by-the-book links, but when your competitors can spend 5 minutes using a bit of automated software to get really powerful – but really irrelevant and dodgy – links and succeed, then what’s the incentive for the white-hat? Work hard, be good and still not rank?! Sod that! Clients won’t thank their agencies for it, and Marketing Directors won’t thank their in-house team for it. And this is precisely the gauntlet Google is running by not tidying up this crap: “if you can’t beat ’em, join ’em”, may become the case for many honest, hard-working white-hats who just aren’t seeing the benefits from all their hard work.

But what’s a search engine giant to do?

Right, now we enter the part of the post where I enter ridiculous self-righteousness…

It’s easy for me to say “this is what Google should do” – especially as I don’t think I could ever work for them, as their crazy interviews questions leave me utterly dazed, confused and feeling stupid – but that said, I think the above certainly highlights that there is a problem and that the problem needs to be seen to and fixed. Easier said than done, yes…

The first thing they should be thinking of doing is turning up the ‘relevancometer’. It’s true that completely irrelevant, off-topic sites can and do link out – for example, someone on a gardening forum might ask its members for advice on car insurance, just because they trust the community and value their opinion – but when every link is irrelevant and off-topic then that seems a little more than coincidence…

I think they should also consider making excessive numbers of links a quality guideline, not just a technical one. I can’t see why a page’s links overall – not just the most recent ones – cannot and shouldn’t be given slightly less value as more links flood onto a page. Obviously it’s a part of Google’s PageRank algorithm – that more links on a page means less ‘link juice’ is allocated to those pages – but I’m talking about a limit where that drops even further; maybe not in terms of PageRank, but in terms of looking at a page, and if it hits a certain limit, its value is affected slightly, and then if it hits another limit, it’s affected even more, and so on.

In fact, if this isn’t a consideration already, what about working out a ratio between page size/copy length and the number of links on the page? For instance, I’ve linked out to a lot of sites within this post, but then it’s also 3,000 words long. 30 links for every 3,000 words is a bit different to having 100s within less than that. Naturally, having comments following a blog post – genuine or otherwise – is going to affect that ratio significantly, but I think it’s safe to say that natural, genuine blog comments are going to be longer and more substantial than your typical, spammy “great post” style comments.

Not only should they consider turning up the ‘relevancometer’ from an on-site copy or general domain point of view, but also take into account the relevancy of other out-bound links. If you’re the only link about “blue widgets” on the page and you’re featured among dozens of other links, each one linking to a site on a completely different subject, then it really only spells one thing: link farm.

While I’d say the above should be serious considerations, we’ll call the next lot ‘maybes’ – I don’t even know if it’d be feasible (or even possible) to assess some of these criteria…

By language/location: Arguably, this depends on industry. For example, a car insurance website is probably not going to get many international links because it’s specific to an individual country, although it might get international links if it does something worthy on a global scale, e.g. a successful viral campaign that’s admired worldwide. Other companies in other industries might be global. However, the EX and CY examples above are UK-specific, but have an irregular amount of links from sites from European and Asian countries. It’d be great if Google could measure the authenticity of these links in a way that works out if the site being linked to warrants the link, i.e. are they the subject of the discussion, or just another random link appearing on the page?

By blogging platform: Speaking of foreign language sites, I noticed that some of the sites being spammed like crazy in the comments were French and that most of them had a very similar blog structure and set-up, so I’m wondering if Google has the ability to discount or discredit certain blogging platforms, assuming this is a blogging platform targeting the French-speaking market? Again, this is a bit unfair for all those who are using their blog properly and are strict when it comes to monitoring comments, but, as I’ve argued above, it could be the blogging platform developers’ fault for not configuring its features properly for its users, e.g. maybe the spam filter isn’t strong enough, or there isn’t even one in place.  I don’t think this too outrageous a consideration, especially given the fact that some blogging platforms are seen to be more SEO-friendly than others

By the extent of the visibility of ‘bad’ code: By “visibility,” I mean exactly what I say: not just bad code by the search engine spiders’ standards, but bad code that people browsing the site can actually see for themselves, too. For example, when these spammy blog comments have been entered irrespective of the blog’s guidelines and you can actually see the <> of broken HTML code (I even saw the [] of broken BB code, in a couple of instances), then this could – and maybe should – be an indication of improper on-site technical practices. It goes without saying that Google likes clean code (insofar as the site is easy to navigate for both search engines and humans), although I can’t say that I’ve come across any reference to visibly bad code, but maybe that’s because it’s just so obvious that it’s probably not a good thing to have on your site.

Again, I’ll repeat: it’s easy for me to say all this – a Google engineer might read this and say “we KNOW, get off our case!” and it’s clear that – even recently – they are apparently working very hard to try and solve the problems associated with spam affecting their index. Even so, I’m hoping that some of my observations indicate the seriousness of the problem and help them in their quest for a search engine that’s swayed by spam as little as possible (if that is even possible and can ever be achieved). Heh, I was even going to say that maybe they should consider hiring someone who’s worked in the SEO industry as part of their engineering team, although it looks like they’ve already something done this for their Google Webmaster Trends Analyst team.

Whatever the case and whatever the future holds, here’s to hoping that black-hats shall rue the day (one day) and that white-hats will see their efforts pay off…

[Image credits: Black-Hat Stormtrooper by Jay Malone; White-Hat R2D2 (a.k.a. one of the coolest pics I’ve ever seen) by Brittney Le Blanc]

SEOno Returns

Just a quick bit of news…

In my last post, I said that SEOno would be on hiatus until June. However it looks as though my circumstances have changed, with my CAM Diploma deadline moving from May to September, which should give me some time in-between in order to blog on here again.

I also realise that it’s been a while (August) since I wrote a post listing guest blog posts and articles I’ve written for other sites, but I think there’s only been the one since then anyway, which appeared on Fresh Business Thinking in October: How to Find Opportunities and Mentions of Your Business Using Google Alerts and Twitter. Should be a few more in the pipeline though…

And that’s all there is to say for now really… I’m not used to typing posts that are this short!