Tactics for a Successful Public Vote Strategy – How I Became a UK Blog Awards Finalist

VOTE imageI’m excited to be a finalist in the UK Blog Awards for the second year running, this time in the Digital & Technology category. The first phase was a public vote, and although I put a fair bit of effort into it, I’m certainly no expert – proof of that is the fact that I only made it to the finals in one of the two categories that I entered, suggesting that the competition this year is a lot more fierce than previous years…

I wanted to share my tactics on how I put the word out asking people to vote for me. This is by no means an exhaustive list, and some of them may be really obvious, but who knows… you might try different things next year and it might make all the difference.

Blog-related

Blog about it

UKBA16 badge exampleFirst things first… Blog about it! I wrote a post about it (“Vote for SEOno in the UK Blog Awards 2016!”) containing the ‘vote now’ image, which linked to my dedicated entry page.

Add a site-wide ‘vote now’ button

You can take this further by added a site-wide ‘vote now’ button. I put mine in my blog’s site-wide left-hand column. This is handy in case someone doesn’t see the dedicated blog post on the subject and instead visits another section (such as the homepage, the About page, the Contact page or a random post).

Social media-related

Twitter

Twitter is a no-brainer, and I reckon the biggest ‘pull’ of votes in my case.

I wouldn’t hesitate to tweet multiple times. I tweeted every 2-3 days during the voting period, varying the times and days. Use something like TweetDeck or Buffer to schedule your tweets (so you can get them all ready in bulk, instead of having to worry about remembering to manually do them yourself), and something like Followerwonk to find out the best time(s) of day to tweet based on your followers’ activity.

Followerwonk example screenshot
UKBA16 tweets imageAnother way to vary your tweets on the subject: RT other people’s tweets about it. So if someone else tweets saying that you’ve entered (@mentioning you in the process) then you could consider retweeting that instead of doing a standalone tweet from your own account.

I also tended to vary whether or not it contained an image (either no image, or the screenshot from the entry page, or the one provided by UKBA themselves), and also varied the landing page (mostly the entry page itself, but sometimes I drove people to the blog post instead).

Oh and lastly… Consider pinning one of the tweets on your profile – ideally one with an image (such as the ‘vote now’ image that UKBA provided, in my case). For people who randomly stumble across your Twitter profile, they’ll see it – and even if they don’t end up voting, it still looks good to show off.

Click to read more!

The 1-Star Sucker-punch – Dropping the Ball on Online Reviews

Seeing stars imageAs SEOs we often have our focuses and our biases: our remit is to help improve clients’ visibility in search engines, after all.

However when working with SMEs in particular, you might be their go-to guy/girl for all their online marketing questions – not just SEO. I always try to offer help and advice on other areas if I can – such as social media and UX – but ultimately some things slip through the cracks. This post is an example where giving the client too much a focus can actually be a bad thing… They may perform one task really well, but then struggle to adjust strategy when it matters…

One of my clients has a big focus is on Local SEO: boosting the Map listing. If you Google “[keyword] [location]” keywords then oftentimes a Google Map shows up. And a big factor of that is getting positive Google reviews against the listing. We do pretty well all things considered, especially given that they’re not based in Cardiff city centre and instead operate on the edge of the city.

I did all the right stuff: I told them who was best to contact (happy clients) as well as the optimum time to contact them (just after a project had finished). I gave them an adaptable email template to use, containing info for the clients on how to leave a review and the appropriate links to the listing, etc. Over time, they hit the (ideal) minimum of five reviews and just kept going and going, eventually hitting more than ten 5-star reviews.

Click to read more!

It’s Taking 34 Weeks (& Counting) To Edit A Yahoo! Local Listing

Yahoo! thumbs-down imageIf you want to edit your Google My Business listing, you login (or claim access), make a change, submit it, and then it could take up to 3 days for the change to happen – but usually it’s almost instantaneous, if not within an hour or so.

If you want to edit your Yahoo! Local listing, …haha. Haha. Hahaha. HaHaHaHa. HAHAHAHA. HAHAHAHAHAHAHA! HA HA HA HA HA HA HA. Yeah, good luck with that.

In the UK it has to be done via Infoserve, and the official response is that it takes 8 weeks for a change to go through (which you find out after you’ve applied to edit a listing). That in itself is an embarrassment, so it’s pretty humiliating that – despite multiple attempts and 8-week waits – I’m still waiting for a change to go through for Computer Recruiter, my parents’ business.

14th May 2015 – I put in a request for an amendment of the listing as the postcode was incorrect, it was showing the company’s old web address, and the phone number was showing up as the fax number. An Infoserve employee (who shall remain nameless) dutifully replied informing me that it’d take 8 weeks and that it’d therefore be ready by 9th July 2015. I asked why it took so long (“8 weeks?!”) and got some nonsense reply about it being their standard process or whatnot.

Click to read more!

Vote for SEOno in the UK Blog Awards 2016!

Vote for SEOno in the UK Blog Awards 2016
UPDATE: Voting is now closed!

Just like last year, SEOno has been entered into the UK Blog Awards 2016 (#UKBA16).

Last year I made the shortlist of PR, Media, Marketing & Comms blogs in the Individual (non-Company) section. I didn’t win, losing out to industry peer Matthew Barby… who I’ve noticed is up for it again this year. Game on… 😉

As it says on my profile, I think the blog’s old design let the side down, plus my writing style – and blog topics covered – have improved this year in my opinion. So who knows… maybe I’ll do better this time around! I guess we’ll have to wait and see.

It takes less than 30 seconds to vote, so if you’re a fan of this blog or you’ve found some of its advice useful then I’d really appreciate a vote. Just go to my dedicated profile page (blogawardsuk.co.uk/ukba2016/my-entry/seono), scroll to the bottom, enter your name and email address, choose one or both* of the categories, and you’re done! You can even vote more than once (on different days) if you feel like it!

* *cough*both*cough*

The first phase (the public vote) ends on Monday 25th January, so if you’re reading this before that date then please vote now!

Thanks in advance – here’s to a good 2016!

Getting Bulk PA Data for 404s with URL Profiler

I’ve been using URL Profiler on-and-off for a few months now, mainly for full-on link analysis – especially when it comes to penalty removal and disavow work. However, as I’m sure other folks have discovered, there’s a few other cheeky ways that the software can be put to good use. I found one, and after a chat with Patrick (one of URLP’s founders), I thought it’d be a good idea to throw it up as a quick blog post.

The challenge – 404orama!

I have a client who – despite only having a 1,000-page website – has over 5,000 404 (Page Not Found) errors associated with it. Over 5,000! (Pity it’s not over 9,000, otherwise I could use this. Anyhow…)

The number is so high due to a variety of reasons:

  • They’ve redesigned the site a few times in the past, which has included URL changes, but have never redirected old URLs to the new URLs,
  • A lot of random and/or duplicate URLs have been auto-generated due to a bug or two caused by their CMS system,
  • Simply due to pages being removed by the client’s internal teams (for archiving purposes) but not being redirected.

When you’re dealing with such a high quantity of 404s, it’s difficult to know where to start. My plan was to get PA (Page Authority) data on every URL, so that I could at least work through the list bit-by-bit starting with those with the most SEO value and therefore the most urgent to fix.

Enter URL Profiler. One of the many bits of data that it can grab is none other than PA. This gave me an idea…

The process

The process was dead simple. Instead of putting in a list of external URLs (as one might do when using it to conduct link analysis), I put in the whole list of 5k+ internal URLs, which was collated using a mix of Google Search Console data and a full-site Screaming Frog crawl.

I asked URLP to find PA data on all of them, let it run, and boom: PA data on 5k+ URLs. Sort from highest PA to lowest and that’s your priority order sorted.

URL Profiler results spreadsheet screenshot
The only problem? I now have the delightful task of figuring out where they should be redirected to. Hopefully chunks of them will follow patterns, and that I won’t need to run through all 5k+ individually(!), but either way – wish me luck…!