In case you haven’t had time recently to wade through the extensive information that exists on Data Protection and Data Law – we’ve put together a summary of some of the most important points so that you can be sure your clinic complies with best business practices.
It’s important stuff that can sometimes get overlooked, so we hope you find this helpful. While our guide is based on Irish and EU law, the common sense practices are just as relevant to other markets. Click on this link to download the media file. Clinic Guide To Data Protection
Have you a topic you’d like one of our clinic marketing experts to cover in an e-book or guide? Let us know by leaving a comment on the blog, or by sending an email to firstname.lastname@example.org.
It’s no secret that Google likes web pages that load quickly. In fact they offer free page speed tools to help you optimise your site and make the bold claim that “Fast and optimized pages lead to higher visitor engagement, retention, and conversions.”
Bounce Rate Increased 15%
Recently I was forced to reset my Google Reader account thanks to Google’s recent account clean up. The upside of this was that I had to resubscribe to any RSS feed I still wanted to read regularly. A few essentials were added first: Google’s Webmaster Central, SEOMoz, Distilled and so on. Then a few other favourites like Fred Wilson’s AVC blog and Mark Suster’s Both Sides Of The Table were added.
What really struck me though was the number of subscriptions I didn’t want to keep up with any more. This was largely for three reasons:
- I was no longer that interested in the topic
- The content was too repetitive
- The content volume and quality had spiralled out of control
For the first reason there was nothing the publisher could have done to keep me as a subscriber. For the second reason, it was possible but unlikely. Some of the topics were quite niche and there wasn’t a lot new to say on a regular basis. However the third reason is completely within every publisher’s control.
Straying From The Original Plot
I’ll pick out Mashable as an example. It’s quite a regular occurrence that when I open my reader in the morning there are 30 or 40 stories in the Mashable folder. Buried in there somewhere are the one or two that I might still find interesting, but there is no way I’m going wade through the rest to find them, especially considering that another more focused blog is bound to reblog it, or someone in my Twitter stream will tweet about it.
Mashable, along with a growing number of other web properties, seem to be obsessed with growing visitor numbers at the expense of focus and even quality control, and in doing so they publish so often and on so many topics that I’m no longer interested in what Mashable has to offer. The same can be said for a growing number of blogs that are looking to grow visitors numbers by growing the number of articles they publish per day.
A Pivot From Niche To Mainstream
Mashable is supposed to be about Social Media News and Web Tips according to it’s own homepage <title> tag. So why is it publishing a story today about the new HTC Sensation XE with Beats Audio? And what about its article on Expanding Your Startup To International Markets? Or even the new version of VMware Fusion? Quite simply they know that these articles will gather traffic because they know they can rank easily. But should they publish them in the first place?
I guess the question comes down to this. Do Mashable just want as much traffic as they can get their hands on, or do they want to be the go to source for social media news? Do they want to be mainstream or niche? The answer in this particular case seems pretty clear to me. Maybe they will succeed in becoming the next Wired, and if that’s what they want, good luck to them. But in the meantime, when I want some social media news, I’ll go to a social media specialist source.
My tip? Unless you’re trying to be come a broad news aggregator stick to what you know, and what your readers want, and make sure you have something to say. If that means publishing less often, so be it, but at least you know that the resulting relevant traffic will be because of the quality of your content and not just because of the volume of articles you publish.
[Here’s a bit of background about our previous cache setup. Skip ahead to “Using Varnish To Cache WhatClinic.com” if you want to jump straight into the Varnish section.]
Our main website is built on Microsoft’s IIS and we have been using its built-in page and component level caching to serve html pages for several years. This built-in caching is easy to setup and quite flexible, but it is very memory hungry.
The memory issue isn’t much of a problem on small static websites with only a couple of hundred pages. Unfortunately though, WhatClinic.com is a dynamic site with potentially millions of individual pages to serve. Typically we were getting only 12% of our pages served from the cache, and sometimes this was as low as 6%. It was almost pointless running the cache at all.
The biggest problem for us is the breadth of the website. On a typical day we have 30,000 unique visitors, but they land on 23,000 distinct URLs. Over the course of a month this balloons to 145,000 distinct landing pages. Worse still, they look at over half a million distinct pages on the site.
To try and improve the performance of the existing IIS cache we tried writing the page cache to disk. Under test conditions with relatively small numbers of pages this worked well, but to get even 50% of our pages from one month’s visits in the cache it meant having 250,000 pages written to the disk. In the end the NT file system on our servers starting grinding to a halt, not because of request volume but purely because of the number of individual files involved.
Using Varnish To Cache WhatClinic.com
We came up with some ways around the NT file system problem but decided in the end it would be better to move the cache off the main box altogether. At the same time we decided to look at Varnish as a solution, with a view to hosting it on AWS.
On the upside Varnish is lightweight and powerful, but it also introduced a number of new problems for us to overcome:
1. Varnish Caches Cookies
2. All Requests Go Through The Varnish Box
To determine a visitor’s location we look at their IP address, but since all requests were going through the Varnish server our own server was only seeing one IP address hit it all the time. We changed the code to pass the referring IP address along and so we could pick it off.
Problem solved, except now our default access logs don’t record the proper IP address of each visitor. We use Google Analytics and our own logs for the bulk of our reporting so this isn’t a big deal, but at some point we might have to look at writing our own access logs with the referring IP address if only to give us the peace of mind that when something does go wrong we can track it in the raw log data.
3. Altering Our Landing Pages
Depending on whether you have just landed on WhatClinic.com, or are browsing subsequent pages, we alter the layout of the page. The layout differences are quite extensive even though the data is all the same, so it isn’t efficient to make the changes on the client side. We need to cache two different versions of the same page.
The solution involved getting Varnish to pass along the referring URL and using something like (isReferringDomainWhatClinic.com) as part of the key for the cache as well as the requested URL itself. In the end this was pretty easy to do too, but it did double the number of pages in the cache. However, we were trying in particular to improve the speed of our landing pages so it is worth it to us.
4. Time To Live
As we said in the intro, we have a very broad site. Our pages also change quite infrequently, so we wanted to have the maximum possible time to live for the cached pages, in the order of several months. However, some pages do change, and a change to any one of our customer’s data may have effects that ripple over hundreds of pages that their clinic might appear on.
The solution was to set our time to live to several months, and then remove pages from the cache only when they had been updated. Having implemented a means to remove the pages from our cache, we then had to determine when a change to a clinic’s data had occurred and which pages were affected by the change, so we knew which pages to remove from the cache and update.
Working out exactly which pages were affected turned out to be a little problematic but we solved it eventually and we’re reasonably happy that we’ve covered all the cases. We also coded a big red “Remove All This Clinic’s Data From The Cache” for use in case of emergencies.
Overall, it has been a big win. After about three weeks of operation we have a page hit rate of around 65%, which is a huge improvement from the 12% we used to get. Cached pages are returned now somewhere in the order of 100-200ms instead of 2000-5000ms, and the load on our server has dropped dramatically, improving performance for those pages which are never going to be in the cache too.
Performance improvements never end, do they?
[Image: The Nonsense Blog]
Identity, or rather the ability to create new identities, is a key facet of the internet. Some people troll away on internet forums under assumed names just to cause trouble, while others use made up identities to expose political scandal or even circumvent local laws. This ability to change identity at will is both a boon to the creative and a bane to the legislative, but in its own way it drives innovation and change.
Identity And Social Media
Eric Schmidt of Google spoke recently at the Edinburgh International Television Festival to announce the launch of Google Television in Europe, which should hit our shores in the new year. However, he also took some questions, including one from Andy Carvin about Google Plus (G+). He asked “how Google justifies the policy [of making people use their real name] given that real identities could put people at risk?”
Schmidt’s answer was that G+ was built primarily an identity service, and that people were free not to use it if they felt they could be putting themselves at risk. I found the answer a little disappointing, especially given the tone of his actual speech which took issue with the ongoing split between the sciences and the arts in the UK.
By forcing G+ users to use their real identities Google are in effect silencing the weird and the creative along with the subversive and the disruptive, leaving them to create their “fake” identities on message boards and Twitter and Facebook instead. Google’s attitude appears to be driven by their desire to use G+ data as part of their search results algorithm as a way of reducing web spam, but this seems like a short sighted method of guaranteeing the authenticity of a +1 click for instance.
Authentication vs Identity
Rather than focus purely on identity I think Facebook and Twitter are getting it right by focusing on authentication. By verifying that you are the person who created a certain Facebook or Twitter account you can continue that internet persona uninterrupted on a myriad of different sites. You could potentially Like things more than once, or share them on multiple Twitter accounts, but does that really cause a problem when a real person does it once or twice?
Say you could have more than one G+ account then, how many people would go to go to the trouble of creating two accounts to game Google’s search results? A lot unfortunately. There’s real money riding on it after all, and knowing the experience of Black Hat SEO practitioners and their “creative” ways of building links, they’ve probably already gotten around Google’s current protections anyway.
Google really are going to be fighting an uphill battle to keep it to one account per person. Twitter suffers a great deal from fake accounts being set up for spam purposes. Facebook apparently less so, even though Facebook Likes are now almost a web currency of their own. But companies with people as smart as the ones at Google, Facebook and Twitter should be able to decouple the ideas of multiple (valid) identities and spambots created purely to manipulate results.
Who Are You Right Now?
I think Fred Wilson’s take on Identity and Authentication is pretty spot on too. He comes at it from a slightly different angle, not so much about fake or hidden identities but rather about his real identity being split across different sites for different reasons. It was for exactly this reason that G+ made the leap forward it did with Circles, letting people split out who they talk to based on some common themes, but by tying it all to your real name it restricts itself unnecessarily.
Quinton O’Reilly also recently covered some of the problems that arise from having a publicly accessible profile with its own unique persona, especially when potential employers come looking to dig up some dirt on you. To me that’s all the more reason why people should be allowed to have different accounts, or identities if they want to. In fact, it’s almost exactly the reason that most people I know on LinkedIn are members of that site. They don’t want their Facebook profiles perused by employers, colleagues or customers!
Should Businesses Care?
Which brings me to the business end of things. Most people who use WhatClinic.com use their real names when they create an enquiry and they use real email addresses and real phone numbers. But do we know if they just created that email account, or just bought a prepaid mobile phone? No. Do we know if they’re using a pseudonym? No. Should we care? Not really. So long as the clinic can actually contact the user they’re free to call themselves whatever they want.
People change depending on the situation they’re in. They do it in the real world and they do it online, and I doubt even Google are going to be able to stop that.
Google Analytics was recently updated to change how it calculated when a visitor’s session ended. We were told this should have only a small effect, around 1% on average, on how our visitors were being counted. The change went live on Thursday August 11th. If you look at the graph above you can see on Friday the 12th our reported visits increased by over 40%. This trend continued for nearly a week.
Now a 40% increase in traffic would obviously be welcome, but our unique visitors report told a very different story. Nothing had changed much at all!
So, what was going on? It turns out there were some bugs in the Analytics update that created new sessions for users when they should have had only one. Full details are in the update to the announcement of the original change. It would seem that in our case visitors who clicked the back button in their browser to go back to the landing page they arrived on were being counted multiple times.
Google pushed a fix to this problem on Tuesday the 16th of August and everything seems to be back to normal now, but I’m sure we’re not alone in having spent some time trying to work out what was going on and what changes we’d need to make to be able to compare reports from before and after the change. Thankfully it looks like that’s not an issue anymore. Still, it seems like a pretty big bug to slip through the net for such an important product.
I switched to the new Google Analytics interface and almost immediately ran into that old problem of wanting to export more than 500 rows of data without having to resort to using API calls. The old “limit=50000″ trick doesn’t work with the new format, but thankfully there is a work around which I came across on the Convonix blog.
If you choose to show more than the standard 10 rows using the drop down at the bottom of the page, a new “rowcount” variable is added to your URL. For example, I changed a page to display 25 rows and the variable looks like this:
By changing the 25 you can change how many rows get displayed and then export them, up to a 50,000 row limit apparently. I’d caution against relying on this as a long term solution though. The previous 50,000 row limit trick got reduced to 20,000 after so many people started using it, and I imagine the same will happen with this trick once its use catches on. In the meantime though, enjoy!
One of the fun problems we have working at WhatClinic.com is trying to organise the millions of pages that result from listing tens of thousands of clinics in thousands of locations for thousands of treatments.
Our search results pages list up to 12 clinics at a time, and when they’re full they offer a great user experience. Lots of choice and lots of information is presented along with a simple way to contact whichever of the clinics takes your fancy.
However, not every combination of clinic type + location + treatment will have a full page of results. In fact with only a little knowledge you could probably guess the URL of a page with no results on it. The obvious solution to these empty pages is to return a 404 response code and not to link to the pages internally, minimising the chance that they’ll be found by users or search engines alike.
What’s Right For The User?
Add one clinic to the page though and we’re left with a quandary. Is this really a useful page for a user? Wouldn’t they like more choice? We know for instance that pages with more clinics on them have a better conversion rate, so would we be better off sending users to a “parent” location page instead, i.e. a location that contains the smaller location but should have more than one clinic on offer?
Another option available to us would be to fill the rest of the page with 11 of the nearest clinics to the location (which could be tens if not hundreds of miles away in some cases), but this would massively increase the duplication of data served across the pages on our site as clinics’ listings would appear in far more locations than they currently do.
Similar Pages – The Rel=”Canonical” Solution
We decided we’d like to see what effect the first option had, i.e. sending the users to a parent page, but we were uncomfortable with 301 redirecting every page that only had one clinic on it, so we decided to try a slightly softer approach.
Having read an article on SEOMoz about using the Rel=”Canonical” tag to get more than one keyword to rank for a given piece of content, we decided to try what we thought was quite a clever scheme that would serve the user and the search engines.
We would put a Rel=”Canonical” tag on our search results pages with only one clinic listed, and we’d hope to send people searching Google for Place A to the search results of Place B, which would contain the search results for Place A and more, giving the user a better choice.
Anchor Text Isn’t A Very Strong Ranking Signal For Pages With A Rel=”Canonical”
Unfortunately for us, the experiment hasn’t exactly gone to plan. We were cautious and only put the Rel=”Canonical” links on a subset of our one result pages, but even still we have enough data to see that for now at least none of the Place B pages are ranking for Place A keywords.
Of a sample set of 20 one result pages with a Rel=”Canonical” tag, 14 have been crawled and no longer appear in Google’s index, and searching using the “Place A” keyword for these pages doesn’t return the Place B search results page.
You might think, well Google have decided that the Place A and Place B pages aren’t sufficiently similar to be a valid use of the Rel=”Canonical” tag, and you might be right, but the fact that original Place A URLs are no longer appearing in the index seems to counter this supposition.
More likely it seems is that the anchor text of the links pointing at Place A pages isn’t a strong enough signal for the Place B pages to rank for keywords based on “Place A”.
Back To The Drawing Board
So it looks like we’re back to square one on this particular problem. I think the next thing to try is the option discussed above where we fill out the search results. It seems like a good thing to do for the user, but I am slightly worried about diluting our content by potentially overusing it. We’ll be sure to keep you posted about the results when we try it out.
Have you run any experiments with the Rel=”Canonical” tag yet? For what purpose, and what results did you see? Let us know in the comments.
We all like to keep an eye on the usual metrics when looking at our Google Analytics accounts. Visits, unique visitors, bounce rates, time on site, and conversion rates all get a look in. These are all great pieces of information for making sure that things are working the way you expect them to on your website, but what if you want to look a little deeper?
Unfortunately Analytics can’t answer every question you might have about your site, in which case it’s time to dust off your Excel For Dummies book and get stuck into manipulating the data yourself. For those of you looking for a good guide to some of the most useful Excel functions for SEO analysis I can recommend the Microsoft Excel for SEO guide from Distilled.net.
Digging deeper often requires large amounts of data to give meaningful answers, so you’re going to want to get familiar with adding the “&limit=50000” to your GA URLs, or better still start using the Google Analytics Data Export API or the Excellent Analytics Excel plug-in.
Keyword Lengths and Conversion Rates
I’m a firm believer that the more you know about your visitors and their behaviour the better you can tailor your product to suit their needs. So, from time to time we go and look at some metrics that are slightly off the beaten track. Have a look at the graph below for instance:
It charts the traffic and email enquiry conversion rate of traffic over a recent two week period. The first thing that struck me was the more keywords people use to find WhatClinic.com the more likely they are to convert. The second thing was that just over 50% of our email conversions come from people who use 4 or 5 keywords to find the site.
All well and good you say, but what use is information like this? Well, for a website like ours with a long tail focus it shows us how long the keywords in the long tail are. We typically optimise pages for one or two keywords, usually two or three words in length. The data above suggests that maybe some pages should be optimised for slightly longer keywords, or perhaps even two longer keywords.
Thanks to other curious SEOs like SharkSEO we also know that you can write two completely different meta descriptions for the same page and the search engines will pick the description that best matches the keyword being searched for. This opens up some new possibilities about how to organise our data and our site structure. Using the keyword length and conversion data above we can make more informed decisions about how to optimise the resulting pages.
Are Keywords Getting Longer?
Just over a year ago I wrote about how people were using longer keywords to find WhatClinic.com. Seeing as we’re talking about keyword lengths again I thought I’d take a quick peek at some data from this year. I was in for a surprise.
If my data was to be believed keyword lengths were almost exactly the same as they were a year ago. The answer seemed too neat to me, so I decided to do a little segmentation. My suspicion was that by looking at our traffic as a whole I was missing some underlying trends, and it turns out I was right.
Traffic from Ireland accounts for around 19% of our total visits, but as you can see from the chart above it accounts for over 30% of our one and two word keyword traffic. Again the question is how is this information useful or actionable? The simple answer again is to do with the messaging – the page title and the meta description in particular.
In Google.ie we now rank quite well for certain one word keywords like “braces” or “dentist”. While this is great for us in terms of traffic, the pages are really optimised for people looking for our page about braces in Ireland, or dentists in Ireland. This means that as the keywords used to find these pages get more generic / head / short tail that maybe we should look at changing the messaging on them to better reflect more closely what the user is looking for. For the cases above, I think that the messaging might be OK, but we’ll test some alternatives and see how they affect CTR and conversion rates.
The Importance Of Segmentation
The Irish traffic above really skewed the keyword length data above. Seeing as our website deals with so many geographies and our keyword rankings quite a lot across them, any decisions about site structures and one page optimisation should only be made once the overall site figures have been sliced enough to have confidence in them.
Excluding the Irish traffic, keywords have gotten slightly longer since 2010, but not massively so. It is the relative shortening of Irish keywords that is much more significant to us on this occasion.
We have previously observed similar big differences in user behaviour based on whether the landing takes place on a brochure / listing page or on one of our search results pages. We’ve even observed that the nearer the top of the tree structure a user lands the more likely they are to convert.
It’s often worth digging deeper than the reports or segments in Google Analytics can offer by themselves because the information that comes out can offer you a clearer picture of some of the bigger underlying trends affecting your site and give you the information you need to not only stay ahead of your competitors in the SERPs, but ultimately make your site better for your users.
Today we have a guest post from Ronan Perceval of Phorest.com.
According to a recent article on TechCrunch.com 20% of all Groupon and CityDeals worldwide are for hair and beauty treatments. That is a lot of money: approximately $1bn a year if you count all the deal sites and growing fast.
But of this 20% the majority are for beauty treatments rather than hair. This is because beauty customers are less loyalty to one salon than hair customers. According to data collected from 1,000 salons using Phorest.com salon software an average of 45% of customers who visit a hair salon in any one year will continue to visit that salon. For beauty salons the figure is only 30%.
This is because when people find a stylist that makes their hair look good, they are much more likely to want to return to that particular person than they are to the therapist that gives them a spray tan or massage that any therapist in a particular salon can be expected to carry out to the same standard.
Loyalty To Groupon
Groupon likes selling beauty offers because customers go from deal to deal, from salon to salon. In this way they stay loyal to Groupon rather than the salon after getting a deal and Groupon can continue milking those beauty customers for buying offers.
Groupon doesn’t like hair offers as much because customers are much more likely to stay with that salon after the deal and not use Groupon again for a hair offer. I was chatting to a hair and beauty salon owner yesterday who told me that they had run a beauty offer on Groupon CityDeal and wanted to run a hair offer next but the Groupon salesperson was adamant that they run another beauty one.
We ran a survey of 1,000 salon customers last week asking them how they chose their current hair salon and their regular beauty salon. The results are interesting. 9% of people first experienced their current hair salon because of an internet deal but only 4% had experienced their regular beauty salon for the first time this way. And this is despite the fact that there are 9 times as many internet deals for beauty than hair.
Be Careful What You Offer
For the dental and cosmetic beauty clinics on WhatClinic.com the advice is clear: if you are considering running a group deal think carefully about the treatment or service you are offering. Are customers who use the deal likely to return to you for this treatment again, or when the time comes will they just use another deal to go to another clinic?
Try to think of a way to make the deal depend on return visits in order to get the best value from it – maybe offer a 10% discount on all treatment for a 12 month period? And make sure you get to demonstrate why they should come back – excellent customer service, skilled staff, modern equipment, etc.
About the author: Ronan Perceval is the CEO of Phorest.com, a leading provider of salon software to thousands of salons and spas in the UK and Ireland. Phorest also operate MyZanadoo.com, the UK and Ireland’s number 1 destination for booking salon and spa appointments online.