our views and our knowledge in analytics and other releveant topics

our blogs

The 5 keys to a long and happy career in analytics

Newsletter Jan18 -Paul blog.jpg

So I realised recently that I had reached my fifteenth year of working, in some capacity or other, as an analyst.  This means that I am now qualified to share with you the secrets to being a super successful analytics guru, in listicle form. You lucky people you! Now, I can hear you whinging already; “But you already did one of these lazy list articles, it was pretty boring. Why are you doing it again and who decided you were an analytics guru?”

Well, yes, I did, but that was about the future and this is about the past, so they’re totally not the same. Also, I’m old. If I want to regurgitate the same ideas over and over again then that’s my prerogative. As for the guru part – alright, I’m not a guru, or a ninja, or any of the other silly things analytics bloggers call themselves, but if I just wrote “some advice about being an analyst, from some guy who’s been an analyst for a while” would you be more inclined to read it? Fine, I’ll give up on the BuzzFeed style click baiting (is BuzzFeed even still a thing?). Here’s some advice about being an analyst, from some guy who’s been an analyst for a while:


1) Do analysis, don’t make reports.

Nobody in business needs a report, never have, never will. Oh they might think they do, those very important people you do analysis for might say “I need to see the company sales every day” but think about it; that’s a statistic, not a report, and if it has no action it’s not worth spending time on.

What reports are, are a sop to management deciding they need to do something, which they would with or without a dashboard of pointless irrelevant numbers – if the report says everything is fine are you all going to stop work? And if it says everything’s rubbish aren’t you going to have to investigate the reasons why and what you should do about it anyway? Why not do that in the first place instead of wasting time on pretty but pointless dashboards.

Analyse your metrics. Actively present your findings instead of fire-and-forget reports. Talk to your management teams and get involved with what projects are going to move the business forwards (and if there aren’t any and your management team are just asking the same questions every week without doing anything about any of them, you might want to reconsider your place of employment).

If they’d meant for us to make reports all day then we’d be called Reporterysts. Are you a Reporteryst? No you’re not. Stop it.


2) Chasing 100% data accuracy will stop you achieving anything.

You know when people talk about Facebook or Amazon or whoever doing something super advanced or big-brothery (depending on your point of view) with their data and everyone in the industry swoons and fawns over how much customer data they’re collecting? Yeah, still not collecting all of it. So when they do this kind of stuff they are extrapolating based on a model and an expected margin of error – there will always be factors you cannot account for, no matter how well you setup your tracking tools, and you know what, that’s absolutely fine. You can draw conclusions from sampled data. You can draw conclusions from data that’s only 95% accurate. Heck, you can draw conclusions from data that’s 20% accurate if you know and understand which bits are missing (even if you can’t quantify them).

In the end, the output of any analysis is going to be a human decision-based process in which you try and convince someone of something like, “I think A is happening and we should take action B because of C reasons.” If the accuracy of C is really high, then you’ll have a convincing argument for B and the lower it is the less convincing it’ll be, but that doesn’t make wrong or not useful. Unexplainables are ok. It’s ok not to know some things and not knowing some things doesn’t invalidate the things you do know – which strangely seems to be a common attitude in business, especially around analytics. There is a misconception that data should always be perfectly accurate in order to be useful. Not so. Interpretation of data is the skill, not the data itself.

So if you’re chasing 100% tracking accuracy and saying things like, “This report has 1% of values that I can’t explain therefore I cannot trust the rest of it,” then your data is redundant.  Keep that up and soon your analytics team will be too.


3) Analytics in business is not a science, nor should it be.

People like to call analytics “science” and sure, it involves some aspects of scientific thinking when done well – Someone has a hypothesis, the hypothesis is tested, resulting in data which determines the probability that the hypothesis is true or not. From there the results are exposed to peer scrutiny and their conclusions accepted or refuted based on the strength of that probability.

The key difference is one of speed. If you’re developing a new cancer treatment, then you need to be thorough and you need to take your time.  There need to be multiple rounds of tests, under multiple conditions, preferably with controls that remove all external influence and bias from the results, which then need to be reviewed and verified by other experts in both the subject and the methodology of scientific testing. Which is why clinical trials often take years – it’s that important to be thorough and the pace of progress at the cutting edge of human though tis by necessity extremely slow.  If you are ascribing that much weight and process to what new features to add to your t-shirt selling website, then by the time you’ve successfully concluded that yes, it probably would be a good idea to make the buy button green, you’ve gone out of business because it’s 2035 and we’re all wearing spandex onesies with hologram projectors in the cuffs.

This ties in with number two above, in that people expect analytics to be like science, and they expect science to be something where results are certainties and any leeway for interpretation is a failing.  So maybe I should have said “analytics in business isn’t a science in the way people generally understand science in popular culture and real science isn’t that either,” but that was a bit less pithy and you probably wouldn’t have read it…because you young people these days have the attention span of gnats…I blame that social networking that they have these days…we didn’t have that in my day…everything was better…blah, blah, I’m really old.

Anyway, if you do some analytics and get a conclusion with a half decent argument that’ll let you take an action for the business, don’t fret about how scientific you were, just get some stuff done.


4) The new hotness will turn out not to be any better than the old hotness.

When I started in analytics, we counted website hits by having a siren attached to the site’s server. Every time it got a request for the site, it made a screeching noise, causing one of our team of trained monkeys to drop a pebble into a big glass jar. At the end of the day, we’d empty the pebbles onto the boss’s desk and count them out so she could gauge the day’s performance. Later we got Urchin and then Coremetrics, but the experience was roughly the same.

Nearly everyday I get an email or a call from someone with some new tool telling me that this one does so many things better than the last one, and will definitely unleash the full power of your data to transformitate your marketabised moneterisation so all my clients will suddenly dominate their sectors so totally that all competition will be wiped out and everyone else will come crawling cap in hand to plead for their jobs and livelihoods.

You know what these tools do? They count things and show you how many things they counted. That’s it. All of them. The newer ones count more things or show you how many things they counted in shinier, more complicated ways. Maybe they could help my t-shirt selling client sell a few more t-shirts, but at the end of the day, the limit of how successful a t-shirt selling company is is how many of their t-shirts people want, not how flashy their analytics tools are. You’ll always get more value from better analysts than you will from fancier tools.


5) Experience is overrated.

Just because someone’s been doing something a long time doesn’t mean they know what they’re talking about. Most people don’t know what they’re talking about, so it might just mean they’ve not known what they’re talking about for a really long time. Or maybe they did know what they were talking about, but that was then and things have changed. Just be careful who you take advice off, is all I’m saying, and you should take that advice, because I’ve been doing this a really long time.