5 min read

5 Reasons HR Analytics Efforts Stall and a New Way Forward

"Life is a struggle, and then you die"

So go make something of it. Work on something important and watch over these things.

The 5 Reasons Most HR Analytics Efforts Stall:

#1. Not having enough precision on what is the right problem to focus and what questions you need to answer to solve that problem.

The typical fail is that you will spend a huge amount of time, money and effort to get a HR reporting environment set-up but downstream users do not use them.

People say the information is nice to have - they just don’t have time to go look at your reports. Sometimes the problem is that the information has little relationship to important decisions, or little bearing on the work that anyone is doing. 

Often the people supported will request an infinite assortment trivial changes in the desperate hope that each change will produce a better result.

Or with no specific reason provided you and your solution just go from hero to old hat over night - and you are left to wonder what actually went wrong.  

The problem is that you spent your resources and time working on the wrong problems and questions. More could have been accomplished with your time and effort had clarity been achieved at the outset.  

#2. Not having all the right data you need to answer the questions you want to answer.

The worst possible outcome of analysis is to produce a statistically significant finding that increases certainty in a wrong answer. This is a common outcome of the “missing variable problem” (the unknown unknowns that wreck most analysis)  This is some portion of the 80% of the variance your model did not explain.  You ran the analysis but did not include the right control data, so you get an answer, but you get a wrong answer, and you have no way of knowing you got a wrong answer. Sound like a nightmare? This is not a nightmare, this is a real problem.

The second worst possible outcome is when you do all the work and don’t achieve a statistically significant finding but could have produced a finding if you had included the right variables.

In either case, not having a basic theory that would explain what variables to include in the analysis results in you never achieving a satisfactory outcome from your effort or you may double or triple the hours to reach a conclusive answer because many passes are required.

These problems are why we pay university scientists the big bucks.  Big Bucks!? O.K., not really!  University scientists try harder because they know their work will be 'Peer Reviewed' by other really smart people who also know something about this topic. We don’t have this check inside organizations - we have non experts reviewing the work of experts. Major danger.

#3. Expecting technology to solve the whole problem (absent analysts).

You have made an important investment in supporting technology, but you may not get anything of lasting value out of this investment because you did not factor in the cost of acquiring (or creating) skilled operators of this technology.  It is as if you have this wonderful piece of machinery sitting in idle.

The success you have with analytics is dependent on the experience and preparation of the people working the analysis.  You can achieve two different analysis outcomes with two different analysts, using the same technology!  The worst part - if you get it wrong, how would you even know?

Between different analysts you will find different choices about what data to include, chosen research method, as well as differences in skill in executing chosen method.  

Clearly, you need to think about analysts, but you must also think about the rest of the organization. It does not help you to have this group of "really smart data people” and nobody else who knows how to use their work.  You need everyone on the same page on where we are trying to go, what everyone's role is, and how it all fits together.

#4. Expecting your analyst to solve the whole problem (absent the right technology & support).

Analysts are evaluated based on results. Some other HR roles can produce activities (implementing programs, policies, processes and systems) and we declare victory at the conclusion of the activity without respect to impact (which conveniently is never measured). Success is defined as completion of the project on budget and on time, then on to the next. Analysts do not have this privilege! For Analysts, the proof is in the pudding. If you tell the truth, "based on our data and the tools I have I found nothing of lasting significance to you" - your reward is you don't get invited back to the meeting.  

Analysts either produce very little value and stick around (satisfied with the activity for pay, as opposed to outcomes) or they leave for another opportunity to do better analysis. They either have a fire in their eye or they don't.  You want the one who cares, or don't bother even starting. You have made an important investment in  a person, but you can get nothing of lasting value out of this person without providing the tools and support they need to complete their work. 

Managing an environment like this is difficult, but not impossible - it requires skill and care.

#5. Expecting results without someone putting in hard work.

Your typical project management wisdom applies - choose one out of three : time, quality or cost.

Every new question you want to answer will involve investment in new data collection, cleanup, transport, joining, reshaping, statistics and figuring out how to best communicate the results.  We inevitably want automation of routine analytical workflow, but there is a first and second priority constraint : what should be made routine? How can it be made routine? We (technologists) will try our best to design out of this, but the first pass is best handled by a human - this will be hard to get around.

It will fail if no one has put in the work. This doesn’t necessarily mean you have to do all the work - or even the hard work - just that somebody does and there is no way to escape this cost.  You can bring in consultants to do the work, you can hire enough people in your organizations to do the work, or you can buy packaged solutions that help with part of the work.  In this you will be making big trade offs on time, quality of delivery and cost. Beware - no silver bullet will kill this beast. You must make a commitment to ongoing refinement of the analytical process or you get an analytical process that really does nothing for you.

If you get into the real day-to-day work of HR Analytics, the People Analysts are dealing with data that generated for some other purpose that does not conform to basic needs and expectations of our existing purpose. The best way to understand what must be done to automate an analytics workflow is to have someone work through the analysis one time to understand what data is there, what is not, what is wrong, and figure how what is there must be improved for successful analysis to occur.

Often we implement expensive reporting solutions on the hope that these will produce useful insights. Why invest in automation (repeatability) if you can not achieve a useful finding through a manual effort? Hope?  Hope is a great attitude to apply to all situations, but not a great strategy.  Why not run through it manually one time and figure out if it is worth automating?  The most important question is - when you got to the end of it all manually, do you end up with a report that is useful to the organization? If you did, great, now is the right time to make decision about automation.

You can find more posts like this and other helpful People Analytics related resources on my Misc- People Analytics blog roll.


People Analytics is difficult, no doubt about that, but...

I’m putting together a series of live group webinars where I will be revealing a process for dramatically increasing probability of success of People Analytics - building on a career of success and failures (Merck, PetSmart, Google, Otsuka, Children's Medical Dallas and Jawbone) - and applying new ideas I have developed over the last few years applying ideas from Lean to People Analytics.

The goal of this webinar series is to engage a select group of qualified early adopters, who have access to an organization, are willing to apply the process, report back how things are going, and work out the bugs out together. This group will have opportunity to share their findings with the broader People Analytics and HR community, if you choose.

If you have interest in participating in the webinar series, let me know here: (http://www.misc-peopleanalytics.com/lean-series

And if you know anyone else who you think would, please let them know too!


Who is Mike West?

Mike has 15 years of experience building People Analytics from the ground up as an employee at the founding of Merck HR Decision Support, PetSmart Talent Analytics, Google People Analytics, Children's Medical (Dallas) HR Analytics, andPeopleAnalyst - the first People Analytics design firm - working with Jawbone, Otsuka and several People Analytics technology start-ups. Mike is currently the VP of Product Strategy for One Model - the first cloud data warehouse platform designed for People Analytics.

Mike's passion is figuring out how to create an analysis strategy for difficult HR problems. 

Reasons Why I am Joining One Model

7 min read

Reasons Why I am Joining One Model

To whom it may concern, For the last 2 years I am proud to have run my own People Analytics consulting company - PeopleAnalyst, which I like to call...

Read More
People Analytics sneaky frustration: How to count people over time?

People Analytics sneaky frustration: How to count people over time?

Analytics is a funny discipline. On one hand, we deal with idealized models of how the world works. On the other hand, we are constantly tripped up...

Read More