Blog — April Bell Research Group

Viewing entries tagged
Data Analytics

Using a Guide to Pass Through the Analytics Wilderness Safely

Comment

Share

Using a Guide to Pass Through the Analytics Wilderness Safely

Let’s face it, “Analytics” is a confusing term. Whenever I try to tell people what I do, they get confused. To simplify this, I like to use the analogy of cutting a trail through the wilderness. “Analytics”, after all, is more about the questions you ask than the data itself. So, let’s talk about what works

Every wilderness adventure needs a good guide, and that's exactly what an analytics partner is: a translator and guide.  One who seeks to understand what the nature of the business problem is first, before jumping to what the technical solution could be.

A good partner is one that sits in the middle ground of business and technology. They get and understand technology, and can develop solutions themselves if the tools are put in front of them. However, they approach problems, not by telling other people what to do, but instead, listening to the problem to help create the most effective approach. 

April Bell Research Group

There are all kinds of "analytics" techniques, and they all involve navigating a new path with data. It’s easy to get lost in the different definitions to gather, organize, and analyze data because the methods are vast: from machine learning, statistical analysis, data transformation, to data visualization. Analytics is a blanket term that includes all these things; so, when people ask me “what is analytics”, they often cite one of these ways and say, “is that what you mean?” 

Yes, but not really.

April Bell Research Group

Effective questions

An “analytics partner” is not someone you hire just for their technical expertise. The ability to do the job with precision and accuracy is a baseline expectation. You hire them because of the way they pose effective questions, which save you from going down the wrong trails.

Many times, consultants will suggest that you build a freeway through your data wilderness, or put an expensive solution in place that doesn’t really play well with your environment or culture. You don’t want to build a house in a location that doesn’t make sense.

 

Rapid approach

April Bell Research Group

“Design Thinking” is a term that better describes good analytics than does much of the analytics terminology because the active verbs in the process are similar: “Empathize”, “Ideate” and “Prototype”. Whereas analytics speaks in more conceptual techniques which sound complicated and mysterious...confusing. In an attempt to sound impressive, it alienates non-experts from understanding it.

For example, in the Design Thinking process, "Ideation" and "Prototyping" are key steps to help create a workable solution quickly.  The first idea is rarely the best one, so by testing and re-building it a few times, it's more likely to produce results that move an organization forward.  This is different than the traditional IT approach to solutioning, where precise “requirements” are needed first to build a more static solution. This is often at the heart of why solutions take so long to build but often miss the mark of what’s really needed.

 

Travel light

April Bell Research Group

Analytics partners are hired because business owners don’t have the resources on their team, or their team is fully committed on other things. The last thing they want is to have a partner come in and require their folks work 25% harder. They need a partner who has the experience to understand the right upfront questions – what are you trying to accomplish? Where is your data? What is the quality of your data? – and then is gone. Not burdening at every step or bogging down because things aren’t clearly defined.

When all this comes together, the experience with an analytics partner can be transformative. More than just a technical solution provider, because they serve as a translator and guide to carve a beautiful, new path through the data wilderness for you. They don’t seek to tame the forest or build a highway through it, because the right next solution may not be in the part of the forest you think.

Comment

Share

How to Build Data That's Useful

Comment

Share

How to Build Data That's Useful

Analytics and Stroller Pushing

One of the best analytical lessons I ever learned was nowhere near my computer. My wife and I were gearing up to have our first child. We were shopping for a baby stroller. If you have done this, you know the choices are paralyzing. There are at least 20 options that are rated on multiple qualities. After hours of debating what should have been painless choice, we stopped ourselves and asked, “what is the most important feature here”. After thinking about it, my wife said, “I want to be able to reach down with one hand (because the other will be holding the baby) and pick it up so it collapses, then toss it in the back of the RAV4 in one motion.” Suddenly, 20 options went down to 2 or 3, and we made a decision a minute after that.

Analytics-and-Stroller-Pushing

Good data insight development follows this approach. It is not an attempt to build the Encyclopedia Britannica, it’s an agreement on what piece of currently unavailable information would make the most difference to the people who actually run the business. Here is a fun little video of me talking about this.

Back in 2011 I took a leap of faith. I left the stability of Pepsico to lead an analytics group in a much smaller Energy company. At that time, I was introduced to a new software called Tableau. It seemed pretty cool, and was easy to learn if you were a strong excel user. So off I went with my team to build reports from the database of company information we had put together.

One of the first and certainly most notorious reports we developed was for a “very eager” and attention-challenged marketing manager. The good news is that he loved data and believed in not making decisions without it. The bad news is that there was no end to the data that he felt he needed to look at.

My team went on to develop the report exactly the way that he wanted it, with all the different possible views and filters he could think of.  With this one report, he would be able to see everything, and answer every question that his directors could pose.

This is an example of what it looked like. My team gave it a name: “Filters Gone Wild.” No one else in the company could stand to use this report for more than two minutes without needing a glass of scotch.

Filters-Gone-Wild

So why to people do this? Isn’t it a noble intention, after all, to want to see more data? The reason is because complexity creates its own burden, As it turns out, consuming data is a lot like purchasing jam - more isn’t always better.  Not only is there a point of diminishing returns in how satisfied we are, but our ability to act is reduced significantly as well.

That was a really interesting role for me, and I’m glad I took it. Not only did I learn a lot of new, useful skills, but more importantly I got to see the gamut of “clients” and how they wanted data. The better ones understood this concept of simplification.

Around the same time, there was an article released by MIT, which put some science to what I was learning. They surveyed a few thousand people at multiple companies and determined that top performers were five times more likely to use analytics than lower performers. No surprise there, but what was more interesting was how the top companies approached data.  It wasn’t about budgets or sophistication of software; the lower performers cited development process and managerial issues as a major contributor to blocking progress. What - people are getting in the way?!?

A recent client experience motivated me to write this blog. The team had purchased all the software it needed to bang out good reporting. They had a small army of internal folks and contractors who could wrangle and structure the data as good as anyone. But when the six-month check-in time on a nine-month project came, they discovered that only rudimentary reporting had been developed, and that the internal clients were disappointed to the point of considering pulling the funding for the expensive software they purchased.

Why? Because the IT developers who were in charge of it had treated it as a requirements fulfillment exercise.

One of the key points of the MIT article was a concept they called “start in the middle”. In their findings, they saw a trend in the approach of effective teams where they would simplify the issue to discover the most relevant information to move the needle the most, and then iterate against that until they honed it to a useful state.

It’s a conversation between business people, that happens to use technology as a tool to make it come to life. There is no requirement to gather, because it’s never really known completely what is needed until the discovery begins. It’s not a conversation with executives, it’s with the frontline managers and directors who make the business happen. Once they start becoming successful, peers start taking notice and the path to a data-driven culture organically grows.

Comment

Share