Sunday, May 13, 2012

My Most Influential One Pixel Line


I thought I'd contribute one story to the "telling stories with data" genre, even if it's a silly one. It's silly 'cuz it features such a silly graph, which I shoved into an appendix of a presentation for a client a few years ago. Here's an anonymized version:


I put that animation with the arrow in there on purpose, because when I presented it, I had to point out the skinny line on the top. More graphs than you'd expect come with a "performance" part and in some contexts, I think this is just fine. Afterwards, one exec at the company referred to it often as "that chart with the one pixel line." (Okay, technically it had about 2 or 3 pixels. Not as punchy if you refer to it as "that chart with the 3 pixel line" or "that chart with the thin red line.")

I'm sure there are other, better, ways to present this red-and-orange tower. The point is: It was remembered. It had an impact. This graph led to more graphs being created! Roughly, we saw these steps:

  1. Acknowledged and admitted: The one pixel red line was considered to be a problem (or rather, the un-analyzed orange bar was).
  2. More descriptive graphs were made: This is key &emdash; an influential graph/chart always leads to more data investigation, with more graphs. Describe the size of the problem, delve further. The giant orange segment was tackled: How could it be made manageable? What patterns existed inside it?
  3. Sensemaking/iterpretation: What could we do, what couldn't we do? What should we prioritize or safely ignore? What tools were needed? Who owned what parts of this orange bar?
  4. Data tools sprouted: A series of ad hoc and then longer term tools were built: Excel reports with perl/python/VBA, then a Flex tool for intermediate data dives, then a dashboard in Flex for tracking larger picture trends.

Do It Well, and Do It "In-House"

It's an old analytics saw that you can't improve what you don't measure. Well, I think you won't improve what you don't measure meaningfully and then pay attention to. The client had collected the data, but then did nothing with it, because no one had made understanding it a priority. Data for data's sake is pointless and will be ignored. At the time of my one-pixel bar, an analytics cheerleader in the company described our primary data system as "buggy, opaque, brittle, esoteric, confusing." I'd add, "understaffed," and as a result of all that, usually ignored, which is how the one pixel red line came to be.

We took a brief detour in which we considered "outsourcing" the data problem to another company to do the top-level reporting for us, but our (mostly my) investigations suggested we couldn't do the fine-grained, raw-to-dashboard (ETL) reporting and analysis we needed without owning the entire pipeline ourselves. Because in all these organizational, data-driven settings, the reasoning goes like this:

  • What's going on? Now, and as a result of previous behaviors/changes. Do we have the right data? Trends, alerts, important KPIs.
  • Why is that going on? Drill in. Question if we have the right data and instruments to diagnose. A deep dive occurs, often all the way back to RAW data. This is normal! And this is necessary.
  • How might we change the bad things? This is a complicated question, never simple and often not just quantitative. This is where the profound thinking happens, when the cross-disciplinary methods and teams pull together to interpret and chop data. Sense-making and interpretation require lots of checks on data, reasoning, and context.

Cross-Disciplinary Success

Our ultimate data team was a cross-company, somewhat ad hoc group of people who cared about the same thing, but didn't report together anywhere: Customer Support, UI development management, directors of development and the API team, a couple of database gurus. Oh yeah, let's not forget the database gurus: I couldn't have even made that bar chart without badgering the database guys for info on their tables, so I could do some SQL on it.

In a year, we had achieved measurable significant improvements, via that cross-disciplinary team, and without out-sourcing our important data in any way. The short-term tools paid off almost immediately, and I hope the long-term ones are still evolving. One of the team members won an award for the tool he developed for exploring important raw data (and I did contribute to the design). None of this was done under official reporting structures. But the organization was flexible enough to support the networking, collaboration, and skills needed.

I Did Other Stuff, Too...

Since that graph is so silly, here's a little montage of other exploratory data and design work I did while I was with that client. Lots of tools were involved, from R to Tableau to Flex to Python to Excel to Illustrator. Vive la toolset!

No comments :