Jason Parker

April 2, 2021

Metaphor is a great tool in data science implementation

Having worked in both education and health care, I've learned that influencing highly educated, highly competent people with data science predictive models is very, very difficult. There are a few common challenges that come up early and often in conversations:

  • "There's nothing this can tell me that I don't already know."
  • "There's no way that a machine can replace people in making this kind of decision."
  • "We already do a great job with this, there's no need to rock the boat."

All of these remarks convey the idea that many people in high-skill, high-education roles see data science models as intrusive and threatening for their work. That's a pretty hard starting place for a conversation about innovation or incremental change.

I often joke that my secret weapon in data science is that I trained in the humanities, specifically in literary studies. Where many of my friends and peers in data and analytics reach for statistics to drive a point home, my instinct is to look for a metaphor.

Metaphor is a powerful literary device because it bridges the gap between our personal experience and something unfamiliar (perhaps something as simple as someone else's personal experience) by drawing on images and concepts that are familiar.

The metaphor I like to use to talk about the role of data science predictions in most kinds of decision-making is GPS apps like Google Maps. I open by asking people if they use GPS on a regular basis to drive home after a day at work (this question hasn't been quite as effective during the pandemic, that's for sure). Most people say that they do. If I want to be extra provocative, I scoff and say, "Ha! You don't know how to drive home?!" This usually gets some laughs...usually.

The point is that nobody thinks using GPS and machine learning-generated directions to get home from the office means that the person driving doesn't actually know how to get home. We all understand that the purpose of getting directions from our phones is to help us find the fastest route. Machines are helping us make better choices, not telling us something we don't already know.

I think this means that most of our implementation efforts for data science predictions needs to reframed: this technology serves as support and extra input to make human decision-making even better. This is particularly true in health care, education, and other domains where the actions we take have immediate and long-lasting effects on human lives.

So the next time you're struggling to convince and energize people to adopt new practices that interact with and rely on machine learning or other predictive techniques, think about what metaphors you can use to help this technology seem a little more approachable and familiar.


Jason Parker