A People Nerds interview with Christian Madsbjerg
“Disruptive” may be the most overused word of the current moment, but it’s hard to find a better descriptor for the work of Christian Madsbjerg, founder of strategic consultancy ReD Associates and author of the new book, “Sensemaking: The Power of the Humanities in the Age of the Algorithm.” Madsbjerg, who’s worked with some of the world’s largest companies, including Ford, Adidas, Coke, and Chanel, argues that in order to really understand what’s going on around us, we have to place the same value on the humanities as we do on data and science. That a strong grasp of historical context, culture, art and history, will deepen our understanding of people, and our ability to predict and make sense of their behavior.
His work is not rooted in the ever-nebulous field of “design thinking,” but in the thinking / philosophy of German philosopher Martin Heidegger, whose most famous book, “Being and Time,” talks about the phenomenon of shared experience and the social structure that results from the existence of inextricable worlds. In that vein, ReD helps companies rethink some of their most fundamental ideas, crystallizing who they are, how their customers view them, and their role within the larger context of global culture and economy. They’ve helped Lego reimagine what it means to play, worked with Ford to change the experience of mobility, and guided Adidas to a new understanding of fitness.
dscout sat down with Madsbjerg to talk about the importance of theory, the danger of big data, and how an understanding of culture can help us all become “masters of sensemaking.”
I actually think I’m more of a “Culture Nerd” than a People Nerd. Individuals are not as interesting to me as groups of people. I’ve always been interested in how groups of people influence each other, how they figure out what is appropriate and not appropriate, beautiful and not beautiful, and so on. How do people in the same company end up dressing the same and using the same vocabulary? Why does everyone in Chicago or Shanghai do things in the same way? That’s been an interest of mine for as long as I remember.
I think it’s a big lie that individuals have a lot of agency. If you look at how people are actually leading their lives and making decisions, mainly they’re living within a set of parameters defined by the place where they live. Social theory of the last few hundred years has been about exploring that. We’re moving away from the idea that humans have complete access to their preferences and feelings and make choices when it comes to what to consume, business, even religion. Society really defines those things for us.
Yes. At the beginning of my career I thought it was weird that the “people who were interested in people” weren’t more engaged with the places where the actions was happening, like the corporate world. Sociologists were still writing about old topics. At the same time it bothered me that these very sophisticated organizations, who knew how to sell products around the world, had such an unsophisticated way of dealing with and understanding people. They talked about segments and personas and customer journeys and had very, very simplified ways of looking at people. They also thought you could just ask people about what they wanted and needed. And the last couple hundred years of social science have taught us that people aren’t very good at saying what they need. I didn’t understand why the world of humanities and academic interest in culture and music and books and stories wasn’t represented in the world of making things.
I think we’re undermining our innate ability to understand each other. There’s a concept I call The Third Knowledge. The first type of knowledge is subjective: this sound is too loud, or my foot is hurting. We all accept that at face value—I can’t tell you that it’s not true that your foot is hurting. Then there’s the kind of knowledge that we can measure. From New York to Chicago is “x” far. That’s also very accepted. And then there’s the third kind: it might be “the mood in the nation is terrible,” or “this party has a festive atmosphere.” It’s an intersubjective kind of knowledge. We know how far to stand from each other at a party. We know how loud to speak. We know when a joke is too much. It’s nebulous and it’s between us. You can’t measure it. And the business world rejects anything that isn’t accepted as science, or that you can’t know for sure.
But this kind of knowledge is critical. If you completely distance yourself from it, if you only rely on data sets to tell you about your users, you start getting rusty and making big mistakes. That worries me. It worries me that banks can’t understand what’s happening to their customers. They have no sensitivity toward it. They can’t understand why people can’t pay for their loans or the houses they bought. It’s something I’ve detected across the board, that people are distancing themselves from their human ability to understand others and what that tells them about what they know.
He has a corporate jet and three drivers.
Well they all understand that they don’t understand. They all have a sinking feeling that they’re far away from people. I got Mark out in the world, meeting directly with customers, and one of his big insights was about being on a budget. He realized “Oh, when people buy our vehicle, it relates to other financial aspects of their lives. If their car breaks down, they can’t go on vacation.” That was just not part of his world view. He’s a very intelligent guy, so when he sees it, he’ll say, “Of course it’s like that.” Yet, the way Ford services a car like the Fiesta or the Focus is not at all organized around that, the way they finance them is not organized around a very basic understanding of how people deal with their cars.
One problem that we come up against a lot is that companies have been burned by the way consultants have tried to help them in the past. They’ve seen projects where people are kind of winging it when it comes to researching other people. At ReD we bring a rigor to it, and say “This is a serious matter. It’s as serious as anything else you’re doing. You have to do it properly. There’s a science to it.”
We do a lot of participatory observation. Following people around, observing and interpreting what they’re doing, comparing them to other people we’ve seen doing similar things. We use a lot of techniques from the social sciences of course, and at the heart of that is the work of philosopher Martin Heidegger. Heidegger looked at everything as sets of different worlds: the business world, the world of cars, the world of fly fishing and so on. He tried to figure out how each world worked by piecing them together. It’s sort of like what a historian would do to understand what it felt like to be a German general during World War II, or the Duke of Windsor leaving the Crown of England. You piece together different sets of data, demographics and pictures and observations and so on. Then you synthesize them, which is something that we humans, for some reason, can do. We can interpret what those things mean once we piece them together, particularly when we have several instances of people doing the same thing. And that’s what we do at ReD, we apply discourse analysis and ethnomethodology to human life and routine.
Absolutely. It’s very, very important to frame your theories for yourself. Without theory, research is meaningless. You might as well send anyone out into the field looking for information. You have to develop a theory that organizes data in a meaningful way and for me, Heidegger was the basis for my theory. I read a lot. I study people through reading, in a way, but I came to Heidegger later in life. I probably read his book “Being and Time” for the first time when I was thirty-five, and I was so frustrated that no one had shown it to me before. I felt like for the first fifteen years of my career I’d been studying the branches and the leaves of the tree, and suddenly here was the trunk and here were the roots. It was the organizing principle to everything I found intellectually attractive.
Without theory, research is meaningless.
But it didn’t just remain a theory. All of my work since has stemmed from it—it’s either a reaction against Heidegger’s theories or it’s a continuation of his philosophical tradition. You have to apply theory to your work. Theory for the sake of theory is boring, and observing the world without theory is unsophisticated. The humanities give us a practical use of theories. That’s the point of “Sensemaking.”
I have nothing against technology in and of itself and I have nothing against A.I. in and of itself. What I do have a problem with is the uncritical stampede into that world without any thought about unintended consequence. With technology, there’s always something just around the corner—right now it’s deep learning—that will solve everything. And then we will have a machine that can watch over us in its wisdom and make choices for us so we don’t have to think anymore. And that is a problem.
Right now the best example of deep learning is skin cancer. If you feed a machine with deep learning algorithms, millions of pictures of people’s skin, it can somehow predict whether a mole will be benign or not. And then based off of that prediction and more data, the machine teaches itself when something is benign or when it isn’t. It can make predictions about your skin that are about 10% more accurate than a doctor. So the tech people say, “Isn’t that much better than what we have?” Yet if you think about it for a second, what’s the cost if doctors no longer have any interaction with our skin? If we always left it up to a machine to do the analysis and the diagnosis, how would we ever come up with anything new? The way you come up with new techniques and new treatments is by studying reality. If you cut doctors away from reality, you lose the innovation that comes out of them constantly engaging with and making sense of and understanding the skin of thousands or millions of people.
The problem is when we start to think that technology can take over for humans, rather than act as an extension of humans. I have a big problem with the idea that artificial intelligence is better than us or the same as us. I mean, the human brain is the most complex thing in the universe and when you put two of them together, it becomes infinitely more complex. And when you put two million of them together, it goes crazy in complexity. The computer world has nothing, nothing on the human brain, particularly in societies.
You can learn it anywhere, by engaging with culture, by being open to more than one set of data. Take Soros. He’s certainly open to the quantitative analysis of markets, but he’s also open to the political dynamic between Germany and England. And he’s open toward the mood of the market and the different exchange desks and so on. He’s betting on how they will react to each other once all hell breaks loose, right? As an event is happening, you can analyze it. But all the rest, all the fear and greed and chaos that happens after an event like a market crash—if you can predict the second or third step of the markets, that’s when you can really make money. That’s what Soros has done his whole life. He’s cultivated a sensitivity to historical context, experiences, newspaper articles, stories. Narrative data. That’s something we can all cultivate.
Every time. Every time. I have nothing against thin data. It’s just a different thing. A description of a human being gleaned from observing and spending time with that person is so different from looking at the statistical data—the clicks and swipes—that a person generates. It’s a rich reality view versus a very flat reality view, and I think the Googles and Facebooks of the world are kidding themselves when they think that they understand people based off of the ads they target them with. Human beings are amazing creatures, deeply interesting creatures. The study of people is so rich. It’s way more complicated than any other object of study in the known universe.
Know someone you’d love to read about in a People Nerd profile? Tell us why