MINDWORKS

Mini: Do you believe in magic? (John Hollenbeck)

March 04, 2021 Daniel Serfaty
MINDWORKS
Mini: Do you believe in magic? (John Hollenbeck)
Show Notes Transcript

The Brady-Belichick Patriots, the Apollo 13 team, and the Allies, what do they all have in common? They are all legendary teams; one could say there is something “magical” about them. However, is there actual “magic” in play? Join MINDWORKS host, Daniel Serfaty, as he speaks with Professor John Hollenbeck of Michigan State University, where they look at the “magic” of teams.

 

Listen to the entire interview in The Magic of Teams Part 5: Multiteam Systems with John Hollenbeck

Daniel Serfaty: There is a myth almost, teams are almost mythical, especially in America with this notion of sports teams and the great teams that are more than the sum of their parts, et cetera. Does it introduce another level of complexity, this notion that there is some magic that are happening, because human beings are designed to work with other human beings. Something at that level?

John Hollenbeck: I love the term magic, because I think we were talking about this before, we kind of used the term magic. I do think there's a magic there because the chemistry... it's not what's happening at the individual positions or the individual or the dyads or the triads. It's all of those things kind of working in parallel so that in many cases things will happen. And it just looks like magic to you because like with a good magic trick, you're looking at the right hand, not the left hand. You're looking at the individual, not the team, or you're looking at the team and not this particular dyad within the team.

And so the magic occurs because you're looking at the right hand, at the action level. So breaking that apart is fun. But I got to say, there's two things about the metaphor when you use magic. There's both good magic and bad. Again, I really believe that there are decisions that are so bad, so irrational, and so illogical, you can only get them out of people in the social context. An individual working alone would never make this mistake, but I can get my MBA students, my executive development students, to make unbelievably stupid mistakes if I put them in a group context and set them up.

I'll give you one example. This would never happen to an individual. But you're probably familiar with the cognitive diversity research on framing, and how if you frame an issue in terms of, these are the things that will be lost, you literally get in a very risk-seeking part of somebody's brain and they become risk-seeking. If you take the exact same data and just flip it around and talk about what you could gain... I mean these are just inverse probabilities. This is the exact same thing. But you just framed it as a gain or loss. If you tell people, "Frame it as a gain," they become extremely conservative. And now you take that process, which is a individual process. Now you put it into a group context. We'll take my MBAs. I'll take four MBAs that are... We do a lot of surveys, so I'll take four MBAs that I know are really risk-seeking in general, as a pre-disposition, and I'll take four of my MBAs that are really cautious, as a pre-disposition.

You can set up the framing, "Oh, you're making an irrational decision. Relative to the probabilities, you're being way too cautious or you're being way too risky." But what happens in a group context is group polarization. That is if you and I start overconfident, you're 80% confident, I'm 80% confident, [Debra's 00:10:34] 80% confident, you put the five of us in a room together for 20 minutes, and then you come back and ask us how confident we are, it's like 99. I mean, you're literally polarized because, "Wow, Daniel, I didn't think of that. You're right. That's even better that what I thought of." Nobody has any negative information. We just kind of beat each other and beat each other, and if you go to my cautious students, they go just the other direction. They're afraid of everything. They won't get out of bed. "Oh, this is only going to be 20% successful." By the time they're done, there's no chance. It's .01.

Only in a group context can you take people that would be a little bit irrational in kind of diversity terms, make them unbelievably irrational, and then I have a slide about it. My students are blown away that number one, this happened, and number two, it was so predictable that the dude's got a slide about it, and the rest of the lecture is built on this error that we all just made that we didn't even see coming. That's why I believe it's magic, because I can make this happen over and over and over again, with every executive development. Every MBA group.

We do a thing called a $10 auction. I don't know if you're familiar with the $10 auction.

Daniel Serfaty: No, please.

John Hollenbeck: Okay [crosstalk 00:11:46]. You can only get this in a group context. Basically, you put a $10 bill in an envelope and say, "We're going to have a bidding war for this $10 bill." Now, in most situations, the key to an auction is figuring out what something's really worth, but you know this is exactly worth $10, and so this auction has an interesting set of rules.

If you pay the most, you get the $10. If your bid is the second highest, then you pay that bid, but you don't get the $10. Third, fourth, fifth, you're out of it. So I put that $10 in there, and we start. Usually it just sits there for about 15 seconds, and Daniel, I've got execs that are COOs in organizations. Eventually some exec will say, "Well, it's a no-brainer. I'll pay $1 for a $10 bill." Then another guy says, "Two. Three, four, five, six." They get through to seven, and then they start laughing, because they'll look at me and go, "Dr. Hollenbeck, you're a bad man. Seven plus six, you're going to make a profit on this $10 Ha, ha, ha." I get it up to nine, and as always, one of the students finally says, "Wow, this is a really great lesson. Yes, I will pay $10 for a $10 bill."

That guy always says it like he thinks it's over. It ain't over, because you go to the person with nine and say, "No, this seems odd." The decision confronting you now is you either eat a $9 loss for sure, or take a chance. But if you say 11, notice how I frame that as a loss? You eat a $9 loss, for sure. I just framed it as a loss and put this person in risk-seeking mode. Or take the chance that if you say 11, this knucklehead's not going to say 12, it's a $10 bill. You know what that guy says, every time? He says, "11," and then you turned to the guy holding the 10 and say, "I know this seems odd, but here's the decision that confronts you now. You need to eat a $10 loss, or you say 13 to prevent." And that guy says 13, and then these two people will go up. If you can get them to go to 19, they'll often freeze at 19. They'll have a hard time getting to 20, but if you push them over 20, they'll go to 29.

The most I ever got, I was actually doing an executive development in Kellogg, and it's the nicest people you'll ever meet. I got this thing up to 19, and I just wanted to see if I could push to 20. There was this woman, Sarah, and Sarah just wasn't willing to go to 20, and she was going against this guy, Frank. I remember their names. Her girlfriend said, "Sarah, we're not going to let Frank beat you." They take out their purses and started giving Sarah money. Now, every guy in the audience goes, "Well, that's just bullshit," and they take out their wallets. Daniel, I am watching this room full of people taking out tons of money so it becomes a battle of the sexes where they fight for this $10 bill. Keith [inaudible 00:14:26], one of my heroes, University of Illinois, actually got $1900 for a $20 bill one time. This is a record.

No individual working alone would ever do this, but if you put them in a group context, there's your magic. It's not good magic. It's bad magic, and in an MBA class, an executive development class, you kind of have to teach people escalation commitment and why you need to have circuit breakers on certain decision processes. You made the initial investment, but you don't make the re-investment decision because you're wasted. You're done. Somebody with a cold, hard, calculated heart will make the re-investment decision. There's a bunch of things you can do. [crosstalk 00:15:05]

But I'm not done yet, Dan. I'm going to give you one more, because once I get all these monies from these execs, I don't want it and so I want to give it back. So we play something called an ultimatum game. The ultimatum game goes like this. There's two people. You make an offer, and there's an ultimatum. We don't negotiate. I can either accept your offer or turn it down, okay? So, it's an ultimatum. Now, winner of the game is the person, and usually at the end of the $10 auction, I have 30 bucks I want to give away. I want to give this money away, and you make an offer, and the person who wins the money is the person that can take the most value out of that thing but still get the other person to say, "Yes, I accept it." I will tell you, I usually get it started.

You and I are playing. I go, "There's $30 in there, Daniel. I'll take 29 and leave you one." How are you going to react to that? I'll tell you, all my exec students, they're like, "No, that's unfair." And then they say, "No." Then you go down to the next person. "How about 28 and two?" "No." "27 and three?" "No." I will have people literally say no to $12. And then you say to them, "Do you understand that you just violated every rational economic principle in the world? Your choice is between $12 and zero, and you took zero." And then the exec will go, "Hell yeah, because the other guy got..." It's like, "No." We call that counting other people's money, rather than making decisions about your own money. You're making decisions about other people's money. There's so many of these things that you know in advance, you set up.

I want to give you one more, and then I'll jump to your questions, because this is a really important one. In my business, have you ever tried to teach the Challenger Space Shuttle? It's very difficult. Thank God somebody came up with the Carter Racing Team. The Carter Racing Team is a case where basically you walk an MBA team or execs through a situation where they have to decide whether or not they want to race in a NASCAR thing. It's the last Sunday of the year, here's been your history, the car's been breaking down, you're finding the people who sponsor you are upset, and they lay out all of the contingencies of if you race and win, yay, this happens. If you race and you're competitive, this happens. If you race and you lose, that's okay, but if you race and the car doesn't finish because you've been unreliable, that's a disaster.

You walk all of these MBA teams through it, and in the end I will tell you Dan, every single one of these teams? They race. Then you have the greatest moment when you flip the slide and go, "Congratulations, you just lost the Space Challenger." Then, you show the Space Challenger blowing up. If you try to teach the Space Challenger, everyone looks at that and goes, "Well, what a bunch of idiots. Didn't they see the O-ring data? Didn't they see what the temperature was? Didn't they see where the wind was coming from?" The Carter Racing Team is the exact same data, but now you have to detect it in advance, instead of explaining it after the fact. Again, for most of my students, when number one, we tell them, "You just lost the space shuttle," and then they knew that I know that they were going to launch it, and now let's talk about decision making errors under high stress context in the face of previous failures.

All of a sudden, they're a little bit more open to listening about it, where if you try to teach the Space Challenger as the Space Challenger, it's like, "No, I would never do that. What a bunch of idiots. Those guys are stupid. What's wrong with the NASA people? Aren't they trained?" No, no, no, no. You would do it. That's kind of the bad magic, that these things are completely predictable. You can only get certain bad things out of a group that you couldn't get out of an individual. That's a lot of the fun of it, too.

Daniel Serfaty: Well, thank you for sharing all these stories. They are basically cautionary tales about that magic. Maybe it's a black magic of team at some point, because it has to do with re-emphasizing why it's really important to understand team dynamics and to try to put in place the right structures, the right processes, the right interactions in order to prevent those kind of groupthink phenomena that you described earlier. Group work polarization, or other situations in which people, at that point, do not optimize their utility functions but are trying basically to establish, to maximize some other function that has to do with the social hierarchy in the team. Who is the person who's going to win the auction, for example?

Now, in a lot of situations that I know our audience is going to find themselves, work teams, those things may not happen to the extreme that you can orchestrate in your MBA classes with your MBA students, but they do happen all the time. We see very often in meetings things deteriorate, and when you look at them in posterior you say, "Well, the team forgot what they were trying to do. They got into another situation." We know from history that this notion of establishing consensus too fast for the sake of consensus is actually dangerous. The Cuban Missile Crisis is a classic example that people talked about, where all these advisors basically reinforcing each other's mistaken beliefs.

John Hollenbeck: Yeah. I've got two things on that, before we leave that, because the Cuban Missile Crisis is kind of an interesting example. I do feel that we team researchers through a lot of predicting after the fact, and we often blame teams for the kind of things that we're saying here. In many cases where you don't have the counterfactual evidence. The things that I've been talking about, we know what the rational decision was. We know what the counterfactual evidence is, but in so many team contexts, because you didn't go a particular direction, you don't even know would have happened had we gone in that direction. It really does kind of promote, and the Cuban Missile Crisis was [inaudible 00:20:36] really got people used to this paradigm where some really smart person would go into the archives of some decision fiasco and then dissect it. Just like the Carter Racing Team, tell you all the things these idiots did wrong. We've got to really be careful.

That's why we need a science. We need a science where you kind of have to make your predictions in advance, and easy to predict the future after it's happened. Yogi Berra. The future's hard to predict in advance. I think we've really got to check... That's why scientifically, I kind of believe in quantitative science. I'm always a little leery of qualitative studies where people go in knowing what already happened, or kind of going in there with a particular angle. You've got to predict the future in advance, and so what I try to do with my classes is to show that some of this science is so magical. I can predict it in advance. I can build a whole lesson plan around it. That's how irrational you're about to be. Again, that's the fun of it. That's the magic of it. I love to teach this stuff. I love to research this stuff. I love going to work every single day. I can't wait to find out what we're going to screw up next, and then kind of fix it and move on. Yeah, I'm totally fascinated by all of that.

Daniel Serfaty: I stand corrected. I didn't mean that those historical example... I know that they're being taught as paradigms of decision making mistakes, or misunderstanding of the situation in teams or in groups. My point is that, and I remember we worked on some of those projects in the past in which when you do the forensics of something that was disastrous, where lives were lost, as we have many examples. In the military, for example, or in industry, and you interview the folks that were in the middle of that decision making process that are now [inaudible 00:22:23] being almost accused of being characterized as having made a mistake, but once you immerse them back in the same situation, they are all pretty adamant that, "Given what I knew at the time, with all the uncertainties at the time about the information and the time I had to make a decision, I would do exactly the same thing today."