All Knowledge in Practice

Tuck Knowledge in Practice Podcast: Decision Biases Under Risk and Uncertainty in the NBA

NBA teams are awash in player data, but their personnel decisions come down to human judgment. Tuck professor Daniel Feiler explains how numerous biases can cause teams to make sub-optimal decisions about whom to trade, recruit and draft—and how those lessons apply to any organization.

In 2024, Tuck professor Daniel Feiler had a series of confidential conversations with executives of numerous NBA teams. As an expert in the psychology of judgment and decision making, Feiler was curious how these executives were using the proliferation of in-game data to make decisions about which players to trade, recruit, and draft. He found that, like managers in any organizational environment, NBA executives were prone to making biased decisions without even realizing it. They evaluated players using differing standards, or they didn’t fully account for the context of players’ performance, or they over-weighted how certain players performed against them.

In this podcast, Feiler discusses how he worked with NBA teams, and what he learned about decision biases that can apply to other corporate settings. 

Listen Now

Advertising 0 / 0
00:00 / 00:00

Our Guest

Dan Feiler (pronounced like “filer”; he/him/his) is a behavioral scientist whose research explores the psychology of judgment and decision-making and its role in organizational behavior and management science. He is a senior editor at Organization Science and an associate editor at Management Science, and his work has been published in numerous journals, including Psychological Science, the Journal of Experimental Social Psychology, and Production and Operations Management. His work has also received popular press coverage in the Wall Street Journal, Forbes, Fast Company, and the Washington Post, among others. Feiler is the founding faculty director of the Impact Academy, a suite of custom executive education programs for the U.S. Olympic and Paralympic Committee. He has been awarded for his research at Academy of Management, Behavioral Decision Research in Management, and Max Planck Institute for Human Development conferences; he was also selected by the Tuck class of 2015 for the Excellence in Teaching Award and, in 2017, as one of the Top 40 Business School Professors Under 40 Years Old by Poets & Quants.

Transcript

[This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of the Tuck Knowledge in Practice Podcast is the audio record.]

Daniel Feiler: I think a super common issue in the NBA is what I'm calling like the against us bias, which is like however a player performs against us massively affects how good we think they are in general. So I was asking them about this. Like isn't like, what do you do? And they're all like, it's inevitable. What I found fascinating is they're all like, yeah, that happens for sure. Right. Like we're terrible about it. I think everyone else is terrible about it too. So, like, what can you do?

[Podcast introduction and music]

Kirk Kardashian: Hey, this is Kirk Kardashian, and you're listening to Knowledge and Practice, a podcast from the Tuck School of Business at Dartmouth. In this podcast, we talk with tuck professors about their research and teaching and the story behind their curiosity. The NBA finals are happening, and to celebrate, we've got a special edition of the Knowledge and Practice podcast. My guest is Tuck Professor Daniel Filer, and we discuss his recent work with NBA teams on managing risk and uncertainty in their decisions about which players to trade, recruit and draft. NBA teams these days have more in-game data at their fingertips than ever before. Never mind the usual statistics of shots, assists, rebounds and blocks. Teams now have tracking systems that follow players on the court and can tell how far away they were from defenders when they launched a three pointer. Some teams use this data strategically, while others are struggling to marshal it. But all teams reach a point where humans must use the data to make judgments about players. And that's when all sorts of biases can creep in. In this episode, I talked with filer about some of the decision-making biases NBA teams have been vulnerable to, and how those lessons apply in any organizational setting.

Kirk: Dan Filer is a behavioral scientist whose research explores the psychology of judgment and decision making and its role in organizational behavior and management science. He is a senior editor at Organization Science and an associate editor at Management Science, and his work has been published in numerous journals, including Psychological Science, the Journal of Experimental Social Psychology, and Production and Operations Management. Filer is the founding faculty director of the Impact Academy, a suite of custom executive education programs for the US Olympic and Paralympic Committee. He has been awarded for his research at the Academy of Management and the conferences for Behavioral Decision Research and Management, and the Max Planck Institute for Human Development. He was also selected by the Tuck Class of 2015 for the Excellence in Teaching Award, and in 2017 was named one of the top 40 Business School Professors under 40 Years Old by Poets and Quants. Dan Filer, welcome to Knowledge and Practice. Thanks for being here.

Daniel: Hey. My pleasure. Thanks for having me. How are you doing, Kirk?

Kirk: I'm doing great. How are you?

Daniel: Fantastic.

Kirk: Ah, it's such a such an honor to have you here talking about this stuff. I think it's really interesting. Um, so we're talking today about a subject you've been researching and teaching for more than ten years, right?

Daniel: Yeah. Yeah, absolutely.

Kirk: Um, so it's decision biases. Um, in particular, specifically, we're going to talk about your work with NBA teams in this area, which I think is fascinating. Um, and what you learned from that experience that can translate to other corporate environments. Um, personally, I've found this area of social science intriguing ever since I read Thinking Fast and Slow by Daniel Kahneman.

Daniel: It's a tome. It's a tome you made it through.

Kirk: Oh, it's a tome. Okay. Yeah. Um, I mean, I think of it as sort of a handbook for life. Yeah, yeah. How to make decisions, how to approach situations, how to interpret situations. Uh, I think about it all the time.

Daniel: Yeah. No. It's brilliant. Yeah.

Kirk: Um, and I think it's interesting that he's a A psychologist that won the Nobel Prize in economics, which shows, I think, the kind of crossover of what you're doing to because you're bringing psychology into the business field. Mhm. So let's get into your work with the NBA, which is exciting. Um so what were you doing. What how did it develop and what were your objectives.

Daniel: So I took this. There was basically certain questions that started to feel very important as I someone who studies decision making under uncertainty, how we think about forecasting or. Um, yeah, like navigating a risk. And at the same time, we've got, um, analytics blowing up massive amounts of data availability, and we're able to rely on that for decision making more and more. And so what I think is interesting for a lot of organizations is they've got all this data, and it feels like we should be able to rely on analytics for our decision making, which should reduce the effect that human judgment might have on some of these things. And yet a lot of times the analytics all go ultimately to people sitting around a table making a decision informed by that. And B there now become questions of when are the when is when are the data misrepresentative what is it that that that the analytics are able to measure and act on, you know, capture readily? And what are things that analytics and data can still get fooled by? And so all of a sudden, I felt like there were all these interesting questions around, okay, we're so advanced in some ways, and yet are we smart about how we're bringing together analytics and human judgment? And maybe some of these biases that we've typically talked about in terms of cognition are now actually in the data and in the analytics themselves.

Daniel: That was sort of another flavor that I became really interested in. So, um, at a high level, these were things that felt on the frontier for me as I think about decision making under uncertainty and in organizations. The NBA context. So this professional basketball. Um, and uh, so for me, what I found really interesting about, uh, the NBA is as opposed to baseball, where everybody goes up one at a time and it's very passable. Right. Like this person is it's one pitcher, one batter, and they go against each other in the NBA. In the NBA, five players are on the court at a time. And to generate opportunities to basically like bring value to their team in terms of how they perform. There's so much interdependence, the, the, the attributes of one player, their decision making affects the value that another player is able to do right. One example being if one person's a really good shooter from far away, or a good three point shooter, that means that the defense needs to be closer to them. They can't give them as much, um, room, which then gives room for the other players to operate on your team. So a three point shooter, even without shooting the ball, even without receiving the ball.

Daniel: That skill now creates opportunities for others, but it's hard to like. It's hard to understand that analytically because all these things are so interdependent. Um, so I really like that aspect of professional basketball, which is like, okay, there's all this interdependence, and we're trying to measure these things, and there are a lot of statistics, but they're going to capture certain elements of this much more easily than other elements. And we know these teams are using all this human judgment. Scouts are everywhere also. So like okay, how does this fit together? This feels like a very relatable challenge. So that was it's also very well documented. And uh, whereas like so the analytics are, are kind of readily visible from the outside in. And so I thought, man, this would be cool if I could get access. And through some connections, I was able to get access to some NBA executives for and we sort of did a series of discussions. Basically, the idea is to sort of share how we're thinking about these challenges with each other. Me, from my research in my field and from them in their organizations, these teams would have these conversations separately, anonymously, and they wouldn't know which other teams were also doing it. So it was it was an interesting experience.

Kirk: Yeah. Wow. That's that sounds really fun. I imagine you're an NBA fan.

Daniel: I am yes. Yeah.

Kirk: Yeah, yeah.

Daniel: Well, so I grew up in Pittsburgh, but there's no NBA team in Pittsburgh. But I'm a University of Pittsburgh college basketball fan. But I had no NBA team. I always loved basketball. My high school was like a state champion. Uh, Schenley High School, which is no longer with us. Rest in peace. Um, but, uh, the, um, when I came up here, I started watching. I had my first child. And, like, there were lots of times where I was just, like, holding the baby with, like, couldn't really do anything. And I started just putting on the Celtics games every night. And so now I'm a Celtics fan, having watched them for, I guess. How long have I been here? Is it 14 years? Something like that. Yeah, yeah. So now I'm now I'm a Celtics fan.

Kirk: Wow. Cool. Well, there are definitely worse teams to.

Daniel: To root for.

Kirk: Oh, yeah. They're quite an impressive team these days. Um, so when you met with these NBA teams, um, did you have a goal as far as, like, trying to help them make decisions about recruiting, about, um, about player scouting and things like that? Um, well, it was sort of like the agreement between you two.

Daniel: Yeah. Um, um, so I think the at a high level, what I was interested in and what was the premise for the conversation is how do they evaluate talent? So things like, um, the draft, you have to choose which player to add to your team. And there's like basically college players primarily, um, and you have some data and you basically get an opportunity to add a player to your team through the draft. How should you figure out who is the right player to choose? Same with player evaluation to inform trades free agent signings. So these are all the basically the personnel decisions. And all of that comes down to how good do I think these players are going to be in the future based on what I know now. So there's a inherent forecasting problem challenge there, in addition to other things you can imagine. Like, you know, there's character issues, there's very various things. Um, but primarily that's the lens. So I'm not interested at all in, you know, should you foul the other team with 30s left in the game or not? Like, I'm not interested in in-game decisions. I'm interested in how the organization thinks about allocating its resources to basically maximize the value, the quality of their of their team and their organization. We talked, you know, there's like a get to know you factor. You know, we're like all trying to feel each other out and understand each other's motivations. It's a funny business. And we can come back to some of the tensions that I've learned Regarding that linger in the background in that field in general.

Kirk: Well, I'm sure it's a little bit awkward to just start talking. Well, how do you how do you pick your draft? You know, how do you. Yeah, right. You know who to who to trade.

Daniel: Yes. It was interesting because I at first very sort of cagey. Um, and then by the end of we got to know each other pretty well. Now they're like naming specific players on who they were right or wrong about in a in a particular situation. But I think what was cool about this was the whole premise was for us to both learn from each other. Right? For I'm, you know, I'm bringing stuff to the table and I'm going to try to get them to start thinking about things that I think my sense is. And of course, based on what they're wrestling with, um, they it will resonate with them or not. And so I like I think this is a natural thing that we should probably start thinking about as I'm thinking about decision making under uncertainty. And then they would say, well, we have thought about that or we haven't really thought about that, and here's how we've been going about it. It, right. It really became an exchange of ideas, um, rather than, um, you know, and it was very different from a lot of the things that I typically do. And so anyway, yeah, that that was kind of the premise and the flow of it.

Kirk: Yeah, yeah. Um, so I imagine teams have loads of data they can look at when they're evaluating a player. Right. There's in-game data, right. I mean, points, there's, you know, assists, there's rebounds, there's blocks. There's all these, you know, sort of objective statistics. Right. And then there's maybe the less objective statistics like how much of a team player they are, you know, or their ability to be a leader on the court, their ability to influence other players. Um, how did you think about sort of like the universe of data that teams evaluate when they pick players?

Daniel: Yeah. I mean, I let the teams lead on that front. I think one thing that's interesting and important is that the that they've gotten way better in the last like ten years in the types of data that they look at. So now they've got tracking systems that that will follow where players are on the court. And so now they can do things like they can say when a given player took a shot, how far away were defenders from them. Right. And so now you can look at how well are they shooting in wide open shots versus how well are they shooting when contested things like that. Um, so like they're using they've advanced dramatically on the availability of, um, this kind of data. Some teams are now very intentional about how they're using that and building strategies around it and other teams. It's like you just dumped it's like discovery and law, like you've got the room full of files and you're just like, ah, that's a lot like, I'll use this, but I'm not exactly sure where I'm going to start. And so they're still figuring out how to apply it in an organized and consistent way, I would say. But I let the team sort of lead a little bit on that side.

Kirk: Yeah, yeah. Well, I mean, I imagine that could be a bit of a pitfall if you have all this data, you think you know everything, but you really don't. Right. I mean, because you have all this human judgment that has to go into it. Data is like garbage in, garbage out. You don't know if what it's telling you is really something reliable. And then that's I think gets to what you're doing is decision making under uncertainty. So is part of what you're helping them understand is well, actually you don't know as much as you think you know.

Daniel: Um, that I that sounds a little too condescending for me. Uh, I would say I was challenging them to ask difficult questions. And so, um, let me give, like, an example would be, um, a lot of times they would use analytics for a specific question. They would say, like, you know, this player or this player. Um, and they would, you know, they would, they would say let's like, you know, uh, you know, let's make an assessment or like, you know, does this player have the potential to be a star? What does the what does the data say? And I think like one problem with this and I think this is rampant in the NBA. Some teams are better about it than others is they go they go to this way of thinking once they're already thinking about specific cases, right. Like they're already right. Like let's say in any organization someone's up for promotion. And now you're like, okay, this person's up for promotion, they're up for partner. Should they get it or not? And now all of a sudden you're talking about their strengths and weaknesses. And all of a sudden you're like, okay, well what do we value in this organization. Right. What's our decision rule? But because now we have the case in front of us where everything we're talking about is really about that case, rather than without thinking about specific cases, stepping back and in establishing your decision rule and thinking through, okay, what are our organization's trade offs around these things? How much are we willing to give on this dimension to get on that dimension? We should be thinking about those and establishing those in advance, separate from individual cases.

Daniel: And then we get there and we can apply that. And what I think happens a lot in a lot of organizations and a lot of these NBA teams is are the ones that some of the ones that I spoke to, many of them is they get to the specific case and now they're like, okay, let's basically create a decision rule to evaluate it once you're already there. And what ends up happening is you become very inconsistent in how you apply your logic. It's like everything, every decision is informed by analytics and is seemingly very, you know, reasoned and informed by data. But you're doing it in a different way for different players at different times of the year. Um, and what ends up happening is they get massive, like, um, motivated reasoning. Like confirmation bias, they end up having a specific case. They have a conclusion that they kind of want to come to, and because they didn't establish how they're going to use all this data to inform decisions in advance, they've got tons of degrees of freedom and they use it basically. And so it feels super objective, right. You're like, oh, it's you know, like and point to it.

Daniel: But the fact is if it was like, you know, when it's a different player four months later, they taking an objective approach but use different kinds of data to measure x, y and z to justify. So if you want to say, oh, so-and-so is actually a good defender, for example, what do you look at? Well, you can pick different ways of looking at, you know, what's the opponents shooting percentage when this person's within three feet of them? Or you know, what's the other team's rate of scoring in general, when this person is on the court versus off the court, there's actually a lot of different ways of thinking about that. They're all objective. But if you haven't decided in advance which ones you're going to use and how you think those are informative or not, those just become degrees of freedom to come to conclusions that you want to in the first place. So that was one thing that I was pushing on them is to, and I would say that was somewhat emergent, right? That wasn't my intention going into it, but I think I encouraged them to talk more. And some of these teams are very good about this already. But many I think, struggle with this is think about this in advance of specific individual cases, so that when you get there, you can be consistent in how you apply your decision rules.

Kirk: Yeah. Yeah. That makes a lot of sense. It sounds like that's the scientific approach. Right. Having the same factors in play. Right. When you're evaluating something.

Daniel: Yeah I think as a decision making scholar, like you never want your, your decisions to be, um, you want them to be informed and affected by the things that like, you want to be intentional about what you want those decisions to be based on. Some of this other approach is like you may be letting outside factors affect your decision because of that inconsistency in a way that if you zoom out and understood it, you would not want to happen, but because you didn't take the time to establish your process in advance. Now all of a sudden, all these contextual factors, right, like, you know, um, to the point where it's like, oh, you know, you're friends with this person's dad, right? Like, like and now you're looking at the data differently because of that. Or, you know, they went to your alma mater like, that's a silly one, but it's actually kind of important. I think, uh, there's just lots of, um, yeah, there's a lot of, you know, and I think that that's why, like, you don't want your decision making and decision processes to be inconsistent across the same case. Right. Like that would be a signal of you could say irrationality or something. And I think some of these organizations are blowing in the wind on some of these elements, much more than you you'd maybe think or would want. Yeah. Again, some of them are doing it really well. Yeah. But I think there's room to grow.

Kirk: Yeah. Huh. Um, I'm curious to hear about some other decision biases that you, uh, encountered with this work.

Daniel: So one thing that I found really interesting is in, in the world of psychology, there's an idea like the fundamental attribution error, which is when you observe behavior, you know, and you're trying to make causal attribution. What led to that behavior or performance? What led to that performance? Um, there's a tendency to over attribute it to the individual, like their personality, their individual ability, and under account for the whole context that contributed to that. And so like one common consequence of this is like you see someone, you know, you're at a meeting and then someone got upset about something that someone else said. And you're like, man, that that that person's just like, you know, a hothead. Um, and what you kind of don't, you might not realize is like you yourself in the same position if someone interrupted you when you were proposing your idea. Like that makes you feel disrespected. Like, once you actually think about being in that position, you realize, oh, you might have also kind of blown up in that situation much more than you might think. So we tend to not realize, like, how much us in those positions would kind of have done the same thing. But also it causes us to like, for example, designate like this person's a star because they did fantastic.

Daniel: Actually, they had the perfect setup, right? Like, you know, everything was teed up to help them look like a star. That's a context that contributed to their success. So I like to refer to this as situation neglect. So the idea of like over, you know, under accounting for the situation, what I find fascinating about that is in psychology we always talk about that in like how we interact and observe. And the causal conclusions that we come to as we think about that. But what I found really interesting is now the situation neglect is in the data itself, right? You look at again like someone's statistics and you think, oh, wow, this person is really like, let's say like really inefficient. Like they're, you know, they're shooting poorly. They turn the ball over a lot. You know, that's not a good sign. At the same time you say, okay, well what's the context in which this player was acting? Oh, they had to be the lead ball handler on this team because this team had no other, like, you know, they had some, let's say, big guys or whatever, but they didn't have any point guards or something.

Daniel: So this player was needed to be used in a role that's outside their true like their, you know, their ideal role. And so they're forced to do these things that make them kind of look bad statistically. But if you were to get them, you would use them in their ideal role. And so in some sense, like the context matters a ton there, right? But if you look at the numbers, it's not going to show you that context. So I got really I thought that was super interesting to think about. Okay. We should. As we look at data, we should think about what is what are the contextual factors that the data can't see. And to me now like that's one of the first things I'm asking myself when I look at anything. Like I think a lot of people are like, well, the data is the objective answer. It put out this like it's so smart. And I think the first thing I want to ask is that's really interesting. Great. Let's understand what is it picking up. Well. And what is it not picking up. Well because that's now where the arbitrage opportunities are for improving beyond that.

Kirk: Yeah. Wow. That's really important I think for all of us to remember. Right. I mean that attribute bias that's.

Daniel: I well I the fundamental attribution bias I like to refer to it as situation neglect. I think it's a little bit a little bit clearer.

Kirk: Yeah, yeah. Um, have you, this might be a shot in the dark here, but have you seen that at all in the NBA? Like, I mean, just your own personal observation.

Daniel: Yeah, absolutely. I mean, again, like, I think teams are getting smarter about this, like, you know, um, like one example would be looking at like, accounting for how many of this player's shots are at the end of the shot clock. Each time a team gets the ball they have it's a possession. They get a certain number of seconds. And eventually sometimes, like you're not able to generate a good scoring opportunity. Now all of a sudden it's down to five seconds, four seconds and someone gets it. It's like you just got to shoot it, right? Like just throw it up there because it has to hit the rim, at least within that time frame. So there are some players who often take on that responsibility for their team of like, okay, now it's like, give it to that person. They're just going to try to throw it up. And so like, you want to adjust for the fact how like where in the shot clock are these, uh, you know, shots coming from. Right. I think that's a good example of, um, of like what you need to account for. It's like, well, what was the context in which the shot was taken? And I think this is like one thing that's fascinating Thing is, let's say you want to look at college data like so a college players statistics. And you want to predict how good of a three point shooter they're going to be in the NBA. It's super important these days.

Daniel: They finally figured out that three points is 50% more than two points. It only took them 40 years to figure out that math. So people are taking more three pointers. So this is really important. It's a very important question. And so now you're trying to compare these different prospects. How good are they going to be as a three point shooter in the NBA. And you might think what I should do is I should look at their three point percentage in college. Seems like a natural thing to do. Yeah, the issue is different. The context in which people are taking three pointers in college are wildly different, right? Some. Again, some players are getting way more space. Some players are having to take three pointers. As with the shot clock winding down, some are not right. Like there's all these differences. And there's a fascinating thing which is free throws. So this is when someone gets fouled and you go to this particular line. It seems not like the rest of basketball, right? It's like everyone stands around and you get say, you know, usually two shots. And so you just stand there, bounce it three times and, you know, like, shoot it. Unpressured. Unpressured. Exactly. It's not like the rest of basketball, you'd think. What's fascinating is that they found that free throw percentage in college is just as maybe more predictive of your potential to be a good three point shooter in the NBA as college three point percentage.

Daniel: Let me say that again. So you're trying to predict their NBA three point percentage. And you can look at their college three point percentage. And or you can look at their college free throw percentage. And the natural thing feels like you should look at their three point percentage in college. But it turns out free throw percentage is quite predictive in the reason, and I find this fascinating is that you want to compare across people. How do you do that? Well, the free throw is the great equalizer. Everyone stands at the same line, the same distance. It's the same context. And so looking at and now and so it's this it's this nice clean situation neutral measure of shooting touch. And it turns out that that's super helpful for being able to predict three point percentage, even though in the spirit of the game itself, it just it feels completely different. So I love that as um, as like I think we as managers, as decision makers want to start thinking about what are these like, what are these opportunities to compare our, um, options or to evaluate talent, etc., where like context is neutral and like either context is equivalent or how can we find ways to account for that context? And these rare opportunities where you get truly apples to apples comparisons with like almost identical context, those are incredibly valuable. So I think that was an interesting one for thinking about situation neglect.

Kirk: Well, let's talk about some other biases that you found. So we got we got situation neglect. Okay, let's go to the next one.

Daniel: So I think one thing that I've alluded to a little bit is like the extent to which teams can fall in love with a particular player and they then start doing massive amounts of confirmation bias. They basically find. So I think what a common and again, so the part of the premise is figuring out how are you using human judgment and analytics in conjunction. And a lot of times the human judgment was coming first and identifying players of interest and then basically using analytics to refine that. But it became using analytics to justify, like I found this player, I almost kind of want to get credit for scouting this player, and now I want them to back it up with whatever statistics I can find. So this idea of sort of falling in love with particular players and that becoming really dangerous, I think, um, was one thing that emerged. I think that probably the funniest one that I ran into that I just I just found this hilarious was all the teams I talked to. So there's an idea of availability bias. So this is kind of the idea that like we tend to overweight information that we can see easily relative to information that that's more difficult for us to see. It's more memorable to us. It's more vivid. One of the things I was asking them about is right, like one implication of that is how a player performs against you.

Daniel: Like these. These executives are all watching their teams games. They're watching all their own team's games. Well, you've got an opponent and you see how that player has performed against you. You saw that, right? And you're like, oh my God. You know, like, you know the uh, Washington Wizards, you know, beat my, you know, uh, let's just go with Boston Celtics because this random role player went, you know, five for six from the three point line and whatever had a big game. And you're like, oh, maybe you know that person's freely good now. You then can look at their aggregate statistics, which includes that game and lots of other games. And yet the vividness, the fact that you experienced that and you might see like, oh, actually, in general, they're not a good shooter, but because they had a very lucky shooting game against you and it was vivid and available and you saw it and you experienced the pain of it as they maybe beat you. Right. Um, now all of a sudden that, that, that registers to a massive extent like, like and so I think a super common issue in the NBA is what I'm calling like the against us bias, which is like however, a player performs against us massively affects how good we think they are in general. And like, you know, so I was asking them about this like isn't like what do you do? And they're all like, it's inevitable.

Daniel: What I found fascinating is they're all like, yeah, that happens for sure, right? Like we're terrible about it. I think everyone else is terrible about it too. So, like, what can you do? And my mind is saying, okay, what I would be doing is I would be tracking how each of my players performs against each other team. So I would know player X on my team. On what are their statistics look like against each of the other teams, and who is likely to have a biased view of each of my players, because what that will tell me is, okay, that team's probably overvaluing these players on my team and undervaluing these players on my team. That now gives me a sense of what kinds of opportunities there might be in trades, for example, for, um, you know, improving my team. So I like what I didn't quite get is I wanted I was surprised how often it was just. Yeah, you know, like it's a bias. We try to avoid it. I'm like, I think a smart team should be getting to the point of saying yes exactly to saying, okay, let's assume we can avoid it. Let's assume everyone else does this bias. What's this strategy? We can create and like let's at least look at that like I couldn't I just was hilarious. Like, yeah, it's this naughty thing we all do.

Kirk: Yeah. Huh. Um, I was wondering if your work with the NBA has caused you to watch the game and follow the game differently? You know, do you pay more attention to certain things now? Are you looking at certain trades and saying, yeah, this this was, uh, an obvious case.

Speaker3: Yeah. Yeah.

Kirk: Against us or something?

Speaker3: Yeah.

Daniel: I'm absolutely I mean, I think one thing is, again, I don't my view is very zoomed out in the sense that I'm interested in the personnel decisions and the strategies employed. So I would not say that doing this work has made me like, like on the edge of my seat for individual games, like, oh my God, they hit the shot or didn't. And you know, like I it hasn't affected that, but it has affected my sense of how these teams are run I. And honestly, I don't know how to say this without it sounding offensive. But like, I actually was a little bit surprised the extent to which these organizations are like kind of the Wild West still, like they're incredibly advanced. They're really smart people in these organizations. And yet, like the way they ultimately make decisions is a little willy nilly like, um, you know, like there's often like 1 or 2 people who have. And if they change, if they're in a particular mood or they change about a mind about something. Like everyone else kind of has to, like, fall in sync. People don't want to dissent. Um, and so, yeah, I think I think I've become a little bit sensitive to some of those factors as well. Like one that gets talked about is like when a new owner buys a team, they have this like, like, um, they have this incredible urge that they cannot hold back to, like, make changes to the team to stamp their right. Imagine a new boss coming in like a new manager in your organization, and they want to make it theirs. A new CEO, they're like, all right, I'm going to put my stamp on this.

Daniel: Yeah. As opposed to taking the time to figuring out, like, what's working well, what's not working well, they want it to feel like it's theirs. And, uh, so there's been some examples in the last year of owners coming in, like just making these massive changes to want to like, really short termism is the issue. It's like they want to look good in the short term. So they basically kill their organization going forward to try to look good in year one. And I think that's like, you know, you can just see it. It's like the train crash. It's like, oh my God, there's no way this ends well. I think another thing that I've started to notice quite a bit is like people being so this is actually a good lesson with respect to how we use data and analytics is data. Let's ask ourselves are they from the future or from the are they from the past? They're from the past, right. We're trying to do well in the future. And so like I and so like we're using the data to predict the future. The problem is a lot all our impressions are based on the past. And we need to be really smart about what does this information tell us about the future. And I think like one very like, you know, like understanding the age curve for players, for example, like there's a natural trajectory of getting better to a particular age and then you get worse, right? I'm now like, you know, I'm on the worst side of the age curve. You know, I'm, um, unless.

Kirk: You're LeBron James.

Daniel: Maybe. Well, right. But he's still he's still getting worse. He's just still better than 99.9% of other players. But like, you know, he you know, if you had to predict is he going to be better or worse next year, you would predict worse, right. Just like you're past the prime year. He's amazing. It's not a not a criticism. It's just a reality. And so like one example that you see. So for example, um, one of the big issues I don't know if I'm allowed to I'll just say like So for example, if you look at like the moves that the Dallas Mavericks have made this in the recent years, you see them acquiring players and betting a lot on players who are post 30, like you've got them signing Klay Thompson, who's you've got them relying on Kyrie Irving who's after 30. You've got them trading their super mega star. I think he's like 25. Like not at the not you know he's at the good part of the age curve for a player who's like 31 at the bad part of the age curve. And so like and I think all of these things are examples of like you need to remember you're predicting where we're going, not where we've been. And so I think it's easy to say, oh, we've got X, Y and Z all star. Well they made an all star team. When in the past. What you want to know is what's the chance they're going to make, you know, be good enough to make an all star team in the future. And some of those players are not anymore. So I've also been surprised at some of these teams are pretty, um, I observe teams not being like, not accounting for like age curve and trajectory and using analytics, which are inherently backwards, you know, looking into the past and drawing the fixed conclusions from that as opposed to using those as projections.

Kirk: Yeah. Okay. So just to review a little bit, if you wanted to kind of summarize for alumni or, you know, managers out there, what are some of the highlights of what you learned from your work with the NBA that people can apply in their organization and how they make decisions?

Daniel: I would try to I think one thing again, like I'm really interested in we've got all these analytics, we've got all this data, and yet we that is a lead into human judgment. And by the way, like all these NBA teams have all this data. They also have scouts all around the world. And they have scouting reports on players. And like there's all these elements of a player's quality that they rely on human judgment to measure, basically. And so I think what we want to do is think a little bit about how are we bringing these pieces together in a systematic, consistent, organized way to inform our decision making and the data? Let's assume for a second that the analytics are incomplete. The data are missing something. The way we're analyzing and looking at the data, it might be missing some something from the picture. What is it that human judgment is picking up that the data can't yet. And so I think I think we should start doing more using human judgment to think hard about what they can pick up that the data are doing poorly and then use that to inform. Okay. What does that mean for what we should be trying to measure now and now? Okay. We should be trying to capture x, Y and z. And what we're measuring that, that then can be used as an input into our analytics. Or maybe can we get the human judgment to be captured objectively, perhaps numerically? Maybe you have 3 or 4 different people independently assess something, and those that becomes data that can go into your analytics.

Daniel: Right? And now the model is better informed and it's more accurate. And now you go back to the humans and say okay, what next. Like what now. Now what's the model missing. Right. What's the next step. And more of like a cyclical, um, figuring out what is it that humans are adding that the models can't and then being intentional about what does that mean for what we should measure? Um, I think that's just like a really I think I think I think all organizations should start trying to be a lot more intentional about that process. The other just last thing I want to say that I learned is, holy crap, when people have job insecurity, it's like incredibly cognitively handicapping. I think one massive issue in the NBA is people are getting fired so much, and everybody's so stressed about their jobs where I'm sitting here being like, hey, you know, like, let's talk about how to make these decisions as effectively as we can. And some of them are like, ready to go there with me. But then a lot of them are like half of my energy. They didn't want to say this, so I'm not they wouldn't say this explicitly, but this is my perception is like a lot of their energy goes to trying to make sure that they've, like, aren't going to get the axe if things go badly. They want to make sure. And a lot of it's like networking. It's like, if I lose this job, can I get a job at another team? Like they want to stay in the bubble and it's like, it's so constantly like worrying about getting fired that the idea of like, let me establish a philosophy and principles of how to do this job well and execute on that.

Daniel: I think that the, the, the uncertainty around losing your job, I think, is really hindering some of these teams and organizations. So like, um, and the second you start feeling the pressure, you might do something like, oh, I'm like, if we don't start winning right now, I might get fired anyway. Let me trade everything. You know, like all our assets that are valuable in two, three years. Let me trade those to see if we can Hail Mary it this year. And if it doesn't work out, I get fired anyway. Holy crap, that just kills. So, like, I think I think the way owners put pressure on, I think they're called governors now. Um, put pressure on those. Running the team is often they don't realize how detrimental that is. And I think in general I would like organizations to feels like I wish people were more focused on doing a good job. I know it sounds trivial, but like, I feel like everybody's like all that matters is networking now, right? Like. And it's like, you know, you want to make sure you have good outside options and you want to make sure you're connected. Um, it also helps to do your job really, really well, right. Like that. That turns out that gives you a lot of leverage if you're really good at your job.

Kirk: The fundamentals. Right.

Daniel: Yeah. Like so like can we can we start there maybe. Um, and so I think I think people don't realize, like, how damaging it can be when you put pressure on people to the point where they feel threat and start just worrying about cover your butt all the time. Um, and constantly, just like always. Like they desperately want to stay in the NBA bubble and they're constantly making sure their network is strong enough so that they need another opportunity. They can have it. And I think that's actually really damaging.

Kirk: Um, well, that probably speaks to decision making under stress, right? Which is faulty usually. Right. People have found.

Daniel: Yeah. I mean, you know, like empirically it's interesting because there's a version of stress which is like they're high stakes. And now you're really focused, right? Like incentives kind of create arousal is how psychologists would talk about it. And like that could be focusing in your effort is increased. But now what we're talking about is stress of a different kind. So like when you look at the literature it looks more mixed because the they're different.

Kirk: Kinds of stress.

Daniel: Distress? Yes, in different levels. But when it gets to like survival, now you get right. Like you get people acting much more individualistically and less for the benefit of the organization. Um, so I think there's a big time, um, organizational culture issues in a lot of these teams. I think they're kind of aware of it. I think they're not quite sure what to do about it yet. And I think some of the teams are doing a better job of it than others, where they foster opportunities for disagreement. But I think some of the teams are like all the way up to the president of basketball operations is just like their job insecurity seems so high that like, they don't have time to think about establishing a good culture. They don't have energy to focus on establishing good culture for those lower in the organization. They're just like, how do I like, get through this next year or two to keep my job?

Kirk: Yeah.

Daniel: Um, and I think that's, that's kind of too bad. So and what's interesting about that is like some of these conversations were really generative, and I think they were really wanted to learn and engage and wrestle with ideas, but those were the teams that tended to already like the ones who were already doing better, in my opinion, were the ones that were most able to engage with some of the things that I was challenging them on. And some of the teams that are currently behind who I think would actually benefit the most from the conversation almost can't have it because they're so worried. You know, they're like, yeah, they're like in survival mode. And it's like, oh wait, we need to get X, Y, and Z in place before we're ready to even think about doing those things like more effectively. Yeah. So it also it hadn't thought about this in this way, but like you get to these places where organizations can get stuck where I guess it's a form of like rich get richer, where like the, the people who are able to get success to a certain level, organizations and stability to a certain level are then the most able to like ask the next important questions, whereas the ones who are like fighting for survival in the first place don't have the bandwidth to even ask those questions, and that then causes them to be a step behind again. Yeah. Um, so I think that that was really, really interesting. And I think it makes it super cool. Amazing people that I met. I think there can be a toxicity that comes with the amount of job insecurity, um, that I would worry about for some of those folks. Yeah. Um, but really, really interesting.

Kirk: I'd like to thank my guest, Daniel Filer. You've been listening to Knowledge and Practice, a podcast from the Tuck School of Business at Dartmouth. Please like and subscribe to the show. And if you enjoyed it, then please write a review as it helps people find the show. The show was recorded by me, Kirk Kardashian. It was produced and sound designed by Tom Whalley. See you next time.

Speaker4: Yo, T-Bone, did you produce this?

Speaker5: Sounds good. Right?