Developer Tea

Interview w/ Daniel Pink (Part 1)

Episode Summary

Today we are thrilled to have Daniel Pink on the show, Daniel has four New York Time Best Sellers and Daniel has a wide variety of experience including being a chief speechwriter for Al Gore in the 90s, and has one of the 10 most watched talks on TED talks. In part 1 of today's interview, we're going to talk about Daniel's experiences and writings and dive right into what Daniel wishes more people would ask him about.

Episode Notes

Today we are thrilled to have Daniel Pink on the show, Daniel has four New York Time Best Sellers and Daniel has a wide variety of experience including being a chief speechwriter for Al Gore in the 90s, and has one of the 10 most watched talks on TED talks.

In part 1 of today's interview, we're going to talk about Daniel's experiences and writings and dive right into what Daniel wishes more people would ask him about.

Get in touch

If you have questions about today's episode, want to start a conversation about today's topic or just want to let us know if you found this episode valuable I encourage you to join the conversation or start your own on our community platform Spectrum.chat/specfm/developer-tea

🧡 Leave a Review

If you're enjoying the show and want to support the content head over to iTunes and leave a review! It helps other developers discover the show and keep us focused on what matters to you.

Episode Transcription

Today, I'm thrilled to have on the show, Daniel Pink. Daniel has four New York Times bestsellers. The most recent is called Win, The Scientific Secrets of Perfect Timing. And he also has To Sell is Human, The Surprising Truth About Moving Others. Drive, this is one we've talked about on the show before. Drive, The Surprising Truth About What Motivates Us. And A Whole New Mind, Why Right-Brainers Will Rule the Future. In fact, two of those books were actually number one New York Times bestsellers. Daniel has a large variety of experiences. For example, he was the chief speechwriter for Al Gore in the 90s. But he's also been the co-host of a TV show that aired on National Geographic Channel. He's been on pretty much every major news channel. He's been on a lot of TV shows. He's been on a lot of TV shows. He's been on a lot of that you can think of. And if that's not enough to lend him credibility, he has one of the 10 most watched TED Talks of all time, with more than 20 million views. Daniel has so much insight to offer. I encourage you to go follow him on Twitter. He tweets out some really excellent research. A lot of the things that he tweets are actually small excerpts or, you know, mini summaries of this research. You can follow him at Daniel Pink, on Twitter. Let's go ahead and get into the interview with Daniel Pink. Daniel, thank you so much for joining me on today's episode. Jonathan, thanks for having me. It's good to be with you. So I know that developers who are listening to this right now have almost certainly heard of your books. If not from the general media landscape, then they've definitely heard it from this show. We've talked about plenty of the ideas that you've discussed and researched, or at least compiled research, on in your various books. I want to take a moment, take a step back, because I know you do a lot of interviews. And I'd love to ask you a question that hopefully you haven't been asked in a recent interview. And that is, what do you wish more people would ask you about? Huh. What do I wish more people would ask me about? I don't know, actually. You know, it could be one of those situations. Where you don't know what you're looking for until someone offers it up. So there isn't something, it's not as if I sit here in this interview with you, Jonathan, saying, oh, man, I hope he asked me my financial advice. I hope he asked me what I think is the best strategy for the Washington Nationals or anything like that. You know, I find that in a lot of these conversations, I do think that a lot of these conversations ultimately come back to some pretty fundamental things. Among them, essentially, one's views on human nature and one's views on values. And so I think that that would be, you know, especially the values question is one that I think we don't talk about explicitly all that often, but is a really rich topic and in many ways underpins all the other topics. And I think that's a really important question. And I think that's a really important topic. So what do you think we get wrong when we talk about values? Do you, where do you, where do you feel like that conversation needs to go more often? Well, I mean, I don't think, I think that we don't often don't surface the values enough. And I think that at some level there's a values conflict in organizations, institutions, and even in cultures. So for instance, if you, if you're like me, where are you, where do you live, Jonathan? Where are you based, San Fran? I am actually based in Chattanooga, Tennessee. Okay. Very, okay. Excellent example then. So I think that a lot of, see, and I'm showing my bias right there because I figure, oh, you're talking about developers, you're probably in Northern California. And so if you look at something like authority, okay, the value of authority, there are a lot of people, I live on the East coast, they're in Washington, DC, big urban area. There are a lot of people in big urban areas who think that everybody has a view of authority, like saying, oh no. Authority is something to be resist, is something to resist. Authority is a negative. And in fact, there is another competing moral value that says that authority is something to be respected. That when, that like I'm doing something wrong when I, me, Daniel are doing, doing something wrong when my 15 year old son's friends come over and they call me Dan rather than Mr. Pink. That is, you know, and there's a value conflict. There's a, there's a conflict of values there. And then also in organizations too, you know, there's, there's certain kind of fundamental, you know, and I, and I just want to, you know, I mean, my value, like I don't hold authority that high as a moral value, but that doesn't mean that I'm right. Or that there's a single notion of morality there. There are other people for whom authority is a much higher value and it doesn't mean that they're right either, but it means that there's a conflict among. Yeah. So, and we have to, we can't assume that all of us share the same underlying set of values. The other thing, and I think it's adjacent to this is, is what's our, what's your view of human nature? And, and, um, and that guides a lot of our decisions about what we do in organizations. If your view of human nature is that people are generally lazy, shipless and need to be controlled, that leads you down one path. If you're a manager, if your view of human nature is that, Hey, most people want to do good work or willing to try or conscientious that leads you down another path. And so a lot of the things. That we superficially talk a lot of things. I don't want to say it's superficial in the negative sense, but on the surface, when we talk about things like performance, when we talk about things like how organizations run, when we talk about work patterns and workflows and structures of organizations deep down inside of them are presumptions about human nature and about values. Yeah. And this is really interesting because I think if you take that values discussion to kind of the edges of the map, um, then you find the places where. We kind of. All agree. Maybe this is where human nature begins and where values end, where those lines are drawn a little bit harder. Like for example, we can all generally agree that killing each other is, is pretty bad. It's not really a good thing, right? It's not really a discussion of, of values. And so, you know, how can we, how can we find those lines? And is it perhaps that some people believe those lines are in different locations on the map? I think that's part of it. Uh, so, you know, if you talk about, you know, this is, this is. This has become philosopher tea, not developer tea here, but you know, if you, if you think about, just think about killing for instance, and think about, um, you know, um, for instance, the, in the Obama administration in particular, there was, there were, uh, there were a lot of military actions that were done by drones. Uh, on occup, you know, on not, uh, you know, uh, uh, uh, like, uh, unmanned area. Yeah. Yeah. I was trying to avoid the word unmanned, you know, sort of, you know, on. I understand. Unhumanned, um, uh, aircraft. Unoccupied, sure. Yeah. Unoccupied aircraft that would, you know, be deployed and would kill somebody. And, and there were some who would say, oh my God, that's an out, you know, there, there, there, there are some people who believe that any kind of killing, whether it's in a war or, or not is unjustified and immoral. There are other people who would say, well, those are the bad guys. Those people are trying to kill us. They've done an incredible harm. We need to, we need to eradicate them. There are some people out there on something like the. The death penalty where you say, well, the death penalty is an appropriately appropriate punishment for the most grievous crimes or other people are saying no, under no circumstances should the state take the life of somebody on things like abortion. You have, you know, uh, you know, is, uh, is an abortion, the killing of a human being and there are people who say, well, yeah, it is because life begins at conception. There are other people who say, well, of course not because the, that embryo is not a human being. And. Um, and so even, even. Then there are these, you know, divides at the moral level that we sometimes don't talk about. Now those, you know, things about killing and death, don't the, and abortion are things that we don't talk about when we typically aren't, aren't germane to a lot of organizations, but some of these other underlying moral issues are especially when, as I said, when it comes to things like tolerance, when it comes to things like ambiguity, uh, authority. Um, when it comes to things, even like order and cleanliness, um, those, those are. Big deals. And, and organizations can be talking about these things in more useful ways. Would you agree with us? Totally. So how can we, how can we have better conversations around values? Maybe that's it. Maybe that's a good question that we don't necessarily have an answer to. Um, you know, I, I think that there is, you know, in general, a lot of what happens in organizations, the interactions and certainly the conversations in some sense are performed. Rather than, rather than, um, lived, um, you know, we're all playing certain now, now it's now it's sociologist tea. Now, uh, the, the, you know, we're performing some of these roles and there's sometimes not authentic conversations. I think authentic conversations inevitably surface some of these deeper issues. I think that the, you know, authentic conversations keep those things. You know, behind a locked door, which. Yeah. Diminishes their, diminishes their usefulness. So, you know, and you see things like, I mean, on a mundane example, you see something like, you know, the conversations that people have in performance reviews. Um, I mean, those are, that's a completely, um, performative encounter. Somebody's playing the role of boss. Somebody's playing the role of employee. It's not the kind of conversation that people would have when they're out having a beer with someone they're comfortable with. Right. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. It's true, especially in tech. If you were to go into a startup in San Francisco, you can identify the person who doesn't want you to talk to them. Uh, they're the ones that is where they're wearing the headphones and you know, it's, it's certainly a stereotype, right? And a lot of developers play into this because there's some kind of identity that, that is forming there. And perhaps, perhaps it's a personal identity, but also this organizational identity, the idea of having a ping pong table. Uh, hopefully, you know, we're, we're slowly moving past that. And I think that's something that is changing in the tech industry. We're moving a little bit past the, you know, oh, unlimited paid time off. And, well, everybody knows that that's actually not true. And so if you were really to take unlimited time off, you wouldn't even work there anymore, right? So there's these identity things that we're building and maybe breaking. And there's kind of this post-startup culture that's emerging from that as well. Yeah, probably. Probably. I mean, again, I think that you raise another good point about identity. Identity matters much more than a lot of conventional, I think, business and political thinking takes account of. Yeah, and I also believe that the values discussion that you brought up earlier plays heavily into that, right? Yeah, yeah. And so we're trying to form, and perhaps it's because, you know, we have difficulty exploring. We're expressing our own values. And something that I've created this show to help kind of evolve the listener's ability to express their own values. Because if you can't express your own value, then perhaps someone else can help you express it, right? Right, right. We adopt the values of our neighbors or we adopt the values of the company that we work for. And this is how we have these kind of homogenous groups of people that cluster together because they're geographically or, you know, some other kind of collection of people, they start to adopt each other's values. And this is very normal, isn't it? Yeah, totally. I mean, we are social creatures, we human beings. And so we are deeply influenced by who's around us when we decide how to behave. We look for cues from people in our midst. And so, you know, this is, we're, you know, complicated. We are complicated creatures. And again, a lot of times I think that the way that we, think about business, the way that we think about organizations doesn't account for that kind of complexity, doesn't account for the multi-dimensionality of many human beings. Yeah, yeah, absolutely. Of all human beings. Yeah. And this drives, no pun intended, back to some of your original kind of, or perhaps your most well-known works about autonomy, mastery, purpose. You know, how can we, how tightly would you say those values and, you know, interact with purpose, for example? Yeah, well, I mean, I think if you look at the research on the values of autonomy, mastery, and purpose, I actually think they're pretty deep and fairly universal. That is, in some ways, I think that they can cut across certain kinds of values. So if you look at something like autonomy, autonomy doesn't mean, you know, you know, screw the man, and I'm going to go off on my own like a cowboy in the American West. Right. What it means is it means basically self-direction, having some sovereignty over yourself. And that can be in the service of something collective, as it would be in places like in East Asian places, or it can be in the service of something purely individualistic, as it is here in the United States, often in the United States. But the idea that people want to be self-determined, they want to have, they want to have some sovereignty over themselves, I think that's universal. And it's perfectly compatible with, you know, a sense of collective desire or collective purpose. I think the same thing is true with something like mastery, which is our desire to get better at stuff. I think that cuts across all, I think that cuts across values. I mean, you just think about, you know, and the same thing is true with purpose. That is, I think people want to know why they're doing something rather than just how to do it. And everybody's purpose isn't necessarily. going to align, but I think everybody does have an animating sense of purpose that helps drive, that helps drive what they do. So if you think about like the difference in values, I think that these are attributes, this is actually more like human nature rather than values. That's why I think that autonomy, mastery, and purpose, that our desire to get better at something that matters, to know why things are happening, to have some sovereignty over what we do and how we do it. I think that's, I think that is human nature. And the reason I say that is, is not because I'm an unoptimist necessarily, but because I mean, in part because I'm a father or, and because I've had kids and I've seen kids. And I defy you to find me a two-year-old or a four-year-old who's not autonomous, self-directed, curious, wanting to learn and grow and wanting to know why things are happening. I think that's our nature. Absolutely. I think that events, events and institutions conspire to snuff out that nature, but I think deep, but, but deep down. That, I, I think that is the, that is human nature. That's not all of human nature, but that is fundamentally part of human nature. So what would be the motivation in, in snuffing that out? That's, that's something that I think is, is kind of the implicit conversation here is like the second order effect. What you, what is required for autonomy is a deep level, perhaps of trust, right? So is it a lack of trust? Is that why we want to shut this down? Like a. It's part of it because I mean, human beings also, well, you got it. That's the last, that's, that's the key word that you just mentioned. Control. Um, you know, human beings also have an impulse to control other people. We like to dial down uncertainty. We like to exercise our authority over people. And so, you know, if you just look at human civilization in some ways, human civilization has always been this con you know, in different pockets of time and around the planet has always been these battles between control and autonomy. One group is trying to control another group. The other group says you can't control me. I'm autonomous. So I'm going to fight back. Um, and so, um, so I think that that's why, um, you know, and also there's a certain, if you think about something like autonomy and if you think about something like, uh, even, you know, even mastery and purpose, they're not purely efficient. Okay. So if you're just trying to make the trains run on time, who cares whether people are getting better at anything, you just want them to execute getting the trains running on time, at least in the short run. Yeah. That's all. That's all you care about. That's another reason why some of these contingent. Rewards these, if that, what I call if then rewards, if you did this, then you get that. It's why they have an effect in the short run. And one of the reasons why they, they're affecting the long short run is one of the reasons why they endure. So, you know, there is this, there, there is this battle here between humans wanting to control the people. There, there was a time in the workforce where efficiency was by far the highest virtue. And in a sense, you wanted to build organizations that went against the grain of human nature because it was. More efficient. But now I think in a world where people are doing more creative, complex tasks, you want people to have a sense of self-direction because they do those kinds of tasks better. You want people to learn and grow because that's how they improve at those kinds of tasks. You want people to know why they're doing what they're doing because they end up doing the task better. And so I think the interesting thing happening right now, if you look at 50 years of behavioral science about what really motivates people is that in many cases are organizations. We're designed to go against. The grain of human nature, because that was more efficient. Um, but now, so for instance, you think about an assembly on assembly line to my mind is not consistent with what we know about human nature. It basically says we're going to suppress human nature in the interest, you know, perhaps understandably, perhaps properly. We're going to, we're going to suppress human nature, uh, in the name of this other thing that we value, which is efficiency and production. Now, I think what's happening is that you can run organizations ever more that go with the grain of human nature. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Again, against the grain of human nature, not necessarily for nicey, nice, touchy feely reasons, but because that's a better way to run organizations. Right. Yeah. Because it's actually more efficient for the creative types of tasks and it's no longer a labor economy. It is more of a knowledge economy. Right. It's, it's no, I mean, again, the skills are, the skills have gone from routine to non-routine, you know, and so for routine, you know, so, so for instance, this is, this is one reason. So, so let's, let's take, let's take the, you know, example, like stuffing envelopes. Okay. stuffing envelopes um controlling mechanisms can be effective so if you say to people if you want a lot of envelope stuff pay people you know per envelope you'll get a lot more envelope stuff than you will if you pay them a flat rate there's no question about it um you know you might get more you might get more envelopes stuffed in at least in the short term certainly in the short term is if you have a monitoring cameras to make sure that everybody is stuffing their envelopes if you have people walking back and forth to make sure that people are filling their under envelope stuffing quota but when it comes to things that require judgment discernment creativity um people don't do their best work under conditions of control we're not creative when we're always being measured we're not creative when somebody is monitoring us we're not created we're not we're less creative when there is this when we have to do it within a scaffolding of contingent punishment and contingent rewards yeah and this is definitely true for for development if you pay your developers uh you know per line of code then your code is gonna get long code yeah exactly right you won't get good code you'll get long code absolutely yeah and because and it's the same thing that's the discussion on gamifying you know the rats the rat's tails i'm sure you've heard this uh i believe it was in london the experiment was done where london as a city or some city i'm probably getting the details wrong but they paid per the tail of a rat and so what ended up happening is people outside of the city started farming rats and they would bring the tails in after after they farm the rats and so yeah the system will be gamified and it's not necessarily because there's nefarious you know people there's not rat farmers all over in the tech industry necessarily but because what you reward is is what you're going to optimize for right exactly thank you so much for listening to today's episode of developer t the first part of my interview with daniel pink daniel was so gracious to come on the show and i hope that you enjoyed this first part and i hope you will subscribe whatever podcasting app you're using now so you don't miss out on the second part of this interview you may notice that we didn't have a sponsor for this episode and this was on purpose we actually have an ad free version of developer t and if you want to listen to more episodes like this one that are uninterrupted by ads then i encourage you to go and download the breaker this is an app for ios right now and it's coming to android soon and with breaker you can subscribe to the ad free version of developer t through breaker's premium service upstream this allows you to support the show directly with a four dollar ninety nine cent a month subscription and in return you get to listen to all of the episodes of developer t without any ads thank you so much for listening to today's episode and until next time enjoy the show enjoy your tea