Today we're talking about ways that we exacerbate ways of seeing things clearly. More specifically we're determining how to uncover truth as it relates to our surroundings, or inspection.
Today we're talking about ways that we exacerbate ways of seeing things clearly. More specifically we're determining how to uncover truth as it relates to our surroundings, or inspection.
Today's episode is sponsored by Linode.
In 2018, Linode is joining forces with Developer Tea listeners by offering you $20 of credit - that's 4 months of FREE service on the 1GB tier - for free! Head over to https://spec.fm/linode and use the code DEVELOPERTEA2018 at checkout.
We've established that we're not very good intuitive statisticians on this show. That's not a novel idea. That's not something that we've provided some kind of primary research for. That's something that we've learned from many other smart individuals who are proving this in other fields. But I want to talk today about ways that we, unfortunately, as humans very often exacerbate our already existing issues with seeing things clearly and with accepting statistics as a truthful measure. My name is Jonathan Cutrellng of Listening to Developer Tea. My goal in this show is to help driven developers connect to their career purpose so they can have a positive influence on the people around them. One of the most important things that you can be doing on your journey as a developer and really as any kind of person, regardless of your career path, is to seek truth and to try to use that truth to become better. Now, truth in this particular scenario is measurable truth, demonstrable truth, something that I can use a method and you can use the same method and given completely separate instances, you can have the same findings that I have. That's more or less the pointer that we want to use for truth. We aren't trying to answer all of life's big questions. We aren't trying to answer, what is the purpose of existence or anything quite that big? Instead we want to focus on truth as it relates to really our surroundings and what's happening with the work that we're doing. That's the most actionable truth that we need to surface. It's important that we find ways to uncover that truth and one of the ways that we can use to uncover truth is inspection. All this means is observing something and taking down some kind of measurement. That's how we're going to define inspection for today's episode and inspecting something allows you to identify some kind of boundaries. Really, the key here, again, it goes back to that definition of truth for today's episode, that I can create a measuring process that you and I or you, me and everyone else agrees on. This measuring protocol, and we'll call that a research method. This is designing and experiment. It's designing that inspection method, that research method, designing the way that we're going to measure and therefore perceive together. Once we have an agreed upon measuring system, and that's a really important part of this, we need to agree at that baseline. We need to have some kind of way of agreeing on a particular measurement process. Once we have that, then we can start to uncover more accurate representations of truth. I'd love to in the podcast there and say it's as simple as creating a shared measurement process and everyone relying on that shared measurement process, but unfortunately, that's just not the case. Unfortunately, even once we've created a shared measurement process, things can go wrong. Not only can they go wrong because we execute that measurement process incorrectly, not only can they go wrong because of randomness, but we can also, even after agreeing that we're going to respect these measurements and these methods, we also have a very strong ability to skew the meaning of those measurements and ultimately, we have a very strong ability to go back on our word to distrust those measurements. Then when we have no particularly good, and when I say good, I mean rational reason, to go back on those commitments. We're going to talk about why that is in today's episode. We've talked about it before in previous episodes when we talk about cognitive bias. We're going to talk about a few cognitive biases that could lead you away from trusting statistics or lead you away from trusting truth even in today's episode. First, I want to talk about today's incredible sponsor, Linode. If you've been listening to Developer Tea for very long at all, you know about Linode. At least you know some things about Linode. We're going to talk about a few new things that we haven't really talked about very much on the show before. First of all, let's go ahead and get this out of the way. Linode is going to give you $20 worth of credit. Keep thinking about that as we go through some of the things that Linode does and think about how you might spend that money. They also have a seven day money back guarantee. Even after you use that $20 credit, if you use more money and you're not happy after seven days, you can get that money back. That's the setup here. Linode provides developers with SSD servers in the Linode Cloud. These SSD servers are native SSD storage. It's a 40 gig bit internal network and it all runs on Intel E5 processors, basically state of the art hardware. What you may not know about Linode is that they have block storage. It's available as a beta. They also have a beta manager that you can go and look at at cloud.linode.com. This beta manager is open source. This is so cool. There's so many of these products, by the way, that are so powerful, so useful. They're going open source. Learning are very own spectrum community platform, just a quick plug there. Linode's cloud beta manager is entirely open source. It's a single page app and it's backed by Linode's API. It's an example of what you could go, by the way, that API is public. You could go and build exactly what Linode has built here. You could build that for yourself if you wanted to. These services are incredible at top notch and they have 24 or 7 customer support. That $20 actually gets you something substantial with Linode. With their entry level plan, which is $5 a month, that $20 is going to carry you for four months. Even after that, you have that seven day money back guarantee. Go and check it out. Spec'd out of them. Slash Linode. Make sure you use the code Developer Tea 2018 for that $20 worth of credit from Linode. Thank you again to Linode for sponsoring today's episode of Developer Tea. So we're getting back to this very common topic on Developer Tea. But it's one that I believe is worth talking about over and over and over because it is not a simple topic to get passed. That is the topic of cognitive bias. We have to keep on reminding ourselves of various cognitive biases. We're going to talk about a few of these today. These are by no means the comprehensive list of cognitive biases and by no means, are we going to go into depth about any one of these? Every cognitive bias that we have as humans, it's not a simple thing. It's a deeply rooted thing that we have learned throughout many thousands and millions of years throughout our entire existence as humans as our brains have developed as we've gone through experiences as a species. We've learned these cognitive biases for a reason. They serve a purpose. So the history and the depth of importance of these cannot be overstated. So this isn't a new thing. It's not a new concept. It's something that we have dealt with for a long much longer than this podcast has been around much longer than you and I have been around further back than we can really remember or even that we have history to count for because these things are so deeply ingrained in our species. So I want to talk about a few of these. The first one, and this is probably the most common one that's talked about when it comes to statistics and finding truth, that is the confirmation bias. We're going to kind of move very quickly through this one because you're probably familiar with this. But if you're not, then it's super important that you understand this one. A confirmation bias is why you can have two people that are so deeply convinced even after doing their own research that their side of the story is correct. Now, why is this? Well, it's because very often when we have a pre-existing theory, we seek and we also lend more authority to information that agrees with our pre-existing theory. I'll give you an example of this. When you go to search a local business and you search by what you want rather than trying to get a picture of what's available, for example, I might search best car dealership in Chattanooga. And if I search best car dealership in Chattanooga, then I'm certainly going to find one that makes the claim of being the best, right? So a better search or perhaps a less confirmation bias prone search would be car dealership reviews Chattanooga. Now, this is something that is pretty easy to overcome in that particular type of scenario. But let's say, for example, that you believe in a particular type of processor. Let's say you believe that your programming language of choice has some kind of major advantages over another language. You're very likely to search what are JavaScript's advantages over TypeScript. And of course, that is a totally made up example. But if you're going to search for this particular type of information, you're already kind of priming that search result to deliver you what you were already looking for, what you already wanted to confirm, what you already had set yourself up to believe. And so your search was less about finding the truth and more about confirming your own perceptions. And realizing that reality is such a key, a key differentiator in making good decisions or making very poor decisions. So confirmation bias. This is kind of a basic bias that a lot of other biases come out of. And it's not just Google searches, by the way. This is how we end up picking our friends and the people that we hang out with. Sometimes this is even how we end up giving promotions to people who don't necessarily deserve those promotions based on merit alone. The cognitive bias is the availability bias. This can also be called the visibility or the focus bias. And really this is often cited in poorly done research. This is often found. And what availability bias is is whatever comes to mind most quickly, whatever has been in your mind recently is likely to have a strong effect on the outcome. So we also do this with more formalized research in the form of a convenience sample. This is trying to get people to participate in research, for example, and getting the ones that are closest by. Very often the problem with this and the same problem happens in our minds is that whatever is close by is not necessarily a good representation of the whole. So for example, you may have read a book very recently about diet and exercise. And if someone were to ask you what you believe about diet and exercise, you may answer that you think that you need to go on a very strict diet and you need to be exercising two hours per day. And you may be pulling from that availability. It's a heuristic. It's something that is top of mind. And so therefore you're going to inform a lot of your thinking with whatever is top of mind. But if you were to take a step back and look at a more balanced view of what you believe and a longer term view of what you've learned in the past and maybe other books that have comments on the same subject of diet and exercise, you may temper that response with a totally differing viewpoint and you may have a different response even as soon as tomorrow. So it's very important to understand how strong the availability bias affects us and how blind we are to it. This is why it's such a key piece of importance that you define for yourself what your principles are. Don't allow principles to kind of come in and out of your mind over and over. This happens very often. If we don't sit down and have our own basic understanding of how we want to operate in the world, if we don't establish our own models for thinking and our own principles, then it's very easy to attach to a well constructed argument from another person. And then when a new one comes along, detach from that old one and attach to the new one. The problem with this obviously is that you're going to be very unfocused and ultimately a lot of the decisions you make will look more like impulsive decisions rather than calculated, well thought out and principled decisions. One final bias that I want to discuss in today's episode is groupthink and this one affects us when we are working on teams. When we don't have an adequate way of measuring outcomes and allowing good outcomes to drive our decision making, very often we end up coming to a some kind of consensus and sometimes that consensus is not as good as the best idea in the room. Those best ideas come from a single or a single voice or perhaps two people who agree on the subject, but the rest of the room doesn't necessarily agree. And because of groupthink, which is mostly driven by kind of social conventions, you end up going with kind of this democratic approach where every voice has the same weight rather than allowing the voices to be targeted towards that outcome. So groupthink can be toxic, it can create a very frustrating culture within a company and it can be very dangerous. You will end up not achieving as much as you could achieve if you found ways to circumvent groupthink. So I want to keep on talking about bias as it relates to statistical thinking, to seeking truth. Now I want to kind of hear from the audience, I want to know how this is resounding with you, how important this stuff has been in your careers, especially those of you who have been in development for a long period of time. So if you have a story about how bias creeps in or maybe ways that you've found to eliminate bias or circumvent it in the organizations that you work in, I'd love to hear from you. Please send me an email, you can send me an email directly at Developer Tea at gmail.com. Of course that email is always open to questions as well. And then you can also reach out on Twitter at at Developer Tea, send me those questions and information thoughts that you have about bias and about finding truth, understanding how you can apply statistics, even in the smallest of ways in order to have better outcomes. I'd love to hear from you. Thank you so much for listening to today's episode. Remember this stuff is not easy. If you think that you're listening to a podcast about how to be a better developer by taking a few simple steps, that's not really what this podcast is about. Sometimes the steps are simple, but sometimes things are much harder. And usually the things that you have to work hard for are the things that are most rewarding. So I hope that you are in for that challenge. I hope you're in for being willing to accept that you very often screw things up. Not because you mean to necessarily, but because you're human. And we all end up having these faults. We all end up falling prey to cognitive biases. Even the most experienced people, even those who understand that they have these black holes, these issues with the way their brain works, that they have to overcome. Thank you so much for listening. Thank you again to Linode for sponsoring today's episode. Remember you can get $20 worth of credit by heading over to Respect.fm and using the Code Developer Tea 2018. Thank you again for listening. Make sure you subscribe if you don't want to miss out on future episodes. Don't forget to send in those questions of stories and until next time, enjoy your tea.