Do not punish learning in software engineering teams

Dr. Cat Hicks shares why development teams are often punished to learn.


We also talk about:
  • how she deviated away from a traditional path of a researcher to start her company, Catharsis Consulting,
  • how to foster a learning culture within your engineering team
  • what learning debt is and
  • how learning debt hinders software engineering teams to reach their full potential.
Picture of Cat Hicks
About Cat Hicks
Dr. Cat Hicks, is a data scientist, a behavioral scientist, and a creative entrepreneur.
Today’s episode is sponsored by Codiga a smart coding assistant and automated code review platform.


Read the whole episode "Do not punish learning in software engineering teams" (Transcript)

[If you want, you can help make the transcript better, and improve the podcast’s accessibility via Github. I’m happy to lend a hand to help you get started with pull requests, and open source work.]

Dr. McKayla 00:03 Hello, and welcome to the Software Engineering Unlocked Podcast. I'm your hosT Dr. McKayla, and today I have the pleasure to talk to Dr. Cat Hicks. But before we start, let me tell you about an amazing startup that is sponsoring this episode Codiga. Codiga is a code analysis platform that automates the boring parts of code reviews and lets you merge with confidence on GitHub, GitLab and BitBucket. I've worked with Codiga for around one year now, and I love how it guides me in discovering the, well, not so nice parts of my code base. But there is more: Codiga also has a coding assistant that helps you write better code faster. Find and share safe and reusable blocks of code within your favorite IDE on demand while you are coding. Codiga has a great free plan, and so there is nothing that actually stops you from giving it a try. Learn more at Codiga.io. That is Codiga.io.

But now back to Cat. Cat, or Catherine Hicks holds a PhD in experimental psychology and is a principal researcher in Team Lead at catalysis consulting. She has designed researchers at places like Google Khan Academy and co founded a startup that builds tools for software engineers, and she led multi institutional collaborations in online learning. So I'm super, super, super thrilled to have a cat here with me, Cat, welcome to the show.

Dr. Cat Hicks 01:30 Thank you so much. I'm really excited to be here.

Dr. McKayla 01:33 Yeah, me too. I'm following you for a long time now on on Twitter. And I was very impressed because you have a similar not the same, obviously very different. But you have a similar background, like coming from academia and then going independent. And and so yeah, so it was very interesting to see how you built Catharsis Consulting. And you're the founder of Catharsis Consulting, right?

Dr. Cat Hicks 01:59 That's correct.

Dr. McKayla 02:00 How did it help people with empirical research and also empirical research? I really a software engineering area, and you are now empirical researcher coming from experimental. How do you help companies?

Dr. Cat Hicks 02:14 That's right. That's right. So it's delightful to connect. I think there are a growing cohort of us out there, you know, in the world who have made this journey, and there's not really a roadmap for us. So I'm always I love to talk about it.

Dr. Cat Hicks 02:27 I like to call catharsis and evidence-science consultancy. So this means that we help partners use evidence to inform their decision making and tell their stories. And in particular, we're very focused on meaningful measurement. So I describe it to people as not just data for the sake of data, but creating research methods that give us data that's fit for purpose. So I try to help partners who are trying to learn something real about the world they're working in, and how to move forward. And we have a couple

areas of competency and special focus. But we've led projects, it's easier to give examples right, then talk high level. So some recent projects are, I've asked things like how can we find evidence that a product design change in a language learning game actually increased the learning that was happening for children using the game. Another recent project is using surveys to help a small nonprofit tell the stories of how community members that they worked with, were helping people in their own families and in their social networks, learn about the COVID vaccine and make the decision to try to get that vaccine. So both of those projects, very different scale for those projects, very different types of data. But both of those projects connected to really immediate impact, whether it was on product design, or on an intervention, and programming that help doctors have better communication with their patients. So at catharsis, you know, we try to bring a few core principles to all of our research projects. One of them is just that people deserve to understand their data, and to really use the data that they already have maybe special access to, and to try to bring the tools of empirical research to everyone, even small organizations that may not have invested in learning those skills before. So one thing that we bring in a lot of our partnerships is an emphasis on teaching those research methods and taking it not just from, you know, findings on one project right now, but actually fitting any work that you do with data into a larger plan of moving forward.

Dr. McKayla 04:39 Yeah, and it sounds really, really exciting. And it reminds me, I'm more on the training side, right. So I'm helping a lot of software engineers actually get better at code reviews, but all of that also based on empirical research that I did around code reviews, but recently this year, actually half of the year I spent on I'm a research project with this startup and help them come up with a framework on, you know, what makes developers developer experience really great. And they created a product out of that. So I can totally relate to that. And it was really a wonderful experience. But the experimental nature of research and startup somehow there was also a lot of tension, I would say, right, there was a lot of, you know, it's very different than in an academic world, you have like these questions, and here was, every day, I want to squash the question, is this impactful? Is this impactful? Right? I don't know if you experienced that as well. And how do you handle that in your work?

Dr. Cat Hicks 05:42 I think we all experience right, right, you know, there's a tension between things that you're doing for the long term and, and needs that are in the short term, and then, you know, just to be really real about it, I think there's a lot of people who have agendas about what we're going to find. And yeah. And I, you know, I try very hard to always work with partners who, who have told me, you know, we are going to make changes based on what we find, even if the changes are uncomfortable to us for even if it it, it helped, we learned something that conflicts what we thought was before, this is very important for social impact work, you know, is very important for equity, when you're going to do anything that has to do with people's well being, but it is a core tension. And I think that researchers, we tend to be people who love the truth, right? And we're just all about finding out the truth. And that can ruffle feathers. I love to do exactly what you described, where you go from working closely with people who are living an experience, and then translate that, you know, to leaders and to organizational structures. And I think it's a beautiful role to be in, but it requires a lot of invisible work, right of explaining both sides to each other.

Dr. McKayla 07:00 Yeah, and, and working with this tension, right, which I think, is tell for me, it was a very challenging time at a time that I learned a lot. Because I'm, you know, for me, the rigor of the methodology is the most important thing. Right? And, and then comes time for the sort of, it's more the time and then the rigor I think, right? Like, yeah, obviously, you know, like, at least a little bit, there is the priority or they are more stressed by a timing, then you know, then a researcher probably, yeah, and so on. So you have to deal with these tensions. And I think it was a very, very interesting learning experience for me. But what I really love this, that I could see, this research transformed into a product. And this was, this was actually the reason also why I loved academia because I was missing that. Yeah, getting real, right, I created a lot of prototypes in my in my research career, and actually think some of them maybe would even have had some potential even for open source, right, maybe not making tons of money, but some open source software that people would have used. But it was never the time. Again, we are coming back to time, but in a different way. Right. So the time was up after the paper is published, the time was up to work on that. And so I felt like I couldn't really translate it into what I would like to see. Right. And that's why I left for example, academia. How is that for you? Why did you leave this traditional path of a researcher and and start your own company and do your own thing go independent? Right?

Dr. Cat Hicks 08:40 Yeah, for sure. So you know, I think that it's interesting because I am a researcher who likes to study environments. So whenever you ask someone about their choice as an individual, I think you have to see it also as a choice about what was around them. So I'll be uh, you know, I'll be real about that. I mean, academia is very hard to succeed in, not, not because of the quality of your work, but because of the opportunities that are around. And I but I think that there was a really core piece of what I loved. So I started out working in classrooms, I started working, asking about the beginning of how we learn to learn and even in my academic work, I was very interested in being in real schools talking to real children was where I started I did a dissertation with 3 to 11 year olds, so you can imagine Oh, yeah, yeah, asking young children about their how they were thinking about mistakes and how they were thinking about learning. So from the very beginning, so cool, you know, it's amazing how much it pays off, right? Because we all we all start there and even now I work with adults, you know, and, and yet, all of the same questions come up all the time. So, you know, I think getting I found it beautiful and amazing that people are constantly scanning around them asking whether it's okay to make mistakes and asking who they can talk to. And I just, you know, I saw a lot of exciting stuff out there in in tech. I think the journey for me too, there's a personal, you know, that's kind of the problem space. But being an entrepreneur is also a way for me to carve out this role that I did not see existing. So I always felt a little bit like, I'm a social scientist and a data scientist. I'm a data scientist, who cares, you know, about how we measure things. I like meaningful data more than big data. You know, it felt like with catharsis, it was a way to make the job that I wanted to have, you know, to do these kinds of projects.

Dr. McKayla 10:47 Yeah, that's exactly what I did as well. I love create the job that I would like to do that, then that I feel like I can strive it.

Dr. Cat Hicks 10:57

And it takes courage. Yeah, no, you have to have to say, I know this is valuable, which I think you do as a researcher, just like you were talking about that startup, sometimes you have to be the person who's saying, I know that this will pay off if you will do it, you know, you haven't measured it. So you can't see it yet. But I know it will. Because I've been there working with people and I see their pain and frustration or whatever else. And then they build it into a product. Right. And it does pay off.

Dr. McKayla 11:23 Yeah, exactly. Right. Yeah. So I looked at your newest report, which was super interesting for me, because it is around the software engineering teams. And there you shed light on the learning debt that we have, and how that can affect engineering teams. Can you tell us a little bit more about what this report is about what this software? Or what is research actually investigated? Or looked at? And what is what is learning data? And why do we have it as software engineers?

Dr. Cat Hicks 11:56 Yeah, great question. So as a part of Catharsis' work, I can occasionally invest in this sort of work basically, for the field. So this is a report I did, because I found it really interesting, and shared publicly, and it's called coding in the dark. I interviewed 25 software engineers or developers, and I asked them to share about their active problem solving as they were ramping up on an unfamiliar codebase. So this was people talking about their real jobs right now. They shared about code review, they shared about how they asked for help, how they collaborated. And I've shared a lot about, you know, what we talked about. And essentially, you know, what I found was even at these really big tech companies, most of the people I was talking to, we're all at big tech companies. Even at these places, people's experiences were really quite frustrating. So I called this report coding in the dark, because that was a quote from one of the people I interviewed, describing how they felt every day, like they were showing up, and the lights were all off, you know, and that they were having to fumble their way through learning without any help from anybody. And there was this core tension that they experienced between feeling like it was so important to learn to build their understanding, to experiment, iterate, but then when they showed up, you know, to code review, and to other moments where they were being evaluated, that learning was not being valued. So I described this cycle, you know, of needing to do this work, and then finding it devalued. And going back to your kind of heads down at your desk, you know, I describe that as learning debt. And learning is essentially the dynamic that happens when people know they need to put a lot of effort into learning. And they know that the kind of work they need to do requires these mistakes. And it requires this long term understanding. And there's kind of all of this stuff that you're doing that's sort of invisible, because it's not showing up in your productivity. And they also know that the environment around them is only measuring that short term productivity. So in this kind of environment, where there's a lot of learning, debt, accumulating, essentially, you know, learning you have to be do that you're not getting rewarded for there's also a lot of performance, pressure, and what's worse, you know, things like documentation, writing code, comments, trying to help other people, you can actually feel actively punished for doing that. Another quote in the one of the interviews I led was that learning would be seen as a waste of time. And I think one of the engineers called documentation and code comments, a red flag about your abilities as an engineer. So you can imagine how that feels. You're in an environment that's telling you to do all this complex work, but also telling you that it's a waste of time if you help anybody else. Learn from what you've learned. So, you know, a big conclusion that I have in this report is this debt cycle this learning debt cycle can accumulate

damage for a long time because teams might look very productive on the surface, but you're building what's really an inefficient experience for learning. So I'll stop there and kind of Yes, more.

Dr. McKayla 15:09 So yeah. Tons of question for now, the first one is really what kind of persona did you interview? You were saying people that are new to a code base, but it is, are you? Did you ask them when they were onboarding? And is that the onboarding experience for people? Or is that somebody that's already on a team, but within you problem, it sounds more like an onboarding, experience. And, and, and heavy onboarding experience. But

Dr. Cat Hicks 15:37 yeah, it was a mix, it was a mix. So I think that one thing that's interesting is that you might think, Oh, this is somebody who's just new to a whole company, you know, they're experiencing the, but actually, I found this was a repeating cycle. So some people were fairly junior, you'll see there's a, there's a cross section of seniority. So we really wanted it is a qualitative project. So it's, it's not intended to be a representative sample, I think, you know, follow up surveys on this kind of thing would be really, really fun to work on. But in this cross section, we did have a good number of junior folks, but also senior folks, even a couple people who are leading the engineering teams at their organization, I did ask them to bring in an exam, think about before the interviews, when they were a recent problem they had of basically trying to understand someone else's code. So if this for some people, this was a really brand new codebase, right, like the whole thing as they were joining a company, but for some people, it was just a piece that they hadn't really touched before. So yeah, it was happening really all over the place. Right? Yeah. So the

Dr. McKayla 16:48 other question that I had, when I tried to envision this is, what kind of learning because there are many things that are you know, that we can learn as software engineers, and I feel that everything that has to do with technology is rewarded, and is seen as something, you know, that that you get some credit for at least, and that it's also very internally, a lot of engineers like to learn new technology. But then if you're coming to code basis, the main knowledge, right, all the work that you have to do to understand this piece of code for code review, and so on, right? I can maybe relate more than this is the kind of learning where would see this, you're, you're supposed to already know it. Right? So let's skip that step. You know it, and then you do your productive work? And why do you you know, this is somehow the invisible thing? Is that is that, you know, is my just guess, here? Is it going in the right direction? Or what kind of learning? Did you? Did you investigate here?

Dr. Cat Hicks 17:48 Absolutely. And I love that you have called out the complexity of learning. And you know, it's learning is a big word for a lot of different things. Right. And, of course, you've had some really phenomenal thinkers who have broke out on this podcast that I've really enjoyed, who've talked about, you know, productivity is not one thing, satisfactions not one thing, the same could be said for learning. So, you know, I, I thought a useful contribution in this report would be to talk very broadly about the beliefs we have about learning. But the actual specific examples are a lot of different things. And I think it does map on right to, to exactly what you said. So developers feel like, Oh, if I'm learning a new language, or

a new piece of, you know, a new tool, something that's very explicit, right, that's easier to defend. And it's easier to justify. But the focus is always on the technology, right? And the production and not so much on, oh, now I really understand how this other team has a mental model of, you know, this connection piece, or I really understand this dependency that happens. And I understand these trade offs.

Dr. Cat Hicks 18:55 So you know, there's actually a tremendous amount of content I got in these interviews, that's not even in the report, because it was so much. I think it could be another report on the kind of active learning that they were doing. And a lot of it felt, you know, almost secretive, like people were saying, oh, you know, I'm sure no one else has to do this. Like, I do have to go back and remind themselves, you know, because I don't want to talk about it, because I'm afraid I won't look like an engineer. But the reality was, to me a lot of that stuff, like thinking about the trade offs of different decisions you made thinking about whether a design decision, you know, that we put on paper really was that way in the code and even questions that are kind of like, is it worth the investment to fix this inefficient piece when I could instead be working on this other piece? You know, these are very abstract things for people to be thinking and learning about but they're really, really critical. And I was reminded to, there's a lot of myths around learning, right? And as a social scientist, I recognize some of these myths. So people will tend to think, once I learned something, it's just learned forever, right? It just goes into like, my brain is a bucket, and I just dumped something in there. And it's always gonna be there. But actually learning is really a behavior over time. So the mourn environment cannot see it as shameful, but see it as beautiful and productive and great that sometimes we're asking each other for help. We're reminding ourselves how things work, you know, and you see that when developers talk about googling for answers, right, and asked on Stack Overflow, and all of these other kinds of things that people do. But it was interesting to me how much they hid that stuff from their environment. Yeah,

Dr. McKayla 20:44 yeah. Because the real engineer knows all the keyboard shortcuts.And I think it's so it's so true, what you say, right? So learning what is learning? And if we are making a decision around trade offs, I think very often it's not framed as learning. And then it's also how, you know if I can write it down. And you know, if it's not in a book, if it's a very specific instance of something, another general thing that I can learn. What does this even mean? Right? So we learn, for example, about object orientation, and you know, how to how to have objects, but then to really think about this piece here. And the instance of should I create an object here? And how should the object look like and that I have to think about that is a little bit shameful, because, obviously, I learned object oriented programming. And so it should be easily coming to me, you know, what methods I should put in here, or naming, right? naming a method? Yeah, it's also learning somehow, or we have to put the time into, and then it's hard. And even though we make jokes about it, if somebody sits next to you, and you have to think about a good name, and only stupid names come to your mind. It's horrible.

Dr. Cat Hicks 22:07 Yeah, and I think you're, you're, you're pointing out something that's actually really, really important here, which is, you know, there are good jokes and bad jokes, right. And we've, we've been around, we've all probably been around someone who has made a joke, you know, that has made us feel really

bad about how we learned or a mistake that we made. And this is something that came up in the report to that, you know, I think one of the quotes was, I'm always watching, like, I'm from a junior code writer, or someone said, I'm always watching the senior members of my team, because I want to know, what an engineer is supposed to sound like. And that can be really beneficial. If the people around you are saying things like, we all make mistakes, we all forget something, you know, we all help each other. That's a good learning culture. But a negative learning culture, right? A bad culture is a place where people are, are saying, oh, you know, don't waste your time, like doing this documentation. Like in order to get ahead, what you actually need to make sure you're doing is putting on this performance. You know, things are very multifaceted, as you know, all of these things are always happening at once. But I do think that there's in engineering culture, there's a lot of myths around what brilliance looks like. And this is where I've pulled from some research from people like Andre Symbian, who's done some work on, you know, when a field thinks that you don't make mistakes, you have to just be born brilliant, then that is a story. That is not how it works, right. But we're all kind of upholding that myth.

Dr. McKayla 23:41 Because we all want to be the 10x Engineer, right? And then we had to have 10x engineer. Oh, my God. Yeah. No, but something.

Dr. Cat Hicks 23:51 Yeah, something that, you know, just just hurts my heart, honestly, to is is like, the people who people do this work, right. People do mentor other people, they do support learning. And that actually is what creates 10x results. It I mean, investing in learning is one of the most evidence backed ways that we have to you need to do work together. And I had, if we could see it as something that we are sharing, and that we're all working on outside of ourselves, you know, it's it's never about, you write bad code, I write bad code. All right, fine. Like we work to make the code better. It's outside of us. And it does not tell me who you are as an engineer. In fact, a good engineer is someone who's written a lot. Yeah, I mean, we need to, you know, we need to improve things and give feedback, right. But I think we need to value the messages that that feedback sends.

Dr. McKayla 24:47 Yeah, I think that I want to come back to this different kinds of things that we learn and, you know, writing good code, whatever that means. And it's also I think, changing over time. What is good code, right, whatever, what is a good way to write code but a good applications, how to structure them that also evolves? But again, I would say this is this textbook knowledge, right? And then I think what's, and this comes back to code reviews and to the data day to day work that we have to do and to productivity a lot as well, is this constant learning? Right? I cannot stop learning. I cannot, you know, it's not like, Oh, now I work. You know, obviously, you get better at this code base, and more familiar with the terminology and with your, how your team works, and so on. Yes, right. But still, even if I'm at this team for three years, and have worked with this code base for X years, right? If I have a new change that somebody else wrote, then I have to look at this code. Yeah, starts right there. And you know, and I cannot come in and have this full bucket of knowledge of how that works. And then, you know, supposed to already point out what was going wrong here. And maybe it has to do with how we measure time, and then a lot of people I think, have really problems with time, I have a lot of problems with time, like, when when should I leave the house to be on time, right? And I think very similar, we

estimate, for example, how long will it take to make to look at this code and give comments. And very often people reduce that to the time to make the comments, but this learning part that is never stopping continues and will have been added nobody wants to talk about and you know, nobody actually wants to have and nobody has time for it. That's that somehow gets forgotten or is forgotten. Right? Yeah. Oh, I

Dr. Cat Hicks 26:42 agree. I agree. And I time came up a lot in these interviews. And it doesn't surprise me because, you know, we all have felt this time pressure. And what I kept asking was, you know, if you're experiencing this time, pressure, like, what is the the first thing that gets cut? is, honestly, to me some of the most valuable stuff, and that is really hard for people. So, you know, there is a sense in which I think I totally agree that doing this work, the learning will never stop. And you'll you know, it can feel a little overwhelming. But I think that that's a reason to say, you know, what success is not you getting to the end of your learning. Like that's not what success is success is having enough space to make a good decision instead of a bad decision about how we move forward. And I, I did see people go through that. And actually, you know, I agree that it's very difficult sometimes with this work to predict how much time it's going to take. And I experienced that with my own work. People ask you to do a research project. And you say, okay, like, it sounds, it all sounds good. But I need to get in there and see what the truth is. And we might learn, it's way more complicated. So I think about things like, you know, can we have measurements of productivity, that is dynamic, that we're able to come back to and change it, and I think people will get get very, very frustrated, you know, when they are assigned a project, they dive into it, they do all this learning, actually mapping out how complicated it is, is a very valuable piece of learning that they've done, and they turn around and they want to share that with somebody, and there's no way to share that, you know, there's no way to kind of get credit for it. So that's, you know, that's one thing I think about is if we can make some of that more visible, right, like, like, allow you to use the learning and share it with collaborators, I think that people really enjoy that they feel the productivity of it, even if your goals of the project change. Another thing is, you know, can we talk about where time pressure makes sense? And where it doesn't make sense, right? So can we prioritize and and see the cost of putting everyone under a time crunch all the time? And where that is just creating these learning cycles? Yeah. So

Dr. McKayla 29:06 what I want to understand a little bit more is, there were definitely some outcomes from this report, tell us how prove right? How can we reduce these learning that how can we have this growth mindset? How can we, you know, how can we in our at least in our engineering team, a celebrate learning and make it a bigger priority? What are some of those outcomes? What can you suggest engineering teams that want to improve their learning experience? And, and the valuing of that?

Dr. Cat Hicks 29:40 I think there's a piece of this puzzle for every different role, right? So, you know, from leaders, from engineering leaders, these people could have a really outsized impact on the culture and I think that you know, a lot of places will put a poster on the wall that says everyone could learn or or maybe there's a bullet point in a slideshow about like we're alerting culture. But if you go to work and you see someone actually get rewarded for a complicated learning situation like, hey, you know, we gave, we

told you to go try to do this thing in the codebase, it turned out the thing we, you know, the thing that we proposed was not possible to do. But you did all this learning, you figured out a better way forward, we're gonna celebrate that instead of, you know, coming down on somebody for it being not what we expected, those kinds of moments. And I think leaders have the ability, you know, to, to notice that to try to push themselves to amplify that that can have an impact. Another thing I would suggest, you know, that I suggest in the report is, we honestly need to separate some of our development feedback from some of our performance feedback. So okay, I don't know how many conversations you've had with engineering friends, about perf cycles. But perf cycles are a huge source of stress. And even though we have invested, this is a whole area of research this ton of people, you know, who look at this, but even though we have invested huge structures into it in tech companies, a thing that I keep seeing as a learning scientist, is that we are rarely letting people have psychological safety to talk about them learning. So I think that a very simple step that leaders could take, is to make space to separate when you're talking about how you want to learn and grow and develop and maybe explore areas of growth for you. And separate that from promotion, performance, reputation management times that you are trying to defend yourself, which is very difficult, you know, you can't really do those two things. At the same time. I have a number of other recommendations in the report, you know, I think that there are some simple steps like, have we put any time in our calendar for documentation? Or are we just acting like that's gonna happen magically by itself? You know, so there are small and big steps to try to make yourself a learning culture. Does that all make sense?

Dr. McKayla 32:05 Yeah, totally. And I think documentation again, is, maybe it's the last thing that I want to talk a little bit about. Because I think there again, we have these two different kind of learnings of information of sharing. So you have this external documentation of how things work, right? And, and people agreed, and you know, in API needs documentation, but then the nitty gritty part becomes a little bit translucent, right? It's like, oh, this method, actually, you should be able to understand it just by looking at the code. Otherwise, the code is not good. And don't put a comment there. That's really bad. Right? And I, I, sometimes I, I really can't understand the problem here. Because while it's great, if you know, and there are different learning types, and you know, in different people that maybe somebody is easier, you know, it's easier for them to look at the code and really get it then skip the skip the comment, right? And some people like the comment, and it gives them context. And you can really know in, in, you know, native in your native language or in in, you know, in written language instead of code. But again, here, there comes this the myth a little bit as well, right? We say, well, code shouldn't actually be documented. And you shouldn't need documentation to read this. And there is also some research around that. And they showed that if there are comments in the code, people are slower with reading the code. But why? Because they are reading the comments. Right? And would they read the comments if the comments are useless? No, they are reading the comments, because they're actually helpful. Right? And

Dr. Cat Hicks 33:45 that's such a good example. Yeah, that's such a good example of a measure that like is taken to be a negative measure. But why it might actually be a positive measure? Yeah. Yeah, I think it's, you know, you bring so much rich lived experience on this, and I love hearing it because it's, the reality is that these are going to be contextual decisions, like a code that was as you said, code that was good,

quote, code, quote, unquote, in one time, my deal if the context has changed, and then that then, you know, you need to make a different decision. And I think that there's, there's there were these interesting quotes, you know, when I interviewed people about who, who is this for? Is the documentation actually, for me? Or is it for, you know, like, some idealized scenario where we're describing the technology and point of view that I have in my consulting, you know, is that I like to focus on people as the heart, you know, and that code writers as learners, like if we, if we take this approach, where we center they're learning, we can be a lot less afraid of things like sometimes the trade off is that you have more comments and that that doesn't work for all situations, but you have preferred did some really deep losses in efficiency and invisible losses that are happening? So if someone's able to ramp up a lot more quickly, that's a huge game. And I think something difficult about it is that sometimes that gain is really invisible. But you know, it's, it's not really possible to have a single way of describing code that's going to work for everyone who's ever learning. Yeah. And similar to measuring developer productivity, I think it's, it's a question of what is the best thing for us right now. And what's going to pay off the most, even if it slows us down a little bit in this way, then I think it will really pay off. If you know, later, this person who we gave all the support to is able to become this champion contributor. And I just think, you know, I use the learning debt cycle, like the learning debt metaphor to, to evoke tech debt, because we understand tech debt right in this field. And we understand that technologies with all these dependencies can start to break apart, even if it made sense when we built it. And I think the same is true for collaboration. Yeah,

Dr. McKayla 36:11 Yeah. Yeah, there's so much goodness in that. And I really want to dig into the productivity. So maybe what I want to do is I'm going to invite your again, a whole episode just on productivity if you're up for it. Yeah. And then we can really dissect that, because I would love to hear your, your opinion also on, you know, you mentioned or hinted a little bit towards that. Can we measure learning as part of our productivity? Right. And I had a podcast where was just me talking about productivity. And there, I was asking the question, I was saying that all these productivity measure that we have focused around activity, right, coming from an area of the industrial age, right, where, well, it was the activity that better Yeah, exactly. And it was that the activity that we did, right, you had only to do very mechanical tasks, and the small task, and so you could count them, and so on. And all those measurements actually stem from there. And now we put them on knowledge workers, were probably the most productive thing is that I'm sitting here doing nothing, but I make a really good trade off this session. Right? That's right,

Dr. Cat Hicks 37:21 that's right, or you help someone else and they do something, I would love to have that conversation. And I do think there are ways we can measure learning. And you know, if anyone is going to be listening to this, like, go to your team right now and ask, what are the things that we do that really make a difference, that are not being captured anywhere that are not being rewarded? Like what is the stuff that you know, is important to do to keep this all of this running? And they will tell you?

Dr. McKayla 37:53 Yeah, yeah. And coming back to what you say, with the sharing, I think what I have seen work really well is small things like brown bags, right? Where we come together, and somebody just explains what they have learned this week. Or if you go back to code reviews, right, that you that every Friday, for

example, that's happening on GitHub every Friday, they are sharing, and sometimes they are sharing, what did I learn this code review? That was really excellent. Right? You know, it's a comment that, uh, no person really took the time and gave me great comment. Or I'm showing some code that I have seen that I haven't seen before, or, you know, some some Yeah, paradigm or something that I've seen. So and we are sharing, we're making some of those very implicit things that are internal that are not, we are making them explicit and sharing them. And I think this is a celebration, as you said, I think those are themes can maybe do to celebrate what's also often referred to as blue work, right? Oh, this, this colleague helped me or that person, you know, they didn't work on their ticket, which had them for their promotion, but they actually went out of their way and did this and that, right. And so, we open openly sharing this and making it explicit. And I think, especially in our remote world now is more important, right, that we have shared that somehow. Yeah, no, I

Dr. Cat Hicks 39:16 think that that's a beautiful point. And really, really important. And, and something that I also thought, you know, something again, that people people said in the interviews, which was, I want to see specific examples. I want to sit next to somebody and see them, see them code, you know, and, and that just I think people don't know how much that doesn't happen. I think they assume it's happening or they say, oh, go get coffee, you know, with this person who wrote the code, you'll, you know, go talk to them. But people often struggle, especially if you're remote, you know, if you're new person, there's all kinds of ways in which people you know, reasons that people might not ask for help. And as I told you, I started out my career looking at Three and five year olds and in classrooms and when they ask for help, and even when we are four and five years old, we're looking at the people around us. And we're asking, Can I Can I ask for help? Can I talk to you about my real learning? So that continues? And the more you see those small messages, and those small social moments can just have a huge impact.

Dr. McKayla 40:23 Yeah, yeah. And I think team culture and psychological safety, and all of that is so important. And it's, it's, it's not something that you can just fix by doing three things today, right? It's something that you can start. But it's a continuous process. And I think this is one of those very rewarding things and you know, things that pay off, but are a little bit invisible, that you have to constantly work on that right, and that you have to raise the bar and say, we are actually allowed to have questions be wrong, you know, growth mindset, and I think it's really, it's a continuous work in a team, but the teams that managed to do it, they are so much better off than That's right.

Dr. Cat Hicks 41:07 And it just is a beautiful part of it, you know, I try to make these problems easier for myself, and for other people by saying, who's already doing this, right? Like, how do we give them a stage to do it? Like, you're who's the person that someone always everybody goes to this person to ask for help? You know, how do we make sure that they are instead of being like, burdened by this invisible work, they're actually rewarded for all this support that they're doing? Yeah.

Dr. McKayla 41:34 Yeah, that's so true. Well, it kinda, it actually brings us to the end of this show. I said, I'm going to bring you back. If you have time. I will continue. We can continue this discussion a little bit more. But is there

anything that you want to tell my listeners, maybe that you think, you know, wraps up some of the learnings that would be powerful for them for the software engineering teams? How can they, you know, be in a better place? What are what is the one advice, you know, that you would give them?

Dr. Cat Hicks 42:08 Yeah, great question. How, what a lovely question to be asked, you know, I think I end the report that I recently released, saying, learning matters. And I would I would like to leave with that, which is that, you know, learning matters and measurement matters. Like whenever we measure something, I think, Who is this measurement for? And is it bringing us closer to this culture that we want to have, you know, where we feel free and happy and, and like, we're all learning together, which is what we need in order to tackle these huge, complicated problems in the world, you know, we need to get past some of these myths about where brilliance comes from, and the myths that we all need to hide, you know, are learning from each other. But that people will only be able to do that if we make the environment around them safe. You know, so it kind of comes from both sides from from us building the environment as individuals in it, but also from people who are able to kind of say, well, I'm gonna, I'm going to do something to make this environment safer. So that's what I would say, you know, learning matters, it pays off. Let's let's work for it.

Dr. McKayla 43:18 Yeah, that's beautiful. That's really great. So thank you so much cat for being on my show. And I will definitely ping you again and ask you for more of your input. Thank you so much. Okay. Bye bye.

Dr. McKayla 43:35 This was another episode of the Software Engineering Unlocked podcast. If you enjoyed the episode, please help me spread the word about the podcast, send episode to a friend via email, Twitter, LinkedIn, Bell, whatever messaging system you use, or give it a positive review on your favorite podcasting platforms such as Spotify or iTunes. This would mean really a lot to me. So thank you for listening. Don't forget to subscribe and I will talk to you in two weeks. Bye.

Copyright 2022 Doctor McKayla