News — From ride-hailing services to warehouses to hiring platforms, algorithms are increasingly taking on the role of manager. What does this mean for worker autonomy and meaningful engagement with work?

On this episode of The Culture Kit podcast, hosts Jenny Chatman and Sameer Srivastava interview Lindsey Cameron, assistant professor of management at the University of Pennsylvania’s Wharton School, about the research insights she gained from getting behind the wheel as a ride-hailing driver. Cameron discusses the cultural aspects of gig work, the “good bad job” paradox, and strategies for fostering equity and worker dignity in an increasingly algorithm-driven world.

Main takeaway from Jenny & Sameer’s interview with Lindsey Cameron :

  1. Keep humans at the center. Rather than optimizing solely for efficiency, use human-centered design to consider worker well-being throughout their lifecycle with the company.

*The Culture Kit with Jenny & Sameer is a production of Haas School of Business and is produced by University FM.*

Show Links

  • By Lindsey D. Cameron, Administrative Science Quarterly, 2024.
  • “,” By Lindsey D. Cameron, Harvard Business Review, 2024
  • “,” by Lindsey D. Cameron, Organization Science, 2021.
  • ”  By Lindsey D. Cameron, & Hatim Rahman. Organization Science, 2022.
  • “,” By Lindsey D. Cameron, Curtis K. Chan, and Michel Anteby. Organizational Behavior and Human Decision Processes, 2022.
  • “” By Arne L. Kalleberg, 2011.
  • “,” By Michael Burawoy, 1982.
  • “,” by Aruna Ranganathan and Alan Benson, American Sociological Review, 2020
  • “”, by Tressie McMillan Cottom, Sociology of Race and Ethnicity, 2020.
  • , by R. Trebor Scholz, Penguin Random House, 2023.
  • , by Alexandrea J. Ravenelle, University of California Press, 2019.

Transcript

[00:00:00] Sameer Srivastava: Welcome to The Culture Kit with Jenny and Sameer, where we give you the tools to build a healthy and effective workplace culture. I’m Sameer Srivastava.

[00:00:14] Jennifer Chatman: And I’m Jenny Chatman. We’re professors at UC Berkeley’s Haas School of Business and co-founders of the Berkeley Center for Workplace Culture and Innovation.

Today, we’ll tackle the topic of algorithmic management with our guest, Lindsey Cameron, an assistant professor of management at the Wharton School, who’s embedded herself with Uber drivers to see firsthand how digital platforms are reshaping work, autonomy, and power.

[00:00:42] Lindsey Cameron: Imagine hiring, firing, evaluating disciplined workers all done by an algorithm. I worked part time as an Uber driver for three years, never had a conversation with an Uber employee or a Lyft employee.

[00:00:57] Sameer Srivastava: Welcome to the show, Lindsey. It’s really great to see you.

[00:01:00] Lindsey Cameron: Oh, it’s great to be here, too. Thanks for the invite.

[00:01:03] Sameer Srivastava: Of course. Well, we always like to start the show with some definitions. And so, the first question for you is, what does algorithmic management mean? And how does it operate in the workplace?

[00:01:14] Lindsey Cameron: Algorithmic management is when you have an algorithm as opposed to a human as your boss. So, imagine hiring, firing, evaluating disciplined workers all done by an algorithm. I worked part time as an Uber driver for three years, never had a conversation with an Uber employee or a Lyft employee. And the way you see it in the broader workplace is, you know, ride hailing and the gig economy is a very, what we call, an extreme case, but I mean, more and more jobs from Amazon warehouses to bus drivers, to people who are trading stocks have a component of algorithmic management somewhere in the, sort of, the human resource life cycle.

[00:01:52] Jennifer Chatman: So, you’ve pointed out that gig work isn’t new, but today’s platforms have radically transformed how it’s done. So, how should we define gig work now? And in what key ways does it differ from more traditional employment, like, especially when it comes to worker autonomy and the role of management?

[00:02:11] Lindsey Cameron: You know, this is such a good question, and it’s a tricky big one. There’s an argument that’s made that we’re actually going to return to the original way work was done, when it was seasonal, or on-the-spot labor market. So, that’s often the argument you hear. There’s, like, OG gig work. It was the original gig work, musicians, things like that.

It’s a nice analogy. It’s actually not the one that I really buy. I think this is a form of sponsored entrepreneurship. You think about franchises, you think of multilevel marketing, maybe your mom did Mary Kay or your dad did Amway. For people who are, sort of, shut out of traditional labor markets or are on the edges, there’s these sponsored business opportunities that bring people in and give them a sense of autonomy and meaning, while there is a focal parent organization that controls or quasi controls the work.

So, you know, to sum up your question about how I think it’s different, you know, people think it’s scheduled flexibility. I think that’s a red herring. There is a short-term contract. It’s management by algorithms. It’s a reputation system that monitors the quality of the work. And this is very much tied to a longer history of sponsored entrepreneurship.

[00:03:16] Sameer Srivastava: So, lots of questions to unpack there, but before we do, I wanted to actually ask about your own journey into this topic. So, what drew you into studying how technology and, specifically, algorithmic management is shaping work?

[00:03:29] Lindsey Cameron: You know, I have a prior career in the National Security Agency and the Central Intelligence Agency, so the NSA and the CIA. I worked as a hacker. I have multiple degrees in engineering. And I thought I was going to pivot in my Ph.D. program. My first published paper is on mindfulness interventions. But as I was thinking about dissertation topics, I really thought about my mother who lost her job during the height of the Great Recession. She managed a very large call center, really very successful. But it was the Great Recession, age discrimination is real. She had a hard time re-entering the traditional labor market. And I saw her do gig job after gig job, you know, selling purses at trademarks, selling food samples at grocery stores, working the overnight shift at a warehouse.

And I saw my mother, as a very fashionable middle-class woman, really struggle to try to stay middle class even though she was earning less than my Ph.D. stipend. So, the question I really became interested in is, how do people stop downward social mobility? There’s an American fascination with the myth of American meritocracy and the American dream, and I wanted to see the opposite. Because, you know, real wages are the same for the past 40 years.

So, as I was doing my research looking at how people cope with layoffs and things like that, my advisor, the gifted Jerry Davis, if you’ve had an opportunity to talk or speak with him, suggested I bound it by an organization. When I looked at my data, I saw a lot of people were in the ride hailing industry, they were in the gig economy. And that’s what led me to study algorithmic management technology and how it’s changing work.

[00:04:59] Jennifer Chatman: So, Lindsey, your work is not just fascinating for the conclusions that you draw, but the methodology that you use. And so, for one of your studies, you followed about 60 drivers and even got behind the wheel as a driver yourself. So, can you tell us a little bit about what it was like to immerse yourself so deeply in the gig economy experience and what insights might you have missed using more traditional research methods alone.

[00:05:25] Lindsey Cameron: You know, I like to say, and this is from one of the founders of sociology, Robert Park, I like to get the seat of my pants dirty, you know, really immerse myself in the tradition of work. And that comes from a long history coming out of sociology. So, yes, I’ve actually followed these drivers now for seven or eight years and looked at them in eight different countries, so, different sets of drivers, and not just ride hailing, all across the gig economy.

But my actual experience driving, I hated it. I’m not a great driver, particularly around rush hour in Washington, D.C. You know, I speak Arabic, and there’s a phrase called huwa huwa. You do the same thing, huwa huwa (هو هو). And so, as I set periods of time, I’d be going up and down the exact same street, seeing the exact same, you know, 7:00 a.m., North Capitol Street looks like this, at 7:00 p.m., Wisconsin Avenue looks like this. So, there was a monotony and a deadening I felt in the work. And when you read a lot of the media presses or public scholarship about this topic, it’s very negative, it’s very critical, calling it “Taylorism in the head,” a “digital panopticon”, an “invisible cage” is the work by Hatim Rahman.

But my interviews led me to the surprise, because they liked it. And I seriously wanted to understand why they like it, not just say they’re wrong. They have some, sort of, false consciousness. And it’s that tension between the fact that I had an experience that I didn’t enjoy. And the individual experiences of a lot of people saying they like it is the core tension. I looked at in my research. And that’s the paradox. That’s the impetus between some of my papers, the good bad job. This work holds tensions, it holds contradictions, and that’s what makes it so powerful and contributes to its growth.

[00:07:02] Sameer Srivastava: So, let’s talk a bit more about the paper you just referenced in which you use that term, “good bad job.” And you describe in that paper how some of this work can actually have appealing aspects, like flexibility, but also the precarity of low wage work. Can you help us understand this idea of good bad job a little bit better?

[00:07:21] Lindsey Cameron: So, when I use the word “bad job,” and that was a great question, Sameer, I don’t mean it in a normative sense. This is actually work that comes from Arne Kalleberg and others in the sociology of work, where it’s about work that’s low paid, no meaningful paths for advancement, economically precarious, no protections, labor protections.

And so, it’s undeniable, this work is structurally bad, for all of these reasons, you know. You own your own car, you own your own means of production. You have to pay a large percent of your wages in commissions to the platform. But people enjoy it because it gives you a sense of autonomy in that you get to make very small choices about the work. I choose what time I’m going to drive. I get to decide, “Am I going to rate the person? Am I going to open the door when they come inside? Will I offer to take their luggage?”

So, all these very small choices give you an amount of autonomy. The amount of autonomy is very small, it’s very finite, but it’s real. And this can let individuals feel like, “Well, this is good. This is working for me.” And that’s really what that paper unpacks, is, why do people consent to this work that is structurally a bad job, but feel good about consenting and keeping on doing this work?

[00:08:30] Jennifer Chatman: I mean, what’s so interesting about studying drivers is to think about the cultural aspect of it, which is harder to discern. And usually, we think that every organization develops its own culture, whether leaders guide it intentionally or it develops or evolves on its own. So, what differences, if any, have you observed in the kinds of cultures that emerge in organizations such as Uber that rely so heavily on gig workers as you’ve described them?

[00:09:01] Lindsey Cameron: Now, that’s a really great question, and it’s tricky. Now, I would say that there are elements of culture, if you spend a lot of time on the forums, like I do, or if you go to Ridesharing Drivers United, that’s actually an organizing guild in California that’s lobbied for a lot of rights, it does feel like there’s a culture. You know, particularly, if you’re on these forums, they’re sharing tips: the algorithm just changed, these are the best places to get rides. I just got, you know, unicorn decals I put all over my car and I’m getting better ratings and you should try that, too. So, in that way, it can feel like an organization. And there’s a lot of research that has been published either on these organizing units or from these communities that are online.

The thing that I found, because of the way I sampled drivers is I sampled them through getting rides. I would just, like, parachute into, say, Missoula, Montana, and see who I could get a ride from, and see who would agree to do a one-hour interview. And I did this in many, many cities. Is that I wasn’t getting people who were online. I was getting regular Joe Schmoes, who were often the ones that were grinding. It’s Pareto’s principle. The majority of rides, say 80%, is done by a small number of drivers, say 20%. That was my sample. This sample is not on the forums. These samples are not at the organizing units. They’re not feeling this culture. And for there, it’s very loosely amalgamated, the way they understand the algorithmic management system, which is a lot of inferences. It’s an imaginary. So, I would say, across the entire platform, there’s certainly not a uniform culture or a platform workers in the same way you’d have a traditional organization, which is one of the things that makes it hard to do this type of work or to stay engaged for a long time.

[00:10:37] Jennifer Chatman: So, Lindsey, earlier, you were describing that the drivers are making these small choices in their daily approach to the work. And you actually labeled these, those who play the relational game, who aim for high customer service ratings by providing extra services, and those who play the efficiency game are trying to maximize earnings by streamlining interactions. I can say that I can certainly relate to this in my experience, this difference. Walk us through how each of these games works in practice and how they’re driven by the algorithm. Do they have different impacts on drivers’ decisions beyond just the decisions they’re making each day in their work.

[00:11:19] Lindsey Cameron: Fascinating. I’m actually going to answer the second question first, is that both games keep people on the platform driving, which goes against this normative idea that, relational is good and efficiency would be bad. It’s the fact they’re playing a game. And here I’m drawing on the great work of Michael Burawoy, who unfortunately, recently passed.

So, the relational game is, “I want to be your buddy,” you know. It’s the people who decorate their car with unicorn decals. Or you get in the car on Friday night, and there’s strobe lights in the car, and there’s tassels. And Pop 40 music is playing. They are there to make sure you have a great customer experience, get your bag, greet you, have water, have summer sausages. These individuals are obsessed with their customer ratings. They’ll tell you their rating to the second decimal point. Mine was 4.86. And at the same time, they feel like the algorithm is there to help them. They don’t understand how it’s working, but they’ll be like, “It’s like God, it just knows how to give me the best rides. Even when I start the day late, I can make my money.” But they don’t know. Nobody really knows how these algorithmic management systems are working. They’re a black box.

So, the relational game is a sense of positive reinforcement. It’s a virtuous cycle because the technology matches with the orientation they have with the customer. Because the technology has the rating system. It has compliments. It has badges. Some people said, “I even check these compliments more than I drive because it just makes me feel good.”

Now, the efficiency game is, “You get in and out of my car,” that you’re not a person, you’re a fare. So, if you’ve ever seen a driver that’s, like, not talking to you, not making eye contact, efficiency game. This is the game I’d say I played the most when I was a driver. I would have, like, these bags in the front seat so nobody would come and sit next to me. And, you know, I remember one driver telling me, like, “I’m not trying to be your best friend. I’m trying to make money. And I don’t get anybody’s bags, because if they drop the bag breaks at the strap breaks, I’m going to be held responsible.”

So, you have this, sort of, transactional relationship with the customer and they keep these extensive logs about how much money they’re making, how much they’re driving, because they ultimately believe the technology there is cheating them. Because they’re not concerned about customer ratings. They’re concerned about the pricing algorithm and the matching algorithm. There’s a lot of opaqueness there.

So, there is a lot more negative feelings about the algorithmic management system. But the thing that I mentioned earlier is they both keep driving because they’re playing a game and they think they can win. That’s the key point. There has to be a measure of the system in which there’s about a chance or luck, but then they have to feel like they can influence the chance or luck.

And so, it’s either of the games keep workers engaged, even though one’s a happy worker, one’s not. And actually, I find the people who are not playing games are the ones that leave, because they don’t have a sense of engagement in the work.

[00:14:13] Sameer Srivastava: So, first of all, Lindsey, thank you for the shout out to our former colleague, Michael Burawoy, who just passed away. But I wanted to pick up on what you were just talking about with respect to these games. Would it be too much of a stretch to say that the algorithms are starting to drive the culture of these places? And where do human managers fit into that system?

[00:14:32] Lindsey Cameron: Now, I think that’s a bit broader in thinking about ride hailing, but algorithms play a role, particularly, when they have managerial decision. And what seems to create a greater sense of people having ownership or feeling good in the work is, can there be a human to intervene, you know? So, one example, a classic one, you know, from Toyota and all the Japanese style of management is that there’s the chain they can pull when they’re on the assembly line. Doesn’t matter what level you are. The lowest level, you see something unsafe or wrong, pull the chain. The same on an oil refinery. Safety is everyone’s responsibility.

And there’s not really any measures for that in this type of gig work. You know, you can see a little bit more than Amazon warehouses where there is a bit of managerial leniency at times, but it’s, can you have a human in the loop? Can you have a human stopgap? And can I meaningfully offer input into the algorithmic management system? And those things, for the most part, are really lacking in these types of labor platforms I study.

[00:15:32] Jennifer Chatman: Yeah, just to dig a little further on that issue, which is so interesting, we’ve been talking about the sense of autonomy that workers gain through the micro-choices that they make,whether to accept a ride request or pursue,surge pricing, in thinking about how you can generalize your findings out to other types of jobs, what can managers in other industries learn from incorporating opportunities for employees to make choices?

[00:16:01] Lindsey Cameron: You know, this is the tricky part where I know Google doesn’t use this slogan anymore, but I’d say, “Don’t be evil,” because the key point with these micro-choices I make in this paper is it feels real. These small senses of autonomy that people have, maybe, in a warehouse, it’s setting your pick rate or deciding whether or not you’re going to allow the AI assistant transcribe your notes in a meeting.

The choices are small. One could argue they’re even superficial, but because they have psychological implications for wellbeing, I think they’re not fully superficial. But the larger issue is the system. And the system is controlling the choices. And so, that’s the key thing that I’m pointing at. Like, an organization can make these small choices that feel good to the worker, but ultimately, you know, they’re the ones that set the terms of engagement. And so, it doesn’t really feel like a systemic solution.

[00:16:50] Sameer Srivastava: So, thinking about algorithmic management more generally, we’ve talked about how it can boost productivity but can also lead to reduced autonomy and more of a, sort of, sense of surveillance. To what extent do you think organizations outside the gig economy are likely to lean heavily on algorithmic management, going forward? And how can leaders strike a balance between a culture that is really focused on efficiency, but also respects workers’ independence?

[00:17:18] Lindsey Cameron: More and more companies are using algorithmic management. You know, I talk to my students and when they go to sign up for a job, they’re like, pick your interview slot. There’s not a person on the other end. It’s HireVue. And they’re giving you a score at the end about, you know, how warm and personable this person was based on how their facial features moved.

So, I just think we’re going to be seeing more of this tracking and surveillance. There’s some jobs like security guards, I’ve been told, can often be hired and start the first day without talking to anybody in the company.

But I think there are ways that are less visible, whether it’s measuring what’s going into your email to see productivity or, like, the screen trackers on the Bloomberg terminals where is that information going and how are we thinking about it?

I think we’re in, actually, the middle of a cultural question about how do you use these tracking tools in a way that is humane and efficient.

You know, I think of some places where maybe I’ve seen it done better. And there’s a really nice paper written by Aruna Ranganathan, one of your colleagues, and she looks at garment workers in a factory in India. The mechanism she describes is “auto-gamification.” These people have these little RFID trackers that lets them know, you know, how fast they’re working to sew a particular garment. And she finds it actually works. It motivates people because they’re playing a game against themselves.

Now, there is one way that you can argue that isn’t a normative mechanism of control, that people are internalizing what the organization wants them to do with these RFID trackers. I believe that. But at the same time, I also believe there’s a way of which the autogamification is fun and people become enthralled into the work. And I feel like those are some of the better examples I’ve seen and to use the technology in a way that at least creates an enjoyable experience for the workers, at the same time meeting these baselines of what is fair pay and what is making sure you have a workplace with dignity and where people can raise concerns and get them addressed.

[00:19:14] Jennifer Chatman: So, you recommend in your work that we should keep a human in the loop somewhere, ultimately, to oversee algorithmic decisions, especially around things like hiring, firing, and discipline. We actually heard a similar point made in our recent podcast with IBM Senior Vice President, CHRO, Nickle LaMoreaux, even how the algorithm is created implies that there are criteria that humans have considered, right? So, you talked about looking at the security guards for their facial movements. Someone had to decide that that’s an important criterion for the job. So, even at that level, I think the programming of the algorithm is going to be influenced by humans and, therefore, susceptible to all of the biases that humans have. If you think about our colleague Merrick Osborne’s work and how racial bias has even been embedded into the kind of algorithms that we’re seeing in AI. I would presume that the same thing happens.

[00:20:18] Lindsey Cameron: Yeah.

[00:20:19] Sameer Srivastava: The example I was thinking about is actually Waymo and how humans oversee the cars. And when the car is not sure what to do, it says, you know, “Here are the three choices I’m considering, and I’m leaning towards this one. Would you agree?” And that’s the way it, sort of, checks some of the ambiguous cases.

[00:20:38] Lindsey Cameron: So, it’s funny that you mentioned that, Sameer, because I actually interpret that more negatively, in the fact that it’s ghost work, that we think these are actually technical systems that are doing it on their own, but there’s all this back end labor in Arizona or the Philippines that does it, you know. I don’t frame it in the same way.

[00:20:56] Sameer Srivastava: Yes. Exactly. Right, which fuels a sense of magic in what’s going on.

[00:21:03] Lindsey Cameron: Right? It is techno utopianism.

[00:21:05] Sameer Srivastava: Exactly.

[00:21:05] Lindsey Cameron: But I think an example where I’ve seen it done really well is a company called Plyometrics. They do personality testing for hiring. Some of you may have heard of the balloon test. So, when you’re hired by a company that uses plyometric, they give you all these personality tests in the forms of games. And so, the balloon test is, you press a button. Doop, doop, doop. And you see how big you can blow the balloon before it pops. And you don’t know what it wants you to do. And what it is is a measure of risk taking.

So, this is a story that Plyometrics has spoken about publicly so I can share about how there was a big company, I think it’s a consumer goods company, that used it as part of their hiring process, this entire battery of computer-assisted tests that had been trained on the top workers in a particular category. So, they knew how to hire for personality and skills through these tests as opposed to where somebody went to school as an indicator of quality.

And what this consumer goods company found, they brought in their new hiring class after using this test, is that, previously, people would come to work and then they would submit their expenses to get reimbursed for the move. And they had a class where they’re like, “Well, we can’t do that. We don’t have enough money to float $10,000 for a move.” They had created an incoming class of analysts with a much greater amount of social economic diversity from the test.

And so, I think that’s a really great example because we are seeing more of these tools in the hiring processes in ways they can be used. It’s not just to, like, broaden the funnel, which I think LinkedIn does in terms of passive recruiting, but how they can be used in the hiring pipeline to make sure you’re bringing in more people that you might have not seen. And it’s not just bringing in more people, it’s bringing in more people that match the attributes of those who do well in that job. And how do you design these games and technologies in a way that they can’t easily be gamed or tricked? Because you have no idea if a job is looking for someone with high-risk tolerance or low-risk tolerance. So, it’s a way to get a more accurate view of one’s personality.

[00:22:59] Sameer Srivastava: Fascinating. So, I want to go back, Lindsey, to the gig workers that you studied in your research. And many of them, obviously, are from an especially fragile sector of the workforce, often juggling multiple jobs and dealing with health challenges, other challenges at home. Many of them come from marginalized communities. And these jobs have very high turnover. So, how can organizations design algorithms and policies that are also focused on equity, rather than just efficiency, and don’t just exacerbate those vulnerabilities?

[00:23:30] Lindsey Cameron: Thank you. That’s such a great question. I mean, you look at the majority of ride hailing drivers, they tend to be black and brown men, often, who are not born in this country—so, first-generation immigrants. So, there is a way, as I was mentioning earlier, how it brings in people that traditionally have been excluded from the labor market. Tressie Cottom McMillan calls it exclusion by inclusion, is the mechanism we’re talking about how these workers get in.

But thinking about how you can ensure equity. So, if I put on, like, my straight sociology hat, you know, I’m thinking of Trebor Scholz’s work where he basically argues the ownership structures of these platforms in trying to create pure market conditions are just set up to have poor labor conditions. That’s the one thing that you can, sort of, squeeze the bottom line and let’s have some driverless cars. We don’t have to pay for humans anymore.

And so, what he argues, about companies like Stocksy that have a more cooperative ownership structure and allow gig workers to come in earn money, and there’s some interesting examples, like, in Europe around, like BlaBlaCar about platforms with different alternative structures.

So, that’s if you had the sociology hat on. You know, I’m in a business school. We can rethink capitalism. I don’t think we’re remaking capitalism. But there’s a belief I hold that I think comes from black feminist thought. Though, a lot of other people have thought about it. And it’s that I am because we are, and we can save one another. So, this idea of mutual solidarity, I think, is really important. And it’s hard to encourage because the work is so atomized and isolating. But I’ve done work in eight different countries in the global south and the feel of this work, the community and the culture is quite different. There are all these hyper local WhatsApp groups, and people will come out and get you if you have a flat tire, or they put out pins on where they’re going to drop off somebody. And if the pin doesn’t move, the other people in the group know there’s probably a robbery that happened. And they all go out and get the person because they know they’ve been stuck.

It might be harder to implement some of these things in the States, you know, because the Facebook groups and Reddit are, sort of, an approximation in the United States that doesn’t have the same depth and richness as you have in the global south. But it can be as simple as having a worker fund that each of these companies contribute to, so, then, if workers come to a, sort of, hardship, it goes in front of, a worker advisory board who are able to give resources to that person.

And then the last comment, you know, of creating more equity is to create mobility pathways. You know, one of the things about these types of jobs is it’s supposed to be a stepping stone to X, but where is X? And there’s a fair amount of research, particularly, you know, I’m thinking of Alexandrea Ravenel at UNC, that shows that people think it’s going to be a stepping stone, and 12 years later, they’re still doing this type of work.

And so, there could be a pathway about how people can move into the organization, you know, have people come into training roles. Maybe there’s a way that a small number would enter into the corporate enterprise. You know, that’s never going to be, like, a long-term solution because there just won’t be the same ratio, but it could be offering grants to people to actually start their own entrepreneurial venture that’s away from the platform or giving more subsidies for schooling or coding boot camps. So, giving people mobility platforms, because it’s hard to imagine, can your body really sustain being a ride hailing driver for 35 years?

[00:26:42] Jennifer Chatman: Lindsey, I’ve learned so much from your work on algorithmic management. Can you think of a takeaway for prioritizing human dignity in an algorithm-driven world?

[00:26:54] Lindsey Cameron: The example of keeping humans at the center, the idea of human-centered design. So, there’s an example that Sameer and I have talked about, you know, with Waymo, about how, when you’re having a self-driving car, there really are a set of watchers watching the screen and making sure things go right. And I mean, that is ghost labor. That is invisible labor that’s behind the scenes. And that is one way to look at it. But at the same time, you can also see there’s a human element prioritizing the safety of the driver, even as these technologies are coming into play.

And so, the ideas I’ve mentioned about having a workers council or having drivers funds or mobility pathways for people, it’s really about not looking at optimization as being the algorithmic match or even the reputation system. It’s a more holistic way of looking at workers across their entire lifespan at the firm and maybe even a step or two beyond when designing these technical systems.

[00:27:49] Sameer Srivastava: Terrific. Thank you so much, Lindsey. This has been really fantastic. We’ve learned so much.

[00:27:53] Jennifer Chatman: Thank you, Lindsey.

[00:27:54] Lindsey Cameron: Thank you, too.

[00:27:59] Jennifer Chatman: Thanks for listening to The Culture Kit with Jenny and Sameer. Do you have a question about work that you want us to answer? Go to haas.org/culture-kit to submit your fix-it ticket today.

[00:28:12] Sameer Srivastava: The Culture Kit Podcast is a production of the Berkeley Center for Workplace Culture and Innovation at the Haas School of Business, and it’s produced by University FM. If you enjoyed the show, be sure to hit that Subscribe button, leave us a review, and share this episode online, so others who have workplace culture questions can find us, too.

[00:28:32] Jennifer Chatman: I’m Jenny.

[00:28:33] Sameer Srivastava: And I’m Sameer.

[00:28:34] Jennifer Chatman: We’ll be back soon with more tools to help fix your work culture challenges.

MEDIA CONTACT
Register for reporter access to contact details