“This is a class of people who are amazingly good at pounding nails into the floorboard with their forehead.
One difference might be that human beings can deal with ambiguity, and computers really can’t. If you’ve done any Python [coding], you make the tiniest mistake, and everything stops immediately. That’s what makes it different even from other forms of engineering.
When you are trying to fix a car, if you fail to tighten a bolt on one wheel as tight as it should be, the entire car doesn’t stop working. But with code, an entire app, an entire website can go down from the misplacement of a single bracket. I think that’s the one thing that sometimes scares writers away, because they are more accustomed to working with ambiguity.
Ars: In the book, you talk about a personality type that is typical of programmers, but you also bemoan the common stereotypes, like the isolated, overweight, socially awkward hacker. Is there a particular kind of person that gravitates toward this field?
Thompson: There’s so many people flooding into coding now that you really do get many more walks of life, but there are some traits that seem pretty common. For instance, coders are good at thinking logically, breaking big problems down into little steps. It can carry into their everyday life, because you spend so much time being so linear and having to be so precise. Also, everyone who thrives at coding is able to deal with mind-bending levels of frustration.
This is the dividing line between people who can code and people who can’t. There’s this Hollywood stereotype that coders sit there just pouring code out all day long. Really what they do is sit there staring at busted code that isn’t working, trying to figure out how to fix it. This is one of the most Sisyphean tasks you’re ever going to do in your life. It’s not going to get any better, because the better you get, the harder the challenges will be. But the pleasure that comes when you finally get things working is such a narcotic jolt. Coders will chase that thrill over and over again. It compensates for those brutal hours of frustration. So this is a class of people who are just amazingly good at pounding nails into the floorboard with their forehead.
Ars: You also cover some of the history. Female programmers were probably the first coders. Then the field became a as you put it in the book—very male-dominated. What happened?
Thompson: The reason why women were driven out of coding is a little complex. There’s no single thing that happened. There were three or four things that all reinforced each other. And that also means there’s no single solution. It’s like that joke: if you want to solve this hard problem, there’s no silver bullet. There’s just a lot of lead bullets.
The early days in coding were genuinely meritocratic, because no one knew how to do it. Companies hired people who could think logically and were meticulous, and just trained them. So you get people like Mary Allen Wilkes, who figured, “Well, I can’t be a lawyer because it’s too sexist in 1959 to be a lawyer, so I’ll just walk into MIT on the day of my graduation and say, do you guys need any coders? And they’ll say, ‘Yes!'” She went on to become a pioneer in creating the operating system for what you could argue was the first personal computer. In those days, software was not valuable. The manly thing was making hardware. That’s where all the guys went. The software was considered to be almost secretarial.
As corporations started having huge amounts of code that were crucial to the way that they operated, you started seeing guys taking a lot more interest. They started developing what Silicon Valley calls “culture fit.” As in, “We need to hire someone not just because he’s good at it, not just someone who has this set of skills, but someone who we feel is like us.” The final nail in the coffin was guys like me in the 1980s who started coding on personal computers in high school. A couple of years later, we started showing up on college campuses and enrolling in computer science, and it completely tilted the field. The professors reasoned, “This is who we should be teaching toward. We should change our curriculum so it almost requires you to be a teenage hacker.” And that’s when things all but collapse for women in computer science.
Ars: There’s been a great deal of discussion of late over inherent bias in many of our algorithms, particularly in social media. This confuses many people who aren’t in the field. They think, “But how can algorithms be biased? It’s math.”
Thompson: First, often the bias is literally at the origins of the “problem” the algorithm is trying to solve. For instance, the social media recommendation algorithms are designed by the architects, by its coders, to try to “gauge the material.” That algorithm is constantly paying attention to what are people most clicking on, trying to identify what most fascinates and compels people. It’s all in the service of an ad-based market model. And people are mesmerized by all the things that prompt extreme emotion—deep anger, rancor, horror. Anything that just mashes on people’s psychological buttons, that’s what the algorithm is going to regard as engaging, and that’s what it’s going to find and promote.
Second, algorithms can also end up with bias if they’re trained on biased data. Here’s an example from my book: Henry Gan is a coder at Gfycat, an animated GIF hosting service. Henry and his team are using visual learning neural-net AI to recognize what’s inside pictures and to automatically tag it. They’re a small company, so they’re not just starting from scratch. They’re using some of the open source neural net software that’s out there, developed and trained by companies like Google and Facebook.
But they discovered that it is actually terrible at recognizing Asian faces. This is a really big problem for Gfycat because a large chunk of their most avid user base are K Pop fans. They love finding animated gifs of major Asian K Pop stars. As Henry explained to me, the AI is trained on these data sets of pictures, and the pictures are mostly white folks, because they were collected by institutions in primarily white countries. So, if you don’t use the algorithm on very many Asian faces, it’s not going to be very good at disambiguating Asian faces. The reverse is also true. There’s AI that’s trained in China on primarily Chinese faces that struggles to deal with white faces.
Ars: Algorithms also have had an enormous impact on media, for better and worse.
Thompson: The media has been affected by ranking algorithms, even going back to something as simple as what are the 10 most forwarded stories on our website. This is not even AI—this is literally a sorting algorithm. Give me a list of everything that was forwarded today, sort it by popularity, take the top 10, here is our ranking list. This is almost a reflexive instinct to a software engineer.
There’s a website that finds YouTube videos with zero plays. I’ve often thought, we should do more of that. You could use software to do really interesting searches. What are the interesting stories that are being ignored? You can use code and algorithms to do really cool, subtle things that aren’t being done because everyone’s trying to go after this obvious low-hanging fruit. It’s a lucrative model, finding the things that are already popular. But it feels like an ultimately unambitious way to use the enormous flexibility of software.
Ars: What was the most surprising thing you learned while writing this book?
Thompson: One of the things that really leapt out is the almost aesthetic delight in efficiency and optimization that you find among software developers. They really like taking something that’s being done ponderously, or that’s repetitive, and optimizing it. Almost all engineering has focused on making things run more efficiently. Saving labor, consolidating steps, making something easier to do, amplifying human abilities. But it also can be almost impossible to turn off. Scott Hanselman talks about coding all day long and coming down to dinner. The rest of the family is cooking dinner and he immediately starts critiquing the inefficient ways they’re doing it: “I’ve moved into code review of dinner.”
It’s where many reflexive knee-jerk business models come from. “Hey! Let’s speed up everything in society and optimize it.” This is part of the story of the Facebook news feed. It accelerated our ability to pay attention to people. It was a massive optimization of how we learned about our friends in the world around us. There’s great things that came from that, but it also creates problems in its wake. The blizzard of TMI makes it hard to focus on any one thing because there’s so many things coming at you.
This arcs through the whole book: a major gift of software engineers has been the relentless desire to optimize things. But that can sometimes turn into a monkey’s paw curse. I could see how that went from the individual person to larger society. Like Uber optimizing the way cars are hailed, which kind of wrecks the auto industry. Or Airbnb optimizing the ability to rent a house on the fly, which winds up screwing with the housing market. Over and over again, whenever you see a tech company locking horns with the civic interest, it’s usually the result of them optimizing something that was great for some people, but caused a lot of collateral damage for other people.
2 with 2 posters participating