(Illustration: Adam Turnbull)
There may be no better way to understand the lot of the American worker today than through Twitch, the website where you can watch live broadcasts of your favorite video games being played in real time by the world’s best players. Leading players on Twitch have their own dedicated channels, and diehard fans can pay $4.99 a month for the privilege of messaging back and forth with them. The players receive half the subscription revenue they generate and a cut of any advertising dollars they bring in from touting the likes of Doritos and Mountain Dew. Last year, Amazon spent roughly $1 billion to acquire the site, one of the 15 most active in the world.
The founders of Twitch presumably set out to push the boundaries of the gaming industry, but where they’ve truly distinguished themselves is in wage inequality. One star, 23-year-old Jordan Maron, makes at least several hundred thousand dollars per year, and probably much more, as a video game personality on Twitch and YouTube. But, as a recent Washington Post Magazine story explained, the Jordan Marons of the world are exceedingly rare. Most Twitcherati don’t even make fast-food wages from the site, while many work bond-trader hours. They have no health insurance or retirement benefits. The subject of the Post piece was a gamer named Alex Ross, who spent a year building up his Twitch following, sometimes working in 20-hour stretches, before taking a single day off. He still struggles to pay rent.
It’s not a stretch to say that as goes Twitch, so goes the entire United States economy. In the corporate world, law, medicine, and engineering, the big checks go to a smattering of superstars, who earn orders of magnitude more than the merely talented, to say nothing of the hoi polloi. It’s a big reason why income inequality has spiked to its highest level since the 1920s.
Much of what’s happening is the culmination of forces that have been visible for centuries. Back in the late 1800s, the economist Alfred Marshall observed that communications technology was allowing a small group of businessmen to operate on a massive scale and reap spectacular rewards as a result. More recently, in 1995, the economists Robert Frank and Philip Cook published a book called The Winner-Take-All Society, which argued that the pay scales we associated with sports and entertainment were spreading to all corners of the economy. They spun out a dystopian future in which more and more workers would spend their days clawing for a shot at superstardom, knowing they would face a life of bleak prospects if they fell a few ticks short. One of the book’s quirkiest examples concerned dentistry, formerly a generous but not extravagantly rewarded trade that, in the 1980s, wound up spawning a wealthy elite of cosmetic specialists, even as ordinary dentists were left behind.
Frank and Cook intended to alarm readers with the idea that winner-take-all-ism could spread as far as dentistry, arguably the least sexy profession this side of accounting. But if anything, they were too optimistic. When they wrote their book, most people worked in firms rather than as independent operators, in the manner of the dentist (or the Hollywood actor). These firms tended to observe rigid pay scales and norms of restraint. A company’s superstars earned less than they might on the open market, while receptionists and other low-skill workers made more—a kind of internal income re-distribution. A 1987 study by the economists Lawrence Katz and William Dickens affirmed this “rent-sharing” thesis, showing that whenever managers within an industry were paid well, the janitors were usually paid well too.
Those barriers to extreme disparities have since broken down. The rise of the Internet has created whole classes of people who can earn millions working almost entirely on their own. As with Twitch and YouTube, a handful of divas are succeeding wildly in this new world order, while everyone else scrapes by. For every Holly Ward, a self-published author who has made millions via Amazon, there are thousands who work for free.
Even those of us who are employed by companies, those erstwhile refuges from the open market, are experiencing something similar. When President Obama’s Council of Economic Advisers recently examined compensation for janitors and managers, they found that the correlation between the two had weakened: Higher pay for managers was less likely to correspond with higher pay for janitors in the 2000s than it was in the 1980s.
Two years ago, I spent time reporting on Mayer Brown, a massive Chicago-based law firm that had traditionally been so staid and nurturing it was nicknamed Mother Mayer. Like most white-shoe firms, Mayer Brown used to pay its senior lawyers fairly similar salaries, regardless of how much business each generated. (Some law firms even abided by a “lockstep” compensation system, in which every partner with the same seniority took home exactly the same amount.)
Then, in the 1980s, The American Lawyer started publishing a statistic called PPP, or profits per partner. Suddenly, firms that had thought of themselves as rough competitors realized they were earning vastly different sums of money. If you were a rainmaker at a firm with a low PPP, you saw that your pay was suffering as a result—that you were subsidizing your less productive colleagues. You also realized you could get rich by decamping to a more profitable rival. For a while, basic inertia and residual loyalty discouraged defections. By the early 2000s, though, rainmakers changed firms constantly. Their salaries ballooned. Other partners saw their pay stall out or even fall.
Mayer Brown felt these pressures as much as any other firm. In 2007, it stripped nearly 50 of its partners—some 10 percent—of their equity stake. Those who didn’t leave suffered a substantial loss of income. Around the same time, Mayer Brown began courting outsiders aggressively. It acquired the white-collar defense practice of a smaller firm called Crowell and Moring in 2005 and paid the head of the practice roughly $2 million per year. Even so, by 2010, he had taken his practice to yet another firm.
Such stories are playing out across the economy. In the same way that law firms now obsessively track the income generated by each client a partner lands, technology has made it easier to measure the productivity of other white-collar workers. Long-standing workplace norms—like promoting from within or limiting disparities in pay—have broken down. It’s not just the ethos of the baseball star in the era of free agency; that was the ’90s. It’s the ethos of baseball super-agent Scott Boras in the era of sabermetrics, in which players and their negotiators claim to precisely identify each player’s value, then jump from team to team to extract every last ounce.
And it’s only getting worse as technology draws more people into the free-agent market. Today, in a matter of seconds, you can hire an independent operator to shuttle you around town or assemble your Ikea furniture, thanks to apps like Uber and TaskRabbit. Within a few years, employers will be using similar apps to hire white-collar workers. Already there is an app called Wonolo that allows companies in need of temps—from e-commerce fulfillment-center workers to supermarket shelf-stockers—to conjure them up, Uber-style, within an hour or two. The founders of Wonolo hope to expand their reach to hospitals and law firms.
The only safe employees are those in occupations where it’s difficult to measure a worker’s individual contribution to the bottom line. For example, you might imagine pharmaceutical companies would pay their star researchers enormous sums, but the rainmaker dynamic isn’t yet pervasive in the life sciences. Bryan Roberts, a venture capitalist who specializes in health care, points out that scientific research doesn’t usually proceed in the manner of the “University of New Hampshire math professor who solves the theorem all by himself and says three words every 10 days.” Major breakthroughs typically require large teams of scientists working together, backed by an enormous infrastructure, making it hard to isolate the impact of any single one of them. As Peter Drucker famously lamented, it can be maddeningly difficult to break out the value of “knowledge workers” to their employers.
Yet even this measurement problem affords only so much protection. Any veteran lawyer will tell you that every major firm has dozens of partners who do essential work without ever landing a client. But it’s the lawyers who bring in the business who realize enormous paydays. Most companies are so intent on rewarding the workers they consider productive—and so afraid of losing stars to rivals—that they ignore the deeper questions of how to define or measure employee value. As Alan Krueger, President Obama’s former chief economist, noted in a recent speech, how else could oil company CEOs earn more when the price of oil rises, even though they have no control over this change?
Market fundamentalists will say that such irregularities are simply the price we pay for a fast-growing economy. The greater the incentive to produce, the harder people work. But reality isn’t so tidy. Research suggests that workers care more about how their wages stack up against colleagues’ wages than about their absolute take-home pay. One study found that when a member of a workplace duo feels short-changed relative to the other, the team’s overall productivity falls by more than it would if both workers had their wages cut. The implication is clear: If we keep letting winners take all, eventually there will be less and less for them to take.