Wednesday, November 22, 2017

Computational Limits of Empire

The tabulation of the 1880 US census took 8 years to complete. As preparation began for the 1890 census, it was estimated that tabulation would not be complete until after the 1900 census began! The computational load was declared to be too great; an alternative approach was needed.

The problem was solved by a mechanical computer based on punch cards. A company was founded specifically to build the contraption; that company would later become IBM.

I was thinking about this story, and I wondered: just how large was the US population in 1890? Did other nations reach that population level before? How did they handle the problem?

The 1890 US census counted 63M people, in total (source). How large did the Roman empire grow? Well, the Roman empire seems to have reached its peak around… 60M people. At this point I really started to get suspicious, and looked up population statistics for the ancient Persian empire and the Chinese empires. 50M people for the Achaemenid empire (Persian). China had 30-85M under the Han dynasty, stabilized around 50M for a few centuries, then grew from 45 to 80M under the Tang dynasty.

Next, I pulled up wikipedia’s list of largest empires and Business Insider’s list of top 10 greatest empires. I had to google around for population stats, many of which were not immediately available, but here are the big ones, excluding empires from 1700 or later:
There were a number of smaller “empires”, mainly the predecessors and/or successors of empires on this list. But on the other end, only the Mongols managed to scrape together an empire of over 100k people, and that empire split within a generation (spinning off the 60M-person Yuan dynasty).

Yes, this is a far cry from systematic. Yes, there’s room to complain about selection. Nonetheless, there is at least a very noticeable tendency for pre-modern empires to max out in the 50-70M population range.

Is the empire population cap due to computational limits in governance? I’m not sure how to properly test that hypothesis, but it does seem awfully suspicious that the founding event of the modern computing industry was triggered specifically by the US passing that 60M population mark.

One interesting question to pursue next: how did other modern nations/empires handle passing the 60M population mark? India and China both achieved sustained growth and built stable nations of over 100M people during the early modern era. Presumably the British empire’s population was also beyond 100M during much of the 19th century. Did these states also face computational blockades? What techniques did they introduce which might explain their ability to overcome the 60M person cap?

Monday, November 13, 2017

The Open-Source Alternative to a College Degree

Five years ago, Massive Open Online Courses (MOOCs) were the hot new thing in higher education. Finally, the time was upon us! The internet was set to upend our outdated modes of education!

Today, that does not seem significantly closer to materializing.

MOOCs failed to usher in an era of cheap, large-scale higher education for exactly the same reason that opencourseware failed to usher in an era of cheap, large-scale higher education ten years earlier: they solve the wrong problem. “Education” is not about learning things, it’s about signalling. People don’t study in school because they’ll need all that knowledge on the job. People study in school to show how smart and hardworking they are, so companies will hire them.

Similarly, the value of college isn’t in making students memorize factoids or formulas. The value of college is in filtering students. Employers hire graduates because colleges filter out weaker candidates, both in admissions and over the course of a four-year degree. College grads are much more likely to make strong employees.

But in shifting from a learning-view of education to a signalling-view, one thing stays the same: college seems like an awful lot of resources to burn. Surely the benefit could be captured without spending four years and two hundred thousand dollars? If anything, it seems like signalling intelligence and work ethic ought to be even less resource-intensive than learning things!

So, what might a viable alternative to college look like? If not MOOCs, then what?

Within software, one answer might be open-source contributions.

Already today, companies are eager to hire large contributors to major open-source projects. And such qualifications seem much more relevant to software engineering than a degree: working on large open-source projects is nearly identical to working on a large project at a software company. A candidate who has contributed lots of code to a popular library or framework will almost certainly be successful writing similar code for a company.

On the flip side, open-source projects are constantly in need of more hands. Even the most popular libraries have long wishlists. There’s no Common Application to get started, just pick a software package, browse the open tickets and go. The filtering comes from project owners, who will review any proposed changes or additions to the code. If your code doesn’t pass muster, re-do until it does - the project owner will likely explain exactly where it falls short. If the owners of a project are unpleasant to deal with, go contribute to a different project - though projects are unlikely to grow large in the first place with unpleasant management.

On the other hand, compare to college. There’s a lengthy admission process of questionable granularity, followed by four years of professors who may or may not be interested in helping you. If you fail, it’s a permanent black mark, even if it’s in some stupid class unrelated to your career. At the end of the day, your incentives, employers’ incentives and colleges’ incentives are not very well aligned.

So why do people still go to college, rather than taking some online programming classes and then working on open-source projects?

One answer, presumably, is that college is the default path. The open-source alternative is non-obvious, especially to people not yet in the software industry. It also lacks the flexibility of a college degree. These both seem like reasonable explanations, but neither is a serious roadblock to wider “adoption” of the open-source alternative. Of course, moronic HR departments are another issue, but that only matters at large companies.

Perhaps the most serious roadblock is simply that nobody is promoting the open-source alternative, so nobody knows it’s there. In this case, there is an obvious group who is incentivized to promote it: owners and managers of open-source projects. If Apache were to promote open-source work as an alternative to a degree, they might find a lot more helping hands.

Am I missing anything here? Is there some other reason why the open-source path would not work? Let me know.

Tuesday, November 7, 2017

Post-Scarcity

Background: This is part of a short series on high-level principles relevant to political/social issues. The previous post discussed depersonalization and scalability of interactions. This post can be read standalone. If you want to understand the modern economy, as opposed to the economies of yore, the one source I recommend most strongly is a short story from the July 1958 issue of Astounding Science Fiction, titled “Business As Usual During Alterations”. It’s roughly a 15 minute read. I’m about to throw out major spoilers, so stop reading here if you want to enjoy the story first.

One morning, two devices mysteriously appear in front of city hall, along with directions on how to use them. Each has two pans and a button. Any object can be placed in one pan and, with a press of the button, a perfect duplicate will appear in the other pan. By placing one duplicator device in the pan of the other, the device itself may be duplicated as well.

Within a span of hours, material scarcity is removed as an economic constraint. What happens in such a world?

People tend to imagine the dawn of a new era, in which human beings can finally escape the economic rat-race of capitalism and consumerism. In the world of the duplicator, a pantry can provide all the food one needs to live. A single tank of gas can drive anywhere one wishes to go. Any good can be copied and shared with friends, for free. All material needs can be satisfied with the push of a button. Utopia, in a nutshell.

The main takeaway of the story is that this isn’t really what happens.

Towards the end, a grocer explains the new status quo eloquently:
“... not very many people will buy beans and chuck roast, when they can eat wild rice and smoked pheasant breast. So, you know what I've been thinking? I think what we'll have to have, instead of a supermarket, is a sort of super-delicatessen. Just one item each of every fancy food from all over the world, thousands and thousands, all different”
Sound familiar?


Of course, that’s just the tip of the iceberg. When it comes to digital goods, like music or videos, the world of the duplicator is exactly the world in which we now live. That’s the obvious parallel, but let’s not stop there.

Over time, the value of raw materials and manufacturing have steadily fallen as a fraction of economic output. Even when looking at material goods, efficiency has shifted the bulk of costs from materials and manufacturing to design and engineering. We are converging to the world of the duplicator, where marginal production costs hit zero, and in many areas we’re already most of the way there!

This hasn’t made economic activity disappear. Pulling from the story again:
“This morning, we had an economy of scarcity. Tonight, we have an economy of abundance. And yet, it doesn't seem to make much difference, it is still the same old rat race.”

I won’t spoil all of the remarkably prescient predictions of the story - do read it yourself.

Badge Value
Here’s one good you can’t just throw on a duplicator: a college degree.

A college degree is more than just words on paper. It’s a badge, a mark of achievement. You can duplicate the badge, but that won’t duplicate the achievement.

Rory Sutherland is another great source for understanding the modern economy. The main message of his classic TED talk is that much of the value in today’s economy is not “material” value, i.e. the actual cost of making a good, but “intangible” or “badge” value. A college degree is an extreme example, but the principle applies to varying degrees in many places.

The sticker price on an iphone or a pair of converse isn’t driven by their material cost. A pair of canvas high-top sneakers without a converse logo is worth less than a pair of converse, because converse are a social symbol, a signal of one’s personal identity. Clothes, cars, computers and phones, furniture, music, even food - the things we buy all come with social signals as a large component of their value. That’s intangible value.

In the world of the duplicator, the world to which our economy is converging, badge value is the lion’s share of the value of most goods. That’s because, no matter how much production costs fall, no matter how low material costs drop, intangible value remains.

In the past, we’ve sold material value because that was a scarce commodity. Now, the shoe is on the other foot, we’ll sell intangible value.

Jobs & Employment
One particularly prescient line from the duplicator story:
“You know, when we first got the word about this thing, this duplicator, we immediately started thinking in terms of pretty drastic retrenchment. Then... it turned out we didn't have much fat to spare. Engineers, draftsmen, designers; we need about six times as many as we have. Nut-twirlers and button-pushers on assembly lines will go; but mechanics, craftsmen who can take a blueprint and turn out a piece to specified tolerance...”
Sound familiar?

We’re already well into the post-scarcity economy, and sure enough, nut-twirlers and button-pushers are disappearing rapidly. Yet every other week, news outlets are running stories about the shortage of STEM workers. The economy of the future, we’re told, needs thinking and creativity rather than repetition and basic labor.

The root cause of all this is the economic equivalent of the duplicator: steady growth of economic productivity, and the consequent reduction of materials and manufacturing as a share of cost.

The duplicator story gets one big thing wrong, however: it predicts that the shift in labor demands will be met by retraining. It’s an elusive dream still chased today, most recently by MOOC advocates. But at the end of the day, learning is not the main purpose of most education - after all, most of what people learn is never used on the job. Education is about signalling, through degrees and grades - badge value. That badge isn’t saying “I know Newton’s laws”, it’s saying “I have handled intellectually challenging problems”. Until we learn to create whatever cognitive capabilities a college degree filters for, retraining is unlikely to turn nut-twirlers into engineers.

And that’s the optimistic case. What if colleges don’t filter for a fixed skill level at all, but instead filter for a relative skill level? Oversimplifying a bit, what if colleges just give degrees to the smartest 20% of people they can find?

Keeping Up with the Joneses
The general problem with badge value, and signalling in general, is that a badge isn’t worth anything if everybody has it. In order for a badge to be worth something, there have to be people without the badge. It’s a zero sum game.

Keeping up with the Joneses is a classic example: people buy things to signal their high status, but then all their neighbors buy the same thing. They’re all back to where they started in terms of status, but everyone has less money.

The prevalence of zero-sum signalling today economically stems from the reduction of material scarcity. If you think about it, zero-sum games are inherent to a so-called post-scarcity society. A positive sum game implies that net production of something is possible. That, in turn, implies that something was scarce to begin with. Without scarcity, what is there to produce?

To put it differently: there’s always going to be something scarce. Take away material scarcity, and you’re left with scarcity of status. If there’s no way to produce net status, you’re left with a zero-sum game. More generally, remove scarcity of whatever can be produced, and you’re left with scarcity of things which do not allow net production at all - zero sum goods.

Today’s world has found a way to get around this problem somewhat: heterogenous cultures. The baristas at SightGlass coffee have very high status among hipsters, but hardly any status with bankers. Janet Yellen has very high status among bankers, but hardly any status with hipsters. Each different culture has its own internal status standards, allowing people to have high status within some culture even if they have low status in others.

Cultural heterogeneity allows net status to be produced, by increasing the kinds of status which one can have. When “hipsters” became a thing, they brought along their own kind of status in addition to all the old kinds of status. Cultural granularization makes status signalling positive-sum. But from another perspective, it just kicks the zero-sum game up to the group level: hipsters as a group compete for status with bankers, in a zero-sum manner. Thus tribal conflict between groups.

Rent Seeking
With all this talk of zero-sum games, the last piece of the post-scarcity puzzle should come as no surprise: political rent-seeking.

Once we accept that economics does not disappear in the absence of material scarcity, that there will always be something scarce, we immediately need to worry about people creating artificial scarcity to claim more wealth. This is the domain of political rent-seeking, of trying to limit market entry via political channels.

One simple way to measure such activity is via lobbying expenditures, especially by businesses. Such spending actually seems to have flattened out in the last decade, but it’s still multiple orders of magnitude higher than it was thirty or forty years ago.

Conclusion
Remove material scarcity as an economic constraint, and what do you get? The same old rat race. Material value no longer scarce? Sell intangible value. Sell status signals. There will always be something scarce.

Between steady growth in industrial productivity and the advent of the digital era, today’s world looks much more like the world of the duplicator than like the world of 1958. Yet many people are still stuck in 1950’s-era economic thinking. At the end of the day, economics studies scarcity. Even in the world of the duplicator, where any material good is arbitrarily abundant, scarcity still exists.

This is the world in which we live: as material and manufacturing costs fall, badge value constitutes a greater and greater fraction of overall value. On the employment side, falling marginal production costs mean less need for assembly line workers, and more need for engineers, designers, and high-skill trades. And politically, less material scarcity means more investment in creating artificial scarcity, through political barriers to market entry.

Welcome to the post-scarcity economy.