Peter Thiel and thinking for yourself

Peter Thiel is asked the formula for starting great businesses at every talk he gives. His answer is every time the same: “There is no formula. You have to figure it out for yourself.”

In his interview, Tyler offers a summary of Thiel’s thought. (Search for the paragraph that mentions Tocqueville.) I haven’t read enough Girard to follow the part about original sin, but Tyler describes Thiel as someone who is trying to get us to break free of socially-derived opinions and to see the world without distortions.

I was still in Europe when I read this, and it prompted me to think about the social environment there. It’s not at all hard to find people beating up on Europe as a bad place to start a tech company; you’ll find no lack of grievances about its regulatory attitude, its taxes, its anti-trust initiatives, its punitive bankruptcy codes, and so on. Still, I think that it’s underrated in one significant way. I submit that from a Thielian perspective we might expect great entrepreneurs to be better developed in Europe, especially in Germany, because it’s easier to be independent there.

My favorite review of Zero to One argues that despite appearances the book is not about how to found startups, but that instead it’s a book of ethics. Thiel thinks that we live in a society of deep conformism and constrained imagination. For him, the key to doing something great (of which starting a company is just one example) is to uncover insights hidden from popular opinion, or in other words to think for yourself.

Tyler has written that there’s an enormous sense of freedom in Sweden: “Autonomy reigns… Sweden is the land of the true individualist, sometimes verging on atomism.” I think it’s easier to be individualistic in Germany too. When I lived there I felt a freedom that’s unavailable in America, a social one not related to regulations or government expenditures. First you’re more free from pursuing status markers; second there are fewer pressures to conform. I’ll make this case focusing mostly on education.

Moving from Canada to go to an American suburban high school and then an American college was distressing in one particular way: It was hard to meet the need to keep up. In college especially you feel these irresistible pressures to seek and display prestige, most of which were earned by going through ever more grueling tournaments. When you enter college you’re with this big pool of students more or less like you, all trying to distinguish themselves in four years or so. That creates an environment that breeds the most intense mimetic pressures. The more that people wanted something (anything), the more it became desirable. This would work its way through until those with only marginal interests get sucked in too.

I think that’s how you’re led to situations where something like 45% of the graduating Harvard and Princeton classes in 2007 entered finance. (That figure is 31% for the Harvard class of 2014.) Toss in consulting, tech, and medicine and you’ll probably claim over a majority of the career aspirations of graduates from elite colleges. Now step back; isn’t that odd? For all of the talk about training people to think critically, somehow you find everybody trying to enter one of very few career paths.

Thiel has asked: “Is this a reason that we ended up sometimes underperforming because we are insecure about things, we want to get validated by winning various competitions?” Now I’m skeptical of the claim that all of us secretly dream of ditching finance to become marine biologists. But I think that these paths are so common because they offer not only prestige, but also assurance that others want this highly-desirable thing too.

Everybody in the world feels these pressures to some extent. I think though that in Germany this is less pronounced; there are fewer markers of social prestige, and it’s more normal to go on different career paths.

Start with schools. There’s no designation of an elite stratum of universities; no “Ivy League,” no “Oxbridge,” no “Grandes écoles,” no “zhongdian daxue.” While certainly some schools are better regarded, choosing a university better resembles a lifestyle choice. If you want to be in a big city, maybe you’ll go to the University of Munich or Humboldt in Berlin. If you want to be in a sunny area and be surrounded by hippies, maybe you’ll go to Freiburg or Heidelberg. Each of these have specialties of course, but they’re all about ranked the same, and they cost the same too (free except for a small administrative fee).

It’s not just postsecondary. Germany is often praised for its system of apprenticeships. From fifth grade on, students are separated into grammar schools (Gymnasiums and Realschulen) and vocational schools (Hauptschulen). Grammar school students are prepared for college work, while Hauptschule students are taught more work-related skills. After school they move on to apprenticeships in fields like construction and IT. It may be most desirable to enter a grammar school, but early on kids are aware that different paths are possible.

When I say that growing up in Germany helps bestow independent thinking skills, I’m not saying that it’s because they’re all taught Straussian art of close reading. Instead I’m arguing that society has suppressed the value of certain status indicators, and that encourages people to think for themselves. To put it another way, there are fewer tournaments for kids to go through, and the value of winning them is not so high. Germans I’ve met are incredibly humble. Nobody feels the need to perpetrate an international hoax about how desirable they are. In addition, people aren’t all drawn to the same fields like finance and consulting. They take up professions like baking or manufacturing, and work with the earnestness that comes from knowing that their work is dignified; it’s easier for them to do the equivalent of moving to Dayton to study widget machines.

Let me end with one last speculation. Germans are taught about the crimes of the Nazi state since elementary school. The Holocaust is mentioned in no fewer than three subjects: biology, history, and German language. People are taught that crowds can be wrong, and that it’s a duty to stand apart if you disagree. Maybe these frequent exhortations to avoid groupthink increases independent thinking on the margins.

Time to summarize. Thiel thinks that great businesses are built by people who discover secrets hidden by conventional opinions. I submit that you can become that sort of person more easily if you grow up in Europe, particularly in Germany. Put aside the question of taxes and regulations, and consider the social environment. America holds dear a lot of status symbols. Germans have fewer elite reference points and makes it common for people to pursue non-prestigious work; those in the services aren’t all trying to earn their masters’. Therefore we should expect more independent thinking to come from Germans.

Thiel himself thinks that Germany is too pessimistic and too comfortable. The best argument against everything I’ve said is to point out that, in fact, Germany has not produced any Facebooks or Airbnbs. Actually, the best-known German tech entrepreneurs may be the Samwer brothers, who are notorious for copying successful ideas from Silicon Valley to try to scale them in other markets. So much for originality.

So maybe taxes and regulations matter more after all; I also don’t want to pass over cultural norms that stigmatize failure. But if the limiting factors to great entrepreneurship is independent thinking combined with courage (as Thiel has said, courage is in shorter supply than capital or genius), then maybe it’s better to be away from America. After all, policies are easier to fix than the social environment, and original minds may grow up over there and start companies over here.

P.S. This column appeared in the Times just yesterday on why so few tech companies have emerged from Europe. At the end there’s this quote: “In Europe, stability is prized,” Professor Moser said. “Inequality is much less tolerated. There’s a culture of sharing. People aren’t so cutthroat.” I think that everything except the part about “stability” would be positives for Thiel.

Continue reading

Data on police officers killed since 1961

In 2012 I put together some data for Radley Balko on the purported rise of police killings. Last week I saw that Dara Lind prepared something similar at Vox. My data goes back a little bit further than Dara’s (her’s goes to 1996, mine to 1961), and I thought to put up what I have here.

The FBI keeps track of two types of police deaths: Accidental deaths and felonious killings, which involves the deliberate killing of law enforcement officers in the line of duty. I’ve collected three statistics related to the latter. First, number of officers feloniously killed since 1961; second, the rate of felonious deaths per 100,000 officers since 1989; and finally, average felonious deaths per five-year period since 1961. I present these statistics in chart form here, and at the end of the post I share my data file and talk about the process of obtaining these figures.

Everything is collected from the Uniform Crime Reports compiled by the FBI. One comment from the 1990 UCR report I found very interesting: “The 1990 total was the lowest since the FBI started collecting such data in the 1960s.” I was able to find online older UCR report up until 1961, and that has made me somewhat confident that my data goes back to the first years that the FBI started to keep track of this number. I’d like to keep updating this as new data comes in so that it can be a complete and easily-searchable source of for these numbers. Your help and feedback is appreciated.

Here’s the summary: In general, the job of policing has become much safer since 1961. Here are a few interesting points.

  • More officers were feloniously killed in the 11 years between 1970 and 1980 (1228 deaths) than in the 21 years between 1993 and 2013 (1182 deaths).
  • The rate of felonious killings per 100,000 officers has declined from about 18 in 1989 to about 5 in 2013. It was over 3 times safer to be a police officer in 2013 than 26 years ago.
  • In the five years between 1971 and 1975, an average of 125 officers were feloniously killed per year. Most recently, between 2006 and 2010, the equivalent number is 50. That’s more remarkable given that the number of officers employed has increased considerably since the ‘70s.

Now the data. Click on these pictures to zoom.

Number of officers feloniously killed since 1961

police-fatalities

I’ve put in a trendline to better illustrate the decline. The peak year for deaths was 132 killings in 1972. The safest year recorded was the most recent: 27 deaths in 2013. That’s nearly an 80% drop. The number of deaths has steadily decreased since the ‘70s, with two spikes in 2001 and 2011.

Next, felonious killings per 100,000 officers since 1989

police-fatalities-rate-per-100,000

You’ll see from the data source in the next section that the number of officers has grown from about 400,000 officers in 1990 to about 530,000 officers in 2000. Still, this decline in the rate of killings isn’t just driven by an expanding denominator (number of officers), but also a declining numerator (number of killings). The number of killings has decreased even when the number of officers grew by over 25%.

The data on the number of officers serving is really difficult to find, which is why my cutoff has been 1989, the last year for which I can get reliable data. I’ll talk more about this in the next section.

Finally, five-year averages of felonious killings

average-police-fatalities-decade

This is just an aggregation of the first chart, useful for seeing the decline of felonious killings in half-decade chunks.

Summary

Every time a police killing makes it to national headlines, voices pipe up warning of an ominous trend in the rise of police officer killings. (See Radley’s recent compilation of some of these articles.) This data indicates that policing is much safer than in the past.

2013 was the safest year recorded for felonious killings of police. It’s hard to go down from 27 deaths. Consider that an increase of 9 felonious killings of police in 2014 would be a 33% rise from the year before; meanwhile, 9 felonious deaths over the 1972 peak would be only a 5% increase.

The data

I’ve compiled everything I’ve found into a Google Doc that you can find here. The first sheet holds the data I’ve collected, along with the source of every year’s UCR report. The next three sheets hold each of the three charts above. You’re very welcome to use it as you like, but please link to this original post or mention @danwwang.

Now some remarks about how I got the data. It was a big challenge to find some of these data points because collections are so haphazard, so I especially welcome feedback and corrections if you catch any errors.

Continue reading

The strangeness of Berlin

Berlin is one of the three cities in Europe that really made me go “wow.” It’s the one that I find hardest to characterize, but here’s an attempt.

Let’s start with the history. Berlin was hopping in the ‘20s, one of cultural capitals of the world. Soon the fanatics took over and made it the capital of the Third Reich. Next came the Allies’ bombs and the Soviet tanks. Then it was divided, and a massive wall broke it in half. That wall endured for thirty years before it was torn down. Now it’s a vibrant place of 3-and-a-half million.

You can’t go through all of that without being weird, and that’s putting it mildly. I went to Berlin before I visited London and Paris, and didn’t then appreciate that it’s so special. Now I’ve had the chance to reflect, and I think Berlin is simply far more interesting than the other two.

Walk around. Notice that Berlin has no organically-developed architecture. You won’t find the consistency of London and Paris. Everything clashes with everything else; there is not the white, neoclassical grace of Westminster, or the more striking grandness that grows along the Seine. Not every building agrees even with itself; witness the glass dome designed by Norman Foster placed on top of the Reichstag. 

After a while, you might alight on a thought. It’s an uncomfortable one, because you don’t really want to believe it, and maybe it’s because you’re just tired, so perhaps you shouldn’t entertain it at all—but you do. Berlin is sort of ugly.

There are no skyscrapers designed by brand-name architects, like in London. There’s no central, well-preserved “oldtown,” like in Strasbourg. The heart of the city isn’t dominated by a centuries-old cathedral, like in Cologne or Milan. If you want to see well-preserved cities on the eastern side of Europe, Berlin is not your best bet; go to Prague or Budapest instead. If you want to see “typical” German architecture, drive through the Black Forest, up to the Rhine valley, or through Bavaria. Berlin might be thought of as a northern Munich, with its old Baroque buildings mixed with contemporary work; only Munich is sunnier, richer, and a hundred times cleaner. To me it’s not obvious if Berlin is example of any aesthetic perfection. There’s always another city that does something better.

But I don’t take this lack of beauty to be a negative. Instead I think of it as quite marvelous.

When I reach for examples of German culture my references always go in one of two directions. It’s either the highly-polished works of Beethoven, Schiller, Brahms, or Fontane. Or it’s the really dark stuff: Berg’s gruesome opera Lulu; Kafka’s surrealist short stories; Brecht’s near-tragic Threepenny Opera; Schönberg’s atonal string quartets; Schiele’s crude, erotic paintings; Döblin’s Berlin Alexanderplatz; and on and on.

The first group feel like the product of a Munich or Cologne upbringing. The latter, with its seediness and edginess, belong to the spirit of Berlin. (Yes I know that many of the people here aren’t Germans, but their works are in German or they’re German-speaking and that’s what’s relevant.)

So what’s attractive about Berlin is precisely what’s missing in the cities that are beautiful. It’s not perfect and it cares not to be. Walking through its streets and thinking about the place is unsettling; you don’t know if something strange and unfortunate is going to happen next. That gives it an incredible vibrancy, a freedom that comes from knowing that it doesn’t have to be gorgeous or be beholden to the aesthetic past. Consider that both east and west were equally vigorous in destroying old buildings. The east even managed to demolish the Berlin Palace (Berlin Schloss), the summer residence of the Hohenzollern kings.

Berlin will surprise you. One hears all the time about how Germans are so great at planning and engineering. And then you read of something like the construction of the new airport in Berlin, which has been so mismanaged that every year it needs to add two more years to its completion date, and needs to take out another billion in loans. It was supposed to start operating in 2011, and completion now looks like it’s going to be 2017. The story of its construction involves huge plot twists, and at this point you can’t help but laugh at headlines like “Berlin Airport: The five biggest mistakes,” and “An endless debacle at the BER airport.”

What fun to live in a place like that, in spite of knowing that the hilarity comes from the mismanagement of your taxes. My great complaint with living in southern Germany is that it’s far too comfortable. Things are beautiful and need no change. The occasions for surprise are always structured. Where are the plot twists, the vendors selling delicious goods without a license, the spontaneity that comes when you know that neighbors don’t judge? Everything in the south is polite. Berlin is not that.

The message of Berlin is that not everything is set, that it has room for you. The latter I mean quite literally: There’s plenty of housing available. Someone told me that his two-bedroom apartment in a nice area of the former West Berlin costs 200 euros a month. It’s a small place, but a good location. Is it possible to live anywhere close to SoHo or the Ninth Arrondissement for less than seven or eight times that amount? And it’s not just housing; the food options are diverse and cheap, and you hear sometimes of the amazing nightclubs set up in abandoned warehouses.

Berlin can’t stay weird and cheap forever. Plan a visit before it turns into Paris.

(Here’s some color-footage of Berlin in July, 1945.) 

@danwwang

Continue reading

Space and military experiments in the sixties, and what we’ve lost

I’ve just finished Tom Wolfe’s The Right Stuff. It’s about Project Mercury, America’s first manned spaceflight program, which paved the way for Gemini and Apollo.

Too much of it focused on the relationships between the astronauts than I liked, though I wouldn’t have enjoyed it more if it went harder on the technical details. I wish that I knew enough physics and engineering to appreciate its details about propulsion. I don’t, and instead I most enjoyed reading about the environment that produced technology. (This is also how I felt when I read The Idea Factory, a book about the inventions that came out of Bell Labs.)

What’s most striking is how easy it was then to run experiments with unpredictable consequences. It’s almost unbelievable to read about everything that the American government was willing to try in order to beat the Soviets. Reciting anything like a precautionary principle to a scientist at that time would probably provoke incredulity and contempt. The sixties were a time when people won funding and permission for trying out really radical things, on a scale that’s hard to grasp today. Here are some examples of what I mean, by no means an exhaustive list of interesting projects of that time.

  1. The rockets that put the first American astronauts into orbit were modified intercontinental ballistic missiles. NASA made some tweaks to the Redstone and Atlas missiles, stuck astronauts on top of them, and shot them up. That was how Alan Shepard entered space in the first Mercury flight in 1961. Someone thought that you can send astronauts into space and bring them back to earth on missiles designed to deliver warheads, and they were right.
  2. The sound barrier was broken by an experimental rocket plane in 1947. The X-1 reached Mach 1.06 by being drop-launched from the bomb bay of a B-29 Superfortress. The X-planes managed to enter space, and the Air Force endeavored (unsuccessfully) to get them to be considered alternatives to NASA’s missions. Someone thought that you can break the sound barrier by launching a plane from the air rather than from the ground, and they were right.
  3. Then I learned that scramjets take this to a whole new extreme. Scramjets are drop-launched from about 50,000 feet, travel at around Mach 5 (~4,000 miles per hour), and can theoretically reach Mach 20. If I understand them correctly, scramjets don’t exactly have engines; instead they suck in a huge amount of oxygen, and “ram” that into a combustor to produce thrust. They travel fast enough that you can get from New York to London in less than an hour. Experimental flights were conducted in the ‘50s and ‘60s, and a patent for its design was first submitted in 1964. Alas it doesn’t look like there hasn’t been much more research and experiments with scramjets for the last few decades.
  4. Project Orion, an investigation into nuclear propulsion, was started in the late ‘50s. Physicists thought that you can travel through space by continuously blowing up atomic bombs behind a spacecraft, which would be protected from these explosions by a copper- or lead-tinted plate. This was however mostly theoretical, and no experiments were ever conducted.
  5. Speaking of nuclear explosions, perhaps the single best representative of the spirit of the times is the Starfish Prime test. In 1962, James Van Allen announced to the world his discovery of a layer of radiation by the earth’s magnetic field. The military promptly decided to detonate a thermonuclear bomb inside it. It was around 1960, when the military had begun conducting nuclear tests in high altitudes and outer space. Starfish Prime involved detonating a 1.45 megaton bomb (100 times the size of the Hiroshima bomb) in what’s now known as the Van Allen radiation belt. The detonation temporarily altered the shape of the belt, destroyed a lot of satellites, and created an artificial aurora borealis that could be seen from New Zealand to Hawaii. Only later did we learn more about the belt and discover, for example, that it plays a crucial role in shielding us from solar winds. Here’s James Fleming, a science historian, on Starfish Prime: “This is the first occasion I’ve ever discovered where someone discovered something and immediately decided to blow it up…” and no less with a hydrogen bomb.

It wasn’t just the government and the military that conducted experiments. Ordinary people more broadly were impacted by innovations from the ‘60s. Microwave ovens were becoming commercially available, Norman Borlaug’s Green Revolution produced food for millions, people debated the merits of massive civil engineering projects.

The precautionary principle is now being invoked to stop people from drilling a hole in the ground to force up natural gas. Imagine learning about these innovations with the attitude of today. “Bring into our homes a machine that heats food by means of electromagnetic radiation? We need decades to study the effects of this.”  “Break the sound barrier? Why do we need to break stuff?” “Engineer new types of crops? Let’s stick with what’s natural.”

We shouldn’t detonate thermonuclear bombs in something we immediately discover. Still it feels like we’ve lost something important. In the ‘60s, people thought about how something should work in theory, designed experiments, and ran them to to test their ideas. Their successes were the bases for new inventions, and ordinary people were able to accept them for commercial use.

With so many leaps in technology what a thrill it must have been to live in the sixties and look forward to the things to come. But something changed, and it feels like we’re no longer so eager to run experiments or accept even not-so-radical inventions. Commercial flight hasn’t gotten faster for decades. Our kitchens haven’t changed much in 50 years. The most successful commercial applications of military technology, namely GPS and the internet, both came from the ‘60s. What a pity that the moon landings in 1969 marked not a new era for human ingenuity, but a capstone for the old one.

The Logic of Nuclear Exchange and the Refinement of American Deterrence Strategy

The most spectacular event of the past half century is one that did not occur. We have enjoyed sixty years without nuclear weapons exploded in anger.

What a stunning achievement—or, if not achievement, what stunning good fortune. In 1960 the British novelist C. P. Snow said on the front page of the New York Times that unless the nuclear powers drastically reduced their nuclear armaments thermonuclear warfare within the decade was a “mathematical certainty.” Nobody appeared to think Snow’s statement extravagant.

We now have that mathematical certainty compounded more than four times, and no nuclear war.

– Thomas Schelling, 2005 Nobel Prize in Economics Lecture

When Robert McNamara was named the Secretary of Defense in 1961, he brought to the Pentagon a group of aides who came to be known as the Whiz Kids. They were young, book-smart men eager to apply the latest in systems analysis, game theory, and operations research to military strategy.

It did not take them long to alienate senior officers. Once to settle a particularly heated argument about nuclear plans, a 29-year-old Whiz Kid declared: “General, I have fought just as many nuclear wars as you have.”

The flip remark understates a fact that deserves great wonder: The world has now gone for seven decades while avoiding nuclear destruction. The thermonuclear war that was once regarded with the greatest of fears and as a mathematical certainty has not come to pass.

In addition, it’s also a startling display of the role that a group of civilians played in defining U.S. nuclear strategy. After a first draft by the military, American strategic objectives were subject to continuous refinements. Many of these refinements came from civilian theorists, most of whom came from the RAND Corporation, and few of whom had seen war. One of the earliest nuclear intellectuals from RAND started out as a naval strategist; when he produced his most important work on naval strategy, he had never seen the ocean, let alone set foot on a ship. In seminar rooms, these strategists pondered the novel challenges of the nuclear world and worked out ideas by discussing not the efficient application of force but rather the exploitation of potential force.

This essay is a short introduction to how nuclear weapons are created and deployed, and the ideas that strategists, policymakers, and the military implemented to reduce the risk of nuclear war.

(Published in prettier formatting on Medium.)

What Are Nuclear Weapons?

 

Thirty years after the detonation on Hiroshima, the world had produced enough nuclear weapons to create the equivalent of about 3 tons of TNT for every man, woman, and child on earth. Here’s context to put that figure into some sort of perspective.

Nuclear Explosions

What happens in a nuclear explosion? First, a huge blast drives air away, producing high winds and changes in air pressure that crush objects. Then come radiation: direct radiation will cause fatal illness in a matter of a few weeks, while thermal radiation will cause first-degree burns a few miles away. Fires immediately follow; a strong blast can generate a firestorm, which destroys everything in a concentrated area, or a conflagration, which is not so strong but spreads along a front. Then there’s fallout: particles are scooped up from the ground, get irradiated by the explosion, and spread depending on wind conditions. Finally, at a sufficiently high altitude, a blast might produce electrons that interact with the earth’s magnetic field, setting off an electromagnetic pulse that can destroy electronics and metal objects.

The world has set off over 2400 nuclear explosions, nearly all of them by America or the Soviet Union, most of them underground. Americans have tested most of their weapons in the southwestern states of Nevada or New Mexico, or on islands in the Pacific. The Soviet Union has conducted mostly in Kazakhstan or archipelagos in the Arctic Ocean.

Nuclear detonations have been set off underground, underwater, and in the atmosphere. They’ve had usually minor and sometimes permanent effects on the earth. As a dramatic example, America’s first hydrogen bomb, named “Ivy Mike,” completely obliterated the small Pacific island on which it was tested.

The effects of nuclear explosions have always provoked anxiety. Before the first nuclear test in New Mexico, Enrico Fermi rounded up his fellow scientists to place a grim bet. Some of them speculated that an atomic bomb would ignite the atmosphere, and Fermi offered wagers on whether the Trinity test might destroy the atmosphere of the planet, or merely that of New Mexico state. More recently, Carl Sagan wrote that instead of igniting the atmosphere, nuclear weapons may cool the world enough to produce a nuclear winter.

The effects of nuclear tests have not always been well controlled. Shortly after the Ivy Mike test, America detonated the most powerful thermonuclear bomb it would ever construct. “Castle Bravo” was expected to yield a blast of five or six megatons, but instead produced a blast of 15 megatons. The blast carried fallout to inhabitants on the Marshall Islands, some of whom ate the radioactive powder they believed to be snow. Hundreds were overexposed to radiation, and a nearby Japanese fishing ship crew suffered from radiation poisoning. Fallout from that blast eventually spread 7,000 miles, including to India, the United States, and Europe.

The Mechanics of Atomic and Hydrogen Bombs

There are two types of nuclear bombs. The atomic bomb creates temperatures equal to those on the surface of the sun; and the much more powerful hydrogen bomb bring the equivalent of a small piece of the sun to earth.

The basic nuclear weapon is the atomic bomb, otherwise known as the fission bomb. Atomic bombs typically have yields measured in the thousands of tons of TNT, or kilotons. Their explosive force is generated from a fission process; fission occurs when a neutron enters the nucleus of an atom of a nuclear material, which is either enriched uranium or enriched plutonium. A large amount of energy is released in the process, which causes the nucleus to release a few more neutrons. In the presence of a critical mass, these neutrons go on to create a chain reaction. There are two types of bomb designs for initiating fission. The first is the gun assembly technique, which brings together two subcritical masses to form a critical mass; the second is the implosion technique, which compresses a single subcritical mass into a critical density.

On August 6th, 1945, the U.S. Air Force dropped the atomic bomb known as “Little Boy” on Hiroshima. Little Boy was a gun-type bomb with a core of 60 kilograms of uranium-235. About 700 grams of it fissioned (just over 10%), generating a blast of 12.5 kilotons; about 60,000 to 80,000 people were killed by the blast, while up to twice that number were killed by burns and radiation. Three days later, the U.S. dropped an atomic bomb on Nagasaki. The Nagasaki bomb, known as “Fat Man,” was an implosion-style bomb carrying 8 kilograms of plutonium-239. Once again about 10% of the material fissioned, producing a yield of about 22 kilotons and instantly killing about 40,000 people. The complete detonation of its plutonium would have caused an explosion 10 times its size.

The more sophisticated and far more destructive kind of nuclear weapon is hydrogen bomb, otherwise known as the thermonuclear bomb, the fusion bomb, or the H-bomb. In hydrogen bombs, heavier isotopes of hydrogen are fused together to form helium. That reaction creates a great deal of energy, far more than the chain reaction possible in fission bombs. Hydrogen bombs are far more difficult to construct than the atomic bomb; nine countries possess nuclear weapons, but only five have definitely developed hydrogen bombs. A successful detonation requires the explosion of a fission bomb (the “primary”) to ignite a fusion (the “secondary”). The difficulty presented by the hydrogen bomb is the risk that the atomic bomb might explode prematurely and blow up the whole bomb, an event referred to as a “fizzle.”

Hydrogen bombs are hundreds or thousands of times more powerful than atomic bombs. The first hydrogen device, which couldn’t be used as a weapon, was detonated by the United States in November of 1952. A true hydrogen weapon was not detonated by America until March, 1954. The bomb, Castle Bravo, was the most powerful nuclear explosion America would ever generate; at 15 megatons, it was over 700 times more powerful than the blast at Nagasaki. The Soviet Union would detonate its first hydrogen bomb in November, 1955. In 1961, it would go on to detonate the largest nuclear weapon ever: The Tsar Bomba had a yield of over 50 megatons, or over 2500 Nagasakis.

The value of a weapon of these sizes is not immediately obvious. A 1-megaton bomb would kill most people within hundreds of miles, while the largest of cities would be destroyed by a bomb of 10 megatons.

How Are Nuclear Weapons Delivered?

There are two types of nuclear deployments. Strategic weapons are launched against homelands, while tactical weapons are used on battlefields.

Strategic nuclear weapons are typically delivered in one of three ways. First, they may be launched from bombers; these can either take the form of free-fall gravity bombs or as air-launched cruise missiles (ALCMs). Second, they’re deployed on intercontinental ballistic missiles (ICBMs), which are launched from underground silos and are capable of reaching any target on earth. Finally, submarine-launched ballistic missiles (SLBMs) are deployed by submarines, which can lie at sea for months and surface only to launch. The majority of warheads are deployed on ballistic missiles, while a few hundred are located at bomber bases.

There has been a greater variety of tactical nuclear weapons, though they’re no longer deployed. They were once a regular part of arsenals, including as torpedoes, mines, artillery, and rocket launchers. A young Colin Powell was an officer stationed in West Germany in 1958 when he was tasked with guarding against a Soviet invasion; if the enemy came over, he was to launch 280 mm atomic cannons, which fired artillery shells with yields of 15 kilotons (or about the explosive force of Hiroshima). These tactical weapons have never actually been put to use.

Stockpiles

The global nuclear stockpile peaked at 70,000 weapons in 1986. Most have been owned either by the Americans or the Soviets.

Both countries have vastly reduced their arsenal. In the last 25 years, America has reduced its stockpile from about 23,000 weapons to around approximately 7000 today. Meanwhile, Russia has brought down its stockpile to around 8000 weapons, from a peak of 30,000 inherited from the Soviet Union.

There are seven other countries with confirmed nuclear weapons: France, China, the U.K., Pakistan, India, and North Korea. Israel is rumored but not officially confirmed to have nuclear weapons. Of all nine countries, five are confirmed to have hydrogen bombs: the U.S., Russia, France, China, and the U.K. India has claimed to have detonated a hydrogen bomb, but scientists debate whether it was a true two-stage thermonuclear device.

The vast majority of nuclear weapons are and have been operated by the U.S. and the Soviet Union; the stockpiles of other countries are miniscule in comparison. Currently France has the next largest stockpile, at around 300 weapons, while North Korea has fewer than 10. Motivations for acquiring the bomb have varied for every country. China, for example, sought not to depend too heavily on protection from the Soviet Union, just as Britain decided that it wanted warheads not controlled by America. Meanwhile, though France was motivated by a similar concern not to depend too much on the United States, it has also developed nuclear weapons because it craved status. Charles de Gaulle believed that that the bomb would “place France where she belonged, among the Great Powers.”

American Nuclear Strategy

 

As America demobilized after the Second World War, Eisenhower believed that nuclear weapons were a cheap substitute to maintaining a large army to deter Soviet aggression. With his Secretary of State John Foster Dulles, he defined a policy called “New Look” that relied on nuclear forces, as opposed to conventional forces, to deter aggression. The United States would be “willing and able to respond vigorously at places and with means of its own choosing.”

What did that mean in practice? At the discretion of the president, the entirety of the American nuclear stockpile would be delivered to enemy targets, both military and civilian. It was a first-strike policy: The enemy faced vast destruction if the United States determined that it crossed a line. Eisenhower and his staff considered it the ultimate deterrence.

It also attracted immediate skepticism from strategists. Critics of the policy considered it reckless and crude. First, it seemed practically an invitation for the Soviets to strike America; before a major action, Soviet forces should eliminate the American means to respond. Second, Eisenhower drew no bright line for incurring nuclear attack. Was America ready to initiate nuclear exchange, and guarantee the deaths of millions, in order to prevent a small country from turning Communist? What about Soviet meddling in the internal affairs of an allied country? In other words, this commitment to initiate exchange was insufficiently credible.

Strategists who made it their living to think about nuclear exchange attempted to make improvements. Many of the them were analysts at the RAND Corporation, a research institute set up by the Air Force to improve engineering and ponder novel scenarios for the modern world. These analysts tried to create options between official U.S. displeasure and full-scale thermonuclear exchange.

The rest of this essay is about certain ideas they developed to reduce the likelihood of mutual destruction. It gives a broad overview of the evolution of American strategic thinking, which started from massive deterrence, then moved through to reject elaborate methods of defense, and ended up on relying on once again on a robust system of deterrence.

Counterforce

William Kaufmann was a RAND analyst and political science professor who tried to create opportunities to wage limited war given weapons of unlimited power. He developed and was the proponent of a strategy that came to be known as “counterforce.”

There are two types of targets: military, which includes airbases, command stations, barracks, etc.; and civilian, which means cities and industrial sites. Early nuclear plans made no distinction between them. When authorized by the president, the stockpile would be launched against every target deemed to be valuable.

Kaufmann developed a different strategy: In case of conflict, not every warhead would be launched, and those that were launched would strike only military targets. The goal was to wipe out the enemy’s military capabilities while warheads held in reserve would threaten enemy cities. In the ideal world, after suffering a (reduced) retaliatory strike, the United States would have eliminated Soviet military capabilities and would be able to use Soviet cities as hostages to bargain for surrender.

What were the virtues of counterforce, as opposed to the cities-also countervalue, strategy?

First, civilians would avoid the brunt of the force. Vast numbers of innocents in cities would be largely spared. In a full-scale nuclear exchange, defense scenarios anticipated hundreds of millions of Soviet and American deaths, no matter who launched first. A counterforce strike also gives an incentive to the retaliating side to also target only military sites. A successful counterforce attack was projected to save over 100 million lives. Moreover, from a strategic standpoint, it created a chance for nuclear war to be limited. Counterforce offered the enemy an opportunity to recognize defeat early and so surrender with its civilian force intact.

The Pentagon warmed to counterforce. By 1962, Secretary McNamara publicly declared counterforce to be official U.S. policy, and encouraged the Soviets to adopt it as well.

It also had its skeptics, who argued that there were no guarantees that it might work as planned. When the enemy detected ICBMs, SLBMs, and strategic bombers racing towards its territory, it had no way to determine that it was subject to a “mere” counterforce strike. It was not clear that counterforce might really stave off escalation, and perhaps the simplicity of massive deterrence was still the best strategy after all.

Curtis LeMay, director of the Strategic Air Command (SAC), thought it meant going soft on the enemy; Thomas Schelling, who worked at RAND and consulted for the Pentagon, never fully embraced it; and even McNamara ended up skeptical of its usefulness. As a result, American nuclear strategy see-sawed between counterforce and massive deterrence; it would be integrated into nuclear plans, and then quickly stripped away, only to be re-introduced years later.

Conventional War

In addition to counterforce, Kaufmann also advocated for another way to keep war limited: Building up conventional military forces.

This was precisely the strategy rejected by Eisenhower. The Soviets were far superior in troops and tanks, enough to overrun Europe. Instead of trying to match their forces, Eisenhower wanted to rely on the massively-destructive and easily-deployable nuclear bomb to stave off attack or deter aggression in the first place.

But massive deterrence was risky. The enemy will try out many gray areas to test which actions were permissible; in each instance the American president has to decide whether it permits the action and lose face or launch the warheads, which risks national suicide while guaranteeing the deaths of millions.

Kaufmann thought that it was reckless to use nuclear weapons at all, save only in the gravest of circumstances. He observed that America’s most successful foreign actions were carried out without the use of nuclear weapons (as was the case with the Berlin Airlift and the intervention in Korea), and continued to believe that their use could be spared.

But it would require that the United States invest in different means for response. He suggested building up conventional forces, which meant included significant ground forces to beat back a Soviet invasion of Europe and smaller scale teams that can be rapidly deployed to “hot spots.”

In the logic of deterrence, an investment in conventional warfare is a signal that nuclear arms were too dangerous to be used. Building up conventional forces was advocated not only by Kaufmann but also important figures like Bernard Brodie and Herman Kahn, two of the earliest nuclear strategists. The growth of conventional forces in the Kennedy Administration was an acknowledgment to the Soviets that they could meet conflict without compelling the use of nuclear arms.

Schelling, in his Nobel Prize lecture, considered conventional forces to be a form of arms control, one as if both sides signed a treaty not to engage in nuclear change: “The investment in restraints on the use of nuclear weapons was real as well as symbolic.” With more options available, going nuclear was moved even further back to be the path of last resort.

SIOP: Single Integrated Operational Plan

Until the end of the Eisenhower Administration, nuclear target planning was delegated to senior military commanders. No single group or person oversaw the selection of targets nor organized the deployment of the nuclear force.

Take a second to imagine what that meant. The president had only the binary decision to strike or not strike. If he decided to strike, it’s up to the different services, each with their own stockpile, to deploy the weapons. The Air Force, the Navy, and the Army made their own war plans. Multiple, redundant warheads would be delivered to a target if it was selected by more than one branch. Due to a lack of coordination, an attacking force might be wiped out by the detonation caused by another American strike. Everyone launched at their own pace; the Navy was found to have been planning strikes fifteen days after the start of war.

This was the state of American nuclear plans for over a decade. The Joint Chiefs of Staff were resistant to the idea of a single branch, which would most likely end up being the Air Force, to control all warheads. The Navy was loath to give up its prized nuclear-armed Polaris submarines, and regarded all moves to centralize to be a plot by the Air Force to monopolize nuclear weapons.

It was only towards the end of the Eisenhower Administration that military objections were overruled. In 1960, Eisenhower authorized the creation of the Single Integrated Operational Plan (pronounced SEYE-OP) to coordinate a contingent plan of nuclear strikes. Only then did the United States integrate target selection into a national plan led by a single organization: the SAC.

SIOP went through different iterations. Some of them integrated the doctrine of counterforce, giving the president different options for launching strikes.

But still it didn’t eliminate concerns about overkill. SAC made extremely pessimistic assumptions about the probability of a successful strike. They planned to lay down four thermonuclear warheads with the power of 7.8 megatons for a Russian city the size of Hiroshima; the successful detonation of all of them would generate an explosive force 600 times more powerful than the 12.5 kiloton bomb that wiped out the Japanese city. It also did not consider the impact of fallout damage, because fallout generates little military value. These assumptions gave SAC the chance to constantly demand more bombs and bombers.

Still, most iterations of SIOP still emphasized the launch of nearly the entire stockpile. Plan 1-A would involve launching over 3000 nuclear weapons, projected to kill nearly 300 million people mostly in Russia and China. SIOP also targeted countries like Albania, for which the presence of a single large air-defense radar was enough to justify a strike by a megaton bomb; no consideration was given to the political fact that the country had been drifting away from the Soviet bloc.

SIOP was refined by different administrations and by different secretaries of defense, but it always suffered two flaws: massive overkill and relative inflexibility in the severity of response. Reading SIOP made presidents and generals feel “appalled” and “stunned”; it would be referred to by Henry Kissinger as a “horror strategy.”

Continue reading

Herman Kahn and the Bomb

Herman Kahn is the kind of eccentric whom you no longer publicly see. In his capacity as an analyst at the RAND Corporation, he made a vast effort to get the public to consider his ideas: What happens next after thermonuclear exchange.

The thought of nuclear war was mostly too grim to behold, and Kahn acknowledged that head on by writing a book called Thinking About the Unthinkable. With morbid humor he challenged people to think about deterrence strategies, mineshaft shelters, and the hydrogen bomb. He loved debate, and he reached out to the public with “twelve-hour lecture, split into three parts over two days, with no text but with plenty of charts and slides.”

Kahn was the main inspiration for Dr. Strangelove. He was supposed to have the highest I.Q. ever recorded, and he made real contributions in shaping U.S. nuclear strategy through his commentary and analysis. He was accused by his colleagues and by the public of treating the annihilation of millions with far too much levity than the subject deserved. Some of the things he said have been really shocking, but it’s a bit of a shame that we don’t really see brilliant oddballs like him much in public, to listen to his ideas and then debate them.

Here are a few interesting facts about him, from two sources. First, Fred Kaplan’s book Wizards of Armageddon, which features him for a chapter:

Brodie and Kauffman approached the business of first-use and counterforce strikes uneasily, as acts of desperation among a terrible set of choices. Kahn, on the other hand, dived in eagerly.

At one point, Kahn had calculations on bomb designs plugged into all twelve high-speed computers then operating in the United States.

Calculations suggested that even with a purely countermilitary attack, two million people would die, a horrifyingly high number… (But) as Kahn phrased it, only two million people would die. Alluding almost casually to “only” two million dead was part of the image that Kahn was fashioning for himself, the living portrait of the ultimate defense intellectual, cool and fearless… Kahn’s specialty was to express the RAND conventional wisdom in the most provocative and outrageous fashion possible.

Along with an engineer at RAND, Kahn figured out on paper that such a Doomsday Machine was technologically feasible.

In the early-to-mid 1960s, Kahn would work out an elaborate theory of “escalation, ” conceiving 44 “rungs of escalation” from “Ostensible Crisis” to “Spasm of Insensate War,” with the rungs in between including “Harassing Acts of Violence,” “Barely Nuclear War,” “Justifiable Counterforce Attacks” and “Slow-Motion Countercity War.”

Kahn felt that having a good civil-defense system made the act of going to the nuclear brink an altogether salutary thing to do on occasion.

More than 5,000 people heard (his lectures) before Kahn finally compiled them into a 652-page tome called On Thermonuclear War. It was a massive, sweeping, disorganized volume, presented as if a giant vacuum cleaner had swept through the corridors of RAND, sucking up every idea, concept, metaphor, and calculation that anyone in the strategic community had conjured up over the previous decade. The book’s title was an allusion to Clausewitz’s On War… Published in 1960 by the Princeton University Press, it sold an astonishing 30,000 copies in hardcover, quickly became known simply as OTW among defense intellectuals, and touched off fierce controversy among nearly everyone who bore through it.

Strangelove, the character and the film, struck insiders as a parody of Herman Kahn, some of the dialogue virtually lifted from the pages of On Thermonuclear War. But the film was also a satire of the whole language and intellectual culture of the strategic intellectual set. Kahn’s main purpose in writing OTW was “to create a vocabulary” so that strategic issues can be “comfortably and easily” discussed, a vocabulary that reduces the emotions surrounding nuclear war to the dispassionate cool of scientific thought. To the extent that many people today talk about nuclear war in such a nonchalant, would-be scientific manner, their language is rooted in the work of Herman Kahn.

And from Louis Menand, writing in the New Yorker:

In his day, Kahn was the subject of many magazine stories, and most of them found it important to mention his girth—he was built, one journalist recorded, “like a prize-winning pear”—and his volubility.

He became involved in the development of the hydrogen bomb, and commuted to the Livermore Laboratory, near Berkeley, where he worked with Edward Teller, John von Neumann, and Hans Bethe. He also entered the circle of Albert Wohlstetter, a mathematician who had produced an influential critique of nuclear preparedness, and who was the most mandarin of the rand intellectuals. And he became obsessed with the riddles of deterrence.

For many readers, this has seemed pathologically insensitive. But these readers are missing Kahn’s point. His point is that unless Americans really do believe that nuclear war is survivable, and survivable under conditions that, although hardly desirable, are acceptable and manageable, then deterrence has no meaning. You can’t advertise your readiness to initiate a nuclear exchange if you are unwilling to accept the consequences. If the enemy believes that you will not tolerate the deaths of, say, twenty million of your own citizens, then he has called your bluff. It’s the difference between saying, “You get one scratch on that car and I’ll kill you,” and saying, “You get one scratch on that car and you’re grounded for a week.”

Kubrick was steeped in “On Thermonuclear War”; he made his producer read it when they were planning the movie. Kubrick and Kahn met several times to discuss nuclear strategy, and it was from “On Thermonuclear War” that Kubrick got the term “Doomsday Machine.”

Kubrick’s plan to make a comedy about nuclear war didn’t bother Kahn. He thought that humor was a good way to get people thinking about a subject too frightening to contemplate otherwise, and although his colleagues rebuked him for it—“Levity is never legitimate,” Brodie told him—he used jokes in his lectures.

Kahn died, of a massive stroke, in 1983. That was the year a group headed by Carl Sagan released a report warning that the dust and smoke generated by a thermonuclear war would create a “nuclear winter,” blocking light from the sun and wiping out most of life on the planet. Kahn’s friends were confident that he would have had a rebuttal.

Why Is Peter Thiel Pessimistic About Technological Innovation?

We’ve all heard this quote from Peter Thiel: “We wanted flying cars, instead we got 140 characters.” It’s the introduction of his VC’s manifesto entitled “What Happened to the Future?”, and it neatly sums up his argument that we’re economically stagnant and no longer living in a technologically-accelerating civilization.

Less well-known is a slightly longer quote from Thiel that also summarizes his views on the technological slowdown. This is from a debate with Marc Andreessen:

“You have as much computing power in your iPhone as was available at the time of the Apollo missions. But what is it being used for? It’s being used to throw angry birds at pigs; it’s being used to send pictures of your cat to people halfway around the world; it’s being used to check in as the virtual mayor of a virtual nowhere while you’re riding a subway from the nineteenth century.”

Why is Thiel pessimistic about the the recent pace of technological innovation and economic growth? Here’s a selection of his evidence that we’re no longer technologically accelerating, collected from his writings and public talks.

(Remarks from talks are lightly edited for clarity. Click here to see this article in slightly prettier formatting.)

Energy

Look at the Forbes list of the 92 people who are worth ten billion dollars or more in 2012. Where do they make money? 11 of them made it in technology, and all 11 were in computers. You’ve heard of all of them: It’s Bill Gates, it’s Larry Ellison, Jeff Bezos, Mark Zuckerberg, on and on. There are 25 people who made it in mining natural resources. You probably haven’t heard their names. And these are basically cases of technological failure, because commodities are inelastic goods, and farmers make a fortune when there’s a famine. People will pay way more for food if there’s not enough. 25 people in the last 40 years made their fortunes because of the lack of innovation; 11 people made them because of innovation. (Source: 39:30)

Real oil prices today exceed those of the Carter catastrophe of 1979–80. Nixon’s 1974 call for full energy independence by 1980 has given way to Obama’s 2011 call for one-third oil independence by 2020. (Source)

“Clean tech” has become a euphemism for “energy too expensive to afford,” and in Silicon Valley it has also become an increasingly toxic term for near-certain ways to lose money. (Source)

One of the smartest investors in the world is considered to be Warren Buffett. His single biggest investment is in the railroad industry, which I think is a bet against technological progress, both in transportation and energy. Most of what gets transported on railroads is coal, and Buffett is essentially betting that after the 21st century, we’ll look more like the 19th rather than the 20th century. We’ll go back to rail, and back to coal; we’re going to run out of oil, and clean-tech is going to fail. (Source: 10:00.)

There was a famous bet in the between Julian Simon, an economist, and Paul Ehrlich in 1980 about whether a basket of commodity prices will go down in price over the next decade. Simon famously won this bet and this was sort of taken as evidence that we have tremendous technological progress and things are steadily getting better. But if you had to re-run the Simon-Ehrlich bet on a rolling decade basis then Paul Ehrlich has been winning the bet every year since 1994 when the price of this basket of goods has been getting more expensive on a decade-by-decade basis. (Source: 8:30)

Transportation

Consider the most literal instance of non-acceleration: We are no longer moving faster. The centuries-long acceleration of travel speeds — from ever-faster sailing ships in the 16th through 18th centuries, to the advent of ever-faster railroads in the 19th century, and ever-faster cars and airplanes in the 20th century — reversed with the decommissioning of the Concorde in 2003, to say nothing of the nightmarish delays caused by strikingly low-tech post-9/11 airport-security systems. (Source)

Biotech

Today’s politicians would find it much harder to persuade a more skeptical public to start a comparably serious war on Alzheimer’s disease — even though nearly a third of America’s 85-year-olds suffer from some form of dementia. (Source)

The cruder measure of U.S. life expectancy continues to rise, but with some deceleration, from 67.1 years for men in 1970 to 71.8 years in 1990 to 75.6 years in 2010. (Source)

We have one-third of the patents approved by the FDA as we have 20 years ago. (Source: 7:35)

Space

The reason that all the rocket scientists went to Wall Street was not only because they got paid more on Wall Street, but also because they were not allowed to build rockets and supersonic planes and so on down the line. (Source: 45:50.)

Space has always been the iconic vision of the future. But a lot has gone wrong over the past couple of decades. Costs escalated rapidly. The Space Shuttle program was oddly Pareto inferior. It cost more, did less, and was more dangerous than a Saturn V rocket. It’s recent decommissioning felt like a close of a frontier. (Source)

Agriculture

The fading of the true Green Revolution — which increased grain yields by 126 percent from 1950 to 1980, but has improved them by only 47 percent in the years since, barely keeping pace with global population growth — has encouraged another, more highly publicized “green revolution” of a more political and less certain character. We may embellish the 2011 Arab Spring as the hopeful by-product of the information age, but we should not downplay the primary role of runaway food prices and of the many desperate people who became more hungry than scared. (Source)

Finance

Think about what happens when someone in Silicon Valley builds a successful company and sells it. What do the founders do with that money? Under indefinite optimism, it unfolds like this:

  • Founder doesn’t know what to do with the money. Gives it to large bank.
  • Bank doesn’t know what to do with the money. Gives it to portfolio of institutional investors in order to diversify.
  • Institutional investors don’t know what to do with money. Give it to portfolio of stocks in order to diversify.
  • Companies are told that they are evaluated on whether they generate money. So they try to generate free cash flows. If and when they do, the money goes back to investor on the top. And so on.

What’s odd about this dynamic is that, at all stages, no one ever knows what to do with the money. (Source)

10-year bonds are yielding about 2%. The expected inflation over the next decade is 2.6%. So if you invest in bonds then in real terms you’re expecting to lose 0.6% a year for a decade. This shouldn’t be surprising, because there’s no one in the system who has any idea what to do with the money. (Source: 27:35)

Science and Engineering

We have 100 times as many scientists as we did in 1920. If there’s less rapid progress now than in 1920 then the productivity per scientist is perhaps less than 1% of what it was in 1920. (Source: 50:20)

The Empire State Building was built in 15 months in 1932. It’s taken 12 years and counting to rebuild the World Trade Center. (Source: 36:00)

The Golden Gate Bridge was built in three-and-a-half years in the 1930s. It’s taken seven years to build an access road that costs more than the original bridge in real dollars. (Source: 36:10)

When people say that we need more engineers in the U.S., you have to start by acknowledging the fact that almost everybody who went into engineering did very badly in the last few decades with the exception of computer engineers. When I went to Stanford in the 1980s, it was a very bad idea for people to enter into mechanical engineering, chemical engineering, bioengineering, to say nothing of nuclear engineering, petroleum engineering, civil engineering, and aero/astro engineering. (Source: 45:20)

Computers

Even if you look at the computer industry, there are some things that aren’t as healthy as you might think. On a number of measurements, you saw a deceleration in the last decade in the industry. If you look at labor employment: It went up 100% in the 1990s, and up 17% in the years since 2000. (If you ignore the recession, it’s gone up about 38% since 2003.) So it’s slower absolute growth, and much lower percentage growth. (Source: 8:40)

If you measured the market capitalizations of companies, Google and Amazon (the two big computer companies created in the late-nineties) are worth perhaps two or three times as all companies combined since the year 2000. If you look at it through labor or capital, there’s been some sort of strange deceleration. (Source: 9:10)

We have a large Computer Rust Belt that nobody likes to talk about. It’s companies like Cisco, Dell, Hewlett Packard, Oracle, and IBM. I think that the pattern will be to become commodities that no longer innovate. There are many companies that are on the cusp. Microsoft is probably close to the Computer Rust Belt. The company that’s shockingly and probably in the Computer Rust Belt is Apple. Is the iPhone 5, where you move the phone jack from the top of the phone to the bottom of the phone really something that should make us scream Hallelujah? (Source: 9:40)

The Technologically-Accelerating Civilization

I sort-of date the end of rapid technological progress to the late-60s or early-70s. At that point something more or less broke in this country and in the western world more generally which has put us into a zone where there’s much slower technological progress. (Source: 39:30)

If you look at 40-year periods: From 1932 to 1972 we saw average incomes in the United States go up by 350% after inflation, so we were making four-and-a-half times as much. And this was comparable to the progress in the forty years before that and so on going back in time. 1972 to 2012: It’s gone up by 22%. (Source: 14:50)

During the last quarter century, the world has seen more asset booms or bubbles than in all previous times put together: Japan; Asia (ex-Japan and ex-China) pre- 1997; the internet; real estate; China since 1997; Web 2.0; emerging markets more generally; private equity; and hedge funds, to name a few. Moreover, the magnitudes of the highs and lows have become greater than ever before: The Asia and Russia crisis, along with the collapse of Long-Term Capital Management, provoked an unprecedented 20-standard-deviation move in financial derivatives in 1998. (Source)

People are starting to expect less progress. Nixon declared the War on Cancer in 1970 and said that we would defeat cancer in 1976 by the bicentennial. Today, 42 years later we are by definition 42 years closer to the goal, but most people think that we’re further than six years away. (Source: 12:10)

How big is the tech industry? Is it enough to save all Western Civilization? Enough to save the United States? Enough to save the State of California? I think that it’s large enough to bail out the government workers’ unions in the city of San Francisco. (Source: 29:00)

The Conclusion

The first step is to understand where we are. We’ve spent 40 years wandering in the desert, and we think that it’s an enchanted forest. If we’re to find a way out of this desert and into the future, the first step is to see that we’ve been in a desert. (Source)

  • Subscribe to new posts

    I publish infrequently. Enter your email to get my posts delivered to your inbox:

Continue reading