NYT: Venture Capitalists Return to Backing Science Start-ups #energy #venture

It’s good to see capital coming back into energy and industrials. This time around, the money is smarter—focusing on smaller pilot projects, leveraging software, and partnering strategics with different return expectations:

After years of shying away from science, engineering and clean-technology start-ups, investors are beginning to take an interest in them again, raising hopes among entrepreneurs in those areas that a long slump is finally over. But these start-ups face intense pressure to prove that their science can turn a profit more quickly than hot tech companies like Snapchat and Uber.

Overall, industrial and energy start-ups attracted $1.24 billion in venture capital financing in the first half of 2014, more than twice as much as in the period a year earlier, according to statistics from the National Venture Capital Association. Still, investment remains well below peaks reached in 2008, when industrial and energy start-ups attracted $4.64 billion.

The full article can be found here.

The Earth as an Egg

This cushion-for-error of humanity’s survival and growth up to now was apparently provided just as a bird inside of the egg is provided with liquid nutriment to develop it to a certain point. But then by design the nutriment is exhausted at just the time when the chick is large enough to be able to locomote on its own legs. And as the chick pecks at the shell seeking more nutriment it inadvertently breaks open the shell. Stepping forth from its initial sanctuary, the young bird must now forage on its own legs and wings to discover the next phase of its regenerative sustenance. —Buckminster Fuller, Operating Manual for Spaceship Earth

Good writers changes your perspective through their choice of words. Buckminster Fuller did this by calling the planet “Spaceship Earth,” which immediately alters the imagery — now, thanks to his words, we’re flying through space. His analogy of the earth as an egg does this as well, and it’s beautiful and frightening at the same time.

It’s a very Goldilocks sort of view — everything is just right. We’ve been given all that we need to hatch humanity into perpetuity like the chick that consumes the nutrients until it has the strength to peck its way out of its shell.

And as we wrestle with renewable energy and increasing food demands, the analogy is even more fitting. Fossil fuels are the nutrients that have propelled us thus far. The biosphere — a thin, fragile ribbon around the earth — is our yolk. It encompasses all life as we know it and stores all of the energy we have. The atmosphere is the shell that traps all of the sunlight we need and protects us from harmful rays and errant asteroids. We are encapsulated and safe.

However, the egg analogy also suggests an end, that the clock is ticking — we have to make sure that we don’t consume all the nutrients before we have the strength to break through. But what does this mean, this breaking through? With a renewed interest in space travel, maybe it means leaving the planet, that we’ve been given enough nutrients here to build a machine to take flight and explore.

It’s a beautiful thought to me that the universe is full of millions of little blue eggs, incubated by distant suns, waiting to hatch their inhabitants into space. But the fact remains that someday our sun will be no more, and then what? Do we break through to a new world, a new existence, or do we wither away to nothing within the shell?


Natural Gas Overview — Why is Methane a Clean Fuel?

Introduction to Methane

What we call natural gas is mostly the chemical compound methane (95% or more; the rest is ethane or longer carbon chains). Methane, which comes out of the ground as a gas, is produced when microorganisms known as methanogens feed on organic matter in environments with little or no oxygen. It is abundant, seeping out of your garbage, landfills, and swamps. Also, everywhere you find oil, you find methane, usually in a pocket above the oil deposit. This methane emanated from the same organic material (dead plants and animals) that produced the oil. Methane can be captured where it naturally occurs or produced in a controlled environment like an anaerobic digester. After it is captured or produced, it is cleaned (by removing carbon dioxide and other liquids); compressed to a higher pressure; odorized (it’s odorless and lethal in high doses in its natural state); and piped into our homes, power plants, and factories for heat and power.

Clean Combustion

Methane, like all fossil fuels, can be combusted (reacted with oxygen) to form energy and water. In fact, a large and growing part of our electricity supply comes from methane. It is the simplest fossil fuel — a single carbon atom with four hydrogen atoms, or CH4. Compare that to diesel fuel, which is a soup of long-chained carbons with sulfur and other molecules attached.

The basic methane combustion reaction is:

CH4 (methane) + 2 O2 (oxygen) = CO2 (carbon dioxide) + 2 H20 (water) + energy

Because of its simplicity and lack of additional compounds, methane is the cleanest of the fossil fuels to combust. When we say cleanest, though, we often mean different things. In terms of the production of carbon dioxide (i.e., the major greenhouse gas), methane has the lowest density, meaning we get more energy per unit of carbon dioxide than we do with other fuels. It releases 29% less carbon than oil, 43% less than coal, and 20-30% less lifecycle carbon than oil when used as a transportation fuel. In addition, unlike other fuels, methane combustion results in basically no NOx (nitrous oxide), SOx (sulfur dioxide), or particulate matter being released into the atmosphere. These gases are all dangerous to our health and regulated under the Clean Air Act.

Fossil fuels and their energy density:

natural gas (51.6 kJ/g) > petroleum (43.6 kJ/g) > coal (39.3 kJ/g) > ethanol (27.3 kJ/g) > wood (16.1 kJ/g)

It should be noted that methane, by itself, when released into the atmosphere, is a potent greenhouse gas. It captures heat [70 times] better and thus, by weight, is 70 times as dangerous as carbon dioxide. This is why it’s so important to flare methane to ensure that it is completely combusted into carbon dioxide. It also means that it is critical that the infrastructure to transport methane — drilling sites, pipes, and tanks — minimizes any leakage into the air. Otherwise, the benefit of transitioning from coal to natural gas (in terms of greenhouse gases) would quickly be lost.

The Age of Methane

The increased use of methane in the US, predominately replacing coal, has stabilized if not actually lowered our level of emissions, and the retirement of old coal power plants is eliminating one of the country’s largest polluters.

Energy transitions typically move from a lower density fuel to a higher density fuel. We moved from wood to coal to oil, and now methane is creeping up, passing coal to become the second largest source of energy in the US. While denser, it is also a gas, which presents logistical issues for transport and storage.  Nonetheless, I think we will eventually make the transition from oil to methane, at least in the US, before renewables ultimately take over in the second half of this century. And when they do, it will be because technically they are a better fuel — i.e., more energy dense.


Global Dynamics

As discussed, methane, being a gas, presents a transportation challenge. The US has a vast network of pipelines, and Russia pipes compressed natural gas into Europe and China from their vast reserves. However, in order to physically move natural gas, as opposed to transferring it via pipeline, you need to cool it to a liquid (at -260 degrees Fahrenheit), at which point it becomes liquified natural gas, or LNG (in it’s compressed form, it’s called compressed natural gas, or not surprisingly, CNG). Once liquified, it can then be transported via ship.

Less than 10 years ago, the US built import terminals to import LNG from abroad. More recently, however, with the discovery of new drilling techniques (i.e., fracking), those same terminals have been re-configured as export terminals as the US is now one of the world’s leading producers of natural gas, along with Russia and Qatar. However, the difficultly and, thus, cost to move it has created a huge pricing disparity around the world. For example, natural gas is routinely under $5/MMBTU in the US, while it can be as much as $20/MMBTU in Japan or China. This has put a huge amount of pressure on producers to export to Asia to satisfy growing demand, as well as on the Asian countries to produce more gas themselves through a combination of the gasification of coal and importing drilling technologies from the US. Just 10 years ago, the US was scared of running out of fuel, but now we find ourselves with an abundance; Asia, by importing fracking technology, could very well find itself in a similar situation.

Distributed Systems — Destruction of the Status Quo

One thing we can say for sure is that there will be a lot more start-ups. The monolithic, hierarchical companies of the mid-twentieth century are being replaced by networks of smaller companies. This process is not just something happening now in Silicon Valley. It started decades ago, and it’s happening as far afield as the car industry. It has a long way to run. —Paul Graham, Startup Investing Trends

This is one of my core beliefs, that the world is in the process of becoming much more distributed, that the traditional hierarchies (as articulated by Fred Wilson in his Le Web talk here) we’ve organized around are changing. It’s not just social or organizational hierarchies, though; enabled by technology, our infrastructure systems (energy, manufacturing, farming) are also becoming much more distributed. These newly formed distributed systems are having  profound consequences for large legacy institutions and creating opportunities for smaller start-up companies. What follows is a discussion of old versus new in myriad industries to illustrate this shift.

Mainframes vs. Data Centers

We’ve seen computation go from large mainframes in the 1960s (think of this as Big Computation) to personal computing in the 80s, with a desktop in every home and office, and now to data centers. We seem to have settled in on the right distributed size — distributed data centers with mostly laptops and mobile phones connected to them. And this pattern can be seen now in more traditional industries as well.

Power Plants vs. Distributed Generation

I think what happened to computers is a good analogy to understand what is happening in energy. In energy, we have Big Energy — giant nuclear, coal, and natural-gas plants that deliver our power over large transmission lines. These are the equivalent of the mainframe computers from the 1960s. Now we are seeing a large increase in demand for distributed generation, one form of which is rooftop solar — this is the personal computer of the 1980s. But it’s incredibly inefficient for each of us to build our own power plant. A similar thing happened to computers — rather than build your own data center, we are now moving to “the cloud.” The cloud is just a form of distributed computation — regional distributed data centers. My guess is that we’re going to end up with an integrated, multi-layered system of power generation that consists of regional distributed generation and small modules at the household level. This might mean that today’s rooftop solar companies will become the personal computer companies of tomorrow (whatever happened to Gateway?).

Factories vs. 3D Printing

Manufacturing is about to be disrupted by 3D printing and automation driven by increases in artificial intelligence (AI). To date, labor gains have offset transportation costs in manufacturing. This is why the world’s manufacturing routinely moves to the lowest-cost labor markets — to China and now to Southeast Asia — despite the fact that these locations are thousands of miles from where the product will ultimately be distributed. With 3D printing, the costs of both manufacturing and the necessary investments will collapse. In addition, much of the assembly is already being automated by robotics and AI. So Big Manufacturing (factories and assembly lines) is already being disrupted and is about to be displaced by distributed manufacturing (3D printers and robotics). We talk of local food, but we’re on the verge of local everything — why produce our goods halfway around the world? Local will be cheap. If your economy is driven by cheap labor, you’re in trouble. (This phenomenon could also collapse trade, leading to more insular economies, but this is a discussion for another day.)

Big Farming vs. Farmers’ Markets

Food is slowly following energy into a more distributed system as well. We’ve seen it with the abundance of farmers’ markets, the return of gardens, and the movement to grow food in cities. To this point, it’s been more small-scale hobby-sized projects. But like manufacturing and energy, there will come a time where the cost to produce is less than the labor and transportation costs. You’ll see distributed farming systems, not unlike data centers, that deliver local, fresh produce year round. This will become even more pronounced as protein-based alternatives chip away at traditional meat demand.

Universities vs. Online Education

Big Education is on the verge of a massive disruption. The value of a degree seems to be collapsing as all the information anyone might need is available free and online provided by sites like Kahn Academy, among others. Universities are struggling with new models — put classes online for free, extend online degree programs, or ignore it all and hope it goes away. If we follow the other trends and apply them to education, it probably means fewer big universities and more small schools. Or perhaps we move even more closely to virtual education: one can easily imagine students in groups of 25-50 in towns around the world gathering together to take online classes from the very best professors. In fact, this is exactly what the Acumen, a patient-capital fund, is doing. My prediction is that the best universities, by changing their business models, have a chance to take market share from the middle tier — the best and the boutiques will be okay — but the middle is in trouble. We’ll also see a surge in start-up colleges and universities as the barriers to entry are gone.

The Paper vs. Twitter

I grew up in a house where my parents got the paper in the morning. They physically walked to the end of the driveway, which seems so odd, so archaic, now. My paper today is Twitter and RSS feeds. The decline of the newspaper industry has been well documented — print journalism is being replaced with a global distributed network of millions of bloggers and tweeters around the world. If I’m interested in a news event, I simply find and “turn on” the hashtag on Twitter.

Federal- and State-Level Governments vs. Cities

There has been a lot written about cities and their rise on the global stage. In a sense, government has become more distributed. Here in the US, it seems that cities are more important than ever and are increasingly the only governable unit — state and federal government seems to matter less and to be less effective. As Richard Florida has observed, we will have a “spikey” world with concentrations of people, wealth, and ideas in the best cities. The effective unit size to govern — just like a data center — has become a city.

We’re moving from a large, hierarchical, centralized world to one that is small, networked, and distributed. It’s as if each industry is homing in on their own Dunbar number — the perfectly sized data center, solar installation, or global classroom. The amazing thing to me is the breadth of this phenomenon, from education to utilities to manufacturing. Over the next 10-20 years, many of the institutions that have been in place for the last 100 years will be fundamentally altered. These systems will be smarter, stronger, and faster than the previous ones.

Energy in 2020 — Transitions and Themes in 4 Graphs

A couple of weeks ago, I was a guest at Northwestern for an energy and entrepreneurship course. The main question that we discussed was where do we see energy markets in 2020? It’s dangerous to speculate in energy, but here are my thoughts in four graphs:

1. Energy Transitions

Energy is a big, physical problem — it’s about math and physics — you’re not going to “Whats App” the energy market. Thus, energy transitions happen over large periods of time — it’s hard to turn the ship. But there are long-term trends and transitions, and they typically are towards cheaper and denser fuels. The large-scale trends seemingly evident now are that solar and natural gas are on their way up, and coal, nuclear, and petroleum are on the their way down. I think these trends will continue, at least in the US: we’re not going to build any more coal or nuclear power plants, and methane and renewables are growing fast.

2. Oil

To even begin to predict the future of energy, of course it’s important to understand the current and past dominant fuel source — in our case, that’s petroleum. If you had to guess what the price of oil would be in 2020, what would you say? Above $80? Above $120? Or even higher? The majority of the class thought it would be over $120. The day I was there, oil for delivery in December 2020 settled at $77.72. This tells you what the market thinks. As Ahmed Zaki Yamani, the Sauidi Oil Minister, said, “The Stone Age came to an end not for a lack of stones, and the oil age will end, but not for a lack of oil.” So it is a demand issue — fewer miles driven, fewer drivers licenses, and greater fuel efficiency will mean less demand for oil. Oil may not be $77 in 2020, but it’s going to be lower than people think—let’s say under $100.

3. Methane

It’s important to remember that $5 methane (natural gas) is the equivalent of a $29 barrel of oil. Methane is cheap, seemingly abundant, and the least CO2-intensive of all fossil fuels. Methane combusts cleanly while diesel produces SOx, NOx, and particulate matter. Just looking at the price disparity and the change in reserves, I think there is a real chance that you see methane pass petroleum in that first graph by 2020.

4. Solar


Solar is growing exponentially, and when things grow exponentially, we tend to under-predict the consequences, and I think this is true of solar. Gains in semiconductor technology continue to accelerate and have led to a Moore’s Law of solar as it relates to efficiency. It was originally assumed that band gaps would create a physical limit to efficiency. However, physicists have figured out how to stack cells to create complimentary band gaps or multi-junction cells with theoretical efficiencies north of 40%. Photosynthesis is just 11%. Think about that — we’re 4x better than photosynthesis, a process that took millions of years to hone. Solar cells with 50% theoretical efficiency by 2020 seems realistic.

Energy transitions are large-scale and slow-moving, to be sure, but they are real. So again, although it is difficult to try to predict energy markets, I think the four trends above are accurate and here to stay, at least for the foreseeable future.


The Economics of EVs — 12 months with a Tesla Model S

I’m a Tesla evangelist. The car is phenomenal and one of the smartest consumer products I’ve ever owned. It’s sharp, drives well, the software integration is amazing, and it received the highest rating ever from Consumer Reports. I’ve had the car for a little over a year and wanted to take a look at the economics of fueling.

I’ve driven 4,499 miles over the last year. During that period, the car used 1,932 kWh of electricity or 429 Wh/mile. I drive mostly in the city, continuously starting and stopping, so my efficiency is likely lower than the average. I pay $0.05 per kWh of electricity in Chicago (one of the cheapest rates in the country). There is a loss of about 10-20% when you charge (according to Wikipedia, which references studies on lithium-ion lifetime-charge efficiencies). We’ll compare these stats to the Honda Civic, which gets 28 mpg in the city. Gasoline in the Midwest averaged about $3.35 over the past year (according to the EIA).

1,932 kWh/0.85% = 2,273 kWh purchased from the utility (adjusted for charge efficiency)

2,273 kWh x $0.05 = $113.65 in electricity for 4,500 miles

$113.65/4,500 miles = $0.025 per mile in fuel costs

$0.025 per mile * 28 mpg = $0.70 per gasoline gallon equivalent

4,500 miles/28 mpg = 161 gallons of gas x $3.35 = $538.40 in gasoline costs

$538.40 in gasoline vs. $113.65 in electricity or a $424.75 difference (nearly an 80% reduction)

On a percentage basis, it’s clearly a large reduction. But I didn’t buy the car for the fuel savings, and I don’t drive enough miles for the economics to work. I’d have to own the car for a long time to get any real payback on the purchase price. However, this does give us some insight into mass-market adoption from an economic standpoint.

According to the EPA, the average miles driven in the US was 11,493 in 2010.

11,493 miles/28 mpg (Honda Civic) = 410 gallons of gasoline x $3.35 = $1,375 in fuel costs

$1,375 in fuel x 80% reduction = $1,100 in fuel savings per year

The average commuter then would save $1,100 a year in an EV versus a Civic.

Let’s assume that this consumer is going to spend $20,000 on a Honda Civic (which starts at $18,000). The question is how much more would you be willing to spend to save the $1,100 a year?

If you spent $25,000, an increase of $5,000, you’d get a 22.0% return ($1,100/$5,000).

If you spent $30,000, an increase of $10,000, you’d get a 11.0% return ($1,100/$10,000).

If you spent $35,000, an increase of $15,000, you’d get a 7.3% return ($1,100/$15,000).

This all sounds about right and squares with the pricing of the Prius. People are willing to pay more, maybe $5,000 to $12,000 more, for the Prius because it’s more efficient, cleaner, and has a certain status. A mass-market EV would have to be near that price point. And even at a $35,000 price point, it still would offer a 7% return in fuel costs to the average consumer versus a leading car like the Honda Civic. Certainly it gets better if you travel more miles. If you are going 15,000 or 20,000 miles per year, then the fuel savings quickly add up. It seems very rational to me then that if EV costs can get close to $30,000 to $35,000, you should see widespread adoption on economics alone.

The car would have to have adequate range, but the required range is reasonable. Let’s assume that the 11,493 miles (the number of miles the average person drives per year in the US) are all driven on weekdays. 11,493/260 weekdays = 44 miles a day. Lets double to 88 miles just to be safe. Therefore, let’s round up and say that you’d want 100 miles of range for a commuter car.

The whole analysis is sensitive to a number of inputs. For example, where you live will determine your fuel and electricity costs, but these tend to be linked. California has higher electricity costs but also higher gasoline costs.

The biggest risk to EV adoption maybe improved fuel efficiency standards. These new standards are going to have a big effect on fuel efficiency in vehicles in the coming years and reduce the demand for gas. That along with cheap and seemingly now abundant oil in the US means gasoline prices are likely going to go down. All that being said, it feels like we could get there this time.

Increasing Density — Corn, Cities, Fuels and Circuits

Whether it’s the number of transistors on a microchip or the number of bushels of corn per acre, there is an undeniable trend toward increasing density. This creates efficiency and thus leads to an increase in productivity. In fact, one of the key components of successful technology is its ability to be miniaturized. The rate of change, governed by different parameters, is different for each industry, but the trend is clearly up and to the right everywhere you look.

Agriculture: Bushels of Corn per Acre (USDA)


Farming productivity has steadily increased. For example, from 1950 to 2000, the average yields of America’s three most important crops (corn, soybeans, and wheat) rose 3.5x, 1.7x, and 2.5x, respectively. (SMIL/USDA 2000) It was the continuous introduction of new technologies that enabled these gains, allowing us to meet the caloric needs of a rising population. The technology came first (before 1950, of course) in the form of draft animals and the use of manure for fertilization; then came synthetic fertilizer, pesticides, and combustion engines to drive harvesters and planters. From here, the transition to automated labor (think the Google car plus a combine) and more controlled environments like greenhouses and eventually vertical farms will inevitably lead to further gains.

Urbanization: From the Fields to the Cities (the Economist)


The number of Americans working on farms has steadily decreased, and by 2000, less than 5% of the US population were farmers. Gains in agricultural efficiency led to a mass migration of people from rural areas to cities, resulting in a large increase in the density of people per acre. This transition further led to gains in productivity as people lived closer, shared resources, and collaborated more. These advances in productivity mean that places like New York City can have the lowest energy emissions per capita. Cities are now cultural hotspots (see the rise of the Creative Class), not too different than biodiversity hotspots, and this urbanization will continue, mostly in Asia, as the rural become the urban around the world. Urbanization is in effect an increase in the density of people per unit of area, which leads to lower energy usage per capita and a host of other efficiencies.

Energy: US Major Fuel Transitions (EIA)


The US has undergone a number of energy transitions, from wood to coal to oil, throughout our history, and each of these was one of increasing density. Wood (16.2 MJ/kg) was replaced by coal (24 MJ/kg), a 1.5x increase in energy density. Coal was then replaced by oil, which was refined into gasoline (46 MJ/kg), leading to a 1.91x increase in density. Recently, methane (55.6 MJ/kg) or natural gas has passed coal. Methane is technically denser by mass than both coal and oil, but storing large amounts of gas in a confined space has its challenges (i.e., it requires extremely high pressures or cold temperatures).

Looking at trends this way can become a good filter. For example, ethanol at 25.65 MJ/liter compared to gasoline at 34.2 MJ/liter doesn’t look like such a great improvement. Hydrogen at 123 MJ/kg and uranium at 83,140,000 MJ/kg would be logical next steps, though. We are a long way from hydrogen-powered cars, and the development of nuclear power has been all but halted due to the recent accidents in Japan, however. Still, it’s interesting to note that each major transition over the last 200 years has been one to higher energy density.

Technology: Moore’s Law


Lastly, we come to the one everyone knows — Moore’s Law, which states that every two years, the number of transistors on an integrated circuit will double. This increase in density is what has given us the Internet, mobile phones, and even solar panels (as costs have dropped due to similar production techniques). What’s interesting about this trend is the magnitude of it — in the last 40 years, computers have become 500,000x more dense. There appears to be no end in sight as just when a physical limit appears to be reached, a new technology emerges again. Ultimately, we may find ourselves with quantum or DNA computers, both of which could lead to further increases in density.

Observations and Questions

1. Trends: What’s amazing to me looking at these charts is how smooth they are. Those lines represent the culmination of technology over decades, and yet they are clear, consistently escalating trends. These are trends that you can depend on, that are investible, and that you should be aware of. If you’re starting a business, you need to think about where you’re going to be when you go to market, not just today.

2. Transitions: There are times in each of these trends when there is a major technological shift or leap. And in fact, I think we’re in the midst of one right now with farming as we move towards more controlled indoor environments. These are step changes where there is opportunity and where wealth gets created, but investing alongside incremental changes is a tough business — the solar industry has seen one company after another go out of business as they pursue small incremental changes in panel efficiency.

3. Normal vs. Log: While the lines may look similar, the technology chart is logarithmic. Every unit is a 10x increase as opposed to a 1x increase for the corn chart. This is an enormous difference: in agriculture, a gain of 2-3x over 50 years is huge, yes, but in technology, the gain may be 500,000x over the same period. The physical world behaves differently — has different constraints — than the world of software.

4. Next: You would think there have to be limits to these trends, and we may in fact eventually witness some such barriers, but the trends in yield, the trends in urbanization, the transition to methane, and the trends in technology (chips, solar, sequencing) all seem intact for the foreseeable future. These are all good things — we’ll produce more food with fewer resources, we’ll live on less land, we’ll use more efficient fuels, and we’ll have even more powerful computers in our pockets.