Nitrogen Fixing Fern that Grows in Water

I’ve been looking at hydro/aqua-ponics lately and ran across Azolla. The only real economic reason for the fish is nitrogen. Azolla would be a possible substitute for the fish—need to figure out conversion ratios and space. Would think growing plants has to be more effective then feeding fish but might take up more space.

Azolla’s significance comes from its partnership with several species of bacteria that can manage a trick no plant finds possible by itself: extracting nitrogen from the air and “fixing” it into chemicals such as ammonia, so that it is available to make proteins. Asian rice farmers have known of Azolla’s fertilising properties for at least 1,500 years, and in many places the fern is encouraged to grow alongside rice in paddies—a sort of aquatic version of alfalfa. Dr Pryer’s primary pitch, therefore, is that understanding the genomes of Azolla and its associated bacteria (which she proposes to sequence at the same time) might assist the improvement of this process, and maybe aid its transfer to other plants.

Would work well for cleaning up water on dairy farms as well—solves the surplus phosphorous issue.

…grow at great speed – doubling its biomass every two to three days. The only known limiting factor on its growth is phosphorus, another essential mineral. An abundance of phosphorus, due for example to eutrophication or chemical runoff, often leads to Azolla blooms.”

Full Article

The Intersection of Big Ag and Big Data

Agriculture will be an interesting space to watch over the next couple of years — GPS-driven automated combines, fertilization by drones, custom seeds based on microclimate parameters, and real-time data from remote soil sensors. The real disruption will be figuring out how to move away from corn and beans.

From The Economist:

INNOVATION is a word that brings to mind small, nimble startups doing clever things with cutting-edge technology. But it is also vital in large, long-established industries—and they do not come much larger or older than agriculture. Farmers can be among the most hidebound of managers, so it is no surprise that they are nervous about a new idea called prescriptive planting, which is set to disrupt their business. In essence, it is a system that tells them with great precision which seeds to plant and how to cultivate them in each patch of land. It could be the biggest change to agriculture in rich countries since genetically modified crops. And it is proving nearly as controversial, since it raises profound questions about who owns the information on which the service is based. It also plunges stick-in-the-mud farmers into an unfamiliar world of “big data” and privacy battles.

Distributed Systems — Destruction of the Status Quo

One thing we can say for sure is that there will be a lot more start-ups. The monolithic, hierarchical companies of the mid-twentieth century are being replaced by networks of smaller companies. This process is not just something happening now in Silicon Valley. It started decades ago, and it’s happening as far afield as the car industry. It has a long way to run. —Paul Graham, Startup Investing Trends

This is one of my core beliefs, that the world is in the process of becoming much more distributed, that the traditional hierarchies (as articulated by Fred Wilson in his Le Web talk here) we’ve organized around are changing. It’s not just social or organizational hierarchies, though; enabled by technology, our infrastructure systems (energy, manufacturing, farming) are also becoming much more distributed. These newly formed distributed systems are having  profound consequences for large legacy institutions and creating opportunities for smaller start-up companies. What follows is a discussion of old versus new in myriad industries to illustrate this shift.

Mainframes vs. Data Centers

We’ve seen computation go from large mainframes in the 1960s (think of this as Big Computation) to personal computing in the 80s, with a desktop in every home and office, and now to data centers. We seem to have settled in on the right distributed size — distributed data centers with mostly laptops and mobile phones connected to them. And this pattern can be seen now in more traditional industries as well.

Power Plants vs. Distributed Generation

I think what happened to computers is a good analogy to understand what is happening in energy. In energy, we have Big Energy — giant nuclear, coal, and natural-gas plants that deliver our power over large transmission lines. These are the equivalent of the mainframe computers from the 1960s. Now we are seeing a large increase in demand for distributed generation, one form of which is rooftop solar — this is the personal computer of the 1980s. But it’s incredibly inefficient for each of us to build our own power plant. A similar thing happened to computers — rather than build your own data center, we are now moving to “the cloud.” The cloud is just a form of distributed computation — regional distributed data centers. My guess is that we’re going to end up with an integrated, multi-layered system of power generation that consists of regional distributed generation and small modules at the household level. This might mean that today’s rooftop solar companies will become the personal computer companies of tomorrow (whatever happened to Gateway?).

Factories vs. 3D Printing

Manufacturing is about to be disrupted by 3D printing and automation driven by increases in artificial intelligence (AI). To date, labor gains have offset transportation costs in manufacturing. This is why the world’s manufacturing routinely moves to the lowest-cost labor markets — to China and now to Southeast Asia — despite the fact that these locations are thousands of miles from where the product will ultimately be distributed. With 3D printing, the costs of both manufacturing and the necessary investments will collapse. In addition, much of the assembly is already being automated by robotics and AI. So Big Manufacturing (factories and assembly lines) is already being disrupted and is about to be displaced by distributed manufacturing (3D printers and robotics). We talk of local food, but we’re on the verge of local everything — why produce our goods halfway around the world? Local will be cheap. If your economy is driven by cheap labor, you’re in trouble. (This phenomenon could also collapse trade, leading to more insular economies, but this is a discussion for another day.)

Big Farming vs. Farmers’ Markets

Food is slowly following energy into a more distributed system as well. We’ve seen it with the abundance of farmers’ markets, the return of gardens, and the movement to grow food in cities. To this point, it’s been more small-scale hobby-sized projects. But like manufacturing and energy, there will come a time where the cost to produce is less than the labor and transportation costs. You’ll see distributed farming systems, not unlike data centers, that deliver local, fresh produce year round. This will become even more pronounced as protein-based alternatives chip away at traditional meat demand.

Universities vs. Online Education

Big Education is on the verge of a massive disruption. The value of a degree seems to be collapsing as all the information anyone might need is available free and online provided by sites like Kahn Academy, among others. Universities are struggling with new models — put classes online for free, extend online degree programs, or ignore it all and hope it goes away. If we follow the other trends and apply them to education, it probably means fewer big universities and more small schools. Or perhaps we move even more closely to virtual education: one can easily imagine students in groups of 25-50 in towns around the world gathering together to take online classes from the very best professors. In fact, this is exactly what the Acumen, a patient-capital fund, is doing. My prediction is that the best universities, by changing their business models, have a chance to take market share from the middle tier — the best and the boutiques will be okay — but the middle is in trouble. We’ll also see a surge in start-up colleges and universities as the barriers to entry are gone.

The Paper vs. Twitter

I grew up in a house where my parents got the paper in the morning. They physically walked to the end of the driveway, which seems so odd, so archaic, now. My paper today is Twitter and RSS feeds. The decline of the newspaper industry has been well documented — print journalism is being replaced with a global distributed network of millions of bloggers and tweeters around the world. If I’m interested in a news event, I simply find and “turn on” the hashtag on Twitter.

Federal- and State-Level Governments vs. Cities

There has been a lot written about cities and their rise on the global stage. In a sense, government has become more distributed. Here in the US, it seems that cities are more important than ever and are increasingly the only governable unit — state and federal government seems to matter less and to be less effective. As Richard Florida has observed, we will have a “spikey” world with concentrations of people, wealth, and ideas in the best cities. The effective unit size to govern — just like a data center — has become a city.

We’re moving from a large, hierarchical, centralized world to one that is small, networked, and distributed. It’s as if each industry is homing in on their own Dunbar number — the perfectly sized data center, solar installation, or global classroom. The amazing thing to me is the breadth of this phenomenon, from education to utilities to manufacturing. Over the next 10-20 years, many of the institutions that have been in place for the last 100 years will be fundamentally altered. These systems will be smarter, stronger, and faster than the previous ones.

Increasing Density — Corn, Cities, Fuels and Circuits

Whether it’s the number of transistors on a microchip or the number of bushels of corn per acre, there is an undeniable trend toward increasing density. This creates efficiency and thus leads to an increase in productivity. In fact, one of the key components of successful technology is its ability to be miniaturized. The rate of change, governed by different parameters, is different for each industry, but the trend is clearly up and to the right everywhere you look.

Agriculture: Bushels of Corn per Acre (USDA)


Farming productivity has steadily increased. For example, from 1950 to 2000, the average yields of America’s three most important crops (corn, soybeans, and wheat) rose 3.5x, 1.7x, and 2.5x, respectively. (SMIL/USDA 2000) It was the continuous introduction of new technologies that enabled these gains, allowing us to meet the caloric needs of a rising population. The technology came first (before 1950, of course) in the form of draft animals and the use of manure for fertilization; then came synthetic fertilizer, pesticides, and combustion engines to drive harvesters and planters. From here, the transition to automated labor (think the Google car plus a combine) and more controlled environments like greenhouses and eventually vertical farms will inevitably lead to further gains.

Urbanization: From the Fields to the Cities (the Economist)


The number of Americans working on farms has steadily decreased, and by 2000, less than 5% of the US population were farmers. Gains in agricultural efficiency led to a mass migration of people from rural areas to cities, resulting in a large increase in the density of people per acre. This transition further led to gains in productivity as people lived closer, shared resources, and collaborated more. These advances in productivity mean that places like New York City can have the lowest energy emissions per capita. Cities are now cultural hotspots (see the rise of the Creative Class), not too different than biodiversity hotspots, and this urbanization will continue, mostly in Asia, as the rural become the urban around the world. Urbanization is in effect an increase in the density of people per unit of area, which leads to lower energy usage per capita and a host of other efficiencies.

Energy: US Major Fuel Transitions (EIA)


The US has undergone a number of energy transitions, from wood to coal to oil, throughout our history, and each of these was one of increasing density. Wood (16.2 MJ/kg) was replaced by coal (24 MJ/kg), a 1.5x increase in energy density. Coal was then replaced by oil, which was refined into gasoline (46 MJ/kg), leading to a 1.91x increase in density. Recently, methane (55.6 MJ/kg) or natural gas has passed coal. Methane is technically denser by mass than both coal and oil, but storing large amounts of gas in a confined space has its challenges (i.e., it requires extremely high pressures or cold temperatures).

Looking at trends this way can become a good filter. For example, ethanol at 25.65 MJ/liter compared to gasoline at 34.2 MJ/liter doesn’t look like such a great improvement. Hydrogen at 123 MJ/kg and uranium at 83,140,000 MJ/kg would be logical next steps, though. We are a long way from hydrogen-powered cars, and the development of nuclear power has been all but halted due to the recent accidents in Japan, however. Still, it’s interesting to note that each major transition over the last 200 years has been one to higher energy density.

Technology: Moore’s Law


Lastly, we come to the one everyone knows — Moore’s Law, which states that every two years, the number of transistors on an integrated circuit will double. This increase in density is what has given us the Internet, mobile phones, and even solar panels (as costs have dropped due to similar production techniques). What’s interesting about this trend is the magnitude of it — in the last 40 years, computers have become 500,000x more dense. There appears to be no end in sight as just when a physical limit appears to be reached, a new technology emerges again. Ultimately, we may find ourselves with quantum or DNA computers, both of which could lead to further increases in density.

Observations and Questions

1. Trends: What’s amazing to me looking at these charts is how smooth they are. Those lines represent the culmination of technology over decades, and yet they are clear, consistently escalating trends. These are trends that you can depend on, that are investible, and that you should be aware of. If you’re starting a business, you need to think about where you’re going to be when you go to market, not just today.

2. Transitions: There are times in each of these trends when there is a major technological shift or leap. And in fact, I think we’re in the midst of one right now with farming as we move towards more controlled indoor environments. These are step changes where there is opportunity and where wealth gets created, but investing alongside incremental changes is a tough business — the solar industry has seen one company after another go out of business as they pursue small incremental changes in panel efficiency.

3. Normal vs. Log: While the lines may look similar, the technology chart is logarithmic. Every unit is a 10x increase as opposed to a 1x increase for the corn chart. This is an enormous difference: in agriculture, a gain of 2-3x over 50 years is huge, yes, but in technology, the gain may be 500,000x over the same period. The physical world behaves differently — has different constraints — than the world of software.

4. Next: You would think there have to be limits to these trends, and we may in fact eventually witness some such barriers, but the trends in yield, the trends in urbanization, the transition to methane, and the trends in technology (chips, solar, sequencing) all seem intact for the foreseeable future. These are all good things — we’ll produce more food with fewer resources, we’ll live on less land, we’ll use more efficient fuels, and we’ll have even more powerful computers in our pockets.

Information vs. Real Assets—Linear vs Exponential Growth

I’ve been thinking about the difference between investing in information assets (computers, information) and investing in real assets (land, oil). There seems to be a fundamental difference between the two, and the effects are starting to manifest themselves in real ways.

The easiest way to understand the difference is to think about two investments. If I invested $2500 in a computer, I could get a really nice machine for that much today. I would have bought the ability to compute and share information. Fast forward 24 months, and that computer, according to Moore’s Law, would be worth half of what I paid for it (i.e., twice as powerful computers would be available). In essence, I bought a deflationary asset—that same $2500 would now buy me twice as much computing power. Compare this to what would happen if I bought $2500 of land, which is about an acre of pasture land in the US. At the end of 24 months, if history is any indicator, I would have modest appreciation (land has appreciated roughly 4% annually in the US). Thus investing in technology (as a store of value) is deflationary, and investing in real assets is inflationary (as a store of value). This is why Buffet won’t buy technology stocks—it’s a bad store of wealth over the long term.

What’s interesting is that the VC industry appears to be breaking along these two lines. Broadly speaking, looking at energy and technology, the venture industry is starting to break into two camps, as Paul Kedrosky recently showed. (Land in some ways is a good proxy for energy—it represents the ability to convert sunlight into calories—i.e., energy). What’s happened on the software side is that the cost of starting a software company has deflated so much that it’s virtually free, and thus the need for large capital investments in software has collapsed. In fact, if you need $10 million to start a software company right now, something is wrong—your scope is too big, or your architecture is bad. When the cost of starting a company is the same as a car, you don’t need venture capital, you need a couple of friends (preferably smart, strategic ones). So what we see is the emergence of “super angels,” or micro-VCs, and incubators that add a lot more value than capital. You take the investment from them because of their focus, their network, and their strategic value. Thus the future of the VC industry seems to be two camps—the Y-Combinators (boot camp and network) or IA Ventures (Big Data sector focus) versus the traditional large ($10-100 million investment) hard science investments.

So why this line between software and energy? And why can’t we take what we’ve learned and apply it to energy (information vs. real assets)? As Bill Gates has said, we’ve been fooled by the rapid success in IT:

But, as Gates put it last week, we’ve been fooled by the rapid success of IT, and “there are things that just don’t move forward.” The pace of chips and IT innovation “is rare,” said Gates. Unfortunately, some of those “things that don’t move forward” are fundamental platforms for the energy industry. For example, as Gates pointed out: batteries. “Batteries have not improved hardly at all. There are deep physical limits,” to this technology, he said.

There seem to be two reasons for this: miniaturization (potentially solvable) and physics (not so much). As Kurzweil outlined in Law of Accelerating Returns, one of the prerequisites for acceleration is the ability to miniaturize the technology. As both Vaclav Smil and Gregor Macdonald have written, all of our energy transitions to date have been one of increasing energy density—wood to coal to oil were all movements to more dense fuels. None of the current transitions and technologies are movements to a denser energy source. Maybe through better nuclear or sparked by open source biology we’ll have thousands of hackers attacking these problems, but anyway you cut it, rapid miniaturization seems unlikely. From Gregor Macdonald:

And here we find the largest hurdle of all. For, in humanity’s last two transitions, from wood to coal and then coal to oil, the trajectory each time was to a higher power density energy source. Energy transition is disruptive enough, but much less so when you are gaining energy density. And how do you suppose transition will be this time, going in the opposite direction, to lower density sources?

The second reason comes from the first law of thermodynamics—energy cannot be created or destroyed, only transformed. We can produce more information, we can only transform energy sources (we do have a nice stream from the sun each day though). From an interview with Vaclav Smil in the FT:

I have named this delusion Moore’s curse because (unlike the crowding of transistors on a microchip) it is fundamentally (that is thermodynamically) impossible for the machines and processes that now constitute the complex infrastructure of global energy extraction, conversion, transportation and transmission to double their capacity or performance, microchip-like, every 18-24 months. It’s a zero sum game… (can not be created or destroyed unlike information) – In other words, you can’t create energy, you simply move it around (fossil fuels, for example, simply release energy that has been stored and concentrated over millions of years); you can’t avoid wasting some energy when you move it around; and you can’t stop using energy altogether.

So let’s look at two technologies that are often talked about: the smart grid and algae. In the case of the smart grid, we’re talking about moving energy around more efficiently—there will be gains in robustness and availability, but it doesn’t create any energy. What’s more applicable is Metcalf’s Law (i.e., the strength of a network is proportional to the number of nodes), so we’ll have a better network and may save energy, but it won’t lead to magnitudes more energy.

Algae gets a bit more interesting because we can apply information technology to the engineering of the cells now through biotech. So we can leverage information technology to sequence, test, and even write the DNA for new cells that can produce fuel. The issue will be one of scale—when you cross the threshold from a cell to scaling it in any size, you are constrained by all the messy real world laws of thermodynamics. It seems cellulosic technologies, algae, and various other technologies all break down when it comes to scale because of this. The challenge for all these technologies seems to be crossing from an informational asset to a real asset.

Investing in real assets—land and energy projects—then is fundamentally different than investing in software. One seems to inflate while the other deflates, one is constrained by physics while the other seems to be unbounded but full of outliers. This isn’t to say one is a better investment than the other, just that they are fundamentally different, and it appears that the venture industry is breaking along these lines. Technology certainly isn’t a bad investment, but when you make such an investment, you better run and run fast because it deflates. The corollary is don’t expect a Google in energy anytime soon—it’s not going to scale like information technology. To put it another way, technology investments have fat tails, but it’s unlikely that energy will.

This isn’t bad at all—as a consequence of the deflation in information technology (or flattening), we’re seeing a shift in focus. On the software side, networks have basically deflated to the physical floor of the speed of light, and each of us has more computation power then we’ll probably ever need just on our desktops, and thus start-ups are attacking the problems of visualizing and processing this massive data set. If the venture industry turns back to more traditional researched-based hard science—biotech and energy—this seems like a good thing. This is where the big challenges and opportunities are, but they are fundamentally different problems.

Maybe a more accurate way then to describe the world is that it is informationally flat and physically lumpy.