In 2007, when Google unveiled its initiative to make renewable energy competitive with coal, called RE<C, it represented a major breakthrough for the industry.

The tech giant said it was prepared to invest tens of millions of dollars to boost emerging solar, wind and geothermal technologies in order to rival the economics of coal. The initiative was unprecedented for a company of its type and put it in the same league with GE, which had undertaken its own ambitious multi-billion-dollar effort years earlier, ecomagination, to commercialize emerging clean energy technologies.

Then, in 2011, Google stopped its R&D efforts prematurely. It appeared the company was more bullish on the deployment of renewables, not on spending lots of money on R&D. In the years since, Google has invested more than $1 billion directly in solar and wind projects. 

"You’d think the thrill might wear off this whole renewable energy investing thing after a while. Nope -- we’re still as into it as ever," rejoiced the company in a blog post last fall. 

The company has now procured enough renewable energy and efficiency to offset its carbon emissions. Meanwhile, the levelized cost of renewables has come down to rival the cost of building new coal plants. 

So did Google just see the trend line early and pull the plug on unnecessary investments?

Actually, it was the opposite.

Two Google engineers who worked on the RE<C initiative have finally opened up about why the team halted their efforts. And it wasn't because they thought existing renewables were enough to decarbonize the global economy.

"Trying to combat climate change exclusively with today’s renewable energy technologies simply won’t work; we need a fundamentally different approach," wrote Google's Ross Koningstein and David Fork in a piece published yesterday in IEEE's Spectrum.

It's a striking admission from a company that has relentlessly supported the growth of renewable energy.

When Google first set out on its mission, the RE<C team was convinced that existing renewables (or those close to commercialization) could reduce emissions enough to avoid the worst climate change scenarios. But by 2011, when engineers realized that their investments were not playing out as expected, they ditched the program and set out to rethink its goals.

"As we reflected on the project, we came to the conclusion that even if Google and others had led the way toward a wholesale adoption of renewable energy, that switch would not have resulted in significant reductions of carbon dioxide emissions," wrote Koningstein and Fork.

The team came to that conclusion after examining different scenarios for renewable energy penetration using a low-carbon modeling tool from the consulting firm McKinsey. They compared those scenarios to former NASA scientist James Hansen's famous 2008 model showing that a 350 ppm emissions level was needed to stabilize the climate.

They didn't find promising results:

We decided to combine our energy innovation study’s best-case scenario results with Hansen’s climate model to see whether a 55 percent emission cut by 2050 would bring the world back below that 350-ppm threshold. Our calculations revealed otherwise. Even if every renewable energy technology advanced as quickly as imagined and they were all applied globally, atmospheric CO2 levels wouldn’t just remain above 350 ppm; they would continue to rise exponentially due to continued fossil fuel use. So our best-case scenario, which was based on our most optimistic forecasts for renewable energy, would still result in severe climate change, with all its dire consequences: shifting climatic zones, freshwater shortages, eroding coasts, and ocean acidification, among others. Our reckoning showed that reversing the trend would require...radical technological advances in cheap zero-carbon energy, as well as a method of extracting CO2 from the atmosphere and sequestering the carbon.

Those calculations cast our work at Google’s RE<C program in a sobering new light. Suppose for a moment that it had achieved the most extraordinary success possible, and that we had found cheap renewable energy technologies that could gradually replace all the world’s coal plants -- a situation roughly equivalent to the energy innovation study’s best-case scenario. Even if that dream had come to pass, it still wouldn’t have solved climate change. This realization was frankly shocking: Not only had RE<C failed to reach its goal of creating energy cheaper than coal, but that goal had not been ambitious enough to reverse climate change.

So what does that mean for Google's strategy? 

Koningstein and Fork hint at one possible focus: technologies like power electronics that can efficiently control the grid and enable higher penetrations of distributed generation. In July, Google unveiled a $1 million challenge to build an inverter one-tenth the size of existing devices. 

Unfortunately, most of today’s clean generation sources can’t provide power that is both distributed and dispatchable. Solar panels, for example, can be put on every rooftop, but can’t provide power if the sun isn’t shining. Yet if we invented a distributed, dispatchable power technology, it could transform the energy marketplace and the roles played by utilities and their customers. Smaller players could generate not only electricity but also profit, buying and selling energy locally from one another at real-time prices. Small operators, with far less infrastructure than a utility company and far more derring-do, might experiment more freely and come up with valuable innovations more quickly.

The engineers stop short of advocating for specific technology investments such as advanced nuclear. Instead, they call for a "70-20-10" approach to pursuing technology development similar to the one Google has implemented.

Incremental improvements to existing technologies aren’t enough; we need something truly disruptive to reverse climate change. What, then, is the energy technology that can meet the challenging cost targets? How will we remove CO2 from the air? We don’t have the answers. Those technologies haven’t been invented yet. However, we have a suggestion for how to foster innovation in the energy sector and allow for those breakthrough inventions.

Consider Google’s approach to innovation, which is summed up in the 70-20-10 rule espoused by executive chairman Eric Schmidt. The approach suggests that 70 percent of employee time be spent working on core business tasks, 20 percent on side projects related to core business, and the final 10 percent on strange new ideas that have the potential to be truly disruptive.

Wouldn’t it be great if governments and energy companies adopted a similar approach in their technology R&D investments? The result could be energy innovation at Google speed. Adopting the 70-20-10 rubric could lead to a portfolio of projects. The bulk of R&D resources could go to existing energy technologies that industry knows how to build and profitably deploy. These technologies probably won’t save us, but they can reduce the scale of the problem that needs fixing. The next 20 percent could be dedicated to cutting-edge technologies that are on the path to economic viability. Most crucially, the final 10 percent could be dedicated to ideas that may seem crazy but might have huge impact. Our society needs to fund scientists and engineers to propose and test new ideas, fail quickly, and share what they learn. Today, the energy innovation cycle is measured in decades, in large part because so little money is spent on critical types of R&D.

The piece does not say whether Google intends to follow its own advice and develop a similar approach to investing in next-generation energy technologies. Indeed, the company has taken a somewhat incremental, deployment-heavy strategy itself in the wake of the failure of the RE<C by focusing on driverless cars, conventional renewable energy procurement and home automation through its acquisition of Nest.

But Koningstein and Fork had a blunt message about their experience: "With 20/20 hindsight, we see that it didn’t go far enough."