Menu
CeBIT 2015: Co-founder Chris Kemp on OpenStack's beginnings and the future of Cloud

CeBIT 2015: Co-founder Chris Kemp on OpenStack's beginnings and the future of Cloud

The birth of OpenStack, and the future of services and the death of software

Former NASA CTO and Openstack co-founder, Chris Kemp

Former NASA CTO and Openstack co-founder, Chris Kemp

Former NASA CTO and co-founder of OpenStack, Chris Kemp, took the stage at CeBit Australia 2015 to challenge proprietary Cloud models and the virtues of an open standards Cloud environment.

Already OpenStack has been declared the fastest growing open source project in history, with companies such as Cisco and HP pouring in billions to support the project. But it had a far more humble beginning, as a collaborative tool to help NASA filter and organise the petabytes of data streaming from its various space telescopes, back in the days when gigabytes cost $30-40 dollars a pop.

On its own, Australia's Square Kilometre Array was generating five times as much data as the internet as a whole, Kemp said, laying out the scale of the problem engineers faced in 2010.

"At these prices, not only would we exceed NASA's budget, just for one of these projects, but we'd nearly exceed the federal budget of the entire US government if we were paying that much for storage. We are literally talking about hundreds of billions, even trillions of dollars," he said.

"We had to think completely differently about how we were going to build this infrastructure."

Kemp's key project at NASA was to combine the data from the Mars curiosity rover going around the Victoria crater, with the data from the Mars Reconnaissance Orbiter, which was orbiting taking high resolution images of the planet. He formed a partnership with both Google and Microsoft to take all the data from the NASA Jet Propulsion Laboratory, translate it into formats that were usable, and render it into the 3D image of Mars that became part of Google Earth.

Kemp moved operations to the birthplace of ARPAnet in the 1960s (which became the Internet) at MAE-West, in the midst of silicon valley, with cheap land and cheap power - but it was hardly a conventional set up.

"What I was able to do was build in these shipping containers, using the same ideas as Google and Facebook, petabytes of storage, 1000s of cores of compute, and cost about a million dollars, versus the tens of millions of dollars it would've cost to put a traditional datacentre in place."

The project worked a treat, and anyone can now see all the visual information NASA has on Mars via Google Earth, they can swoop through craters, do flybys, use it in classrooms, and it doesn't cost a cent, a key consideration Kemp wanted to preserve.

"This was a way for NASA to take a whole bunch of information that was probably once only seen by a few hundred people, and we've now made it accessible to literally hundreds of millions of people in the last five years."

The project drew a lot of attention, which eventually lead to a call from the Whitehouse's newly appointed CIO, Vivek Kundra.

Kundra had been following the project and was invited to help launch the US government's Cloud strategy in 2009. President Obama's administration was looking at Cloud to help save the government tens of millions of dollars in IT costs.

"The Government in the United States is the largest consumer of IT in the world, they were spending $90 billion dollars of the US budget every year on IT. The opportunity to save money was significant, because they had so many datacentres, and so many redundant and failed projects. The idea was that Cloud would mean that they wouldn't have to build this stuff themselves, and save billions of dollars."

Kemp used the infrastructure platform developed, NASA Nebula, to produce the government's USAspending.gov - a website that showed the public how it was spending "every penny" of the taxpayers money - some $US4 trillion dollars across state, federal and the private sector.

The website needed to suck up all the data from every government agency, and present it in an intuitive way via simple visualisations, such as graphs and flowcharts. It was powered by the very first "alpha" version of OpenStack.

This effectively made NASA a service provider to the Whitehouse, funded by the General Services Administration (GSA).

Being a government project OpenStack couldn't be privatised, and there were no plans to on sell or profit from the technology. This also meant the budget was limited, and attracting high level talent was tough - especially since Kemp says that top shelf talent hates working on closed standards, and hates working in secret government labs.

"We knew if we could make this technology widely available, and get people to embrace it, it could become a new standard. This would benefit NASA because it would mean that we could have other parties providing computing resources back to us," he said.

"So we open sourced the entire code. Philosophically, it also made a lot of sense to us. We didn't think that taxpayers should be paying for code that was proprietary. The code, the intellectual property, the knowledge created in that project should be open source.

Within a few weeks, Kemp was approached by Rackspace Hosting, who were also in the process of open sourcing their technology. Combining their storage technology with NASA's compute, OpenStack, as we know it now, was born.

"It was an overnight success. We got a number of companies involved very quickly, that also shared our vision for an open source Cloud platform. An AWS, or Google, or Azure that you could just run in your own datacentre, Infrastructure-as-a-Service. It's a way you can treat an entire room full of computers as one system," he said.

"OpenStack was to a datacentre, to what Linux was to a workstation in the '80s and '90s."

Already the large scale successes include: the CERN particular physics laboratory, which boasts 150,000 compute cores, and Wal-Mart, which just announced that OpenStack will be powering its entire ecommerce infrastructure.

"So this is a project that didn't exist five years ago, that a small bunch of hackers at NASA put together, and because we decided to open source the technology, we got every major IT company throwing their weight behind it. Now NASA doesn't need to build it anymore, it can buy it from any number of different vendors that are selling products competitively in the free market.

"What I learned from all this is that it's not about the code, it's about the value that the code provides. There is so much more value in the code that we created becoming available to everyone, becoming a standard in the industry, that far exceeded the value of it staying a proprietary piece of software inside NASA. It's about freedom and innovation."

Much like Cloud as a whole, Kemp believes that we need to start seeing software as a means to an end, rather than an asset paid for. It's about the value add on top of that service. The economics have fundamentally changed, hence why NASA never locked the system down and asked for license fees.

"It flies in the face of modern economics. In terms of scarcity and demand, it's effectively about negative scarcity.

"We're not buying software, we're benefiting from software, we're using software and the applications live in the Cloud. Facebook, Uber, AWS, Salesforce.com, Office365 they're all powered by massive hyperscale computing infrastructure.

"These applications are now pervasive in our lives, they're on our PCs, our tablets, on our TVs, our cars. In that sense, there's no real value in the technology that Facebook uses to build their servers, or their analytics platform. That's why now everyone open sources their technology, if you can get more people using your technology, it provides more value."

Effectively, the whole world is becoming a digital service.

"If we start to think about the future, and the world of tomorrow, we're really starting to see companies become service providers for the thing that they do. If you're Amazon, you're trying to optimise your supply chain, how to optimise the placement of warehouses, how to optimise pricing, margin profiles.

"Most banks are largely huge IT shops, they are just trying to understand the implications of various factors in the market, playing out complex algorithms.

"Companies will focus on a small handful of things that make them who they are, and they will consume everything else as a service from other companies who make it their business to create those goods. It no longer makes sense to host your own email, your own accounting system, your own ERP or HR systems."

Allan Swann is the Editor of ARN, published by IDG Communications Australia. Follow Allan on Twitter @allanswann, and at Google+.

Follow Us

Join the ARN newsletter!

Error: Please check your email address.

Tags datacentreGoogle Earthopen sourceNASAChris Kempbig dataOpenStackrackspacePresident ObamaVivek KundraGoogleMicrosoftCloud

iasset.com is a channel management ecosystem that automates all major aspects of the entire sales, marketing and service process, including data tracking, integrated learning, knowledge management and product lifecycle management.

Show Comments