Inside a Rackspace data center. (Photo: Rackspace)

Inside a Rackspace data center. (Photo: Rackspace)

OpenStack’s Search for True Self

Add Your Comments

While the name OpenStack appears everywhere you turn in the IT world, there isn’t really an agreed-upon set of technologies the trademark describes. The number of technologies in the OpenStack ecosystem is growing, and there is a push underway within the OpenStack Foundation to define what is and what isn’t OpenStack.

The goal of this push is to ensure interoperability between OpenStack clouds, Mark Collier, chief operating officer of OpenStack, said. “One of the things people envision is interoperation,” he said. “If you want to use the same tools, you want to use [them] for any OpenStack cloud. Now that we’ve gotten to this critical mass, we want to live up to that vision.”

The community operates around a six-month release cycle with frequent development milestones. Given the frequency of updates, definitions trail behind the releases. The foundation’s board is close to completing definition work for Havana, the eighth release of the open source cloud architecture, which came out about one year ago.  They will then move to the most recent Icehouse release, but the next one, Juno, is already slated for release in a few weeks.

The board is trying to come to a consensus across the ecosystem on a minimum set of requirements a cloud needs to meet to rightfully use the OpenStack trademark. Board member and Dreamhost CEO Simon Anderson said components of the baseline set – Nova for compute and Swift for block and object storage – are fundamental building blocks, but the definition is “soft.”

Dreamhost, a hosting company, has an OpenStack cloud but uses Ceph instead of Swift, which would disqualify it had the definition been a hard one.

“Smart end users are seeing it as ‘what do I need to use in this broad set of software? The trademark issue will definitely help, specifically around API compatibility,” he said.

Same tests for everyone

In addition to better defining itself, the OpenStack foundation is working on a set of tests to qualify interoperability between different companies’ technologies in the OpenStack ecosystem. “We’re just confirming that a cloud operates the way you’d expect, that it has common behaviors,” Collier said.

“What we’re basically doing as a community is we’re taking the same kinds of tests. As we release new versions of OpenStack, we make sure it doesn’t break downstream.”

Behind the testing effort is Tempest, an open source project that contains many different types of integration tests.

“We’re getting to the point where the tests can be run by different companies that use the [OpenStack] trademark,” Colliers said. “There will be a grace period to get used to the idea of passing all of these tests. Right now we’re socializing it, getting [it in] the hands of users and seeing if there are any red flags. We absolutely didn’t want to dictate that day one you have to pass all of these tests.”

Why now?

OpenStack is now four years old, while the OpenStack Foundation is two. Collier explained that standards around definitions and testing for interoperability have become a focus now because OpenStack has reached a critical mass of real-world installations and users.

“There wasn’t a framework for how to talk about it. Now there is,” Anderson said. “My sense being an insider, is that there’s a lot of will to get things done, a consensus from a large number of companies.”

The foundation consists of separate but overlapping responsibilities. “During the last couple of board meetings, we’ve done joint meetings with the technical community,” he said. “Combining commercial governance together with the technical community has brought a lot of discussion of the strategy around OpenStack.”

Jonathan Bryce, executive director of the foundation, said there were two aspects to the definition work: one about the expected capabilities and the other about specific lines and modules of code under the hood. “That gets into a lot of detail,” Bryce said about the latter. “That’s where there’s the most discussion.”

User-driven decisions

Just as users have been instrumental to evolution of technology behind OpenStack, they have played an important role in the process of documentation, testing and development, as well as the interoperability programs, Bryce said.

“Here’s what I always come back to: users are who ultimately should have the strongest voice,” Collier said. “Four years ago there were not a lot of users; it was prototyping and a lot of companies experimenting. The users are someone to go to as a tie-breaker. It’s much different than a traditional proprietary software model. We strive to involve the users in the process. We also have some analytics, a user survey, all that data to leverage. That has been a godsend in tie-breakers during our discussions.”

About the Author

Jason Verge is an Editor/Industry Analyst on the Data Center Knowledge team with a strong background in the data center and Web hosting industries. In the past he’s covered all things Internet Infrastructure, including cloud (IaaS, PaaS and SaaS), mass market hosting, managed hosting, enterprise IT spending trends and M&A. He writes about a range of topics at DCK, with an emphasis on cloud hosting.

Add Your Comments

  • (will not be published)