New UK IBM Cloud DCs ‘Infused with Intelligence,’ But Maybe Not for Its Own Ops
When IBM announced this week its plans to expand its UK-based operations for IBM Cloud with the addition of four data centers across the country, it characterized those new facilities as being “infused with cognitive intelligence.”
If cognitive intelligence algorithms are so great, how come the same service providers that are offering them for their own customers, don’t take advantage of them for their own sake? These AI tools could potentially improve their own automation, their responsiveness to traffic bottlenecks, and conceivably their own power consumption.
When IBM announced this week its plans to expand its UK-based operations for IBM Cloud, with the addition of four data centers across the country in addition to its existing two, it characterized those new facilities as being “infused with cognitive intelligence.” In an earlier decade, that might have sounded too much like science fiction; but today, it sounds more like a miracle ingredient for cleaning kitchen countertops. If indeed these new facilities are so “infused,” will IBM be able to leverage this infusion for the facilities’ own benefit?
In responses to Data Center Knowledge from London, which arrived early Friday morning, the company told us that cognitive intelligence is something that clients, specifically, are asking for. Furthermore, IBM said that clients — and only clients — will be leveraging these features, at least for now.
“IBM Cloud has a single platform that brings together IaaS with more than 150 APIs and services, that clients are asking for as we shift to Cloud as a platform for innovation,” stated the IBM spokesperson. “Clients want help to manage their data resources through our data centers, but we also offer options to scale their applications and build new ones infused with cognitive intelligence, blockchain services, IoT, data-as-a-service, and more.”
This is as far as IBM will go, for now, with respect to its own efforts to do the sort of thing that software engineers call, “eating their own dog food.” They’re not alone in that respect: Google has recently bragged about the cognition-like capabilities of its own DeepMind project, and how it’s expanding the possibilities for real-world applications in neural networking. Last June, Google went so far as to boast it had applied DeepMind to its own data center power consumption models, with the result being that the company reduced its cooling bill (by its own tally) by about 40 percent.
But when pressed further about its application of DeepMind — for example, in determining better patterns for automating the dissemination of the many microservices that comprise its global computing network — all the company was able to do, at that time, was sit quietly and take notes. Its people may get back with us at some point.
It’s not as though Google, or IBM, or anyone else, is uninterested in the idea. In 2014, Google data center research engineer Jim Gao published a research paper [PDF] demonstrating that neural networks were particularly effective in predicting power usage effectiveness (PUE) ratios over extended periods of time, given historical data for prior power consumption.
Gao stated that the variables from these predictions could be adjusted for future conditions — for example, determining whether the use of drycoolers to exchange steam with outdoor air during the winter months, could be more effective at certain periods; and whether increasing wet bulb temperatures have a negative impact on fan speeds at certain points in time.
But Gao also embedded a little complaint at the front of his paper: specifically, that budget outlays at Google and elsewhere for studying the reduction of PUE, were being reduced. The reason: Budget managers were drawing the conclusion that they’d already reached the point of diminishing returns with efforts to lower PUE further.
When we asked IBM to tell us what formulas it used in its conclusion that it needed to triple data center capacity for IBM Cloud in the UK, the only formula it provided came not from internal research, but from analysis firm IDC.
“Presently, client demand for cloud is soaring. Overall, the industry is moving into the new wave of cloud adoption – transitioning from costs savings to a platform for innovation,” reads IBM’s response to Data Center Knowledge.
“The need for cloud data centers in Europe continues to grow; IDC forecasts worldwide revenues from public cloud services will reach more than $195 billion in 2020. UK Cloud adoption rates have increased to 84 percent over the course of the last five years, according to Cloud Industry Forum. IBM's new facilities will give users access to a broad array of server options, including bare metal servers, virtual servers, storage, and networking capabilities.”
If AI truly is becoming a competitive market, any cloud service provider would do itself a world of good by borrowing its own data centers as its own test case.
Few other details are known about IBM’s plans at this point, except that the first of its four new UK data centers will go online next month in Fareham, a western suburb of Portsmouth [pictured left]. The other three facilities are scheduled to go online during 2017, though their ultimate locations may have yet to be determined. Ark Data Centres will be the lease holder for the second new facility — it’s a joint venture partner with the UK Government’s own Crown Hosting.
Read more about:
EuropeAbout the Author
You May Also Like