Next Generation Computing: The Cloud as a Knowledge Engine

At HP’s Industry Analyst Summit earlier this month, I reviewed the generational leaps in the IT industry that brought us to where we are today – what we here at HP call the New Style of IT – and what we’re doing within HP to prepare for the future. 
 
While we continue to release innovative HP products and services, we are simultaneously thinking about what comes next—making sure that today’s work is grounded in the context of the future. The cloud, infrastructure, solutions and software that we provide today are not developed as isolated, time-bound capabilities; they are developed with an eye to what’s next. 
 
So, what is next? 
 
The cloud has brought powerful new capabilities in terms of business models, cost and deployment agility, but we must continue to evolve where it is going. We do this by identifying the unmet challenges or new problems presented by today’s ever-changing world. (That’s the flipside of innovation: generational leaps invariably introduce a new set of issues that require pioneering solutions.)
 
For example, today’s society has an insatiable desire for compute capabilities. We face an unbelievable onslaught of data, yet have a limited power infrastructure that struggles to provide enough electricity for the data centers that we need to efficiently process it all. With this in mind, here are some of the changes we’re planning for: 
 
  • Security – Recent headlines have brought security to the forefront of customers’ minds. They increasingly want to know where their data is, who owns it and how it is encrypted. We’re looking at ways to ensure that security is completely built-in at the base of the infrastructure and that users’ data never lives in an unencrypted state.
  • Big Data – When looking to the future of Big Data, we actually begin our thought process at the exabyte range – a billion terabytes. It’s clear that data growth will continue to explode and that we’ll reach the point where the data is too big to move (even with photonics). You can’t work with the data if you don’t know what you have, so we’re identifying ways to send an index of stored data to the cloud, helping customers get and extract real-time insight from a huge magnitude of data. 
  • Insufficient Resources – Resource allocation is a tremendous issue, and scaling out infrastructure is quickly becoming impossible. We already hear from customers who can’t get the amount of power from their cities or power grids that is needed to maximize their data center capabilities. Securing necessary resources for power and cooling will only get worse with increased competition and data demands. Given this reality, we’ve shifted a lot of our research from managing energy efficiently to stopping energy use in the first place. 
  • End of Cheap Hardware – The Moore’s Law era of ever more powerful and ever-cheaper hardware is coming to an end. Growing the processing and storage requirements that customers need for performance will inevitably increase their hardware costs. In order to meet these increased processing and storage needs, we will need a radically new compute paradigm built on task-specific processing nodes and vast pools of non-volatile storage communicating over photonic links. 

Part of what we’re doing within HP Labs is developing a next generation cloud with a cognitive computing learning engine at its core. It will allow us to take “knowledge” – not just data – from end points, use cognitive computing to turn that knowledge into “learning,” and then redistribute those learnings back to the end points.

Consider cell towers, which must cope with an enormous volume of fast –changing data. Today, that data is far too big to move so the vast majority is simple deleted. Now, imagine those cell towers as end points of the next generation cloud, which transform all that data locally into intelligence which is then sent to a centralized learning engine. Using cognitive computing, that engine could identify key learnings and feed the conclusions back to all of the other cell towers, increasing efficiency and business value. 
 
This type of local data processing aims to reduce latency, save energy, reduce cost and decrease the need to physically reconfigure the cell tower hardware. A cloud-based knowledge engine would also provide full network virtualization, enabling lightning-fast rollout of new capabilities.
 
I hope this gives you an idea of the goals and targets that we have for the future, as well as the performance levels and kinds of workloads that we are anticipating. HP has all of the right assets – from Converged Infrastructure, to Information Optimization, to Security – and the technology needed to build devices for this next generation of computing.