Cloud computing has become the all-purpose buzzword of business computing -- it can mean pretty much whatever you want it to mean, but every product better have some cloud in it. Networking giant Cisco has totally bought in to the concept, but a couple of top execs also described what they see as limits on how far pure cloud computing will spread.
In a drab conference room out by Oakland Airport (the company's planned Zeppelin excursion to highlight its cloud product launch was scrubbed by bad weather) Cisco's Murali Sitaram (VP/GM Cloud Collaboration Applications) and Lew Tucker (VP, CTO Cloud Computing) explained the company's approach as they introduced additions to Cisco's Cloudverse family.
![]()
The Plumbing Behind The Cloud
Instead of just offering its own products on a hosted basis, Cisco's approach is to work with telecom carriers, large enterprises and resellers to help them offer collaboration-and-communication-as-a-service.
The idea, Sitaram said, is to leverage Cisco's partners to provide services without having to become a carrier itself -- which is a daunting, heavily regulated proposition in many parts of the world. "We don't want to be in the carrier business, but we do want to provide services through partners."
Those services include expanding Cisco’s Hosted Collaboration Solution to include TelePresence, Customer Collaboration (contact centers), unified communications and mobility. It also means letting large customers install the company's WebEx online Web conferencing solution in their own data centers.
![]()
That may not gibe with most people's definition of cloud computing, but according to Sitaram, many customers still demand more control over their services, either because they're in a highly sensitive industry like the military, health care or financial services, or because they're in emerging markets with restrictive regulations and unreliable public infrastructure.
"It's not easy to deliver cloud-based services" to countries like China, India, Russia and South America, Sitaram said, "especially from the United States." Besides, "the cloud isn't just Facebook and Salesforce," Sitaram added. "If you peel the onion, there are just so many nuances."
![]()
When Is The Cloud Not The Cloud?
Nuances or not, earlier this month, I noted that Oracle's Larry Ellison Has Some Strange Ideas About Cloud Computing. Cisco's use of "cloud computing" in this context reminded me of Ellison's Oracle Private Cloud oxymoron, but Sitaram said the Private Cloud version of WebEx retained the service's "quasi-multi-tenant" cloud-based architecture and still offered end-users a subscription based experience. Well, if he says so, but putting your stuff in the customer's data center still ain't what I call "cloud."
Ironically, that may be the point. "Some countries and businesses, they will never go to [Software-as-a-Service]-based clouds," Sataram said.
So how far will cloud computing go? "There's going to be a world of many clouds," Tucker predicted. Most things will go into the cloud, but many may not. "Many companies want to be their own cloud providers to their workers - but using a cloud model with self-service and pay-as-you-go pricing… the consumer could be an employee."
When I tried to pin down Tucker on exactly how far he thought cloud adoption would go, Tucker guessed 60% cloud, 40% on premeses. That seems low to me. After all, once the utilities started providing cheap, reliable power, how many customers still wanted to generate their own electricity?
Photos of Sitaram and Tucker by Fredric Paul.
Wednesday, October 17, 2012
Cisco Execs Plumb The Limits Of Cloud Computing
The Open Source Cloud is Ready for Hadoop, Projects Say
Two major trends in enterprise computing this year show increasing overlap: big data processing and open source cloud adoption.
To Hortonworks, the software company behind open source Apache Hadoop, the connection makes sense. Enterprise customers want the ability to spin up a cluster on demand and quickly process massive amounts of data, said Jim Walker, director of product marketing at Hortonworks, in an interview at OSCON in July. The cloud provides this kind of access by its ability to scale and handle computing tasks elastically.
The open source cloud offers the additional benefit of low-cost deployment and extra plugability you won’t get with a proprietary cloud infrastructure.
All three major open source IaaS platforms -- OpenStack, CloudStack and Eucalyptus -- have made much progress this year in testing Hadoop deployments on their stacks. And Eucalyptus is working on full integration with the platform.
Although no formal relationship exists between Hadoop and the open source IaaS platforms now, Hortonworks does see potential for collaboration given the nature of cloud computing, in general, Walker said.
“(Hadoop) could be a piece of any open cloud platform today,” he said.
Here’s what each of the three major platforms had to say recently about their progress with Hadoop on the open cloud.
OpenStack
In the past, deploying Hadoop in a cloud datacenter proved too challenging for business-critical applications, said Somik Behera, a founding core developer of the OpenStack Quantum project at Nicira, which has since been acquired by VMware. Big data applications require a guaranteed bandwidth, which was difficult to do, Behera said.
OpenStack’s Quantum networking project, which was recently integrated in the new Folsom release, offers an Open vSwitch pluggable networking patch to help ensure performance on Hadoop deployments, Behera said. His Quora post on the topic explains it best:
Read Quote of Somik Behera's answer to Apache Hadoop: Has anyone tried to deploy an Apache Hadoop cluster on OpenStack? on Quora
CloudStack
The biggest challenge for deploying Hadoop on CloudStack has been allocation of resources, said Caleb Call, manager of website systems at Openstock.com and a CloudStack contributor, via email.
“In order to crunch the data we need to in our Hadoop cluster, we currently have many bare metal boxes,” Call said. “Reproducing this same model in the cloud, even being a private cloud, has proven to be tough.”
Though CloudStack is not currently working on an Hadoop integration, the team has built its cloud environment to guarantee performance for Hadoop workloads by building a dedicated resource pool, said Call, who oversees a team of engineers on the CloudStack project’s “Move to the Cloud” initiative.
“We've also built and tuned our compute templates around Hadoop for this cluster so we don't have to throw large amounts of computing power at the problems,” Call said. “Same as you would do for a bare metal system, but now the saved resources are still left in our compute resource pool available to be used by other Hadoop processes.”
Eucalyptus
At Eucalyptus, performance challenges with Hadoop in the cloud have been largely overcome in the past year, said Andy Knosp, VP of Product at the company.
“There’s been some good research that’s shown near-native performance of Hadoop workloads in a virtualized environment,” Knosp said. This has made Hadoop “a perfect use case” for the open cloud.
Amazon Web Services currently offers the Elastic MapReduce (EMR) service, a hosted Hadoop framework that runs on EC2 and S3. Through the company’s partnership with AWS, Eucalyptus is developing a similar offering that will provide simplified deployment of Hadoop on Eucalyptus.
Customers can run Hadoop on the Eucalyptus private cloud platform as-is – no plugins required, Knosp said. But the company also has a team working on integrating Hadoop with the platform for simplified deployment.
“We want to make it as simple as possible for our community and partners to deploy,” Knosp said. “It improves time to market for Hadoop applications.”
MIT/Stanford Venture Lab (VLAB): The Revolution of Software Defined Networks
Rating: 0.0/5 (0 votes cast)
Earlier tonight (October 16th, 2012), I had the honor of moderating the panel: The Revolution of Software Defined Networks hosted by The MIT/Stanford Venture Lab (VLAB) with a fantastic set of panelists including:
- Michael Beesley, Chief Technology Officer, Platform Systems Division at Juniper Networks
- Kelly Herrell, Chief Executive Officer at Vyatta
- Awais Nemat, Chief Executive Officer at PLUMgrid
- Jake Flomenberg, Partner at Accel Partners
With their permission, SDNCentral is making available the slides from the event below.Register to download the slides:
Event Description:
On July 23, 2012, VMware bought Nicira for $1.26B validating this revolution in Networking.Next week, Oracle acquired Xsigo. Just recently Cisco acquired vCider. Upstarts claim they will commoditize the expensive networking gear sold by the incumbents, with standards like Software Defined Networking (SDN), and OpenFlow (OF). Already, Google and Facebook deploy their own network hardware and software – and not the proprietary offerings of incumbent networking players. Many entrepreneurs are betting on SDN and OpenFlow. VLAB engages a robust discussion on SDN.
- Are we ready for chasm crossing?
- Is Cloud Computing driving SDN?
- Who else is using SDN and why?
- Is SDN a tectonic technology shift, or just a niche?
- Will incumbents co-opt SDN with closed proprietary implementations?
- Are we to have a win-win between users and vendors?
- Where are the opportunities?
Very useful information on the latest in SDN