Sunday, December 2, 2012

The Northbound API is the key to OpenFlow’s Success

David Lenrow says:

The value of the SDN architectural approach (Which is what SDN is, it isn’t a network solution and doesn’t do anything in and of itself, but rather lends itself to building solutions with global network view and more abstracted APIs than the device or flow table model) and controllers with their associated NBI, is that it completely abstracts the details of what southbound API is used to talk to the network devices. A controller based on the SDN architectural approach may or may not speak OpenFlow and the answer to that question is a solid “don’t care” from the Orchestration and Cloud OS layer talking to the NBI in the cloud stack. The power of SDN is that a controller can expose a network abstraction and the details of the device level implementation are completely hidden. I completely agree that developing, sharing, and eventually standardizing NBI is important and has the ability to be a game changer, but this is completely orthogonal to whether OpenFlow is the only, or even a good, south bound protocol to control some or all of the forwarding behaviors in a network. The ONF initially made the horrible mistake of positioning SDN as the tail and OpenFlow as the dog when they launched. Now that the interesting conversation in the industry is about the NBI, the ONF is at risk of becoming even more irrelevant in future because they don’t appear to understand that the NBI is the key to integrating virtual networking with the on-its-way-to-ubiquity cloud movement. The most innovative and important data center SDN solutions are being build without the not-yet-ready-to-control-anything-but-the-forwarding table OpenFlow protocol and the ONF needs to have jurisdiction over the interesting decisions for the industry or become super-irrelevant as the flow-table-wire-protocol foundation. NBI is really important, but that has almost nothing to do with OpenFlow and whether it will ever be a comprehensive protocol for controlling network devices.

I have always believe this to be the only value of sdn as a whole. Meaning the ability to configure complex protocols on multiple devices. Openflow as it is today provides the transport but you will need a lot more implementation at the controller level to make this attractive long term

Posted via email from Larkland Morley's posterous

Breaking News: SDN Consolidation Continues, Cisco to Acquire Cariden for $141M

Rating: 0.0/5 (0 votes cast)

Cisco to acquire Cariden

This morning Cisco accounted plans to acquire Cariden to enhance it’s Service Provider software-defined networking (SDN) solutions.  Surprisingly, this hasn’t been positioned as an SDN play — though from our direct experience with the company — they fit the definition of SDN and have real customers and revenue to prove it.

Equally impressive, Cariden built the company through blood, sweat and tears, bypassing traditional venture capital financing — providing budding entrepreneurs and folks considering joining an SDN startup inspiration to think differently.

I worked with Shailesh Shukla, the executive responsible for the Cariden acquisition back at Juniper Networks — he’s a smart executive — and made a great purchase.

Cariden is an example of a network application that can drive adoption of SDN technologies.  For example, as Big Switch announced during their product launch, Cariden is integrated with Floodlight.

We believe this is the start of the Cisco acquiring network applications and can eventually be integrated with CiscoONE.  Expect to see more networking application acquisitions from Cisco in the near future.  We also expect to see a continued shift to Cisco and others increasingly acquiring software companies who’ve bypassed traditional venture capital financing.

Congrats Arman and team!

Check out SDNCentral’s other Cariden Coverage:

Cisco Press Release Below.

Checkout more SDN company coverage on SDNCentral:


Cisco Announces Intent to Acquire Cariden
Acquisition Further Strengthens Cisco’s Ability to Lead the Evolution in Service Provider Networking
SAN JOSE, Calif. – Nov. 29, 2012 – Cisco today announced its intent to acquire privately held Cariden Technologies, Inc., a Sunnyvale, Calif.-based supplier of network planning, design and traffic management solutions for telecommunications service providers. With global service providers converging their Internet Protocol (IP) and optical networks to address exploding Internet and mobile traffic growth and complex traffic patterns, Cisco’s acquisition of Cariden will allow providers to enhance the visibility, programmability and efficiency of their converged networks, while improving service velocity.

Cariden’s industry-leading capacity planning and management tools for IP/MPLS (Multi-Protocol Label Switching) networks, which have been deployed by many of the world’s leading fixed and mobile network operators, will be integrated into Cisco’s Service Provider Networking Group to enable multilayer modeling and optimization of optical transport and IP/MPLS networks. Cariden’s products and technology will advance Cisco’s nLight technology for IP and optical convergence. The acquisition also supports the company’s Open Network Environment (ONE) strategy by providing sophisticated wide area networking (WAN) orchestration capabilities. These capabilities will allow service providers to improve both the programmability of their networks and the utilization of existing network assets across the IP and optical transport layers.

“The Cariden acquisition reinforces Cisco’s commitment to offering service providers the technologies they need to optimize and monetize their networks, and ultimately grow their businesses,” said Surya Panditi, senior vice president and general manager, Cisco’s Service Provider Networking Group. “Given the widespread convergence of IP and optical networks, Cariden’s technology will help carriers more efficiently manage bandwidth, network traffic and intelligence. This acquisition signals the next phase in Cisco’s packet and optical convergence strategy and further strengthens our ability to lead this market transition in networking.”

The acquisition of Cariden exemplifies Cisco’s build, buy, and partner innovation framework and is aligned to Cisco’s strategic goals to develop and deliver innovative networking technologies and provide best-in-class solutions for customers, all while attracting and cultivating top talent.

Upon the close of the acquisition, Cariden employees will be integrated into Cisco’s Service Provider Networking Group, reporting to Shailesh Shukla, vice president and general manager of the company’s Software and Applications Group. Under the terms of the agreement, Cisco will pay approximately $141 million in cash and retention-based incentives in exchange for all shares of Cariden. The acquisition is subject to various standard closing conditions and is expected to be completed in the second quarter of Cisco’s fiscal year 2013.

About Cisco

Cisco (NASDAQ: CSCO) is the worldwide leader in networking that transforms how people connect, communicate and collaborate. Information about Cisco can be found at http://www.cisco.com. For ongoing news, please go to http://newsroom.cisco.com.

About the Author

.

Matt has 20+ years of software-defined networking (SDN), cloud computing, SaaS, & computer networking… More

Another twist to the SDN story..

Posted via email from Larkland Morley's posterous

Cloud Computing and Big Data Intersect at NIST, January 15-17

Two major new technologies come together for the Cloud Computing and Big Data Workshop, hosted by the National Institute of Standards and Technology (NIST) at its Gaithersburg, Md., campus Jan. 15-17, 2013.

nebula N76
Combining cloud computing and big data could hasten valuable scientific discoveries in many areas including astronomy. (NASA image of nebula N76 in a bright, star-forming region of the Small Magellanic Cloud.)
Credit: NASA

Cloud computing* offers an on-demand access to a shared pool of configurable resources; big data explores large and complex pools of information and requires novel approaches to meet the associated computing and storage requirements. The workshop will focus on the intersection of the two—the meeting is part of the traditional semi-annual cloud computing forum and workshop series with the additional dimension of big data and its relation with and influence on cloud platforms and cloud computing.

"Cloud computing and big data are each powerful trends. Together they can be even more powerful and that's why we're hosting this workshop," said Chuck Romine, director of the NIST Information Technology Laboratory. "The cloud can make big data accessible to those who can't take advantage today. In turn, big data opens doors to discovery, innovation, and entrepreneurship that are inaccessible at conventional data scales."

The January conference will bring together leaders and innovators from industry, academia and government in an interactive format that combines keynote presentations, panel discussions, interactive breakout sessions and open discussion. Patrick Gallagher, Under Secretary of Commerce for Standards and Technology and NIST director, and Steven VanRoekel, the Chief Information Officer of the United States, will open the conference.

The first day's morning panels examine the convergence of cloud and big data, progress on the U.S. Government Cloud Computing Roadmap and international cloud computing standards.

Two afternoon sessions focus on progress made on the Priority Action Plans (PAP)s associated with the 10 requirements described in the first release of the USG Cloud Computing Technology Roadmap, Volume I (NIST SP 500-293).** Each requirement has associated PAPs related to interoperability, portability and security. The meetings will showcase the voluntary, independent, cloud-related efforts on diverse PAPs underway by industry, academia and standards-developing organizations.

The second day of the workshop explores the unprecedented challenges posed by big data on storage, integration, analysis and visualization—demands that many cloud innovators are working to meet today. The workshop will explore possibilities for harmonizing cloud and big data measurement, benchmarking and standards in ways that bring the power of these two approaches together to facilitate innovation. Day three offers workshops on exploring the formation of new working groups at the intersection of cloud and big data, kicking off a Big Data Research Roadmap, discussing international cloud computing standards progress, and hearing the status of the USG Cloud Computing Technology Roadmap Volume III. Special topic briefings will be offered during lunch times.

For more information on the meeting or to register, go to www.nist.gov/itl/cloud/cloudbdworkshop.cfm.

* For the NIST definition of cloud computing, see http://csrc.nist.gov/publications/nistpubs/800-145/SP800-145.pdf
** USG Cloud Computing Technology Roadmap, Volume I (NIST SP 500-293) is available at www.nist.gov/itl/cloud/upload/SP_500_293_volumeI-2.pdf

Interest read about Cloud Computing and Big Data. They are two of the most interesting fields now in computing.

Posted via email from Larkland Morley's posterous

Saturday, November 10, 2012

Gartner: Mobile Development, Social Media and Cloud Computing Disrupting IT

Gartner logo 300x68 Gartner: Mobile Development, Social Media and Cloud Computing Disrupting IT

In a conference in Orlando, Florida, Gartner Inc. revealed that the central focus of IT consisting of social media innovations, mobile devices, web information, and cloud computing can disrupt the whole IT environment. Addressing at least 10,000 participants, Gartner Vice President David Cearley said that at the rate things are going the mobile experience is overshadowing the desktop experience. Cloud computing, together with mobile devices, is set to alter the modern corporation’s primary architecture of computing. Instead of focusing on client-server, IT shops must now set their sights on cloud-client architecture.

With this new type of architecture, it is also possible for skill sets necessary for enterprise software development to be altered significantly. The front-end interface must have better designs and development teams must gear towards HTML5 Web browser opportunities aside from the usual mobile device operating systems. Cearley also claimed that consumers have fresh expectations. As such, application developers and architects must obtain new design skills to meet these new expectations.

According to Kii Inc. Senior VP for platform marketing and developer relations Miko Matsumura, the result of mobile development has caused traditional architecture to evolve and a new breed of developers has turned their mobile perspectives to the cloud. According to him, the client cloud is not something different from a programming platform, programming language, or programming model. On the other hand, Gartner VP for research Jim Duggan said that the alterations in application lifecycles and development are signs that by 2015 mobile applications will be greater than static deployment by 400%. This means that focus should be on developers training as well as outsourcing.

According to Gartner analysts, there will come a time when each corporate budget will be an IT budget and that businesses will have a Chief Digital Officer in their payroll. Gartner further predicts that by 2015, around 25% of businesses will have Chief Digital Officers.

Cloud security is also expected to triple in size. This is primarily because of regulatory compliance. According to Gartner analysts, IT leaders must be able to plan for the upcoming government regulations and interventions. Towards the end of 2015, Gartner expects bigger service providers to acquire cloud-based identity access management solutions. The analyst group also believes that administrative error or user management will comprise about 80% of cloud security occurrences in 2013. Those businesses which require basic security environments can rely on the security provided by the public cloud service or structure. Gartner also expects that 60% of large firms will limit network access connectivity of mobile devices personally-owned by their staff.

Posted via email from Larkland Morley's posterous

Cloud Computing Gains Continue

The US cloud computing market has grown quickly for widespread usage. Up from 58% last year, about 78% of medium and large enterprises already use or are testing a cloud solution. On average, cloud as a percentage of enterprise IT stands at 4.4%. Use of cloud in enterprise IT is a mile-wide, an inch deep, and growing fast.

New research from WaveLength Market Analytics and Winn Technology Group, The Continuing Enterprise Cloud Computing Evolution, shows that 2012 saw the emergence of a new segment, the multicloud users. About 19% of the market is comprised of them, called Cloud Pros. Other segments are Cloud Pioneers (59%) that actively use or pilot a cloud, Cloud Planners (12%) with cloud plans, and Stragglers (10%) with no plans.

"The enterprise cloud market and segments have quickly evolved; today's meaningful question is no longer if cloud is used but rather how much," said Natalie Robb, of WaveLength Market Analytics. "Last year, cloud users said they expected 28% of IT to be cloud-based by 2015 and now they expect around 35%. Knowing what sets Cloud Pros and Pioneers apart is crucial for technology and telecom firms to advance technologies and reach buyers."

Other key findings from WaveLength/Winn's report include:

Pros and Pioneers use multiple data centers, but nearly all Pros use AWS, while Pioneers are more likely to use IBM, Verizon, and Rackspace.

To prepare for cloud, Pros invested in network performance improvements while Pioneers invested in storage and security.

With 48% of all cloud users, human resources apps surpassed CRM and email as the most common enterprise application in the cloud.

Biggest gain in enterprise and infrastructure cloud service usage is desktop apps, which grew from 6% last year to nearly 26%, and back-up and disaster recovery, which surged from 17% to 38%.

The Continuing Enterprise Cloud Computing Evolution discusses broad trends in the changing cloud computing market. It examines penetration of different service deployment models, projects to prepare for deployment, and cloud enterprise application adoption.

The Continuing Enterprise Cloud Computing Evolution is a joint effort: Winn Technology Group collected the data and WaveLength conducted the analysis. Two more reports on the enterprise cloud market segments will be released in the coming weeks.

This is very interesting data to review for investment in cloud computing

Posted via email from Larkland Morley's posterous

Wednesday, October 17, 2012

Cisco Execs Plumb The Limits Of Cloud Computing

Cloud computing has become the all-purpose buzzword of business computing -- it can mean pretty much whatever you want it to mean, but every product better have some cloud in it. Networking giant Cisco has totally bought in to the concept, but a couple of top execs also described what they see as limits on how far pure cloud computing will spread.

In a drab conference room out by Oakland Airport (the company's planned Zeppelin excursion to highlight its cloud product launch was scrubbed by bad weather) Cisco's Murali Sitaram (VP/GM Cloud Collaboration Applications) and Lew Tucker (VP, CTO Cloud Computing) explained the company's approach as they introduced additions to Cisco's Cloudverse family.

The Plumbing Behind The Cloud

Instead of just offering its own products on a hosted basis, Cisco's approach is to work with telecom carriers, large enterprises and resellers to help them offer collaboration-and-communication-as-a-service.

The idea, Sitaram said, is to leverage Cisco's partners to provide services without having to become a carrier itself -- which is a daunting, heavily regulated proposition in many parts of the world. "We don't want to be in the carrier business, but we do want to provide services through partners."

Those services include expanding Cisco’s Hosted Collaboration Solution to include TelePresence, Customer Collaboration (contact centers), unified communications and mobility. It also means letting large customers install the company's WebEx online Web conferencing solution in their own data centers.

That may not gibe with most people's definition of cloud computing, but according to Sitaram, many customers still demand more control over their services, either because they're in a highly sensitive industry like the military, health care or financial services, or because they're in emerging markets with restrictive regulations and unreliable public infrastructure.

"It's not easy to deliver cloud-based services" to countries like China, India, Russia and South America, Sitaram said, "especially from the United States." Besides, "the cloud isn't just Facebook and Salesforce," Sitaram added. "If you peel the onion, there are just so many nuances."

When Is The Cloud Not The Cloud?

Nuances or not, earlier this month, I noted that Oracle's Larry Ellison Has Some Strange Ideas About Cloud Computing. Cisco's use of "cloud computing" in this context reminded me of Ellison's Oracle Private Cloud oxymoron, but Sitaram said the Private Cloud version of WebEx retained the service's "quasi-multi-tenant" cloud-based architecture and still offered end-users a subscription based experience. Well, if he says so, but putting your stuff in the customer's data center still ain't what I call "cloud."

Ironically, that may be the point. "Some countries and businesses, they will never go to [Software-as-a-Service]-based clouds," Sataram said.

So how far will cloud computing go? "There's going to be a world of many clouds," Tucker predicted. Most things will go into the cloud, but many may not. "Many companies want to be their own cloud providers to their workers - but using a cloud model with self-service and pay-as-you-go pricing… the consumer could be an employee."

When I tried to pin down Tucker on exactly how far he thought cloud adoption would go, Tucker guessed 60% cloud, 40% on premeses. That seems low to me. After all, once the utilities started providing cheap, reliable power, how many customers still wanted to generate their own electricity?

 

Photos of Sitaram and Tucker by Fredric Paul.

Posted via email from Larkland Morley's posterous

The Open Source Cloud is Ready for Hadoop, Projects Say

Two major trends in enterprise computing this year show increasing overlap: big data processing and open source cloud adoption. 

To Hortonworks, the software company behind open source Apache Hadoop, the connection makes sense. Enterprise customers want the ability to spin up a cluster on demand and quickly process massive amounts of data, said Jim Walker, director of product marketing at Hortonworks, in an interview at OSCON in July. The cloud provides this kind of access by its ability to scale and handle computing tasks elastically.

The open source cloud offers the additional benefit of low-cost deployment and extra plugability you won’t get with a proprietary cloud infrastructure.

All three major open source IaaS platforms -- OpenStack, CloudStack and Eucalyptus -- have made much progress this year in testing Hadoop deployments on their stacks. And Eucalyptus is working on full integration with the platform.

Somik Behera is a founding core developer of the OpenStack Quantum project at Nicira, which has since been acquired by VMware.Although no formal relationship exists between Hadoop and the open source IaaS platforms now, Hortonworks does see potential for collaboration given the nature of cloud computing, in general, Walker said.

“(Hadoop) could be a piece of any open cloud platform today,” he said.

Here’s what each of the three major platforms had to say recently about their progress with Hadoop on the open cloud.

OpenStack

In the past, deploying Hadoop in a cloud datacenter proved too challenging for business-critical applications, said Somik Behera, a founding core developer of the OpenStack Quantum project at Nicira, which has since been acquired by VMware. Big data applications require a guaranteed bandwidth, which was difficult to do, Behera said.

OpenStack’s Quantum networking project, which was recently integrated in the new Folsom release, offers an Open vSwitch pluggable networking patch to help ensure performance on Hadoop deployments, Behera said. His Quora post on the topic explains it best:

Read Quote of Somik Behera's answer to Apache Hadoop: Has anyone tried to deploy an Apache Hadoop cluster on OpenStack? on Quora

CloudStack

The biggest challenge for deploying Hadoop on CloudStack has been allocation of resources, said Caleb Call, manager of website systems at Openstock.com and a CloudStack contributor, via email.

“In order to crunch the data we need to in our Hadoop cluster, we currently have many bare metal boxes,” Call said.  “Reproducing this same model in the cloud, even being a private cloud, has proven to be tough.”

Though CloudStack is not currently working on an Hadoop integration, the team has built its cloud environment to guarantee performance for Hadoop workloads by building a dedicated resource pool, said Call, who oversees a team of engineers on the CloudStack project’s “Move to the Cloud” initiative.

“We've also built and tuned our compute templates around Hadoop for this cluster so we don't have to throw large amounts of computing power at the problems,” Call said. “Same as you would do for a bare metal system, but now the saved resources are still left in our compute resource pool available to be used by other Hadoop processes.”

Eucalyptus

At Eucalyptus, performance challenges with Hadoop in the cloud have been largely overcome in the past year, said Andy Knosp, VP of Product at the company.

Andy Knosp, VP of Product, Eucalyptus.“There’s been some good research that’s shown near-native performance of Hadoop workloads in a virtualized environment,” Knosp said. This has made Hadoop “a perfect use case” for the open cloud.

Amazon Web Services currently offers the Elastic MapReduce (EMR) service, a hosted Hadoop framework that runs on EC2 and S3. Through the company’s partnership with AWS, Eucalyptus is developing a similar offering that will provide simplified deployment of Hadoop on Eucalyptus.

Customers can run Hadoop on the Eucalyptus private cloud platform as-is – no plugins required, Knosp said. But the company also has a team working on integrating Hadoop with the platform for simplified deployment.

“We want to make it as simple as possible for our community and partners to deploy,” Knosp said. “It improves time to market for Hadoop applications.”

 

Posted via email from Larkland Morley's posterous