Saturday, December 15, 2012

Is Cloud Computing Killing Open Source Software

Is Cloud Computing Killing Open Source Software

The best thing about open source software systems has always been the fact that it is freely available and any programmer or company can use it to develop its own version of that software. For the longest time they have been the best solution for people willing to go outside the box in order to get the best results in their respective IT departments. Of course these systems have never been without profit and it came from two sources that are now getting to be absolute because of the emergence of cloud computing and the level of affordability most of its components come from.

The way open source software systems have worked so far has been through selling license agreements. Any company could take a software system like MySQL incorporate it in their own product and then they would either have the choice of getting an open source license or buy a commercial license from MySQL, in this case.

However because of the cloud is not actually selling software systems but only time on those systems companies like Amazon, who has developed their Amazon RDS based on MySQL do not have to pay them any licensee fee. The end users get exactly what they needed and are willing to pay for it and cloud service providers like Amazon do not need to pay any fee in licensing.

There is also a second stream of income for big open source companies because it is their software that is modified and sold further on. The process creates a need for specialists and those specialists are delivered by the initial company like MySQL or Red Hat. But if the company that has used the software generates enough revenue from it, they can afford to hire their own specialists. And since their products are not sold further as such but are only accessed by third parties on their own server there is nobody else left who would need those services.

However the world of open source software does not end with MySQL and even they have alternative sources of funding. For one, even the specialists hired by Amazon need to be trained, tested and licensed by a valid authority which will always need to route back to Oracle who currently owns MySQL. And the same is true for any open source software.

Also the entire Linux platform is what currently supports and Android software and as long as that exists there will be little chance for the actual concept of the open source software to go out of date. Even the Android system itself is an open source software system that many companies like CyanogenMod have taken to using and further developing.

So ultimately the cloud cannot take out the open source concept because it is built itself on open source platforms. The game has gotten tougher for many open source companies but they are already fighting back by putting in place new licensing systems like the Affero GPL license.

By Luchi Gabriel Manescu

(Disclaimer: CloudTweaks publishes news and opinion articles from different contributors. All views and opinions in these articles belong entirely to our contributors. They do not reflect or represent in any way the personal or professional opinions of CloudTweaks.com or those of its staff.)

Tagged as: Amazon, android, cloud service providers, IT department, license agreements, Open Source, open source companies, open source license, Programmer, Software, software system, SQL, the cloud

This is a very interesting discussing as if you can get all the services in a cloud why bother with open source software? you can just leverage what is already in the could without worrying about maintenance and development work.

Posted via email from Larkland Morley's posterous

Sunday, December 2, 2012

The Northbound API is the key to OpenFlow’s Success

David Lenrow says:

The value of the SDN architectural approach (Which is what SDN is, it isn’t a network solution and doesn’t do anything in and of itself, but rather lends itself to building solutions with global network view and more abstracted APIs than the device or flow table model) and controllers with their associated NBI, is that it completely abstracts the details of what southbound API is used to talk to the network devices. A controller based on the SDN architectural approach may or may not speak OpenFlow and the answer to that question is a solid “don’t care” from the Orchestration and Cloud OS layer talking to the NBI in the cloud stack. The power of SDN is that a controller can expose a network abstraction and the details of the device level implementation are completely hidden. I completely agree that developing, sharing, and eventually standardizing NBI is important and has the ability to be a game changer, but this is completely orthogonal to whether OpenFlow is the only, or even a good, south bound protocol to control some or all of the forwarding behaviors in a network. The ONF initially made the horrible mistake of positioning SDN as the tail and OpenFlow as the dog when they launched. Now that the interesting conversation in the industry is about the NBI, the ONF is at risk of becoming even more irrelevant in future because they don’t appear to understand that the NBI is the key to integrating virtual networking with the on-its-way-to-ubiquity cloud movement. The most innovative and important data center SDN solutions are being build without the not-yet-ready-to-control-anything-but-the-forwarding table OpenFlow protocol and the ONF needs to have jurisdiction over the interesting decisions for the industry or become super-irrelevant as the flow-table-wire-protocol foundation. NBI is really important, but that has almost nothing to do with OpenFlow and whether it will ever be a comprehensive protocol for controlling network devices.

I have always believe this to be the only value of sdn as a whole. Meaning the ability to configure complex protocols on multiple devices. Openflow as it is today provides the transport but you will need a lot more implementation at the controller level to make this attractive long term

Posted via email from Larkland Morley's posterous

Breaking News: SDN Consolidation Continues, Cisco to Acquire Cariden for $141M

Rating: 0.0/5 (0 votes cast)

Cisco to acquire Cariden

This morning Cisco accounted plans to acquire Cariden to enhance it’s Service Provider software-defined networking (SDN) solutions.  Surprisingly, this hasn’t been positioned as an SDN play — though from our direct experience with the company — they fit the definition of SDN and have real customers and revenue to prove it.

Equally impressive, Cariden built the company through blood, sweat and tears, bypassing traditional venture capital financing — providing budding entrepreneurs and folks considering joining an SDN startup inspiration to think differently.

I worked with Shailesh Shukla, the executive responsible for the Cariden acquisition back at Juniper Networks — he’s a smart executive — and made a great purchase.

Cariden is an example of a network application that can drive adoption of SDN technologies.  For example, as Big Switch announced during their product launch, Cariden is integrated with Floodlight.

We believe this is the start of the Cisco acquiring network applications and can eventually be integrated with CiscoONE.  Expect to see more networking application acquisitions from Cisco in the near future.  We also expect to see a continued shift to Cisco and others increasingly acquiring software companies who’ve bypassed traditional venture capital financing.

Congrats Arman and team!

Check out SDNCentral’s other Cariden Coverage:

Cisco Press Release Below.

Checkout more SDN company coverage on SDNCentral:


Cisco Announces Intent to Acquire Cariden
Acquisition Further Strengthens Cisco’s Ability to Lead the Evolution in Service Provider Networking
SAN JOSE, Calif. – Nov. 29, 2012 – Cisco today announced its intent to acquire privately held Cariden Technologies, Inc., a Sunnyvale, Calif.-based supplier of network planning, design and traffic management solutions for telecommunications service providers. With global service providers converging their Internet Protocol (IP) and optical networks to address exploding Internet and mobile traffic growth and complex traffic patterns, Cisco’s acquisition of Cariden will allow providers to enhance the visibility, programmability and efficiency of their converged networks, while improving service velocity.

Cariden’s industry-leading capacity planning and management tools for IP/MPLS (Multi-Protocol Label Switching) networks, which have been deployed by many of the world’s leading fixed and mobile network operators, will be integrated into Cisco’s Service Provider Networking Group to enable multilayer modeling and optimization of optical transport and IP/MPLS networks. Cariden’s products and technology will advance Cisco’s nLight technology for IP and optical convergence. The acquisition also supports the company’s Open Network Environment (ONE) strategy by providing sophisticated wide area networking (WAN) orchestration capabilities. These capabilities will allow service providers to improve both the programmability of their networks and the utilization of existing network assets across the IP and optical transport layers.

“The Cariden acquisition reinforces Cisco’s commitment to offering service providers the technologies they need to optimize and monetize their networks, and ultimately grow their businesses,” said Surya Panditi, senior vice president and general manager, Cisco’s Service Provider Networking Group. “Given the widespread convergence of IP and optical networks, Cariden’s technology will help carriers more efficiently manage bandwidth, network traffic and intelligence. This acquisition signals the next phase in Cisco’s packet and optical convergence strategy and further strengthens our ability to lead this market transition in networking.”

The acquisition of Cariden exemplifies Cisco’s build, buy, and partner innovation framework and is aligned to Cisco’s strategic goals to develop and deliver innovative networking technologies and provide best-in-class solutions for customers, all while attracting and cultivating top talent.

Upon the close of the acquisition, Cariden employees will be integrated into Cisco’s Service Provider Networking Group, reporting to Shailesh Shukla, vice president and general manager of the company’s Software and Applications Group. Under the terms of the agreement, Cisco will pay approximately $141 million in cash and retention-based incentives in exchange for all shares of Cariden. The acquisition is subject to various standard closing conditions and is expected to be completed in the second quarter of Cisco’s fiscal year 2013.

About Cisco

Cisco (NASDAQ: CSCO) is the worldwide leader in networking that transforms how people connect, communicate and collaborate. Information about Cisco can be found at http://www.cisco.com. For ongoing news, please go to http://newsroom.cisco.com.

About the Author

.

Matt has 20+ years of software-defined networking (SDN), cloud computing, SaaS, & computer networking… More

Another twist to the SDN story..

Posted via email from Larkland Morley's posterous

Cloud Computing and Big Data Intersect at NIST, January 15-17

Two major new technologies come together for the Cloud Computing and Big Data Workshop, hosted by the National Institute of Standards and Technology (NIST) at its Gaithersburg, Md., campus Jan. 15-17, 2013.

nebula N76
Combining cloud computing and big data could hasten valuable scientific discoveries in many areas including astronomy. (NASA image of nebula N76 in a bright, star-forming region of the Small Magellanic Cloud.)
Credit: NASA

Cloud computing* offers an on-demand access to a shared pool of configurable resources; big data explores large and complex pools of information and requires novel approaches to meet the associated computing and storage requirements. The workshop will focus on the intersection of the two—the meeting is part of the traditional semi-annual cloud computing forum and workshop series with the additional dimension of big data and its relation with and influence on cloud platforms and cloud computing.

"Cloud computing and big data are each powerful trends. Together they can be even more powerful and that's why we're hosting this workshop," said Chuck Romine, director of the NIST Information Technology Laboratory. "The cloud can make big data accessible to those who can't take advantage today. In turn, big data opens doors to discovery, innovation, and entrepreneurship that are inaccessible at conventional data scales."

The January conference will bring together leaders and innovators from industry, academia and government in an interactive format that combines keynote presentations, panel discussions, interactive breakout sessions and open discussion. Patrick Gallagher, Under Secretary of Commerce for Standards and Technology and NIST director, and Steven VanRoekel, the Chief Information Officer of the United States, will open the conference.

The first day's morning panels examine the convergence of cloud and big data, progress on the U.S. Government Cloud Computing Roadmap and international cloud computing standards.

Two afternoon sessions focus on progress made on the Priority Action Plans (PAP)s associated with the 10 requirements described in the first release of the USG Cloud Computing Technology Roadmap, Volume I (NIST SP 500-293).** Each requirement has associated PAPs related to interoperability, portability and security. The meetings will showcase the voluntary, independent, cloud-related efforts on diverse PAPs underway by industry, academia and standards-developing organizations.

The second day of the workshop explores the unprecedented challenges posed by big data on storage, integration, analysis and visualization—demands that many cloud innovators are working to meet today. The workshop will explore possibilities for harmonizing cloud and big data measurement, benchmarking and standards in ways that bring the power of these two approaches together to facilitate innovation. Day three offers workshops on exploring the formation of new working groups at the intersection of cloud and big data, kicking off a Big Data Research Roadmap, discussing international cloud computing standards progress, and hearing the status of the USG Cloud Computing Technology Roadmap Volume III. Special topic briefings will be offered during lunch times.

For more information on the meeting or to register, go to www.nist.gov/itl/cloud/cloudbdworkshop.cfm.

* For the NIST definition of cloud computing, see http://csrc.nist.gov/publications/nistpubs/800-145/SP800-145.pdf
** USG Cloud Computing Technology Roadmap, Volume I (NIST SP 500-293) is available at www.nist.gov/itl/cloud/upload/SP_500_293_volumeI-2.pdf

Interest read about Cloud Computing and Big Data. They are two of the most interesting fields now in computing.

Posted via email from Larkland Morley's posterous