Monday, May 27, 2013

Enterprise Startups SDN Market Reaches $4 Billion

software defined networking sdn 300x152 Enterprise Startups SDN Market Reaches $4 Billion

Software-defined networking (SDN) changes the way companies build their IT networks. Instead of buying expensive routers and switches with a lot of fancy features from the likes of Cisco, companies can buy simpler, cheaper hardware, and less of it, and those fancy features are handled by a new layer of software.

SDN technology is still in its infancy. But it’s already created some billion-dollar success stories, capturing the imagination of Silicon Valley’s savviest investors and inventors. More than two years after the establishment of the Open Networking Foundation (ONF), with big names like Google and Facebook and the OpenFlow protocol as a result, SDN is about to see daylight. Most of the new switches from networking vendors like IBM, HP, Arista, Dell, Brocade, Huawei, Juniper, NEC, Extreme and Pronto are OpenFlow compatible.

Open Daylight is hosted by The Linux Foundation, the nonprofit organization of open source development and technology. The project has broad support from the major players in the network market such as Cisco, Brocade, Ericsson, IBM, Juniper Networks, Microsoft, Citrix, NEC, Big Switch Networks, Red Hat and VMware. Each of these names is donating a piece of software and engineering resources as an open source framework. For example, Cisco delivers controller technology, IBM open source version of its Distributed Overlay Virtual Ethernet technology, while Juniper includes an xmpp client and server protocol code empowering.

Already a few new vendors including Cisco, Juniper Network, Avaya, Dell, HP, IBM and VMware have surfaced as major players. Consider the $1.26 billion acquisition of Nicira by VMware last summer or the $176 million acquisition of Contrail Systems by Juniper Networks that have grabbed more of the attention in this emerging field. This has led to a whole crop of startups ready to take on market leader Cisco with an estimated $4 billion SDN market share. Startup Glue Networks is targeting Cisco’s installed base of WAN routers as a sweet spot for its SDN WAN offerings.

Cisco through the acquisition of vCider takes SDN to next level that sees players placing cloud programmable networks at the heart of multi-tenant and large scale cloud environments. Buying vCider, Cisco intends to accelerate both virtualization and cloud strategies of the network.

F5 Networks recently acquired LineRate Systems, specialist publisher of SDN. LineRate Systems model separates the physical layer of the network control layer managed software level, with a layer of abstraction for controlling the network.

The acquisition of Vyatta complement R&D investments in Brocade SDN and allow it to open up markets for virtualization of data centers, public and private cloud and managed services. Vyatta develops networks demand for OS routing, security and VPN for virtual and physical networks and cloud environments.

HP has also announced the launch next year of its own software OpenFlow controller, called Virtual Application Networks SDN Controller (NPV SDN Controller). HP is not the only major manufacturer interested in OpenFlow. IBM also announced the launch of its own OpenFlow controller, called IBM Programmable Network Controller.

To strengthen its grip on cloud computing, Oracle recently announced that it has signed an agreement to buy Xsigo, editor of a monitoring solution for heterogeneous networks. Xsigo solutions are based on the concept of SDN. Xsigo’s technology can create virtual pools of network capacity allowing resources to be delivered dynamically based computing needs.

For most other enterprises, the initial focus of SDN will be in their datacenter to assess the impact on network reliability. It will take a few years for SDN applications and best practices to become significant IT resources.

Interesting level of investment in the SDN market space

Posted via email from Larkland Morley's posterous

Sunday, April 14, 2013

Cisco to Acquire Ubiquisys for $310M

Cisco (NASDAQ: CSCO) is to acquire venture-backed Ubiquisys, a Swindon, UK-based provider of intelligent 3G and long-term evolution (LTE) small-cell technologies that provide connectivity across mobile heterogeneous networks for service providers, for approximately $310m.

The acquisition will allow Cisco to add Ubiquisys’ indoor small-cell expertise and focus on intelligent software for licensed 3G and LTE spectrum to its mobility portfolio and Wi-Fi expertise and provide a comprehensive small-cell solution for service providers that supports the transition to next-generation radio access networks. The deal also complements Cisco’s mobility strategy along with the recent purchases of BroadHop and Intucell, reinforcing in-house R&D such as service provider Wi-Fi and licensed radio.

Upon the close of the acquisition, which is expected to close in the fourth quarter of Cisco’s fiscal year 2013, subject to customary closing conditions, the Ubiquisys employees will be integrated into the Cisco Mobility Business Group, reporting to Partho Mishra, vice president and general manager, Service Provider Small Cell Technology Group.
Beyond the above mentioned cash, Cisco will also pay in retention-based incentives to acquire the entire business and operations of Ubiquisys.

FinSMEs

07/04/2013

Related News
22/09/2010: Ubiquisys Raises $5M from Three Asian Investors
15/08/2012: Ubiquisys Raises $19M in Funding

Share this:

Tagged as: Cisco, Ubiquisys

Posted via email from Larkland Morley's posterous

Cloud Computing And Organizational Inertia

Cloud Computing And Organizational Inertia

Having spent this last week at the Cloud Connect event in Silicon Valley, I have had a number of interesting discussions with people involved with various aspects of cloud computing. While industry analysts such as Gartner and IDC are projecting that 80% of all servers running on native hardware are expected to be virtualized by 2020, many service provider and media representatives I’ve spoken with continue to be a little skeptical as far as these projections are concerned.

From a technical standpoint, the development effort by industry bellwethers like Cisco, IBM and HP has significantly accelerated the technological development in this space. Based on what I have seen at Cloud Connect and in the course of our daily work, I am confident that the underlying solutions will be ready to hit the mainstream by the end of 2013. Having said that, to meet the growth projections made by the analyst community, finalizing the facilitating technology stacks is not enough.

Over the last couple of months, our team conducted a vendor survey with 102 service providers around the world. Although there is still some work to do as far as technologies are concerned, I think perhaps the most interesting finding was that the biggest obstacle holding service providers back in their cloud and data center automation efforts is organizational inertia, not technology. In retrospect, this is of course logical, because large shifts like this also require changes on the organizational level.

Retail banking is another industry that has gone through a similar change, By now, most of us are used to doing the majority of our banking online, without the need to visit our local branch or to contact a designated representative to complete daily transactions. For banks this has required significant organizational changes involving cut-backs at the individual branches and increases in the size of the workforce managing the technology platforms that make online banking possible. Ultimately, this has made banking a lot more convenient for the average user, while providing the banking institutions with operational efficiencies.

Amidst all the hype, it is easy to forget that cloud computing is really not about technology. Rather, what the cloud promises us all is a new way to consume applications without having to give any thought to the underlying technologies. That translates to cost efficiencies, ease of use and self-service IT, based on process automation. When one thinks about all the companies using IT as an integral part of their operations, it becomes obvious that the impact of cloud computing will affect pretty much all industries, ultimately helping consumers to get more for less.

With an upside of this magnitude, it is no wonder that service providers and IT departments around the world are thinking about ways to benefit from this opportunity. Before embarking on this journey, however, most of these organizations would be well-advised to take a good look at their organizational structures. In traditional computing, various specialist teams have been working in silos, focusing on different areas of computing such as applications, databases, servers and networking. To enable end-to-end automation that spans across all these functions, organizational changes will likely be necessary.

To decide whether or not your own organization is ready for the change, here is a quick check-list:

1. Has the cloud computing initiative been made a strategic priority in the organization?

As people in IT departments always have a number of items on their to-do list, making cloud computing a priority across the organization is key to success. Unless everyone in the organization appreciates the strategic importance, coordinating efforts between different teams will become difficult and the momentum will be lost.

2. Have you named a sponsor who is senior enough to push the initiative through?

Resistance to change is natural. To make sure that the strategic cloud computing initiative is not derailed through politics, the initiative must be headed by a respected senior member of the organization whose judgement other people will trust. Ability to build trust between cross-functional teams is also a great asset.

3. Have you thoroughly assessed the make or buy aspects of cloud computing?

Since cloud computing is all about end-to-end automation and efficiency, a natural reaction within an IT department is to start securing bases by driving the cloud initiative towards home-grown or highly customized technical solutions provided by systems integrators. While this approach can be justified in select service provisioning environments, for most enterprises spending time and/or resources on developing IT systems in-house – or paying someone else to do that on customized basis is a waste of time and money. There are ample cloud computing and automation solutions available that can help your organization in meeting its requirements, making job security the only real driver for home-grown tweaks.

By Juha Holkkola

Juha is managing director of Nixu Software Oy Ltd, the cloud application deployment company, an affiliate of Nixu. He joined Nixu in early 2000 and has since held various business and sales management positions. Before Nixu, Juha worked for Nokia Networks and financial services company Sampo Group in various marketing and treasury positions.

For guest blogger opportunities, please contact us regarding…

(Disclaimer: CloudTweaks publishes news and opinion articles from different contributors. All views and opinions in these articles belong entirely to our contributors. They do not reflect or represent in any way the personal or professional opinions of CloudTweaks.com or those of its staff.)

Tagged as: analyst community, application deployment, Automation Solution, automation solutions, Cloud Application, Cloud Computing, enterprise, growth projections, IDC, IT department, job security, organizational inertia, Platform, Security, technical standpoint, technology platform, technology platforms

Interesting discussion

Posted via email from Larkland Morley's posterous

Top Cloud Computing Deployment Models

Many people are becoming curious, with the increasing popularity of cloud topology, as to what cloud computing deployment models exist, and which ones are popular. While cloud is a big buzzword right now, a lot of people are kind of mystified in regards to what it really is. That’s ok, that’s what I’m here for.

So, today, I’m going to clarify once more, for those new here, what cloud computing is, then go over some cloud computing deployment models which are popular, and maybe talk a little bit about how they work.

First off, cloud just means that it exists off location, and is being stored, processed and/or served by an outside machine or machines. This is usually utilized so that software or processes can be controlled from devices too weak to actually perform them, but which clusters of servers, or one really strong server, can execute easily. It is an effective way of putting super computing into Joe Everyguy’s hands without a lot of fuss.

It’s also pursued as a safe and secure backup and storage system, as well as used for communications and cooperative software use and work performance online. So it’s pretty much anything web sources or web powered that’s not local network.

So, what are the popular deployment models for cloud computing? In general, there are three of these models that are standardly used, so let’s take a look at what they are.

The simplest model is just a typical webhost scenario, where you use a single dedicated server or host space on a server, and use limited enhanced computing and cloud services from this single source. It is the most affordable and easy to work with.

The second is the grid, which utilizes a series of servers, usually in the same datacenter, linked together to form a great, herculean computing force for all users logged into it. It’s the most expensive model, due to its specificity, but is also not the most powerful one, in technicality, which is …

Cluster computing. Cluster computing maps a global range of available servers regardless of space. It will link them together to form nodes of shared super computing power, often based on distance from the user. If any fail, the next nearest would take its place in an almost cellular or neural configuration. This system is prone to some faults, but is the most dynamic despite being only the middle of the road for pricing.

Algorithms for this model need to be refined and protocols adjusted, but what system doesn’t need that starting out?

If you’re interested in cloud computing deployment models, there exist very long PDFs that go into the deep, scientific details about the different architectures, including less-used ones than the ones I mention here. They also have, gasp, diagrams, which I cannot provide for you.

Cloud computing has some challenges ahead of it in being accepted and made completely practical for its intended range of purposes, but one day, it will take the place of traditional computing in many arenas, so we may as well embrace it now, and understand it to the fullest. Tomorrow waits for nobody, but nobody said we can’t get there early, right?

Published at DZone with permission of Omri Erel, author and DZone MVB. (source)

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)

Great discussion on cloud computing Models for deployment

Posted via email from Larkland Morley's posterous

Thursday, February 7, 2013

5 Cloud Computing Advantages For The Healthcare Industry

5 Cloud Computing Advantages for the Healthcare Industry

Cloud computing technologies are on the rise in the healthcare industry. Although their adoption is held back by regulatory initiatives and security concerns, the cloud computing market in healthcare is expected to grow to $5.4 billion by 2017.  This is the conclusion of the “Healthcare Cloud Computing (Clinical, EMR, SaaS, Private, Public, Hybrid) Market – Global Trends, Challenges, Opportunities & Forecasts (2012 – 2017 published by research firm Markets and Markets.

The health care environment is changing faster than ever before due to the demand of delivering higher quality medical services for less money, and increased competitively between health care services providers. Hospitals, research clinics, private health care institutions and doctors are looking for solutions to increase daily activities, efficiency and decrease their spending.

Cloud computing technologies, if implemented and used appropriately, have a response to all these requirements. Thus, cloud computing provides to the health care environment the opportunity to improve services for patients, to easily share information, to improve operational efficiency, and to streamline costs.

What advantages does cloud computing offer for the health care industry?

1.    Collaboration. In many cases specific information may be needed in two places, by different health services providers at the same time. Through cloud technologies, the information is synchronized and shared in real time.

2.    Speed. Cloud-based tools can upgrade and improve their features faster, less expensively and with minimal or no service interruption. Plus, cloud services enables faster access to important information for health services providers and their patients.

3.    Mobility. Each mobile app is backed up by a cloud infrastructure. By storing data and computing power in the cloud, health care services providers enable their staff to have access to information anywhere and anytime.

4.    Security and privacy. Cloud services providers are required to comply with many privacy standards such as HIPAA (Health Insurance Portability and Accountability Act).  Today there are several managed cloud providers offering HIPAA compliance.

5.    Decreased costs. There is no need for the health care institution and doctors to invest in hardware infrastructure and maintenance because these concerns are already taken care of by the cloud computing providers.

Although cloud computing offers significant advantages to the health care industry, it is still perceived by some as unsafe.. The most common concerns are those that make any other business, from any industry, be reluctant to adopting cloud technologies:  security and confidentiality of patient information, interoperability and compliance with government regulations.  These can all be overcome by doing your due diligence and selecting the right partners.

By Rick Blaisdell / RicksCloud

(Disclaimer: CloudTweaks publishes news and opinion articles from different contributors. All views and opinions in these articles belong entirely to our contributors. They do not reflect or represent in any way the personal or professional opinions of CloudTweaks.com or those of its staff.)

Tagged as: government regulations, health care industry, health care institutions, health insurance portability, health insurance portability and accountability, Health Insurance Portability and Accountability Act, medical services, patient information, research clinics

Posted via email from Larkland Morley's posterous

IBM Simplifies Big Data, Cloud Computing

Making it easier for organizations to quickly adopt and deploy big data and cloud computing solutions, IBM (NYSE: IBM ) today announced major advances to its PureSystems family of expert integrated systems .

Now, organizations challenged by limited IT skills and resources can quickly comb through massive volumes of data and uncover critical trends that can dramatically impact their business. The new PureSystems models also help to remove the complexity of developing cloud-based services by making it easier to provision, deploy and manage a secure cloud environment. Together, these moves by IBM further extend its leadership in big data and next generation computing environments such as cloud computing , while opening up new opportunities within growth markets and with organizations such as managed service providers (MSPs).

Across all industries and geographies, organizations of various sizes are being challenged to find simpler and faster ways to analyze massive amounts of data and better meet client needs. According to IDC, the market for big data technology and services will reach $16.9 billion by 2015, up from $3.2 billion in 2010. 1 At the same time, an IBM study found that almost three-fourths of leaders surveyed indicated their companies had piloted, adopted or substantially implemented cloud in their organizations -- and 90 percent expect to have done so in three years. 2 While the demand is high, many organizations do not have the resources or skills to embrace it.

Today's news includes PureData System for Analytics to capitalize on big data opportunities; a smaller PureApplication System to accelerate cloud deployments for a broader range of organizations; PureApplication System on POWER7+ to ease management of transaction and analytics applications in the cloud; additional options for MSPs across the PureSystems family including flexible financing options and specific MSP Editions to support new services models; and SmartCloud Desktop Infrastructure to ease management of virtual desktop solutions.

New Systems Tuned for Big Data

The new IBM PureData System for Analytics, powered by Netezza technology, features 50 percent greater data capacity per rack 3 and is able to crunch data 3x faster 4 , making this system a top performer, while also addressing the challenges of big data. The IBM PureData System for Analytics is designed to assist organizations with managing more data while maintaining efficiency in the data center -- a major concern for clients of all sizes.

With IBM PureData System for Analytics, physicians can analyze patient information faster and retailers can better gain insight into customer behavior. The New York Stock Exchange (NYSE) relies on PureData System for Analytics to handle an enormous volume of data in its trading systems and identify and investigate trading anomalies faster and easier.

"NYSE needs to store and analyze seven years of historical data and be able to search through approximately one terabyte of data per day, which amounts to hundreds in total," said Emile Werr, head of product development, NYSE Big Data Group and global head of Enterprise Data Architecture and Identity Access Management for NYSE Euronext. "The PureData System for Analytics powered by Netezza system provides the scalability, simplicity and performance critical in being able to analyze our big data to deliver results eight hours faster than on the previous solution, which in our world is a game changer when you look at the impact on businesses every second that passes."

The Nielsen Company, leading global information and measurement company, provides clients with a comprehensive understanding of consumers and their behavior leveraging Netezza technology to deliver complex analytic capabilities.

"Recently, Nielsen tested two major competitors with their latest products to tackle our highly complex analytic workload," said John Naduvathusseril, chief data architect, the Nielsen Company. "Both vendors did not match up on consistent performance, simplicity, data refresh speed and overall performance of our reporting needs. Other vendors require customization, which we cannot sustain and they still did not deliver the kind of performance as the PureData System for Analytics."

The IBM PureData System for Analytics is powered by Netezza technology. It is a strategic part of the IBM Big Data Platform, an integrated architecture that is intended to help organizations achieve Smarter Analytics by leveraging workload optimized systems that work together to tackle advanced analytics.

IBM Makes Cloud Simpler

By simplifying and accelerating cloud deployment platforms, organizations of all sizes and across geographies can increase business agility, minimize business risk and speed time-to-revenue.

One segment of the market in particular that will benefit from today's announcement is MSPs. MSPs are helping midmarket companies solve complex challenges. However, to grow their businesses, MSPs need to expand their IT infrastructure and service delivery capabilities while minimizing the disruption and risk that often comes with growth. Today, IBM brings to market new offerings designed specifically for MSPs including new MSP Editions, flexible "pay as you grow" financing options, and enhanced marketing and sales support. These offerings are all designed to help MSPs deliver a robust cloud infrastructure that will enable them to drive new revenue streams.

Overall, IBM is introducing new cloud options tailored for the data center that allow businesses of all sizes to free up time and money to focus on innovation. These offerings include:

PureSystems is part of IBM SmartCloud offerings, a portfolio of enterprise-class cloud computing technologies and services built on open standards that provides flexible deployment options including PureSystems and IBM SmartCloud services. IBM helps clients build private clouds with IBM SmartCloud Foundation, which provides a common cloud operating environment across the different deployment options. IBM SmartCloud helps clients quickly build and scale private clouds and hybrid clouds for cloud capabilities such as Infrastructure-as-a-Service and Platform-as-a-Service.

PureApplication System also continues to gain momentum with the independent software vendor (ISV) community. IBM works with 275 ISVs to offer more than 325 applications across 21 industries that are validated " Ready for PureSystems ."

"IBM PureApplication System with the POWER7+ architecture offers a greater level of stability and flexibility for our shared customers. Most importantly, we expect to offer an even lower cost of ownership to our customers by optimizing the Manhattan Supply Chain Process Platform with the IBM PureApplication System," said David Landau, vice president product management, Manhattan Associates.

Posted via email from Larkland Morley's posterous

Saturday, December 15, 2012

Is Cloud Computing Killing Open Source Software

Is Cloud Computing Killing Open Source Software

The best thing about open source software systems has always been the fact that it is freely available and any programmer or company can use it to develop its own version of that software. For the longest time they have been the best solution for people willing to go outside the box in order to get the best results in their respective IT departments. Of course these systems have never been without profit and it came from two sources that are now getting to be absolute because of the emergence of cloud computing and the level of affordability most of its components come from.

The way open source software systems have worked so far has been through selling license agreements. Any company could take a software system like MySQL incorporate it in their own product and then they would either have the choice of getting an open source license or buy a commercial license from MySQL, in this case.

However because of the cloud is not actually selling software systems but only time on those systems companies like Amazon, who has developed their Amazon RDS based on MySQL do not have to pay them any licensee fee. The end users get exactly what they needed and are willing to pay for it and cloud service providers like Amazon do not need to pay any fee in licensing.

There is also a second stream of income for big open source companies because it is their software that is modified and sold further on. The process creates a need for specialists and those specialists are delivered by the initial company like MySQL or Red Hat. But if the company that has used the software generates enough revenue from it, they can afford to hire their own specialists. And since their products are not sold further as such but are only accessed by third parties on their own server there is nobody else left who would need those services.

However the world of open source software does not end with MySQL and even they have alternative sources of funding. For one, even the specialists hired by Amazon need to be trained, tested and licensed by a valid authority which will always need to route back to Oracle who currently owns MySQL. And the same is true for any open source software.

Also the entire Linux platform is what currently supports and Android software and as long as that exists there will be little chance for the actual concept of the open source software to go out of date. Even the Android system itself is an open source software system that many companies like CyanogenMod have taken to using and further developing.

So ultimately the cloud cannot take out the open source concept because it is built itself on open source platforms. The game has gotten tougher for many open source companies but they are already fighting back by putting in place new licensing systems like the Affero GPL license.

By Luchi Gabriel Manescu

(Disclaimer: CloudTweaks publishes news and opinion articles from different contributors. All views and opinions in these articles belong entirely to our contributors. They do not reflect or represent in any way the personal or professional opinions of CloudTweaks.com or those of its staff.)

Tagged as: Amazon, android, cloud service providers, IT department, license agreements, Open Source, open source companies, open source license, Programmer, Software, software system, SQL, the cloud

This is a very interesting discussing as if you can get all the services in a cloud why bother with open source software? you can just leverage what is already in the could without worrying about maintenance and development work.

Posted via email from Larkland Morley's posterous