Red Hat OpenShift Enterprise 2.0

Red Hat has released a 2.0 version of OpenShift, its on-premises (private) PaaS. OpenShift seems to build on real customer experience to address a range of issues that come up in real deployments, providing an out-of-the-box solution that is likely to appeal to enterprises seeking to offer a consistent development/deployment option to reduce complexity and …

Utopian Disaster Recovery

Recently at Dell World, I was part of a conversation about what would be utopian disaster recovery and where we are today in the state of the industry. But where we are today is transforming, with a new name that encompasses many technologies. We are now using the term “data protection” (DP) to mean much …

Get Off the Hypervisor and Get Into the Cloud

Off of the hypervisor and get into the cloud: In my last couple of post I wanted to express my thoughts about the future of cloud computing. In the first post, I shared what appears to be a bright outlook of the future for people working in the cloud space with the soaring demand for skilled engineers and not enough quality people to fill those roles. In my second post, I presented a couple of key skill areas that currently seem to have the most demand but I want to share my thoughts, or more to the point, concern that this “gap” of skilled engineers in only going to increase unless we can help guide people off of the hypervisor and into the cloud.

Cloud Foundry: Life Is Too Short

Pivotal’s public cloud version of Cloud Foundry really struggles with the loose integration of third-party services. To appeal to ISVs and others with real-world complexity in their applications, Pivotal needs to identify a coherent product and concentrate on delivering something that works. I tried assiduously to use it and ultimately failed. In case you think …

Network Virtualization: Not Just for the Service Provider

Many network virtualization products appear to be aimed at the top 10,000 customers worldwide, accounting for their price as well as their published product direction. While this is a limited and myopic view, many claim it is for the best, their reason being that network virtualization is only really needed by the very large networks. The more I think about this approach, the more I believe it is incorrect. Let us be frank here. Most networking today, within many different organizational sizes, is a hodgepodge of technologies designed to solve the same problem(s) over and over: how to get data quickly from point A to point B with minimum disruption to service.