Archive for category Virtualization

Leslie Muller Chats with Mike Laverick of RTFM Education


Leslie Muller, Founder & CTO of DynamicOps, sat down with Mike Laverick of RTFM Education for a chat about cloud computing, cultural challenges, technical needs and how DynamicOps is transforming the datacenter.

Sit back, relax and join us for some Vendorwag –  http://bit.ly/g28xGz

, , ,

Leave a comment

Part 2: How to Share and Play Well with Others in a Private Cloud


by: Richard Bourdeau, VP Product Marketing, DynamicOps

The common infrastructure. What a blessing. What a curse.

Here is a familiar scenario for you…Well mannered IT administrator goes to provision resources for a mission critical application only to find that said resources have already been consumed by someone in a different group. To make matters worse, the other less important function is over-provisioned. Well, a handy automated self service product would have helped this guy out, you say. Not necessarily. Many of today’s typical automation tools just treat your shared infrastructure as a single pool of resources with little or no control over who can consume them. And don’t confuse manual approvals as part of the provisioning process as solving this problem. In a large environment, it’s too easy to lose track over who can consume which resources.

It’s this daily occurrence that makes the ability to deliver secure multi-tenancy one of, if not the most important aspects of cloud computing. By allowing multiple groups or tenants to share a common physical infrastructure, companies can achieve better resource utilization and improved business agility. By dynamically reallocating resources between groups in order to address shifting workloads, companies can more effectively utilize their limited IT resources.

The challenge is to share in such a way that one group does not have access, or even visibility, to the resources that have been allocated to others. Without a secure method of ensuring multi-tenancy, a cloud computing strategy cannot succeed.

Secure multi-tenancy is one of those buzz words thrown about by most cloud automation vendors. Sure, many of them can do it. But to what scale? To what level of control and capacity? Before selecting a vendor make sure their capabilities to securely share a common IT infrastructure meet both your current and future needs.

Multiple Grouping Levels
Make sure that your cloud management tool has enough levels of grouping to support both your organizational constructs as well as the levels of service tiers that you want to provide for those businesses moving ahead.

For Example: You don’t have to be a large company with multiple divisions, each having many departments to need multiple levels of grouping. Maybe your company is not that big, but you want to separate desktop operations from server operations from development and test. In addition you may also want to sub-divide resources allocated to a group into several service tiers (i.e. Tier 1, Tier 2, and Tier 3). Most companies will need a minimum of 2-3 levels of resource grouping.

Think Strategically Act Tactically
Most companies start their private cloud deployments with a single group or in a lab. This is certainly a viable strategy to get experience with new technologies and processes before expanding deployment to multiple groups. The mistake many companies make is selecting their cloud automation platform to only support the requirements of that control group. One of our customers has been so successful with their initial deployment that they not only expanded it to other groups within that company, but are in the process of expanding it to other divisions, creating a community cloud across multiple business of this large multi-national company. And the process is going smoothly for them because they knew to anticipate future needs to maximize their technology investment.

As you look to implement a cloud infrastructure remember the story of our well mannered IT administrator and remember, it can happen in the cloud too. The trick is to know how to avoid it.

Go in knowing these things about your business:

  • What do we need now?
  • What will we need in the future?
  • Can the tech support the transition in scale?
  • What kind of provisions are made to protect allocated resources in shared pools?
  • Ask and ask again, will it scale?

Now onto governance control – who can have what and how much. It can be easier and more effective than you think. Stay tuned!

In the meantime tell us how you maintain secure multi-tenancy. How do you do it?

, ,

Leave a comment

VMBlog: 2011 Virtualization and Cloud Predictions


by Leslie Muller, CTO & Founder, DynamicOps

Recently published to VMBlog for their 2011 Virtualization and Cloud Predictions Series.

In the past few years we have had a unique position in the market that has allowed us to see different angles of the future datacenter. The march toward that vision continues and we all know adoption and the acceleration of cloud technologies will continue to grow exponentially. However, as clouds get bigger and users are looking for the most efficient and beneficial route to deployments,  the key to success lies in the details of integration, automation, managing scale and complexity while delivering a consumer experience.

 1. Virtual Desktop “pilots” will start scaling into large production deployments – Management automation key enabler  
I predict we will see more of the mega scale VDI deployments. Sizes in the hundreds of thousands of VMs and bigger. Having said that processes that worked fine with a few hundred machines quickly break down as companies have to scale deployment to thousands or tens of thousands of machines. These implementations go smoothly during the early phases when you can standardize on a single desktop deployment and have a limited catalogue to provision, reconfigure, and decommission. But as varying desktops types, provisioning methodologies and solution components are added, the ability to keep up with the management without blowing the operational budget will stall many projects. Management automation will bubble to the top as a hot button as processes are evaluated and re-addressed to meet increased demands of scale and real word complexity.

2. As virtualization deployment accelerates the challenge will move from server consolidation to management efficiency
Currently the IT industry is only about 30% virtualized.  I see massive pressure in the next year to get the number to 50% or beyond. The primary business challenge will shift from Server consolidation (cap-ex savings) to improved service delivery times and operational efficiency (Op-Ex savings). This will put the focus on managing growth, complexity and security as a means of establishing governance and controls while reducing operational costs.

3. Inflexibility of management tools will stall many initial private cloud deployments
The persistent trade-off of “change your company and process to match the automation tool” or “wait for months and pay huge sums to create a customized tool” will quickly become unacceptable. Customers will demand rapid custom solution delivery. They can’t afford to change their process, and they can’t afford to wait months and pay huge sums for professional services.

4. Early private cloud deployments will expand to community clouds and service multiple business units.
Many of the companies we have worked with are looking to leverage the operational efficiencies of their initial private cloud deployments to other business or divisions within their companies. These groups are acting as service providers setting up community clouds for other groups.  The improvements in service delivery time coupled with lower operational and capital costs of this deployment model will help accelerate expansion into additional groups within a single business or multiple businesses within a large enterprise. 

5. Hardware becomes more virtualized blurring the lines between virtual and physical management
Virtualization is impacting all compute components, not just the partition that the operating system runs in. System, storage, and networking vendors will continue to virtualize more and more components within their offerings providing IT departments with more flexibility about how they utilize their resources. Increasing we are seeing companies treat their physical resources as a pool that can be dynamically reconfigured and reallocated similar to their virtual infrastructure.  

6. On-demand computing is not just for virtual Infrastructures or private clouds
Most companies have, or will have, a combination of virtual and physical systems. More companies will want a single solution to that provides automated self-service of all their assets not just the virtual ones. Even if a company is 100% virtualized they will need to provision and managed the physical hosts that contain their virtual machines. As customers start to dabble with moving systems to a public cloud service like Amazon EC2 they will want the same operational governance and control that they have implemented for their private cloud services.

7. Public Cloud adoption will create additional governance and control challenges
Public cloud adoption will accelerate primarily in the area of Software as a Service (SaaS) and Platform as a Service (PaaS) platforms like Microsoft Azure, Salesforce and Google Apps. We are excited about the potential that these platforms will provide. However, just because your applications move to a public cloud does not obviate the need for governance and control of these deployments. Unified cloud management for hybrid cloud environments will become increasingly important.

, , ,

1 Comment