Archive for category Private Clouds

Leslie Muller Chats with Mike Laverick of RTFM Education


Leslie Muller, Founder & CTO of DynamicOps, sat down with Mike Laverick of RTFM Education for a chat about cloud computing, cultural challenges, technical needs and how DynamicOps is transforming the datacenter.

Sit back, relax and join us for some Vendorwag –  http://bit.ly/g28xGz

, , ,

Leave a comment

Migrate to a Private Cloud, Not a Virtual Datacenter


“A private cloud can be a very attractive solution, but a bad implementation can lead to ugly results”

That’s what Brian Proffit of Internet.com’s Enterprise Networking Planet has to say in his latest piece – Migrate to a Private Cloud, Not a Virtual Datacenter. A great piece – and not just because it references our own words of wisdom here on DynamicTalks. 

Take a look here  and then let us know your own thoughts.

, ,

Leave a comment

Must-Have Series Part 4: Private Cloud Deployment Simplicity


by: Richard Bourdeau, VP Marketing, DynamicOps

Time to value is a critical success factor in management buy-in and deployment of a private cloud infrastructure.  If the cost to deploy and maintain a private cloud manager is too high it could not only delay the breakeven point relative to the cost to deploy the solution, it could also dramatically reduce the overall return on a private cloud investment.  Sadly, it could also mean that you never get your cloud off the ground.

Agents add complexity

Know this, initial installation and deployment of the standard configuration should take no more than a few hours if the prerequisites are in place and the deployment is planned correctly.  If the solution requires the deployment of agents on physical host or virtual machines this not only complicates the deployment, but also the ongoing maintenance of your deployment.

Making pieces & parts work together

Another factor that contributes to complexity is the number of products required to deploy the solution.  Many of today’s offerings are a collection of loosely coupled products acquired over time with different architectures and databases.  You know, big company acquires little company making their niche solution part of their big solution that you just cannot deploy without. Making them work together smoothly can be like putting together a child’s toy on Christmas Eve.  Some assembly required? More like an all-nighter with the wrong size allen wrench!  I’m not saying that you will find one tool that will provide all your private cloud management needs.  No one tool can do everything.  However the tool you use to automate private cloud service management should be based on a common foundation that allows for the rapid integration of other components regardless if they are offered by your management vendor or not.   

Working with what you already have

Deployment simplicity is important for companies that want standard off-the-shelf capabilities, as well as for companies who want custom integration with their existing ecosystem of tools and processes.  If armies of people are required to deliver a custom solution, this not only significantly increases the cost, it adds risk and delays the time to value. 

In upcoming entries of this series we will discuss how multi-vendor integration and rapid ecosystem integration are essential to allow you to continue to work with your prior investments and will assist in keeping deployment simple.

 For more information on this topic and others in the must-have series download the white paper from DynamicOps –  bit.ly/eS2HJe. But be sure to check back here as we take deeper dives and open it all up for discussion.

, , ,

Leave a comment

Part 2: How to Share and Play Well with Others in a Private Cloud


by: Richard Bourdeau, VP Product Marketing, DynamicOps

The common infrastructure. What a blessing. What a curse.

Here is a familiar scenario for you…Well mannered IT administrator goes to provision resources for a mission critical application only to find that said resources have already been consumed by someone in a different group. To make matters worse, the other less important function is over-provisioned. Well, a handy automated self service product would have helped this guy out, you say. Not necessarily. Many of today’s typical automation tools just treat your shared infrastructure as a single pool of resources with little or no control over who can consume them. And don’t confuse manual approvals as part of the provisioning process as solving this problem. In a large environment, it’s too easy to lose track over who can consume which resources.

It’s this daily occurrence that makes the ability to deliver secure multi-tenancy one of, if not the most important aspects of cloud computing. By allowing multiple groups or tenants to share a common physical infrastructure, companies can achieve better resource utilization and improved business agility. By dynamically reallocating resources between groups in order to address shifting workloads, companies can more effectively utilize their limited IT resources.

The challenge is to share in such a way that one group does not have access, or even visibility, to the resources that have been allocated to others. Without a secure method of ensuring multi-tenancy, a cloud computing strategy cannot succeed.

Secure multi-tenancy is one of those buzz words thrown about by most cloud automation vendors. Sure, many of them can do it. But to what scale? To what level of control and capacity? Before selecting a vendor make sure their capabilities to securely share a common IT infrastructure meet both your current and future needs.

Multiple Grouping Levels
Make sure that your cloud management tool has enough levels of grouping to support both your organizational constructs as well as the levels of service tiers that you want to provide for those businesses moving ahead.

For Example: You don’t have to be a large company with multiple divisions, each having many departments to need multiple levels of grouping. Maybe your company is not that big, but you want to separate desktop operations from server operations from development and test. In addition you may also want to sub-divide resources allocated to a group into several service tiers (i.e. Tier 1, Tier 2, and Tier 3). Most companies will need a minimum of 2-3 levels of resource grouping.

Think Strategically Act Tactically
Most companies start their private cloud deployments with a single group or in a lab. This is certainly a viable strategy to get experience with new technologies and processes before expanding deployment to multiple groups. The mistake many companies make is selecting their cloud automation platform to only support the requirements of that control group. One of our customers has been so successful with their initial deployment that they not only expanded it to other groups within that company, but are in the process of expanding it to other divisions, creating a community cloud across multiple business of this large multi-national company. And the process is going smoothly for them because they knew to anticipate future needs to maximize their technology investment.

As you look to implement a cloud infrastructure remember the story of our well mannered IT administrator and remember, it can happen in the cloud too. The trick is to know how to avoid it.

Go in knowing these things about your business:

  • What do we need now?
  • What will we need in the future?
  • Can the tech support the transition in scale?
  • What kind of provisions are made to protect allocated resources in shared pools?
  • Ask and ask again, will it scale?

Now onto governance control – who can have what and how much. It can be easier and more effective than you think. Stay tuned!

In the meantime tell us how you maintain secure multi-tenancy. How do you do it?

, ,

Leave a comment

Part 1: Automating Self Service for the Private Cloud


by: Richard Bourdeau, VP Product Marketing, DynamicOps

As promised, so begins our series on the must have’s for your private cloud deployment and what to look for when choosing your technology providers and partners. You will be in it for the long haul with whomever you choose so it is crucial they can do what they promise and you know what to do.

There are many vendors that offer automated self-service for cloud deployment. However when you start to look at what automated self-service means, the implementations vary greatly. Your definition of automation may not be the vendor’s definition and you will soon see gaps between where your automation needs begin and where theirs ends. At DynamicOps, our deployment experience has shown that most vendors provide a one size fits all automation that does not fully automate the entire process or cannot be modified to accommodate the differences in the types of services being delivered or different methodologies used by different groups. Other vendors provide more flexible workflow automation, but do little to actually automate the tasks that need to be performed. It’s a frustrating experience. You think you have done your homework, your strategy is in place, your vendor selected and before you know it production is stalled as you go through the oh so manual task of implementing an effective automation solution.

Before you select automation software to help deploy your private cloud, make sure that it has the functionality to help you with these most common automation challenges.

1.   Automate the entire process
Automated delivery needs to incorporate both the configuration of the IT resources as well as any pre or post configuration steps that need to be completed to either make the IT compute resource usable for the requestor or complete the “paperwork” required to monitor and track the resource throughout its life. Some think that it is a lot to ask to address the entire process and only seek to automate part of the process. So, many private cloud management solutions only address part of the process and focus only on the configuring of the machine vs. the end-to-end process. 

Partial automation, though better than complete manual processing, will still not allow companies to achieve the service level response times and levels of efficiencies desired.  Best way to avoid this trap is map out your process, soup to nuts. Note where compromises cannot be made on automation and understand how the new zero-touch approach will affect your processes on a whole. The right vendor will address your needs and bring additional suggestions and functionality to the table.

2.   Automate the task not just the process
It seems so obvious doesn’t it? But sadly, many service desk solutions that claim to automate the entire process really only automate the workflow that links a bunch of manual configuration steps.  In order to deliver compute resources to its consumer efficiently and reduce service delivery times, automation needs to orchestrate the configuration of both virtual and physical CPU, memory, storage, and network resources.  Ask yourself: Can the solution allow for pre-configured permissions so that resources are allocated with little to no manual intervention?

 3.  Different processes for different services, groups, or users
Every IT administrator dreams of the day when there is one process that addresses every business group and there is a clear view from Point A to Point B. You and I both know that the chances of this happening are even less likely than pigs flying. It is very common that different groups within the same company use different processes to manage their IT resources.  Because of this, production systems typically require more approvals and utilize different best practices than systems created for development and testing. Then, to make life even more interesting, within the same group different IT services can have different components which can necessitate different deployment processes. And we are not done yet! Every use within that group can have different access needs which limit both the services that they can request and the management functions that they can perform against those compute resources. 

I am exhausted just thinking about it. Bottom line – Automation tools which provide a one size fits all approach will not provide enough flexibility as implementations grow beyond typical lab deployments.

4.  Delegated Self-Service
Even with the appropriate governance and controls in place, some companies don’t feel comfortable jumping to  full service modes where end users directly provision and manage their own IT resources. Instead, these companies prefer a delegated self-service model, where an administrator provisions on-behalf of the user. For this to work the software needs to be able to track the actual owner and not the person who provisioned the machine. Ownership tracking is key  to successful lifecycle management. Look at it this way, it’s no use knowing who made the car when you just want to know who put 100k miles on it.

So be sure to look for automation tools that support an administrator initiated provisioning model that  tracks the owner/user. You will thank me later.

I have only scratched the surface on some of the significant differences you should consider when initiating automated self-service. Hopefully I have given you a sense about what to look for.

But don’t think that just because you have automation a private cloud creates. On the contrary, it is just one of the parts to a successful cloud strategy. But fear not, we will be reviewing more. Next we will look at some of the challenges of sharing a common physical infrastructure and what a secure multi-tenant environment will mean to you.

,

Leave a comment

VMBlog: 2011 Virtualization and Cloud Predictions


by Leslie Muller, CTO & Founder, DynamicOps

Recently published to VMBlog for their 2011 Virtualization and Cloud Predictions Series.

In the past few years we have had a unique position in the market that has allowed us to see different angles of the future datacenter. The march toward that vision continues and we all know adoption and the acceleration of cloud technologies will continue to grow exponentially. However, as clouds get bigger and users are looking for the most efficient and beneficial route to deployments,  the key to success lies in the details of integration, automation, managing scale and complexity while delivering a consumer experience.

 1. Virtual Desktop “pilots” will start scaling into large production deployments – Management automation key enabler  
I predict we will see more of the mega scale VDI deployments. Sizes in the hundreds of thousands of VMs and bigger. Having said that processes that worked fine with a few hundred machines quickly break down as companies have to scale deployment to thousands or tens of thousands of machines. These implementations go smoothly during the early phases when you can standardize on a single desktop deployment and have a limited catalogue to provision, reconfigure, and decommission. But as varying desktops types, provisioning methodologies and solution components are added, the ability to keep up with the management without blowing the operational budget will stall many projects. Management automation will bubble to the top as a hot button as processes are evaluated and re-addressed to meet increased demands of scale and real word complexity.

2. As virtualization deployment accelerates the challenge will move from server consolidation to management efficiency
Currently the IT industry is only about 30% virtualized.  I see massive pressure in the next year to get the number to 50% or beyond. The primary business challenge will shift from Server consolidation (cap-ex savings) to improved service delivery times and operational efficiency (Op-Ex savings). This will put the focus on managing growth, complexity and security as a means of establishing governance and controls while reducing operational costs.

3. Inflexibility of management tools will stall many initial private cloud deployments
The persistent trade-off of “change your company and process to match the automation tool” or “wait for months and pay huge sums to create a customized tool” will quickly become unacceptable. Customers will demand rapid custom solution delivery. They can’t afford to change their process, and they can’t afford to wait months and pay huge sums for professional services.

4. Early private cloud deployments will expand to community clouds and service multiple business units.
Many of the companies we have worked with are looking to leverage the operational efficiencies of their initial private cloud deployments to other business or divisions within their companies. These groups are acting as service providers setting up community clouds for other groups.  The improvements in service delivery time coupled with lower operational and capital costs of this deployment model will help accelerate expansion into additional groups within a single business or multiple businesses within a large enterprise. 

5. Hardware becomes more virtualized blurring the lines between virtual and physical management
Virtualization is impacting all compute components, not just the partition that the operating system runs in. System, storage, and networking vendors will continue to virtualize more and more components within their offerings providing IT departments with more flexibility about how they utilize their resources. Increasing we are seeing companies treat their physical resources as a pool that can be dynamically reconfigured and reallocated similar to their virtual infrastructure.  

6. On-demand computing is not just for virtual Infrastructures or private clouds
Most companies have, or will have, a combination of virtual and physical systems. More companies will want a single solution to that provides automated self-service of all their assets not just the virtual ones. Even if a company is 100% virtualized they will need to provision and managed the physical hosts that contain their virtual machines. As customers start to dabble with moving systems to a public cloud service like Amazon EC2 they will want the same operational governance and control that they have implemented for their private cloud services.

7. Public Cloud adoption will create additional governance and control challenges
Public cloud adoption will accelerate primarily in the area of Software as a Service (SaaS) and Platform as a Service (PaaS) platforms like Microsoft Azure, Salesforce and Google Apps. We are excited about the potential that these platforms will provide. However, just because your applications move to a public cloud does not obviate the need for governance and control of these deployments. Unified cloud management for hybrid cloud environments will become increasingly important.

, , ,

1 Comment

The Must Have’s and Must Know’s for a Private Cloud Deployment – A Series


by: Rich Bourdeau, VP Product Management & Marketing, DynamicOps

Is IT ready for automated service?

Over 10 years ago, back in my EMC days, an attendee at a Customer Council told me that we needed to make management much simpler. The number of things that he was managing was growing, technologies were becoming more complex, and his administrators had to know about more technologies. There was no way for his staff to specialize and become an expert in any specific area. He wanted more automation. He wanted the process to be simple so that his clients, system administrators, and DBAs could self-manage. But what he was saying was heresy to his peers. They chided him – Why would you want to do that? It will be anarchy; you will lose control! Looking back I see this guy for what he really was – a visionary.

The world has changed

In all aspects of our lives, we increasingly interact with automated systems that provide instant access to services that once required manual processing and hours or days to complete.  For example, banking and travel were specialized services in which we relied upon other individuals to grant us access and control. Today, we book flights, hotel and rental cars online without ever talking to an agent or even handling a ticket. Banking pushed our control even further. ATMS and online banking give us instant access to our assets enabling us to make real-time decisions.

Today, IT is far more willing to provide automated self-service of IT resources than they were just 2-3 years ago. Large service providers like Amazon with it Elastic Cloud Computing (EC2) infrastructure service have demonstrated the cost-effectiveness and the near instant access of their on-demand IT services. IT consumers are demanding quicker access to desktops, servers and applications. If they don’t get them fast enough from their IT department, they have shown in the past that they will use alternative options.  This is where IT really loses control.

Is Automated Self-Service ready for IT?

In order to improve the IT service delivery experience, IT is embracing automated self-service. According to Gartner, IDC and others, the growth in private cloud management software will outpace the growth in core virtualization software over the next 5 years. In order to meet this expected demand, there are probably over 50 vendors that profess to have automated self-service management of virtual or cloud computing. These include offerings from the leading virtualization vendors (VMware, Microsoft, Citrix, Red Hat and others), the Big 4 Management vendors (BMC, CA, HP, and IBM) and emerging vendors like DynamicOps.

Automated Cloud Management software accelerate service delivery times while at the same time reducing both operational cost and optimizing capital investment through more effective use of a shared physical infrastructure. This is an attractive proposition for any enterprise. However, without efficient and effective management tools, companies may not be able to achieve the savings they originally envisioned. 

Since being spun out of Credit Suisse in 2008, DynamicOps spent the last three years helping enterprise companies deploy on-demand IT services or private clouds. Over the coming weeks I will share some of our operational expertise and real-world deployment experience. By presenting the some of the challenges you will likely face and discussing the product capabilities you should look for, I will help you accelerate the time to value of your private cloud deployment. 

So stay tuned. Your cloud will never be the same!

,

Leave a comment