Skip navigation

Tag Archives: Hyper-V

The Free download of Hyper-V is now available. Microsoft® Hyper-V™ Server 2008 is a stand-alone product that provides a simplified, reliable, cost-effective and optimized virtualization solution enabling organizations to improve server utilization and reduce costs. It allows organizations to consolidate workloads onto a single physical server and is a good solution for organizations who want a basic and simplified virtualization solution for consolidating servers .
Download it Free!!

Microsoft has placed its stamp on the virtualization phenomenon with the launch of Hyper-V, a new technology that adds to the power of the Windows Server 2008 platform.  Hyper-V represents the next era of virtualization, allowing administrators and hosting customers to get the most out of their sever investments and usage.  What sets it apart from other virtualization platforms is the ability to run a number of different operating systems in parallel on a single physical server.  It doesn’t matter if you prefer Windows, Linux or Unix, Hyper-V allows you to take full advantage of 64-bit computing.

Server Consolidation

Today’s businesses are under a lot of pressure to simplify management processes all while reducing costs and maintaining competitive benefits such as flexibility, scalability, reliability and security.  Hyper-V enables organizations to meet these goals by consolidating multiple servers within a single system while maintaining isolation.  Aside from saving money on individual servers, this type of virtualization also provides for lower processing power, cooling and management costs.  Businesses of various sizes can benefit from an asset standpoint as well as capabilities that allow them to effectively balance workloads across multiple resources.  This results in improved flexibility that allows for the integration of 32-bit and 64-bit operations in the same environment.

Disaster Recovery and Business Continuity

Business continuity refers to the ability to reduce both scheduled and unscheduled downtime.  This includes time lost due to routine operations such as backup and maintenance as well untimely events caused by failures and outages.  Hyper-V offers extensive business continuity features such as live backup and seamless migration capabilities that enable businesses to meet demanding uptime and response metrics.

Disaster recovery is an essential element of business continuity.  Emergencies, natural disasters, malicious physical attacks and even simple configuration issues such as software conflict all have the power to cripple an organization’s vital data before administrators have a chance to back it up.  Leveraging the clustering features integrated into Windows Server 2008, Hyper-V supports disaster recovery across diverse IT environments and data centers.  Fast, reliable disaster recovery and business continuity tools help to ensure minimal data loss and enhanced remote management capabilities.

Improved Testing and Development

Testing and development are usually the first functions businesses take advantage of with virtualization technology.  The use of virtual machines allows development staff to create and test a wide range of scenarios in a secure, self-contained environment that precisely approximates the operation of physical clients and servers.  Hyper-V enables the maximum utilization of test hardware, helping to reduce costs, improve test coverage and life cycle management.  With support for numerous operating systems and checkpoint features, this virtualization solution makes an excellent platform for test and development environments.

Server virtualization is a hot commodity in the IT world for the potential of benefits that can be weighed on the scales of cost and performance.  Already one of the pioneers in web technology, Microsoft can now be credited for enhancing the virtualization concept and enabling businesses all over the world to enjoy a more dynamic IT infrastructure.

Cloud computing biggest losers
Cloud computing biggest losers

Roman Stanek, during his opening keynote at the Cloud Computing Conference & Expo Europe in Prague today, said “Big server vendors such as HP, Sun, Dell, Microsoft, as well as monolithic app providers will be among the losers of the Cloud Computing revolution, while innovation, SMBs, and the little guys will be the winners of the Cloud.”
VIEW STANEK’S KEYNOTE HERE

In his presentation, titled: “Building Great Companies in the Cloud,” Stanek – a technology visionary who has spent the past fifteen years building world-class technology companies – talked about what it means to be ‘born on the cloud.’ Specifically he shared with delegates his thoughts on how to use cloud computing as a technical design center; how to take advantage of the economics of cloud computing in building and operating cloud services; how to dramatically change customer adoption; and how to plan for the technical and operational scale that cloud computing makes possible.

– Read more

The impact of cloud computing is most often analyzed through its expected disruption of IT vendors, or the media, or as an economic balm for developers and Web 2.0 start-ups.

Yet cloud computing is much more than a just newcomer on the Internet hype curve. The heritage of what cloud computing represents dates back to the dawn of information technology (IT), to the very beginnings of how government agencies and large commercial enterprises first accessed powerful computers to solve complex problems.

We’ve certainly heard a lot about the latest vision for cloud computing and what it can do for the delivery of applications, services and infrastructure, and for application development and deployment efficiencies. So how does cloud computing fit into the whole journey of the last 35 years of IT? What is the context of cloud computing in the real-world enterprise? How do we take the vision and apply it to today’s enterprise concerns and requirements?

To answer these questions, we need to look at the more mundane IT requirements of security, reliability, management, and the need for integration across multiple instances of cloud services. To help understand the difference between the reality and the vision for cloud computing, I recently interviewed Frank Gillett, vice president and principal analyst for general cloud computing topics and issues at Forrester Research.

Gardner: You know, Frank, the whole notion of cloud computing isn’t terribly new. I think it’s more of a progression.

Gillett: When I talk to folks in the industry, the old timers look at me and say, “Oh, time-sharing!” For some folks this idea, just like virtualization, harkens back to the dawn of the computer industry and things they’ve seen before. … We didn’t think of them as cloud, per se, because cloud was just this funny sketch on a white board that people used to say, “Well, things go into the network, magic happens, and something cool comes from somewhere.”

So broadly speaking, software as a service (SaaS) is a finished service that end users take in. Platform as a service (PaaS) is not for end users, but for developers. … Some developers want more control at a lower level, right? They do want to get into the operating system. They want to understand the relationship among the different operating systems instances and some of the storage architecture.

At that layer, you’re talking about infrastructure as a service (IaaS), where I’m dealing with virtual servers, virtualized storage, and virtual networks. I’m still sharing infrastructure, but at a lower level in the infrastructure. But, I’m still not nailed to this specific hardware the way you are in say a hosting or outsourcing setup.

Gardner: We’re in the opening innings of cloud computing?

Gillett: A lot of the noisy early adopters are start-ups that are very present on the Web, social media, blogs, and stuff like that. Interestingly, the bigger the company the more likely they are to be doing it, despite the hype that the small companies will go first.

… It doesn’t necessarily mean that your typical enterprise is doing it, and, if they are, it’s probably the developers, and it’s probably Web-oriented stuff. … In the infrastructure layer, it’s really workloads like test and development, special computation, and things like that, where people are experimenting with it. But, you have to look at your developers, because often it’s not the infrastructure guys who are doing this. It’s the developers.

It’s the people writing code that say, “It takes too long to get infrastructure guys to set up a server, configure the network, apportion the storage, and all that stuff. I’ll just go do it over here at the service provider.”

… There is no one thing called “cloud,” and therefore, there is no one owner in the enterprise. What we find is that, if you are talking about SaaS, business owners are the ones who are often specing this.

Gardner: Who is the “one throat to choke” if something goes wrong?

Gillett: Bottom line, there isn’t one, because there is no one thing. … They are on their own within the company. They have to manage the service providers, but there is this thing called the network that’s between them and the service providers.

– read more

VMware’s vSphere 4.0, released just over a month ago, is aimed at helping VMware stay at least two steps ahead of Microsoft, now its biggest competitor in the virtualisation market; VMware’s previous move in this perpetual dance was to make its core hypervisor ESX a free download.

So I went to talk to VMware to find out more, and to get a hands-on demonstration of how it works. The company describes vSphere as an operating system for the internal cloud — a term you might as well view as synonymous with datacentre, as a VMware exec was ready to admit. The top level story is that vSphere is all about improving the efficiency of the four key datacentre resources: computing, networking, storage, and memory.

vSphere, from the company that’s led the server virtualisation market since its inception. is aware of its customers’ needs for fast return on investment: the last enterprise IT manager I spoke said that no spending that didn’t offer more or less instant ROI was making it through today’s tight budget filters. But as VMware’s Fredrik Sjostedt pointed out at the demo, there are lots of projects that companies are implementing for that reason, and many of those are making use of virtualisation.

Caveats
A few caveats before I get into the demo highlights. The company talks about addressing the internal cloud. And applications for vSphere may not lie within a single datacentre but instead might be spread across multiple datacentres. An example might be an organisation with multiple sites and which has grown through acquisition, and wants to consolidate to achieve efficiencies.

There’s also an underlying assumption in a datacentre operating system that there’s always adequate hardware waiting to be converted into a virtual resource. This may be the case in your setup…  read more

Have you ever wondered why Microsoft cares so much about server virtualization? After all, it’s only a software representation of a physical machine.

Microsoft has been very content over the last nearly 30 years letting the likes of Dell, Hewlett-Packard and IBM build physical servers with nary a care. When VMware introduced commodity server virtualization back in 1999, Microsoft hardly batted an eye. So what’s happened to make Microsoft not only care, but care enough to invest millions of dollars into their own server virtualization solution?

It’s all about control.

Today, Microsoft pretty much owns the x86 data center above the hardware. Sure, Linux has established a beach head and Apple is blowing up some dust, but by and large, if it’s x86, it’s running Windows.

How did Microsoft get into this position? By making it easy for developers to build applications on top of the Windows operating system. Look at Novell NetWare — arguably a much better network OS than Windows NT, but a really difficult development platform for ISVs. You had your choice of development languages, as long as it was Watcom C. You also had your choice of user interface, as long as it was exposed across the network.

Microsoft gave the developers freedom to choose the development language they liked to work in and to build a rich user interface. The rest is history. Sure, there are still people who choose Novell for their technology, but Microsoft has usurped Novell’s customer base and Novell has been relegated to a second tier vendor.

OK, so what in the world does all that have to do with Microsoft caring about virtualization? Well, think of today’s developers. Do they develop applications for Windows? Mostly, no. They develop against an application framework. Be it .Net, Struts, Ruby on Rails or something else, it’s the framework that’s important, not the operating system.

You can run many (most?) .Net applications on Linux with a simple recompile under Mono. The other frameworks really don’t care what OS is underneath. As for the user interface, Ajax provides near fat client user experience through an industry standard framework. Microsoft’s response to Ajax is Silverlight, an attempt to keep control over the user experience.

Microsoft is nervous because … read more