Skip navigation

Tag Archives: cloud computing

Open Cirrus, the joint collaboration on cloud computing research formed by Yahoo, Intel and HP last year, has added three new international research organizations to join the open source test bed for cloud computing research.

The new research organizations that are joining Open Cirrus, include the Russian Academy of Sciences, South Korea’s Electronics and Telecommunications Research Institute and MIMOS, a strategic research and development organization under the Ministry of Science, Technology and Innovation in Malaysia.

Launched in July 2008, Open Cirrus was created by the three tech giants to promote open collaboration among industry, academia and governments by removing the financial and logistical barriers to research in data-intensive, Internet-scale computing. The test bed, which has more than 50 research projects currently underway, simulates a real-life, global, Internet-scale environment and lets researchers test applications and measure the performance of applications and services that are built to run on large-scale cloud systems.

As we wrote last year, the Open Cirrus is a way for Yahoo to get its foot in the cloud computing space and a channel for Intel and HP to further their position as leaders in cloud computing technology. — read more


Earlier today at the Clift Hotel in San Francisco, we convened a group of journalists, partners and customers for a discussion on Google Apps in the enterprise. We’re pleased to report that the “state of the cloud” is strong, and we’ve taken a number of steps to make it stronger.
At the event we discussed the growth of our business, introduced some new customers, and announced a feature that makes switching to Apps even easier. The Clift was a particularly appropriate venue because it’s a member of the Morgans Hotel Group, which is deploying Google Apps to its 1,750 employees. JohnsonDiversey, a global provider of commercial cleaning and hygiene products and solutions, has also gone Google. Choosing Apps helped …read more first look at Google Wave. Take a look to see what all the fuss is about and to get a sense of what the final product will be like: check it out

Described as a “personal communication and collaboration tool,” Google Wave allows users to chat and share documents including audio files, videos and photos in real-time.
Google Wave: What is it and how does it work?

Created as an open-source platform, Vic Gundotra, VP of engineering for Google explained the reasons for this: “It’s open sourced for many reasons. Not only do we want to contribute to the Internet but frankly we need developers to help us complete this product and we need your support.”

What makes Google Wave particularly revolutionary is the real-time aspect. Whereas most social media tools involve an element of waiting around as one person waits for another to respond, with a message such as “Person X is typing” usually appearing in the corner, Wave allows users to see what’s being typed as instantly as it appears on the typer’s screen. In the case of sensitive information sharing, an opt-out button allows users to conceal what they are writing until they hit the “send” button.

Google Wave.gifGoogle Wave is the brain-child of Lars and Jens Rasmussen, the brothers who bought us Google Maps. They explain they wanted to “rethink what a single communications platform might look like if we started from scratch.” Lars asked the 4,000-strong audience: “What would email look like if it were invented today?” The idea, they say, was to start with a clean slate, rather than have today’s Internet reality determine what could be possible in the future or even present. By readjusting this approach, they were able to look at online communication in fresh and innovative ways.

The project has been ongoing for two years and since then has expanded along with the team, who have added seemingly simple functions that risk rendering current communication platforms “backward” by comparison. For instance, a document uploaded on Wave can be edited by multiple users in one go, with changes appearing instantaneously, adding a new spin to the term “team work.” — Read more

Red Hat , the world’s leading provider of open source solutions, today announced that its leading operating platform, Red Hat Enterprise Linux, is driving further enterprise adoption of cloud computing .
Building on the strong, long-term technology collaboration between Red Hat and Verizon Business, today Red Hat Enterprise Linux is offered as one of the first two operating system platforms available for Verizon Business’ new Computing as a Service solution .
Verizon CaaS is a new on-demand solution that enables business and government agencies to take advantage of cloud computing to control IT costs and improve operational flexibility .
CaaS enables customers to manage IT resources, including server, network and storage, efficiently and securely, meeting today’s business demands .
Red Hat Enterprise Linux provides a reliable, high-performance, secure open source platform on which to base cloud deployments .

”Verizon Business is now delivering to customers what we believe to be the most comprehensive and secure cloud-based computing solution on the market, said Michael Marcellin, vice president of global managed solutions, Verizon .
”Verizon CaaS was engineered to meet the challenging security needs and performance requirements faced by enterprises and the Red Hat Enterprise Linux operating system is playing a big part as we bring this unique offering to customers around the world .

– read more

Verizon Communications Inc. (VZ) unveiled on Wednesday a service that allows large companies to outsource their technology infrastructure to the telecommunications provider.

The New York company is tapping into the “cloud computing” trend, where corporate IT services are managed over the Web. Large companies looking to cut costs can hand over their IT equipment, which include storage, server and network gear, for Verizon to manage through its own network and data centers.

“Clearly, there’s a huge savings opportunity for companies to deploy cloud services,” said Michael Marcellin, who is vice president of Verizon’s global product marketing.

Rather than buy their own data centers and servers, companies can utilize Verizon’s infrastructure. Customers only have to pay for what they use, which is a more efficient model than buying banks of data center and servers to serve current needs and have enough capacity for future growth.

“It really changes how enterprises and government agencies purchase resources and manage them,” Marcellin said. “You can scale up for a busy season, or scale down for dead times.”

Customers pay a subscription fee of $250 a month, and then get billed on daily usage. Because of the variability of the pricing, Marcellin wasn’t more specific with costs. He added, however, that companies can expect to save between 30% and 60% in IT costs through Verizon’s offering.

Verizon, however, enters a crowded field. International Business Machines Corp. (IBM) and Hewlett-Packard Co. (HPQ) – often partners with the telco – offer traditional data center and computing services. Cisco Systems Inc. (CSCO) recently unveiled a virtualized data center for businesses. The increased competition has each company securing their grip on their customers. AT&T Inc. (T), meanwhile, has a cloud storage service, and plans to offer its own computing service as well.

In addition, large companies that have… Read more

Virtualization in plain english. We look forward to bringing you insightful information on virtualization, cloud computer and a few cool gadgets that makes our day to day a bit easier – (the GeeekQ way!!!) What do you think? Do you hate it or love it?

Microsoft has placed its stamp on the virtualization phenomenon with the launch of Hyper-V, a new technology that adds to the power of the Windows Server 2008 platform.  Hyper-V represents the next era of virtualization, allowing administrators and hosting customers to get the most out of their sever investments and usage.  What sets it apart from other virtualization platforms is the ability to run a number of different operating systems in parallel on a single physical server.  It doesn’t matter if you prefer Windows, Linux or Unix, Hyper-V allows you to take full advantage of 64-bit computing.

Server Consolidation

Today’s businesses are under a lot of pressure to simplify management processes all while reducing costs and maintaining competitive benefits such as flexibility, scalability, reliability and security.  Hyper-V enables organizations to meet these goals by consolidating multiple servers within a single system while maintaining isolation.  Aside from saving money on individual servers, this type of virtualization also provides for lower processing power, cooling and management costs.  Businesses of various sizes can benefit from an asset standpoint as well as capabilities that allow them to effectively balance workloads across multiple resources.  This results in improved flexibility that allows for the integration of 32-bit and 64-bit operations in the same environment.

Disaster Recovery and Business Continuity

Business continuity refers to the ability to reduce both scheduled and unscheduled downtime.  This includes time lost due to routine operations such as backup and maintenance as well untimely events caused by failures and outages.  Hyper-V offers extensive business continuity features such as live backup and seamless migration capabilities that enable businesses to meet demanding uptime and response metrics.

Disaster recovery is an essential element of business continuity.  Emergencies, natural disasters, malicious physical attacks and even simple configuration issues such as software conflict all have the power to cripple an organization’s vital data before administrators have a chance to back it up.  Leveraging the clustering features integrated into Windows Server 2008, Hyper-V supports disaster recovery across diverse IT environments and data centers.  Fast, reliable disaster recovery and business continuity tools help to ensure minimal data loss and enhanced remote management capabilities.

Improved Testing and Development

Testing and development are usually the first functions businesses take advantage of with virtualization technology.  The use of virtual machines allows development staff to create and test a wide range of scenarios in a secure, self-contained environment that precisely approximates the operation of physical clients and servers.  Hyper-V enables the maximum utilization of test hardware, helping to reduce costs, improve test coverage and life cycle management.  With support for numerous operating systems and checkpoint features, this virtualization solution makes an excellent platform for test and development environments.

Server virtualization is a hot commodity in the IT world for the potential of benefits that can be weighed on the scales of cost and performance.  Already one of the pioneers in web technology, Microsoft can now be credited for enhancing the virtualization concept and enabling businesses all over the world to enjoy a more dynamic IT infrastructure.

Cloud computing biggest losers
Cloud computing biggest losers

Roman Stanek, during his opening keynote at the Cloud Computing Conference & Expo Europe in Prague today, said “Big server vendors such as HP, Sun, Dell, Microsoft, as well as monolithic app providers will be among the losers of the Cloud Computing revolution, while innovation, SMBs, and the little guys will be the winners of the Cloud.”

In his presentation, titled: “Building Great Companies in the Cloud,” Stanek – a technology visionary who has spent the past fifteen years building world-class technology companies – talked about what it means to be ‘born on the cloud.’ Specifically he shared with delegates his thoughts on how to use cloud computing as a technical design center; how to take advantage of the economics of cloud computing in building and operating cloud services; how to dramatically change customer adoption; and how to plan for the technical and operational scale that cloud computing makes possible.

– Read more

Whitehurst anticipates the market splitting up roughly along the following lines: small and medium-sized companies subscribing to public clouds and large companies building their own. He says large companies (say, those with 1,000-plus servers) get close to the same economies of scale as public cloud service providers when it comes to purchasing gear, so the cost benefits of moving to the public cloud aren’t as compelling. “There’s going to be some difference in costs, but not much because the margins aren’t that big,” he said in an interview following his keynote.

The other rationale for do-it-yourself clouds versus services like Amazon’s Elastic Compute Cloud is the lower risk, or perceived lower risk, around data security. A few Red Hat customers have begun to investigate the feasibility of creating semi-private clouds where they would share IT infrastructure for cloud bursting or other demand peaks with partners that have been vetted in advance. Red Hat is helping its customers identify potential partners for these semi-private clouds.

One issue to be resolved with semi-private clouds is whether partners should be in the industry, say financial services, or different industries. On the one hand, financial firms might find it easier to set up a cloud with companies they already know and do business with. On the other hand, their IT usage patterns might be too similar to warrant a shared environment, with the risk of simultaneous demand spikes. “This is one of the big debates,” Whitehurst says.

Red Hat has no plans to offer cloud services itself. “We’re not getting in the cloud business. We’re not competing with our customers,” says Whitehurst.