Linux Containers Authors: Carmen Gonzalez, Jim Kaskade, Elizabeth White, Derek Weeks, Liz McMillan

News Feed Item

PHD Virtual Provides Easier to Manage and More Efficient Backup Solutions for Enterprise Environments

PHD Virtual Technologies, a pioneer in virtual machine backup and recovery, and innovator of virtualization monitoring solutions, announced today their solutions are well-suited for large enterprise environments as well as small businesses with growing data storage, backup and recovery needs.

Get PHDVB v6.0 for 15 Days Click Here

Tweet This: @PHDVirtual Provides Enterprises with Virtual Machine Backup and Recovery Solutions Fit to Scale http://bit.ly/HyPclv

PHD Virtual Backup 6.0 gives customers of all sizes the scalability and flexibility they need, but for large enterprises, these products make backup processes easier to manage and more efficient, while also providing fully recoverable data at a moment’s notice. They also consume the least amount of overhead and capital when compared to other products on the market and provide encrypted data protection that is required at larger enterprises.

"Customers continue to embrace server virtualization and are increasingly deploying multiple hypervisors," said Robert Amatruda, Research Director for Data Protection and Recovery at IDC. "PHD Virtual's Backup 6.0 solution provides customers a cost-effective and easy-to-deploy solution that supports multiple hypervisors and will scale with their virtual environment.”

“In addition to the growth we have experienced within the SMB market, we’ve seen a major uptick in our penetration in the larger enterprise environments as well,” commented Jim Legg, CEO, PHD Virtual. “The simplicity and cost-savings are definitely not exclusive to the smaller organizations – the ease of use and data movement offsite provides a powerful combination for any size corporation.”

PHD Virtual benefits for large enterprise environments include the following:

  • Complete or partial restorations - provides administrators with the ability to restore a complete server from scratch, by simply selecting a restore point and target. No agents, no operating system install, just restore a complete duplicate of the VM.
  • TrueDedupe technology - a true source-side deduplication of data, meaning the deduplication and compression of the source data is performed before sending the information across the WAN/LAN and before the data is written to disk. This type of efficient deduplication is critical for enterprises that require disk space to house the backups for massive amounts of data while also being more scalable and pairing down the overall backup window time. PHD performs deduplication by comparing REAL data existing on the backup target while ensuring to eliminate duplicate copies of data across ALL VMs stored on the target making it more robust and preventing unnecessary job management to achieve storage efficiencies.
  • Parallel processing model – this provides the ability to use multiple data streams for backing up, restoring, and replicating with the result that multiple jobs can run concurrently, In addition, parallel processing allows the user to throttle, increasing or decreasing the resources used for processing to balance the work load and timing of the backup window in the data center.
  • Fault tolerant scaling – provides a 100% virtualized footprint for your backups running on a Linux-based application that can scale up and out by simply deploying more Virtual Backup Appliances (VBAs) giving the ability to create fault tolerance and load balancing, while providing performance that is required without the extra cost of more physical infrastructure or extra licensing.
  • Replication – by backing up VMs once and storing them to disk, PHD Virtual Backup eliminates the need for unnecessary snapshots on your production VMs while maintaining the extra layer of protection of having the replicated VMs located offsite.
  • Disaster recovery planning – PHD Virtual provides:
    • TrueRestore: a verification and self-healing process in which the blocks being backed up are inspected both during the backup and restore functions. By doing this, PHD ensures that the data being backed up is indeed the same data that is going to be restored.
    • Test-Mode: provides the ability to run replicated VMs in a test mode located in a standby environment. This gives peace of mind that the standby VM has been verified, is completely operational and can be properly failed over.
  • Data recovery – PHD Virtual’s Instant Restore provides more savvy recovery methods by allowing administrators to immediately power on the backup VMs and begin a restore process simultaneously. By doing so, PHD provides immediate access to servers and applications and also leverages concurrent data streams, allowing them to implement a technology called “mass restore,” which creates and configures a single restore job that will process multiple VMs at the same time, again reducing complexity and reducing the company’s RTO. Granular restore is more common than a complete data center restore, so PHD has provided support within their products to restore a file, virtual disk within a VM, or single application object, such as an email, mailbox, datastore, database, table, etc. This feature provides the functionality to restore only what you need, when you need it, without setting up any virtual labs or sandboxes to speed up recovery time and prevent unnecessary data loss.
  • Backing up a constantly evolving data center – PHD Virtual allows administrators to plug a backup appliance or VBA into virtually any environment, including the cloud or software defined data center (SDDC) or a remote or branch office, ensuring data is safe and recoverable.

“PHD Virtual does VMware backup better than anyone else, especially for enterprises like us,” said Barry Quiel, SunGard Public Sector, California. “For our large environment, we needed someone that specialized in moving lots of virtual machine data, while storing as little as possible of it. We also need plenty of options to handle how we move data off-site and to tape. PHD Virtual gives us all of the options we needed to make sure our VMware environment meets our enterprise data protection requirements.”

Supporting Resources

PHD Virtual Technologies: http://phdvirtual.com/

More PHD Virtual News: http://phdvirtual.com/newsandevents

Twitter: https://twitter.com/PHDVirtual

Facebook: http://www.facebook.com/PHDVirtualTechnologies

LinkedIn: http://www.linkedin.com/groups?gid=1992663&mostPopular=&trk=tyah

RSS Feeds: PHD Virtual news releases: http://www.phdvirtual.com/rss/news-and-events.xml

About PHD Virtual Technologies

PHD Virtual provides the absolute best value in virtual backup and monitoring for VMware and Citrix platforms. More than 4,500 customers worldwide rely on our products because they are effective, easier to use and far more affordable than competitive alternatives. Delivering the highest performance and most scalable cross platform backup and monitoring solutions on the market and pioneer of Virtual Backup Appliances (VBAs), PHD Virtual Technologies has been transforming data protection for virtual IT environments since 2006. Its PHD Virtual Monitor provides a complete, end-to-end solution for monitoring virtual, physical and application infrastructures in VMware and Citrix environments. For more information, please visit: http://www.phdvirtual.com/

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

@ThingsExpo Stories
Major trends and emerging technologies – from virtual reality and IoT, to Big Data and algorithms – are helping organizations innovate in the digital era. However, to create real business value, IT must think beyond the ‘what’ of digital transformation to the ‘how’ to harness emerging trends, innovation and disruption. Architecture is the key that underpins and ties all these efforts together. In the digital age, it’s important to invest in architecture, extend the enterprise footprint to the cl...
SYS-CON Events announced today that MathFreeOn will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MathFreeOn is Software as a Service (SaaS) used in Engineering and Math education. Write scripts and solve math problems online. MathFreeOn provides online courses for beginners or amateurs who have difficulties in writing scripts. In accordance with various mathematical topics, there are more tha...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
@ThingsExpo has been named the Top 5 Most Influential Internet of Things Brand by Onalytica in the ‘The Internet of Things Landscape 2015: Top 100 Individuals and Brands.' Onalytica analyzed Twitter conversations around the #IoT debate to uncover the most influential brands and individuals driving the conversation. Onalytica captured data from 56,224 users. The PageRank based methodology they use to extract influencers on a particular topic (tweets mentioning #InternetofThings or #IoT in this ...
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue an...