Click here to close now.

Welcome!

Linux Authors: Mike Kavis, Roger Strukhoff, Pat Romanski, Carmen Gonzalez, VictorOps Blog

News Feed Item

PHD Virtual Provides Easier to Manage and More Efficient Backup Solutions for Enterprise Environments

PHD Virtual Technologies, a pioneer in virtual machine backup and recovery, and innovator of virtualization monitoring solutions, announced today their solutions are well-suited for large enterprise environments as well as small businesses with growing data storage, backup and recovery needs.

Get PHDVB v6.0 for 15 Days Click Here

Tweet This: @PHDVirtual Provides Enterprises with Virtual Machine Backup and Recovery Solutions Fit to Scale http://bit.ly/HyPclv

PHD Virtual Backup 6.0 gives customers of all sizes the scalability and flexibility they need, but for large enterprises, these products make backup processes easier to manage and more efficient, while also providing fully recoverable data at a moment’s notice. They also consume the least amount of overhead and capital when compared to other products on the market and provide encrypted data protection that is required at larger enterprises.

"Customers continue to embrace server virtualization and are increasingly deploying multiple hypervisors," said Robert Amatruda, Research Director for Data Protection and Recovery at IDC. "PHD Virtual's Backup 6.0 solution provides customers a cost-effective and easy-to-deploy solution that supports multiple hypervisors and will scale with their virtual environment.”

“In addition to the growth we have experienced within the SMB market, we’ve seen a major uptick in our penetration in the larger enterprise environments as well,” commented Jim Legg, CEO, PHD Virtual. “The simplicity and cost-savings are definitely not exclusive to the smaller organizations – the ease of use and data movement offsite provides a powerful combination for any size corporation.”

PHD Virtual benefits for large enterprise environments include the following:

  • Complete or partial restorations - provides administrators with the ability to restore a complete server from scratch, by simply selecting a restore point and target. No agents, no operating system install, just restore a complete duplicate of the VM.
  • TrueDedupe technology - a true source-side deduplication of data, meaning the deduplication and compression of the source data is performed before sending the information across the WAN/LAN and before the data is written to disk. This type of efficient deduplication is critical for enterprises that require disk space to house the backups for massive amounts of data while also being more scalable and pairing down the overall backup window time. PHD performs deduplication by comparing REAL data existing on the backup target while ensuring to eliminate duplicate copies of data across ALL VMs stored on the target making it more robust and preventing unnecessary job management to achieve storage efficiencies.
  • Parallel processing model – this provides the ability to use multiple data streams for backing up, restoring, and replicating with the result that multiple jobs can run concurrently, In addition, parallel processing allows the user to throttle, increasing or decreasing the resources used for processing to balance the work load and timing of the backup window in the data center.
  • Fault tolerant scaling – provides a 100% virtualized footprint for your backups running on a Linux-based application that can scale up and out by simply deploying more Virtual Backup Appliances (VBAs) giving the ability to create fault tolerance and load balancing, while providing performance that is required without the extra cost of more physical infrastructure or extra licensing.
  • Replication – by backing up VMs once and storing them to disk, PHD Virtual Backup eliminates the need for unnecessary snapshots on your production VMs while maintaining the extra layer of protection of having the replicated VMs located offsite.
  • Disaster recovery planning – PHD Virtual provides:
    • TrueRestore: a verification and self-healing process in which the blocks being backed up are inspected both during the backup and restore functions. By doing this, PHD ensures that the data being backed up is indeed the same data that is going to be restored.
    • Test-Mode: provides the ability to run replicated VMs in a test mode located in a standby environment. This gives peace of mind that the standby VM has been verified, is completely operational and can be properly failed over.
  • Data recovery – PHD Virtual’s Instant Restore provides more savvy recovery methods by allowing administrators to immediately power on the backup VMs and begin a restore process simultaneously. By doing so, PHD provides immediate access to servers and applications and also leverages concurrent data streams, allowing them to implement a technology called “mass restore,” which creates and configures a single restore job that will process multiple VMs at the same time, again reducing complexity and reducing the company’s RTO. Granular restore is more common than a complete data center restore, so PHD has provided support within their products to restore a file, virtual disk within a VM, or single application object, such as an email, mailbox, datastore, database, table, etc. This feature provides the functionality to restore only what you need, when you need it, without setting up any virtual labs or sandboxes to speed up recovery time and prevent unnecessary data loss.
  • Backing up a constantly evolving data center – PHD Virtual allows administrators to plug a backup appliance or VBA into virtually any environment, including the cloud or software defined data center (SDDC) or a remote or branch office, ensuring data is safe and recoverable.

“PHD Virtual does VMware backup better than anyone else, especially for enterprises like us,” said Barry Quiel, SunGard Public Sector, California. “For our large environment, we needed someone that specialized in moving lots of virtual machine data, while storing as little as possible of it. We also need plenty of options to handle how we move data off-site and to tape. PHD Virtual gives us all of the options we needed to make sure our VMware environment meets our enterprise data protection requirements.”

Supporting Resources

PHD Virtual Technologies: http://phdvirtual.com/

More PHD Virtual News: http://phdvirtual.com/newsandevents

Twitter: https://twitter.com/PHDVirtual

Facebook: http://www.facebook.com/PHDVirtualTechnologies

LinkedIn: http://www.linkedin.com/groups?gid=1992663&mostPopular=&trk=tyah

RSS Feeds: PHD Virtual news releases: http://www.phdvirtual.com/rss/news-and-events.xml

About PHD Virtual Technologies

PHD Virtual provides the absolute best value in virtual backup and monitoring for VMware and Citrix platforms. More than 4,500 customers worldwide rely on our products because they are effective, easier to use and far more affordable than competitive alternatives. Delivering the highest performance and most scalable cross platform backup and monitoring solutions on the market and pioneer of Virtual Backup Appliances (VBAs), PHD Virtual Technologies has been transforming data protection for virtual IT environments since 2006. Its PHD Virtual Monitor provides a complete, end-to-end solution for monitoring virtual, physical and application infrastructures in VMware and Citrix environments. For more information, please visit: http://www.phdvirtual.com/

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

@ThingsExpo Stories
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
Every innovation or invention was originally a daydream. You like to imagine a “what-if” scenario. And with all the attention being paid to the so-called Internet of Things (IoT) you don’t have to stretch the imagination too much to see how this may impact commercial and homeowners insurance. We’re beyond the point of accepting this as a leap of faith. The groundwork is laid. Now it’s just a matter of time. We can thank the inventors of smart thermostats for developing a practical business application that everyone can relate to. Gone are the salad days of smart home apps, the early chalkb...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, representing a model of how to analyze rea...
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes for use cases across the industrial, enterprise, and consumer segments.
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch of Docker's initial release in March of 2013, interest was revved up several notches. Then late last...
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your online presence.
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along with a steady stream of well-publicized data breaches, only add to the uncertainty
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
Docker is an excellent platform for organizations interested in running microservices. It offers portability and consistency between development and production environments, quick provisioning times, and a simple way to isolate services. In his session at DevOps Summit at 16th Cloud Expo, Shannon Williams, co-founder of Rancher Labs, will walk through these and other benefits of using Docker to run microservices, and provide an overview of RancherOS, a minimalist distribution of Linux designed expressly to run Docker. He will also discuss Rancher, an orchestration and service discovery platf...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add scalable realtime functionality with minimal effort and cost.”
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
CommVault has announced that top industry technology visionaries have joined its leadership team. The addition of leaders from companies such as Oracle, SAP, Microsoft, Cisco, PwC and EMC signals the continuation of CommVault Next, the company's business transformation for sales, go-to-market strategies, pricing and packaging and technology innovation. The company also announced that it had realigned its structure to create business units to more directly match how customers evaluate, deploy, operate, and purchase technology.
In the consumer IoT, everything is new, and the IT world of bits and bytes holds sway. But industrial and commercial realms encompass operational technology (OT) that has been around for 25 or 50 years. This grittier, pre-IP, more hands-on world has much to gain from Industrial IoT (IIoT) applications and principles. But adding sensors and wireless connectivity won’t work in environments that demand unwavering reliability and performance. In his session at @ThingsExpo, Ron Sege, CEO of Echelon, will discuss how as enterprise IT embraces other IoT-related technology trends, enterprises with i...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data they generate about customer usage and product performance to deliver extremely compelling and reliabl...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to make sense of it all.
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics archive, in his session at @ThingsExpo, Jim Kaskade, Vice President and General Manager, Big Data & Ana...