|By Rich Collier||
|March 1, 2014 10:15 AM EST||
Application Performance Management (APM) grew out of the movement to better align IT with real business concerns. Instead of monitoring a lot of disparate components, such as servers and switches, APM would provide improved visibility into mission-critical application performance and the user experience. Today, APM solutions help IT track end-to-end application response time and troubleshoot coding errors across application components that have an impact on performance.
APM has a rightful place in the arsenal of monitoring tools that IT uses to keep its applications and systems up and running. However, today's APM solutions have some serious gaps and challenges when it comes to providing IT with the entire application performance picture.
Most APM solutions provide minimal information about the hardware and network components underlying application performance, other than showing which components are involved in each part of the transaction. Those that do a better job usually require users to shift to another screen or monitoring system to get more hardware visibility. As with the blind men touching different parts of an elephant, this approach makes it difficult to correlate hardware performance with all the other components driving the application.
The Virtual, Distributed Environment
Most of today's APM solutions were created before virtualization, the cloud, and complex, composite applications took off in the IT environment. With virtual machines migrating back and forth among physical servers at different times of the day or week, and applications dependent on scores of components and cloud services, APM vendors are hard-pressed to provide visibility into the entire scope of a single application.
As 24 by 7 by 365 uptime becomes increasingly critical to business success, enterprises need to be able to predict and address issues BEFORE they affect the business, rather than after. APM has had mixed success in this area. A recent survey by TRAC Research found that of organizations deploying APM solutions, 60 percent report a success rate of less than half in identifying performance issues before they have an impact on end users.
Enter Predictive Analytics for IT
Filling these APM gaps is how Big Data and predictive analytics for IT can play a significant, highly beneficial role in IT's efforts to maintain application performance. Today, when IT encounters performance issues, it typically has to collect its server, storage, network, and APM folks into a war room to search through mountains of hardware and APM logs, and correlate information manually to isolate the root cause. This resource-intensive process can frequently take hours or even days.
IT has lots of alerts and thresholds to analyze, but those are only as good as the knowledge, experience, and insight of the IT folks who configured them. Just because a server surpassed its CPU utilization threshold doesn't mean that event had anything to do with the root cause of an application issue. Often the real issue is hidden deep in all the delicate interactions among multiple hardware and software components, and may not be reflected in individual thresholds. The same TRAC Research study shows an average of 46.2 hours spent by IT each month in these war rooms searching for root cause. Even more depressing, the root cause is often not found, so IT just reboots everything in the hope that it all works until the same problem rears its ugly head again.
Predictive analytics take over where APM leaves off, harnessing third-generation machine learning and Big Data analysis techniques to efficiently plow through mountains of log data. They discover all the behavior patterns and interrelationships between the IT software and hardware components driving today's mission-critical applications. Over several hours or days, the best solutions baseline the normal behavior of all those components, relationships, and events and use complex algorithms to detect any anomalies that are the early warning signs of developing performance issues. Better yet, because the analytics understand the chain of events involved in the developing anomaly, IT support staff are immediately provided with not only the alert that something is going wrong, but also the behavior of every component involved. This information can shave hours or even days off those war room scenarios. For example, thanks to a predictive analytics for IT solution, a major retailer was able to trace periodic gift card application outages to a misconfigured VLAN. Similarly, a predictive analytics solution reduced - from six hours in the war room to ten minutes - the time it took to diagnose a financial content management performance issue.
Another advantage of predictive analytics solutions is that because they self-learn the normal behavior patterns of underlying components, they drastically reduce the educated guessing that usually goes along with IT staff identifying and setting thresholds against key performance. The inflexibility of these thresholds results in large numbers of false-positive alerts. But with predictive analytics, highly sophisticated algorithms compute the probability of certain behaviors and can therefore generate much more accurate alerts. Some users of predictive analytics solutions have called them the Donald Rumsfelds of IT management tools because they point IT to infrastructure issues they never even knew existed and never looked for. Rumsfeld called these the "unknown unknowns."
However, it is in their ability to be "predictive" that these advanced analytics solutions really shine. By detecting small anomalies early in the game, predictive analytics can alert IT to performance issues and provide enough information to address their root cause before IT or application users even notice them. This can have a dramatic effect on application uptime and performance and a direct impact on user satisfaction and even enterprise revenue. In the case of the document management application, predictive analytics discovered a developing performance issue, and its root cause, the night before it would have affected users placing the application under load on Monday morning.
APM tools have their place in the enterprise, but predictive analytics solutions for IT can kick the effectiveness of those and other IT monitoring tools up a notch by detecting, tracing, and predicting performance issues and their root cause long before any IT war room can.
- TRAC Research, March 4, 2013: "2013 Application Performance Management Spectrum" report.
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Oct. 10, 2015 07:00 AM EDT Reads: 5,932
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Oct. 10, 2015 06:00 AM EDT Reads: 813
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
Oct. 10, 2015 04:00 AM EDT Reads: 600
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
Oct. 10, 2015 04:00 AM EDT Reads: 520
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
Oct. 10, 2015 03:00 AM EDT Reads: 333
WebRTC: together these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at WebRTC Summit, Cary Bran, VP of Innovation and New Ventures at Plantronics and PLT Labs, will provide an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it may enable, complement or entirely transform.
Oct. 10, 2015 02:15 AM EDT Reads: 772
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
Oct. 10, 2015 02:00 AM EDT Reads: 659
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, will introduce the technologies required for implementing these ideas and some early experiments performed in the Kurento open source software community in areas ...
Oct. 10, 2015 01:00 AM EDT Reads: 785
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Oct. 10, 2015 12:00 AM EDT Reads: 185
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
Oct. 9, 2015 10:15 PM EDT Reads: 151
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, will discuss the impact of technology on identity. Should we federate, or not? How should identity be secured? Who owns the identity? How is identity ...
Oct. 9, 2015 10:00 PM EDT Reads: 454
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
Oct. 9, 2015 10:00 PM EDT Reads: 250
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new data-driven world, marketplaces reign supreme while interoperability, APIs and applications deliver un...
Oct. 9, 2015 08:00 PM EDT Reads: 322
Electric power utilities face relentless pressure on their financial performance, and reducing distribution grid losses is one of the last untapped opportunities to meet their business goals. Combining IoT-enabled sensors and cloud-based data analytics, utilities now are able to find, quantify and reduce losses faster – and with a smaller IT footprint. Solutions exist using Internet-enabled sensors deployed temporarily at strategic locations within the distribution grid to measure actual line loads.
Oct. 9, 2015 06:30 PM EDT Reads: 148
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, will explore the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
Oct. 9, 2015 05:30 PM EDT Reads: 128
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Oct. 9, 2015 04:00 PM EDT Reads: 254
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Oct. 9, 2015 03:45 PM EDT Reads: 512
Today’s connected world is moving from devices towards things, what this means is that by using increasingly low cost sensors embedded in devices we can create many new use cases. These span across use cases in cities, vehicles, home, offices, factories, retail environments, worksites, health, logistics, and health. These use cases rely on ubiquitous connectivity and generate massive amounts of data at scale. These technologies enable new business opportunities, ways to optimize and automate, along with new ways to engage with users.
Oct. 9, 2015 02:00 PM EDT Reads: 194
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll share tips on how to speed up business initiatives, harness Big Data and remain one step ahead by apply...
Oct. 9, 2015 01:45 PM EDT Reads: 568
There will be 20 billion IoT devices connected to the Internet soon. What if we could control these devices with our voice, mind, or gestures? What if we could teach these devices how to talk to each other? What if these devices could learn how to interact with us (and each other) to make our lives better? What if Jarvis was real? How can I gain these super powers? In his session at 17th Cloud Expo, Chris Matthieu, co-founder and CTO of Octoblu, will show you!
Oct. 9, 2015 01:15 PM EDT