Click here to close now.

Welcome!

Linux Authors: VictorOps Blog, Carmen Gonzalez, Sematext Blog, Plutora Blog, Automic Blog

Related Topics: @ThingsExpo, Java, Wireless, Linux, Cloud Expo, Big Data Journal

@ThingsExpo: Blog Feed Post

The Fourth Digital Wave: The Age of Application Intelligence

This is the age of multi-device mobility, the cloud, seamless computing from one device to another

This post originally appeared on APM Digest

Welcome to the fourth era of digital.

The first three periods or ages or phases — call them what you like — were each defined clearly by transformative events.

First, the dawn of the personal computer age in April 1977 with the debut of the Apple II (and validated in August 1981 with the introduction of the IBM PC).

Next, the beginning of the Internet age when the Netscape browser was released in 1994, which redefined forever the way we connect.

Then, on June 29, 2007 — ushered in again by Steve Jobs and Apple — the mobility era began with the unveiling of the first iPhone, which ushered in a “Mobile First” mindset for the masses.

And now we’re in the fourth era. This time there’s been no single, monumental event or technology to mark its beginning, though mobility and the cloud are the primary enabling technologies. What’s happening instead is that a number of technologies are coalescing and achieving, even as we speak, a critical mass that will make this age as transformative or more so than any of the previous three.

This is the age of multi-device mobility, the cloud, seamless computing from one device to another, a growing ecosystem of connected devices (watches, cars, thermostats), instant and ubiquitous communication, the blurring of the lines and hours between work and not-work. It’s a transformation that may have started with the smartphone, but has now engulfed everything about the way we use technology for, well everything.

Organizations that master the ability to collect, understand and act upon knowledge derived from user experiences, application behaviours, and infrastructure use from across this connected ecosystem will outcompete those that don’t, and win in this fourth era of Digital: The Age of Application Intelligence.

A Tectonic Technological Shift
There’s really no precedent for the speed of what has become a tectonic technological shift. In her much-anticipated Internet Trends 2014, KPCB’s Mary Meeker characterizes a tech market that saw 20 percent growth for smartphones, 52 percent for tablets, and 82 percent for mobile data in 2013. She predicts 10x growth in mobile Internet units in this decade — from the one billion-plus desktop Internet units/users to more than 10 billion for the mobile Internet.

Seemingly overnight, we have new models for hardware and software development, new models of behavior, and unforgiving expectations from consumers — for more apps, more functionality, more entertainment, more speed — driven by mobility, but extending to all online experiences regardless of interaction preference.

This is good, and it’s a great time to be in the thick of the enabling technology platforms — if you’re functioning with a model designed for this fourth era of digital.

On the other hand, it’s a pretty challenging time if you’re dealing with technology that matured early in the 2000s. Think huge, monolithic apps, sprawling private data centers with proprietary consoles for every piece of your infrastructure supported by “engagements” — a very loaded term — when a literal or figurative truckload of consultants, engineers, and programmers would descend on an enterprise and spend several months and multiple man-years engrossed in a single project only to emerge at the end with a big, bloated, largely rigid “deliverable.”

And if the applications themselves were large and unwieldy and slow to adapt, the Application Performance Management systems were (and legacy systems still are) similarly complex, difficult to adapt, and slow to process the limited amount of data they collected. The notion of “real time” was not even a consideration.

It wasn’t that long ago, but it’s hard to imagine trying to do business like that today. And in fact, you really can’t do business that way today. Some of the legacy APM platforms are trying to make the transition. But it’s a difficult maneuver that requires the kind of wholesale reinvention that few entrenched enterprises are willing to attempt, or that those brave enough to try can accomplish successfully.

The recent challenge faced by OpTier is a case in point. It’s always a bit alarming to see a player leave the arena, even a competitor. But it’s not likely to be the last such story we’ll hear.

Whether you’re building the applications themselves or the platforms to optimize their performance and business value, today everything is about speed, agility, and creating exceptional end-user experiences.

If you’re providing the applications, that means you have to be able to iterate quickly — often multiple times per day — and deliver the features and functionality your customers want, whether they’re outside or inside your enterprise. And of course, your apps have to be continuously available and meet your customers’ expectations for speed and performance, whatever OS or device they’re using. And you have to do this in an environment that is distributed, heterogeneous, complex, and ever-changing.

To pull this off requires a level of application intelligence designed specifically to succeed with these challenges in these environments.

Delivering Real APM Value

Specifically, for an APM platform to deliver real value for the application and the enterprise, it has to satisfy a number of key requirements, including:

  • Fast setup: minutes or hours vs. days or weeks, without need for a professional services ‘engagement.’
  • Self-learning, auto-configuring: Your apps and infrastructure change frequently; your APM platform needs to automatically detect and learn those changes and configure itself in real time, without manual intervention; there’s simply no time or resources for that.
  • Detect, diagnose, and respond: If there’s a problem, a slowdown, an outage, your APM platform should be the first to know about it, and whenever possible, should fix it before you know about it; or if it requires a bigger intervention, give you the data you need to solve it quickly.
  • Deliver actionable intelligence in real-time: In the old days, APM was about speed and availability and not much else. In today’s software-enabled enterprises, the APM platform not only has to measure, monitor, and manage system health, it has to be able to tell you, in real time, what impact performance is having on the business. It’s a focus far beyond availability and throughput, on the business transaction for the end user.
  • Provide end-to-end transaction visibility: Your applications may be running on your premises, in the cloud, or both; you need to be able to see what’s happening everywhere, through one pane of glass, because you can only manage, fix, and optimize performance that you can see.
  • Be insanely fast: When you do a release, you need to know immediately what’s working, what’s not, and how to fix things in a hurry, live, in production.

And it has to be stingy with the overhead, be able to scale itself and your applications up or down in response to changing demand, make the most of your resources and infrastructure, and many more things.

That’s a far cry from the big, heavy, slow systems and processes of a few short years ago. And characteristics like these don’t just apply to APM — it’s the way of all technology development today, from VR gaming headset hardware to massive e-commerce systems. Fail fast and recover (smarter next time). Design from the outside-in. Iterate quickly. Respond in real time. Innovate faster than the competition, in technology and marketing. Create user experiences that drive success.

In Internet Trends, Mary Meeker says that “New companies — with new data from new device types — [are] doing things in new ways and growing super fast.” And she describes the rapid growth of “uploadable/sharable/findable real-time data.” These are ideas that describe much of what is driving this new, fourth era of digital.

The old adage is true now more than ever: Change is the one constant you can count on. Those organizations who can adapt continuously are the ones that will thrive and win.

This post originally appeared on APM Digest

The post The Fourth Digital Wave: The Age of Application Intelligence written by appeared first on Application Performance Monitoring Blog from AppDynamics.

Read the original blog entry...

More Stories By AppDynamics Blog

In high-production environments where release cycles are measured in hours or minutes — not days or weeks — there's little room for mistakes and no room for confusion. Everyone has to understand what's happening, in real time, and have the means to do whatever is necessary to keep applications up and running optimally. DevOps is a high-stakes world, but done well, it delivers the agility and performance to significantly impact business competitiveness.

@ThingsExpo Stories
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data they generate about customer usage and product performance to deliver extremely compelling and reliabl...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
The IoT market is projected to be $1.9 trillion tidal wave that’s bigger than the combined market for smartphones, tablets and PCs. While IoT is widely discussed, what not being talked about are the monetization opportunities that are created from ubiquitous connectivity and the ensuing avalanche of data. While we cannot foresee every service that the IoT will enable, we should future-proof operations by preparing to monetize them with extremely agile systems.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. Learn about IoT, Big Data and deployments processing massive data volumes from wearables, utilities and other machines.
SYS-CON Events announced today that CodeFutures, a leading supplier of database performance tools, has been named a “Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. CodeFutures is an independent software vendor focused on providing tools that deliver database performance tools that increase productivity during database development and increase database performance and scalability during production.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com), moderated by Ashar Baig, Research Director, Cloud, at Gigaom Research, Nate Gordon, Director of T...
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Intelligent Systems Services will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Established in 1994, Intelligent Systems Services Inc. is located near Washington, DC, with representatives and partners nationwide. ISS’s well-established track record is based on the continuous pursuit of excellence in designing, implementing and supporting nationwide clients’ mission-critical systems. ISS has completed many successful projects in Healthcare, Commercial, Manufacturing, ...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add scalable realtime functionality with minimal effort and cost.”
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
For years, we’ve relied too heavily on individual network functions or simplistic cloud controllers. However, they are no longer enough for today’s modern cloud data center. Businesses need a comprehensive platform architecture in order to deliver a complete networking suite for IoT environment based on OpenStack. In his session at @ThingsExpo, Dhiraj Sehgal from PLUMgrid will discuss what a holistic networking solution should really entail, and how to build a complete platform that is scalable, secure, agile and automated.
We’re no longer looking to the future for the IoT wave. It’s no longer a distant dream but a reality that has arrived. It’s now time to make sure the industry is in alignment to meet the IoT growing pains – cooperate and collaborate as well as innovate. In his session at @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine the key ingredients to IoT success and identify solutions to challenges the industry is facing. The deep industry expertise behind this presentation will provide attendees with a leading edge view of rapidly emerging IoT oppor...
In the consumer IoT, everything is new, and the IT world of bits and bytes holds sway. But industrial and commercial realms encompass operational technology (OT) that has been around for 25 or 50 years. This grittier, pre-IP, more hands-on world has much to gain from Industrial IoT (IIoT) applications and principles. But adding sensors and wireless connectivity won’t work in environments that demand unwavering reliability and performance. In his session at @ThingsExpo, Ron Sege, CEO of Echelon, will discuss how as enterprise IT embraces other IoT-related technology trends, enterprises with i...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data they generate about customer usage and product performance to deliver extremely compelling and reliabl...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to make sense of it all.
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities. In his session at @ThingsExpo, Gary Hall, Chief Technology Officer, Federal Defense at Cisco Systems, will break down the core capabilities of IoT in multiple settings and expand upon IoE for bo...
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics archive, in his session at @ThingsExpo, Jim Kaskade, Vice President and General Manager, Big Data & Ana...