Click here to close now.

Welcome!

Linux Authors: AppDynamics Blog, Carmen Gonzalez, Jason Bloomberg, Yeshim Deniz, Liz McMillan

Related Topics: @DevOpsSummit, Java, Linux, Cloud Expo, Big Data Journal, SDN Journal

@DevOpsSummit: Blog Feed Post

DevOps, Automation, and Mid-Market Companies

When you think about the largest and most dynamic networks in the world topics like automation are a no-brainer

The overall networking landscape has been going through a fairly deliberate shift over the past couple of years. Where we used to talk CapEx, we are now talking OpEx. Where we used to talk features, we are now talking about workflows. This change in industry dialogue mirrors the rise of trends like SDN and DevOps. I have been a huge fan of automation in general and DevOps in particular for many years now. But, as an industry, are we leaving people behind unintentionally?

When you think about the largest and most dynamic networks in the world (typically characterized as either service providers or web-scale companies), topics like automation are a no-brainer. The sheer number of devices in the networks that these companies manage demands something more than keying in changes manually. And for these types of companies, the network is not just an enabler – it is a central part of their business. Without the network, there is no business. It’s not terribly surprising that these companies hire small armies of capable engineers and developers to make everything function smoothly.

In these environments, automation is not a nice-to-have. It’s closer to food and water than it is to sports and entertainment. Accordingly, their interest in technologies that support automation is high. Their capability in putting automation tools to use is high. And if their abilities do not match their requirements, they open up their wallets to make sure they get there (think: OSS/BSS).

In networking, there is a prevailing belief that what is good for these complex environments will eventually make its way into smaller, less complex networks. It might take time, but the technologies and best practices that the most advanced companies employ, will eventually trickle down to everyone else. It’s sort of the networking equivalent of Reaganomics.

But is this necessarily true?

First, let me reiterate that I am a huge advocate for automation and DevOps. But these capabilities might not be universally required. Automation is most important in environments where either the volume or rate of change is high enough to justify the effort. If the network is relatively static, changing primarily to swap out old gear for new functionally equivalent gear, it might not be necessary to automate much at all. Or if network changes are tied to incremental growth, it might not make sense to automate the very much.

Automation enthusiasts (myself included) will likely react somewhat viscerally to the idea that automation isn’t necessary. “But even in these cases, automation is useful!” Certainly, it is useful. But what if your IT team lacks the expertise to automate all the things. What then? Sure, you can change the team up, but is it worth the effort?

And even if it is worth the effort, how far along the automation path will most companies need to go? It could be that simple shell scripts are more than enough to manage the rate of change for some companies. Full-blown DevOps would be like bringing a cruise missile to a water gun fight.

In saying this, I am not trying to suggest that automation or DevOps are not important. Rather, the tools we associate with these are just that: tools. They need to be applied thoughtfully and where it makes sense. Vendors that build these tools and then to try to push them too far down into the market will find that the demand for cruise missiles drops off pretty precipitously after the top-tier companies.

Even smaller-scale infrastructure does require workflow though. The trick is in packaging the tools so that they are right-sized for the problems they are addressing.

This obviously starts with discarding the notion that workflows are common across all sizes of networks. That is simply not true. The reason that there is pushback when people say that the future of network engineering is programming is that for many people, it is not yet a foregone conclusion that full-blown automation is worth the effort.

For these people, the juice isn’t worth the squeeze.

The conclusion to draw here is not that automation is not a good thing. It’s that automation packaged as a complex DIY project isn’t always the right fit. Not everyone wants to do it themselves. At home, it turns out I am capable of repainting a room, but it just isn’t worth my time, so I hire a professional. In a network, people might be fully capable of automating policy provisioning and still find that it isn’t worth doing because policy for them just isn’t that complex.

What vendors ought to be doing is packaging their workflow optimizations in a way that is far easier to consume. Rather than building scaffolding around the network to handle management, it might make sense to make the management itself much more intuitive and more a core part of the way devices are architected.

This might sound like a brain dead statement, but consider that most networking devices are designed by people who do not run networks. And even worse, the workflows that dictate how things are used are frequently the last thing designed. If the mid-market and below are to get the advantages of the automation capabilities that the big guys are driving, vendors will need to design workflows explicitly for broad adoption.

If we really want to make the juice worth the squeeze, we need to make the squeeze a lot less painful. We need to move beyond automated networking closer to intuitive networking.

[Today’s fun fact: Lake Nicaragua boasts the only fresh water sharks in the entire world. I would be very motivated not to fall down while water skiing.]

The post DevOps, automation, and mid-market companies appeared first on Plexxi.

More Stories By Michael Bushong

The best marketing efforts leverage deep technology understanding with a highly-approachable means of communicating. Plexxi's Vice President of Marketing Michael Bushong has acquired these skills having spent 12 years at Juniper Networks where he led product management, product strategy and product marketing organizations for Juniper's flagship operating system, Junos. Michael spent the last several years at Juniper leading their SDN efforts across both service provider and enterprise markets. Prior to Juniper, Michael spent time at database supplier Sybase, and ASIC design tool companies Synopsis and Magma Design Automation. Michael's undergraduate work at the University of California Berkeley in advanced fluid mechanics and heat transfer lend new meaning to the marketing phrase "This isn't rocket science."

@ThingsExpo Stories
While great strides have been made relative to the video aspects of remote collaboration, audio technology has basically stagnated. Typically all audio is mixed to a single monaural stream and emanates from a single point, such as a speakerphone or a speaker associated with a video monitor. This leads to confusion and lack of understanding among participants especially regarding who is actually speaking. Spatial teleconferencing introduces the concept of acoustic spatial separation between conference participants in three dimensional space. This has been shown to significantly improve comprehe...
SYS-CON Events announced today that the "First Containers & Microservices Conference" will take place June 9-11, 2015, at the Javits Center in New York City. The “Second Containers & Microservices Conference” will take place November 3-5, 2015, at Santa Clara Convention Center, Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists will peel away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud environment, and we must architect and code accordingly. At the very least, you'll have no problem fil...
The 4th International Internet of @ThingsExpo, co-located with the 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - announces that its Call for Papers is open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
The Internet of Things promises to transform businesses (and lives), but navigating the business and technical path to success can be difficult to understand. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, demonstrated how to approach creating broadly successful connected customer solutions using real world business transformation studies including New England BioLabs and more.
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understanding the kinds of data: structured, unstructured, big/small? Analytics: What kinds and how responsiv...
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
Advanced Persistent Threats (APTs) are increasing at an unprecedented rate. The threat landscape of today is drastically different than just a few years ago. Attacks are much more organized and sophisticated. They are harder to detect and even harder to anticipate. In the foreseeable future it's going to get a whole lot harder. Everything you know today will change. Keeping up with this changing landscape is already a daunting task. Your organization needs to use the latest tools, methods and expertise to guard against those threats. But will that be enough? In the foreseeable future attacks w...
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises are using some form of XaaS – software, platform, and infrastructure as a service.
Cloud is not a commodity. And no matter what you call it, computing doesn’t come out of the sky. It comes from physical hardware inside brick and mortar facilities connected by hundreds of miles of networking cable. And no two clouds are built the same way. SoftLayer gives you the highest performing cloud infrastructure available. One platform that takes data centers around the world that are full of the widest range of cloud computing options, and then integrates and automates everything. Join SoftLayer on June 9 at 16th Cloud Expo to learn about IBM Cloud's SoftLayer platform, explore se...
15th Cloud Expo, which took place Nov. 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA, expanded the conference content of @ThingsExpo, Big Data Expo, and DevOps Summit to include two developer events. IBM held a Bluemix Developer Playground on November 5 and ElasticBox held a Hackathon on November 6. Both events took place on the expo floor. The Bluemix Developer Playground, for developers of all levels, highlighted the ease of use of Bluemix, its services and functionality and provide short-term introductory projects that developers can complete between sessions.
The 3rd International @ThingsExpo, co-located with the 16th International Cloud Expo – to be held June 9-11, 2015, at the Javits Center in New York City, NY – is now accepting Hackathon proposals. Hackathon sponsorship benefits include general brand exposure and increasing engagement with the developer ecosystem. At Cloud Expo 2014 Silicon Valley, IBM held the Bluemix Developer Playground on November 5 and ElasticBox held the DevOps Hackathon on November 6. Both events took place on the expo floor. The Bluemix Developer Playground, for developers of all levels, highlighted the ease of use of...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
In the consumer IoT, everything is new, and the IT world of bits and bytes holds sway. But industrial and commercial realms encompass operational technology (OT) that has been around for 25 or 50 years. This grittier, pre-IP, more hands-on world has much to gain from Industrial IoT (IIoT) applications and principles. But adding sensors and wireless connectivity won’t work in environments that demand unwavering reliability and performance. In his session at @ThingsExpo, Ron Sege, CEO of Echelon, will discuss how as enterprise IT embraces other IoT-related technology trends, enterprises with i...
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
We’re no longer looking to the future for the IoT wave. It’s no longer a distant dream but a reality that has arrived. It’s now time to make sure the industry is in alignment to meet the IoT growing pains – cooperate and collaborate as well as innovate. In his session at @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine the key ingredients to IoT success and identify solutions to challenges the industry is facing. The deep industry expertise behind this presentation will provide attendees with a leading edge view of rapidly emerging IoT oppor...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...