Welcome!

Linux Containers Authors: Liz McMillan, Yeshim Deniz, Pat Romanski, Elizabeth White, Stefana Muller

Related Topics: SDN Journal, Java IoT, Microservices Expo, Linux Containers, Containers Expo Blog, @CloudExpo

SDN Journal: Blog Feed Post

Timing Technological Change: Creation vs. Consumption

Most people nod their heads when there is talk about programmability, APIs, and massive automation frameworks

In any space, there is a very small vocal minority. Most people lack the time, interest, or even confidence to say what they think in public. So we are left with a vocal few who drive the conversation. In networking, the vocal minority consists mainly of the vanguards for change. For these people, the network is more than just some connective tissue inside a nebulous infrastructure. It is their life. They live and breathe it. Accordingly, they have strong opinions about how things work and, more importantly, how they ought to work.

But what is happening now is that we are at some risk of the luminaries creating an impassable distance between their vision and the on-the-ground reality in many IT shops today.

Most people nod their heads when there is talk about programmability, APIs, and massive automation frameworks. But head-nodding should not be mistaken for agreement and support. It simply shows an understanding of the argument. It can mean anything from “Oh my god! Let’s start right now. Where do I sign the PO?” to “Your logic is sound. You are really smart. Where did I put my Cat6k?”

When you look at the large, web-scale properties, the need for customization and the desire for a DIY environment is huge. These guys simply cannot operate in an environment that is as costly or difficult to manage as what most companies have today. This is why companies like Google have embraced SDN and white box switching from the outset. They stand up in front of conference crowds and talk about what they have done.

And everyone nods their heads.

But let’s be clear: not everyone is Google. Even if their needs were the same, they aren’t printing enough money to make hiring a whole new team feasible. They live in a different reality.

The truth is that there is no inherent nobility in what we build. Whether your environment is massive and complex, or small and simple, all any of us is really trying to do is support our business. If our business requires that we build something sophisticated, we will go and build it. But there are a precious few jobs where sophistication itself is the goal. So barring the work done in professional labs or academia, the vast majority of our market consists of people whose objective is really straightforward: make my business work.

This creates an interesting dynamic. The industry dialogue is dominated by the newest and most sophisticated technologies while the industry buying motions are dominated by the same legacy solutions that have been popular for several decades. I don’t mean to suggest that legacy players can rest on their laurels, but the transition from an aging environment to a newfangled one is not so easy (or even a priority) for many companies. And the further out ahead of demand that vendors go, the more difficult it is to bring customers along.

Make no mistake about it: there will be vendors who overshoot. They will reach too far, convinced that the future is changing. They will be right. But they will be early. The question for these vendors is whether they have the funds to wait for the market to catch up.

This means that vendors in this space have to be worried about more than just getting the technology right – they must also get the timing right. Show up too late, and you are obsolete before you hit the market. Show up too early, and you end up going out of business before the market adopts the technology.

Ultimately, for companies to be successful, they need to be a part of important technology trends while not creating too much distance between themselves and the market. But how do you thread that needle?

The key is in creating a solid technological foundation (you cannot risk obsolescence) from which you can apply an intuitive approach (make it easily adoptable). There is just as much skill in making technology consumable as there is in willing it into existence. Innovation is more than just exposing a bunch of capabilities through APIs and configuration knobs. You need to make those capabilities relevant and easy-to-use in context.

This is partly why OpenFlow has an adoption problem. Tons of capability, but very difficult to use. And it’s not just OpenFlow. If you look at a lot of the edge policy features popular in service provider environments, or complex traffic engineering setups, or even elaborate QoS schemes… they all suffer from a general lack of intuitiveness. Unless the network is your business, you don’t have the time or desire to sift through it all and figure out the arcane bits and pieces.

This should be instructive. As new companies attack the network, they need to be emphasizing more than just functionality. Those that address both the creation and consumption of technology will be uniquely positioned to take advantage of the vocal minority while servicing the buying majority.

[Today’s fun fact: An office chair with wheels travels 8 miles a year.]

The post Timing technological change: creation vs. consumption appeared first on Plexxi.

Read the original blog entry...

More Stories By Michael Bushong

The best marketing efforts leverage deep technology understanding with a highly-approachable means of communicating. Plexxi's Vice President of Marketing Michael Bushong has acquired these skills having spent 12 years at Juniper Networks where he led product management, product strategy and product marketing organizations for Juniper's flagship operating system, Junos. Michael spent the last several years at Juniper leading their SDN efforts across both service provider and enterprise markets. Prior to Juniper, Michael spent time at database supplier Sybase, and ASIC design tool companies Synopsis and Magma Design Automation. Michael's undergraduate work at the University of California Berkeley in advanced fluid mechanics and heat transfer lend new meaning to the marketing phrase "This isn't rocket science."

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...