Linux Containers Authors: Elizabeth White, Yeshim Deniz, AppDynamics Blog, Carmen Gonzalez, John Mertic

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Microsoft Cloud, Linux Containers, SDN Journal

@CloudExpo: Article

An End-to-End Solution

Can one company fulfill all of the requirements of an organization?

I was asked by Mr. Peter Hastings (NH DoIT Commissioner) about my understanding and knowledge of "End to End Solutions". I have witnessed these solutions before but I wanted to find a good definition. So, I began my research and found this definition: One supplier or one company that can provide all of your hardware and software needs in a suite of tools that meets all of the customers requirements and no other suppliers need to be involved. I think this is a good one and makes sense to me.

I'm sure there are many other definitions that exist and this is simplistic in nature but I see this as a baseline definition. One could certainly branch off of this definition to encompass a variety of explanations but let's stick with this one for this article.

This idea has been around for sometime now and slowly but surely as the years go on it seems to go away and pop up and go away and pop up, etc... I have been watching and observing for a long time as well and witnessed the process and implementation of these suites over the years and have developed my own opinions about all encompassing tools. So, this has come up before in my career.

Commissioner Hastings wanted to understand at a high level if there was any benefit for our development teams across the enterprise of State government to integrate and entire suite of tools to achieve a more seamless Software Development Life Cycle (SDLC). If this were possible there could be savings realized by having shorter development life cycles rather than an application taking three years to complete, trim a year and the State could save money.

Theoretically, this is a great idea and makes good sense but in reality the only shops that can take on an idea of this magnitude are shops with a lot of money, a lot of staff and a lot of expertise in house because inherit in the words "End to End Solution" I hear big, large, enormous, all encompassing and there is a lot of overhead in this idea right out of the gate.

I explained to the Commissioner in advance and prefaced my dialog as such that I would be happy to write about this subject but that not all would agree with my analysis and conclusions. I explained to the Commissioner that this would only be my perspective and that there are varying experiences that would be a counterpoint to my thoughts on this subject. The Commissioner is quite open to varying opinions and really just wanted to understand the subject more than anything else at a high level. Now, I can only tell you what I have seen and witnessed over the years and share my thoughts.

The premise that one company can fulfill all of the requirements of an organization: Can this be true? Can one supplier/company provide this much and help an organization realize a cost savings? My opinion is that I have not seen this be true to date. It's a big white unicorn in my opinion, although, you have to believe in a unicorn once in a while. Every company has their specialty. Those things they do well and those things they try to do well and fail at.

We are all microcosms of the wider world so we are all "End-to-End Solutions" in some way, do we as humans do it all well? I think not. We all have those things we do well and those things we don't. The same goes for companies. So, the premise that a company can deliver it all to me is not real but fantasy in my opinion. Not only that but let's take it one step further and play devils advocate for one moment and even if a company could deliver an entire "End to End Solution" a suite of tools that delivered it all.

These are the next questions to ask: Can the organization it's being delivered to in reality manage it all? Do they have all the staff to support the toolsets in the suite, do they have all the expertise required to implement the toolsets in the suite, do they have the time it takes to learn and support the toolsets in the suite and can they really manage an entire suite of toolsets successfully?

If I were to think even of one tool in the suite, let's take the SCM tool since I know that tool well. You have the software configuration administration piece of the tool, you have the release management tool performing compiles, you have defect tracking tool to manage deficiencies, modifications and enhancements against the application, then you have code reviews and theoretically this could be three to four people with these skills. That's just one toolset in the suite. Do you have all of this expertise in one person in house already? SCM is not an easy skill to come by to begin with.

What tools will your organization do well at and which ones do you have the knowledge in already? Several toolsets in the suite may be worked because there is expertise in house on those toolsets but not on others and you may have to hire or train staff in house on the other toolsets providing you have the staff to train. In any case it usually turns out to be that you don't have all that it takes and as such fail at the implementation. Once the implementation fails and the buy in from management fades because no one wants to be associated with a failing project and the grass roots folks that are really working the products on the ground are to stressed from the implementation the experiment is over and lots of money is wasted that could have been targeted in another more productive manner.

The problem really is that when you get a lot of folks in a room and the sell is on and the atmosphere is charged with enthusiasm and a lot of folks see potential savings not yet realized that could be they become swept away in a sell. When this happens it usually ends up costing an organization a lot of money that was unintended and at some point the question is always asked what did we do?

I agreed to write on this subject ultimately because I'm bugged about "End to End Solutions". I bugged about the sell. I'm bugged about the hype and the promise that I don't see how anyone could deliver. For example: I have recording software that is more automated than anything I have seen before but I still have to know how to run the tool, record with it, how to mix, add effects, edit the recording, master to a stereo track even though the tool is very automated.

This would be the case for the "End to End Solution": There is the Testing tool, The QA tool, The Project Management tool, The Requirements Gathering tool, the SCM tool, the Defect Tracking tool and whatever other tool or plug-in you add to the suite. I'm sure the suites can be customized as well to only include what you need for your organization, which makes more sense.

The bottom line is, that's a lot of expertise, staff, time, money and resources. The tools have to be learned; you might have to hire folks in addition to support them, others will have to be cross trained in the event that your expertise goes away. What organization can really afford this level of overhead other than large organizations with a chunk of capital?

I think one of the big mistakes that organizations make is that when these sells are occurring there is often times no expertise from the organization in the room to ask the tough questions. What training is the supplier offering, what is the technical support like, are you using a proprietary database, if the database goes down who supports it, etc...

So, it's typically managers who are in the room who are not even going to do the work on the ground or support these behemoths. I'm bugged by the suites themselves that typically do a good job in one area and fall down in others but yet you now have an entire suite of tools to make-work. For example: I have seen some companies make a great SCM tool but a terrible Defect Tracking tool and a great Project Management tool but a terrible SCM tool.

The idea that one supplier can do it all in my opinion I feel is flawed. Just like when I'm recording music, since that's the analogy that I understand best. As automated as any mix they want to make whether they say click here for rock, jazz, easy listening, etc... The user still has to do something to get to that place and if that is not the right place the user must have an understanding of how to get to the right place. So, human intervention is required. Truly automated tools are a myth in my opinion.

Now a lot of folks are going to disagree with me because they are selling these monster's to people who can't support them, implement them, maintain them, provide technical support on them and mentor other's on them. It really requires and army depending on how big the organization is and that brings up my first example.

Every organization wants to save money and often times find themselves getting sold on this pot of gold that if everything could be integrated across the enterprise of our business however big or small we could save so much money and here's the thing as I said earlier.

A large organization can take the impact of a completely integrated software environment like this because it has the staff. Now if they have the staff, then it's possible they have the expertise or you at least have the staff that can be trained by the seller initially to get your people trained. However, when the customer leaves there must be competent staff that can truly maintain the tools and champion the tools to create the standard that can actually produce the desired result and cost savings.

For most organizations the additional staff, training and expertise in itself would more than likely be a great expense that was unintended. Good for you and your organization if you have the expertise of the suites toolsets in house but typically that is not the case.

Often times you have to go outside your space to find some help. Maybe to find someone who knows QA, which is hard to find these days or SCM's, that is even harder, Tester's, etc... If you have a suite of five toolsets you must have a champion of each one of those toolsets in order to get them introduced successfully in an organization.

Now, if you try this with State government your even more likely to fail because you never have the staff, expertise, time or the money. I have seen this attempted at least two times in my tenure.

Also, something that organizations don't take into consideration when taking on suites like this is that it usually requires a culture that accepts change and having their cheese moved and understand the chain of command. It requires typically a culture shift and most folks don't like change, including me. However, it was once said to me many years ago that Change is stability. It's the one constant. So, I try to believe in that.

So, consider for one moment an organization taking on a tool suite of this nature the "End to End Solution". Once you have hired staff, have your trained expertise in house, your staffing is solidly in place, you now need your leader to convincingly direct the staff to implement and accept the new methodology. This industry is always in flux and moves fast and is never is static but change is still difficult to implement.

When everyone gets on the bus and you have institutionalized a new culture with new toolsets and they are embraced, then savings can be realized because anyone who then draws outside the lines changes the integrity of the outcome and a new outcome emerges.

I've told you what you need and that larger organizations are more successful at this than smaller ones because they have the staffing, expertise, money and time on their side.

Let me just provide one example of what I have seen and the way I think this can be done successfully even in smaller organizations. Many attempts fail and the reason is simple. It's too big to eat. I mean these implementations are monstrous. They take multiple administrators for starters of every toolset in the suite. One person should never be running the whole suite with each tool having it's own administrator that would be a security risk. You never want one person having all the keys to the kingdom anyway.

There are multiple layers of understanding and expertise and a lot of time to get up and running. Now, I'll have plenty of folks who will disagree with me and that's fine. They are the ones selling these suites. They are going to have a different opinion and that's fine.

As I have learned in life anyway one-third of the people are going to like you, one third of the people are going to hate you and one third of the people don't care.

So, you purchase the suite anyway and the first toolset you'll tackle is the testing suite. Now you need to have a body that understands the testing tool. How to write the scripts that get populated into the tool to run the test scripts and they are constantly changing based on new functionality, modifications, deficiencies and enhancements against the applications under development. If you don't have a good scriptwriter for testing you are already in trouble.

Now you are going to tackle the Project Management tool. Who really knows this skill? I've never met anyone who actually used a tool like this and understood it and implemented it well. That's just my experience. I'm sure they are out there. To me this is a rare expertise. Moving onto the QA tool, then the SCM tool, Defect Tracking Tool, etc.. There could be a combination of toolsets. My point is that you have to eat a tool one at a time. It's extremely hard to begin with to standardize any tool across any organization never mind a suite of toolsets. However, if you take a small bite, digest that and then the next one and so one you'll be more successful.

I feel scalability is the answer and actually makes more sense. If you can customize the suite to only include in it what you have expertise in and build upon that knowledge slowly I feel you can be more successful. For example start with the expertise you have in house. Let's say you have SCM and Defect Tracking knowledge.

Introduce those tools and let users in your organization get comfortable with that standardization. Then once that is stable introduce the Project Management, then Testing, then Requirements Gathering, etc... It doesn't have to be in this order clearly. It can be in whatever order you want but if you go slowly you won't break the bank and it will evolve. I always tell folks this is an evolution not a revolution.

Just look at world events, when there is a revolution and the old government is now out of office typically no one knows how to introduce a new government successfully. Even when they do they try to do the whole thing at once and it fails because it usually takes years to get in the situation that you are in and it's going to take time to get out of it and change it. So, an evolution is always a better approach in my opinion. Slow progress over time gathering up steam as one tool is introduced after another until it's all in place.

Should an organization even take on this kind of an exercise you have to put competent people on it because competent people can mentor others and this makes the implementation more successful at the end of the day. Every person has their strength's and weaknesses so your going to want to put folks on the job that have a proven track record at pushing change through. Not an easy thing to do.

You want folks running this who have experience with rejection and get a leadership that is 100% unfalteringly driving the process. One exception to the process and the entire suite will fail.

If you were smart while you were implementing the toolset you have strength in you would have been learning the next toolset in the suite to be ready to move forward with that implementation the moment the first tool was completely implemented. This way there is no pause in the next learning curve.

There is a lot of work that goes into these suites and it's not just staffing a tool like this and finding the expertise to implement the various pieces/tools. This includes upgrading the tool, applying any bug fixes, maintaining the OS, supporting back-up's. The list really does go on.

One thing to make sure to find when examining the SCM toolset in the suites is that it has a Generic repository perspective. This really is the answer for the SCM piece of the puzzle. You need to have repositories, which is a glorified name for a folder that will house any type of asset in the organization. You can have an "End to End Solution" for one development tool and one project but most development environments contain multiple tools (Java, Cobol, PowerBuilder, etc.). Now you'll have to introduce multiples toolsets and twice the overhead and no way of measure across an enterprise. That's a waste of money in the final analysis.

You need to make sure that these repositories can support any platform of development tool that an organization has in house or across the enterprise depending on how your organization is structured. SCM tools that only support one platform or another that are part of a suite of this nature in the "End to End Solution" are limited in scope and are not flexible enough to be called standards and standardizing and centralizing is how you really save money.

I don't know of to many organizations anyway that only have one application language they are developing in and nothing else and if you're serious about SCM you have to take into account every type of software asset or intellectual property you have in the enterprise.

I'm not saying that there are not exclusive Java, Visual Studio or PowerBuilder shops because they exist. What I am saying is that large organizations like States for example typically have a lot of legacy applications that were developed in many types of languages and to have a SCM tool in your "End to End Solution" suite that only covers one language or another could never be a standard. Where would you house all the rest of your source data assets, intellectual property or mission-critical assets in a second SCM tool or a third?

If that is the case there goes the whole benefit of reporting, managing, requirements gathering, testing on all of your assets in the enterprise because you won't have everything in one repository but in multiple repositories. Then the "End to End Solution" idea is a moot point. The benefit is lost. Now you will need more expertise, more money, more staff, more time, more resources and now your wasting more of everything and you lose the value of the "End to End Solution" anyway. There is no value in supporting just one type of language at all, the metrics will be quite limited and remember the purpose of these suites to begin with is to standardize, centralize and realize cost savings across an entire organization not just one project.

Listen to how this makes no sense we spent all of the money and bought an "End to End Solution" just for the Java team or the Visual Studio team or the PowerBuilder team. What about all the other types of development you are performing? Don't you want to save money in those areas of application development as well? For the cost of these suites I would hope so and anything else would be a waste of time, money and resources for any organization.

There is also the disaster recovery aspect would you secure one set of assets and then not secure another set. For example would I only care about my PowerBuilder source assets and not my Visual Studio assets? If I have a repository perspective that can only handle one type of data assets, what happens if there is a catastrophic event I only can recover one sets of assets but not another. So, when looking at these suites and considering disaster recovery make sure the SCM toolset has a generic repository that can secure and control any type of source data assets in your organization. It doesn't have to be great at it but it does have to work.

There are a lot of pitfalls to these "End to End Solutions" make sure you have competent folks in the room that can ask the hard questions and if the managers are embarrassed by the hard questions at the dog and pony show they are going to be even more embarrassed when they buy the suites and begin to lose money for the organization, the implementation fails, they lose their jobs or have to move on to distance themselves from the failure and the organization has to absorb the cost.

You have to make sure you have the staff, the time, the money; the expertise and that you choose suites that can support every type of source data asset in your enterprise. Make sure leadership convincingly communicates the change. Take small bites and digest those and make the cultural shifts in your organization you need to make before moving on to the next toolsets in the suite. It's an evolution not a revolution so think it through and don't get caught up in the sell. It will only cost you more money later when you buy on impulse.

My background is that I have been working in Software Configuration Management (SCM) for a few years now and I am presently, the administrator of SCM at the State of NH. SCM is a process-based Software Configuration Management (SCM) tool for managing application source code, intellectual property and mission critical assets. In this capacity I also secure these assets for disaster recovery purposes. I manage 400 plus applications housed in SCM and support 400 users using the product. The development tools we currently use are PowerBuilder PBV8, PBV11 and 12; Visual Studio 2003, 2005, 2008, 2010 and 2012; Eclipse, Juno, RAD, Mule, Cold Fusion, Java, COBOL and VB.

As the Software Configuration Manager (SCM), I provide the administration of the source code management tool. This includes the entire infrastructure of the environment for development, developing life cycles, providing best practices, procedures, processes, documentation; defect tracking, disaster recovery, maintaining build machines and the training of all the developers on proper source code management using the development tools in our development environment.

Risk to End to End Solutions

  • Not enough staff to support the toolsets
  • Not enough expertise to understand the toolsets with great clarity (Testing, SCM, Defect Tracking, QA, Requirements Gathering, Project Management)
  • Loss of expertise to other employment opportunities.
  • No cross training of expertise or transfer of knowledge
  • No solid implementation team
  • The expertise required to analyze these tools are typically not invited to the dog and pony show.
  • Not enough capital when problems arise.

More Stories By Al Soucy

Al Soucy is software configuration manager at the State of New Hampshire's Department of Information Technology (DoIT). In that role Al manages software configuration for dozens of PowerBuilder applications as well as applications written in Java, .NET, and COBOL (yes, COBOL). Al plays bass guitar, acoustic guitar, electric rhythm/lead guitar, drums, mandolin, keyboard; he sings lead and back up vocals and he has released 8 CDs.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
The Open Connectivity Foundation (OCF), sponsor of the IoTivity open source project, and AllSeen Alliance, which provides the AllJoyn® open source IoT framework, today announced that the two organizations’ boards have approved a merger under the OCF name and bylaws. This merger will advance interoperability between connected devices from both groups, enabling the full operating potential of IoT and representing a significant step towards a connected ecosystem.
SYS-CON Media announced today that @WebRTCSummit Blog, the largest WebRTC resource in the world, has been launched. @WebRTCSummit Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @WebRTCSummit Blog can be bookmarked ▸ Here @WebRTCSummit conference site can be bookmarked ▸ Here
SYS-CON Events announced today that Streamlyzer will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Streamlyzer is a powerful analytics for video streaming service that enables video streaming providers to monitor and analyze QoE (Quality-of-Experience) from end-user devices in real time.
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
@ThingsExpo has been named the Top 5 Most Influential Internet of Things Brand by Onalytica in the ‘The Internet of Things Landscape 2015: Top 100 Individuals and Brands.' Onalytica analyzed Twitter conversations around the #IoT debate to uncover the most influential brands and individuals driving the conversation. Onalytica captured data from 56,224 users. The PageRank based methodology they use to extract influencers on a particular topic (tweets mentioning #InternetofThings or #IoT in this ...
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and ...
Cloud based infrastructure deployment is becoming more and more appealing to customers, from Fortune 500 companies to SMEs due to its pay-as-you-go model. Enterprise storage vendors are able to reach out to these customers by integrating in cloud based deployments; this needs adaptability and interoperability of the products confirming to cloud standards such as OpenStack, CloudStack, or Azure. As compared to off the shelf commodity storage, enterprise storages by its reliability, high-availabil...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will w...
When people aren’t talking about VMs and containers, they’re talking about serverless architecture. Serverless is about no maintenance. It means you are not worried about low-level infrastructural and operational details. An event-driven serverless platform is a great use case for IoT. In his session at @ThingsExpo, Animesh Singh, an STSM and Lead for IBM Cloud Platform and Infrastructure, will detail how to build a distributed serverless, polyglot, microservices framework using open source tec...
November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Penta Security is a leading vendor for data security solutions, including its encryption solution, D’Amo. By using FPE technology, D’Amo allows for the implementation of encryption technology to sensitive data fields without modification to schema in the database environment. With businesses having their data become increasingly more complicated in their mission-critical applications (such as ERP, CRM, HRM), continued ...
SYS-CON Events announced today that Cloudbric, a leading website security provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Cloudbric is an elite full service website protection solution specifically designed for IT novices, entrepreneurs, and small and medium businesses. First launched in 2015, Cloudbric is based on the enterprise level Web Application Firewall by Penta Security Sys...
A completely new computing platform is on the horizon. They’re called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general. Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some ...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
SYS-CON Events announced today that Roundee / LinearHub will exhibit at the WebRTC Summit at @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LinearHub provides Roundee Service, a smart platform for enterprise video conferencing with enhanced features such as automatic recording and transcription service. Slack users can integrate Roundee to their team via Slack’s App Directory, and '/roundee' command lets your video conference ...
SYS-CON Events announced today that Enzu will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive advantage. By offering a suite of proven hosting and management services, Enzu wants companies to focus on the core of their online busine...
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and...