Click here to close now.


Linux Containers Authors: Carmen Gonzalez, Liz McMillan, Mike Kavis, Pat Romanski, SmartBear Blog

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Microsoft Cloud, Linux Containers, SDN Journal

@CloudExpo: Article

An End-to-End Solution

Can one company fulfill all of the requirements of an organization?

I was asked by Mr. Peter Hastings (NH DoIT Commissioner) about my understanding and knowledge of "End to End Solutions". I have witnessed these solutions before but I wanted to find a good definition. So, I began my research and found this definition: One supplier or one company that can provide all of your hardware and software needs in a suite of tools that meets all of the customers requirements and no other suppliers need to be involved. I think this is a good one and makes sense to me.

I'm sure there are many other definitions that exist and this is simplistic in nature but I see this as a baseline definition. One could certainly branch off of this definition to encompass a variety of explanations but let's stick with this one for this article.

This idea has been around for sometime now and slowly but surely as the years go on it seems to go away and pop up and go away and pop up, etc... I have been watching and observing for a long time as well and witnessed the process and implementation of these suites over the years and have developed my own opinions about all encompassing tools. So, this has come up before in my career.

Commissioner Hastings wanted to understand at a high level if there was any benefit for our development teams across the enterprise of State government to integrate and entire suite of tools to achieve a more seamless Software Development Life Cycle (SDLC). If this were possible there could be savings realized by having shorter development life cycles rather than an application taking three years to complete, trim a year and the State could save money.

Theoretically, this is a great idea and makes good sense but in reality the only shops that can take on an idea of this magnitude are shops with a lot of money, a lot of staff and a lot of expertise in house because inherit in the words "End to End Solution" I hear big, large, enormous, all encompassing and there is a lot of overhead in this idea right out of the gate.

I explained to the Commissioner in advance and prefaced my dialog as such that I would be happy to write about this subject but that not all would agree with my analysis and conclusions. I explained to the Commissioner that this would only be my perspective and that there are varying experiences that would be a counterpoint to my thoughts on this subject. The Commissioner is quite open to varying opinions and really just wanted to understand the subject more than anything else at a high level. Now, I can only tell you what I have seen and witnessed over the years and share my thoughts.

The premise that one company can fulfill all of the requirements of an organization: Can this be true? Can one supplier/company provide this much and help an organization realize a cost savings? My opinion is that I have not seen this be true to date. It's a big white unicorn in my opinion, although, you have to believe in a unicorn once in a while. Every company has their specialty. Those things they do well and those things they try to do well and fail at.

We are all microcosms of the wider world so we are all "End-to-End Solutions" in some way, do we as humans do it all well? I think not. We all have those things we do well and those things we don't. The same goes for companies. So, the premise that a company can deliver it all to me is not real but fantasy in my opinion. Not only that but let's take it one step further and play devils advocate for one moment and even if a company could deliver an entire "End to End Solution" a suite of tools that delivered it all.

These are the next questions to ask: Can the organization it's being delivered to in reality manage it all? Do they have all the staff to support the toolsets in the suite, do they have all the expertise required to implement the toolsets in the suite, do they have the time it takes to learn and support the toolsets in the suite and can they really manage an entire suite of toolsets successfully?

If I were to think even of one tool in the suite, let's take the SCM tool since I know that tool well. You have the software configuration administration piece of the tool, you have the release management tool performing compiles, you have defect tracking tool to manage deficiencies, modifications and enhancements against the application, then you have code reviews and theoretically this could be three to four people with these skills. That's just one toolset in the suite. Do you have all of this expertise in one person in house already? SCM is not an easy skill to come by to begin with.

What tools will your organization do well at and which ones do you have the knowledge in already? Several toolsets in the suite may be worked because there is expertise in house on those toolsets but not on others and you may have to hire or train staff in house on the other toolsets providing you have the staff to train. In any case it usually turns out to be that you don't have all that it takes and as such fail at the implementation. Once the implementation fails and the buy in from management fades because no one wants to be associated with a failing project and the grass roots folks that are really working the products on the ground are to stressed from the implementation the experiment is over and lots of money is wasted that could have been targeted in another more productive manner.

The problem really is that when you get a lot of folks in a room and the sell is on and the atmosphere is charged with enthusiasm and a lot of folks see potential savings not yet realized that could be they become swept away in a sell. When this happens it usually ends up costing an organization a lot of money that was unintended and at some point the question is always asked what did we do?

I agreed to write on this subject ultimately because I'm bugged about "End to End Solutions". I bugged about the sell. I'm bugged about the hype and the promise that I don't see how anyone could deliver. For example: I have recording software that is more automated than anything I have seen before but I still have to know how to run the tool, record with it, how to mix, add effects, edit the recording, master to a stereo track even though the tool is very automated.

This would be the case for the "End to End Solution": There is the Testing tool, The QA tool, The Project Management tool, The Requirements Gathering tool, the SCM tool, the Defect Tracking tool and whatever other tool or plug-in you add to the suite. I'm sure the suites can be customized as well to only include what you need for your organization, which makes more sense.

The bottom line is, that's a lot of expertise, staff, time, money and resources. The tools have to be learned; you might have to hire folks in addition to support them, others will have to be cross trained in the event that your expertise goes away. What organization can really afford this level of overhead other than large organizations with a chunk of capital?

I think one of the big mistakes that organizations make is that when these sells are occurring there is often times no expertise from the organization in the room to ask the tough questions. What training is the supplier offering, what is the technical support like, are you using a proprietary database, if the database goes down who supports it, etc...

So, it's typically managers who are in the room who are not even going to do the work on the ground or support these behemoths. I'm bugged by the suites themselves that typically do a good job in one area and fall down in others but yet you now have an entire suite of tools to make-work. For example: I have seen some companies make a great SCM tool but a terrible Defect Tracking tool and a great Project Management tool but a terrible SCM tool.

The idea that one supplier can do it all in my opinion I feel is flawed. Just like when I'm recording music, since that's the analogy that I understand best. As automated as any mix they want to make whether they say click here for rock, jazz, easy listening, etc... The user still has to do something to get to that place and if that is not the right place the user must have an understanding of how to get to the right place. So, human intervention is required. Truly automated tools are a myth in my opinion.

Now a lot of folks are going to disagree with me because they are selling these monster's to people who can't support them, implement them, maintain them, provide technical support on them and mentor other's on them. It really requires and army depending on how big the organization is and that brings up my first example.

Every organization wants to save money and often times find themselves getting sold on this pot of gold that if everything could be integrated across the enterprise of our business however big or small we could save so much money and here's the thing as I said earlier.

A large organization can take the impact of a completely integrated software environment like this because it has the staff. Now if they have the staff, then it's possible they have the expertise or you at least have the staff that can be trained by the seller initially to get your people trained. However, when the customer leaves there must be competent staff that can truly maintain the tools and champion the tools to create the standard that can actually produce the desired result and cost savings.

For most organizations the additional staff, training and expertise in itself would more than likely be a great expense that was unintended. Good for you and your organization if you have the expertise of the suites toolsets in house but typically that is not the case.

Often times you have to go outside your space to find some help. Maybe to find someone who knows QA, which is hard to find these days or SCM's, that is even harder, Tester's, etc... If you have a suite of five toolsets you must have a champion of each one of those toolsets in order to get them introduced successfully in an organization.

Now, if you try this with State government your even more likely to fail because you never have the staff, expertise, time or the money. I have seen this attempted at least two times in my tenure.

Also, something that organizations don't take into consideration when taking on suites like this is that it usually requires a culture that accepts change and having their cheese moved and understand the chain of command. It requires typically a culture shift and most folks don't like change, including me. However, it was once said to me many years ago that Change is stability. It's the one constant. So, I try to believe in that.

So, consider for one moment an organization taking on a tool suite of this nature the "End to End Solution". Once you have hired staff, have your trained expertise in house, your staffing is solidly in place, you now need your leader to convincingly direct the staff to implement and accept the new methodology. This industry is always in flux and moves fast and is never is static but change is still difficult to implement.

When everyone gets on the bus and you have institutionalized a new culture with new toolsets and they are embraced, then savings can be realized because anyone who then draws outside the lines changes the integrity of the outcome and a new outcome emerges.

I've told you what you need and that larger organizations are more successful at this than smaller ones because they have the staffing, expertise, money and time on their side.

Let me just provide one example of what I have seen and the way I think this can be done successfully even in smaller organizations. Many attempts fail and the reason is simple. It's too big to eat. I mean these implementations are monstrous. They take multiple administrators for starters of every toolset in the suite. One person should never be running the whole suite with each tool having it's own administrator that would be a security risk. You never want one person having all the keys to the kingdom anyway.

There are multiple layers of understanding and expertise and a lot of time to get up and running. Now, I'll have plenty of folks who will disagree with me and that's fine. They are the ones selling these suites. They are going to have a different opinion and that's fine.

As I have learned in life anyway one-third of the people are going to like you, one third of the people are going to hate you and one third of the people don't care.

So, you purchase the suite anyway and the first toolset you'll tackle is the testing suite. Now you need to have a body that understands the testing tool. How to write the scripts that get populated into the tool to run the test scripts and they are constantly changing based on new functionality, modifications, deficiencies and enhancements against the applications under development. If you don't have a good scriptwriter for testing you are already in trouble.

Now you are going to tackle the Project Management tool. Who really knows this skill? I've never met anyone who actually used a tool like this and understood it and implemented it well. That's just my experience. I'm sure they are out there. To me this is a rare expertise. Moving onto the QA tool, then the SCM tool, Defect Tracking Tool, etc.. There could be a combination of toolsets. My point is that you have to eat a tool one at a time. It's extremely hard to begin with to standardize any tool across any organization never mind a suite of toolsets. However, if you take a small bite, digest that and then the next one and so one you'll be more successful.

I feel scalability is the answer and actually makes more sense. If you can customize the suite to only include in it what you have expertise in and build upon that knowledge slowly I feel you can be more successful. For example start with the expertise you have in house. Let's say you have SCM and Defect Tracking knowledge.

Introduce those tools and let users in your organization get comfortable with that standardization. Then once that is stable introduce the Project Management, then Testing, then Requirements Gathering, etc... It doesn't have to be in this order clearly. It can be in whatever order you want but if you go slowly you won't break the bank and it will evolve. I always tell folks this is an evolution not a revolution.

Just look at world events, when there is a revolution and the old government is now out of office typically no one knows how to introduce a new government successfully. Even when they do they try to do the whole thing at once and it fails because it usually takes years to get in the situation that you are in and it's going to take time to get out of it and change it. So, an evolution is always a better approach in my opinion. Slow progress over time gathering up steam as one tool is introduced after another until it's all in place.

Should an organization even take on this kind of an exercise you have to put competent people on it because competent people can mentor others and this makes the implementation more successful at the end of the day. Every person has their strength's and weaknesses so your going to want to put folks on the job that have a proven track record at pushing change through. Not an easy thing to do.

You want folks running this who have experience with rejection and get a leadership that is 100% unfalteringly driving the process. One exception to the process and the entire suite will fail.

If you were smart while you were implementing the toolset you have strength in you would have been learning the next toolset in the suite to be ready to move forward with that implementation the moment the first tool was completely implemented. This way there is no pause in the next learning curve.

There is a lot of work that goes into these suites and it's not just staffing a tool like this and finding the expertise to implement the various pieces/tools. This includes upgrading the tool, applying any bug fixes, maintaining the OS, supporting back-up's. The list really does go on.

One thing to make sure to find when examining the SCM toolset in the suites is that it has a Generic repository perspective. This really is the answer for the SCM piece of the puzzle. You need to have repositories, which is a glorified name for a folder that will house any type of asset in the organization. You can have an "End to End Solution" for one development tool and one project but most development environments contain multiple tools (Java, Cobol, PowerBuilder, etc.). Now you'll have to introduce multiples toolsets and twice the overhead and no way of measure across an enterprise. That's a waste of money in the final analysis.

You need to make sure that these repositories can support any platform of development tool that an organization has in house or across the enterprise depending on how your organization is structured. SCM tools that only support one platform or another that are part of a suite of this nature in the "End to End Solution" are limited in scope and are not flexible enough to be called standards and standardizing and centralizing is how you really save money.

I don't know of to many organizations anyway that only have one application language they are developing in and nothing else and if you're serious about SCM you have to take into account every type of software asset or intellectual property you have in the enterprise.

I'm not saying that there are not exclusive Java, Visual Studio or PowerBuilder shops because they exist. What I am saying is that large organizations like States for example typically have a lot of legacy applications that were developed in many types of languages and to have a SCM tool in your "End to End Solution" suite that only covers one language or another could never be a standard. Where would you house all the rest of your source data assets, intellectual property or mission-critical assets in a second SCM tool or a third?

If that is the case there goes the whole benefit of reporting, managing, requirements gathering, testing on all of your assets in the enterprise because you won't have everything in one repository but in multiple repositories. Then the "End to End Solution" idea is a moot point. The benefit is lost. Now you will need more expertise, more money, more staff, more time, more resources and now your wasting more of everything and you lose the value of the "End to End Solution" anyway. There is no value in supporting just one type of language at all, the metrics will be quite limited and remember the purpose of these suites to begin with is to standardize, centralize and realize cost savings across an entire organization not just one project.

Listen to how this makes no sense we spent all of the money and bought an "End to End Solution" just for the Java team or the Visual Studio team or the PowerBuilder team. What about all the other types of development you are performing? Don't you want to save money in those areas of application development as well? For the cost of these suites I would hope so and anything else would be a waste of time, money and resources for any organization.

There is also the disaster recovery aspect would you secure one set of assets and then not secure another set. For example would I only care about my PowerBuilder source assets and not my Visual Studio assets? If I have a repository perspective that can only handle one type of data assets, what happens if there is a catastrophic event I only can recover one sets of assets but not another. So, when looking at these suites and considering disaster recovery make sure the SCM toolset has a generic repository that can secure and control any type of source data assets in your organization. It doesn't have to be great at it but it does have to work.

There are a lot of pitfalls to these "End to End Solutions" make sure you have competent folks in the room that can ask the hard questions and if the managers are embarrassed by the hard questions at the dog and pony show they are going to be even more embarrassed when they buy the suites and begin to lose money for the organization, the implementation fails, they lose their jobs or have to move on to distance themselves from the failure and the organization has to absorb the cost.

You have to make sure you have the staff, the time, the money; the expertise and that you choose suites that can support every type of source data asset in your enterprise. Make sure leadership convincingly communicates the change. Take small bites and digest those and make the cultural shifts in your organization you need to make before moving on to the next toolsets in the suite. It's an evolution not a revolution so think it through and don't get caught up in the sell. It will only cost you more money later when you buy on impulse.

My background is that I have been working in Software Configuration Management (SCM) for a few years now and I am presently, the administrator of SCM at the State of NH. SCM is a process-based Software Configuration Management (SCM) tool for managing application source code, intellectual property and mission critical assets. In this capacity I also secure these assets for disaster recovery purposes. I manage 400 plus applications housed in SCM and support 400 users using the product. The development tools we currently use are PowerBuilder PBV8, PBV11 and 12; Visual Studio 2003, 2005, 2008, 2010 and 2012; Eclipse, Juno, RAD, Mule, Cold Fusion, Java, COBOL and VB.

As the Software Configuration Manager (SCM), I provide the administration of the source code management tool. This includes the entire infrastructure of the environment for development, developing life cycles, providing best practices, procedures, processes, documentation; defect tracking, disaster recovery, maintaining build machines and the training of all the developers on proper source code management using the development tools in our development environment.

Risk to End to End Solutions

  • Not enough staff to support the toolsets
  • Not enough expertise to understand the toolsets with great clarity (Testing, SCM, Defect Tracking, QA, Requirements Gathering, Project Management)
  • Loss of expertise to other employment opportunities.
  • No cross training of expertise or transfer of knowledge
  • No solid implementation team
  • The expertise required to analyze these tools are typically not invited to the dog and pony show.
  • Not enough capital when problems arise.

More Stories By Al Soucy

Al Soucy is software configuration manager at the State of New Hampshire's Department of Information Technology (DoIT). In that role Al manages software configuration for dozens of PowerBuilder applications as well as applications written in Java, .NET, and COBOL (yes, COBOL). Al plays bass guitar, acoustic guitar, electric rhythm/lead guitar, drums, mandolin, keyboard; he sings lead and back up vocals and he has released 8 CDs.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
There will be 20 billion IoT devices connected to the Internet soon. What if we could control these devices with our voice, mind, or gestures? What if we could teach these devices how to talk to each other? What if these devices could learn how to interact with us (and each other) to make our lives better? What if Jarvis was real? How can I gain these super powers? In his session at 17th Cloud Expo, Chris Matthieu, co-founder and CTO of Octoblu, will show you!
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the cloud and the best price/performance value available. ProfitBricks was named one of the coolest Clo...
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll share tips on how to speed up business initiatives, harness Big Data and remain one step ahead by apply...
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
Today’s connected world is moving from devices towards things, what this means is that by using increasingly low cost sensors embedded in devices we can create many new use cases. These span across use cases in cities, vehicles, home, offices, factories, retail environments, worksites, health, logistics, and health. These use cases rely on ubiquitous connectivity and generate massive amounts of data at scale. These technologies enable new business opportunities, ways to optimize and automate, along with new ways to engage with users.
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new data-driven world, marketplaces reign supreme while interoperability, APIs and applications deliver un...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
SYS-CON Events announced today that Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, will keynote at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
WebRTC converts the entire network into a ubiquitous communications cloud thereby connecting anytime, anywhere through any point. In his session at WebRTC Summit,, Mark Castleman, EIR at Bell Labs and Head of Future X Labs, will discuss how the transformational nature of communications is achieved through the democratizing force of WebRTC. WebRTC is doing for voice what HTML did for web content.
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, will introduce the technologies required for implementing these ideas and some early experiments performed in the Kurento open source software community in areas ...
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...