|By Al Soucy||
|June 27, 2014 11:00 AM EDT||
I was asked by Mr. Peter Hastings (NH DoIT Commissioner) to document the benefits and activities behind the Standardization and Centralization Initiative (SACI). This initiative concerns the standardization of the software configuration management tool, defect tracking tool and standardizing production compiles across the enterprise of state government. In addition this includes the storage of assets for disaster recovery purposes, securing and controlling state source data assets, intellectual property and mission-critical data assets across the enterprise of state government.
This article will cover how the State of NH is working to save money in efficiencies related to standardizing and centralizing state data assets. Every state wants to save money for its citizens and stakeholders but saving money in state government can be a very challenging task. The fact that every agency in state government is basically its own separate entity makes this even more difficult. Initiatives of this size and scope can be complicated to implement if agencies don't understand the value and don't feel the need to cooperate or collaborate to achieve success and understand that the savings is in their best interest. In today's economy no one wants to spend any money to achieve any savings to begin with because they don't have the money to make any capital investments.
Attempting to standardize any process, tool or initiative across the enterprise of state government is no easy task. The Standardization and Centralization Initiative (SACI) is no exception. This project standardizes the Software Configuration Management (SCM) tool across the entire enterprise of state government, secures, controls and centralizes all state source data assets, intellectual property and mission-critical assets for state government and for disaster recovery purposes, meeting the Federal COOP requirements as set forth by Homeland Security.
To introduce a directive of this magnitude speaks to the courage of Commissioner Peter Hastings for taking on such a challenging initiative across the enterprise of state government. You need a leader who can communicate effectively to the agency heads that there is value in this initiative that will ultimately lead to savings across the enterprise. The State of NH will receive many benefits from this initiative and there have been many hurdles to achieving such a huge undertaking. This initiative reflects the understanding of Commissioner Peter Hastings to address enterprise-wide level inconsistencies that cost the State of NH money and realize a cost savings instead.
This initiative establishes a centralized virtualized repository environment for all state software assets. It standardizes the Software Configuration Management (SCM) tool, Defect Tracking tool and formalizes production compiles across the enterprise. This initiative set in motion by Commissioner Peter Hastings at the State of NH is transforming state government software development activities across the enterprise of state government thereby streamlining processes and saving resources, time and money for the citizens of NH and the stakeholders of state government software development activities. I don't see any initiative like this taking place or being tackled anywhere else in the country on a state level and over the long term this will potentially save the State of NH perhaps millions of dollars.
Business Problem and Solution
Prior to the implementation of the SACI (Standardization and Centralization Initiative), state source data assets were spread across the enterprise of state government anywhere and everywhere. No one really knew where all the assets were truly located. State developers would leave for other opportunities and managers would hire a new developer and not be able to point the developer to the latest greatest application source assets for a specific project. Some developers were storing assets on their local drives, others on shared drives and no one knew where they were placed. Others stored these valuable assets on CDs and DVDs, which could be lost, and local drives, which can crash, and assets would be misplaced and lost. This resulted in a lot of state money being expended to re-create these assets again.
Let's just say a lot of state resources, time and money can be wasted this way: lost code, overwritten code, deleted files, no source code control, restores of assets and many flavors of other types of source code control tools contained within a silo across the enterprise. This can apply really to any organization.
Commissioner Peter Hastings realized quickly once becoming Commissioner that if he standardized and centralized all of the state's assets - whether they be source data assets of application source code, intellectual property related to applications that are supported inside and outside of state government, and secured and controlled state mission-critical assets in a centralized repository - that this would save the state a lot of money.
Commissioner Peter Hastings provided a directive to implement the Standardization and Centralization Initiative (SACI) and save money on activities related to software development and transform state government with more status accounting of software development activities across the enterprise of state government. There is visibility into software development activities that did not exist before this initiative.
The direction was to place all existing physical servers housing software assets and software developers across the enterprise in a virtual environment on one server and on one standard tool across the enterprise to centralize the state source data assets, intellectual property and mission-critical assets.
There is a cost savings to the state with this initiative alone with fewer software configuration management (SCM) servers in technical support and electricity and a myriad of other savings I'll speak to later. This initiative would centralize and standardize the users, licenses, assets and support across the enterprise for software development activities.
Improvement to Government
Intellectual property is a valuable commodity and it needs to be secured and controlled. The Federal government realizes this as well with some of the problems experienced with NSA securing metadata for the gathering program for phone calls, emails, etc. Standardizing tools and centralizing assets to reduce cost and streamline software development activities makes sense on so many levels (reduction in support, fewer physical SCM servers to maintain and support, less electricity, less software to install, fewer operating systems to maintain, fewer servers to back-up, easier to maintain SCM licenses, less of a learning curve for developers from one state agency to another on an SCM tool, more formalized development, more disciplined development, status accounting now available on software development activities, common code library can be shared to reuse code, future upgrades to the clients are easier, standard defect tracking being used which is centralized, etc.). All these savings add up and benefit state government in managing the cost of software development activities.
In addition when all users are in a centralized virtual environment the cloud concept can now be applied from within the standardized Software Configuration Management (SCM) tool to reuse common functionality code across the enterprise. A common code library for common functionalities between agencies across the enterprise can be set up and code that could be shared, reused, understood can all be under the secure umbrella of SCM.
Standardizing the Software Configuration Management tool allows groups to be created that have certain levels of access all the way down to a process or a file. It's easier to allow some teams or many teams to browse this library of shared common code. It also educates other teams on how one team versus another might be developing common functionalities. It can save the state money rather than reinventing the wheel on common functionalities already written by another development team.
This initiative saves the state resources, time and money on development activities:
- All state source data assets are centralized - diminishes the overhead of physical support.
- All processes are standardized by the use of one product to support multiple platforms. When a tool has a generic repository that houses any type of data asset any development platform can be secured and controlled. In large organization there can be legacy platforms that have to be sustained sometimes for years and a generic repository supports that effort.
- Allows movement of a developer from one project to another with more flexibility.
- Common functionality libraries can be set up for all development platforms (Visual Studio, PowerBuilder, Java, etc.) so that all developers can have access to reusable code under a SCM secure umbrella.
- Provides transparency to all managers with a view into a life cycle as to what developers are working on in any SCM State across the enterprise of State government.
Many large organizations have too many varying methods, tools, and processes and this creates silos of development, which overall is counterproductive to the whole. This makes it harder to not only know what the team(s) are doing or developing in or have access to or to interchange that skill set with another development team.
When all developers are using the same tool they can easily be provided access to another project via the secure cloud of the standardized SCM tool and work any project and have a much smaller learning curve by not leaving the development environment but seeing a new project in their view in the SCM tool.
When all users/developers are on one server that contains all the licenses for the tool and the assets are centralized and the SCM tool is standardized, the state can save money in various ways on activities related to software development and transform state government with more status accounting and visibility into software development not only for state developers but contractors as well.
The following is a list of the benefits of this project to State Government: Standardization and Centralization Imitative (SACI):
Benefits of the Project:
- This project will standardize the state's Software Configuration Management (SCM) Tool so that one product is being used by all agencies and this will save the state time, money and resources by having one tool across the enterprise to support.
- Provides the state with a smaller learning curve from software development team to software development team across the enterprise and allows software developers to save time by not learning a different tool from agency to agency. This allows for more focus on software development activities for application development.
- Allows for cloud software development to occur in a secure umbrella of SCM and allows for reuse of source code to save the State time, money and ultimately resources on software development activities. Prevents re-inventing the wheel on common functionalities.
- Allows for software developers to be interchanged between agencies more easily without the need to learn a new SCM tool for software configuration management activities.
- Precludes lost source code (source data assets, intellectual property, mission-critical assets) from agency to agency that saves the State time, money and resources over the long term.
- This will save the state money on electricity as well because all software developers will be on one virtual server versus three or four physical servers. Power consumption is a big deal with servers. It takes a great deal of energy to keep them cool and running.
- This will save the State money in terms of manpower as well with only one virtual server to maintain and back-up and one operating system to maintain and support rather than three or four.
- Saves the state money with all state data assets, intellectual property and mission critical assets centrally located rather than spread out all over the place across the enterprise.
- This methodology allows for better control of disaster recovery to meet Federal COOP requirements across the enterprise when it concerns State source data assets and intellectual property. These requirements are set up by Homeland Security to maintain business continuity.
- This allows state managers more visibility into software development activities and allows for better time management of these resources. It removes the silo effect.
- Standardize Defect Tracking for all state agencies so that one tool is used across the enterprise for all reporting against application development. This centralizes reporting.
- Standardize production compilations so that compilations and debugging issues are minimized across the enterprise for software development activities.
- When contractors use our standardized tool we have greater insight into their software development activities and at least have the assets secured and controlled within state government.
- When contractors use our standardized tool and house the State assets within our infrastructure it allows State software developers to shadow contractors and parallel the contractors work so that when the contract is complete State staff can sustain the application. This saves the State money because state staff can do the work cheaper than contractors.
- Harvest SCM integrates with all the software development platforms that we use here at the State today even legacy applications. So, one tool rather than a myriad of tools can perform the task of SCM and this saves the State time and money with only one tool to support across the enterprise and it takes only two people to support the entire infrastructure across the enterprise.
The final result is one virtual software development server, which would include intellectual property of various types (application code - VS, PB, Java, docs, etc., or Oracle code, SQL code, Access, etc.) and mission-critical assets.
What will have been achieved across the enterprise:
- Centralized State Data Assets across the Enterprise of State Government
- Centralized Data Asset Repositories
- Standardize Software Configuration Management (SCM) Tool
- Standardize Application Compilers - OpenMake
- Standardize Defect Tracking System
- Created Common Code Library for Common Functionalities
- Provide Status Accounting and Auditing of SCM projects
- Provide IT Managers and Commissioners visibility into Software Development Activities across the Enterprise of State Government
It's difficult to measure these savings in terms of real dollars today (ROI) but the savings can be measured by the streamlining of software development activities across the enterprise. The increased efficiency and quality improvements as well as a reduction of errors and or duplicative efforts are the real payback to this initiative.
My background is that I have been working in Software Configuration Management (SCM) for a few years now and I am presently, the administrator of SCM at the State of NH. SCM is a process-based Software Configuration Management (SCM) tool for managing application source code, intellectual property and mission critical assets. In this capacity I also secure these assets for disaster recovery purposes. I manage 400 plus applications housed in SCM and support 400 users using the product. The development tools we currently use are PowerBuilder PBV8, PBV11 and 12; Visual Studio 2003, 2005, 2008, 2010 and 2012; Eclipse, Juno, RAD, Mule, Cold Fusion, Java, COBOL and VB.
As the Software Configuration Manager (SCM), I provide the administration of the source code management tool. This includes the entire infrastructure of the environment for development, developing life cycles, providing best practices, procedures, processes, documentation; defect tracking, disaster recovery, maintaining build machines and the training of all the developers on proper source code management using the development tools in our development environment.
SYS-CON Events announced today that BMC will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. BMC delivers software solutions that help IT transform digital enterprises for the ultimate competitive business advantage. BMC has worked with thousands of leading companies to create and deliver powerful IT management services. From mainframe to cloud to mobile, BMC pairs high-speed digital innovation with robust IT industrialization – allowing customers to provide amazing user experiences with optimized IT per...
May. 30, 2015 08:15 AM EDT Reads: 1,828
We’re entering a new era of computing technology that many are calling the Internet of Things (IoT). Machine to machine, machine to infrastructure, machine to environment, the Internet of Everything, the Internet of Intelligent Things, intelligent systems – call it what you want, but it’s happening, and its potential is huge. IoT is comprised of smart machines interacting and communicating with other machines, objects, environments and infrastructures. As a result, huge volumes of data are being generated, and that data is being processed into useful actions that can “command and control” thi...
May. 30, 2015 08:00 AM EDT Reads: 1,454
Building low-cost wearable devices can enhance the quality of our lives. In his session at Internet of @ThingsExpo, Sai Yamanoor, Embedded Software Engineer at Altschool, provided an example of putting together a small keychain within a $50 budget that educates the user about the air quality in their surroundings. He also provided examples such as building a wearable device that provides transit or recreational information. He then reviewed the resources available to build wearable devices at home including open source hardware, the raw materials required and the options available to power s...
May. 30, 2015 05:30 AM EDT Reads: 4,547
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.
May. 30, 2015 04:30 AM EDT Reads: 5,859
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
May. 30, 2015 03:30 AM EDT Reads: 5,822
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happens, where data lives and where the interface lies. For instance, it's a mix of architectural styles ...
May. 30, 2015 03:00 AM EDT Reads: 6,190
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
May. 30, 2015 02:00 AM EDT Reads: 6,564
Collecting data in the field and configuring multitudes of unique devices is a time-consuming, labor-intensive process that can stretch IT resources. Horan & Bird [H&B], Australia’s fifth-largest Solar Panel Installer, wanted to automate sensor data collection and monitoring from its solar panels and integrate the data with its business and marketing systems. After data was collected and structured, two major areas needed to be addressed: improving developer workflows and extending access to a business application to multiple users (multi-tenancy). Docker, a container technology, was used to ...
May. 30, 2015 01:00 AM EDT Reads: 2,832
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
May. 29, 2015 09:00 PM EDT Reads: 5,560
2015 predictions circa 1970: houses anticipate our needs and adapt, city infrastructure is citizen and situation aware, office buildings identify and preprocess you. Today smart buildings have no such collective conscience, no shared set of fundamental services to identify, predict and synchronize around us. LiveSpace and M2Mi are changing that. LiveSpace Smart Environment devices deliver over the M2Mi IoT Platform real time presence, awareness and intent analytics as a service to local connected devices. In her session at @ThingsExpo, Sarah Cooper, VP Business of Development at M2Mi, will d...
May. 29, 2015 04:27 PM EDT Reads: 989
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
May. 29, 2015 03:45 PM EDT Reads: 5,136
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In this session, James Kirkland, Red Hat's Chief Architect for the Internet of Things and Intelligent Systems, will describe how to revolutionize your architecture and...
May. 29, 2015 02:33 PM EDT Reads: 933
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
May. 29, 2015 02:00 PM EDT Reads: 6,915
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be
May. 29, 2015 01:15 PM EDT Reads: 3,131
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
May. 29, 2015 01:00 PM EDT Reads: 7,557
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) i...
May. 29, 2015 01:00 PM EDT Reads: 4,812
SYS-CON Events announced today that MetraTech, now part of Ericsson, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Ericsson is the driving force behind the Networked Society- a world leader in communications infrastructure, software and services. Some 40% of the world’s mobile traffic runs through networks Ericsson has supplied, serving more than 2.5 billion subscribers.
May. 29, 2015 01:00 PM EDT Reads: 2,512
Thanks to widespread Internet adoption and more than 10 billion connected devices around the world, companies became more excited than ever about the Internet of Things in 2014. Add in the hype around Google Glass and the Nest Thermostat, and nearly every business, including those from traditionally low-tech industries, wanted in. But despite the buzz, some very real business questions emerged – mainly, not if a device can be connected, or even when, but why? Why does connecting to the cloud create greater value for the user? Why do connected features improve the overall experience? And why do...
May. 29, 2015 12:42 PM EDT Reads: 1,138
SYS-CON Events announced today that O'Reilly Media has been named “Media Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York City, NY. O'Reilly Media spreads the knowledge of innovators through its books, online services, magazines, and conferences. Since 1978, O'Reilly Media has been a chronicler and catalyst of cutting-edge development, homing in on the technology trends that really matter and spurring their adoption by amplifying "faint signals" from the alpha geeks who are creating the future. An active participa...
May. 29, 2015 12:30 PM EDT Reads: 1,426
Imagine a world where targeting, attribution, and analytics are just as intrinsic to the physical world as they currently are to display advertising. Advances in technologies and changes in consumer behavior have opened the door to a whole new category of personalized marketing experience based on direct interactions with products. The products themselves now have a voice. What will they say? Who will control it? And what does it take for brands to win in this new world? In his session at @ThingsExpo, Zack Bennett, Vice President of Customer Success at EVRYTHNG, will answer these questions a...
May. 29, 2015 12:13 PM EDT Reads: 945