Welcome!

Linux Authors: Jim Kaskade, Carmen Gonzalez, Trevor Parsons, Elizabeth White, Lori MacVittie

Related Topics: Cloud Expo, Java, SOA & WOA, .NET, Linux, SDN Journal

Cloud Expo: Article

An End-to-End Solution

Can one company fulfill all of the requirements of an organization?

I was asked by Mr. Peter Hastings (NH DoIT Commissioner) about my understanding and knowledge of "End to End Solutions". I have witnessed these solutions before but I wanted to find a good definition. So, I began my research and found this definition: One supplier or one company that can provide all of your hardware and software needs in a suite of tools that meets all of the customers requirements and no other suppliers need to be involved. I think this is a good one and makes sense to me.

I'm sure there are many other definitions that exist and this is simplistic in nature but I see this as a baseline definition. One could certainly branch off of this definition to encompass a variety of explanations but let's stick with this one for this article.

This idea has been around for sometime now and slowly but surely as the years go on it seems to go away and pop up and go away and pop up, etc... I have been watching and observing for a long time as well and witnessed the process and implementation of these suites over the years and have developed my own opinions about all encompassing tools. So, this has come up before in my career.

Commissioner Hastings wanted to understand at a high level if there was any benefit for our development teams across the enterprise of State government to integrate and entire suite of tools to achieve a more seamless Software Development Life Cycle (SDLC). If this were possible there could be savings realized by having shorter development life cycles rather than an application taking three years to complete, trim a year and the State could save money.

Theoretically, this is a great idea and makes good sense but in reality the only shops that can take on an idea of this magnitude are shops with a lot of money, a lot of staff and a lot of expertise in house because inherit in the words "End to End Solution" I hear big, large, enormous, all encompassing and there is a lot of overhead in this idea right out of the gate.

I explained to the Commissioner in advance and prefaced my dialog as such that I would be happy to write about this subject but that not all would agree with my analysis and conclusions. I explained to the Commissioner that this would only be my perspective and that there are varying experiences that would be a counterpoint to my thoughts on this subject. The Commissioner is quite open to varying opinions and really just wanted to understand the subject more than anything else at a high level. Now, I can only tell you what I have seen and witnessed over the years and share my thoughts.

The premise that one company can fulfill all of the requirements of an organization: Can this be true? Can one supplier/company provide this much and help an organization realize a cost savings? My opinion is that I have not seen this be true to date. It's a big white unicorn in my opinion, although, you have to believe in a unicorn once in a while. Every company has their specialty. Those things they do well and those things they try to do well and fail at.

We are all microcosms of the wider world so we are all "End-to-End Solutions" in some way, do we as humans do it all well? I think not. We all have those things we do well and those things we don't. The same goes for companies. So, the premise that a company can deliver it all to me is not real but fantasy in my opinion. Not only that but let's take it one step further and play devils advocate for one moment and even if a company could deliver an entire "End to End Solution" a suite of tools that delivered it all.

These are the next questions to ask: Can the organization it's being delivered to in reality manage it all? Do they have all the staff to support the toolsets in the suite, do they have all the expertise required to implement the toolsets in the suite, do they have the time it takes to learn and support the toolsets in the suite and can they really manage an entire suite of toolsets successfully?

If I were to think even of one tool in the suite, let's take the SCM tool since I know that tool well. You have the software configuration administration piece of the tool, you have the release management tool performing compiles, you have defect tracking tool to manage deficiencies, modifications and enhancements against the application, then you have code reviews and theoretically this could be three to four people with these skills. That's just one toolset in the suite. Do you have all of this expertise in one person in house already? SCM is not an easy skill to come by to begin with.

What tools will your organization do well at and which ones do you have the knowledge in already? Several toolsets in the suite may be worked because there is expertise in house on those toolsets but not on others and you may have to hire or train staff in house on the other toolsets providing you have the staff to train. In any case it usually turns out to be that you don't have all that it takes and as such fail at the implementation. Once the implementation fails and the buy in from management fades because no one wants to be associated with a failing project and the grass roots folks that are really working the products on the ground are to stressed from the implementation the experiment is over and lots of money is wasted that could have been targeted in another more productive manner.

The problem really is that when you get a lot of folks in a room and the sell is on and the atmosphere is charged with enthusiasm and a lot of folks see potential savings not yet realized that could be they become swept away in a sell. When this happens it usually ends up costing an organization a lot of money that was unintended and at some point the question is always asked what did we do?

I agreed to write on this subject ultimately because I'm bugged about "End to End Solutions". I bugged about the sell. I'm bugged about the hype and the promise that I don't see how anyone could deliver. For example: I have recording software that is more automated than anything I have seen before but I still have to know how to run the tool, record with it, how to mix, add effects, edit the recording, master to a stereo track even though the tool is very automated.

This would be the case for the "End to End Solution": There is the Testing tool, The QA tool, The Project Management tool, The Requirements Gathering tool, the SCM tool, the Defect Tracking tool and whatever other tool or plug-in you add to the suite. I'm sure the suites can be customized as well to only include what you need for your organization, which makes more sense.

The bottom line is, that's a lot of expertise, staff, time, money and resources. The tools have to be learned; you might have to hire folks in addition to support them, others will have to be cross trained in the event that your expertise goes away. What organization can really afford this level of overhead other than large organizations with a chunk of capital?

I think one of the big mistakes that organizations make is that when these sells are occurring there is often times no expertise from the organization in the room to ask the tough questions. What training is the supplier offering, what is the technical support like, are you using a proprietary database, if the database goes down who supports it, etc...

So, it's typically managers who are in the room who are not even going to do the work on the ground or support these behemoths. I'm bugged by the suites themselves that typically do a good job in one area and fall down in others but yet you now have an entire suite of tools to make-work. For example: I have seen some companies make a great SCM tool but a terrible Defect Tracking tool and a great Project Management tool but a terrible SCM tool.

The idea that one supplier can do it all in my opinion I feel is flawed. Just like when I'm recording music, since that's the analogy that I understand best. As automated as any mix they want to make whether they say click here for rock, jazz, easy listening, etc... The user still has to do something to get to that place and if that is not the right place the user must have an understanding of how to get to the right place. So, human intervention is required. Truly automated tools are a myth in my opinion.

Now a lot of folks are going to disagree with me because they are selling these monster's to people who can't support them, implement them, maintain them, provide technical support on them and mentor other's on them. It really requires and army depending on how big the organization is and that brings up my first example.

Every organization wants to save money and often times find themselves getting sold on this pot of gold that if everything could be integrated across the enterprise of our business however big or small we could save so much money and here's the thing as I said earlier.

A large organization can take the impact of a completely integrated software environment like this because it has the staff. Now if they have the staff, then it's possible they have the expertise or you at least have the staff that can be trained by the seller initially to get your people trained. However, when the customer leaves there must be competent staff that can truly maintain the tools and champion the tools to create the standard that can actually produce the desired result and cost savings.

For most organizations the additional staff, training and expertise in itself would more than likely be a great expense that was unintended. Good for you and your organization if you have the expertise of the suites toolsets in house but typically that is not the case.

Often times you have to go outside your space to find some help. Maybe to find someone who knows QA, which is hard to find these days or SCM's, that is even harder, Tester's, etc... If you have a suite of five toolsets you must have a champion of each one of those toolsets in order to get them introduced successfully in an organization.

Now, if you try this with State government your even more likely to fail because you never have the staff, expertise, time or the money. I have seen this attempted at least two times in my tenure.

Also, something that organizations don't take into consideration when taking on suites like this is that it usually requires a culture that accepts change and having their cheese moved and understand the chain of command. It requires typically a culture shift and most folks don't like change, including me. However, it was once said to me many years ago that Change is stability. It's the one constant. So, I try to believe in that.

So, consider for one moment an organization taking on a tool suite of this nature the "End to End Solution". Once you have hired staff, have your trained expertise in house, your staffing is solidly in place, you now need your leader to convincingly direct the staff to implement and accept the new methodology. This industry is always in flux and moves fast and is never is static but change is still difficult to implement.

When everyone gets on the bus and you have institutionalized a new culture with new toolsets and they are embraced, then savings can be realized because anyone who then draws outside the lines changes the integrity of the outcome and a new outcome emerges.

I've told you what you need and that larger organizations are more successful at this than smaller ones because they have the staffing, expertise, money and time on their side.

Let me just provide one example of what I have seen and the way I think this can be done successfully even in smaller organizations. Many attempts fail and the reason is simple. It's too big to eat. I mean these implementations are monstrous. They take multiple administrators for starters of every toolset in the suite. One person should never be running the whole suite with each tool having it's own administrator that would be a security risk. You never want one person having all the keys to the kingdom anyway.

There are multiple layers of understanding and expertise and a lot of time to get up and running. Now, I'll have plenty of folks who will disagree with me and that's fine. They are the ones selling these suites. They are going to have a different opinion and that's fine.

As I have learned in life anyway one-third of the people are going to like you, one third of the people are going to hate you and one third of the people don't care.

So, you purchase the suite anyway and the first toolset you'll tackle is the testing suite. Now you need to have a body that understands the testing tool. How to write the scripts that get populated into the tool to run the test scripts and they are constantly changing based on new functionality, modifications, deficiencies and enhancements against the applications under development. If you don't have a good scriptwriter for testing you are already in trouble.

Now you are going to tackle the Project Management tool. Who really knows this skill? I've never met anyone who actually used a tool like this and understood it and implemented it well. That's just my experience. I'm sure they are out there. To me this is a rare expertise. Moving onto the QA tool, then the SCM tool, Defect Tracking Tool, etc.. There could be a combination of toolsets. My point is that you have to eat a tool one at a time. It's extremely hard to begin with to standardize any tool across any organization never mind a suite of toolsets. However, if you take a small bite, digest that and then the next one and so one you'll be more successful.

I feel scalability is the answer and actually makes more sense. If you can customize the suite to only include in it what you have expertise in and build upon that knowledge slowly I feel you can be more successful. For example start with the expertise you have in house. Let's say you have SCM and Defect Tracking knowledge.

Introduce those tools and let users in your organization get comfortable with that standardization. Then once that is stable introduce the Project Management, then Testing, then Requirements Gathering, etc... It doesn't have to be in this order clearly. It can be in whatever order you want but if you go slowly you won't break the bank and it will evolve. I always tell folks this is an evolution not a revolution.

Just look at world events, when there is a revolution and the old government is now out of office typically no one knows how to introduce a new government successfully. Even when they do they try to do the whole thing at once and it fails because it usually takes years to get in the situation that you are in and it's going to take time to get out of it and change it. So, an evolution is always a better approach in my opinion. Slow progress over time gathering up steam as one tool is introduced after another until it's all in place.

Should an organization even take on this kind of an exercise you have to put competent people on it because competent people can mentor others and this makes the implementation more successful at the end of the day. Every person has their strength's and weaknesses so your going to want to put folks on the job that have a proven track record at pushing change through. Not an easy thing to do.

You want folks running this who have experience with rejection and get a leadership that is 100% unfalteringly driving the process. One exception to the process and the entire suite will fail.

If you were smart while you were implementing the toolset you have strength in you would have been learning the next toolset in the suite to be ready to move forward with that implementation the moment the first tool was completely implemented. This way there is no pause in the next learning curve.

There is a lot of work that goes into these suites and it's not just staffing a tool like this and finding the expertise to implement the various pieces/tools. This includes upgrading the tool, applying any bug fixes, maintaining the OS, supporting back-up's. The list really does go on.

One thing to make sure to find when examining the SCM toolset in the suites is that it has a Generic repository perspective. This really is the answer for the SCM piece of the puzzle. You need to have repositories, which is a glorified name for a folder that will house any type of asset in the organization. You can have an "End to End Solution" for one development tool and one project but most development environments contain multiple tools (Java, Cobol, PowerBuilder, etc.). Now you'll have to introduce multiples toolsets and twice the overhead and no way of measure across an enterprise. That's a waste of money in the final analysis.

You need to make sure that these repositories can support any platform of development tool that an organization has in house or across the enterprise depending on how your organization is structured. SCM tools that only support one platform or another that are part of a suite of this nature in the "End to End Solution" are limited in scope and are not flexible enough to be called standards and standardizing and centralizing is how you really save money.

I don't know of to many organizations anyway that only have one application language they are developing in and nothing else and if you're serious about SCM you have to take into account every type of software asset or intellectual property you have in the enterprise.

I'm not saying that there are not exclusive Java, Visual Studio or PowerBuilder shops because they exist. What I am saying is that large organizations like States for example typically have a lot of legacy applications that were developed in many types of languages and to have a SCM tool in your "End to End Solution" suite that only covers one language or another could never be a standard. Where would you house all the rest of your source data assets, intellectual property or mission-critical assets in a second SCM tool or a third?

If that is the case there goes the whole benefit of reporting, managing, requirements gathering, testing on all of your assets in the enterprise because you won't have everything in one repository but in multiple repositories. Then the "End to End Solution" idea is a moot point. The benefit is lost. Now you will need more expertise, more money, more staff, more time, more resources and now your wasting more of everything and you lose the value of the "End to End Solution" anyway. There is no value in supporting just one type of language at all, the metrics will be quite limited and remember the purpose of these suites to begin with is to standardize, centralize and realize cost savings across an entire organization not just one project.

Listen to how this makes no sense we spent all of the money and bought an "End to End Solution" just for the Java team or the Visual Studio team or the PowerBuilder team. What about all the other types of development you are performing? Don't you want to save money in those areas of application development as well? For the cost of these suites I would hope so and anything else would be a waste of time, money and resources for any organization.

There is also the disaster recovery aspect would you secure one set of assets and then not secure another set. For example would I only care about my PowerBuilder source assets and not my Visual Studio assets? If I have a repository perspective that can only handle one type of data assets, what happens if there is a catastrophic event I only can recover one sets of assets but not another. So, when looking at these suites and considering disaster recovery make sure the SCM toolset has a generic repository that can secure and control any type of source data assets in your organization. It doesn't have to be great at it but it does have to work.

Summary
There are a lot of pitfalls to these "End to End Solutions" make sure you have competent folks in the room that can ask the hard questions and if the managers are embarrassed by the hard questions at the dog and pony show they are going to be even more embarrassed when they buy the suites and begin to lose money for the organization, the implementation fails, they lose their jobs or have to move on to distance themselves from the failure and the organization has to absorb the cost.

You have to make sure you have the staff, the time, the money; the expertise and that you choose suites that can support every type of source data asset in your enterprise. Make sure leadership convincingly communicates the change. Take small bites and digest those and make the cultural shifts in your organization you need to make before moving on to the next toolsets in the suite. It's an evolution not a revolution so think it through and don't get caught up in the sell. It will only cost you more money later when you buy on impulse.

My background is that I have been working in Software Configuration Management (SCM) for a few years now and I am presently, the administrator of SCM at the State of NH. SCM is a process-based Software Configuration Management (SCM) tool for managing application source code, intellectual property and mission critical assets. In this capacity I also secure these assets for disaster recovery purposes. I manage 400 plus applications housed in SCM and support 400 users using the product. The development tools we currently use are PowerBuilder PBV8, PBV11 and 12; Visual Studio 2003, 2005, 2008, 2010 and 2012; Eclipse, Juno, RAD, Mule, Cold Fusion, Java, COBOL and VB.

As the Software Configuration Manager (SCM), I provide the administration of the source code management tool. This includes the entire infrastructure of the environment for development, developing life cycles, providing best practices, procedures, processes, documentation; defect tracking, disaster recovery, maintaining build machines and the training of all the developers on proper source code management using the development tools in our development environment.

Risk to End to End Solutions

  • Not enough staff to support the toolsets
  • Not enough expertise to understand the toolsets with great clarity (Testing, SCM, Defect Tracking, QA, Requirements Gathering, Project Management)
  • Loss of expertise to other employment opportunities.
  • No cross training of expertise or transfer of knowledge
  • No solid implementation team
  • The expertise required to analyze these tools are typically not invited to the dog and pony show.
  • Not enough capital when problems arise.

More Stories By Al Soucy

Al Soucy is software configuration manager at the State of New Hampshire's Department of Information Technology (DoIT). In that role Al manages software configuration for dozens of PowerBuilder applications as well as applications written in Java, .NET, and COBOL (yes, COBOL). Al plays bass guitar, acoustic guitar, electric rhythm/lead guitar, drums, mandolin, keyboard; he sings lead and back up vocals and he has released 8 CDs.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
The Internet of Things (IoT) is going to require a new way of thinking and of developing software for speed, security and innovation. This requires IT leaders to balance business as usual while anticipating for the next market and technology trends. Cloud provides the right IT asset portfolio to help today’s IT leaders manage the old and prepare for the new. Today the cloud conversation is evolving from private and public to hybrid. This session will provide use cases and insights to reinforce the value of the network in helping organizations to maximize their company’s cloud experience.
IoT is still a vague buzzword for many people. In his session at Internet of @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, will discuss the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. The presentation will also discuss how IoT is perceived by investors and how venture capitalist access this space. Other topics to discuss are barriers to success, what is new, what is old, and what the future may hold.
Whether you're a startup or a 100 year old enterprise, the Internet of Things offers a variety of new capabilities for your business. IoT style solutions can help you get closer your customers, launch new product lines and take over an industry. Some companies are dipping their toes in, but many have already taken the plunge, all while dramatic new capabilities continue to emerge. In his session at Internet of @ThingsExpo, Reid Carlberg, Senior Director, Developer Evangelism at salesforce.com, to discuss real-world use cases, patterns and opportunities you can harness today.
All major researchers estimate there will be tens of billions devices – computers, smartphones, tablets, and sensors – connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be!
Noted IoT expert and researcher Joseph di Paolantonio (pictured below) has joined the @ThingsExpo faculty. Joseph, who describes himself as an “Independent Thinker” from DataArchon, will speak on the topic of “Smart Grids & Managing Big Utilities.” Over his career, Joseph di Paolantonio has worked in the energy, renewables, aerospace, telecommunications, and information technology industries. His expertise is in data analysis, system engineering, Bayesian statistics, data warehouses, business intelligence, data mining, predictive methods, and very large databases (VLDB). Prior to DataArcho...
Software AG helps organizations transform into Digital Enterprises, so they can differentiate from competitors and better engage customers, partners and employees. Using the Software AG Suite, companies can close the gap between business and IT to create digital systems of differentiation that drive front-line agility. We offer four on-ramps to the Digital Enterprise: alignment through collaborative process analysis; transformation through portfolio management; agility through process automation and integration; and visibility through intelligent business operations and big data.
There will be 50 billion Internet connected devices by 2020. Today, every manufacturer has a propriety protocol and an app. How do we securely integrate these "things" into our lives and businesses in a way that we can easily control and manage? Even better, how do we integrate these "things" so that they control and manage each other so our lives become more convenient or our businesses become more profitable and/or safe? We have heard that the best interface is no interface. In his session at Internet of @ThingsExpo, Chris Matthieu, Co-Founder & CTO at Octoblu, Inc., will discuss how thes...
Last week, while in San Francisco, I used the Uber app and service four times. All four experiences were great, although one of the drivers stopped for 30 seconds and then left as I was walking up to the car. He must have realized I was a blogger. None the less, the next car was just a minute away and I suffered no pain. In this article, my colleague, Ved Sen, Global Head, Advisory Services Social, Mobile and Sensors at Cognizant shares his experiences and insights.
We are reaching the end of the beginning with WebRTC and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) ir...
Can call centers hang up the phones for good? Intuitive Solutions did. WebRTC enabled this contact center provider to eliminate antiquated telephony and desktop phone infrastructure with a pure web-based solution, allowing them to expand beyond brick-and-mortar confines to a home-based agent model. It also ensured scalability and better service for customers, including MUY! Companies, one of the country's largest franchise restaurant companies with 232 Pizza Hut locations. This is one example of WebRTC adoption today, but the potential is limitless when powered by IoT. Attendees will learn rea...
From telemedicine to smart cars, digital homes and industrial monitoring, the explosive growth of IoT has created exciting new business opportunities for real time calls and messaging. In his session at Internet of @ThingsExpo, Ivelin Ivanov, CEO and Co-Founder of Telestax, will share some of the new revenue sources that IoT created for Restcomm – the open source telephony platform from Telestax. Ivelin Ivanov is a technology entrepreneur who founded Mobicents, an Open Source VoIP Platform, to help create, deploy, and manage applications integrating voice, video and data. He is the co-founder ...
The Internet of Things (IoT) promises to create new business models as significant as those that were inspired by the Internet and the smartphone 20 and 10 years ago. What business, social and practical implications will this phenomenon bring? That's the subject of "Monetizing the Internet of Things: Perspectives from the Front Lines," an e-book released today and available free of charge from Aria Systems, the leading innovator in recurring revenue management.
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. In her session at 6th Big Data Expo®, Hannah Smalltree, Director at Treasure Data, to discuss how IoT, Big Data and deployments are processing massive data volumes from wearables, utilities and other mach...
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at Internet of @ThingsExpo, Erik Lagerway, Co-founder of Hookflash, will walk through the shifting landscape of traditional telephone and voice s...
While great strides have been made relative to the video aspects of remote collaboration, audio technology has basically stagnated. Typically all audio is mixed to a single monaural stream and emanates from a single point, such as a speakerphone or a speaker associated with a video monitor. This leads to confusion and lack of understanding among participants especially regarding who is actually speaking. Spatial teleconferencing introduces the concept of acoustic spatial separation between conference participants in three dimensional space. This has been shown to significantly improve comprehe...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, will discuss single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example...
SYS-CON Events announced today that Gridstore™, the leader in software-defined storage (SDS) purpose-built for Windows Servers and Hyper-V, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Gridstore™ is the leader in software-defined storage purpose built for virtualization that is designed to accelerate applications in virtualized environments. Using its patented Server-Side Virtual Controller™ Technology (SVCT) to eliminate the I/O blender effect and accelerate applications Gridsto...
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace. These technological reforms have not only changed computers and smartphones, but are also changi...