Click here to close now.


Linux Containers Authors: Elizabeth White, VictorOps Blog, Bill Szybillo, Liz McMillan, Mehdi Daoudi

Related Topics: Containers Expo Blog, Java IoT, Microservices Expo, Linux Containers, SDN Journal

Containers Expo Blog: Blog Post

In-Memory Computing: In Plain English

Explaining in-memory computing and defining what in-memory computing is really about

After five days (and eleven meetings) with new customers in Europe, Russia, and the Middle East, I think time is right for another refinement of in-memory computing's definition. To me, it is clear that our industry is lagging when it comes to explaining in-memory computing to potential customers and defining what in-memory computing is really about. We struggle to come up with a simple, understandable definition of what in-memory computing is all about, what problems it solves, and what uses are a good fit for the technology.

In-Memory Computing: What Is It?
In-memory computing means using a type of middleware software that allows one to store data in RAM, across a cluster of computers, and process it in parallel. Consider operational datasets typically stored in a centralized database which you can now store in "connected" RAM across multiple computers. RAM, roughly, is 5,000 times faster than traditional spinning disk. Add to the mix native support for parallel processing, and things get very fast. Really, really, fast.

RAM storage and parallel distributed processing are two fundamental pillars of in-memory computing.

RAM storage and parallel distributed processing are two fundamental pillars of in-memory computing. While in-memory data storage is expected of in-memory technology, the parallelization and distribution of data processing, which is an integral part of in-memory computing, calls for an explanation.

Parallel distributed processing capabilities of in-memory computing are... a technical necessity. Consider this: a single modern computer can hardly have enough RAM to hold a significant dataset. In fact, a typical x86 server today (mid-2014) would have somewhere between 32GB to 256GB of RAM. Although this could be a significant amount of memory for a single computer, that's not enough to store many of today's operational datasets that easily measure in terabytes.

To overcome this problem in-memory computing software is designed from the ground up to store data in a distributed fashion, where the entire dataset is divided into individual computers' memory, each storing only a portion of the overall dataset. Once data is partitioned - parallel distributed processing becomes a technical necessity simply because data is stored this way.

And while it makes the development of in-memory computing software challenging (literally fewer than 10 companies in the world have mastered this type of software development) - end users of in-memory computing seeking dramatic performance and scalability increas benefit greatly from this technology.

In-Memory Computing: What Is It Good For?
Let's get this out of the way first: if one wants a 2-3x performance or scalability improvements - flash storage (SSD, Flash on PCI-E, Memory Channel Storage, etc.) can do the job. It is relatively cheap and can provide that kind of modest performance boost.

To see, however, what a difference in-memory computing can make, consider this real-live example...

Last year GridGain won an open tender for one of the largest banks in the world. The tender was for a risk analytics system to provide real-time analysis of risk for the bank's trading desk (common use case for in-memory computing in the financial industry). In this tender GridGain software demonstrated one billion (!) business transactions per second on 10 commodity servers with the total of 1TB of RAM. The total cost of these 10 commodity servers? Less than $25K.

Now, read the previous paragraph again: one billion financial transactions per second on $25K worth of hardware. That is the in-memory computing difference - not just 2-3x times faster; more than 100x faster than theoretically possible even with the most expensive flash-based storage available on today's market (forget about spinning disks). And 1TB of flash-based storage alone would cost 10x of entire hardware setup mentioned.

Importantly, that performance translates directly into the clear business value:

  • you can use less hardware to support the required performance and throughput SLAs, get better data center consolidation, and significantly reduce capital costs, as well as operational and infrastructure overhead, and
  • you can also significantly extend the lifetime of your existing hardware and software by getting increased performance and improve its ROI by using what you already have longer and making it go faster.

And that's what makes in-memory computing such a hot topic these days: the demand to process ever growing datasets in real-time can now be fulfilled with the extraordinary performance and scale of in-memory computing, with economics so compelling that the business case becomes clear and obvious.

In-Memory Computing: What Are the Best Use Cases?
I can only speak for GridGain here but our user base is big enough to be statistically significant. GridGain has production customers in a wide variety of industries:

  • Investment banking
  • Insurance claim processing & modeling
  • Real-time ad platforms
  • Real-time sentiment analysis
  • Merchant platform for online games
  • Hyper-local advertising
  • Geospatial/GIS processing
  • Medical imaging processing
  • Natural language processing & cognitive computing
  • Real-time machine learning
  • Complex event processing of streaming sensor data

And we're also seeing our solutions deployed for more mundane use cases, like speeding the response time of a student registration system from 45 seconds to under a half-second.

By looking at this list it becomes pretty obvious that the best use cases are defined not by specific industry but by the underlying technical need, i.e. the need to get the ultimate best and uncompromised performance and scalability for a given task.

In many of these real-life deployments in-memory computing was an enabling technology, the technology that made these particular systems possible to consider and ultimately possible to implement.

The bottom line is that in-memory computing is beginning to unleash a wave of innovation that's not built on Big Data per se, but on Big Ideas, ideas that are suddenly attainable. It's blowing up the costly economics of traditional computing that frankly can't keep up with either the growth of information or the scale of demand.

As the Internet expands from connecting people to connecting things, devices like refrigerators, thermostats, light bulbs, jet engines and even heart rate monitors are producing streams of information that will not just inform us, but also protect us, make us healthier and help us live richer lives. We'll begin to enjoy conveniences and experiences that only existed in science fiction novels. The technology to support this transformation exists today - and it's called in-memory computing.

More Stories By Nikita Ivanov

Nikita Ivanov is founder and CEO of GridGain Systems, started in 2007 and funded by RTP Ventures and Almaz Capital. Nikita has led GridGain to develop advanced and distributed in-memory data processing technologies – the top Java in-memory computing platform starting every 10 seconds around the world today.

Nikita has over 20 years of experience in software application development, building HPC and middleware platforms, contributing to the efforts of other startups and notable companies including Adaptec, Visa and BEA Systems. Nikita was one of the pioneers in using Java technology for server side middleware development while working for one of Europe’s largest system integrators in 1996.

He is an active member of Java middleware community, contributor to the Java specification, and holds a Master’s degree in Electro Mechanics from Baltic State Technical University, Saint Petersburg, Russia.

@ThingsExpo Stories
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now all corporate assets – people, objects, and spaces – can share information about themselves and thei...
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, exploreed the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessions, I wanted to share some of my observations on emerging trends. As cyber security serves as a fou...
Most of the IoT Gateway scenarios involve collecting data from machines/processing and pushing data upstream to cloud for further analytics. The gateway hardware varies from Raspberry Pi to Industrial PCs. The document states the process of allowing deploying polyglot data pipelining software with the clear notion of supporting immutability. In his session at @ThingsExpo, Shashank Jain, a development architect for SAP Labs, discussed the objective, which is to automate the IoT deployment process from development to production scenarios using Docker containers.
The cloud. Like a comic book superhero, there seems to be no problem it can’t fix or cost it can’t slash. Yet making the transition is not always easy and production environments are still largely on premise. Taking some practical and sensible steps to reduce risk can also help provide a basis for a successful cloud transition. A plethora of surveys from the likes of IDG and Gartner show that more than 70 percent of enterprises have deployed at least one or more cloud application or workload. Yet a closer inspection at the data reveals less than half of these cloud projects involve production...
Countless business models have spawned from the IaaS industry – resell Web hosting, blogs, public cloud, and on and on. With the overwhelming amount of tools available to us, it's sometimes easy to overlook that many of them are just new skins of resources we've had for a long time. In his general session at 17th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, an IBM Company, broke down what we have to work with, discussed the benefits and pitfalls and how we can best use them to design hosted applications.
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" in this scenario: microservice A (releases daily) depends on a couple of additions to backend B (re...
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
Container technology is shaping the future of DevOps and it’s also changing the way organizations think about application development. With the rise of mobile applications in the enterprise, businesses are abandoning year-long development cycles and embracing technologies that enable rapid development and continuous deployment of apps. In his session at DevOps Summit, Kurt Collins, Developer Evangelist at, examined how Docker has evolved into a highly effective tool for application delivery by allowing increasingly popular Mobile Backend-as-a-Service (mBaaS) platforms to quickly crea...
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound effect on the world, and what should we expect to see over the next couple of years.
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Day 2 Keynote at 17th Cloud Expo, Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, wil...
PubNub has announced the release of BLOCKS, a set of customizable microservices that give developers a simple way to add code and deploy features for realtime apps.PubNub BLOCKS executes business logic directly on the data streaming through PubNub’s network without splitting it off to an intermediary server controlled by the customer. This revolutionary approach streamlines app development, reduces endpoint-to-endpoint latency, and allows apps to better leverage the enormous scalability of PubNub’s Data Stream Network.
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Ben Perlmutter, a Sales Engineer with IBM Cloudant, demonstrated techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk was on IBM Cloudant, Apache CouchDB, and ...
I recently attended and was a speaker at the 4th International Internet of @ThingsExpo at the Santa Clara Convention Center. I also had the opportunity to attend this event last year and I wrote a blog from that show talking about how the “Enterprise Impact of IoT” was a key theme of last year’s show. I was curious to see if the same theme would still resonate 365 days later and what, if any, changes I would see in the content presented.
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical to maintaining positive ROI. Raxak Protect is an automated security compliance SaaS platform and ma...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, demonstrated examples of com...