Welcome!

Linux Containers Authors: Elizabeth White, Carmen Gonzalez, Liz McMillan, Pat Romanski, Jyoti Bansal

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Linux Containers, Containers Expo Blog, @BigDataExpo

@CloudExpo: Article

Take Control of Your Schemalessness with Dynamic Schemas

Addressing the inflexibility of structured data by enabling schemaless data to be dynamically and logically structured

Static data structures have been at the heart of data processing tools since the dawn of computing, but they have always limited the flexibility of the organization leveraging the data. Recently, the rise of flexible formats like JSON have led to schemaless data as an attempt to increase agility. However, schemaless data have proven difficult to work with, because of hidden rigid structure in the form of implied schemas.

EnterpriseWeb addresses the problems of both the inflexibility of structured data as well as the impracticality of schemaless data, by enabling schemaless data to be dynamically and logically structured.

From the fixed-length fields of the 1950s, to the relational structures of modern database management systems, to the semistructured data formats XML and JSON, the structure of our data has always informed code about how it should be processed. Data are defined by their relationships, and we used to hard-code those relationships into rigid structures. That approach allows only one static view, which is difficult to work with, and even more difficult to change. Nevertheless, such rigid data structures - and the models that represent them - are an integral part of enterprise information management.

Traditional relational database management systems (RDBMSs) exemplify this point with their static entity-relationship models (ERMs) and tightly interconnected data structures. XML improves this situation slightly, allowing semi-structured information, but schemas still constrain flexibility and performance. With both approaches, fixed definitions, views, and reports limit the ability for businesses to freely transform information into insight and become obstacles to systemwide change.

The Rise of Schemalessness
This challenge of inflexible data structures has given rise to schemaless data. With JSON in particular, we can create whatever data structure we like when we author data. We don't have to shoehorn data into rigid data structures, thus allowing every record to have its own structure.

But there is a problem with schemaless data. Consider this simple task: how do you create a query for all the addresses in a particular Zip Code if every record has a different name or format for Zip Code? Schemalessness, after all, isn't magic - even schemaless data require some kind of metadata so the code will know how to process such information, what software development guru Martin Fowler calls an implied schema.

Implied schemas represent the structure inherent in any data record. If each address record has its own format, then that format provides the implied schema for that record. Dealing with implied schemas thus falls to the developer, who must figure out how to code software to process these implied schemas, which are different for each and every record.

In Fowler's tutorial on schemalessness, he explains the pros and cons of implied schemas. Despite acknowledging the power of schemalessness to support more flexible and responsive user experiences, he recommends avoiding it and implied schemas for developer convenience. Good advice with respect to traditional software, but the world of data is changing. Today we live in an increasingly schemaless world, where more often than not, the structure of our data is fluid or nonexistent.

Raising the Discussion to Dynamic Schemas
Fowler makes it clear that in the past it has been impractical from the developer's perspective to work systematically with schemaless data, because implied schemas are difficult to deal with. After all, structure is itself useful, and isn't the problem per se. Rather, how to avoid the limitations of static structure without falling into the trap of unmanageable schemaless data that is the real challenge.

EnterpriseWeb's unique approach to modeling solves this critically important challenge by leveraging dynamic schemas that have flexible, metadata-driven relationships with underlying information. Using metadata this way separates concerns, letting people consider relationships from multiple perspectives, rather than from a single static point of view. In addition, it's now possible to change and extend metadata to meet diverse business needs without disruption.

Instead of settling for complex ERMs with their inflexible, tightly coupled data structures or dealing with the coding complexities of implied schemas, developers can project dynamic schemas from the metadata simply by writing different transformations. As a result, dynamic schemas are developer friendly and dynamic - a welcome change from the difficult problem of schemalessness.

Add an Agent for Performance
So far so good, but how do we build software to process all such data in a general way, freeing ourselves from custom coding for implicit schemas? The solution is an intelligent agent.

EnterpriseWeb's intelligent agent, SmartAlex™, is a distributable transaction manager that resolves dynamic schemas for each interaction. Every human or system client interaction is a request for SmartAlex to interpret dynamic schemas (as well as other models and additional metadata) and translate them to a context-specific set of resources in order to construct a custom response.

This Agent-Oriented approach maximizes performance for such dynamic computing. In the background, SmartAlex handles all run time connection and transformation details, sparing programmers from manually integrating resources for varied and unanticipated uses, greatly improving IT productivity while enabling business agility.

SmartAlex logs all system events, indexes all new and updated resources, and tags all changes in relationships for detailed and navigable audit history. This practice creates a feedback loop as SmartAlex leverages the same indexed logs to guide its execution. Data, code, and user interface components, as well as connectors for federated services, systems, databases, and devices, can be updated or replaced without breaking related apps and processes - as SmartAlex is ‘aware' of the changes. In this way EnterpriseWeb supports real time exception and change management for resilient solutions that can evolve naturally.

The EnterpriseWeb Take
Schemalessness was a reaction to the limitations of structured data. People struggled with the constraints of static structure, and figured that if they simply got rid of structure, then the problem would go away. But this move was merely a shell game, as the limitations of fixed schemas shifted to implied schemas, now without the benefits of structure to inform the code responsible for their processing.

The solution is to raise the level of abstraction, and instead of arguing over fixed vs. implied schemas, to work at the dynamic schema level. Such an approach is model-driven, allowing application designers to build models that capture their data structures, and allowing an intelligent agent to use the metadata each model represents to meet the specific needs of each interaction. The real lesson here is that the solution to resolving the challenge of schemalessness combines both dynamic schemas and the action of the agent. Stay tuned to my next newsletter for more information.

More Stories By Jason Bloomberg

Jason Bloomberg is the leading expert on architecting agility for the enterprise. As president of Intellyx, Mr. Bloomberg brings his years of thought leadership in the areas of Cloud Computing, Enterprise Architecture, and Service-Oriented Architecture to a global clientele of business executives, architects, software vendors, and Cloud service providers looking to achieve technology-enabled business agility across their organizations and for their customers. His latest book, The Agile Architecture Revolution (John Wiley & Sons, 2013), sets the stage for Mr. Bloomberg’s groundbreaking Agile Architecture vision.

Mr. Bloomberg is perhaps best known for his twelve years at ZapThink, where he created and delivered the Licensed ZapThink Architect (LZA) SOA course and associated credential, certifying over 1,700 professionals worldwide. He is one of the original Managing Partners of ZapThink LLC, the leading SOA advisory and analysis firm, which was acquired by Dovel Technologies in 2011. He now runs the successor to the LZA program, the Bloomberg Agile Architecture Course, around the world.

Mr. Bloomberg is a frequent conference speaker and prolific writer. He has published over 500 articles, spoken at over 300 conferences, Webinars, and other events, and has been quoted in the press over 1,400 times as the leading expert on agile approaches to architecture in the enterprise.

Mr. Bloomberg’s previous book, Service Orient or Be Doomed! How Service Orientation Will Change Your Business (John Wiley & Sons, 2006, coauthored with Ron Schmelzer), is recognized as the leading business book on Service Orientation. He also co-authored the books XML and Web Services Unleashed (SAMS Publishing, 2002), and Web Page Scripting Techniques (Hayden Books, 1996).

Prior to ZapThink, Mr. Bloomberg built a diverse background in eBusiness technology management and industry analysis, including serving as a senior analyst in IDC’s eBusiness Advisory group, as well as holding eBusiness management positions at USWeb/CKS (later marchFIRST) and WaveBend Solutions (now Hitachi Consulting).

@ThingsExpo Stories
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
Discover top technologies and tools all under one roof at April 24–28, 2017, at the Westin San Diego in San Diego, CA. Explore the Mobile Dev + Test and IoT Dev + Test Expo and enjoy all of these unique opportunities: The latest solutions, technologies, and tools in mobile or IoT software development and testing. Meet one-on-one with representatives from some of today's most innovative organizations
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and E...
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
WebRTC sits at the intersection between VoIP and the Web. As such, it poses some interesting challenges for those developing services on top of it, but also for those who need to test and monitor these services. In his session at WebRTC Summit, Tsahi Levent-Levi, co-founder of testRTC, reviewed the various challenges posed by WebRTC when it comes to testing and monitoring and on ways to overcome them.
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, introduced the technologies required for implementing these idea...
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud enviro...
"A lot of times people will come to us and have a very diverse set of requirements or very customized need and we'll help them to implement it in a fashion that you can't just buy off of the shelf," explained Nick Rose, CTO of Enzu, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web co...
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.