|By Mark R. Hinkle||
|March 10, 2006 02:00 PM EST||
As time passes, the base of Linux users is growing in the data center, desktop, and even embedded electronic devices. Numbers from industry analysts point out that Linux server shipments have shown double-digit growth every quarter for over two years. In many cases these servers are being used for expansion or new projects. Inevitably they will be put into service to replace systems that once ran Unix or Windows.
In these cases there is usually an event like a hardware upgrade due to obsolescence or capacity concerns, software maintenance renewal, or other occasion that drives the migration. Rarely do we see a complete replacement of all legacy systems; it's commonly done piecemeal where one part of the infrastructure has been migrated in lieu of renewing an investment in other systems. This means migration to Linux. It also often means adoption of open standards that for the most part is a prime consideration for the open source community. Open source without open standards bears less advantage than a fully open system.
Linux migration is usually a matter of expansion, adding Linux into an increasingly diverse environment. Desktop PCs may be predominantly Windows, while file servers and application servers once hosted on Windows or even Novell NetWare might soon be hosted on Linux. In this case there are two hurdles that you need to overcome, especially if there is a need for communication between systems. The first is the obvious hurdle of moving from one system to another. This is a short-term problem. It's usually very disruptive and requires a considerable amount of planning and staging, though it's not especially unique as you face many of the same problems moving from one version of Windows to another or from one brand of Unix to another. The second problem is longer term and involves interoperability with existing systems. To lessen the burden in this area you should be planning well before a migration. The consideration I believe is most important is whether your systems lock you in and make it unlikely you can easily change vendors should you become unhappy with and want to investigate others. This applies to your data and network services. For example, could data stored in DB/2, Sybase, or Oracle be stored in MySQL or vice versa? Does one system have features you can't live without? Could documents originally authored in Microsoft Office be read in OpenOffice.org? If a new version of Windows becomes available, does it still allow you to access your Samba file system hosted on Linux? These are all questions you should be asking regardless of platform. For example, using Apache on a FreeBSD server is a fine choice, but if you decide that Red Hat offers a good value, can you move from one platform to another? The answer is likely yes. Does the same hold true for Web applications developed on Microsoft's Internet Information Server?
My advice is to adhere to open standards and portable file formats that are more easily migrated later on. Even if you decide to stay with one vendor, it's much better for you to select which solutions to use rather than your vendor making that choice for you. I suggest looking at things that are going on today in your enterprise. On the desktop you are likely using Microsoft Office. Their next-generation file formats are XML and intuitively should allow for easier collaboration between Microsoft and OpenOffice.org users. However, watch closely to make sure that they really do facilitate the sharing of files and that the hype is not the result of a clever PR campaign. Another thing to be wary of is the potential of a new feature in Microsoft's yet-to-be-released Vista operating system: encrypted file systems. Since the product has not gone live, it's hard to understand how this will affect cross-platform enterprises. My understanding is that this feature would likely include a level of encryption that protects the data on the hard drive (for example, the one on your Linux laptop). The idea being that if your laptop were stolen, it could not be booted under another operating system and the data would not be accessible. At first glance this sounds like a valuable feature. However, the question is: Would this prevent you from legitimately accessing that data from another operating system (like I do on my dual boot Windows/Linux laptop). In fact I often help Microsoft users migrate their data from an out-of-service Windows PC using a bootable Linux CD and a network. Will I still have this option with future products? Since this new technology prevents the bad guys from getting my data, I wonder if it also prevents me from accessing my data in a way that I choose? Is the encryption technology open source and does it allows me to authenticate my data from systems other than Windows? Does that make it possible for me to share files between systems that aren't licensed to use this new cryptology technology? I'm unsure of the answer.
I also worry about the inclusion of Digital Rights Management (DRM) technology in hardware. For example, DVI connections that are present on many modern graphics cards are very similar to the HDMI (High Definition Multimedia Interface) cables that are being used to combine audio and video into one cable for home entertainment equipment. Eventually, computers will use this same interface. Now here's a bit of trivia: the HDMI standard includes an element called HDCP (High Definition Copy Protection, developed by Intel) that does little to add value to my personal experience (I'm sure the recording and movie industries will offer some statistic about how reduced piracy keeps prices lower for me). However, it could mean that if I choose to use a "standard" graphics card with a standard PCI-E interface, I also must make sure that they adhere to less obvious standards buried within my hardware should I want to watch a DVD or HDTV broadcast. Does HDCP add value to me personally? Does it help me get more enjoyment out of my system? Should I be concerned about what's going on within the widgetry of my system? I would think so.
My point in mentioning these things is not to cause you any undue worry or to preach doom and gloom. My hope is to make you aware that while you continue to adopt mainstream technologies, you may also unknowingly be adopting features that lock you into a product's technology. Some of these features will have benefits that will be useful to you; just make sure you are getting what you bargained for. Also, what happens if these technologies add an additional point of failure? For example, the reason I know so much about HDMI is because I recently bought a plasma TV and while running cables from my HD receiver to my A/V receiver to my new TV, I found out that the receiver or the set top box didn't properly implement the standard. The result was that the copy protection software inhibited my ability to legally use my equipment. HDCP never came up in the sales process nor were the installers of my system aware of the potential problem. Take the same situation in a different content: What happens if data stored on your Windows server becomes unavailable to your Linux servers because of some obscure DRM scheme? Does it shut down your operation? Does it add unnecessary complexity and inconvenience? These are the questions I would be asking before I made my next investment in new technologies. The freedom to migrate is one that I believe to be more important than the actual act of migrating. You see, I have made my decisions and continue to make them but do so on my terms, not that of vendors that are conspiring to lock me in. In the future I don't know which conventions might be widely adopted that would prevent me from using legally purchase products in a reasonable way (of course, the consumer's version of reasonable way and the vendor's are bound to be different). What I do know is that I need to be vigilant and watch for these gotchas, and take steps to avoid them. I would advise you to do the same.
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
Oct. 21, 2016 09:00 AM EDT
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
Oct. 21, 2016 08:45 AM EDT Reads: 11,076
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
Oct. 21, 2016 08:45 AM EDT Reads: 1,344
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
Oct. 21, 2016 08:00 AM EDT Reads: 5,560
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Oct. 21, 2016 07:45 AM EDT Reads: 3,721
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
Oct. 21, 2016 07:15 AM EDT Reads: 1,271
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service.
Oct. 21, 2016 07:15 AM EDT Reads: 865
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Oct. 21, 2016 06:45 AM EDT Reads: 1,783
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Oct. 21, 2016 06:15 AM EDT Reads: 4,627
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessi...
Oct. 21, 2016 05:45 AM EDT Reads: 5,037
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Oct. 21, 2016 05:00 AM EDT Reads: 3,914
Fifty billion connected devices and still no winning protocols standards. HTTP, WebSockets, MQTT, and CoAP seem to be leading in the IoT protocol race at the moment but many more protocols are getting introduced on a regular basis. Each protocol has its pros and cons depending on the nature of the communications. Does there really need to be only one protocol to rule them all? Of course not. In his session at @ThingsExpo, Chris Matthieu, co-founder and CTO of Octoblu, walk you through how Oct...
Oct. 21, 2016 04:30 AM EDT Reads: 3,068
Major trends and emerging technologies – from virtual reality and IoT, to Big Data and algorithms – are helping organizations innovate in the digital era. However, to create real business value, IT must think beyond the ‘what’ of digital transformation to the ‘how’ to harness emerging trends, innovation and disruption. Architecture is the key that underpins and ties all these efforts together. In the digital age, it’s important to invest in architecture, extend the enterprise footprint to the cl...
Oct. 21, 2016 04:15 AM EDT Reads: 1,712
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
Oct. 21, 2016 04:00 AM EDT Reads: 10,930
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Oct. 21, 2016 03:15 AM EDT Reads: 3,839
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
Oct. 21, 2016 03:15 AM EDT Reads: 1,650
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
Oct. 21, 2016 02:00 AM EDT Reads: 5,905
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...
Oct. 21, 2016 01:30 AM EDT Reads: 872
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue an...
Oct. 21, 2016 01:15 AM EDT Reads: 2,912
SYS-CON Events announced today that Hitrons Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Hitrons Solutions Inc. is distributor in the North American market for unique products and services of small and medium-size businesses, including cloud services and solutions, SEO marketing platforms, and mobile applications.
Oct. 21, 2016 12:30 AM EDT Reads: 1,928