|By Mark R. Hinkle||
|March 10, 2006 02:00 PM EST||
As time passes, the base of Linux users is growing in the data center, desktop, and even embedded electronic devices. Numbers from industry analysts point out that Linux server shipments have shown double-digit growth every quarter for over two years. In many cases these servers are being used for expansion or new projects. Inevitably they will be put into service to replace systems that once ran Unix or Windows.
In these cases there is usually an event like a hardware upgrade due to obsolescence or capacity concerns, software maintenance renewal, or other occasion that drives the migration. Rarely do we see a complete replacement of all legacy systems; it's commonly done piecemeal where one part of the infrastructure has been migrated in lieu of renewing an investment in other systems. This means migration to Linux. It also often means adoption of open standards that for the most part is a prime consideration for the open source community. Open source without open standards bears less advantage than a fully open system.
Linux migration is usually a matter of expansion, adding Linux into an increasingly diverse environment. Desktop PCs may be predominantly Windows, while file servers and application servers once hosted on Windows or even Novell NetWare might soon be hosted on Linux. In this case there are two hurdles that you need to overcome, especially if there is a need for communication between systems. The first is the obvious hurdle of moving from one system to another. This is a short-term problem. It's usually very disruptive and requires a considerable amount of planning and staging, though it's not especially unique as you face many of the same problems moving from one version of Windows to another or from one brand of Unix to another. The second problem is longer term and involves interoperability with existing systems. To lessen the burden in this area you should be planning well before a migration. The consideration I believe is most important is whether your systems lock you in and make it unlikely you can easily change vendors should you become unhappy with and want to investigate others. This applies to your data and network services. For example, could data stored in DB/2, Sybase, or Oracle be stored in MySQL or vice versa? Does one system have features you can't live without? Could documents originally authored in Microsoft Office be read in OpenOffice.org? If a new version of Windows becomes available, does it still allow you to access your Samba file system hosted on Linux? These are all questions you should be asking regardless of platform. For example, using Apache on a FreeBSD server is a fine choice, but if you decide that Red Hat offers a good value, can you move from one platform to another? The answer is likely yes. Does the same hold true for Web applications developed on Microsoft's Internet Information Server?
My advice is to adhere to open standards and portable file formats that are more easily migrated later on. Even if you decide to stay with one vendor, it's much better for you to select which solutions to use rather than your vendor making that choice for you. I suggest looking at things that are going on today in your enterprise. On the desktop you are likely using Microsoft Office. Their next-generation file formats are XML and intuitively should allow for easier collaboration between Microsoft and OpenOffice.org users. However, watch closely to make sure that they really do facilitate the sharing of files and that the hype is not the result of a clever PR campaign. Another thing to be wary of is the potential of a new feature in Microsoft's yet-to-be-released Vista operating system: encrypted file systems. Since the product has not gone live, it's hard to understand how this will affect cross-platform enterprises. My understanding is that this feature would likely include a level of encryption that protects the data on the hard drive (for example, the one on your Linux laptop). The idea being that if your laptop were stolen, it could not be booted under another operating system and the data would not be accessible. At first glance this sounds like a valuable feature. However, the question is: Would this prevent you from legitimately accessing that data from another operating system (like I do on my dual boot Windows/Linux laptop). In fact I often help Microsoft users migrate their data from an out-of-service Windows PC using a bootable Linux CD and a network. Will I still have this option with future products? Since this new technology prevents the bad guys from getting my data, I wonder if it also prevents me from accessing my data in a way that I choose? Is the encryption technology open source and does it allows me to authenticate my data from systems other than Windows? Does that make it possible for me to share files between systems that aren't licensed to use this new cryptology technology? I'm unsure of the answer.
I also worry about the inclusion of Digital Rights Management (DRM) technology in hardware. For example, DVI connections that are present on many modern graphics cards are very similar to the HDMI (High Definition Multimedia Interface) cables that are being used to combine audio and video into one cable for home entertainment equipment. Eventually, computers will use this same interface. Now here's a bit of trivia: the HDMI standard includes an element called HDCP (High Definition Copy Protection, developed by Intel) that does little to add value to my personal experience (I'm sure the recording and movie industries will offer some statistic about how reduced piracy keeps prices lower for me). However, it could mean that if I choose to use a "standard" graphics card with a standard PCI-E interface, I also must make sure that they adhere to less obvious standards buried within my hardware should I want to watch a DVD or HDTV broadcast. Does HDCP add value to me personally? Does it help me get more enjoyment out of my system? Should I be concerned about what's going on within the widgetry of my system? I would think so.
My point in mentioning these things is not to cause you any undue worry or to preach doom and gloom. My hope is to make you aware that while you continue to adopt mainstream technologies, you may also unknowingly be adopting features that lock you into a product's technology. Some of these features will have benefits that will be useful to you; just make sure you are getting what you bargained for. Also, what happens if these technologies add an additional point of failure? For example, the reason I know so much about HDMI is because I recently bought a plasma TV and while running cables from my HD receiver to my A/V receiver to my new TV, I found out that the receiver or the set top box didn't properly implement the standard. The result was that the copy protection software inhibited my ability to legally use my equipment. HDCP never came up in the sales process nor were the installers of my system aware of the potential problem. Take the same situation in a different content: What happens if data stored on your Windows server becomes unavailable to your Linux servers because of some obscure DRM scheme? Does it shut down your operation? Does it add unnecessary complexity and inconvenience? These are the questions I would be asking before I made my next investment in new technologies. The freedom to migrate is one that I believe to be more important than the actual act of migrating. You see, I have made my decisions and continue to make them but do so on my terms, not that of vendors that are conspiring to lock me in. In the future I don't know which conventions might be widely adopted that would prevent me from using legally purchase products in a reasonable way (of course, the consumer's version of reasonable way and the vendor's are bound to be different). What I do know is that I need to be vigilant and watch for these gotchas, and take steps to avoid them. I would advise you to do the same.
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new data-driven world, marketplaces reign supreme while interoperability, APIs and applications deliver un...
Oct. 9, 2015 04:00 PM EDT Reads: 304
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Oct. 9, 2015 04:00 PM EDT Reads: 235
Electric power utilities face relentless pressure on their financial performance, and reducing distribution grid losses is one of the last untapped opportunities to meet their business goals. Combining IoT-enabled sensors and cloud-based data analytics, utilities now are able to find, quantify and reduce losses faster – and with a smaller IT footprint. Solutions exist using Internet-enabled sensors deployed temporarily at strategic locations within the distribution grid to measure actual line loads.
Oct. 9, 2015 03:49 PM EDT
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Oct. 9, 2015 03:45 PM EDT Reads: 506
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Oct. 9, 2015 03:45 PM EDT Reads: 140
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
Oct. 9, 2015 03:30 PM EDT Reads: 107
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, will explore the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
Oct. 9, 2015 03:05 PM EDT
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
Oct. 9, 2015 03:00 PM EDT Reads: 203
Today’s connected world is moving from devices towards things, what this means is that by using increasingly low cost sensors embedded in devices we can create many new use cases. These span across use cases in cities, vehicles, home, offices, factories, retail environments, worksites, health, logistics, and health. These use cases rely on ubiquitous connectivity and generate massive amounts of data at scale. These technologies enable new business opportunities, ways to optimize and automate, along with new ways to engage with users.
Oct. 9, 2015 02:00 PM EDT Reads: 190
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll share tips on how to speed up business initiatives, harness Big Data and remain one step ahead by apply...
Oct. 9, 2015 01:45 PM EDT Reads: 564
There will be 20 billion IoT devices connected to the Internet soon. What if we could control these devices with our voice, mind, or gestures? What if we could teach these devices how to talk to each other? What if these devices could learn how to interact with us (and each other) to make our lives better? What if Jarvis was real? How can I gain these super powers? In his session at 17th Cloud Expo, Chris Matthieu, co-founder and CTO of Octoblu, will show you!
Oct. 9, 2015 01:15 PM EDT
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the cloud and the best price/performance value available. ProfitBricks was named one of the coolest Clo...
Oct. 9, 2015 01:00 PM EDT Reads: 797
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
Oct. 9, 2015 01:00 PM EDT Reads: 605
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
Oct. 9, 2015 12:00 PM EDT Reads: 741
SYS-CON Events announced today that Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, will keynote at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Oct. 9, 2015 11:15 AM EDT
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
Oct. 9, 2015 09:00 AM EDT Reads: 299
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
Oct. 9, 2015 08:30 AM EDT Reads: 311
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Oct. 9, 2015 07:00 AM EDT Reads: 5,886
WebRTC converts the entire network into a ubiquitous communications cloud thereby connecting anytime, anywhere through any point. In his session at WebRTC Summit,, Mark Castleman, EIR at Bell Labs and Head of Future X Labs, will discuss how the transformational nature of communications is achieved through the democratizing force of WebRTC. WebRTC is doing for voice what HTML did for web content.
Oct. 9, 2015 06:00 AM EDT Reads: 1,415
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
Oct. 9, 2015 04:00 AM EDT Reads: 573