Click here to close now.

Welcome!

Linux Authors: Roger Strukhoff, Klaus Enzenhofer, Carmen Gonzalez, VictorOps Blog, Ian Khan

Related Topics: Linux

Linux: Article

Migrating to Open Source Databases Running on Linux

Databases Like MySQL, Ingres r3, PostgreSQL, and Firebird Have Aroused a Lot of Interest

Open source databases running on Linux like MySQL, Ingres r3, PostgreSQL, and Firebird have aroused a lot of interest.

Database developers and corporate users are heralding the anticipated the release of MySQL 5.0, which includes enterprise-level features such as stored procedures, triggers, and views.

Last August Computer Associates made Ingres r3 available under the CA Trusted Open Source License and followed up with a Million-Dollar Challenge, an unprecedented offer to the open source community to develop migration toolkits for the system.

Tony Gaughan, senior VP at Computer Associates, says, "The relational database world is evolving. Enterprise customers are demanding rich, functional products that scale, while lowering the total cost of ownership. Ingres has the pedigree of one of the most seasoned and functional products in the market that can be flexible enough to meet the demands of even the largest organizations."

The publicly traded data infrastructure software company Pervasive Software now services and supports PostgreSQL. Marten Mickos, CEO of MySQL AB, as quoted in SearchEnterpriseLinux.com, is optimistic about open source databases this year betting that "We will see increased growth, faster growth than before, in the adoption of open source in the enterprise, not just for MySQL, but across the board."

Though surely not breaking news, there are many compelling business and technical reasons for migrating to an open source database running on Linux. IT organizations are becoming more focused on business value and are asking questions like "Are we overspending on software? Are there more cost-effective alternatives that meet our specific needs and don't lock us into a long-term relationship with a vendor?"

Besides, unlike the desktop, a database isn't a user-facing technology. A change or modification to a database is less likely to "stir a hornet's nest" among a company's transactional or knowledge workers.

From a technical perspective, security and flexibility are being given serious consideration. For example, since Microsoft SQL Server is tightly integrated into the Windows platform, it's exposed to Windows virus attacks and, as a result, is vulnerable. There have been numerous documented virus attacks on Microsoft Windows, the worst attacks targeted at SQL Server. The Sapphire/Slammer worm (www.cs.berkeley.edu/~nweaver/sapphire/) exploited a buffer overflow vulnerability in Microsoft SQL Server. It infected at least 75,000 hosts and caused network outages and unforeseen consequences such as canceled airline flights, interference with elections, and ATM failures.

Companies are valuing more and more the need to assume greater control of their development processes. With open source, not only can a company view and modify the source code to fix bugs and add needed features, they can control the code's future development. After a consultant or vendor has developed a specific open source application for a customer, that customer is free to use a different consultant or vendor for future development, maintenance, and enhancements if it likes.

Since it's unlikely that Microsoft will open source SQL Server anytime soon, I thought LinuxWorld readers might find an overview of how to migrate the data structure and data from SQL Server to open source databases running on Linux valuable. A word of caution - tread carefully! A manual migration is extremely tedious. Each step takes many man-hours to complete. And, due to space limitations, I won't address the manual migration of stored procedures, views, and triggers that, admittedly, are important components of the enterprise-level database.

That being said, if considering a migration from Microsoft SQL Sever to MySQL, I would recommend waiting for MySQL 5.0 since the current versions don't have stored procedures, views, and triggers. PostreSQL, Ingres r3, and Firebird support stored procedures, views, and triggers so those databases are ready for migration and implementation today. For additional instructions on migrating stored procedures, views, and triggers please see (www.versora.com/__files/documentation/database_migrationsec.pdf)

Migrating Data Structure
First, you'll need to export the table structure using SQL Server Enterprise Manager:

  • At the SQL server, launch Enterprise Manager and connect to the database you intend to migrate.
  • Select all the tables that are being migrated, right-click, and choose Generate SQL Scripts.
  • In the dialog that appears, switch to the ‘Formatting' tab. Uncheck Generate the DROP <object > Command for each Object box. Check Generate Scripts for All Dependent Objects.
  • To make things more manageable, you'll probably want to choose Create One File Per Object. Click OK and indicate where to save the script files. This procedure will create a data structure that works only with SQL Server.
Tweaking will be required for the new database. Consider removing the brackets around names and types. Change types to corollary types. Remove the permissions and index statements from the end of each of these files and store them in a temporary "holding" file that will be applied after data is applied (for speed reasons). If statements aren't revised as indicated, migrating the data will be significantly slower.

When completed, copy these files to the new machine (via file sharing, by burning a CD, or any other way you want), and apply them to the new database. Each database has its own way of running SQL script files though most will let you execute scripts via command-line redirection. For example, PostgreSQL has a command-line tool called psql used to import SQL script files. An example command line for PostgreSQL might look something like this:

psql <dbname> -U <username> < sqlscript.sql

The total time needed to move the data structure manually varies depending on which database you're migrating to, how complicated existing tables are, and how many tables there are. Though this isn't a difficult phase, it can be tedious.

More Stories By Jon Walker

Jon Walker serves as CTO of Versora, an ISV providing Microsoft to Linux migration software. Mr. Walker recently has co-authored 2 whitepapers with Novell titled Migrating from IS Web Servers to Apache SUSE LINUX Enterprise Server 9.0 and Migrating File and Print Servers from Windows to SUSE LINUX Enterprise Server 9. Prior to Versora, Mr. Walker was CTO/VP of Engineering for Miramar Systems. Software developed under his direction at Miramar has been deployed to over 20 million computers worldwide. Mr. Walker has also served as senior technologist for Nortel and Xing Technology (now Real Networks).

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Docker is an excellent platform for organizations interested in running microservices. It offers portability and consistency between development and production environments, quick provisioning times, and a simple way to isolate services. In his session at DevOps Summit at 16th Cloud Expo, Shannon Williams, co-founder of Rancher Labs, will walk through these and other benefits of using Docker to run microservices, and provide an overview of RancherOS, a minimalist distribution of Linux designed expressly to run Docker. He will also discuss Rancher, an orchestration and service discovery platf...
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch of Docker's initial release in March of 2013, interest was revved up several notches. Then late last...
CommVault has announced that top industry technology visionaries have joined its leadership team. The addition of leaders from companies such as Oracle, SAP, Microsoft, Cisco, PwC and EMC signals the continuation of CommVault Next, the company's business transformation for sales, go-to-market strategies, pricing and packaging and technology innovation. The company also announced that it had realigned its structure to create business units to more directly match how customers evaluate, deploy, operate, and purchase technology.
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your online presence.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along with a steady stream of well-publicized data breaches, only add to the uncertainty
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add scalable realtime functionality with minimal effort and cost.”
The Internet of Things (IoT) is rapidly in the process of breaking from its heretofore relatively obscure enterprise applications (such as plant floor control and supply chain management) and going mainstream into the consumer space. More and more creative folks are interconnecting everyday products such as household items, mobile devices, appliances and cars, and unleashing new and imaginative scenarios. We are seeing a lot of excitement around applications in home automation, personal fitness, and in-car entertainment and this excitement will bleed into other areas. On the commercial side, m...
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
Every innovation or invention was originally a daydream. You like to imagine a “what-if” scenario. And with all the attention being paid to the so-called Internet of Things (IoT) you don’t have to stretch the imagination too much to see how this may impact commercial and homeowners insurance. We’re beyond the point of accepting this as a leap of faith. The groundwork is laid. Now it’s just a matter of time. We can thank the inventors of smart thermostats for developing a practical business application that everyone can relate to. Gone are the salad days of smart home apps, the early chalkb...
In the consumer IoT, everything is new, and the IT world of bits and bytes holds sway. But industrial and commercial realms encompass operational technology (OT) that has been around for 25 or 50 years. This grittier, pre-IP, more hands-on world has much to gain from Industrial IoT (IIoT) applications and principles. But adding sensors and wireless connectivity won’t work in environments that demand unwavering reliability and performance. In his session at @ThingsExpo, Ron Sege, CEO of Echelon, will discuss how as enterprise IT embraces other IoT-related technology trends, enterprises with i...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data they generate about customer usage and product performance to deliver extremely compelling and reliabl...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to make sense of it all.
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics archive, in his session at @ThingsExpo, Jim Kaskade, Vice President and General Manager, Big Data & Ana...
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understanding the kinds of data: structured, unstructured, big/small? Analytics: What kinds and how responsiv...
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing demand and the rapidly changing workspace model.
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.