When it comes to migrating data from postgres to mysql, a developer might face some compatibility issues for which finding the right solution becomes a hectic task. I went through a lot of hassle when migrating data of an application from postgres local server to a remote mysql cloud based server. This small step by step process will guide you through the practical solution of how to migrate data when such a situation arises.
Assuming you have mysql installed on the remote machine and access to cmd interface on local server.
1. Access your localmachine/local server with postgres database using from terminal using the following command:
run ssh -X firstname.lastname@example.org -p
2. Navigate to the rails application folder:
3. There are two options to export your data:
a) Using Postgres dump command from terminal:
pg_dump -h localhost -p 5432 -U postgres -F p -b -a -f "/path/to/file/for/dumping/examplefile.sql" databasename
-h= –hostname e.g. localhost
-p= –port (usually 5432 for postgres)
-U= –Username is the database username (default postgres)
-F p= –format=format Selects the format of the output. format can be one of the following
p plain : Output a plain-text SQL script file (the default).
c custom : Output a custom-format archive suitable for input into pg_restore.
d directory : Output a directory-format archive suitable for input into pg_restore
-b=–blobs Include large objects in the dump. This is the default behavior except when –schema, –table, or –schema-only is specified, so the -b switch is
only useful to add large objects to selective dumps.
-a=–data-only Dump only the data, not the schema (data definitions). Table data, large objects, and sequence values are dumped.
-f=–filename is the file name with path where to dump the data. (Create the examplefile.sql before hand at the path)
For more information on pg_dump extensions visit: http://www.postgresql.org/docs/9.3/static/app-pgdump.html
b) Using the GUI of Posgtres my admin:
– Open terminal and run:
– When the GUI is open, Connect to the database server using the login credentials for your account
– When connection is made, press the + sign on the hierarchical menu and go to the public directory
– Right click, from options choose Backup from bottom
– A new window appears, Enter the name of file, and options from different tabs including plain format, blobs and excluding privileges, owner name etc
– Make sure to check data only options in the second tab to extract only data from the tables, and column inserts in column inserts options
– Select all /required tables for which data needs to be exported
– When you have chosen the desired options, press Ok button and the data will be generated in the external file
4. Now that postgres dump file has been created there is a need to convert the data to mysql compatible format:
– Download pg2sql.tar.gz2 from http://www.lightbox.ca/pg2mysql.php (you may copy paste the code from the data file to the online version of the converter but for large data files, copy pasting is not a desirable option so its better to use the local copy of software)
– unzip the files to some directory using command | tar xjvf pg2mysql-1.9.tar.bz2 (in my case the directory was /tmp, and the file was also dumped to /tmp folder)
– Go to the pg2mysql folder using command | cd /tmp/pg2mysql-1.9
– To convert the examplefile.sql to mysql compatible format run command in the pg2mysql directory:
php pg2mysql_cli.php /tmp/myexamplefile.sql /tmp/myexamplefile_mysql.sql innodb
Command format is = php pg2mysql_cli.ph /[path-file-to-convert] /[output-file-path-with-name] [engine]
In case you get an error (in my case I did) running the pg2mysql_cli.php, do the following:
-emacs pg2mysql.inc.php [you may use any other editor like vi]
-Press Ctrl+S, and search for &$ , change it to $ only as in newer versions of php, & will be a deprecated option with the code
-Press Ctrl+X+S to save changes and then Ctrl+X+C to exit emacs
-re run the php command in the last step and wait till your file is converted
-You may view the converted file using command | less /filepath/filename.extension
5. Now that the file is converted on your local server, copy the file from there to the cloud project directory using SCP or RSYNC Command from local server:
scp /tmp/rmyexamplefile_mysql.sql email@example.com:/tmp
6. Now log in to the cloud server using the login credentials using terminal (In our example we used Google Cloud).
7. Copy file from temporary folder to the upload folder of PHPMyAdmin (Default directory to upload PHPMyAdmin import files):
cp /tmp/myexamplefile_mysql.sql /var/lib/phpMyAdmin/upload/
8. Considering you have mysql2 gem installed in your cloud application, and config/database.yml file of rails app filled with correct database user credentials, to create your database tables automatically, Go to your application directory and run database migrations based on your environment. In case of plugins in application, make sure to run plugin migrations separately as well:
rake db:migrate RAILS_ENV=production
9. Migrations may have some dummy data present in the tables so its a good practice to truncate any default data to upload the converted mysql2 data:
mysql -uUSERNAME -pPASSWORD -Nse 'show tables' DATABASENAME | while read table; do mysql DATABASENAME -uUSERNAME -pPASSWORD -e "SET FOREIGN_KEY_CHECKS=0;truncate table $table;SET FOREIGN_KEY_CHECKS=1;"; done
10. Now that tables are truncated, you may import the myexamplefile_mysql.sql data file which is mysql2 compatible. File may be chosen from upload directory using PHPMyAdmin UI view but if the files are large >8 MB, it may create problems in uploading them directory. Its recommended to use command line interface:
mysql -uUSERNAME -pPASSWORD DATABASENAME
This takes in username, password and database name to upload the file, takes in the path to file and uploads the data to the tables no matter how large the file is.
11. Now that the file is uploaded, restart mysql service on the cloud instance and you are ready to use the database with the application:
sudo /etc/init.d/mysqld restart
This effort will surely help you save a lot of time and hassle of migrating database from postgres to mysql. This blog will also give you basics of how to work with terminal to upload large import files to mysql/phpMyAdmin, manually truncate databases, and copying files between servers and systems.
Make sure to transfer files between servers you have the right permissions (e.g. ssh keys added to permissible list of keys). Feel free to give your comments or imporvements in the steps if you feel are important.