mysqlpdump

Description

MySQL Parallel Dump

Multi threaded mysqldump is not an utopia any more. mysqlpdump can dump all your tables and databases in parallel so it can be much faster in systems with multiple cpu’s.

It stores each table in a different file by default. It can also generate the dump to stdout although this is not recommended because it can use all the memory in your system if your tables are big.

History

I saw an interesting post on MySQL Performance Blog with some suggestions to improve mysqldump.

Here is my effort to implement some of that suggestions.

Download

Requeriments

Usage

Simplest usage (will save a file for each table):

mysqlpdump.py -u root -p password

Save compressed files (gzip) to /tmp/dumps and pass “–skip-opt” to mysqldump

mysqlpdump.py -u root -p password -d /tmp/dumps/ -g -P "--skip-opt"

Output to stdout and use 20 threads:

mysqlpdump.py -u root -p password -stdout -t 20

Be more “verbose”:

mysqlpdump.py -u root -p password -v

Exclude “mysql” and “test” table from dumping:

mysqlpdump.py -u root -p password -e mysql -e test

Only dump “mysql” table:

mysqlpdump.py -u root -p password -i mysql

Links

Changelog

  • 0.5
    • Compress 00_master_data.sql file if specified
    • bugfix: when it’s called without a terminal or a logged user, it uses “nobody”.
    • bugfix: destination now works with 00_master_data.sql
  • 0.4
    • Made it compatible with python 2.4
    • Can include and exclude specified databases.
  • 0.3
    • Fixed a bug that prevented the tables of being dumped because of a lock
    • Added –master-data option to write “CHANGE MASTER TO ” statement
  • 0.2
    • Store dumps to files directly instead to stdout
    • Can compress files
    • Dump each table in its own file
    • Can pass parameters directly to mysqldump
  • 0.1
    • First version

License

mysqlpdump uses GNU/GPL License.