This utility imports metadata (object definitions) or data or both for one or more databases from one or more files.
If an object exists on the destination server with the same name as an imported object, it is dropped first before importing the new object.
To skip objects by type, use the
--skip option with a list of
the objects to skip. This enables you to extract a particular set
of objects, say, for importing only events (by excluding all other
types). Similarly, to skip creation of
UPDATE statements for
BLOB data, specify the
Input consists of SQL statements. For definitions, this
consists of the appropriate
GRANT statements. For data,
this is an INSERT statement
(or bulk insert if the
--bulk-insert option is
Display output in grid or table format like that of the mysql client command-line tool.
Input is formatted in comma-separated values format.
Input is a simple CSV file containing uniform rows with values
separated with commas. The file can contain a header (the
first row) that lists the table columns. The option
--table is required to
use this format.
Input is formatted in tab-separated format.
Display output in single-column format like that of the
\G command for the mysql
client command-line tool.
To indicate that input in csv
or tab format does not contain
column headers, specify the
To turn off all feedback information, specify the
By default, the utility creates each table on the destination
server using the same storage engine as the original table. To
override this and specify the storage engine to use for all tables
created on the destination server, use the
If the destination server supports the new engine, all tables use
To specify the storage engine to use for tables for which the
destination server does not support the original storage engine on
the source server, use the
option is given and the destination server does not support the
specified storage engine, a warning is issued and the server's
default storage engine setting is used instead.
You must provide connection parameters (user, host, password, and so forth) for an account that has the appropriate privileges to access all objects in the operation. For details, see NOTES.
If you attempt to import databases on a server with GTIDs enabled (GTID_MODE = ON), a warning will be generated if the import file did not include the GTID statements generated by mysqldbexport.
The utility will also generate a warning if you import databases
on a server without GTIDs enabled and there are GTID statements
present in the file. Use the
--skip-gtid option to ignore
the GTID statements.
To make the most use of GTIDs and export/import, you should export
all of the databases on the server with the
--all option. This will
generate an export file with all of the databases and the GTIDs
executed to that point. Importing this file on another server will
ensure that server has all of the data as well as all of the GTIDs
recorded correctly in its logs.
mysqldbimport accepts the following command-line options:
Display a help message and exit.
Display license information and exit.
Enable autocommit for data import. By default, autocommit is off and data changes are only committed once at the end of each imported file.
Use bulk insert statements for data.
Sets the client character set. The default is retrieved from
the server variable
The engine to use for tables if the destination server does not support the original storage engine on the source server.
Drop each database to be imported if exists before importing anything into it.
Import the files and generate the statements but do not execute them. This is useful for testing input file validity.
Specify the input format. Permitted format values are sql (default), grid, tab, csv, raw_csv, and vertical.
Specify the import format. Permitted format values are:
Table 5.1. mysqldbimport Import Types
||Only import the definitions (metadata) for the objects in the database list|
||Only import the table data for the tables in the database list|
|both||Import both the definitions (metadata) and data|
If you attempt to import objects into an existing database,
the result depends on the import format. If the format is
both, an error occurs
given. If the format is
data, imported table data
is added to existing table data.
Specify the maximum number of INSERT statements to bulk, by
default 30000. This option is only used with
Specify the number of processes to concurrently import the specified files. Special values: 0 (number of processes equal to the number of detected CPUs) and 1 (default - no concurrency). Multiprocessing works at the files level for any operating systems.
The engine to use for all tables created on the destination MySQL server.
Input does not contain column headers. This option only applies to the csv and tab output types.
Turn off all messages for quiet execution.
Connection information for the server in the format: <user>[:<passwd>]@<host>[:<port>][:<socket>] or <login-path>[:<port>][:<socket>].
Specify objects to skip in the operation as a comma-separated list (no spaces). Permitted values for this list are; CREATE_DB, DATA, EVENTS, FUNCTIONS, GRANTS, PROCEDURES, TABLES, TRIGGERS, and VIEWS.
Do not import
Skip execution of
Do not execute replication commands.
Specify the table for importing. This option is required while
Specify how much information to display. Use this option
multiple times to increase the amount of information. For
-v = verbose,
-vvv = debug.
Display version information and exit.
The login user must have the appropriate permissions to create new
objects, access (read) the
mysql database, and
grant privileges. If a database to be imported already exists, the
user must have read permission for it, which is needed to check
the existence of objects in the database.
Actual privileges needed may differ from installation to installation depending on the security privileges present and whether the database contains certain objects such as views or events and whether binary logging is enabled.
Some combinations of the options may result in errors during the operation. For example, excluding tables but not views may result in an error when a view is imported.
--import options, the
permitted values are not case sensitive. In addition, values may
be specified as any unambiguous prefix of a valid value. For
specifies the grid format. An error occurs if a prefix matches
more than one valid value.
When importing data and including the GTID commands, you may encounter an error similar to "GTID_PURGED can only be set when GTID_EXECUTED is empty". This occurs because the destination server is not in a clean replication state. To solve this problem, you can issue a "RESET MASTER" command on the destination prior to executing the import.
The path to the MySQL client tools should be included in the
PATH environment variable in order to use the
authentication mechanism with login-paths. This will allow the
utility to use the my_print_defaults tools
which is required to read the login-path values from the login
configuration file (
Keep in mind that you can only take advantage of multiprocessing if your system has multiple CPUs available for concurrent execution. Also note that multiprocessing is applied at the file level for the mysqldbimport utility, which means that only different files can be concurrently imported.
To import the metadata from the
database to the server on the local host using a file in CSV
format, use this command:
mysqldbimport --server=root@localhost --import=definitions \
--format=csv data.csv# Source on localhost: ... connected. # Importing definitions from data.csv. #...done.
Similarly, to import the data from the
util_test database to the server on the local
host, importing the data using bulk insert statements, use this
mysqldbimport --server=root@localhost --import=data \
--bulk-insert --format=csv data.csv# Source on localhost: ... connected. # Importing data from data.csv. #...done.
To import both data and definitions from the
util_test database, importing the data using
bulk insert statements from a file that contains SQL statements,
use this command:
mysqldbimport --server=root@localhost --import=both --bulk-insert --format=sql data.sql# Source on localhost: ... connected. # Importing definitions and data from data.sql. #...done.
You also need permissions to create the new data directory and write data to it.