we have 30 GB database.we want to take backup of this database.
here the question is we want to take the backup batch of every 10k records.
How we can take batch of records backup using mysqldump
Here is something you may find intriguing
mysqldump already uses an option called --extended-insert. This will group a series of records that can be loaded in a single INSERT command. While you cannot use mysqldump to control the number of rows to be batched together, you can parse a mysqldump. Of course, these means you will have to script it in the language of your choice (Python, Perl, C++, etc).
For any given table in a mysqldump of a single database, you will see the following paradigm:
CREATE TABLE LOCK TABLE INSERT INTO ... INSERT INTO ... INSERT INTO ... ... ... INSERT INTO ... INSERT INTO ... UNLOCK TABLES;
You could parse the mysqldump as follows:
UNLOCK TABLES;, goto Step01
Each INSERT command you read has a certain number of rows. This number you cannot control unless you script to put them together in the number of rows you choose. I would advise against that be mysqld decided the best number of rows for that particular batch given the current configuration settings (perhaps the max_allowed_packet)
--skip-extended-insertand batch them your way
When you deactivate --extended-insert, each INSERT command will have one row. You can collect the rows using a script and collect as many rows as you like.