{"id":201,"date":"2020-03-25T05:47:00","date_gmt":"2020-03-25T05:47:00","guid":{"rendered":"https:\/\/molecularsciences.org\/content\/?p=201"},"modified":"2020-12-05T05:52:00","modified_gmt":"2020-12-05T05:52:00","slug":"copying-a-large-database-from-one-server-to-another","status":"publish","type":"post","link":"https:\/\/molecularsciences.org\/content\/copying-a-large-database-from-one-server-to-another\/","title":{"rendered":"Copying a large database from one server to another"},"content":{"rendered":"\n<p>Recently, I had to copy a large database from one server to another. I had 100G free on the source server. Unfortunately, the server would run out of memory before the mysqldump would complete. So I had to find a way to make the mysqldumps smaller. This can be done by creating separate dumps for each table. Then I had to compress the files to conserve memory. Following is the script I wrote to accomplish this. Type entire code on one line.<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>for I in $(mysql -u root -p node2 -e 'show tables' -s --skip-column-names); do mysqldump -u root -pMyPass node2 $I | gzip > \"$I.sql.gz\"; done<\/code><\/pre>\n\n\n\n<p>The mysql command in parenthesis will get a list of tables. The for loop will loop through all the tables. mysqldump will create sql dumps for each table. I included my my password in the command because I didn&#8217;t want to type my password 400 times, once for each table. gzip compresses the file.<\/p>\n\n\n\n<p>For your work, if you don&#8217;t want to compress the sql dumps, try the following code (type entire code on one line):<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>for I in $(mysql -u root -p node2 -e 'show tables' -s --skip-column-names); do mysqldump -u root -pMyPass node2 $I > $I.sql; done<\/code><\/pre>\n\n\n\n<p>When the job is complete, remove the line from history because it has your MySQL password<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>history -d &lt;linenumber><\/code><\/pre>\n\n\n\n<p>or change your MySQL password<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Recently, I had to copy a large database from one server to another. I had 100G free on the source server. Unfortunately, the server would run out of memory before the mysqldump would complete. So I had to find a way to make the mysqldumps smaller. This can be done by creating separate dumps for [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[34],"tags":[14,86,35,27,85,87],"class_list":["post-201","post","type-post","status-publish","format-standard","hentry","category-linux","tag-database","tag-gzip","tag-linux","tag-mysql","tag-mysqldump","tag-sql"],"_links":{"self":[{"href":"https:\/\/molecularsciences.org\/content\/wp-json\/wp\/v2\/posts\/201","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/molecularsciences.org\/content\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/molecularsciences.org\/content\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/molecularsciences.org\/content\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/molecularsciences.org\/content\/wp-json\/wp\/v2\/comments?post=201"}],"version-history":[{"count":1,"href":"https:\/\/molecularsciences.org\/content\/wp-json\/wp\/v2\/posts\/201\/revisions"}],"predecessor-version":[{"id":202,"href":"https:\/\/molecularsciences.org\/content\/wp-json\/wp\/v2\/posts\/201\/revisions\/202"}],"wp:attachment":[{"href":"https:\/\/molecularsciences.org\/content\/wp-json\/wp\/v2\/media?parent=201"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/molecularsciences.org\/content\/wp-json\/wp\/v2\/categories?post=201"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/molecularsciences.org\/content\/wp-json\/wp\/v2\/tags?post=201"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}