digitalmars.D.learn - Convert a huge SQL file to CSV
- biocyberman (8/8) Jun 01 2018 I need to convert a compressed 17GB SQL dump to CSV. A workable
- Martin Tschierschke (17/25) Jun 01 2018 You don't need python:
- biocyberman (7/33) Jun 02 2018 Ah yes, thank you Martin. I forgot that we can do a "batch" SQL
- Steven Schveighoffer (6/45) Jun 04 2018 Well, it could be done quick-and-dirty by simply ignoring most of SQL
I need to convert a compressed 17GB SQL dump to CSV. A workable solution is to create a temporary mysql database, import the dump, query by python, and export. But i wonder if there is something someway in D to parse the SQL file directly and query and export the data. I imagine this will envolve both parsing and querying because the data is stored in several tables. I am in the process of downloading the dump now so I can’t give excerpt of the data.
Jun 01 2018
On Friday, 1 June 2018 at 09:49:23 UTC, biocyberman wrote:I need to convert a compressed 17GB SQL dump to CSV. A workable solution is to create a temporary mysql database, import the dump, query by python, and export. But i wonder if there is something someway in D to parse the SQL file directly and query and export the data. I imagine this will envolve both parsing and querying because the data is stored in several tables. I am in the process of downloading the dump now so I can’t give excerpt of the data.You don't need python: https://michaelrigart.be/export-directly-mysql-csv/ SELECT field1, field2 FROM table1 INTO OUTFILE '/path/to/file.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' FIELDS ESCAPED BY '\' LINES TERMINATED BY '\n'; Most important: INTO OUTFILE : here you state the path where you want MySQL to store the CSV file. Keep in mind that the path needs to be writeable for the MySQL user You can write a parser for SQL in D, but even if the import into mysql would take some time, it's only compute time and not yours. Regards mt.
Jun 01 2018
On Friday, 1 June 2018 at 10:15:11 UTC, Martin Tschierschke wrote:On Friday, 1 June 2018 at 09:49:23 UTC, biocyberman wrote:Ah yes, thank you Martin. I forgot that we can do a "batch" SQL query where mysql server can parse and run query commands. So no need for Python. But I am still currently waiting for the import to finish the importing of mysql dump. It took 18 hours and is still counting! The whole mysql database is 68GB at the moment. Can we avoid the import and query the database dump directly?I need to convert a compressed 17GB SQL dump to CSV. A workable solution is to create a temporary mysql database, import the dump, query by python, and export. But i wonder if there is something someway in D to parse the SQL file directly and query and export the data. I imagine this will envolve both parsing and querying because the data is stored in several tables. I am in the process of downloading the dump now so I can’t give excerpt of the data.You don't need python: https://michaelrigart.be/export-directly-mysql-csv/ SELECT field1, field2 FROM table1 INTO OUTFILE '/path/to/file.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' FIELDS ESCAPED BY '\' LINES TERMINATED BY '\n'; Most important: INTO OUTFILE : here you state the path where you want MySQL to store the CSV file. Keep in mind that the path needs to be writeable for the MySQL user You can write a parser for SQL in D, but even if the import into mysql would take some time, it's only compute time and not yours. Regards mt.
Jun 02 2018
On 6/3/18 2:04 AM, biocyberman wrote:On Friday, 1 June 2018 at 10:15:11 UTC, Martin Tschierschke wrote:Well, it could be done quick-and-dirty by simply ignoring most of SQL syntax, and focusing on the actual things used in the dump. mysqldump is pretty consistent with how it outputs data. You might even be able to do it with an awk script :) -SteveOn Friday, 1 June 2018 at 09:49:23 UTC, biocyberman wrote:Ah yes, thank you Martin. I forgot that we can do a "batch" SQL query where mysql server can parse and run query commands. So no need for Python. But I am still currently waiting for the import to finish the importing of mysql dump. It took 18 hours and is still counting! The whole mysql database is 68GB at the moment. Can we avoid the import and query the database dump directly?I need to convert a compressed 17GB SQL dump to CSV. A workable solution is to create a temporary mysql database, import the dump, query by python, and export. But i wonder if there is something someway in D to parse the SQL file directly and query and export the data. I imagine this will envolve both parsing and querying because the data is stored in several tables. I am in the process of downloading the dump now so I can’t give excerpt of the data.You don't need python: https://michaelrigart.be/export-directly-mysql-csv/ SELECT field1, field2 FROM table1 INTO OUTFILE '/path/to/file.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' FIELDS ESCAPED BY '\' LINES TERMINATED BY '\n'; Most important: INTO OUTFILE : here you state the path where you want MySQL to store the CSV file. Keep in mind that the path needs to be writeable for the MySQL user You can write a parser for SQL in D, but even if the import into mysql would take some time, it's only compute time and not yours. Regards mt.
Jun 04 2018