Replies: 2 comments 3 replies
-
Please see
Interested in your feedback on that. The most important thing you can probably do is work on a way to sanitize and slim down the db for development. that saves far more time importing than the effort in working on the utility to do it. Certainly open to any other techniques you may find. |
Beta Was this translation helpful? Give feedback.
-
I struggled with this for years. I have one project database that is 7.1G compressed sql. The best solution I found was what @rfay mentioned above, I finally wrote a shell script that just processed the raw sql.gz and rewrote it, removing cruft. We can't run this on our production database, as that historical data is needed, and we couldn't lock the production database for the time that it would take. But running this on our exports that developers import locally, has brought load times from 4 hours to 5 minutes. My script is highly specific to our data, so it's not sharable. |
Beta Was this translation helpful? Give feedback.
-
I'm finding myself dealing with large databases at the moment, and a 1.1GB
.sql.gz
file (19GB uncompressed) is taking over 22 minutes to import into DDEV, starting at 15MiB/s and by the end (presumably as rows are inserted) reporting 6-8MiB/s.I'm dumping the source database with
mysqldump --single-transaction dbname > dbname.sql
and then gzipping. It's using extended inserts by default so is efficiently sending data in.I'd welcome any tips, tricks or ideas to speed this up - or pointers for how to profile it to see where the time is going?
Current setup is:
Beta Was this translation helpful? Give feedback.
All reactions