How can I import a large text file (size 51GB) into a MySQL database? I will try on lokalkhost. How then to transfer such a large database to the server?
- If you are given an exhaustive answer, mark it as correct (a daw opposite the selected answer). - Nicolas Chabanovsky ♦
3 answers
The easiest thing after a base has been on a local host is to stupidly copy the entire folder with the base and transfer it to the archive or in any other way. But this is if the base in MyISAM.
I tried this way to overtake tecdoc. Mysqldump worked for about 15 minutes and the sql load lasted more than an hour. Transferring the database files took 2 minutes. All you need is to disable mysql. Copy the database folder to the server via samba or sftp. Restart mysql on the server.
You can try navicat for import but most likely it will not pull. I once wrote a mini c # program for this task. pkhp this is not pulled though it is possible if you pick in the settings you can raise the limits and they
You did not give any information about the nature of the data in this file, but I would suggest that this is a kind of delimited file (CSV). Such a format is most efficiently loaded by means of SQL itself:
LOAD DATA INFILE
try locally, if successful, repeat the same thing on the hosting.
mysql -u myuser -p < dump.sql MySQL user password is requested
Enter password: Enter the password and wait until the import of the database is finished and the prompt for the command interpreter appears again.
myuser is your mysql username
dump - dump your database
If the database name is requested:
mysql -u myuser -p MYDB < dump.sql MYDB - the name of your database