Not a common problem. Sometimes you will be working on a very large website and want to transfer the database up to your host and your dbms panel, like phpmyadmin just doesn’t cut it.
It’s only happened to me on a couple of occasions. The last time was on a website that had more than 90,000 articles so the database was enormous. Even though it was pure text it still reached about 40MB in size.
Obviously one can’t use phpmyadmin to import such a huge DB. Even if your host allowed such large http uploads you would get timeouts or memory problems. PHPMyAdmin can easily export DBs of this size but not import them. The next obvious choice is to use Heidi, Navicat or some other direct connection, but many hosts don’t allow this either. Argh, what to do?
That’s where BigDump comes in handy. Big Dump is a single file “Staggered MySQL Dump Importer”. Put simply it takes a single large phpmyadmin DB dump and imports it into the target DB bit by bit. Obviously there’s alot more to it than that but we don’t need to know the complexities behind the script to use it.
Process
- Use phpmyadmin to export your database (either raw sql or compressed sql.gz). Be sure to use the Drop Tables option if your target DB isn’t empty.
- Upload your database dump to the remote server.
- Download the BigDump script from here.
- Upload the script to the remote server. Be sure to choose the same folder as your database dump.
- Open the script and update the DB connection settings to allow it to log into your target DB, i.e. change the following
$db_server = ”;
$db_name = ”;
$db_username = ”;
$db_password = ”; - Change the filename to the filename of your uploaded DB dump file.
- Run the script by pointing to it in your browser
- Follow the on-screen instructions. For a large DB it will take some minutes.
Your database should now be on the remote server. Let me know if you would like screenshots.