How to Create a MySQL Database, Set User Permissions and Import SQLdump File

I’m sure you’ve been in that frustrating situation where you need to set up a database for your website, but can’t quite remember all those MySQL commands. Maybe you’re migrating a WordPress site to a new server, or perhaps you’re setting up a fresh environment and need to import that massive SQLdump file. If you’ve searched around the interwebs for help, you’d know there’s plenty of overly complex guides that make this seem like rocket science.

How to Create a MySQL Database, Set User Permissions and Import SQLdump File - blackMORE Ops - 5

Most tutorials either assume you’re a complete beginner (and waste your time with unnecessary explanations) or they’re so advanced that you need a computer science degree to follow along. I just got fed up with this situation when I needed to quickly create a database, set up a user with proper permissions, and import a 2GB SQLdump file without the whole server timing out on me.

So, in this guide, I’ll show you exactly how to create a MySQL database, set up users with passwords, restart MySQL when needed (because sometimes it just stops cooperating), and import those pesky SQLdump files – even the massive ones that make most import tools crash. No fluff, no unnecessary details, just the commands you need to get the job done.

To keep things practical, we’ll use a typical scenario: setting up a database for a WordPress site, creating a dedicated user with appropriate permissions, and importing a backup file. By the end of this guide, you’ll be able to handle these MySQL tasks quickly and efficiently. BTW, the commands are exactly same for MySQL or MariaDB like import SQLdump file or create DB or user manipulation etc.

Accessing the MySQL Command Line

Before creating a database, you’ll need to access the MySQL command line. Open your terminal and log in using the following command:

mysql -u root -p

You’ll be prompted to enter your MySQL root password. If you’ve just installed MySQL and haven’t set a root password yet, you might be able to log in without one.

Creating a New MySQL Database

Once logged in, create a database with:

CREATE DATABASE mydatabase;

Replace mydatabase with your preferred name.

Verify creation with:

SHOW DATABASES;

Setting Up Users and Permissions

Create a dedicated user:

CREATE USER 'newuser'@'localhost' IDENTIFIED BY 'password';

Grant permissions:

GRANT ALL PRIVILEGES ON mydatabase.* TO 'newuser'@'localhost';

Apply changes:

FLUSH PRIVILEGES;

Modifying User Passwords

Change a user’s password:

ALTER USER 'username'@'localhost' IDENTIFIED BY 'newpassword';
FLUSH PRIVILEGES;

Reloading MySQL Service

Restart MySQL after configuration changes:

Ubuntu/Debian:

sudo systemctl restart mysql

CentOS/RHEL:

sudo systemctl restart mysqld

macOS:

sudo launchctl unload -F /Library/LaunchDaemons/com.oracle.oss.mysql.mysqld.plist
sudo launchctl load -F /Library/LaunchDaemons/com.oracle.oss.mysql.mysqld.plist

Importing SQLdump Files

Let’s get to the most critical part – importing those SQLdump files. This is often where things get tricky, especially with larger dumps. Here’s how to handle it properly:

From MySQL Command Line

For smaller files, you can import directly through the MySQL console:

USE mydatabase;
SOURCE /path/to/your/dumpfile.sql;

From Terminal (Better for Large Files)

For larger SQLdump files, this method is much more efficient:

mysql -u username -p mydatabase < /path/to/your/dumpfile.sql

This imports the SQLdump directly without loading it into memory all at once, preventing those annoying timeout issues.

For Very Large SQLdump Files

If you’re dealing with massive SQLdump files (like those 2GB+ monsters), you’ll need to adjust MySQL’s timeout settings before importing:

SET GLOBAL net_read_timeout=3600;
SET GLOBAL max_allowed_packet=1073741824;

These settings give MySQL enough time and memory to process large files without timing out. The max_allowed_packet setting is particularly important – it determines the maximum size of a packet or a generated/intermediate string, and the default is usually too small for large SQLdump imports.

Compressing SQLdump Files for Faster Import

For extremely large files, you can use compression to speed up the import:

gunzip < dumpfile.sql.gz | mysql -u username -p mydatabase

This decompresses the file on-the-fly while importing, saving both disk space and transfer time.

Troubleshooting SQLdump Imports

Let’s face it – SQLdump imports often fail the first time around. Here’s how to solve the most common issues:

Import Errors

If your SQLdump import fails with syntax errors, it’s likely due to version incompatibility. Check if your SQLdump was created on a newer MySQL version than the one you’re importing to. You can verify the MySQL version with:

SELECT VERSION();

Permission Denied

Nothing more frustrating than permission errors during a SQLdump import. Verify your user has all the necessary privileges:

SHOW GRANTS FOR 'username'@'localhost';

Make sure you see ALL PRIVILEGES or at least SELECT, INSERT, UPDATE, DELETE, CREATE, DROP permissions.

Import Taking Forever

For those massive SQLdump files that seem to import endlessly, you have a few options:

  1. Split the SQLdump file into smaller chunks using the split command
  2. Import only structure first, then data
  3. Disable foreign key checks temporarily during import:
SET FOREIGN_KEY_CHECKS=0;
-- Import your SQLdump file here
SET FOREIGN_KEY_CHECKS=1;

Connection Issues

If MySQL keeps disconnecting during import, check if the service is running properly:

sudo systemctl status mysql

You might need to increase MySQL’s max_connection setting in my.cnf.

Conclusion

You’ve now got all the essential commands for creating MySQL databases and importing SQLdump files – even those massive ones that make most systems choke. The key takeaways here are: create your database first, set up a user with proper permissions, and use the right import technique based on your SQLdump file size.

Remember that importing large SQLdump files isn’t just about running a command – it’s about preparing your MySQL environment with the right settings to handle the load. The max_allowed_packet and net_read_timeout settings are absolute lifesavers when dealing with hefty SQLdump files.

Always back up your databases before attempting any major changes or imports. Nothing worse than a half-imported SQLdump file corrupting your existing data. With these commands and techniques in your toolkit, you can confidently handle the core MySQL operations and SQLdump imports required for most website management tasks.

Leave your solution or comment to help others.

This site uses Akismet to reduce spam. Learn how your comment data is processed.