Voiced by Amazon Polly |
Overview
Regular backups of MySQL databases are crucial for disaster recovery and data protection. Automating and storing backups securely on Amazon S3 ensures high availability, security, and easy recovery. This guide provides a step-by-step approach to setting up automated MySQL backups on an Amazon EC2 instance and uploading them to Amazon S3.
Pioneers in Cloud Consulting & Migration Services
- Reduced infrastructural costs
- Accelerated application deployment
Introduction
Data loss can have severe consequences for businesses. Manually managing backups is time-consuming and prone to human error. Automating MySQL backups ensures that data is consistently backed up and securely stored in Amazon S3. This guide explains creating an Amazon S3 bucket, configuring permissions, writing a backup script, and automating the process with cron jobs.
Step-by-Step Guide
Step 1: Create an Amazon S3 Bucket
Amazon S3 is a reliable and scalable cloud storage service that is ideal for storing database backups. To create an Amazon S3 bucket, take the following actions:
- Log in to your AWS Management Console.
- Navigate to the Amazon S3 Service.
- Click Create Bucket.
- Provide a unique bucket name (e.g., mysql-backups-bucket).
- Choose a region and configure permissions as required.
- Click Create Bucket to finalize the process.
Step 2: Create an AWS IAM Role with Amazon S3 permissions
To allow your Amazon EC2 instance to upload backups to Amazon S3, create an AWS IAM role with the necessary permissions.
- Navigate to the AWS IAM Service in AWS.
- Click on Roles and then select Create Role.
- Choose Amazon EC2 as the trusted entity.
- Attach the AmazonS3FullAccess policy (or create a custom policy to restrict access to only the necessary bucket).
- Name the role (e.g., MySQLBackupRole) and create it.
Step 3: Connect the Amazon EC2 instance to the AWS IAM role
Attach the AWS IAM role to your Amazon EC2 instance after it has been created:
- Navigate to Amazon EC2 Service in AWS.
- Select the instance where the script will run.
- Click Actions > Security > Modify AWS IAM Role.
- Attach the MySQLBackupRole to the instance.
Step 4: Configure AWS CLI
Ensure your Amazon EC2 instance has the AWS CLI installed and configured to interact with AWS services.
Run the following command:
1 |
aws configure |
You will be prompted to enter:
- AWS Access Key
- AWS Secret Key
- Default Region
These credentials should have the required Amazon S3 access permissions.
Step 5: Create the Backup Script
Now, create a shell script to automate MySQL database backup and upload it to Amazon S3.
- Connect to your Amazon EC2 instance via SSH.
- Create a new script file:
1 |
nano /home/ubuntu/mysql_backup.sh |
3. Add the following script:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
#!/bin/bash # Variables DB_NAME="your_database_name" DB_USER="your_db_user" DB_PASSWORD="your_db_password" BACKUP_PATH="/home/ubuntu/backups" TIMESTAMP=$(date +"%Y%m%d_%H%M%S") BACKUP_FILE="$BACKUP_PATH/$DB_NAME_$TIMESTAMP.sql" S3_BUCKET="s3://mysql-backups-bucket" # Create backup directory if it doesn't exist mkdir -p $BACKUP_PATH # Dump MySQL database mysqldump -u $DB_USER -p$DB_PASSWORD $DB_NAME > $BACKUP_FILE # Upload backup to S3 aws s3 cp $BACKUP_FILE $S3_BUCKET/ # Remove local backup file after upload rm -f $BACKUP_FILE # Logging echo "Backup completed for $DB_NAME at $(date)" >> /var/log/mysql_backup.log |
4. To save and close the file, use CTRL + X, followed by Y, and then Enter.
5. Make the script executable:
1 |
chmod +x /home/ubuntu/mysql_backup.sh |
Step 6: Schedule the Script with Cron
To automate the backup process, schedule the script to run weekly using cron.
- Open the crontab editor:
1 |
crontab -e |
2. Add the following line at the end of the file to run the script every Sunday at midnight:
1 |
0 0 * * 0 /home/ubuntu/mysql_backup.sh |
3. Save and exit.
Step 7: Test the Backup Script
Before relying on automation, manually test the script to ensure it works correctly:
/home/ubuntu/mysql_backup.sh
Verify the following:
- The backup file is created in the specified directory.
- The backup file is successfully uploaded to Amazon S3.
- The local backup file is deleted after uploading.
- Log entries are recorded in /var/log/mysql_backup.log.
Step 8: Monitor Backups
Regular monitoring ensures backups are running as expected.
- Check the backup directory to confirm backups are created before deletion.
- Verify the Amazon S3 bucket to confirm backups are successfully uploaded.
- Review logs using:
tail -f /var/log/mysql_backup.log
4. Set up Amazon CloudWatch monitoring for failed uploads or missing backups.
5. Configure Amazon SNS alerts to receive notifications if backups fail.
Step 9: Security Best Practices
Protect your database and backup files by following security best practices:
- Use AWS IAM Policies to restrict Amazon S3 access to only the backup bucket.
- Encrypt backups using AWS KMS before uploading to Amazon S3.
- Use MFA Delete on Amazon S3 buckets to prevent accidental deletions.
- Limit access to the backup script by setting proper file permissions:
chmod 700 /home/ubuntu/mysql_backup.sh
5. Rotate credentials regularly and avoid hardcoding them in scripts. Instead, use AWS Secrets Manager.
Conclusion
Following these steps, you have successfully automated MySQL backups and stored them securely in Amazon S3. This approach ensures:
- Automated and scheduled backups with cron.
- Secure and scalable storage in Amazon S3.
- Monitoring and logging to detect any failures.
- Easy recovery in case of data loss or corruption.
Automating your MySQL backups with AWS reduces manual work, enhances security, and ensures data availability in case of unexpected failures.
Drop a query if you have any questions regarding MySQL backups and we will get back to you quickly
Empowering organizations to become ‘data driven’ enterprises with our Cloud experts.
- Reduced infrastructure costs
- Timely data-driven decisions
About CloudThat
CloudThat is a leading provider of Cloud Training and Consulting services with a global presence in India, the USA, Asia, Europe, and Africa. Specializing in AWS, Microsoft Azure, GCP, VMware, Databricks, and more, the company serves mid-market and enterprise clients, offering comprehensive expertise in Cloud Migration, Data Platforms, DevOps, IoT, AI/ML, and more.
CloudThat is the first Indian Company to win the prestigious Microsoft Partner 2024 Award and is recognized as a top-tier partner with AWS and Microsoft, including the prestigious ‘Think Big’ partner award from AWS and the Microsoft Superstars FY 2023 award in Asia & India. Having trained 650k+ professionals in 500+ cloud certifications and completed 300+ consulting projects globally, CloudThat is an official AWS Advanced Consulting Partner, Microsoft Gold Partner, AWS Training Partner, AWS Migration Partner, AWS Data and Analytics Partner, AWS DevOps Competency Partner, AWS GenAI Competency Partner, Amazon QuickSight Service Delivery Partner, Amazon EKS Service Delivery Partner, AWS Microsoft Workload Partners, Amazon EC2 Service Delivery Partner, Amazon ECS Service Delivery Partner, AWS Glue Service Delivery Partner, Amazon Redshift Service Delivery Partner, AWS Control Tower Service Delivery Partner, AWS WAF Service Delivery Partner, Amazon CloudFront, Amazon OpenSearch, AWS DMS, AWS Systems Manager, Amazon RDS, AWS CloudFormation and many more.
FAQs
1. How do I verify if my backup script is working?
ANS: – You can manually run the script using:
1 |
/home/ubuntu/mysql_backup.sh |
2. How can I encrypt my MySQL backups before uploading to Amazon S3?
ANS: – You can use gpg to encrypt the backup file:
1 |
gpg --symmetric --cipher-algo AES256 -o $BACKUP_FILE.gpg |

WRITTEN BY Shaikh Mohammed Fariyaj Najam
Mohammed Fariyaj Shaikh works as a Research Associate at CloudThat. He has strong analytical thinking and problem-solving skills, knowledge of AWS Cloud Services, migration, infrastructure setup, and security, as well as the ability to adopt new technology and learn quickly.
Comments