How to create a backup file of your database and upload it to AWS s3 bucket using Laravel

Data is an essential ingredient for any business, no matter how large or small! Losing data can cost a lot to business so it is important to take backup and keep the data secure. Talking of data backup and security lets take a walk-through of how to create a database backup file and upload rather securely save it to a cloud platform like AWS using Laravel.

Artisan command for creating the DB backup and Uploading it to AWS

We will have an artisan command that will handle the generation of the backup file and then upload it to AWS smoothly using cron. It will be using the mysqldump tool that is available along with the MYSQL installation. This tool dumps the database into a .sql file. So let us start by creating an artisan command so, in the console, run:

php artisan make:command DBFileUploadAws

This will create a class with the name in the command (here DBFileUploadAws.php) in App\Console\Commands.

  1. namespace App\Console\Commands;
  2.  
  3. use Illuminate\Console\Command;
  4. use Symfony\Component\Process\Exception\ProcessFailedException;
  5. use Symfony\Component\Process\Process;
  6.  
  7. Class DBFileUploadAws extends Command
  8. {
  9. /*** The name and signature of the console command.
  10. *
  11. * @var string
  12. */
  13. protected $signature = 'file:DBFileUploadAws';
  14. /**
  15. * The console command description.
  16. * @var string
  17. */
  18. protected $description = 'Copy DB backup file to Amazon S3 daily ';
  19. /**
  20. * Create a new command instance.
  21. *
  22. * @return void
  23. */
  24. public function __construct()
  25. {
  26.      parent::__construct();
  27. }
  28. /**
  29. * Execute the console command.
  30. *
  31. * @return mixed
  32. */
  33. public function handle()
  34. {
  35.  
  36.  }
  37. }



Now we add all the code that we want to execute when the cron job runs, inside the handle function. To add the code in the handle function (in the cron job class, here DBFileUploadAws.php), for generating a database backup file and uploading it to AWS, follow the steps as given below:

Create a Database Backup File

  1. To dump the database and save it to an SQL file, add the following command:
    mysqldump -v -u[user] -h[host] -p[password] [dbName] > [file_path]
    For example, 
    $mysql_dump_command= “mysqldump -v -u{$dbUser} -h{$dbHost} -p{$dbPassword} {$dbName} >  $backupFileUC”;
    where, the variables $dbUser, $dbHost, $dbPassword, $dbName are the config variables (get their values from the env file via config files) and dbBackupFileUC is the name of the file(uncompressed) to be created.

  2. Now to compress the file, we have command gzip -9 -f [filename]. Therefore,
    $gzip_command= "gzip -9 -f $backupFileUC"
  3. Next, to run these commands in the console add the following code in the handle function and include the classes as follows ,
    //include this on the top
    use Symfony\Component\Process\Process;
    use Symfony\Component\Process\Exception\ProcessFailedException;
     
    //add this code in the handle function
    $mysql_dump_process= Process::fromShellCommandline($mysql_command);
    $run_mysql->mustRun(); 
    $gzip_process= Process::fromShellCommandline($gzip_command);
    $gzip_process->mustRun();

Upload The File to AWS S3

Amazon s3 setup

  1. To create a new aws account use link: https://portal.aws.amazon.com/billing/signup or if you already have an account then sign in to the AWS management console.

  2. After successfully logging in, create a bucket using steps as below:
    • In navigation bar select Services → S3 (or directly go to the link: https://s3.console.aws.amazon.com/s3/home)


    • Click on the ‘Create Bucket’ and enter the bucket name and region. Make the required settings and create the bucket.
    • Now, go to the bucket and to generate bucket policy go to Permissions → Bucket Policy and find the policy editor as below:



    • Click on policy generator and add details. For uploading the files to bucket, select policy type as ‘S3 Bucket Policy’, service as ‘Amazon S3’ as follows and click on ‘Add Statement’ :



    • Click on ‘Generate Policy’ and on successful policy generation, a pop-up as given below must display the policy details:

Steps for copying the file to s3 using Laravel

  1. Install the s3 package via composer, using command: composer require league/flysystem-aws-s3-v3
  2. Now add the API key and secret key to the .env file of the project. Their values can be found in ‘My Security Credentials’ from navigation bar in AWS management console.
    1.  AWS_ACCESS_KEY_ID=xxxx
    2.  AWS_SECRET_ACCESS_KEY=xxxx
    3.  AWS_DEFAULT_REGION=region
    4.  AWS_BUCKET=bucket name
  3. To upload the file from local storage to the amazon s3 bucket add the following code to the handle function inside the cron job:
    1. $copyFIle= Storage::disk('s3')->put( $s3path./.$fileName, file_get_contents(public_path($localFile)));

    where, $s3Path= folder or pathname in the bucket $fileName= save file as name in cloud $file= local storage file


So, what the code looks like?

The code in the cron job class that is DBFileUploadAws.php will look like the following. We have the command name, command description and the handle function where the code to be executed in the cron job is included.

  1. namespace App\Console\Commands;
  2.  
  3. use Illuminate\Console\Command;
  4. use League\Flysystem\MountManager;
  5. use Illuminate\Support\Facades\Storage;
  6. use Illuminate\Http\File;
  7. use Illuminate\Support\Facades\Config;
  8.  
  9. use Symfony\Component\Process\Exception\ProcessFailedException;
  10. use Symfony\Component\Process\Process;
  11.  
  12. class DBFileUploadAws extends Command
  13. {
  14.     /**
  15.      * The name and signature of the console command.
  16.      *
  17.      * @var string
  18.      */
  19.      protected $signature = 'file:dailyBackupOnS3';
  20.  
  21.     /**
  22.      * The console command description.
  23.      *
  24.      * @var string
  25.      */
  26.     protected $description = 'Copy DB backup file to Amazon S3 daily ';
  27.  
  28.     /**
  29.      * Create a new command instance.
  30.      *
  31.      * @return void
  32.     */
  33.     public function __construct()
  34.     {
  35.         parent::__construct();
  36.     }
  37.     /**
  38.      * Execute the console command.
  39.      *
  40.      * @return mixed
  41.      */
  42.     public function handle()
  43.     {
  44.  
  45.         $dbUser = config('database.connections.mysql.username');
  46.         $dbPassword = config('database.connections.mysql.password');
  47.         $dbHost = config('database.connections.mysql.host');
  48.         $dbName = config('database.connections.mysql.database');
  49.  
  50.         $dbBackupFolder = storage_path('app/'.config("filesystems.local_db_backup_folder")); //local folder for backup file
  51.  
  52.         $dbBackupFile =  $dbBackupFolder ."/dbbackup.sql";    //uncompressed file
  53.  
  54.         $dbBackupFileCompressed = $dbBackupFolder .'/dbbackup.sql.gz';               //compressed file
  55.  
  56.         $localFilePath= $dbBackupFile; //local backup file
  57.  
  58.         $s3Folder=  config("filesystems.s3_backup_folder");  //folder or path  for s3 file
  59.  
  60.         //system command to backup database
  61.  
  62.         $mysql_command = "mysqldump -v -u{$dbUser} -h{$dbHost} -p{$dbPassword}   {$dbName} >  $dbBackupFileUC";
  63.  
  64.         $gzip_command = "gzip -9 -f $dbBackupFileUC";
  65.  
  66.         try {
  67.               $process_mysql = Process::fromShellCommandline($mysql_command);  
  68.               $process_mysql->mustRun();
  69.  
  70.               $process_gzip = Process::fromShellCommandline($gzip_command);
  71.               $process_gzip->mustRun();
  72.       } catch (ProcessFailedException $exception) {
  73.               $errorOcurred= $exception->getMessage();         
  74.               return $this->error($errorOcurred);
  75.       }
  76.  
  77.       //check if file exists in local       
  78.               $fileFound = file_exists($dbBackupFile);
  79.  
  80.               if($fileFound == true){
  81.                  $copyTos3= Storage::disk('s3')->put($s3Folder .'/dbbackup.' .'.sql.gz', file_get_contents($dbBackupFile));
  82.      }
  83. }

Thus, Laravel makes it pretty simple and easy to create database backup files and wrap it in the artisan commands. Also, it allows uploading files to s3 with its package flysystem-aws-s3-v3 effortlessly. However, creating a backup of the most important aspect of the business that is data and saving it securely to a cloud like AWS makes it feel much safer and the businessmen can sit back and relax.


Related Post

About the Author

Madhavi Gajjar

Gladly, working as a PHP/Laravel Developer at Regur Technology Solutions. Strives to explore, learn and grow technically and analytically. Loves crafting in free time.