Importing large CSV files in MySQL using Laravel

Importing large CSV files in MySQL using Laravel

In this post we will give you information about Importing large CSV files in MySQL using Laravel. Hear we will give you detail about Importing large CSV files in MySQL using Laravel And how to use it also give you demo for it if it is necessary.

This article will show you how to import a large number of records from a CSV file into a Laravel Application using a database seeder.

When we have thousands or millions of records or CSV files that we want to store in our database, We frequently use cron or queue jobs. We can use the database seeder to import the records if we only want to import the CSV file once.

Because the file is large and the application cannot handle multiple requests at once, it displays the maximum time execution when we use the import functionality to store large amounts of data. So We’ve added a fantastic example code for inserting a large CSV file into a database using a laravel application in a matter of minutes or seconds.

Importing large CSV files in MySQL using Laravel

Step 1: Install Laravel App

 Installing a fresh new laravel application, so head over to the terminal, type the command, and create a new laravel app.

composer create-project --prefer-dist laravel/laravel large-csv-laravel

or, if you have installed the Laravel Installer as a global composer dependency:

laravel new large-csv-laravel

Step 2: Configure Database Detail

Enter your database login information, including the database name, username, and password, after opening the .env file:

DB_HOST=127.0.0.1
DB_PORT=3306
DB_DATABASE=<DATABASE NAME>
DB_USERNAME=<DATABASE USERNAME>
DB_PASSWORD=<DATABASE PASSWORD>

Step 3: Create Migration and Model

The next step is to use the terminal to run the following command to create the model and migration file:

php artisan make:model Product -m

add the following code to the migration file:

public function up()
{
    Schema::create('products', function (Blueprint $table) {
        $table->bigIncrements('id');
        $table->string('name');
        $table->string('description');
        $table->timestamps();
    });
}

Open the command prompt and type the following command to create the table in the database:

php artisan migrate

Add the following code into app/Models/Product.php:

<?php
 
namespace AppModels;
 
use IlluminateDatabaseEloquentFactoriesHasFactory;
use IlluminateDatabaseEloquentModel;
 
class Product extends Model
{
    use HasFactory;
    protected $fillable = ['name', 'description'];
}

Step 4: Create a Database Seeder

To create a seeder file for quickly uploading a massive CSV file into the database, run the following command first:

php artisan make:seeder ProductSeeder

Update the code immediately so that long CSV files may be imported with ease by Laravel apps. The following code has to be included in your databaseseedersCommunitySeeder.phpfile:

<?php

  namespace DatabaseSeeders;

  use IlluminateDatabaseSeeder;
  use AppModelsProduct;
  use IlluminateSupportFacadesDB;
  use IlluminateSupportLazyCollection;

  class ProductSeeder extends Seeder 
  {
  /**
  * Run the database seeds.
  *
  * @return void
  */
  public function run()
  {
    LazyCollection::make(function () {
      $handle = fopen(public_path("products.csv"), 'r');
      
      while (($line = fgetcsv($handle, 4096)) !== false) {
        $dataString = implode(", ", $line);
        $row = explode(';', $dataString);
        yield $row;
      }

      fclose($handle);
    })
    ->skip(1)
    ->chunk(1000)
    ->each(function (LazyCollection $chunk) {
      $records = $chunk->map(function ($row) {
        return [
            "name" => $row[0],
            "description" => $row[1],
        ];
      })->toArray();
      
      DB::table('products')->insert($records);
    });
  }
}

Step 5: Run Database Seeder

Run the following command to run Database seeder:

php artisan db:seed --class=ProductSeeder

Importing large CSV files in MySQL using Laravel and maatwebsite/Laravel-Excel

If you are using Laravel Excel you may use the chunking feature. Use the following code to import chunks into the database via Chunk Reading:

Excel::filter('chunk')->load(database_path('seeds/csv/products.csv'))->chunk(250, function($results) {
    foreach ($results as $row) {
        Product::create([
            'name' => $row->name,
            'description'=>$row->description
            // other fields
        ]);
    }
});

If you are using the Import class in your Laravel App through Laravel Excel then you can use the following methods to import data to the database:

<?php

namespace AppImports;

use AppModelsProduct;
use MaatwebsiteExcelConcernsToModel;
use MaatwebsiteExcelConcernsWithChunkReading;

class ProductImport implements ToModel, WithChunkReading
{
    public function model(array $row)
    {
        return new Product([
            'name' => $row[0],
            'description' => $row[1],
        ]);
    }
    
    public function chunkSize(): int
    {
        return 1000;
    }
}

Thank you for reading this article.

Also see: Laravel 10 Import Export Excel & CSV File Example

  .       .

Hope this code and post will helped you for implement Importing large CSV files in MySQL using Laravel. if you need any help or any feedback give it in comment section or you have good idea about this post you can give it comment section. Your comment will help us for help you more and improve us. we will give you this type of more interesting post in featured also so, For more interesting post and code Keep reading our blogs

For More Info See :: laravel And github

We're accepting well-written guest posts and this is a great opportunity to collaborate : Contact US