How to Insert Bulk Data into MySQL Using Java

  • Post last modified:February 19, 2024
  • Reading time:3 mins read


  • As a developer, we interact with databases in various capacities. Often our action comprises select, insert, and update operations.
  • When data size grows we often find bottlenecks in our application code and work on improving performance.
  • In this article, we will discuss how to insert data in bulk into the MySQL database, and we will discuss the benefits of doing it compared to inserting each record.

Use Case

  • We receive CSV files of customers from marketing and our task is to send them emails for a new campaign.
  • We as a developer have built the flow, where we read each customer data from this file, modify and enrich them with marketing info and insert the records into the database table and enqueue job.
  • Enqueued jobs then read them and send emails to each customer.

Business Logic

Database Driver

  • First, we need to add a database driver to the pom XML so that we can connect to our database. 
  • I am using MySQL database hence adding MySQL driver to the pom.xml

Database Connection

  • Our Connection Instance requires a connection string, username, and password for the database.
  • The connection string would look like this : 
public class JDBCConnection {

    public static Connection connect() throws SQLException {
        // this is for demo but please don't setup username and password in code , instead read from config file or Vault/KMS
        Connection connection = DriverManager.getConnection(CONNECTION_STRING,
                USER, PASSWORD);
        if(connection != null){
                System.out.println("Connected to database with "+USER);
        return connection;

Inserting Records As Batch/ Bulk

  • The below function takes inserQuery and records to be inserted as a batch as arguments.
  • We first need to get the connection to a database and set auto-commit to false. We want to control when we would like to commit hence need to mark auto-commit as false.
  • PreparedStatement is a pre-compiled query that is very efficient because the database doesn’t need to compile them, especially when we have bulk records, we can use precompiled query and pass a parameterized argument which improves significant database performance.
  • We iterate over each input record and add it as a batch to preparedstatement.
  • Then we can execute the batch against over database.
    public void insertAsBatch(String insertQuery, List<Record> records) throws SQLException {
        Connection connection = JDBCConnection.connect();
        PreparedStatement preparedStatement = connection.prepareStatement(insertQuery);
        connection.setAutoCommit(false); -> {
            try {
                setPreparedStatement(record, connection, preparedStatement);
            } catch (Exception e) {
        int[] ints = preparedStatement.executeBatch();
        System.out.println("Number of records to be inserted : "+ record.size());
        System.out.println("Number of records actually inserted : "+

Set Records

  • We need to add each record to the prepared statement.
private static void setPreparedStatement(Record record, Connection connection, PreparedStatement preparedStatement) throws SQLException, ParseException {
        preparedStatement.setString(1, record.getProp1());
        preparedStatement.setString(2, record.getProp1());
        preparedStatement.setString(3, record.getProp1());
        preparedStatement.setString(4, record.getProp1());

Client Code

    public static void main(String[] args) throws SQLException {
       // read input files and set csvRecords
       recordDao.insertAsBatch(INSERT_QUERY, csvRecords);


  • In this article, we used PreparedStatement to set input records as batch and insert them into the database as a single atomic operation.
  • If one of them failed to insert into database then transaction will be rollback. So it gives atomicity to the bulk operation

BestSeller Course

Leave a Reply