Chapter 13. Batch processing

A naive approach to inserting 100 000 rows in the database using Hibernate might look like this:

Session session = sessionFactory.openSession();
Transaction tx = session.beginTransaction();
for ( int i=0; i<100000; i++ ) {
    Customer customer = new Customer(.....);
    session.save(customer);
}
tx.commit();
session.close();

This would fall over with an OutOfMemoryException somewhere around the 50 000th row. That's because Hibernate caches all the newly inserted Customer instances in the session-level cache.

In this chapter we'll show you how to avoid this problem. First, however, if you are doing batch processing, it is absolutely critical that you enable the use of JDBC batching, if you intend to achieve reasonable performance. Set the JDBC batch size to a reasonable number (say, 10-50):

hibernate.jdbc.batch_size 20

Note that Hibernate disables insert batching at the JDBC level transparently if you use an identiy identifier generator.

You also might like to do this kind of work in a process where interaction with the second-level cache is completely disabled:

hibernate.cache.use_second_level_cache false

However, this is not absolutely necessary, since we can explicitly set the CacheMode to disable interaction with the second-level cache.

13.1. Batch inserts

When making new objects persistent, you must flush() and then clear() the session regularly, to control the size of the first-level cache.

Session session = sessionFactory.openSession();
Transaction tx = session.beginTransaction();
   
for ( int i=0; i<100000; i++ ) {
    Customer customer = new Customer(.....);
    session.save(customer);
    if ( i % 20 == 0 ) { //20, same as the JDBC batch size
        //flush a batch of inserts and release memory:
        session.flush();
        session.clear();
    }
}
   
tx.commit();
session.close();