Shrikant Dande Shrikant Dande - 5 months ago 18
Java Question

how to get record count of deleted rows using deleteInBatch method of jpaRepository

@Scheduled(cron = "${trackingdetailspurge.cron.expression}")
void oldRecordPurgeDailyCron() {
List<TrackingEvent> listTrackingEvent = trackingEventDao.findByEventDate(450, "AUDIT_EVENT");
logger.info("--Before Tracking_event purge execution | no of records :"+listTrackingEvent.size());
trackingEventDao.deleteInBatch(listTrackingEvent);
logger.info("Tracking event purge completed | no of records :"+listTrackingEvent.size()+" | "+listTrackingEvent.get(0).getId());
List<Tracking> listTracking = trackingDao.findByUpdateDate(450);}
}





This is what I am doing so far.

It is giving stackoverflow error. it is 1 million records I am trying delete.

Answer Source

In my opinion, if you are trying to delete so many records, then i would suggest splitting the list in even groups and removing those groups iteratively, each in its own separate transaction.

Thanks to that you can calculate the maximum entities in a batch do not cause the StackOverflow and are processed with greatest speed. Also you would not block other transactions for too long which might be an important point to consider.

1) Create a service method which would always be run in a new, isolated transaction:

@Transactional(propagation = Propagation.REQUIRES_NEW)
public void deleteEventsInBatch(List<TrackingEvent events){
    trackingEventDao.deleteInBatch(events);
}

2) Change your scheduled method to split the original list and invoke the aforementioned service method in a loop.

Now you can keep the counter and update it after each successfull service call. If one of the fails, then the counter value will hold the deleted rows so far.