Skip to content
Published on

Spring Boot Batch Complete Guide: Job, Step, Chunk Processing and Production Patterns

Authors

1. Spring Batch Architecture

Spring Batch is a lightweight batch processing framework designed for handling large volumes of data. It is widely used for ETL (Extract, Transform, Load), data migration, report generation, settlement processing, and more.

Core Architecture Components

Job
└── Step 1
│   └── Chunk (ChunkSize: 100)
│       ├── ItemReader   (read)
│       ├── ItemProcessor (transform/filter)
│       └── ItemWriter   (write)
└── Step 2
    └── Tasklet (single operation)

Key components:

  • Job: Top-level unit of a batch process. Composed of one or more Steps
  • Step: Execution unit within a Job — either Chunk-based or Tasklet-based
  • Chunk: A fixed-size unit of read, process, and write operations
  • ItemReader: Interface that reads items one at a time from a data source
  • ItemProcessor: Transforms or filters read items
  • ItemWriter: Persists processed items to a target store

JobRepository, JobLauncher, JobExplorer

JobLauncher ──→ Job (launch request)
JobRepository (store/retrieve execution history)
JobExplorer (read-only history access)

Spring Batch Meta Tables

-- Key metadata tables
BATCH_JOB_INSTANCE    -- Job instance information
BATCH_JOB_EXECUTION   -- Job execution info (status, start/end time)
BATCH_JOB_PARAMS      -- Job parameters
BATCH_STEP_EXECUTION  -- Step execution info
BATCH_STEP_EXECUTION_CONTEXT -- Step context data
BATCH_JOB_EXECUTION_CONTEXT  -- Job context data

2. Dependencies and Configuration

Maven Dependencies

<!-- Spring Boot Batch -->
<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-batch</artifactId>
</dependency>

<!-- Database for batch meta tables (e.g., PostgreSQL) -->
<dependency>
    <groupId>org.postgresql</groupId>
    <artifactId>postgresql</artifactId>
</dependency>

<!-- Testing -->
<dependency>
    <groupId>org.springframework.batch</groupId>
    <artifactId>spring-batch-test</artifactId>
    <scope>test</scope>
</dependency>

application.yml Configuration

spring:
  batch:
    job:
      enabled: false # prevent auto-run on startup
    jdbc:
      initialize-schema: always # auto-create meta tables
  datasource:
    url: jdbc:postgresql://localhost:5432/batchdb
    username: batchuser
    password: batchpass

logging:
  level:
    org.springframework.batch: DEBUG

3. Basic Job Configuration

UserMigrationJobConfig - Full Configuration Example

@Configuration
@EnableBatchProcessing
public class UserMigrationJobConfig {

    // Job definition
    @Bean
    public Job userMigrationJob(JobRepository jobRepository,
                                 Step migrationStep) {
        return new JobBuilder("userMigrationJob", jobRepository)
                .start(migrationStep)
                .listener(jobExecutionListener())
                .build();
    }

    // Step definition (Chunk-based)
    @Bean
    public Step migrationStep(JobRepository jobRepository,
                               PlatformTransactionManager txManager,
                               ItemReader<User> userItemReader,
                               ItemProcessor<User, UserDto> userItemProcessor,
                               ItemWriter<UserDto> userItemWriter) {
        return new StepBuilder("migrationStep", jobRepository)
                .<User, UserDto>chunk(100, txManager)
                .reader(userItemReader)
                .processor(userItemProcessor)
                .writer(userItemWriter)
                .faultTolerant()
                .skipLimit(10)
                .skip(DataIntegrityViolationException.class)
                .retryLimit(3)
                .retry(TransientDataAccessException.class)
                .listener(stepExecutionListener())
                .build();
    }

    @Bean
    public JobExecutionListener jobExecutionListener() {
        return new JobExecutionListener() {
            @Override
            public void beforeJob(JobExecution jobExecution) {
                System.out.println("Job started: "
                    + jobExecution.getJobInstance().getJobName());
            }

            @Override
            public void afterJob(JobExecution jobExecution) {
                System.out.printf("Job finished: %s, status: %s%n",
                    jobExecution.getJobInstance().getJobName(),
                    jobExecution.getStatus());
            }
        };
    }

    @Bean
    public StepExecutionListener stepExecutionListener() {
        return new StepExecutionListener() {
            @Override
            public void beforeStep(StepExecution stepExecution) {
                System.out.println("Step started: " + stepExecution.getStepName());
            }

            @Override
            public ExitStatus afterStep(StepExecution stepExecution) {
                System.out.printf("Step finished: read=%d, skipped=%d, written=%d%n",
                    stepExecution.getReadCount(),
                    stepExecution.getProcessSkipCount(),
                    stepExecution.getWriteCount());
                return stepExecution.getExitStatus();
            }
        };
    }
}

Tasklet-Based Step

@Bean
public Step cleanupStep(JobRepository jobRepository,
                         PlatformTransactionManager txManager) {
    return new StepBuilder("cleanupStep", jobRepository)
            .tasklet((contribution, chunkContext) -> {
                // Suitable for simple one-off operations
                System.out.println("Cleaning up temporary files...");
                // file deletion logic here
                return RepeatStatus.FINISHED;
            }, txManager)
            .build();
}

4. ItemReader Implementations

JdbcCursorItemReader - High-Volume DB Reading

@Bean
@StepScope
public JdbcCursorItemReader<User> userCursorReader(DataSource dataSource) {
    return new JdbcCursorItemReaderBuilder<User>()
            .name("userCursorReader")
            .dataSource(dataSource)
            .sql("SELECT id, username, email, status FROM users WHERE status = 'ACTIVE' ORDER BY id")
            .rowMapper(new BeanPropertyRowMapper<>(User.class))
            .fetchSize(1000)
            .build();
}

JdbcPagingItemReader - Page-Based Reading

@Bean
@StepScope
public JdbcPagingItemReader<User> userPagingReader(DataSource dataSource) {
    Map<String, Order> sortKeys = new HashMap<>();
    sortKeys.put("id", Order.ASCENDING);

    PostgresPagingQueryProvider queryProvider = new PostgresPagingQueryProvider();
    queryProvider.setSelectClause("SELECT id, username, email, status");
    queryProvider.setFromClause("FROM users");
    queryProvider.setWhereClause("WHERE status = 'ACTIVE'");
    queryProvider.setSortKeys(sortKeys);

    return new JdbcPagingItemReaderBuilder<User>()
            .name("userPagingReader")
            .dataSource(dataSource)
            .queryProvider(queryProvider)
            .pageSize(100)
            .rowMapper(new BeanPropertyRowMapper<>(User.class))
            .build();
}

FlatFileItemReader - CSV File Reading

@Bean
@StepScope
public FlatFileItemReader<UserCsvDto> csvUserReader(
        @Value("#{jobParameters['inputFile']}") String inputFile) {

    return new FlatFileItemReaderBuilder<UserCsvDto>()
            .name("csvUserReader")
            .resource(new FileSystemResource(inputFile))
            .linesToSkip(1)  // skip header row
            .delimited()
            .delimiter(",")
            .names("id", "username", "email", "createdAt")
            .fieldSetMapper(new BeanWrapperFieldSetMapper<>() {{
                setTargetType(UserCsvDto.class);
            }})
            .build();
}

Custom ItemReader Implementation

@Component
@StepScope
public class ApiCallItemReader implements ItemReader<UserData> {

    private final UserApiClient apiClient;
    private int page = 0;
    private List<UserData> currentPageData = new ArrayList<>();
    private int currentIndex = 0;
    private boolean exhausted = false;

    public ApiCallItemReader(UserApiClient apiClient) {
        this.apiClient = apiClient;
    }

    @Override
    public UserData read() throws Exception {
        if (exhausted) return null;

        if (currentIndex >= currentPageData.size()) {
            currentPageData = apiClient.fetchUsers(page++, 100);
            currentIndex = 0;

            if (currentPageData.isEmpty()) {
                exhausted = true;
                return null;
            }
        }

        return currentPageData.get(currentIndex++);
    }
}

5. ItemProcessor Implementations

Data Transformation and Filtering

@Component
@StepScope
public class UserMigrationProcessor implements ItemProcessor<User, UserDto> {

    private static final Logger log = LoggerFactory.getLogger(UserMigrationProcessor.class);

    @Override
    public UserDto process(User user) throws Exception {
        // Returning null skips this item — it will not be passed to the writer
        if (!isEligibleForMigration(user)) {
            log.debug("Skipping user: {}", user.getUsername());
            return null;
        }

        UserDto dto = new UserDto();
        dto.setId(user.getId());
        dto.setUsername(user.getUsername().toLowerCase().trim());
        dto.setEmail(user.getEmail().toLowerCase());
        dto.setDisplayName(formatDisplayName(user.getFirstName(), user.getLastName()));
        dto.setMigratedAt(LocalDateTime.now());

        return dto;
    }

    private boolean isEligibleForMigration(User user) {
        return user.getStatus() != null
            && "ACTIVE".equals(user.getStatus())
            && user.getEmail() != null
            && user.getEmail().contains("@");
    }

    private String formatDisplayName(String firstName, String lastName) {
        return Stream.of(firstName, lastName)
                .filter(s -> s != null && !s.isBlank())
                .collect(Collectors.joining(" "));
    }
}

CompositeItemProcessor - Chaining Multiple Processors

@Bean
public CompositeItemProcessor<User, UserDto> compositeProcessor() {
    List<ItemProcessor<?, ?>> processors = new ArrayList<>();
    processors.add(new ValidationProcessor());
    processors.add(new EnrichmentProcessor(externalService));
    processors.add(new TransformationProcessor());

    CompositeItemProcessor<User, UserDto> composite = new CompositeItemProcessor<>();
    composite.setDelegates(processors);
    return composite;
}

6. ItemWriter Implementations

JdbcBatchItemWriter - Bulk INSERT/UPDATE

@Bean
public JdbcBatchItemWriter<UserDto> userDtoWriter(DataSource dataSource) {
    return new JdbcBatchItemWriterBuilder<UserDto>()
            .dataSource(dataSource)
            .sql("""
                INSERT INTO users_new (id, username, email, display_name, migrated_at)
                VALUES (:id, :username, :email, :displayName, :migratedAt)
                ON CONFLICT (id) DO UPDATE
                SET username = EXCLUDED.username,
                    email = EXCLUDED.email,
                    migrated_at = EXCLUDED.migrated_at
                """)
            .beanMapped()
            .build();
}

JpaItemWriter

@Bean
public JpaItemWriter<UserDto> jpaUserWriter(EntityManagerFactory entityManagerFactory) {
    JpaItemWriter<UserDto> writer = new JpaItemWriter<>();
    writer.setEntityManagerFactory(entityManagerFactory);
    return writer;
}

FlatFileItemWriter - Result File Output

@Bean
@StepScope
public FlatFileItemWriter<UserDto> csvResultWriter(
        @Value("#{jobParameters['outputFile']}") String outputFile) {

    return new FlatFileItemWriterBuilder<UserDto>()
            .name("csvResultWriter")
            .resource(new FileSystemResource(outputFile))
            .headerCallback(writer -> writer.write("id,username,email,migrated_at"))
            .delimited()
            .delimiter(",")
            .names("id", "username", "email", "migratedAt")
            .build();
}

CompositeItemWriter - Write to Multiple Targets

@Bean
public CompositeItemWriter<UserDto> compositeWriter(
        JdbcBatchItemWriter<UserDto> dbWriter,
        FlatFileItemWriter<UserDto> fileWriter) {

    CompositeItemWriter<UserDto> writer = new CompositeItemWriter<>();
    writer.setDelegates(Arrays.asList(dbWriter, fileWriter));
    return writer;
}

7. Advanced Features

Partitioning - Parallel Processing for Large Datasets

@Configuration
public class PartitionedJobConfig {

    @Bean
    public Step masterStep(JobRepository jobRepository,
                            Partitioner partitioner,
                            Step workerStep) {
        return new StepBuilder("masterStep", jobRepository)
                .partitioner("workerStep", partitioner)
                .step(workerStep)
                .gridSize(4)  // number of partitions (threads)
                .taskExecutor(taskExecutor())
                .build();
    }

    @Bean
    public Partitioner columnRangePartitioner(DataSource dataSource) {
        return gridSize -> {
            Map<String, ExecutionContext> result = new HashMap<>();
            int totalCount = getTotalCount(dataSource);
            int rangeSize = totalCount / gridSize;

            for (int i = 0; i < gridSize; i++) {
                ExecutionContext context = new ExecutionContext();
                context.putLong("minId", (long) i * rangeSize + 1);
                context.putLong("maxId",
                    i == gridSize - 1 ? totalCount : (long)(i + 1) * rangeSize);
                result.put("partition" + i, context);
            }
            return result;
        };
    }

    @Bean
    @StepScope
    public JdbcPagingItemReader<User> partitionedReader(
            DataSource dataSource,
            @Value("#{stepExecutionContext['minId']}") Long minId,
            @Value("#{stepExecutionContext['maxId']}") Long maxId) {

        Map<String, Object> parameterValues = new HashMap<>();
        parameterValues.put("minId", minId);
        parameterValues.put("maxId", maxId);

        return new JdbcPagingItemReaderBuilder<User>()
                .name("partitionedReader")
                .dataSource(dataSource)
                .parameterValues(parameterValues)
                // ... configure query
                .build();
    }

    @Bean
    public TaskExecutor taskExecutor() {
        ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
        executor.setCorePoolSize(4);
        executor.setMaxPoolSize(8);
        executor.setQueueCapacity(25);
        executor.setThreadNamePrefix("batch-partition-");
        executor.initialize();
        return executor;
    }
}

Multi-Threaded Step

@Bean
public Step multiThreadedStep(JobRepository jobRepository,
                               PlatformTransactionManager txManager,
                               SynchronizedItemStreamReader<User> reader,
                               ItemWriter<UserDto> writer) {
    return new StepBuilder("multiThreadedStep", jobRepository)
            .<User, UserDto>chunk(100, txManager)
            .reader(reader)   // must be thread-safe
            .writer(writer)
            .taskExecutor(new SimpleAsyncTaskExecutor())
            .throttleLimit(4)
            .build();
}

// Wrapping a reader for thread safety
@Bean
public SynchronizedItemStreamReader<User> synchronizedReader(
        JdbcCursorItemReader<User> reader) {
    SynchronizedItemStreamReader<User> synchronizedReader = new SynchronizedItemStreamReader<>();
    synchronizedReader.setDelegate(reader);
    return synchronizedReader;
}

AsyncItemProcessor / AsyncItemWriter

@Bean
public AsyncItemProcessor<User, UserDto> asyncProcessor(
        UserMigrationProcessor delegateProcessor) {

    AsyncItemProcessor<User, UserDto> asyncProcessor = new AsyncItemProcessor<>();
    asyncProcessor.setDelegate(delegateProcessor);
    asyncProcessor.setTaskExecutor(new SimpleAsyncTaskExecutor());
    return asyncProcessor;
}

@Bean
public AsyncItemWriter<UserDto> asyncWriter(
        JdbcBatchItemWriter<UserDto> delegateWriter) {

    AsyncItemWriter<UserDto> asyncWriter = new AsyncItemWriter<>();
    asyncWriter.setDelegate(delegateWriter);
    return asyncWriter;
}

Dynamic Configuration with JobParameters

@Bean
@StepScope
public JdbcCursorItemReader<User> dynamicReader(
        DataSource dataSource,
        @Value("#{jobParameters['startDate']}") String startDate,
        @Value("#{jobParameters['endDate']}") String endDate) {

    return new JdbcCursorItemReaderBuilder<User>()
            .name("dynamicReader")
            .dataSource(dataSource)
            .sql("SELECT * FROM users WHERE created_at BETWEEN ? AND ?")
            .preparedStatementSetter(ps -> {
                ps.setString(1, startDate);
                ps.setString(2, endDate);
            })
            .rowMapper(new BeanPropertyRowMapper<>(User.class))
            .build();
}

8. Restart and Retry Strategies

Skip/Retry Policy

@Bean
public Step robustStep(JobRepository jobRepository,
                        PlatformTransactionManager txManager) {
    return new StepBuilder("robustStep", jobRepository)
            .<User, UserDto>chunk(100, txManager)
            .reader(reader())
            .processor(processor())
            .writer(writer())
            .faultTolerant()
            .skipLimit(10)
            .skip(ValidationException.class)
            .skip(DataIntegrityViolationException.class)
            .retryLimit(3)
            .retry(TransientDataAccessException.class)
            .retry(DeadlockLoserDataAccessException.class)
            .noSkip(FatalBatchException.class)
            .build();
}

SkipListener Implementation

@Component
public class UserSkipListener implements SkipListener<User, UserDto> {

    private static final Logger log = LoggerFactory.getLogger(UserSkipListener.class);

    @Override
    public void onSkipInRead(Throwable t) {
        log.error("Skipped during read: {}", t.getMessage());
    }

    @Override
    public void onSkipInProcess(User user, Throwable t) {
        log.warn("Skipped during process - userId: {}, error: {}",
            user.getId(), t.getMessage());
    }

    @Override
    public void onSkipInWrite(UserDto dto, Throwable t) {
        log.error("Skipped during write - userId: {}, error: {}",
            dto.getId(), t.getMessage());
    }
}

Controlling Job Restartability

@Bean
public Job nonRestartableJob(JobRepository jobRepository, Step step1) {
    return new JobBuilder("nonRestartableJob", jobRepository)
            .start(step1)
            .preventRestart()  // disallow restart after failure
            .build();
}

9. Scheduler Integration

@Scheduled + JobLauncher

@Configuration
@EnableScheduling
public class BatchSchedulerConfig {

    @Autowired
    private JobLauncher jobLauncher;

    @Autowired
    private Job userMigrationJob;

    @Scheduled(cron = "0 0 2 * * *")  // every day at 2:00 AM
    public void runDailyBatch() {
        try {
            JobParameters params = new JobParametersBuilder()
                    .addString("date", LocalDate.now().toString())
                    .addLong("timestamp", System.currentTimeMillis())
                    .toJobParameters();

            JobExecution execution = jobLauncher.run(userMigrationJob, params);
            System.out.println("Batch execution status: " + execution.getStatus());
        } catch (Exception e) {
            System.err.println("Batch execution failed: " + e.getMessage());
        }
    }
}

Quartz Scheduler Integration

@Configuration
public class QuartzBatchConfig {

    @Bean
    public JobDetail batchJobDetail() {
        return JobBuilder.newJob(BatchQuartzJob.class)
                .withIdentity("batchJob")
                .storeDurably()
                .build();
    }

    @Bean
    public Trigger batchJobTrigger() {
        return TriggerBuilder.newTrigger()
                .forJob(batchJobDetail())
                .withIdentity("batchTrigger")
                .withSchedule(CronScheduleBuilder.cronSchedule("0 0 2 * * ?"))
                .build();
    }
}

@Component
public class BatchQuartzJob implements org.quartz.Job {

    @Autowired
    private JobLauncher jobLauncher;

    @Autowired
    private Job userMigrationJob;

    @Override
    public void execute(JobExecutionContext context) throws JobExecutionException {
        try {
            JobParameters params = new JobParametersBuilder()
                    .addLong("time", System.currentTimeMillis())
                    .toJobParameters();
            jobLauncher.run(userMigrationJob, params);
        } catch (Exception e) {
            throw new JobExecutionException(e);
        }
    }
}

REST API for Manual Job Triggering

@RestController
@RequestMapping("/api/batch")
public class BatchController {

    @Autowired
    private JobLauncher jobLauncher;

    @Autowired
    private Job userMigrationJob;

    @PostMapping("/run")
    public ResponseEntity<String> runJob(
            @RequestParam String startDate,
            @RequestParam String endDate) {
        try {
            JobParameters params = new JobParametersBuilder()
                    .addString("startDate", startDate)
                    .addString("endDate", endDate)
                    .addLong("timestamp", System.currentTimeMillis())
                    .toJobParameters();

            JobExecution execution = jobLauncher.run(userMigrationJob, params);
            return ResponseEntity.ok("Job execution ID: " + execution.getId()
                + ", status: " + execution.getStatus());
        } catch (Exception e) {
            return ResponseEntity.internalServerError()
                .body("Job execution failed: " + e.getMessage());
        }
    }
}

10. Testing

@SpringBatchTest Setup

@SpringBatchTest
@SpringBootTest(classes = {UserMigrationJobConfig.class, TestBatchConfig.class})
@ActiveProfiles("test")
class UserMigrationJobTest {

    @Autowired
    private JobLauncherTestUtils jobLauncherTestUtils;

    @Autowired
    private JobRepositoryTestUtils jobRepositoryTestUtils;

    @BeforeEach
    void clearMetadata() {
        jobRepositoryTestUtils.removeJobExecutions();
    }

    @Test
    void testCompleteJob() throws Exception {
        JobExecution jobExecution = jobLauncherTestUtils.launchJob(
            new JobParametersBuilder()
                .addString("date", "2026-03-17")
                .toJobParameters()
        );

        assertThat(jobExecution.getStatus()).isEqualTo(BatchStatus.COMPLETED);
        assertThat(jobExecution.getExitStatus()).isEqualTo(ExitStatus.COMPLETED);
    }

    @Test
    void testSingleStep() throws Exception {
        JobExecution jobExecution = jobLauncherTestUtils.launchStep("migrationStep");

        StepExecution stepExecution = jobExecution.getStepExecutions().iterator().next();
        assertThat(stepExecution.getStatus()).isEqualTo(BatchStatus.COMPLETED);
        assertThat(stepExecution.getWriteCount()).isGreaterThan(0);
    }
}

StepScopeTestExecutionListener

@RunWith(SpringRunner.class)
@SpringBootTest
@TestExecutionListeners({
    DependencyInjectionTestExecutionListener.class,
    StepScopeTestExecutionListener.class
})
class ItemReaderTest {

    @Autowired
    private JdbcCursorItemReader<User> userReader;

    public StepExecution getStepExecution() {
        StepExecution execution = MetaDataInstanceFactory.createStepExecution();
        execution.getExecutionContext().putString("inputFile", "classpath:test-users.csv");
        return execution;
    }

    @Test
    void testReader() throws Exception {
        List<User> users = new ArrayList<>();
        User user;
        while ((user = userReader.read()) != null) {
            users.add(user);
        }
        assertThat(users).isNotEmpty();
    }
}

Integration Test Example

@SpringBatchTest
@SpringBootTest
@Testcontainers
class BatchIntegrationTest {

    @Container
    static PostgreSQLContainer<?> postgres = new PostgreSQLContainer<>("postgres:15");

    @DynamicPropertySource
    static void setProps(DynamicPropertyRegistry registry) {
        registry.add("spring.datasource.url", postgres::getJdbcUrl);
        registry.add("spring.datasource.username", postgres::getUsername);
        registry.add("spring.datasource.password", postgres::getPassword);
    }

    @Autowired
    private JobLauncherTestUtils jobLauncherTestUtils;

    @Autowired
    private UserRepository userRepository;

    @Test
    void testFullMigrationPipeline() throws Exception {
        // Given: insert test data
        insertTestUsers(100);

        // When: run the batch job
        JobExecution execution = jobLauncherTestUtils.launchJob();

        // Then: verify results
        assertThat(execution.getStatus()).isEqualTo(BatchStatus.COMPLETED);
        assertThat(userRepository.countByMigratedTrue()).isEqualTo(100);
    }
}

11. Monitoring

Spring Batch Actuator Endpoints

management:
  endpoints:
    web:
      exposure:
        include: health,info,metrics,batch
  endpoint:
    batch:
      enabled: true

Micrometer + Prometheus Metrics

@Configuration
public class BatchMetricsConfig {

    @Bean
    public BatchMetrics batchMetrics(MeterRegistry meterRegistry) {
        return new BatchMetrics(meterRegistry);
    }
}

Key metrics to monitor:

  • spring.batch.job.* — job execution time, success/failure count
  • spring.batch.step.* — read, process, and write counts per step
  • spring.batch.item.* — per-item processing latency

Quiz: Test Your Spring Batch Knowledge

Q1. What happens when an ItemProcessor returns null in a Chunk-based Step?

Answer: The item is filtered out and not passed to the ItemWriter.

Explanation: When an ItemProcessor returns null for a given item, Spring Batch automatically filters that item and does not forward it to the writer. This is the cleanest way to implement conditional filtering and does not count toward the SkipLimit counter. This behavior is distinct from throwing an exception to trigger a skip, which does affect skip counting.

Q2. What is the key difference between JdbcCursorItemReader and JdbcPagingItemReader, and when is each appropriate?

Answer: JdbcCursorItemReader streams data via a DB cursor on a single connection; JdbcPagingItemReader fetches data page by page using LIMIT/OFFSET queries.

Explanation: JdbcCursorItemReader holds a cursor open on a single connection and streams data efficiently in memory, but is not thread-safe (unsuitable for multi-threaded Steps). JdbcPagingItemReader issues a new query per page, works well with connection pools, and is safe in multi-threaded environments. Use Cursor for single-threaded high-volume processing; use Paging when parallel processing or restartability is important.

Q3. How does Spring Batch resume processing from the point of failure when a Job is restarted?

Answer: Running the Job again with the same JobParameters causes Spring Batch to automatically resume from the last failed Step.

Explanation: Spring Batch stores execution history in the JobRepository. When the same JobParameters are used to re-run a Job, it looks up the most recent failed JobExecution in the BATCH_JOB_EXECUTION table and resumes processing from that Step. Calling preventRestart() disables this behavior. In Chunk processing, chunks that were already successfully committed are not reprocessed — only the failed chunk onward is retried.

Q4. Why is Partitioning used in Spring Batch, and what does the gridSize parameter control?

Answer: Partitioning divides a large dataset into sub-partitions for parallel execution, and gridSize specifies how many partitions (concurrent execution units) to create.

Explanation: A Partitioning Step logically divides a large dataset into segments and processes each in a separate thread or process concurrently. The gridSize determines the partition count and is typically set based on available CPU cores or DB connection pool size. By implementing the Partitioner interface, you can define partitions by ID range, date range, file list, or any other criterion.

Q5. Why is spring.batch.job.enabled: false recommended in application.yml?

Answer: To prevent all registered Jobs from automatically running when the application starts up.

Explanation: By default, Spring Batch runs all registered Jobs when the application context loads. Setting this to false is essential when embedding batch processing in a web application or when you want Jobs to be triggered explicitly via a REST API or scheduler rather than at every startup. It also prevents accidental batch runs during development when restarting the server. Jobs are then launched explicitly through JobLauncher.