53

I am trying to configure a couple of datasources within Spring Batch. On startup, Spring Batch is throwing the following exception:

To use the default BatchConfigurer the context must contain no more thanone DataSource, found 2

Snippet from Batch Configuration

@Configuration
@EnableBatchProcessing 
public class BatchJobConfiguration {

    @Primary
    @Bean(name = "baseDatasource")
    public DataSource dataSource() {
         // first datasource definition here
    }
    @Bean(name = "secondaryDataSource")
    public DataSource dataSource2() {
         // second datasource definition here
    }
    ...
}

Not sure why I am seeing this exception, because I have seen some xml based configuration for Spring batch that declare multiple datasources. I am using Spring Batch core version 3.0.1.RELEASE with Spring Boot version 1.1.5.RELEASE. Any help would be greatly appreciated.

4
  • 2
    With the xml one you have to be explicit in which datasource Spring Batch uses. If you don't declare it explicitly with Java based configuration it will try to detect the datasource to work, which will only work in case a single datasource is detected. YOu could try annotating the one to use for Batch with @Primary. Else you could construct a DefaultBatchConfigurer which requires a datasource as construct argument and pass it the one to use.
    – M. Deinum
    Aug 28, 2014 at 5:08
  • 2
    I have tried with @Primary and it doesnt work, I will try with DefaultBatchConfigurer. Aug 28, 2014 at 16:23
  • This approach is somewhat helpful stackoverflow.com/a/25811665/701368
    – wmarbut
    Oct 17, 2014 at 23:10
  • beans are injected "by-type" in case of autowired annotation. Wire beans "by-name" if there is any conflict in objects of same type.
    – Braj
    Mar 23, 2015 at 11:37

6 Answers 6

39

You must provide your own BatchConfigurer. Spring does not want to make that decision for you

@Configuration
@EnableBatchProcessing
public class BatchConfig {

     @Bean
      BatchConfigurer configurer(@Qualifier("batchDataSource") DataSource dataSource){
        return new DefaultBatchConfigurer(dataSource);
      }

...
3
  • 1
    Simple and effective. Thanks! Jan 25, 2018 at 16:37
  • Do we need to define the JobRepository and JobLauncher now ? If I do so I am getting many errors
    – Jeff Cook
    Aug 24, 2018 at 20:29
  • 1
    4.1.1 - As near as I can tell, in order to use the spring batch configuration annotations with multiple datasources, you need to rewrite the entire configuration stack so you can tell it which datasource to use.
    – pojo-guy
    Apr 3, 2019 at 17:38
29

AbstractBatchConfiguration tries to lookup BatchConfigurer in container first, if it is not found then tries to create it itself - this is where IllegalStateException is thrown where there is more than one DataSource bean in container.

The approach to solving the problem is to prevent from creation the DefaultBatchConfigurer bean in AbstractBatchConfiguration. To do it we hint to create DefaultBatchConfigurer by Spring container using @Component annotation:

The configuration class where @EnableBatchProcessing is placed we can annotate with @ComponentScan that scan the package that contains the empty class that is derived from DefaultBatchConfigurer:

package batch_config;
...
@EnableBatchProcessing
@ComponentScan(basePackageClasses = MyBatchConfigurer.class)
public class MyBatchConfig {
    ...
}

the full code of that empty derived class is here:

package batch_config.components;
import org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer;
import org.springframework.stereotype.Component;
@Component
public class MyBatchConfigurer extends DefaultBatchConfigurer {
}

In this configuration the @Primary annotation works for DataSource bean as in the example below:

@Configuration
public class BatchTestDatabaseConfig {
    @Bean
    @Primary
    public DataSource dataSource()
    {
        return .........;
    }
}

This works for the Spring Batch version 3.0.3.RELEASE

The simplest solution to make @Primary annotation on DataSource work might be just adding @ComponentScan(basePackageClasses = DefaultBatchConfigurer.class) along with @EnableBatchProcessing annotation:

@Configuration
@EnableBatchProcessing
@ComponentScan(basePackageClasses = DefaultBatchConfigurer.class)
public class MyBatchConfig {
2
  • 1
    2nd approach worked like a boon. Wasted 4 hours in this. Thanks a lot. May 14, 2019 at 13:41
  • 3
    Using primary on a given dataSource is a bad idea. whenever Spring injects a Datasource it will pick the primary one. I just ran into a problem with SpringBatch and Spring JPA./Hibernate. Hibernate was using DataSource2 and SpringJpa was performing the final commit on the primary Datasource (Datasource1).
    – pmartin8
    Jan 23, 2020 at 21:07
4

I would like to provide a solution here, which is very similar to the one answered by @vanarchi, but I managed to put all the necessary configurations into one class.

For the sake of completeness, the solution here assumes that primary datasource is hsql.

@Configuration
@EnableBatchProcessing
public class BatchConfiguration extends DefaultBatchConfigurer {

@Bean
@Primary
public DataSource batchDataSource() {

    // no need shutdown, EmbeddedDatabaseFactoryBean will take care of this
    EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
    EmbeddedDatabase embeddedDatabase = builder
            .addScript("classpath:org/springframework/batch/core/schema-drop-hsqldb.sql")
            .addScript("classpath:org/springframework/batch/core/schema-hsqldb.sql")
            .setType(EmbeddedDatabaseType.HSQL) //.H2 or .DERBY
            .build();
    return embeddedDatabase;
}

@Override
protected JobRepository createJobRepository() throws Exception {
    JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
    factory.setDataSource(batchDataSource());
    factory.setTransactionManager(transactionManager());
    factory.afterPropertiesSet();

    return (JobRepository) factory.getObject();
}

private ResourcelessTransactionManager transactionManager() {
    return new ResourcelessTransactionManager();
}

//NOTE: the code below is just to provide developer an easy way to access the in-momery hsql datasource, as we configured it to the primary datasource to store batch job related data. Default username : sa, password : ''
@PostConstruct
public void getDbManager(){
    DatabaseManagerSwing.main(
            new String[] { "--url", "jdbc:hsqldb:mem:testdb", "--user", "sa", "--password", ""});
}

}

THREE key points in this solution:

  1. This class is annotated with @EnableBatchProcessing and @Configuration, as well as extended from DefaultBatchConfigurer. By doing this, we instruct spring-batch to use our customized batch configurer when AbstractBatchConfiguration tries to lookup BatchConfigurer;
  2. Annotate batchDataSource bean as @Primary, which instruct spring-batch to use this datasource as its datasource of storing the 9 job related tables.
  3. Override protected JobRepository createJobRepository() throws Exception method, which makes the jobRepository bean to use the primary datasource, as well as use a different transactionManager instance from the other datasource(s).
2

The simplest solution is to extend the DefaultBatchConfigurer and autowire your datasource via a qualifier:

@Component
public class MyBatchConfigurer extends DefaultBatchConfigurer {

    /**
     * Initialize the BatchConfigurer to use the datasource of your choosing
     * @param firstDataSource
     */
    @Autowired
    public MyBatchConfigurer(@Qualifier("firstDataSource") DataSource firstDataSource) {
        super(firstDataSource);
    }
}

Side Note (as this also deals with the use of multiple data sources): If you use autoconfig to run data initialization scripts, you may notice that it's not initializing on the datasource you'd expect. For that issue, take a look at this: https://github.com/spring-projects/spring-boot/issues/9528

2
  • 1
    getting this exception while running batch ! org.hsqldb.HsqlException: user lacks privilege or object not found: BATCH_JOB_INSTANCE Mar 20, 2019 at 9:32
  • 2
    Spring batch 4.1.1 - this does not work because the postprocessor does not know about the populated attribute, and still attempts to set the autowired datasource but finds a list of candidates instead.
    – pojo-guy
    Apr 3, 2019 at 17:27
1

You can define below beans and make sure you application.properties file has entries needed for

@Configuration
@PropertySource("classpath:application.properties")
public class DataSourceConfig {

    @Primary
    @Bean(name = "abcDataSource")
    @ConfigurationProperties(prefix = "abc.datasource")
    public DataSource dataSource() {
        return DataSourceBuilder.create().type(HikariDataSource.class).build();
    }


    @Bean(name = "xyzDataSource")
    @ConfigurationProperties(prefix = "xyz.datasource")
    public DataSource xyzDataSource() {
        return DataSourceBuilder.create().type(HikariDataSource.class).build();
    }
}

application.properties

abc.datasource.jdbc-url=XXXXX
abc.datasource.username=XXXXX
abc.datasource.password=xxxxx
abc.datasource.driver-class-name=org.postgresql.Driver

...........
...........
...........
...........

Here you can refer: Spring Boot Configure and Use Two DataSources

0

First, create a custom BatchConfigurer

@Configuration
@Component
public class TwoDataSourcesBatchConfigurer implements BatchConfigurer {

    @Autowired
    @Qualifier("dataSource1")
    DataSource dataSource;

    @Override
    public JobExplorer getJobExplorer() throws Exception {
        ...
    }

    @Override
    public JobLauncher getJobLauncher() throws Exception {
        ...
    }

    @Override
    public JobRepository getJobRepository() throws Exception {
        JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
        // use the autowired data source
        factory.setDataSource(dataSource);
        factory.setTransactionManager(getTransactionManager());
        factory.afterPropertiesSet();
        return factory.getObject();
    }

    @Override
    public PlatformTransactionManager getTransactionManager() throws Exception                      {
        ...
    }

}

Then,

@Configuration
@EnableBatchProcessing
@ComponentScan("package")
public class JobConfig {
    // define job, step, ...
}

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Not the answer you're looking for? Browse other questions tagged or ask your own question.