2

I have my POJO as this:

@Data
@NoArgsConstructor
@AllArgsConstructor

public class FileInfo {
    
    private String filepath;
    private String ignorestr1;
    private String firstname;
    private String lastname;
    private String employeeid;
    private String applicantid;
    private String createdate;
    private String startretdate;
    private String retlength;
    private String emporapplicant;
    
}

And my ItemReader is like this:

@Bean
    @StepScope
    @Qualifier("FileInfoItemReader")
    @DependsOn("partitioner")
    public FlatFileItemReader<FileInfo> FileInfoItemReader(@Value("#{stepExecutionContext['fileName']}") String filename)
            throws MalformedURLException {
        return new FlatFileItemReaderBuilder<FileInfo>().name("FileInfoItemReader").delimited().delimiter("|")
                .names(new String[] { "filepath", "ignorestr1", "firstname", "lastname", "employeeid", "applicantid", "createdate", "startretdate", "retlength", "emporapplicant" })
                .fieldSetMapper(new BeanWrapperFieldSetMapper<FileInfo>() {
                    {
                        setTargetType(FileInfo.class);
                    }
                }).resource(new UrlResource(filename)).build();
    }

Update:

My complete BatchConfig:

@Configuration
@EnableBatchProcessing
public class BatchConfiguration {

    private static final Logger log = LoggerFactory.getLogger(BatchConfiguration.class);
    @Autowired
    public JobBuilderFactory jobBuilderFactory;

    @Autowired
    public StepBuilderFactory stepBuilderFactory;

    @Autowired
    private FlatFileItemReader<FileInfo> FileInfoItemReader;

    @Bean("partitioner")
    @StepScope
    public Partitioner partitioner() {
        log.info("In Partitioner");

        MultiResourcePartitioner partitioner = new MultiResourcePartitioner();
        ResourcePatternResolver resolver = new PathMatchingResourcePatternResolver();
        Resource[] resources = null;
        try {
            resources = resolver.getResources("*.csv");
        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }
        partitioner.setResources(resources);
        partitioner.partition(10);
        return partitioner;
    }

    @Bean
    public FileInfoItemProcessor processor() {
        return new FileInfoItemProcessor();
    }


    @Bean
    public FileInfoWriter<FileInfo> writer() {
        return new FileInfoWriter<FileInfo>();
    }

    @Bean
    public Job importUserJob(JobCompletionNotificationListener listener, Step step1) {
        return jobBuilderFactory.get("importUserJob").incrementer(new RunIdIncrementer()).listener(listener)
                .flow(masterStep()).end().build();
    }

    @Bean
    public Step step1() {
        return stepBuilderFactory.get("step1").<FileInfo, FileInfo>chunk(10).reader(FileInfoItemReader).processor(processor()).writer(writer())
                .build();
    }

    @Bean
    public ThreadPoolTaskExecutor taskExecutor() {
        ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
        taskExecutor.setMaxPoolSize(25);
        taskExecutor.setCorePoolSize(25);
        taskExecutor.setQueueCapacity(25);
        taskExecutor.afterPropertiesSet();
        return taskExecutor;
    }

    @Bean
    @Qualifier("masterStep")
    public Step masterStep() {
        return stepBuilderFactory.get("masterStep").partitioner("step1", partitioner()).step(step1())
                .taskExecutor(taskExecutor()).build();
    }

    @Bean
    @StepScope
    @Qualifier("FileInfoItemReader")
    @DependsOn("partitioner")
    public FlatFileItemReader<FileInfo> FileInfoItemReader(@Value("#{stepExecutionContext['fileName']}") String filename)
            throws MalformedURLException {
        return new FlatFileItemReaderBuilder<FileInfo>().name("FileInfoItemReader").delimited().delimiter("|")
                .names(new String[] { "I", "can", "put", "literally", "anything", "here", "and", "it", "works", "just_fine" })
                .fieldSetMapper(new BeanWrapperFieldSetMapper<FileInfo>() {
                    {
                        setTargetType(FileInfo.class);
                    }
                }).resource(new UrlResource(filename)).build();
    }
}

Doubt/Question: My mapping follows the strict sequence in my FileInfo. If I switch the position of any of private String.... in my POJO, the csv's row elements' mappings are messed up. Is that the expected behavior? If not, then what I am missing here? Or what is the correct way to make it POJO sequence independent?

1 Answer 1

2

The BeanWrapperFieldSetMapper uses reflection to map fields, so their declaration order in your class should not matter.

The order of fields that you declare in the array parameter in .names() corresponds to the order of columns in the input file, not to the declaration order in the POJO.

EDIT: Add sample

persons.csv

1,foo
2,bar

SO69224405.java

import javax.sql.DataSource;

import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.item.ItemWriter;
import org.springframework.batch.item.file.FlatFileItemReader;
import org.springframework.batch.item.file.builder.FlatFileItemReaderBuilder;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.io.FileSystemResource;
import org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseBuilder;
import org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseType;

@Configuration
@EnableBatchProcessing
public class SO69224405 {

    @Bean
    public FlatFileItemReader<Person> itemReader() {
        return new FlatFileItemReaderBuilder<Person>()
                .name("personItemReader")
                .resource(new FileSystemResource("persons.csv"))
                .delimited()
                .names("id", "name") // with names("name", "id") the example fails
                .targetType(Person.class)
                .build();
    }

    @Bean
    public ItemWriter<Person> itemWriter() {
        return items -> items.forEach(System.out::println);
    }

    @Bean
    public Job job(JobBuilderFactory jobs, StepBuilderFactory steps) {
        return jobs.get("job")
                .start(steps.get("step")
                        .<Person, Person>chunk(5)
                        .reader(itemReader())
                        .writer(itemWriter())
                        .build())
                .build();
    }

    public static void main(String[] args) throws Exception {
        ApplicationContext context = new AnnotationConfigApplicationContext(SO69224405.class);
        JobLauncher jobLauncher = context.getBean(JobLauncher.class);
        Job job = context.getBean(Job.class);
        jobLauncher.run(job, new JobParameters());
    }

    @Bean
    public DataSource dataSource() {
        return new EmbeddedDatabaseBuilder()
                .setType(EmbeddedDatabaseType.H2)
                .addScript("/org/springframework/batch/core/schema-h2.sql")
                .build();
    }

    public static class Person {
        // the declaration order of fields should not matter
        private String name;
        private int id;

        public Person() {
        }

        public int getId() {
            return id;
        }

        public void setId(int id) {
            this.id = id;
        }

        public String getName() {
            return name;
        }

        public void setName(String name) {
            this.name = name;
        }

        public String toString() {
            return "Person{id=" + id + ", name='" + name + '\'' + '}';
        }
    }

}

pom.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.example</groupId>
    <artifactId>so69224405</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>so69224405</name>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <java.version>1.8</java.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.springframework.batch</groupId>
            <artifactId>spring-batch-core</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework</groupId>
            <artifactId>spring-jdbc</artifactId>
        </dependency>
        <dependency>
            <groupId>com.h2database</groupId>
            <artifactId>h2</artifactId>
            <scope>runtime</scope>
        </dependency>
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-simple</artifactId>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <configuration>
                    <source>${java.version}</source>
                    <target>${java.version}</target>
                </configuration>
            </plugin>
        </plugins>
    </build>

    <dependencyManagement>
        <dependencies>
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-dependencies</artifactId>
                <version>2.5.4</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
        </dependencies>
    </dependencyManagement>

</project>
Sign up to request clarification or add additional context in comments.

6 Comments

.names() array has no effect at all. No matter what Strings I put there. Am I missing something here? I am using spring boot 2.5.4.
It should, this is the order of column as they are in the file. I added a sample that shows that the order of fields in the pojo does not matter while it matters in the array of names passed to the mapper.
Hi Mahmoud thanks a lot for helping me out. Please take a look at my complete BatchConfig as Update in my initial post and see the .names, it just works fine. What is going wrong here. Looks like a interesting and serious issue here. Please let me know if you need any other details.
That's curious. Please provide a minimal complete example that I can download and run to be able to debug the case. That said, you should not call partitioner.partition(10);, this will be called by Spring Batch when running the partitioned step.
Hmmm... I found something interesting. Initially I had lombok. Then I removed lombok annotation and used old fashioned constructors, getters/setters, to string as per your sample. And it started worked perfectly fine as expected - as you mentioned. And now even though I switched back to lombok again, its still working. Weird stuff. Needless to mention I did a n number of clean/insall, rebuilds in between. Not too sure if this issue is related to Lombok. Anyways I am back on track Mahmoud. You are awesome. Accepting your answer for all your hard work and effort. Thanks a ton.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.