9

I have a requirement to get the data from a database and write that data to files based on the filename given in the database.

This is how data is defined in the database:

Columns --> FILE_NAME, REC_ID, NAME
 Data --> file_1.csv, 1, ABC
 Data --> file_1.csv, 2, BCD
 Data --> file_1.csv, 3, DEF
 Data --> file_2.csv, 4, FGH
 Data --> file_2.csv, 5, DEF
 Data --> file_3.csv, 6, FGH
 Data --> file_3.csv, 7, DEF
 Data --> file_4.csv, 8, FGH

As you see, basically the file names along with the data is defined in the Database so what SpringBatch should do is get this data and write it to the corresponding file specified in the Database (i.e., file_1.csv should only contain 3 records (1,2,3), file_2.csv should only contain records 4 and 5, etc.)

Is it possible to use MultiResourceItemWriter for this requirement (please note that entire file name is dynamic and needs to be retrieved from Database).

2 Answers 2

7

I'm not sure but I don't think there is an easy way of obtaining this. You could try to build your own ItemWriter like this:

public class DynamicItemWriter  implements ItemStream, ItemWriter<YourEntry> {

    private Map<String, FlatFileItemWriter<YourEntry>> writers = new HashMap<>();

    private LineAggregator<YourEntry> lineAggregator;

    private ExecutionContext executionContext;

    @Override
    public void open(ExecutionContext executionContext) throws ItemStreamException {
        this.executionContext = executionContext;
    }

    @Override
    public void update(ExecutionContext executionContext) throws ItemStreamException {
    }

    @Override
    public void close() throws ItemStreamException {
        for(FlatFileItemWriter f:writers.values()){
            f.close();
        }
    }

    @Override
    public void write(List<? extends YourEntry> items) throws Exception {
        for (YourEntry item : items) {
            FlatFileItemWriter<YourEntry> ffiw = getFlatFileItemWriter(item);
            ffiw.write(Arrays.asList(item));
        }
    }

    public LineAggregator<YourEntry> getLineAggregator() {
        return lineAggregator;
    }

    public void setLineAggregator(LineAggregator<YourEntry> lineAggregator) {
        this.lineAggregator = lineAggregator;
    }


    public FlatFileItemWriter<YourEntry> getFlatFileItemWriter(YourEntry item) {
        String key = item.FileName();
        FlatFileItemWriter<YourEntry> rr = writers.get(key);
        if(rr == null){
            rr = new FlatFileItemWriter<>();
            rr.setLineAggregator(lineAggregator);
            try {
                UrlResource resource = new UrlResource("file:"+key);
                rr.setResource(resource);
                rr.open(executionContext);
            } catch (MalformedURLException e) {
                e.printStackTrace();
            }
            writers.put(key, rr);
            //rr.afterPropertiesSet();
        }
        return rr;
    }
}

and configure it as a writer:

<bean id="csvWriter" class="com....DynamicItemWriter">
        <property name="lineAggregator">
        <bean
         class="org.springframework.batch.item.file.transform.DelimitedLineAggregator">
            <property name="delimiter" value=","/>
            <property name="fieldExtractor" ref="csvFieldExtractor"/>
        </bean>
    </property>
Sign up to request clarification or add additional context in comments.

3 Comments

Thanks Gilles for your answer and time. Because of this complex requirement in nature I gave up on customizing the framework for my needs so instead I handled the file writing part using a simple tasklet (of course, restarting the batch job is challenging)
@forumuser1 How did you do it btw ?, I am having the same issue and dynamically creating ItemWriters give many errors ...
Thanks for this useful answer. In my case, I had a "File not writable issue" on the second writer.open, so I had to set @StepScope to the writer. Issue related here, because 'restarted' was true after 1st time : stackoverflow.com/a/26729596/2641426
2

In spring-batch, you can do this using ClassifierCompositeItemWriter.

Since ClassifierCompositeItemWriter gives you access to your object during write, you can write custom logic to instruct spring to write to different files.

Take a look at below sample. The ClassifierCompositeItemWriter needs an implementation of Classifier interface. Below you can see that I have created a lambda where I am implementing the classify() method of the Classifier interface. The classify() method is where you will create your ItemWriter. In our example below, we have created a FlatFileItemWriter which gets the name of the file from the item itself and then creates a resource for that.

@Bean
public ClassifierCompositeItemWriter<YourDataObject> yourDataObjectItemWriter(
    Classifier<YourDataObject, ItemWriter<? super YourDataObject>> itemWriterClassifier
) {
  ClassifierCompositeItemWriter<YourDataObject> compositeItemWriter = new ClassifierCompositeItemWriter<>();
  compositeItemWriter.setClassifier(itemWriterClassifier);
  return compositeItemWriter;
}

@Bean
public Classifier<YourDataObject, ItemWriter<? super YourDataObject>> itemWriterClassifier() {
  return yourDataObject -> {
    String fileName = yourDataObject.getFileName();

    BeanWrapperFieldExtractor<YourDataObject> fieldExtractor = new BeanWrapperFieldExtractor<>();
    fieldExtractor.setNames(new String[]{"recId", "name"});
    DelimitedLineAggregator<YourDataObject> lineAggregator = new DelimitedLineAggregator<>();
    lineAggregator.setFieldExtractor(fieldExtractor);

    FlatFileItemWriter<YourDataObject> itemWriter = new FlatFileItemWriter<>();
    itemWriter.setResource(new FileSystemResource(fileName));
    itemWriter.setAppendAllowed(true);
    itemWriter.setLineAggregator(lineAggregator);
    itemWriter.setHeaderCallback(writer -> writer.write("REC_ID,NAME"));

    itemWriter.open(new ExecutionContext());
    return itemWriter;
  };
}

Finally, you can attach your ClassifierCompositeItemWriter in your batch step like you normally attach your ItemWriter.

@Bean
public Step myCustomStep(
    StepBuilderFactory stepBuilderFactory
) {
  return stepBuilderFactory.get("myCustomStep")
      .<?, ?>chunk(1000)
      .reader(myCustomReader())
      .writer(yourDataObjectItemWriter(itemWriterClassifier(null)))
      .build();
}

NOTE: As pointed out in comments by @Ping, a new writer will be created for each chunk, which is usually a bad practice and not an optimal solution. A better solution would be to maintain a hashmap of filename and writer so that you can reuse the writer.

5 Comments

In the above example, will a new writer be created for each chunk? If not, how does the ClassifierCompositeItemWriter know that it needs to reuse a ItemWriter based on some attribute in YourObject?
In above example, a new writer will be created for each chunk because itemWriterClassifier is called on every object it wants to write and creates a new writer on the spot. That is bad obviously, and was written only for demonstration purposes. But I think you can easily save the reference to the writer and re-use it. stackoverflow.com/questions/53501152/…
The example that you linked to doesn't consider dynamic file names. There are a fixed set of writer beans to chose from which is known at compile time itself rather than runtime. In the current example from the question, if the OP gets file5 in the database FILE_NAME column in the future, there is no writer available to handle it. Writers should be instantiated at runtime and passed the file name to write to. Writers should also be cached so they can be reused if writing to the same file again.
You can simply create a hashmap or some key-value pair to dynamically create a writer if it doesn't exist, and then use the same one if it does exist.
Exactly. I believe this needs to be added to the answer?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.