Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MultiResourceItemWriter - Job restart failing (File is not writable) if last file first chunk failed [BATCH-2759] #845

Open
spring-issuemaster opened this issue Sep 20, 2018 · 2 comments

Comments

@spring-issuemaster
Copy link
Collaborator

@spring-issuemaster spring-issuemaster commented Sep 20, 2018

Aitor Perez opened BATCH-2759 and commented

Hi,

I'm developing an Spring Batch application that writes database rows in CSV files. The database is huge so I need to generate multiple output CSV files partitioned (outputfile.csv.1, outputfile.csv.2, outputfile.csv.3 .....)

I'm testing all data cases and I have found a problem when I restart the failed job in this situation. When the execution failed in the first chunk of a new partitioned file, this new file is not created (I tried shouldDeleteIfEmpty = false but the file is not created).

When I try the restart, it seems that is expecting this last missing partitioned failed file.

Example:
ChunkSize = 50
ItemCountLimitPerResource = 100
Force Exception at item 325

The execution fails an I can see in the ouput the generated files (outputfile.csv.1, outputfile.csv.2, outputfile.csv.3) with the 300 first rows (6 chunks and 3 completed files).

When I restart the job, I'm getting the exception:

org.springframework.batch.item.ItemStreamException: File is not writable: [XXXX...\outputfile.csv.4]

It seems that expects this file, but was not created due to the error in the first chunk (I tried shouldDeleteIfEmpty = false but the file is not created)

Reviewing the database repository I can check the execution, and the value of the resource index is 4

{"map":[{"entry":[{"string":"FlatFileItemWriter.current.count","long":700},{"string":"JdbcCursorItemReader.read.count","int":300},{"string":"MultiResourceItemWriter.resource.index","int":4},{"string":"FlatFileItemWriter.written","long":50},{"string":["batch.taskletType","org.springframework.batch.core.step.item.ChunkOrientedTasklet"]},{"string":"MultiResourceItemWriter.resource.item.count","int":0},{"string":["batch.stepType","org.springframework.batch.core.step.tasklet.TaskletStep"]}]}]}

I tried configuring the FlatFileItemWriter flags with different values (AppendAllowed, ShouldDeleteIfEmpty, ShouldDeleteIfExists) but the same issue is thrown

Thanks


Affects: 4.0.1

@spring-issuemaster
Copy link
Collaborator Author

@spring-issuemaster spring-issuemaster commented Sep 20, 2018

Aitor Perez commented

In the example remarked, the forced exception (in item 325) is thrown by the ItemProcessor (so maybe this chunk is never processed by the writer)

@jeffreysmooth
Copy link

@jeffreysmooth jeffreysmooth commented May 5, 2020

When we can expect the fix for this.File suppose to be created when we restart the job

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
2 participants
You can’t perform that action at this time.