Can't the ItemStream open() and close() methods should be called within a transaction? The JSR 352 also describes this in the paragraph "11.6
Regular Chunk Processing", although it is indicated as being optional.
<->ItemReader.open // thread A
<->ItemWriter.open // thread A
My use case is a writer that wraps around other writer delegate and transparently keeps track (in a database) of some business-specific statistics of the file that is being written too. Some of this metadata is already being persisted in the open() method. I would like these to be rollbacked when the open() of the delegate throws an exception.
What you are describing from a use case isn't the intent of the ItemReader#open method or the reason it's transactional in JSR-352. If you have business specific metadata that should be persisted at the beginning of a step (around the same time the open method is called), the StepExecutionListener#beforeStep would be a better place to do that type of processing. That listener would also allow for compensating logic in the StepExecutionListener#afterStep method if there was an actual error.
In the JSR-352 scenario, the reason the ItemReader#open and similar methods are transactional was due to the expert group's understanding that in an EE environment, you can't enlist a transactional resource with a global transaction without being in a transaction. This would prevent a JMS reader or a Jdbc reader that uses a new connection from participating in the global transaction if ItemReader#open wasn't wrapped in a transaction.
In either case, there is nothing stopping you from using a TransactionTemplate in the StepExecutionListener#beforeStep or ItemReader#open.
The EnrichedFileWriter in my specific use case is part of a batch component library that is shared across many batch projects. I can't really use the StepExecutionListener#beforeStep approach because:
I don't want the users to need to register the writer as a listener
Often the writer is configured as a dynamic delegate for e.g. a MultiResourceItemWriter or a ClassifierCompositeItemWriter, so the beforeStep isn't really the right time in the lifecycle because we don't know yet how many writers will be created
Wrapping only the EnrichedFileWriter#open in a TransactionTemplate would help, but:
requires an additional TransactionManager dependency
if there are multiple streams, a failure in the open() of one stream won't roll back the other streams
Do you see any drawbacks with making opening and closing the streams transactional? Isn't the JSR-352 scenario you are describing also applicable to using spring-batch in a JEE environment?
requires an additional TransactionManager dependency - You already have the dependency in that batch relies on it so it's already available in any batch application.
if there are multiple streams, a failure in the open() of one stream won't roll back the other streams - True. You'd need to wrap all streams in a composite stream that handles the transactionality.
Another option that I didn't mention before is the use of a ChunkListener. While you don't seem to like the idea of registering a listener, that particular listener is transactional and would roll back if an error occurred. You would have to add some minor logic to determine if you're at the beginning of the step (first chunk) or not, but it should be straight forward.
The drawbacks of making these methods transactional I see are:
It encourages the incorrect use of these methods - We already have lifecycle methods for this use case as mentioned before. This would muddle the line of when to use which method.
It feels like a heavy handed approach for an outlying condition. The number of use cases where these methods being transactional is useful is very limited IMHO.
With regards to the JSR-352 technical limitation and it's application to Spring Batch, it does apply. But as I mention above, it's a scenario that doesn't happen very often (from my experience) so handling this with a TrasnactionTemplate is typically an acceptable approach.