Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Call ItemStream open() and close() within a transaction [BATCH-2240] #1361

Open
spring-projects-issues opened this issue May 29, 2014 · 3 comments

Comments

@spring-projects-issues
Copy link
Collaborator

@spring-projects-issues spring-projects-issues commented May 29, 2014

Jimmy Praet opened BATCH-2240 and commented

Can't the ItemStream open() and close() methods should be called within a transaction? The JSR 352 also describes this in the paragraph "11.6
Regular Chunk Processing", although it is indicated as being optional.

  1. [ ]
  2. <->ItemReader.open // thread A
  3. <->ItemWriter.open // thread A
  4. [ ]

My use case is a writer that wraps around other writer delegate and transparently keeps track (in a database) of some business-specific statistics of the file that is being written too. Some of this metadata is already being persisted in the open() method. I would like these to be rollbacked when the open() of the delegate throws an exception.


Reference URL: http://stackoverflow.com/questions/26234720/jsr-352-spring-batch-transaction-management

2 votes, 4 watchers

@spring-projects-issues
Copy link
Collaborator Author

@spring-projects-issues spring-projects-issues commented Nov 3, 2014

Michael Minella commented

What you are describing from a use case isn't the intent of the ItemReader#open method or the reason it's transactional in JSR-352. If you have business specific metadata that should be persisted at the beginning of a step (around the same time the open method is called), the StepExecutionListener#beforeStep would be a better place to do that type of processing. That listener would also allow for compensating logic in the StepExecutionListener#afterStep method if there was an actual error.

In the JSR-352 scenario, the reason the ItemReader#open and similar methods are transactional was due to the expert group's understanding that in an EE environment, you can't enlist a transactional resource with a global transaction without being in a transaction. This would prevent a JMS reader or a Jdbc reader that uses a new connection from participating in the global transaction if ItemReader#open wasn't wrapped in a transaction.

In either case, there is nothing stopping you from using a TransactionTemplate in the StepExecutionListener#beforeStep or ItemReader#open.

@spring-projects-issues
Copy link
Collaborator Author

@spring-projects-issues spring-projects-issues commented Nov 5, 2014

Jimmy Praet commented

The EnrichedFileWriter in my specific use case is part of a batch component library that is shared across many batch projects. I can't really use the StepExecutionListener#beforeStep approach because:

  • I don't want the users to need to register the writer as a listener
  • Often the writer is configured as a dynamic delegate for e.g. a MultiResourceItemWriter or a ClassifierCompositeItemWriter, so the beforeStep isn't really the right time in the lifecycle because we don't know yet how many writers will be created

Wrapping only the EnrichedFileWriter#open in a TransactionTemplate would help, but:

  • requires an additional TransactionManager dependency
  • if there are multiple streams, a failure in the open() of one stream won't roll back the other streams

Do you see any drawbacks with making opening and closing the streams transactional? Isn't the JSR-352 scenario you are describing also applicable to using spring-batch in a JEE environment?

@spring-projects-issues
Copy link
Collaborator Author

@spring-projects-issues spring-projects-issues commented Nov 5, 2014

Michael Minella commented

With regards to your specific points:

  • requires an additional TransactionManager dependency - You already have the dependency in that batch relies on it so it's already available in any batch application.
  • if there are multiple streams, a failure in the open() of one stream won't roll back the other streams - True. You'd need to wrap all streams in a composite stream that handles the transactionality.

Another option that I didn't mention before is the use of a ChunkListener. While you don't seem to like the idea of registering a listener, that particular listener is transactional and would roll back if an error occurred. You would have to add some minor logic to determine if you're at the beginning of the step (first chunk) or not, but it should be straight forward.

The drawbacks of making these methods transactional I see are:

  1. It encourages the incorrect use of these methods - We already have lifecycle methods for this use case as mentioned before. This would muddle the line of when to use which method.
  2. It feels like a heavy handed approach for an outlying condition. The number of use cases where these methods being transactional is useful is very limited IMHO.

With regards to the JSR-352 technical limitation and it's application to Spring Batch, it does apply. But as I mention above, it's a scenario that doesn't happen very often (from my experience) so handling this with a TrasnactionTemplate is typically an acceptable approach.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants