Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Doc: JMS CachingConnectionFactory incompatible with DefaultMessageListenerContainer in some circumstances [SPR-10581] #15210

Closed
spring-issuemaster opened this issue May 23, 2013 · 1 comment

Comments

Projects
None yet
2 participants
@spring-issuemaster
Copy link
Collaborator

commented May 23, 2013

Daniel Blezek opened SPR-10581 and commented

Using a CachingConnectionFactory in conjunction with a DefaultMessageListenerContainer that implements scaling using maxMessagePerTask can result in JMS messages delivered to cached consumers that are no longer attached to the DefaultMessageListenerContainer. This problem is documented and explained in detail in this forum thread: http://forum.springsource.org/showthread.php?133467-DMLC-maxMessagesPerTask-causes-inability-to-scale-down

Suggested Fix:

Put a comment in the documentation that consumer caching is not compatible with dynamic scaling in DefaultMessageListenerContainer OR issue a warning when a developer attempts to use them together.


Affects: 3.2 GA

Reference URL: http://forum.springsource.org/showthread.php?133467-DMLC-maxMessagesPerTask-causes-inability-to-scale-down

@spring-issuemaster

This comment has been minimized.

Copy link
Collaborator Author

commented Aug 5, 2013

Juergen Hoeller commented

I've added a warning to DMLC's javadoc for the time being.

We could try to detect a mismatch in configuration at runtime but I'd rather not go that far at this point, since there is always the risk of overreacting to configuration that is perfectly valid within its custom context...

Juergen

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.