New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handling of duplicate keys in xsl:map #169
Comments
https://www.saxonica.com/papers/xmlprague-2020mhk.pdf says "the required Would
work? Would
? (I have no idea why I'd want to do the second one, but it leaps to mind I have fairly often wanted to group all the examples of something under |
The idea is to support a single-pass operation where, when you encounter a duplicate, you decide what to do with it there and then. The two arguments are (a) the current value (which may result from previous processing of duplicates), and (b) the new value, and you're expected to combine them into a new current value. If you want to build a sequence containing all the values you can use |
I think I'm thinking of the equivalent of
And I'm not sure how I'd do that with current value and new value when there are multiple It is entirely reasonable to tell me not to expect to sneak grouping into how |
The effect of |
The group accepted this as the status quo at meeting 056. |
The draft specifications include proposals to enhance the handling of duplicate key values in xsl:map. This issue seeks WG endorsement of these changes.
See XSLT section 21.2.1
map:merge currently provides a set of fixed "policy" options for handling of duplicates; xsl:map always raises an error.
Rather than adopt the fixed set of policies in map:merge, I propose that xsl:map should accept a callback function to process duplicates. For example, forming a sequence-concatenation of the values can be achieved using
on-duplicates="op(',')"
. This approach allows all the options provided by map:merge, and more; for example the duplicate entries can be combined using string-join() if so desired.The text was updated successfully, but these errors were encountered: