If I am the author of a generic library—e.g. the Swift Standard Library—and I have an algorithm for one of my protocols (e.g. Collection) that I know can have a lower complexity bound for a refined protocol (e.g. RandomAccessCollection), the prescription is to make the algorithm a requirement of the base protocol Collection and implement it in extensions of both Collection and RandomAccessCollection. I can then document the complexity of the algorithm as dependent on the conformance of the model to RandomAccessCollection, right?
Wrong!! I have to say the complexity depends on the conformance to RandomAccessCollectionunless the model is a generic type and its most general form conforms to Collection, in which case the complexity depends on whether the conformance to RandomAccessCollection is statically known at the point where the algorithm is invoked. In other words, there's nothing I can say about it that's both intelligible and accurate.
Furthermore, in this case, as far as I can tell, there's no way at all to access the specialized version of the algorithm from generic code or for the author of the conditionally-conforming type to be explicit about getting the specialized algorithm implementation.
Because the result of `joined()` is a type that only conditionally conforms to `Collection`, `Array.init` will not find the `underestimatedCount` for collections, and instead of preallocating storage for all elements as it is supposed to, it will make log(N) allocations growing the array's buffer exponentially.
Oh, wow, it's worse than a trip-up: there's no way to fix this! Even if `FlattenSequence` implemented `underestimatedCount` in its conditional conformance to `Collection`, it wouldn't get used.