[codex] Preserve specialized TypeVar bounds#3291
Open
cneuralnetwork wants to merge 1 commit intofacebook:mainfrom
Open
[codex] Preserve specialized TypeVar bounds#3291cneuralnetwork wants to merge 1 commit intofacebook:mainfrom
cneuralnetwork wants to merge 1 commit intofacebook:mainfrom
Conversation
Bounded TypeVars that appear as class type parameters can carry specialized generic bounds such as P[Any]. The recursive targs truncation was dropping those arguments whenever the bound class had restricted type parameters, which made protocol classmethod lookup leak the bound class's own TypeVar instead of the specialized Any argument. Make truncation detect an actual recursive path back to the same class before erasing class arguments. Add a protocol regression test covering facebook#3285.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Fixes #3285.
Summary
This fixes a false-positive
bad-argument-typeerror when looking up a classmethod through atype[T]whereTis a protocol-bounded TypeVar whose bound is itself specialized, for exampleT: SQLExpr[Any].The PR changes recursive type-argument truncation so it only erases generic arguments when there is an actual recursive path back to the same class. Non-recursive specialized bounds such as
P[Any]are now preserved.Root Cause
TParams::truncate_recursive_targswas intentionally preventing fixpoint growth for recursive TypeVar bounds. The old predicate was too broad: it stripped type arguments from anyClassTypewhose formal type parameters had non-trivial restrictions.That meant a bound like:
could be truncated from
P[Any]to barePbecauseP's own type parameter had a bound. Later, attribute lookup ontype[P_co]instantiated classmethod parameters from bareP, so the method leakedP's class TypeVar instead of using the specializedAnyfrom the bound.In the original issue this made:
look like:
instead of the expected:
Fix
The truncation logic now walks the restriction graph and only strips
ClassTypearguments when the class's embedded type parameters can reach back to the same class through their own bounds or constraints.That keeps the recursion guard for shapes like
Bar[Any]whereBarparticipates in its own bound cycle, while preserving ordinary specialized bounds used for protocol classmethod lookup.Tests
Added a regression test covering a reduced protocol form of the issue:
Validation run locally:
I also reran a
reveal_typeprobe for the original reproducer and confirmed the classmethod parameter is now specialized as(Sequence[Any]) -> Any.AI Disclosure
This PR was prepared with help from Codex.