Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Fix tokenizer hitting tail-recursion limit #125

Merged
merged 3 commits into from
Mar 8, 2024

Conversation

kitten
Copy link
Member

@kitten kitten commented Mar 8, 2024

Summary

TypeScript also has a tail recursion limit and the changes in #111, while being very effective at reduing parsing overhead times, caused us to now hit the limit of 1,000 when too many ignored tokens were being tokenized.

This PR addresses this by splitting the recursion of ignored tokens back out, reducing the iteration count of tokens for each chain of ignored tokens back to 1.

That does mean that we still have a limit of 1,000 tokens, including chains of ignored tokens, but from preliminary testing it looks like this is quite reasonable and hard to hit in an app with composed fragments.
I actually like that this may cause us to have a hard limit on which queries are reasonably parsed with gql.tada, and when hitting this limit, there's a high chance of a fragment boundary being missed.

Set of changes

  • Readd skipIgnored type alias to tokenizer
  • Restore old structure of takeSelectionRec type
    • Note: This is unrelated, but while it looked like it was better, it actually wasn't

@kitten kitten requested a review from JoviDeCroock March 8, 2024 23:27
Copy link

changeset-bot bot commented Mar 8, 2024

🦋 Changeset detected

Latest commit: 3a757fd

The changes in this PR will be included in the next version bump.

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@kitten
Copy link
Member Author

kitten commented Mar 8, 2024

Merging as a hotfix, so this isn't blocking anyone

@kitten kitten merged commit 7995be6 into main Mar 8, 2024
1 check passed
@kitten kitten deleted the fix/tokenizer-size-limit branch March 8, 2024 23:30
@github-actions github-actions bot mentioned this pull request Mar 8, 2024
@kitten kitten added this to the Are We Fast Yet? milestone Mar 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant