Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bugfix/flash fill max token #309

Merged
merged 2 commits into from
Mar 18, 2023
Merged

Bugfix/flash fill max token #309

merged 2 commits into from
Mar 18, 2023

Conversation

ad12
Copy link
Collaborator

@ad12 ad12 commented Mar 18, 2023

max token must be initialized after init

@codecov-commenter
Copy link

codecov-commenter commented Mar 18, 2023

Codecov Report

Merging #309 (bb6de15) into main (995fdb4) will increase coverage by 0.35%.
The diff coverage is 0.00%.

@@            Coverage Diff             @@
##             main     #309      +/-   ##
==========================================
+ Coverage   67.03%   67.38%   +0.35%     
==========================================
  Files         211      211              
  Lines       11926    11943      +17     
  Branches     1781     1787       +6     
==========================================
+ Hits         7994     8048      +54     
+ Misses       3508     3469      -39     
- Partials      424      426       +2     
Flag Coverage Δ
unittests 67.38% <0.00%> (+0.35%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
...p/src/lib/component/contrib/flash_fill/__init__.py 12.22% <0.00%> (ø)

... and 8 files with indirect coverage changes

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@seyuboglu seyuboglu merged commit a0564c6 into main Mar 18, 2023
@seyuboglu seyuboglu deleted the bugfix/flash-fill-max-token branch March 18, 2023 18:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants