Skip to content

Conversation

jamesbraza
Copy link
Collaborator

The fh-llm-client added in #757 had an unpinned min version, which it turns out caused some problems downstream CI:

Installed 137 packages in 61ms
...
 + fh-llm-client==0.0.1
...

    from paperqa.agents.task import GradablePaperQAEnvironment, LitQAv2TaskDataset
.venv/lib/python3.11/site-packages/paperqa/__init__.py:3: in <module>
    from llmclient import (
E   ImportError: cannot import name 'EmbeddingModel' from 'llmclient' 

@jamesbraza jamesbraza added the bug Something isn't working label Jan 6, 2025
@jamesbraza jamesbraza requested a review from a team January 6, 2025 19:13
@jamesbraza jamesbraza self-assigned this Jan 6, 2025
@dosubot dosubot bot added the size:XS This PR changes 0-9 lines, ignoring generated files. label Jan 6, 2025
@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Jan 6, 2025
@jamesbraza jamesbraza merged commit c062fb5 into main Jan 6, 2025
5 checks passed
@jamesbraza jamesbraza deleted the pinning-fh-llm-client branch January 6, 2025 19:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working lgtm This PR has been approved by a maintainer size:XS This PR changes 0-9 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants