-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Chat addition #17
Chat addition #17
Conversation
@@ -16,6 +17,22 @@ | |||
|
|||
log = getLogger(__name__) | |||
|
|||
class LLMVChatPair(db.Model): | |||
__tablename__ = 'llmv_chat_pair' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's pair mean in this context? Prompt generation pair?
history = llmv_generation.history | ||
history = [h.json() for h in llmv_generation.history] | ||
log.debug(f'Found history "{history}" ') | ||
else: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the block that handles multi model setups?
@llm_verifications.route('/chat_limit/<challenge_id>', methods=['GET']) | ||
@authed_only | ||
def chat_limit(challenge_id): | ||
"""Define a route for for showing users their answer submissions.""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"""Define a route for for showing users their answer submissions.""" | |
"""Define a route for showing users their answer submissions.""" |
@@ -37,8 +38,6 @@ def upgrade(op=None): | |||
sa.Column("model_id", sa.Integer(), nullable=False), | |||
sa.Column("user_id", sa.Integer(), nullable=True), | |||
sa.Column("team_id", sa.Integer(), nullable=True), | |||
sa.Column("text", sa.Text(), nullable=False), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Moved to table that supports multiturn above if I understand correctly?
@@ -51,16 +88,10 @@ def generate_for_challenge(): | |||
# Combine the pre-prompt and user-provided prompt with a space between them. | |||
prompt = request.json["prompt"] | |||
log.debug(f'pre-prompt {preprompt} and user-provided-prompt: "{prompt}"') | |||
idempotency_uuid = str(uuid4()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just curious why is this needed?
llmv_models.py
Outdated
id = db.Column(db.Integer, primary_key=True) | ||
generation_id = db.Column(db.Integer, db.ForeignKey('llmv_generation.id', ondelete='CASCADE')) | ||
prompt = db.Column(db.Text) | ||
generation = db.Column(db.Text) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider making these nullable in case the model fails for some reason and a generation is not returned
@@ -16,6 +17,22 @@ | |||
|
|||
log = getLogger(__name__) | |||
|
|||
class LLMVChatPair(db.Model): | |||
__tablename__ = 'llmv_chat_pair' | |||
__table_args__ = {'extend_existing': True} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this table defined elsewhere?
https://docs.sqlalchemy.org/en/13/core/metadata.html#sqlalchemy.schema.Table.params.extend_existing
llmv_routes.py
Outdated
if 'generation_id' in request.json: | ||
log.info("Found old generation id, using that") | ||
llmv_generation = LLMVGeneration.query.filter_by(id=request.json['generation_id']).first_or_404() | ||
if llmv_generation.status != "pending": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Style nit: All these conditions return the same response. Consider collapsing unless the plan is to change the returned json
if (llmv_generation.status != "pending" or
llmv_generation.account_id != get_current_user().id or
llmv_generation.challenge_id != challenge.id):
response = {'success': False, 'data': {'text': "This challenge is complete.", "id": -1}}
return jsonify(response)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It should have custom error messages.
@@ -49,6 +48,16 @@ def upgrade(op=None): | |||
sa.ForeignKeyConstraint(["user_id"], ["users.id"]), | |||
sa.PrimaryKeyConstraint("id"), | |||
) | |||
op.create_table( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Its good to see migrations. Will a contest ever be migrated though? It feels like a new db would be started each time an event is run meaning no migration would need to take place?
Cant hurt to have it, just asking
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is where the tables are created, the llm_model
file just reads the established tables.
Co-authored-by: Ravin Kumar <7213793+canyon289@users.noreply.github.com>
@comath I fixed the conflict |
Co-authored-by: Ravin Kumar <7213793+canyon289@users.noreply.github.com>
Co-authored-by: Ravin Kumar <7213793+canyon289@users.noreply.github.com>
Co-authored-by: Ravin Kumar <7213793+canyon289@users.noreply.github.com>
… into chat-addition
const result = await response.json(); | ||
this.chat_limit = result.data.chat_limit; | ||
if (this.chat_limit > 1) { | ||
this.is_single_turn = false; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why two bools? Cant single turn vs multi turn be differentiated with just one?
json={"uuid": idempotency_uuid,'prompt': prompt, "system" : preprompt, "model": model, "history": history}) | ||
except (requests.Timeout, requests.ConnectionError) as error: | ||
log.debug(error) | ||
# try again, with idempotency key |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should something happen here? Connection error is just handled twice?
From static read LGTM, Merge it and fixups can happen on main |
Merging so I can continue fixing things up on main |
Todo