-
Notifications
You must be signed in to change notification settings - Fork 135
fix: train edit nlu sample #905
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
WalkthroughThe changes introduce a loading state variable for mutation operations in the NLU sample editing and creation flows. The Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant NlpSampleForm
participant useUpdate
participant NlpDatasetSample
User->>NlpSampleForm: Edit and submit sample
NlpSampleForm->>useUpdate: Trigger updateSample mutation
useUpdate-->>NlpSampleForm: Return isUpdatingSample (loading state)
NlpSampleForm->>NlpDatasetSample: Pass isMutationLoading prop
NlpDatasetSample->>User: Reflect loading state in UI (e.g., disable button)
Assessment against linked issues
Suggested labels
Suggested reviewers
Poem
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (3)
🚧 Files skipped from review as they are similar to previous changes (3)
⏰ Context from checks skipped due to timeout of 90000ms (3)
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
🔭 Outside diff range comments (1)
api/src/nlp/services/nlp-sample.service.spec.ts (1)
56-363
: 🛠️ Refactor suggestionMissing test cases for new functionality
While the necessary dependencies for the new
automate_inference
functionality have been added, there are no test cases that verify this functionality. This could lead to untested code.Add test cases for the new functionality that uses the
automate_inference
setting. I can help you generate appropriate test cases if needed.
🧹 Nitpick comments (3)
api/src/nlp/services/nlp-sample.service.ts (3)
282-282
: Add null check for settings.chatbot_settings.There's a potential issue if the settings object doesn't have the expected structure.
- if (!settings.chatbot_settings.automate_inference) { + if (!settings.chatbot_settings?.automate_inference) {
290-290
: Fix typo in log message.- this.logger.debug(`${helper.getName()} infered these entites`, entities); + this.logger.debug(`${helper.getName()} inferred these entities`, entities);
305-306
: Add success log after sample update.Adding a log message after successfully updating a sample would improve traceability and debugging.
await this.repository.updateOne(foundSample.id, { type: 'train', }); + this.logger.debug(`Sample ${foundSample.id} updated from inbox to train`);
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (8)
api/src/nlp/services/nlp-sample.service.spec.ts
(3 hunks)api/src/nlp/services/nlp-sample.service.ts
(3 hunks)api/src/setting/seeds/setting.seed-model.ts
(1 hunks)frontend/public/locales/en/chatbot_settings.json
(1 hunks)frontend/public/locales/fr/chatbot_settings.json
(1 hunks)frontend/src/components/nlp/components/NlpSampleForm.tsx
(2 hunks)frontend/src/components/nlp/components/NlpTrainForm.tsx
(4 hunks)frontend/src/components/nlp/index.tsx
(2 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (2)
- GitHub Check: Frontend Tests
- GitHub Check: API-Tests
🔇 Additional comments (15)
frontend/src/components/nlp/components/NlpSampleForm.tsx (2)
32-32
: Good enhancement to track loading stateThe addition of
isLoading: isUpdatingSample
properly captures the mutation loading state from theuseUpdate
hook, which is essential for providing user feedback during async operations.
66-70
: Correctly passing loading state to child componentThe loading state is now properly passed to the
NlpDatasetSample
component, enabling it to reflect the mutation state appropriately in the UI.frontend/src/components/nlp/index.tsx (2)
65-65
: Good enhancement to track creation loading stateThe addition of
isLoading
destructured from theuseCreate
hook properly captures the mutation loading state, consistent with the approach in NlpSampleForm.
94-97
: Correctly passing loading state to child componentThe loading state from the create mutation is now properly passed to the
NlpDatasetSample
component, ensuring consistent UX during data submission.frontend/src/components/nlp/components/NlpTrainForm.tsx (5)
52-52
: Good enhancement to NlpDatasetSampleProps typeAdding the
isMutationLoading
prop to the component's props type ensures proper type checking and documents the new requirement.
58-58
: Correctly destructuring the new propThe isMutationLoading prop is properly destructured from the component props.
101-101
: Added monitoring of language fieldGood addition of the language field to the watched form values, which is now used in validation logic.
171-180
: Improved validation logic with clear variable namesThe code now uses descriptive variable names to check for empty values in various form fields, which enhances readability and maintainability. The
shouldDisableValidateButton
aggregates all validation checks including the mutation loading state.
458-458
: Simplified button disabled state logicUsing the comprehensive
shouldDisableValidateButton
variable instead of inline checks makes the code more readable and maintainable.api/src/setting/seeds/setting.seed-model.ts (1)
88-94
: LGTM! New setting properly structuredThe addition of the
automate_inference
setting is well-structured and follows the existing pattern. It's correctly categorized under 'chatbot_settings' with an appropriate type and default value.api/src/nlp/services/nlp-sample.service.spec.ts (3)
13-20
: New imports added correctlyThe imports for
HelperService
,SettingRepository
,SettingModel
,SettingSeeder
, andSettingService
have been properly added, which are necessary for the enhanced functionality.
77-78
: SettingModel properly added to Mongoose feature importsThe
SettingModel
has been correctly added to the Mongoose module feature imports.
92-95
: Added required services to test providersThe necessary services (
SettingService
,SettingRepository
,SettingSeeder
,HelperService
) have been properly added to the providers array.api/src/nlp/services/nlp-sample.service.ts (2)
18-19
: New imports are clearly structured and well-organized.The additions of HelperService, HelperType, and SettingService are appropriate for the new functionality being introduced.
Also applies to: 22-22
52-53
: Constructor dependencies are properly injected.The addition of helperService and settingService as private readonly dependencies follows the established pattern in this class and maintains consistency with dependency injection principles.
"default_storage_helper": "Utilitaire de stockage par défaut", | ||
"automate_inference": "Inférence Automatisée" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Missing help text for new setting
Unlike other settings in this file, there is no corresponding help text added in the "help" section for the new "automate_inference" setting. Users may not understand the purpose of this setting.
Add help text by including the following in the "help" section:
"default_llm_helper": "Utilitaire responsable de l'intelligence artificielle générative avancée pour effectuer des tâches telles que la génération de texte, la complétion de chat et les réponses à des requêtes complexes.",
"default_storage_helper": "Utilitaire de stockage définit l'emplacement où stocker les fichiers joints. Par défaut, le stockage local les conserve localement, mais vous pouvez choisir d'utiliser Minio ou toute autre solution de stockage."
+ "automate_inference": "L'inférence automatisée permet au système de traiter et de comprendre automatiquement les entrées des utilisateurs sans intervention manuelle, améliorant ainsi le temps de réponse et la précision."
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
"default_storage_helper": "Utilitaire de stockage par défaut", | |
"automate_inference": "Inférence Automatisée" | |
{ | |
"help": { | |
"default_llm_helper": "Utilitaire responsable de l'intelligence artificielle générative avancée pour effectuer des tâches telles que la génération de texte, la complétion de chat et les réponses à des requêtes complexes.", | |
"default_storage_helper": "Utilitaire de stockage définit l'emplacement où stocker les fichiers joints. Par défaut, le stockage local les conserve localement, mais vous pouvez choisir d'utiliser Minio ou toute autre solution de stockage.", | |
"automate_inference": "L'inférence automatisée permet au système de traiter et de comprendre automatiquement les entrées des utilisateurs sans intervention manuelle, améliorant ainsi le temps de réponse et la précision." | |
} | |
} |
142bd28
to
a8002e5
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (2)
api/src/nlp/services/nlp-sample.service.ts (2)
313-341
:⚠️ Potential issueAdd error handling to the handleInference method.
The method needs error handling around external service calls and database operations, similar to other methods in this class (like
handleNewMessage
). Without proper error handling, issues could propagate and cause unexpected behavior.Apply this diff to implement error handling:
@OnEvent('hook:message:postCreate') async handleInference(doc: THydratedDocument<Message>) { - const settings = await this.settingService.getSettings(); - if (!settings.chatbot_settings.automate_inference) { - this.logger.log('Automated inference is disabled'); - return; - } - this.logger.log('Automated inference running'); - if ('text' in doc.message) { - const helper = await this.helperService.getDefaultHelper(HelperType.NLU); - const { entities = [] } = await helper.predict(doc.message.text); - this.logger.debug(`${helper.getName()} infered these entites`, entities); - - const foundSample = await this.repository.findOne({ - text: doc.message.text, - type: 'inbox', - }); - if (!foundSample) { - return; - } - await this.nlpSampleEntityService.storeSampleEntities( - foundSample, - entities, - ); - await this.repository.updateOne(foundSample.id, { - type: 'train', - }); - } + try { + const settings = await this.settingService.getSettings(); + if (!settings.chatbot_settings?.automate_inference) { + this.logger.log('Automated inference is disabled'); + return; + } + this.logger.log('Automated inference running'); + if ('text' in doc.message) { + const helper = await this.helperService.getDefaultHelper(HelperType.NLU); + const { entities = [] } = await helper.predict(doc.message.text); + this.logger.debug(`${helper.getName()} infered these entites`, entities); + + const foundSample = await this.repository.findOne({ + text: doc.message.text, + type: 'inbox', + }); + if (!foundSample) { + return; + } + await this.nlpSampleEntityService.storeSampleEntities( + foundSample, + entities, + ); + await this.repository.updateOne(foundSample.id, { + type: 'train', + }); + this.logger.debug(`Sample ${foundSample.id} updated from inbox to train`); + } + } catch (error) { + this.logger.error('Error during automated inference', error); + } }
316-317
:⚠️ Potential issueAdd null check for chatbot_settings.
Add a null/undefined check for the settings.chatbot_settings object to prevent potential errors if the settings structure changes.
- if (!settings.chatbot_settings.automate_inference) { + if (!settings.chatbot_settings?.automate_inference) {
🧹 Nitpick comments (3)
api/src/nlp/services/nlp-sample.service.ts (3)
313-314
: Add documentation to explain the handleInference method.Following the pattern in the rest of the codebase, consider adding JSDoc comments to explain the purpose and functionality of this method.
+ /** + * Handles automated inference for new messages + * + * When a new message is created and the automate_inference setting is enabled, + * this method automatically predicts entities for the message text, + * finds a matching sample in the inbox, and updates it to be a training sample. + * + * @param doc - The message document that was created + */ @OnEvent('hook:message:postCreate') async handleInference(doc: THydratedDocument<Message>) {
324-325
: Fix typo in debug message.There's a typo in the debug message: "infered" should be "inferred".
- this.logger.debug(`${helper.getName()} infered these entites`, entities); + this.logger.debug(`${helper.getName()} inferred these entities`, entities);
337-339
: Add debug logging for successful sample update.Consider adding a debug log entry to record when a sample is successfully updated from inbox to train, similar to other logging patterns in this class.
await this.repository.updateOne(foundSample.id, { type: 'train', }); + this.logger.debug(`Sample ${foundSample.id} updated from inbox to train`);
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (8)
api/src/nlp/services/nlp-sample.service.spec.ts
(3 hunks)api/src/nlp/services/nlp-sample.service.ts
(3 hunks)api/src/setting/seeds/setting.seed-model.ts
(1 hunks)frontend/public/locales/en/chatbot_settings.json
(1 hunks)frontend/public/locales/fr/chatbot_settings.json
(1 hunks)frontend/src/components/nlp/components/NlpSampleForm.tsx
(2 hunks)frontend/src/components/nlp/components/NlpTrainForm.tsx
(4 hunks)frontend/src/components/nlp/index.tsx
(2 hunks)
🚧 Files skipped from review as they are similar to previous changes (7)
- frontend/public/locales/en/chatbot_settings.json
- frontend/public/locales/fr/chatbot_settings.json
- api/src/setting/seeds/setting.seed-model.ts
- frontend/src/components/nlp/components/NlpSampleForm.tsx
- frontend/src/components/nlp/index.tsx
- frontend/src/components/nlp/components/NlpTrainForm.tsx
- api/src/nlp/services/nlp-sample.service.spec.ts
🧰 Additional context used
🧬 Code Graph Analysis (1)
api/src/nlp/services/nlp-sample.service.ts (2)
api/src/utils/types/filter.types.ts (1)
THydratedDocument
(139-139)api/src/extensions/channels/web/types.ts (1)
Message
(221-221)
⏰ Context from checks skipped due to timeout of 90000ms (1)
- GitHub Check: API-Tests
🔇 Additional comments (2)
api/src/nlp/services/nlp-sample.service.ts (2)
19-20
: Appropriately imported the necessary dependencies.The additions of HelperService, HelperType, and SettingService imports support the new automated inference functionality.
Also applies to: 23-23
54-55
: Dependencies properly injected into the constructor.The constructor has been correctly updated to include the necessary services for the new automated inference functionality.
foundSample, | ||
entities, | ||
); | ||
await this.repository.updateOne(foundSample.id, { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Language should also be updated because it's actually taking the default value. Should be fetched from the variable entities
this.logger.log('Automated inference running'); | ||
if ('text' in doc.message) { | ||
const helper = await this.helperService.getDefaultHelper(HelperType.NLU); | ||
const { entities = [] } = await helper.predict(doc.message.text); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Language
entity should be filtered out for entities
We need to handle the migration of newly added settings. Currently it's not working as far as I tested. |
@MohamedAliBouhaouala Thanks for the review, feed back applied ! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (1)
api/src/nlp/services/nlp-sample.service.ts (1)
313-320
:⚠️ Potential issueAdd error handling and null safety check
While the event handler implementation looks good, it's missing error handling and a null check for
chatbot_settings
.@OnEvent('hook:message:postCreate') async handleInference(doc: THydratedDocument<Message>) { - const settings = await this.settingService.getSettings(); - if (!settings.chatbot_settings.automate_inference) { - this.logger.log('Automated inference is disabled'); - return; - } - this.logger.log('Automated inference running'); + try { + const settings = await this.settingService.getSettings(); + if (!settings.chatbot_settings?.automate_inference) { + this.logger.log('Automated inference is disabled'); + return; + } + this.logger.log('Automated inference running');
🧹 Nitpick comments (2)
api/src/nlp/services/nlp-sample.service.ts (2)
321-331
: Fix typo in debug log messageThere's a typo in the debug log message.
const helper = await this.helperService.getDefaultHelper(HelperType.NLU); let { entities = [] } = await helper.predict(doc.message.text); this.logger.debug( - `${helper.getName()} inferred these entities`, + `${helper.getName()} inferred these entities`, entities, );Also, consider renaming variables for better clarity when filtering out language entity:
- const inferredLanguage = entities.find((e) => e.entity === 'language'); - entities = entities.filter((e) => e.entity !== 'language'); + const inferredLanguage = entities.find((e) => e.entity === 'language'); + const filteredEntities = entities.filter((e) => e.entity !== 'language');
313-360
: Add JSDoc documentation to the methodAll other methods in this class have JSDoc documentation, but this new method doesn't. Add documentation to maintain consistency.
+/** + * Handles automated inference for new messages. + * When a message is created and automate_inference is enabled, + * this method predicts entities in the message text, sets them in the + * corresponding NLP sample, and changes the sample type from 'inbox' to 'train'. + * + * @param doc - The message document that was created + */ @OnEvent('hook:message:postCreate') async handleInference(doc: THydratedDocument<Message>) {
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
api/src/nlp/services/nlp-sample.service.ts
(3 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (1)
api/src/nlp/services/nlp-sample.service.ts (2)
api/src/utils/types/filter.types.ts (1)
THydratedDocument
(139-139)api/src/extensions/channels/web/types.ts (1)
Message
(221-221)
⏰ Context from checks skipped due to timeout of 90000ms (3)
- GitHub Check: Analyze (javascript-typescript)
- GitHub Check: Frontend Tests
- GitHub Check: API-Tests
🔇 Additional comments (3)
api/src/nlp/services/nlp-sample.service.ts (3)
19-20
: LGTM: Appropriate imports added for new functionalityThe imports added for
HelperService
,HelperType
, andSettingService
align properly with the new functionality being implemented.Also applies to: 23-23
54-55
: LGTM: Constructor dependencies properly injectedThe new dependencies for
helperService
andsettingService
are correctly injected via the constructor.
340-343
:✅ Verification successful
Verify entity filtering implementation
The code correctly filters out the language entity before storing entities in the sample, addressing the comment by MohamedAliBouhaouala.
Run this script to verify that language filtering is working properly in other NLU-related functions:
🏁 Script executed:
#!/bin/bash # Check if language entity filtering is consistent across the codebase # Look for other instances where entities are processed rg -A 5 -B 5 "entities\s*=\s*.*filter" --type ts --glob "api/src/nlp/**"Length of output: 898
Entity filtering implementation verified
The shell script output confirms that the language entity is correctly filtered out in
api/src/nlp/services/nlp-sample.service.ts
before storing the entities. The filtering operation:
- Locates the language entity with:
const inferredLanguage = entities.find((e) => e.entity === 'language');
- Filters it out with:
entities = entities.filter((e) => e.entity !== 'language');
This behavior matches the intention stated in the review comment by MohamedAliBouhaouala. No further changes are required.
const language = await this.languageService.findOne( | ||
{ code: inferredLanguage?.value }, | ||
undefined, | ||
{ _id: 1 }, | ||
); | ||
|
||
if (!language) { | ||
this.logger.warn('Unable to find inferred language', inferredLanguage); | ||
} | ||
|
||
await this.repository.updateOne(foundSample.id, { | ||
type: 'train', | ||
...(language && { language: language.id }), | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Check language update logic
The logic for updating the language based on inference looks correct, but there's a potential edge case if inferredLanguage
is undefined.
🏁 Script executed:
#!/bin/bash
# Check how language detection is handled elsewhere in the codebase
# Look for language detection patterns
rg -A 3 -B 3 "entity === 'language'" --type ts
rg -A 3 -B 3 "e.entity === 'language'" --type ts
# Check for similar language update patterns
rg -A 5 "language: language" --type ts
Length of output: 9245
Action: Enhance Guard for Inferred Language
The language update logic generally works as expected. However, note that if inferredLanguage
is undefined, the call to this.languageService.findOne({ code: inferredLanguage?.value }, …)
will pass undefined
for the language code. This could lead to unnecessary database lookups and a vague warning message. Consider adding an explicit check for the existence of inferredLanguage
before performing the lookup and update.
- File & Lines:
api/src/nlp/services/nlp-sample.service.ts
(Lines 345-358) - Issue:
findOne
is called with{ code: undefined }
wheninferredLanguage
is not set. - Suggestion: Guard against an undefined
inferredLanguage
(e.g., log and skip the lookup) to improve clarity and efficiency.
const foundSample = await this.repository.findOne({ | ||
text: doc.message.text, | ||
type: 'inbox', | ||
}); | ||
if (!foundSample) { | ||
return; | ||
} | ||
|
||
await this.nlpSampleEntityService.storeSampleEntities( | ||
foundSample, | ||
entities, | ||
); | ||
|
||
const language = await this.languageService.findOne( | ||
{ code: inferredLanguage?.value }, | ||
undefined, | ||
{ _id: 1 }, | ||
); | ||
|
||
if (!language) { | ||
this.logger.warn('Unable to find inferred language', inferredLanguage); | ||
} | ||
|
||
await this.repository.updateOne(foundSample.id, { | ||
type: 'train', | ||
...(language && { language: language.id }), | ||
}); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Complete error handling and add success logging
The code needs to complete the error handling wrapper and add a success log when a sample is updated.
const foundSample = await this.repository.findOne({
text: doc.message.text,
type: 'inbox',
});
if (!foundSample) {
return;
}
await this.nlpSampleEntityService.storeSampleEntities(
foundSample,
- entities,
+ filteredEntities,
);
const language = await this.languageService.findOne(
{ code: inferredLanguage?.value },
undefined,
{ _id: 1 },
);
if (!language) {
this.logger.warn('Unable to find inferred language', inferredLanguage);
}
await this.repository.updateOne(foundSample.id, {
type: 'train',
...(language && { language: language.id }),
});
+this.logger.debug(`Sample ${foundSample.id} updated from inbox to train`);
+ } catch (error) {
+ this.logger.error('Error during automated inference', error);
+ }
Committable suggestion skipped: line range outside the PR's diff.
await this.nlpSampleEntityService.storeSampleEntities( | ||
foundSample, | ||
entities, | ||
); | ||
|
||
const language = await this.languageService.findOne( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just minor enhancement to avoid extra DB calls:
let languageId: Language | null = null;
if (languageEntity?.value) {
languageId = await this.languageService.findOne(
{ code: languageEntity.value },
undefined,
{ _id: 1 }
);
}
This PR has been put on hold as other development tasks have taken higher priority |
We should emit 'hook:chatbot:received' after the NLU prediction, that way we avoid doing 2 requests.
|
3270ec5
to
5840841
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (1)
api/src/chat/services/chat.service.ts (1)
147-147
: Fix typo in variable name.There's a typo in the variable name "nlpEntites" that should be corrected.
- const nlpEntites = event.getNLP(); + const nlpEntities = event.getNLP();Make sure to update the usage of this variable in the condition below as well.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (4)
api/src/chat/chat.module.ts
(2 hunks)api/src/chat/services/chat.service.ts
(4 hunks)api/src/nlp/services/nlp-sample.service.spec.ts
(0 hunks)api/src/nlp/services/nlp-sample.service.ts
(2 hunks)
💤 Files with no reviewable changes (1)
- api/src/nlp/services/nlp-sample.service.spec.ts
🚧 Files skipped from review as they are similar to previous changes (1)
- api/src/nlp/services/nlp-sample.service.ts
🧰 Additional context used
🧬 Code Graph Analysis (1)
api/src/chat/chat.module.ts (1)
api/src/nlp/schemas/nlp-sample.schema.ts (1)
NlpSampleModel
(79-82)
⏰ Context from checks skipped due to timeout of 90000ms (3)
- GitHub Check: Analyze (javascript-typescript)
- GitHub Check: API-Tests
- GitHub Check: Frontend Tests
🔇 Additional comments (7)
api/src/chat/chat.module.ts (3)
19-20
: Nice integration of NLP module.This change adds the required imports for the NLP module and schema, allowing the chat module to use NLP functionality.
67-67
: Good addition of NlpSampleModel to Mongoose features.This correctly registers the NLP sample model with Mongoose, allowing it to be used within the chat module.
74-74
: Proper module registration.The NlpModule is correctly imported into the ChatModule, enabling injection of NLP-related services in the ChatService.
api/src/chat/services/chat.service.ts (4)
32-32
: Correct import of NlpSampleService.The import path follows the project's conventions and gives access to the NLP sample service functionality.
50-50
: Proper dependency injection of NlpSampleService.The service is correctly injected into the constructor, following NestJS best practices.
369-371
: Event emission reordering aligns with PR objectives.Moving the event emission ('hook:chatbot:received') to this position ensures NLP data is processed before notifying other components, which aligns with the PR objective of automating the inference process. This change matches marrouchi's suggestion to emit the event after NLU prediction is performed, avoiding duplicate requests.
358-368
:❓ Verification inconclusive
Consider enhancing NLP extraction with proposed settings.
The PR mentions adding an "automate inference" setting that defaults to true. However, this code doesn't appear to check that setting before running the NLP prediction. The current implementation always attempts to run the helper prediction if text exists without NLP data.
Could we verify if the "automate inference" setting mentioned in the PR should be checked here before running the prediction? This might look like:
🏁 Script executed:
#!/bin/bash # Check if there's code in the codebase that checks the "automate inference" setting rg -A 5 "automate.?inference" --no-ignoreLength of output: 43
Verify existence of
automateInference
flag before NLP execution
I searched the codebase for any reference to “automate inference” (and common variants) and found none. Please confirm that:
- The
automateInference
(or correctly named) setting is defined (with a default oftrue
) in your configuration.- This block in
api/src/chat/services/chat.service.ts
(around lines 358–368) checks that flag before callinghelper.predict
.Without such a guard, NLP will always run on incoming text, regardless of your new setting.
const nlpEntites = event.getNLP(); | ||
|
||
if (nlpEntites?.entities) { | ||
await this.nlpSampleService.upgradeSampleWithEntities( | ||
nlpEntites.entities, | ||
msg, | ||
); | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Consider adding error handling for NLP sample upgrades.
While the code correctly extracts NLP entities and processes them, errors in the upgradeSampleWithEntities
method aren't caught here and could potentially crash the handleReceivedMessage
method.
const msg = await this.messageService.create(received);
const nlpEntites = event.getNLP();
if (nlpEntites?.entities) {
+ try {
await this.nlpSampleService.upgradeSampleWithEntities(
nlpEntites.entities,
msg,
);
+ } catch (err) {
+ this.logger.error('Failed to upgrade NLP sample with entities', err);
+ }
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const nlpEntites = event.getNLP(); | |
if (nlpEntites?.entities) { | |
await this.nlpSampleService.upgradeSampleWithEntities( | |
nlpEntites.entities, | |
msg, | |
); | |
} | |
const nlpEntites = event.getNLP(); | |
if (nlpEntites?.entities) { | |
try { | |
await this.nlpSampleService.upgradeSampleWithEntities( | |
nlpEntites.entities, | |
msg, | |
); | |
} catch (err) { | |
this.logger.error('Failed to upgrade NLP sample with entities', err); | |
} | |
} |
d42ab6c
to
9f0f5b0
Compare
Motivation
This pr fixes edit nlu sample by disabling the
validate
button when the required fields are not selected.fixes #786
Type of change:
Please delete options that are not relevant.
Checklist:
Summary by CodeRabbit