Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Basic LLM Chain doesn't handle On Error correctly #8921

Closed
dkindlund opened this issue Mar 19, 2024 · 2 comments
Closed

Basic LLM Chain doesn't handle On Error correctly #8921

dkindlund opened this issue Mar 19, 2024 · 2 comments
Labels
in linear Issue or PR has been created in Linear for internal review Released

Comments

@dkindlund
Copy link

Bug Description

I have a workflow using a Basic LLM Chain with the On Error param set to Continue (using error output) like this:
image

When I feed in an input to the LLM that flags on their content filtering policy, this sort of error gets thrown by the node:

NodeOperationError: 400 AzureException - Error code: 400 - {'error': {'message': "The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry. To learn more about our content filtering policies please read our documentation: https://go.microsoft.com/fwlink/?linkid=2198766", 'type': None, 'param': 'prompt', 'code': 'content_filter', 'status': 400, 'innererror': {'code': 'ResponsibleAIPolicyViolation', 'content_filter_result': {'hate': {'filtered': False, 'severity': 'safe'}, 'self_harm': {'filtered': False, 'severity': 'safe'}, 'sexual': {'filtered': True, 'severity': 'medium'}, 'violence': {'filtered': False, 'severity': 'safe'}}}}}
    at ChatOpenAI.callMethodAsync (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/logWrapper.js:34:23)
    at processTicksAndRejections (node:internal/process/task_queues:95:5)
    at Proxy.connectionType (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/logWrapper.js:156:47)
    at async Promise.allSettled (index 0)
    at Proxy._generateUncached (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/node_modules/@langchain/core/dist/language_models/chat_models.cjs:114:25)
    at LLMChain._call (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/dist/chains/llm_chain.cjs:157:37)
    at LLMChain.call (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/dist/chains/base.cjs:120:28)
    at createSimpleLLMChain (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:84:23)
    at getChain (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:93:16)
    at Object.execute (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:407:31)

But what's really weird is that the SUCCESS path of the node is executed instead of the ERROR path:
image

And when I try to see the data returned to the Airtable nodes, it's completely EMPTY:
image

To Reproduce

  1. Create a workflow using the Basic LLM Chain
  2. Configure the node to use On Error set to Continue (using error output)
  3. Feed in a prompt that triggers OpenAI's content filtering policy
  4. Watch as error gets caught but sent down the SUCCESS path instead of ERROR path
  5. Also, no data from the error is provided to subsequent nodes downstream from the error, which is also a bug

Expected behavior

I'd expect this configuration to catch the error, evaluate the ERROR path, and provide details of the error to the downstream node.

Operating System

Google Cloud Run

n8n Version

1.31.2

Node.js Version

18.10

Database

PostgreSQL

Execution mode

main (default)

@Joffcom
Copy link
Member

Joffcom commented Mar 21, 2024

Thanks for the report, I have created NODE-1254 as the internal dev ticket to get this added.

@Joffcom Joffcom added the in linear Issue or PR has been created in Linear for internal review label Mar 21, 2024
@janober
Copy link
Member

janober commented Apr 10, 2024

Fix got released with n8n@1.37.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
in linear Issue or PR has been created in Linear for internal review Released
Projects
None yet
Development

No branches or pull requests

3 participants