Skip to content

🔦 Feature: How to chat with obsidian and ollama(local language model)? #8374

@doriszhang2020

Description

@doriszhang2020

Please confirm if feature request does NOT exist already ?

  • I confirm there is no existing issue for this

Describe the usecase for the feature

ollama is a good way to run local languase model(like llama3 8b),and the AI can easily input the so much inorder data quickerly as we want,eg,we copy a lot of information of one person,we just paste in the ai ,and ask it to write in we already set table,it can quickerly write in correct location in the table insted of we one by one.And also ,chat with ai can easily darw picture of data .And the obsidian is a markdown soft,which can easily write so many things ,if we write in ob and upload the data to Nocodb,and then chat with the information auto .

Suggested Solution

Additional Context

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    Status

    ✅ Closed

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions