Skip to content

Conversation

@Emt-lin
Copy link
Collaborator

@Emt-lin Emt-lin commented Mar 14, 2025

No description provided.

@logancyang
Copy link
Owner

The init problem still exists
SCR-20250316-otzt

@logancyang
Copy link
Owner

Since we will rely on context caching, it's essential to not allow some models in Project mode or the user will face huge cost. We should allow only models with the Google Gemini provider in the early iterations where we specifically implement context caching for them on the client side.

Just putting it here, not really a requirement for this PR but it's relevant for "Projects" mode overall.

BTW we should name it "Plus Projects (alpha)" mode.

@logancyang
Copy link
Owner

logancyang commented Mar 24, 2025

Thanks for the PR!

Here are some observations

  1. The text inputs in Project don't allow space?
  2. Prompt should be "Project System Prompt" and it should be optional.
  3. After updating the project system prompt, it is not actually updated during chat.
  4. The instruction for <ProjectContext> should be outside of the tag.
  5. The models should be hardcoded to gemini-1.5-flash-001 and gemini-1.5-pro-001 in this iteration since they are the only ones with 1M context and context caching.
  6. When no inclusion/exclusion but only youtube URL, the Project "save" button is disabled, but it should not be.
  7. When there's an inclusion folder and a youtube URL, the final prompt only has the youtube transcript but not the included notes. (I like the spinner "project context processing"!)
  8. Project context cache is not working

I'm going to attempt to fix some of these myself and aim for a pre-release for tmr.

@logancyang
Copy link
Owner

logancyang commented Mar 25, 2025

  • The text inputs in Project don't allow space?
  • Prompt should be "Project System Prompt" and it should be optional.
  • After updating the project system prompt, it is not actually updated during chat.
  • The instruction for should be outside of the tag.
  • The models should be hardcoded to gemini-1.5-flash-001 and gemini-1.5-pro-001 in this iteration since they are the only ones with 1M context and context caching.
  • When no inclusion/exclusion but only youtube URL, the Project "save" button is disabled, but it should not be.
  • When there's an inclusion folder and a youtube URL, the final prompt only has the youtube transcript but not the included notes.
    • This is because getMatchingPatterns fallback to global incl / excl setting, if there's a conflict, the Project filter UI does not reflect it.
    • FIX: Do not fallback to global filters inside Project!
  • Project context cache is not working
  • Project model not updated in chat after a model change
  • Chat UI should not be displayed when switched to Project chain
  • Use single click for project rather than double click
  • Implement DELETE project
  • Hardcode gpt-4o-mini and gemini-1.5-flash/pro-001 as Project-enabled models.

Context caching with gemini langchainjs client is not working, langchainjs might have a bug

Next for Project mode, need to

  • Implement context caching on the server side for copilot-plus-flash.
  • Add PDF context

@logancyang
Copy link
Owner

This PR has been merged into https://github.com/logancyang/obsidian-copilot/tree/2.9.0-preview

This PR is now for review only. Further changes should be made to a NEW PR against that branch.

@logancyang logancyang closed this Apr 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants