Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
30 changes: 30 additions & 0 deletions .github/workflows/push.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,36 @@ jobs:
uses: microsoft/setup-msbuild@v1.1
- name: Clear NuGet cache
run: dotnet nuget locals all --clear
- name: Cache telegram-bot-api build
id: cache-tg-bot-api
uses: actions/cache@v4
with:
path: telegram-bot-api-bin
key: tg-bot-api-win-x64-${{ hashFiles('.github/workflows/push.yml') }}
restore-keys: |
tg-bot-api-win-x64-
- name: Build telegram-bot-api from source
if: steps.cache-tg-bot-api.outputs.cache-hit != 'true'
shell: pwsh
run: |
git clone --recursive https://github.com/tdlib/telegram-bot-api.git
C:\vcpkg\vcpkg.exe install openssl:x64-windows-static zlib:x64-windows-static --no-print-usage
Push-Location telegram-bot-api
New-Item -ItemType Directory -Force build | Out-Null
Push-Location build
cmake -A x64 -DCMAKE_BUILD_TYPE=Release `
-DCMAKE_TOOLCHAIN_FILE="C:/vcpkg/scripts/buildsystems/vcpkg.cmake" `
-DVCPKG_TARGET_TRIPLET="x64-windows-static" `
..
cmake --build . --config Release --target telegram-bot-api
Pop-Location
Pop-Location
New-Item -ItemType Directory -Force telegram-bot-api-bin | Out-Null
Copy-Item "telegram-bot-api\build\Release\telegram-bot-api.exe" "telegram-bot-api-bin\telegram-bot-api.exe"
- name: Copy telegram-bot-api binary to project
shell: pwsh
run: |
Copy-Item "telegram-bot-api-bin\telegram-bot-api.exe" "TelegramSearchBot\telegram-bot-api.exe"
Comment on lines +54 to +57
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Copy step may fail if both cache miss and build fail.

The copy step runs unconditionally, but telegram-bot-api-bin\telegram-bot-api.exe only exists if the cache was hit OR the build succeeded. If the build step fails (e.g., network issue cloning, CMake error), this step will fail the workflow.

Consider adding an if condition or combining with the build step.

🔧 Proposed fix
     - name: Copy telegram-bot-api binary to project
+      if: success()
       shell: pwsh
       run: |
+        if (Test-Path "telegram-bot-api-bin\telegram-bot-api.exe") {
           Copy-Item "telegram-bot-api-bin\telegram-bot-api.exe" "TelegramSearchBot\telegram-bot-api.exe"
+        } else {
+          Write-Warning "telegram-bot-api.exe not found, skipping copy"
+        }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
- name: Copy telegram-bot-api binary to project
shell: pwsh
run: |
Copy-Item "telegram-bot-api-bin\telegram-bot-api.exe" "TelegramSearchBot\telegram-bot-api.exe"
- name: Copy telegram-bot-api binary to project
if: success()
shell: pwsh
run: |
if (Test-Path "telegram-bot-api-bin\telegram-bot-api.exe") {
Copy-Item "telegram-bot-api-bin\telegram-bot-api.exe" "TelegramSearchBot\telegram-bot-api.exe"
} else {
Write-Warning "telegram-bot-api.exe not found, skipping copy"
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/push.yml around lines 54 - 57, The copy step "Copy
telegram-bot-api binary to project" runs unconditionally but will fail if
neither the cache hit nor the build produced
telegram-bot-api-bin\telegram-bot-api.exe; update that job step to only run when
the binary is available by adding an if condition checking the cache hit or
build success (for example: if: steps.cache.outputs.cache-hit == 'true' ||
steps.build.outcome == 'success'), or alternatively check the file existence
before copying (using a conditional script or an if: expression) so the step is
skipped when the binary doesn't exist.

- name: Restore dependencies
run: dotnet restore --force --no-cache /p:BuildWithNetFrameworkHostedCompiler=true
- name: Build
Expand Down
3 changes: 1 addition & 2 deletions TelegramSearchBot.Common/Attributes/McpAttributes.cs
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
using System;

namespace TelegramSearchBot.Attributes
{
namespace TelegramSearchBot.Attributes {
/// <summary>
/// Marks a method as a tool that can be called by the LLM.
/// Deprecated: Use <see cref="BuiltInToolAttribute"/> instead for built-in tools.
Expand Down
21 changes: 19 additions & 2 deletions TelegramSearchBot.Common/Env.cs
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,17 @@ static Env() {
}
try {
var config = JsonConvert.DeserializeObject<Config>(File.ReadAllText(Path.Combine(WorkDir, "Config.json")));
BaseUrl = config.BaseUrl;
IsLocalAPI = config.IsLocalAPI;
EnableLocalBotAPI = config.EnableLocalBotAPI;
TelegramBotApiId = config.TelegramBotApiId;
TelegramBotApiHash = config.TelegramBotApiHash;
LocalBotApiPort = config.LocalBotApiPort;
if (config.EnableLocalBotAPI) {
BaseUrl = $"http://127.0.0.1:{config.LocalBotApiPort}";
IsLocalAPI = true;
} else {
BaseUrl = config.BaseUrl;
IsLocalAPI = config.IsLocalAPI;
}
BotToken = config.BotToken;
AdminId = config.AdminId;
EnableAutoOCR = config.EnableAutoOCR;
Expand Down Expand Up @@ -44,6 +53,10 @@ static Env() {
public static readonly long AdminId;
public static readonly bool EnableAutoOCR;
public static readonly bool EnableAutoASR;
public static readonly bool EnableLocalBotAPI;
public static readonly string TelegramBotApiId;
public static readonly string TelegramBotApiHash;
public static readonly int LocalBotApiPort;
public static readonly string WorkDir;
public static readonly int TaskDelayTimeout;
public static readonly bool SameServer;
Expand All @@ -70,6 +83,10 @@ public class Config {
public bool EnableAutoASR { get; set; } = false;
//public string WorkDir { get; set; } = "/data/TelegramSearchBot";
public bool IsLocalAPI { get; set; } = false;
public bool EnableLocalBotAPI { get; set; } = false;
public string TelegramBotApiId { get; set; }
public string TelegramBotApiHash { get; set; }
public int LocalBotApiPort { get; set; } = 8081;
Comment on lines +86 to +89
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Check if documentation exists and mentions these new config options
fd -t f -e md . Docs/ 2>/dev/null | head -5
rg -l "EnableLocalBotAPI|TelegramBotApiId|TelegramBotApiHash|LocalBotApiPort" Docs/ 2>/dev/null || echo "No documentation found for new config fields"

Repository: ModerRAS/TelegramSearchBot

Length of output: 268


Update documentation in Docs/ for new configuration fields.

New persistent configuration fields (EnableLocalBotAPI, TelegramBotApiId, TelegramBotApiHash, LocalBotApiPort) have been added to the Config class but are not documented. Per coding guidelines, user documentation must be updated when modifying persistent configuration.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@TelegramSearchBot.Common/Env.cs` around lines 86 - 89, Add documentation
entries in the Docs/ folder for the newly persisted configuration fields added
to the Config/Env class: EnableLocalBotAPI, TelegramBotApiId,
TelegramBotApiHash, and LocalBotApiPort. Update the user-facing configuration
reference (e.g., the config schema docs and any example config files) to
describe each field, its type (bool, string, string, int), default values
(false, null/empty, null/empty, 8081), expected usage, and any security notes
for TelegramBotApiId/TelegramBotApiHash; also add a short migration note so
users know these are now persistent. Ensure the docs reference the same symbols
(EnableLocalBotAPI, TelegramBotApiId, TelegramBotApiHash, LocalBotApiPort) so
they match the code.

public bool SameServer { get; set; } = false;
public int TaskDelayTimeout { get; set; } = 1000;
public string OllamaModelName { get; set; } = "qwen2.5:72b-instruct-q2_K";
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ public static bool IsIterationLimitMessage(string content) {
/// 在累积内容末尾追加标记
/// </summary>
public static string AppendMarker(string accumulatedContent) {
return (accumulatedContent ?? string.Empty) + Marker;
return ( accumulatedContent ?? string.Empty ) + Marker;
}

/// <summary>
Expand Down
Original file line number Diff line number Diff line change
@@ -1,15 +1,12 @@
using Microsoft.EntityFrameworkCore.Migrations;
using Microsoft.EntityFrameworkCore.Migrations;

#nullable disable

namespace TelegramSearchBot.Migrations
{
namespace TelegramSearchBot.Migrations {
/// <inheritdoc />
public partial class AddUserWithGroupUniqueIndex : Migration
{
public partial class AddUserWithGroupUniqueIndex : Migration {
/// <inheritdoc />
protected override void Up(MigrationBuilder migrationBuilder)
{
protected override void Up(MigrationBuilder migrationBuilder) {
migrationBuilder.CreateIndex(
name: "IX_UsersWithGroup_UserId_GroupId",
table: "UsersWithGroup",
Expand All @@ -18,8 +15,7 @@ protected override void Up(MigrationBuilder migrationBuilder)
}

/// <inheritdoc />
protected override void Down(MigrationBuilder migrationBuilder)
{
protected override void Down(MigrationBuilder migrationBuilder) {
migrationBuilder.DropIndex(
name: "IX_UsersWithGroup_UserId_GroupId",
table: "UsersWithGroup");
Expand Down
Original file line number Diff line number Diff line change
@@ -1,15 +1,12 @@
using Microsoft.EntityFrameworkCore.Migrations;
using Microsoft.EntityFrameworkCore.Migrations;

#nullable disable

namespace TelegramSearchBot.Migrations
{
namespace TelegramSearchBot.Migrations {
/// <inheritdoc />
public partial class AddChannelWithModelIsDeleted : Migration
{
public partial class AddChannelWithModelIsDeleted : Migration {
/// <inheritdoc />
protected override void Up(MigrationBuilder migrationBuilder)
{
protected override void Up(MigrationBuilder migrationBuilder) {
migrationBuilder.AddColumn<bool>(
name: "IsDeleted",
table: "ChannelsWithModel",
Expand All @@ -19,8 +16,7 @@ protected override void Up(MigrationBuilder migrationBuilder)
}

/// <inheritdoc />
protected override void Down(MigrationBuilder migrationBuilder)
{
protected override void Down(MigrationBuilder migrationBuilder) {
migrationBuilder.DropColumn(
name: "IsDeleted",
table: "ChannelsWithModel");
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -102,12 +102,20 @@ public async Task GetChannelsAsync_NoModels_ReturnsEmpty() {
public async Task GetChannelsAsync_WithModel_ReturnsOrderedChannels() {
// Arrange
var channel1 = new LLMChannel {
Name = "ch1", Gateway = "gw1", ApiKey = "key1",
Provider = LLMProvider.OpenAI, Parallel = 2, Priority = 1
Name = "ch1",
Gateway = "gw1",
ApiKey = "key1",
Provider = LLMProvider.OpenAI,
Parallel = 2,
Priority = 1
};
var channel2 = new LLMChannel {
Name = "ch2", Gateway = "gw2", ApiKey = "key2",
Provider = LLMProvider.OpenAI, Parallel = 3, Priority = 10
Name = "ch2",
Gateway = "gw2",
ApiKey = "key2",
Provider = LLMProvider.OpenAI,
Parallel = 3,
Priority = 10
};
_dbContext.LLMChannels.AddRange(channel1, channel2);
await _dbContext.SaveChangesAsync();
Expand All @@ -130,7 +138,10 @@ public async Task GetChannelsAsync_WithModel_ReturnsOrderedChannels() {
public async Task ExecAsync_NoModelConfigured_YieldsNoResults() {
// Arrange - no group settings configured
var message = new TelegramSearchBot.Model.Data.Message {
Content = "test", GroupId = 123, MessageId = 1, FromUserId = 1
Content = "test",
GroupId = 123,
MessageId = 1,
FromUserId = 1
};

// Act
Expand All @@ -153,8 +164,12 @@ public async Task GetAvailableCapacityAsync_NoChannels_ReturnsZero() {
public async Task GetAvailableCapacityAsync_WithChannels_ReturnsCapacity() {
// Arrange
var channel = new LLMChannel {
Name = "ch1", Gateway = "gw1", ApiKey = "key1",
Provider = LLMProvider.OpenAI, Parallel = 5, Priority = 1
Name = "ch1",
Gateway = "gw1",
ApiKey = "key1",
Provider = LLMProvider.OpenAI,
Parallel = 5,
Priority = 1
};
_dbContext.LLMChannels.Add(channel);
await _dbContext.SaveChangesAsync();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -67,8 +67,12 @@ public async Task GetModelCapabilities_NotFound_ReturnsNull() {
public async Task GetModelCapabilities_WithCapabilities_ReturnsCorrectModel() {
// Arrange
var channel = new LLMChannel {
Name = "test", Gateway = "gw", ApiKey = "key",
Provider = LLMProvider.OpenAI, Parallel = 1, Priority = 1
Name = "test",
Gateway = "gw",
ApiKey = "key",
Provider = LLMProvider.OpenAI,
Parallel = 1,
Priority = 1
};
_dbContext.LLMChannels.Add(channel);
await _dbContext.SaveChangesAsync();
Expand Down Expand Up @@ -106,8 +110,12 @@ public async Task GetModelCapabilities_WithCapabilities_ReturnsCorrectModel() {
public async Task GetToolCallingSupportedModels_ReturnsCorrectModels() {
// Arrange
var channel = new LLMChannel {
Name = "test", Gateway = "gw", ApiKey = "key",
Provider = LLMProvider.OpenAI, Parallel = 1, Priority = 1
Name = "test",
Gateway = "gw",
ApiKey = "key",
Provider = LLMProvider.OpenAI,
Parallel = 1,
Priority = 1
};
_dbContext.LLMChannels.Add(channel);
await _dbContext.SaveChangesAsync();
Expand Down Expand Up @@ -138,7 +146,7 @@ public async Task GetToolCallingSupportedModels_ReturnsCorrectModels() {
await _dbContext.SaveChangesAsync();

// Act
var result = (await _service.GetToolCallingSupportedModels()).ToList();
var result = ( await _service.GetToolCallingSupportedModels() ).ToList();

// Assert
Assert.Single(result);
Expand All @@ -149,8 +157,12 @@ public async Task GetToolCallingSupportedModels_ReturnsCorrectModels() {
public async Task GetVisionSupportedModels_ReturnsCorrectModels() {
// Arrange
var channel = new LLMChannel {
Name = "test", Gateway = "gw", ApiKey = "key",
Provider = LLMProvider.OpenAI, Parallel = 1, Priority = 1
Name = "test",
Gateway = "gw",
ApiKey = "key",
Provider = LLMProvider.OpenAI,
Parallel = 1,
Priority = 1
};
_dbContext.LLMChannels.Add(channel);
await _dbContext.SaveChangesAsync();
Expand All @@ -170,7 +182,7 @@ public async Task GetVisionSupportedModels_ReturnsCorrectModels() {
await _dbContext.SaveChangesAsync();

// Act
var result = (await _service.GetVisionSupportedModels()).ToList();
var result = ( await _service.GetVisionSupportedModels() ).ToList();

// Assert
Assert.Single(result);
Expand All @@ -181,8 +193,12 @@ public async Task GetVisionSupportedModels_ReturnsCorrectModels() {
public async Task GetEmbeddingModels_ReturnsCorrectModels() {
// Arrange
var channel = new LLMChannel {
Name = "test", Gateway = "gw", ApiKey = "key",
Provider = LLMProvider.OpenAI, Parallel = 1, Priority = 1
Name = "test",
Gateway = "gw",
ApiKey = "key",
Provider = LLMProvider.OpenAI,
Parallel = 1,
Priority = 1
};
_dbContext.LLMChannels.Add(channel);
await _dbContext.SaveChangesAsync();
Expand All @@ -202,7 +218,7 @@ public async Task GetEmbeddingModels_ReturnsCorrectModels() {
await _dbContext.SaveChangesAsync();

// Act
var result = (await _service.GetEmbeddingModels()).ToList();
var result = ( await _service.GetEmbeddingModels() ).ToList();

// Assert
Assert.Single(result);
Expand All @@ -213,8 +229,12 @@ public async Task GetEmbeddingModels_ReturnsCorrectModels() {
public async Task CleanupOldCapabilities_RemovesOldEntries() {
// Arrange
var channel = new LLMChannel {
Name = "test", Gateway = "gw", ApiKey = "key",
Provider = LLMProvider.OpenAI, Parallel = 1, Priority = 1
Name = "test",
Gateway = "gw",
ApiKey = "key",
Provider = LLMProvider.OpenAI,
Parallel = 1,
Priority = 1
};
_dbContext.LLMChannels.Add(channel);
await _dbContext.SaveChangesAsync();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ public void SerializeProviderHistory_WithToolCallHistory_PreservesContent() {
};

var serialized = OpenAIService.SerializeProviderHistory(history);

Assert.Equal(5, serialized.Count);
Assert.Contains("tool_call", serialized[2].Content);
Assert.Contains("bash", serialized[3].Content);
Expand Down
2 changes: 1 addition & 1 deletion TelegramSearchBot.LLM/Service/AI/LLM/GeminiService.cs
Original file line number Diff line number Diff line change
Expand Up @@ -341,7 +341,7 @@ public async IAsyncEnumerable<string> ExecAsync(
executionContext.IterationLimitReached = true;
executionContext.SnapshotData = new LlmContinuationSnapshot {
ChatId = ChatId,
OriginalMessageId = (int)message.MessageId,
OriginalMessageId = ( int ) message.MessageId,
UserId = message.FromUserId,
ModelName = modelName,
Provider = "Gemini",
Expand Down
9 changes: 4 additions & 5 deletions TelegramSearchBot.LLM/Service/AI/LLM/McpToolHelper.cs
Original file line number Diff line number Diff line change
Expand Up @@ -170,7 +170,7 @@ private static string RegisterToolsAndGetPromptString(List<Assembly> assemblies)
var builtInParamAttr = param.GetCustomAttribute<BuiltInParameterAttribute>();
var mcpParamAttr = param.GetCustomAttribute<McpParameterAttribute>();
var paramDescription = builtInParamAttr?.Description ?? mcpParamAttr?.Description ?? $"Parameter '{param.Name}'";
var paramIsRequired = builtInParamAttr?.IsRequired ?? mcpParamAttr?.IsRequired ?? (!param.IsOptional && !param.HasDefaultValue);
var paramIsRequired = builtInParamAttr?.IsRequired ?? mcpParamAttr?.IsRequired ?? ( !param.IsOptional && !param.HasDefaultValue );
var paramType = MapToJsonSchemaType(param.ParameterType);

properties[param.Name] = new Dictionary<string, object> {
Expand Down Expand Up @@ -511,7 +511,7 @@ private static (string toolName, Dictionary<string, string> arguments) ParseTool
}
}

if (toolName == null || (!ToolRegistry.ContainsKey(toolName) && !ExternalToolRegistry.ContainsKey(toolName))) {
if (toolName == null || ( !ToolRegistry.ContainsKey(toolName) && !ExternalToolRegistry.ContainsKey(toolName) )) {
_sLogger?.LogWarning($"ParseToolElement: Unregistered tool '{element.Name.LocalName}'");
return (null, null);
}
Expand Down Expand Up @@ -703,8 +703,7 @@ public static async Task<object> ExecuteRegisteredToolAsync(string toolName, Dic

if (result is Task taskResult) {
await taskResult;
if (taskResult.GetType().IsGenericType)
{
if (taskResult.GetType().IsGenericType) {
return ( ( dynamic ) taskResult ).Result;
}
return null;
Expand Down Expand Up @@ -900,7 +899,7 @@ public static void RegisterExternalMcpTools(Interface.Mcp.IMcpServerManager mcpS
RegisterExternalTools(
toolInfos,
async (serverName, toolName, arguments) => {
var objectArgs = arguments.ToDictionary(kvp => kvp.Key, kvp => (object)kvp.Value);
var objectArgs = arguments.ToDictionary(kvp => kvp.Key, kvp => ( object ) kvp.Value);
var result = await mcpServerManager.CallToolAsync(serverName, toolName, objectArgs);
if (result.IsError) {
return $"Error: {string.Join("\n", result.Content?.Select(c => c.Text ?? "") ?? Enumerable.Empty<string>())}";
Expand Down
2 changes: 1 addition & 1 deletion TelegramSearchBot.LLM/Service/AI/LLM/OllamaService.cs
Original file line number Diff line number Diff line change
Expand Up @@ -202,7 +202,7 @@ public async IAsyncEnumerable<string> ExecAsync(Model.Data.Message message, long
executionContext.IterationLimitReached = true;
executionContext.SnapshotData = new LlmContinuationSnapshot {
ChatId = ChatId,
OriginalMessageId = (int)message.MessageId,
OriginalMessageId = ( int ) message.MessageId,
UserId = message.FromUserId,
ModelName = modelName,
Provider = "Ollama",
Expand Down
Loading
Loading