Skip to content

init 0.2.9, pretty minor changes but mostly just bug fixes and publishing to pypi for easier installation#77

Merged
theshadow76 merged 7 commits intoChipaDevTeam:masterfrom
sixtysixx:master
Mar 9, 2026
Merged

init 0.2.9, pretty minor changes but mostly just bug fixes and publishing to pypi for easier installation#77
theshadow76 merged 7 commits intoChipaDevTeam:masterfrom
sixtysixx:master

Conversation

@sixtysixx
Copy link
Copy Markdown
Collaborator

@sixtysixx sixtysixx commented Mar 9, 2026

Pull Request

Overview

init 0.2.9, pretty minor changes but mostly just bug fixes and publishing to pypi for easier installation

Changes

decreased ci build time using cache
some small bug fixes and increased error catching

Type of Change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation / Examples
  • Performance / Refactoring
  • CI/CD / Build System

Validation

Describe how the changes were tested.

  • Unit tests
  • Integration tests
  • Manual verification

Environment

  • OS: w11 wsl debian
  • Python Version: 3.11, 3.12, 3.13, 3.14
  • Rust Version: 1.94.0

Checklist

  • Code follows project conventions and style guidelines.
  • Documentation and examples updated if necessary.
  • All tests pass locally.
  • No new warnings introduced.

Screenshots (Optional)

Add relevant visuals if applicable.

Summary by CodeRabbit

Release Notes - Version 0.2.9

  • Bug Fixes

    • Fixed balance calculation potentially returning -1
    • Resolved unsafe operations
    • Improved SSID parsing to prevent double-encoded JSON messages
  • Changes

    • Extended Python support to versions 3.8–3.15
    • Updated Python dependencies
  • Documentation

    • Minor documentation improvements
  • Tests

    • Added validation exception safety test

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Mar 9, 2026

📝 Walkthrough

Walkthrough

This PR coordinates a version bump to 0.2.9 with infrastructure improvements: sccache-based CI caching, UV-based MkDocs installation, expanded Tokio features across the dependency tree, multi-stage Docker builds, Python import robustness enhancements, and build profile optimizations.

Changes

Cohort / File(s) Summary
CI Workflow & Build Infrastructure
.github/workflows/CI.yml, docker/linux/Dockerfile, Cargo.toml
Integrated sccache caching, UV package manager for docs, added --strip flag to maturin builds; converted Docker to multi-stage cargo-chef pattern; adjusted release profile (lto: "thin", codegen-units: 16) and added dev profile defaults.
Dependency Version Updates
BinaryOptionsToolsUni/Cargo.toml, BinaryOptionsToolsV2/rust/Cargo.toml, crates/binary_options_tools/Cargo.toml, crates/core-pre/Cargo.toml, crates/core/Cargo.toml
Unified Tokio features across workspace (["full"] → ["rt-multi-thread", "macros", "net", "time", "sync"]), bumped thiserror 2.0.12+ → 2.0.18, updated tokio-tungstenite 0.21/0.27 → 0.28.0, rand → 0.9.2, uniffi → 0.31.0.
Python Import Robustness
BinaryOptionsToolsV2/python/BinaryOptionsToolsV2/config.py, BinaryOptionsToolsV2/python/BinaryOptionsToolsV2/tracing.py, BinaryOptionsToolsV2/python/BinaryOptionsToolsV2/validator.py
Replaced hard-coded fallback imports with centralized helpers (_get_pyconfig, _get_rust_attr); improved Validator.custom type hint to Callable[[str], bool]; added exception-safety test for custom validators.
Version & Metadata Bumps
CHANGELOG.md, CITATION.cff, README.md
Updated version references to 0.2.9; expanded Python support range to 3.8–3.15; added CHANGELOG entries for balance fix, unwrap safety, SSID parsing, and python support improvements.
Repository Structure
.gitignore, .serena/.gitignore, .serena/project.yml, docs/examples
Reinstated debug/ ignore, added .serena/ ignore; removed /cache ignore from .serena/.gitignore; deleted entire .serena/project.yml config file; removed docs/examples symlink.
Examples & Tests
examples/python/async/get_candles.py, tests/python/core/test_validator.py
Removed pandas DataFrame usage from candles example (commented out); added test_validator_custom_exception_safety for exception handling validation.

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Possibly related PRs

Suggested reviewers

  • Rick-29
  • theshadow76

Poem

🐰 A hop through CI's cache with sccache so keen,
Import paths now robust—no more fallback scene!
Tokio's features trimmed, Docker stages refined,
Version 0.2.9 leaves the old bugs behind! 🎉

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title clearly describes the main purpose of the pull request: version 0.2.9 release with bug fixes and PyPI publishing for easier installation.
Docstring Coverage ✅ Passed Docstring coverage is 81.25% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Tip

Try Coding Plans. Let us write the prompt for your AI agent so you can ship faster (with fewer bugs).
Share your feedback on Discord.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@gemini-code-assist
Copy link
Copy Markdown
Contributor

Summary of Changes

Hello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request delivers version 0.2.9 of the project, primarily focusing on stability, performance, and ease of installation. It incorporates crucial bug fixes, enhances the Python bindings for better robustness, and optimizes the build process to reduce CI times. The update also prepares the project for broader Python compatibility and simplifies the installation experience by publishing to PyPI.

Highlights

  • Version Update: The project version has been updated to 0.2.9 across various configuration files and documentation.
  • Dependency Management: Key Rust and Python dependencies, including uniffi, thiserror, tokio, tokio-tungstenite, and rand, have been updated to newer versions, and tokio features were explicitly defined.
  • Python Bindings Improvements: Refactored Python config.py, tracing.py, and validator.py to improve import flexibility and error handling, particularly for custom validators.
  • Build System Enhancements: The Dockerfile for Linux builds was refactored to utilize cargo-chef for efficient dependency caching and a multi-stage build process, significantly decreasing CI build times.
  • Bug Fixes: Addressed several bugs including authentication failures related to SSID parsing, potential balance reporting issues, and resolved unsafe unwraps in the Rust codebase.
  • Documentation & Examples: Updated supported Python versions in the README.md to include Python 3.15 and revised installation links. A docs/examples file was removed, and pandas usage was commented out in an example script.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • .gitignore
    • Added the .serena/ directory to the ignore list.
  • .serena/.gitignore
    • Removed the .serena/.gitignore file.
  • .serena/project.yml
    • Removed the Serena project configuration file.
  • BinaryOptionsToolsUni/Cargo.toml
    • Updated uniffi dependency from 0.30.0 to 0.31.0.
    • Updated thiserror dependency from 2.0.14 to 2.0.18.
    • Specified tokio features to rt-multi-thread, macros, net, time, sync.
  • BinaryOptionsToolsV2/python/BinaryOptionsToolsV2/config.py
    • Refactored _get_pyconfig to improve import flexibility for PyConfig by trying package-relative and direct imports.
  • BinaryOptionsToolsV2/python/BinaryOptionsToolsV2/tracing.py
    • Refactored Logger and LogBuilder initialization to use a common helper function _get_rust_attr for Rust attribute retrieval.
    • Removed direct debug, info, warn, error methods from the Python Logger class, relying on the underlying RustLogger.
  • BinaryOptionsToolsV2/python/BinaryOptionsToolsV2/validator.py
    • Refactored _get_raw_validator for improved import flexibility.
    • Updated the docstring for the custom validator to clarify exception handling and usage notes.
    • Changed type hint for func parameter in custom from callable to Callable[[str], bool].
  • BinaryOptionsToolsV2/rust/Cargo.toml
    • Updated package version from 0.2.8 to 0.2.9.
    • Updated thiserror dependency from 2.0.17 to 2.0.18.
    • Specified tokio features to rt-multi-thread, macros, net, time, sync.
  • CHANGELOG.md
    • Updated changelog for version 0.2.9, detailing updated Python support, improved SSID parsing, fixes for balance returning -1, and resolved unsafe unwraps.
  • CITATION.cff
    • Updated project version from 0.2.8 to 0.2.9.
  • Cargo.toml
    • Adjusted lto from true to thin and codegen-units from 1 to 16 for the release profile.
    • Introduced a dev profile with opt-level = 0, debug = true, lto = false, and codegen-units = 256.
  • README.md
    • Updated supported Python versions from 3.8 - 3.13 to 3.8 - 3.15.
    • Revised installation wheel links to version 0.2.9 for Windows, Linux, and macOS.
  • bun.lock
    • Updated lint-staged development dependency from 16.2.7 to 16.3.2.
  • crates/binary_options_tools/Cargo.toml
    • Updated tokio-tungstenite dependency from 0.21.0 to 0.28.0.
    • Updated rand dependency from 0.8.5 to 0.9.2.
    • Updated thiserror dependency from 1.0.63 to 2.0.18.
    • Specified tokio features to rt-multi-thread, macros, net, time, sync.
  • crates/core-pre/Cargo.toml
    • Updated thiserror dependency from 2.0.12 to 2.0.18.
    • Updated rand dependency from 0.9 to 0.9.2.
    • Specified tokio features to rt-multi-thread, macros, net, time, sync.
  • crates/core/Cargo.toml
    • Updated rand dependency from 0.9.1 to 0.9.2.
    • Updated thiserror dependency from 2.0.12 to 2.0.18.
    • Updated tokio-tungstenite dependency from 0.27.0 to 0.28.0.
  • docker/linux/Dockerfile
    • Refactored Dockerfile to use cargo-chef for dependency caching and implemented a multi-stage build process.
    • Updated maturin build command to include --strip for reduced binary size.
  • docs/examples
    • Removed the docs/examples file.
  • examples/python/async/get_candles.py
    • Commented out the pandas import and related usage for candles_pd.
  • tests/python/core/test_validator.py
    • Added a new test case test_validator_custom_exception_safety to verify graceful exception handling in custom validators.
Ignored Files
  • Ignored by pattern: .github/workflows/** (1)
    • .github/workflows/CI.yml
Activity
  • No specific human activity (comments, reviews, progress updates) has been recorded for this pull request yet.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for GitHub and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces version 0.2.9, focusing on bug fixes, dependency updates, and build process improvements. The changes to use cargo-chef in the Dockerfile and optimize Rust build profiles are great for improving CI and development build times. The refactoring of Python import logic and making custom validators safer are also solid improvements. However, I found a critical issue in the tracing.py module where a refactoring seems to have accidentally removed public methods from the Logger and LogBuilder classes, breaking the logging functionality. The rest of the changes look good.

Comment on lines +45 to +68
class Logger:
"""
A logger class wrapping the RustLogger functionality.

Attributes:
logger (RustLogger): The underlying RustLogger instance.
"""

def __init__(self):
RustLogger = _get_rust_attr("Logger")
self.logger = RustLogger()


class LogBuilder:
"""
A builder class for configuring the logs, create log layers and iterators.

Attributes:
builder (RustLogBuilder): The underlying RustLogBuilder instance.
"""

def __init__(self):
RustLogBuilder = _get_rust_attr("LogBuilder")
self.builder = RustLogBuilder()
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The refactoring to use _get_rust_attr is a good improvement for centralizing import logic. However, it appears that the public methods for the Logger and LogBuilder classes were accidentally removed in the process. This breaks their functionality as they no longer expose any logging or building capabilities. The wrapper classes should delegate calls to the underlying Rust objects.

class Logger:
    """
    A logger class wrapping the RustLogger functionality.

    Attributes:
        logger (RustLogger): The underlying RustLogger instance.
    """

    def __init__(self):
        RustLogger = _get_rust_attr("Logger")
        self.logger = RustLogger()

    def debug(self, message):
        """
        Log a debug message.

        Args:
            message (str): The message to log.
        """
        self.logger.debug(str(message))

    def info(self, message):
        """
        Log an informational message.

        Args:
            message (str): The message to log.
        """
        self.logger.info(str(message))

    def warn(self, message):
        """
        Log a warning message.

        Args:
            message (str): The message to log.
        """
        self.logger.warn(str(message))

    def error(self, message):
        """
        Log an error message.

        Args:
            message (str): The message to log.
        """
        self.logger.error(str(message))


class LogBuilder:
    """
    A builder class for configuring the logs, create log layers and iterators.

    Attributes:
        builder (RustLogBuilder): The underlying RustLogBuilder instance.
    """

    def __init__(self):
        RustLogBuilder = _get_rust_attr("LogBuilder")
        self.builder = RustLogBuilder()

    def create_logs_iterator(self, level: str = "DEBUG", timeout: Optional[timedelta] = None) -> LogSubscription:
        """
        Create a new logs iterator with the specified level and timeout.

        Args:
            level (str): The logging level (default is "DEBUG").
            timeout (Optional[timedelta]): Optional timeout for the iterator.

        Returns:
            LogSubscription: A new LogSubscription instance that supports both asynchronous and synchronous iterators.
        """
        return LogSubscription(self.builder.create_logs_iterator(level, timeout))

    def log_file(self, path: str = "logs.log", level: str = "DEBUG") -> "LogBuilder":
        """
        Configure logging to a file.

        Args:
            path (str): The path where logs will be stored (default is "logs.log").
            level (str): The minimum log level for this file handler.
        """
        self.builder.log_file(path, level)
        return self

    def terminal(self, level: str = "DEBUG") -> "LogBuilder":
        """
        Configure logging to the terminal.

        Args:
            level (str): The minimum log level for this terminal handler.
        """
        self.builder.terminal(level)
        return self

    def build(self):
        """
        Build and initialize the logging configuration. This function should be called only once per execution.
        """
        self.builder.build()

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 11

🧹 Nitpick comments (3)
docker/linux/Dockerfile (1)

33-33: Consider adding --out dist for consistency with CI workflow.

The CI workflow (in .github/workflows/CI.yml) uses --out dist to specify the output directory. While the default target/wheels/ works, aligning with CI conventions improves maintainability.

-RUN maturin build --release --strip --interpreter python3
+RUN maturin build --release --strip --interpreter python3 --out dist

Note: If you apply this, also update line 38 to COPY --from=builder /app/dist/*.whl /app/.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docker/linux/Dockerfile` at line 33, Update the maturin build invocation in
the Dockerfile: modify the RUN command that calls "maturin build --release
--strip --interpreter python3" to include "--out dist" so the build artifacts
land in /app/dist; then adjust the subsequent COPY step that references the
wheel (the COPY --from=builder ... *.whl line) to copy from /app/dist/*.whl
instead of the current target directory.
README.md (1)

116-155: Document the PyPI install path as the primary option.

This release is explicitly about easier installation via PyPI, but Option A still points users to platform-specific GitHub wheel URLs. I’d add pip install binaryoptionstoolsv2 first and keep the direct release URLs as a fallback.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@README.md` around lines 116 - 155, Update the installation section to show
the PyPI package as the primary, e.g., add a new "Option A: PyPI (Recommended)"
with the command `pip install binaryoptionstoolsv2` (or replace current Option A
heading with this), then demote the GitHub wheel URLs to a new "Option B:
Prebuilt Wheels (Fallback)" and keep the existing platform-specific wheel
examples and the existing "Option B/C: Build from Source" instructions (rename
as needed) so the wheel URLs remain as fallbacks; modify the README headings and
example commands accordingly to ensure users see `pip install
binaryoptionstoolsv2` first while preserving the current direct wheel and build
instructions.
CHANGELOG.md (1)

16-24: Make the 0.2.9 release notes more specific.

Entries like Updated python support, Balance returning -1 (possibly), and Unsafe unwraps are too vague for a changelog. Please name the actual supported version range, affected API/failure mode, and the PyPI publishing change; double-encoded JSON messages also reads cleaner on Line 17.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@CHANGELOG.md` around lines 16 - 24, Update the 0.2.9 CHANGELOG entries to be
specific: change "Updated python support" to name the supported Python versions
and PyPI publishing change (e.g., "Python 3.8–3.11 support; publish wheel to
PyPI"), change "Improved SSID parsing to prevent double encoded JSON msgs" to
"Improved SSID parsing to prevent double-encoded JSON messages", replace
"Balance returning -1 (possibly)" with the affected API and failure mode (e.g.,
"Fix: account balance API returned -1 on malformed balance responses in
BalanceService"), and replace "Unsafe unwraps" with the files/functions where
unwraps were fixed (reference specific symbols or modules, e.g., "Fixed unsafe
unwraps in Ssid::Display and auth_handshake::parse_credentials"). Also keep the
existing Ssid::Display note but ensure wording clarifies it now returns the raw
auth message (`42[\"auth\",{...}]`) sent during the WebSocket handshake.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/CI.yml:
- Around line 30-33: Replace the unsafe curl|sh installer step named "Install
uv" with the official action astral-sh/setup-uv pinned to a stable version
(e.g., astral-sh/setup-uv@v7) and keep the subsequent "Install dependencies with
uv" step as-is so uv is on PATH for the next step; alternatively, if you must
keep the shell installer, explicitly export the install directory to
$GITHUB_PATH after the installer step so the "Install dependencies with uv" step
can find uv. Use the job step names "Install uv" and "Install dependencies with
uv" to locate and update the workflow.

In `@BinaryOptionsToolsV2/python/BinaryOptionsToolsV2/config.py`:
- Around line 71-84: The _update_pyconfig method and the to_dict method are
dropping public fields terminal_logging, log_level, and extra_duration; update
_update_pyconfig to assign self._pyconfig.terminal_logging,
self._pyconfig.log_level, and self._pyconfig.extra_duration from the Config
instance (same pattern used for max_allowed_loops, sleep_interval, etc.), and
update to_dict to include "terminal_logging", "log_level", and "extra_duration"
keys so JSON/FFI round-trips preserve those values; refer to methods
_update_pyconfig, to_dict and attributes terminal_logging, log_level,
extra_duration to locate the changes.
- Around line 149-151: The update() method currently uses hasattr(self, key)
which permits overwriting methods/properties/private state (e.g., to_json,
pyconfig, _locked, _pyconfig); change update() to only accept keys that are
actual dataclass fields (same filter used by from_dict()), e.g., iterate over
the dataclass field names (self.__dataclass_fields__ or
dataclasses.fields(self)) and set attributes only when key is in that set so
methods/properties/private attrs cannot be overwritten.
- Around line 13-22: The fallback uses an invalid level-2 relative import ("from
..BinaryOptionsToolsV2 import PyConfig") which raises "attempted relative import
beyond top-level package"; update the fallback inside _get_pyconfig to use a
level-1 relative import (e.g., "from .BinaryOptionsToolsV2 import PyConfig" or
"from . import PyConfig" if PyConfig is exported in __init__.py) so that
importing PyConfig from the BinaryOptionsToolsV2 package succeeds; ensure the
import path references the same symbol names (PyConfig, BinaryOptionsToolsV2) as
in the current diff.

In `@BinaryOptionsToolsV2/python/BinaryOptionsToolsV2/tracing.py`:
- Around line 92-96: The current try/except in start_logs swallows all
exceptions (from os.makedirs or start_tracing) by printing and returning,
contradicting the docstring; change start_logs so startup failures propagate:
either remove the broad try/except or catch only expected exceptions and
re-raise after logging; if you want a log message, use the module logger (not
print) and then raise the original exception (or a wrapped one) so callers see
the failure; ensure references to os.makedirs, start_tracing, and the start_logs
function are updated accordingly.

In `@BinaryOptionsToolsV2/python/BinaryOptionsToolsV2/validator.py`:
- Around line 11-20: The fallback relative import in
BinaryOptionsToolsV2.validator is using a level-2 relative import which fails;
update the fallback import in the except block to use a level-1 relative import
so the sibling compiled module can be resolved: change the statement that
imports RawValidator (currently using from ..BinaryOptionsToolsV2 import
RawValidator) to a single-dot relative import (from .BinaryOptionsToolsV2 import
RawValidator) so RawValidator can be returned correctly; keep the subsequent
ImportError fallback to import BinaryOptionsToolsV2 and
getattr(BinaryOptionsToolsV2, "RawValidator") as-is.

In `@docker/linux/Dockerfile`:
- Around line 35-41: The final runner stage runs as root; create a non-root
user/group in the runner stage, chown the /app directory (and copied wheel files
from the COPY step) to that user, and set USER to that account before CMD so the
container no longer runs as root. Locate the runner stage (FROM
debian:bookworm-slim AS runner) where COPY --from=builder
/app/target/wheels/*.whl /app/ occurs, add steps to create a minimal user/group
(system or non-login), change ownership of /app and its contents, and then set
USER to that new user so subsequent commands (CMD ["ls", "-l", "/app"]) run
unprivileged.
- Line 24: The RUN instruction currently uses an unpinned package install ("pip
install maturin") which can lead to unreproducible builds; change that to pin a
specific stable maturin version (e.g., replace "pip install maturin" with "pip
install maturin==<stable_version>") or introduce a build ARG like
MATURIN_VERSION and use it in the RUN line to install
"maturin==${MATURIN_VERSION}" so builds remain reproducible and easy to update.

In `@examples/python/async/get_candles.py`:
- Around line 20-23: The example is hard-coding a 60s frame in the call to
get_candles, causing duplicate output across the time_frames loop; update the
call to use the loop variable (e.g., replace the literal 60 with the
frame/time_frame variable used in the loop) or remove the surrounding loop so
you only fetch once; ensure you modify the get_candles(...) invocation (the
symbol to change) to pass the intended frame variable instead of the constant.

In `@README.md`:
- Around line 118-135: The README and packaging metadata claim Python 3.8
compatibility but the wheels and Rust pyo3 binding use ABI3 for CPython 3.9
(`cp39-abi3`, `pyo3` `abi3-py39`), so update docs and pyproject to require
Python >=3.9: change the README text from "3.8 - 3.15" to "3.9+", remove the
Python 3.8 badge/mention, and in BinaryOptionsToolsV2/pyproject.toml update
requires-python to ">=3.9" and remove any Python 3.8 classifiers so the package
metadata matches the built wheels and pyo3 ABI configuration.

In `@tests/python/core/test_validator.py`:
- Around line 79-84: The test should avoid E712 and ARG001: rename the unused
parameter in crashing_func from msg to _msg (e.g., def crashing_func(_msg: str)
-> bool) and change the assertion to use identity comparison: assert
v.check("any message") is False; this keeps the intended behavior
(Validator.custom(crashing_func) returns False on exception) while satisfying
lint rules.

---

Nitpick comments:
In `@CHANGELOG.md`:
- Around line 16-24: Update the 0.2.9 CHANGELOG entries to be specific: change
"Updated python support" to name the supported Python versions and PyPI
publishing change (e.g., "Python 3.8–3.11 support; publish wheel to PyPI"),
change "Improved SSID parsing to prevent double encoded JSON msgs" to "Improved
SSID parsing to prevent double-encoded JSON messages", replace "Balance
returning -1 (possibly)" with the affected API and failure mode (e.g., "Fix:
account balance API returned -1 on malformed balance responses in
BalanceService"), and replace "Unsafe unwraps" with the files/functions where
unwraps were fixed (reference specific symbols or modules, e.g., "Fixed unsafe
unwraps in Ssid::Display and auth_handshake::parse_credentials"). Also keep the
existing Ssid::Display note but ensure wording clarifies it now returns the raw
auth message (`42[\"auth\",{...}]`) sent during the WebSocket handshake.

In `@docker/linux/Dockerfile`:
- Line 33: Update the maturin build invocation in the Dockerfile: modify the RUN
command that calls "maturin build --release --strip --interpreter python3" to
include "--out dist" so the build artifacts land in /app/dist; then adjust the
subsequent COPY step that references the wheel (the COPY --from=builder ...
*.whl line) to copy from /app/dist/*.whl instead of the current target
directory.

In `@README.md`:
- Around line 116-155: Update the installation section to show the PyPI package
as the primary, e.g., add a new "Option A: PyPI (Recommended)" with the command
`pip install binaryoptionstoolsv2` (or replace current Option A heading with
this), then demote the GitHub wheel URLs to a new "Option B: Prebuilt Wheels
(Fallback)" and keep the existing platform-specific wheel examples and the
existing "Option B/C: Build from Source" instructions (rename as needed) so the
wheel URLs remain as fallbacks; modify the README headings and example commands
accordingly to ensure users see `pip install binaryoptionstoolsv2` first while
preserving the current direct wheel and build instructions.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 9a0314dc-48cc-406b-9f0d-694b72ddac2d

📥 Commits

Reviewing files that changed from the base of the PR and between 4b9b22b and 4419fc8.

⛔ Files ignored due to path filters (1)
  • bun.lock is excluded by !**/*.lock
📒 Files selected for processing (20)
  • .github/workflows/CI.yml
  • .gitignore
  • .serena/.gitignore
  • .serena/project.yml
  • BinaryOptionsToolsUni/Cargo.toml
  • BinaryOptionsToolsV2/python/BinaryOptionsToolsV2/config.py
  • BinaryOptionsToolsV2/python/BinaryOptionsToolsV2/tracing.py
  • BinaryOptionsToolsV2/python/BinaryOptionsToolsV2/validator.py
  • BinaryOptionsToolsV2/rust/Cargo.toml
  • CHANGELOG.md
  • CITATION.cff
  • Cargo.toml
  • README.md
  • crates/binary_options_tools/Cargo.toml
  • crates/core-pre/Cargo.toml
  • crates/core/Cargo.toml
  • docker/linux/Dockerfile
  • docs/examples
  • examples/python/async/get_candles.py
  • tests/python/core/test_validator.py
💤 Files with no reviewable changes (3)
  • .serena/.gitignore
  • docs/examples
  • .serena/project.yml

Comment thread .github/workflows/CI.yml
Comment on lines +30 to +33
- name: Install uv
run: curl -LsSf https://astral.sh/uv/install.sh | sh
- name: Install dependencies with uv
run: uv pip install --system mkdocs-material "mkdocstrings[python]"
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🌐 Web query:

What is Astral's recommended way to install uvin GitHub Actions? Does the shell installer persistuvon PATH across later steps, or should a pinnedastral-sh/setup-uvaction /$GITHUB_PATH be used instead?

💡 Result:

Astral’s docs recommend installing uv in GitHub Actions via the official astral-sh/setup-uv action (it installs uv, adds it to PATH for later steps, and can persist/cache uv’s cache). (docs.astral.sh)

If you use the shell installer (curl … | sh) in Actions, don’t rely on it “sticking” on PATH across steps: the installer’s “put uv on PATH” behavior is primarily by modifying shell profiles, which later GHA steps won’t necessarily source. In CI, Astral instead points to using an unmanaged install to a known directory and then you must ensure that directory is on PATH yourself (e.g., via $GITHUB_PATH). (docs.astral.sh)

So: prefer a pinned astral-sh/setup-uv (e.g., @v7 or a commit SHA), or if you use the installer, add the install dir to $GITHUB_PATH explicitly. (docs.astral.sh)

Citations:


Use astral-sh/setup-uv action instead of curl | sh.

This job has contents: write permission, making the piped installer a supply-chain risk. Additionally, shell-profile edits don't persist on PATH across Actions steps, so the next step may fail. Use the official astral-sh/setup-uv action (pinned to a version like @v7), or if using the shell installer, explicitly add the install directory to $GITHUB_PATH.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/CI.yml around lines 30 - 33, Replace the unsafe curl|sh
installer step named "Install uv" with the official action astral-sh/setup-uv
pinned to a stable version (e.g., astral-sh/setup-uv@v7) and keep the subsequent
"Install dependencies with uv" step as-is so uv is on PATH for the next step;
alternatively, if you must keep the shell installer, explicitly export the
install directory to $GITHUB_PATH after the installer step so the "Install
dependencies with uv" step can find uv. Use the job step names "Install uv" and
"Install dependencies with uv" to locate and update the workflow.

Comment on lines +13 to +22
except (ImportError, AttributeError):
# Fallback to direct import
try:
from ..BinaryOptionsToolsV2 import PyConfig

return PyConfig
except ImportError:
import BinaryOptionsToolsV2

return getattr(BinaryOptionsToolsV2, "PyConfig")
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
set -euo pipefail

python - <<'PY'
from importlib._bootstrap import _resolve_name

for level in (1, 2):
    try:
        resolved = _resolve_name("BinaryOptionsToolsV2", "BinaryOptionsToolsV2", level)
        print(f"level={level}: {resolved}")
    except Exception as exc:
        print(f"level={level}: {type(exc).__name__}: {exc}")
PY

echo
echo "Package exports:"
fd -p '__init__.py' BinaryOptionsToolsV2/python/BinaryOptionsToolsV2 -x sh -c 'echo "### $1"; sed -n "1,200p" "$1"' sh {}

Repository: ChipaDevTeam/BinaryOptionsTools-v2

Length of output: 2642


The second fallback import has an invalid relative import level.

Line 16 uses from ..BinaryOptionsToolsV2 import PyConfig, which attempts a level-2 relative import from within the BinaryOptionsToolsV2 package. However, since there is no parent package above BinaryOptionsToolsV2, this relative import fails with ImportError: attempted relative import beyond top-level package. This means _get_pyconfig() still fails in the exact scenario this fallback is meant to handle.

Change the import to level-1: from .BinaryOptionsToolsV2 import PyConfig (or alternatively, from . import PyConfig if PyConfig is already exported from the package init.py).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@BinaryOptionsToolsV2/python/BinaryOptionsToolsV2/config.py` around lines 13 -
22, The fallback uses an invalid level-2 relative import ("from
..BinaryOptionsToolsV2 import PyConfig") which raises "attempted relative import
beyond top-level package"; update the fallback inside _get_pyconfig to use a
level-1 relative import (e.g., "from .BinaryOptionsToolsV2 import PyConfig" or
"from . import PyConfig" if PyConfig is exported in __init__.py) so that
importing PyConfig from the BinaryOptionsToolsV2 package succeeds; ensure the
import path references the same symbol names (PyConfig, BinaryOptionsToolsV2) as
in the current diff.

Comment on lines +71 to +84
def _update_pyconfig(self):
"""Updates the internal PyConfig with current values"""
if self._locked:
raise RuntimeError("Configuration is locked and cannot be modified after being used")

if self._pyconfig is None:
self._pyconfig = _get_pyconfig()()

self._pyconfig.max_allowed_loops = self.max_allowed_loops
self._pyconfig.sleep_interval = self.sleep_interval
self._pyconfig.reconnect_time = self.reconnect_time
self._pyconfig.connection_initialization_timeout_secs = self.connection_initialization_timeout_secs
self._pyconfig.timeout_secs = self.timeout_secs
self._pyconfig.urls = self.urls
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

These public config fields are currently dropped.

terminal_logging, log_level, and extra_duration are part of the public Config state, but Lines 79-84 never copy them into self._pyconfig. to_dict() on Lines 119-128 also omits extra_duration, so JSON round-trips and FFI callers can see a different config than the Python object.

Also applies to: 119-128

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@BinaryOptionsToolsV2/python/BinaryOptionsToolsV2/config.py` around lines 71 -
84, The _update_pyconfig method and the to_dict method are dropping public
fields terminal_logging, log_level, and extra_duration; update _update_pyconfig
to assign self._pyconfig.terminal_logging, self._pyconfig.log_level, and
self._pyconfig.extra_duration from the Config instance (same pattern used for
max_allowed_loops, sleep_interval, etc.), and update to_dict to include
"terminal_logging", "log_level", and "extra_duration" keys so JSON/FFI
round-trips preserve those values; refer to methods _update_pyconfig, to_dict
and attributes terminal_logging, log_level, extra_duration to locate the
changes.

Comment on lines +149 to +151
for key, value in config_dict.items():
if hasattr(self, key):
setattr(self, key, value)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Restrict update() to declared config fields.

Using hasattr(self, key) here lets callers overwrite methods (to_json), properties (pyconfig), and private state (_locked, _pyconfig). from_dict() already filters to dataclass fields; update() should do the same.

🔒 Suggested fix
-        for key, value in config_dict.items():
-            if hasattr(self, key):
-                setattr(self, key, value)
+        allowed = set(type(self).__dataclass_fields__)
+        for key, value in config_dict.items():
+            if key in allowed:
+                setattr(self, key, value)
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
for key, value in config_dict.items():
if hasattr(self, key):
setattr(self, key, value)
allowed = set(type(self).__dataclass_fields__)
for key, value in config_dict.items():
if key in allowed:
setattr(self, key, value)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@BinaryOptionsToolsV2/python/BinaryOptionsToolsV2/config.py` around lines 149
- 151, The update() method currently uses hasattr(self, key) which permits
overwriting methods/properties/private state (e.g., to_json, pyconfig, _locked,
_pyconfig); change update() to only accept keys that are actual dataclass fields
(same filter used by from_dict()), e.g., iterate over the dataclass field names
(self.__dataclass_fields__ or dataclasses.fields(self)) and set attributes only
when key is in that set so methods/properties/private attrs cannot be
overwritten.

Comment on lines +92 to +96
try:
os.makedirs(path, exist_ok=True)
start_tracing(path, level, terminal, layers)
except Exception as e:
print(f"Error starting logs: {e}")
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Don't swallow logging startup failures.

The docstring says start_logs() raises on error, but Lines 95-96 turn every exception into a print() and return successfully. That hides broken log configuration from callers and makes startup failures look like success.

🛠️ Suggested fix
-    except Exception as e:
-        print(f"Error starting logs: {e}")
+    except Exception as exc:
+        raise RuntimeError(f"Error starting logs: {exc}") from exc
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
try:
os.makedirs(path, exist_ok=True)
start_tracing(path, level, terminal, layers)
except Exception as e:
print(f"Error starting logs: {e}")
try:
os.makedirs(path, exist_ok=True)
start_tracing(path, level, terminal, layers)
except Exception as exc:
raise RuntimeError(f"Error starting logs: {exc}") from exc
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@BinaryOptionsToolsV2/python/BinaryOptionsToolsV2/tracing.py` around lines 92
- 96, The current try/except in start_logs swallows all exceptions (from
os.makedirs or start_tracing) by printing and returning, contradicting the
docstring; change start_logs so startup failures propagate: either remove the
broad try/except or catch only expected exceptions and re-raise after logging;
if you want a log message, use the module logger (not print) and then raise the
original exception (or a wrapped one) so callers see the failure; ensure
references to os.makedirs, start_tracing, and the start_logs function are
updated accordingly.

Comment thread docker/linux/Dockerfile
@@ -14,7 +23,19 @@ ENV PATH="$VENV_PATH/bin:$PATH"

RUN pip install --upgrade pip && pip install maturin
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🌐 Web query:

What is the latest stable version of maturin for Python packaging?

💡 Result:

The latest stable release of maturin on PyPI is 1.12.6 (released March 1, 2026). [1]

Sources:
[1] (pypi.org)

Citations:


Pin maturin version for reproducible builds.

The pip install maturin command lacks version pinning, which may cause inconsistent builds over time. Use a specific stable version instead.

📦 Suggested fix
-RUN pip install --upgrade pip && pip install maturin
+RUN pip install --upgrade pip && pip install maturin==1.12.6
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
RUN pip install --upgrade pip && pip install maturin
RUN pip install --upgrade pip && pip install maturin==1.12.6
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docker/linux/Dockerfile` at line 24, The RUN instruction currently uses an
unpinned package install ("pip install maturin") which can lead to
unreproducible builds; change that to pin a specific stable maturin version
(e.g., replace "pip install maturin" with "pip install
maturin==<stable_version>") or introduce a build ARG like MATURIN_VERSION and
use it in the RUN line to install "maturin==${MATURIN_VERSION}" so builds remain
reproducible and easy to update.

Comment thread docker/linux/Dockerfile
Comment on lines +35 to +41
# Stage 3: Runner (Minimal image)
FROM debian:bookworm-slim AS runner
WORKDIR /app
COPY --from=builder /app/target/wheels/*.whl /app/
# The runner stage just holds the artifacts in this case as it's a build container.
# If it were an app, we would install the wheel here.
CMD ["ls", "-l", "/app"]
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Add a non-root user to the final stage for security hardening.

The container runs as root, which is flagged by static analysis (DS-0002). Even for a build artifact container, using a non-root user is a security best practice that prevents privilege escalation in case of container escape.

🔒 Suggested fix
 # Stage 3: Runner (Minimal image)
 FROM debian:bookworm-slim AS runner
+RUN useradd --create-home --shell /bin/bash appuser
 WORKDIR /app
 COPY --from=builder /app/target/wheels/*.whl /app/
+RUN chown -R appuser:appuser /app
+USER appuser
 # The runner stage just holds the artifacts in this case as it's a build container.
 # If it were an app, we would install the wheel here.
 CMD ["ls", "-l", "/app"]
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
# Stage 3: Runner (Minimal image)
FROM debian:bookworm-slim AS runner
WORKDIR /app
COPY --from=builder /app/target/wheels/*.whl /app/
# The runner stage just holds the artifacts in this case as it's a build container.
# If it were an app, we would install the wheel here.
CMD ["ls", "-l", "/app"]
# Stage 3: Runner (Minimal image)
FROM debian:bookworm-slim AS runner
RUN useradd --create-home --shell /bin/bash appuser
WORKDIR /app
COPY --from=builder /app/target/wheels/*.whl /app/
RUN chown -R appuser:appuser /app
USER appuser
# The runner stage just holds the artifacts in this case as it's a build container.
# If it were an app, we would install the wheel here.
CMD ["ls", "-l", "/app"]
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docker/linux/Dockerfile` around lines 35 - 41, The final runner stage runs as
root; create a non-root user/group in the runner stage, chown the /app directory
(and copied wheel files from the COPY step) to that user, and set USER to that
account before CMD so the container no longer runs as root. Locate the runner
stage (FROM debian:bookworm-slim AS runner) where COPY --from=builder
/app/target/wheels/*.whl /app/ occurs, add steps to create a minimal user/group
(system or non-login), change ownership of /app and its contents, and then set
USER to that new user so subsequent commands (CMD ["ls", "-l", "/app"]) run
unprivileged.

Comment on lines 20 to +23
candles = await api.get_candles("EURUSD_otc", 60, time)
# print(f"Raw Candles: {candles}")
candles_pd = pd.DataFrame.from_dict(candles)
print(f"Candles: {candles_pd}")
print(f"Raw Candles: {candles}")
# candles_pd = pd.DataFrame.from_dict(candles)
# print(f"Candles: {candles_pd}")
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Use frame in the request or remove the inner loop.

Line 20 still hard-codes 60, so this example prints the same candle batch once for every entry in time_frames. Now that the script outputs raw payloads, that duplication is user-visible.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@examples/python/async/get_candles.py` around lines 20 - 23, The example is
hard-coding a 60s frame in the call to get_candles, causing duplicate output
across the time_frames loop; update the call to use the loop variable (e.g.,
replace the literal 60 with the frame/time_frame variable used in the loop) or
remove the surrounding loop so you only fetch once; ensure you modify the
get_candles(...) invocation (the symbol to change) to pass the intended frame
variable instead of the constant.

Comment thread README.md
Comment on lines +118 to +135
Install directly from our GitHub releases. Supports **Python 3.8 - 3.15**.

**Windows**

```bash
pip install "https://github.com/ChipaDevTeam/BinaryOptionsTools-v2/releases/download/v0.2.8/binaryoptionstoolsv2-0.2.8-cp39-abi3-win_amd64.whl"
pip install "https://github.com/ChipaDevTeam/BinaryOptionsTools-v2/releases/download/v0.2.9/binaryoptionstoolsv2-0.2.9-cp39-abi3-win_amd64.whl"
```

**Linux**

```bash
pip install "https://github.com/ChipaDevTeam/BinaryOptionsTools-v2/releases/download/v0.2.8/binaryoptionstoolsv2-0.2.8-cp39-abi3-manylinux_2_28_x86_64.whl"
pip install "https://github.com/ChipaDevTeam/BinaryOptionsTools-v2/releases/download/v0.2.9/binaryoptionstoolsv2-0.2.9-cp39-abi3-manylinux_2_28_x86_64.whl"
```

**macOS (Apple Silicon)**

```bash
pip install "https://github.com/ChipaDevTeam/BinaryOptionsTools-v2/releases/download/v0.2.8/binaryoptionstoolsv2-0.2.8-cp39-abi3-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl"
pip install "https://github.com/ChipaDevTeam/BinaryOptionsTools-v2/releases/download/v0.2.9/binaryoptionstoolsv2-0.2.9-cp39-abi3-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl"
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
set -e

echo "== ABI floor from Rust binding =="
rg -n 'abi3-py39' BinaryOptionsToolsV2/rust/Cargo.toml

echo
echo "== README support claims and wheel tags =="
rg -n '3\.8|3\.15|cp39-abi3' README.md

echo
echo "== Python package metadata, if present =="
fd '^pyproject\.toml$' . -x sh -c '
  file="$1"
  echo "-- $file --"
  rg -n "requires-python|Programming Language :: Python" "$file" || true
' sh {}

Repository: ChipaDevTeam/BinaryOptionsTools-v2

Length of output: 1619


Update Python version floor to 3.9+ across all documentation and package metadata.

The published wheels are tagged cp39-abi3 and the Rust binding is configured with pyo3 abi3-py39, which requires Python 3.9+. However, the README (Lines 4 and 118) and pyproject.toml (Line 14) incorrectly claim Python 3.8 support. Python 3.8 users will not be able to install this package as documented. Before publishing, update:

  • README Line 118: Change 3.8 - 3.15 to 3.9+
  • README Line 4: Remove Python 3.8 from the badge
  • BinaryOptionsToolsV2/pyproject.toml Line 14: Change requires-python = ">=3.8" to >=3.9
  • Remove Python 3.8 from classifiers in pyproject.toml (Line 23)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@README.md` around lines 118 - 135, The README and packaging metadata claim
Python 3.8 compatibility but the wheels and Rust pyo3 binding use ABI3 for
CPython 3.9 (`cp39-abi3`, `pyo3` `abi3-py39`), so update docs and pyproject to
require Python >=3.9: change the README text from "3.8 - 3.15" to "3.9+", remove
the Python 3.8 badge/mention, and in BinaryOptionsToolsV2/pyproject.toml update
requires-python to ">=3.9" and remove any Python 3.8 classifiers so the package
metadata matches the built wheels and pyo3 ABI configuration.

Comment on lines +79 to +84
def crashing_func(msg: str) -> bool:
raise ValueError("Simulated crash")

v = Validator.custom(crashing_func)
# This should return False instead of crashing the process
assert v.check("any message") == False
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Make the new test use a lint-safe boolean assertion.

Line 84 is flagged as E712. Renaming the unused argument on Line 79 also clears ARG001 in the same block.

🧹 Suggested fix
-    def crashing_func(msg: str) -> bool:
+    def crashing_func(_msg: str) -> bool:
         raise ValueError("Simulated crash")
@@
-    assert v.check("any message") == False
+    assert v.check("any message") is False
🧰 Tools
🪛 Ruff (0.15.4)

[warning] 79-79: Unused function argument: msg

(ARG001)


[warning] 80-80: Avoid specifying long messages outside the exception class

(TRY003)


[error] 84-84: Avoid equality comparisons to False; use not v.check("any message"): for false checks

Replace with not v.check("any message")

(E712)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/python/core/test_validator.py` around lines 79 - 84, The test should
avoid E712 and ARG001: rename the unused parameter in crashing_func from msg to
_msg (e.g., def crashing_func(_msg: str) -> bool) and change the assertion to
use identity comparison: assert v.check("any message") is False; this keeps the
intended behavior (Validator.custom(crashing_func) returns False on exception)
while satisfying lint rules.

@theshadow76 theshadow76 merged commit 32e79c2 into ChipaDevTeam:master Mar 9, 2026
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants