ScryNeuro is a high-performance bridge between Scryer Prolog and Python, designed for neuro-symbolic AI research. It enables Scryer Prolog programs to seamlessly call Python neural components — LLMs, deep neural networks, reinforcement learning agents, NumPy, PyTorch — while preserving Prolog's logical reasoning capabilities.
Inspired by Jurassic.pl (SWI-Prolog ↔ Julia bridge).
[ Scryer Prolog ] <-> [ Rust cdylib (FFI) ] <-> [ PyO3 ] <-> [ Python Runtime ]
(Logic) (Bridge Layer) (Glue) (Neural/Perception)
- Scryer Prolog — logical reasoning and top-level control flow.
- Rust cdylib (
libscryneuro.so/.dylib) — FFI bridge with handle-based object registry. - PyO3 — embeds Python within Rust; manages GIL and type conversions.
- Python — executes neural predicates, data processing, library calls (PyTorch, NumPy, OpenAI, etc.).
NN, LLM, and RL functionality are opt-in plugins — separate modules loaded via use_module. The core (scryer_py.pl) only provides py_* predicates and the := operator.
| Plugin | Module file | Predicates |
|---|---|---|
| Neural Networks | prolog/scryer_nn.pl |
nn_load/3,4, nn_predict/3,4 |
| Large Language Models | prolog/scryer_llm.pl |
llm_load/3,4, llm_generate/3,4 |
| Reinforcement Learning | prolog/scryer_rl.pl |
rl_create/4, rl_load/3,4, rl_save/2, rl_action/3,4, rl_train/2,3, rl_evaluate/3, rl_info/2 |
Each plugin has a matching Python runtime module (python/scryer_*_runtime.py) that is loaded lazily on first use.
The agent subsystem reads profile configuration from JSON:
- Default file:
python/config/agent_profiles.json - Override with env:
SCRYNEURO_AGENT_CONFIG=/abs/path/to/agent_profiles.json - Optional local override file:
<config>.local.json- Example:
python/config/agent_profiles.local.json
- Example:
When a local override exists, it is deep-merged into the base config (nested objects are merged; scalar values are overridden).
Runtime precedence for effective settings is:
- Explicit options passed at agent creation call (highest priority)
- Profile fields from config file(s)
- Environment /
.envfallback (for OpenAI:OPENAI_API_KEY,OPENAI_BASE_URL,OPENAI_MODELwhen model isauto) - Hard defaults (
provider=openai,model=auto, thenOPENAI_MODELfallback togpt-4o-mini)
| Component | Version | Notes |
|---|---|---|
| Rust | stable ≥ 1.70 | rustup recommended |
| Python | 3.10 – 3.13 | with shared library (libpython3.x.so / .dylib) |
| Scryer Prolog | latest git | must support library(ffi) |
Note on Python 3.14+: current
pyo3 = 0.23.xmay reject Python 3.14 by default. Our build scripts now auto-enablePYO3_USE_ABI3_FORWARD_COMPATIBILITY=1when Python >= 3.14 is detected.
The compatibility flag is only used for the build process in the script and is unset afterward, so it does not leak into normal runtime commands.
# Install rustup (if not installed)
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source ~/.cargo/env
# Verify
rustc --version
cargo --version# Build from source (requires Rust)
git clone https://github.com/mthom/scryer-prolog.git
cd scryer-prolog
cargo install --path .
# Verify
scryer-prolog --versionScryNeuro links against whatever python3 is active at build time, and loads libpython3.x.so at runtime. Both must match.
# Create a dedicated environment
conda create -n scryneuro python=3.12 numpy -y
conda activate scryneuro
# Install ML libraries as needed
conda install pytorch torchvision torchaudio pytorch-cuda=12.4 -c pytorch -c nvidia # GPU
# OR
conda install pytorch torchvision torchaudio cpuonly -c pytorch # CPU only
# Verify shared library exists
python3 -c "import sysconfig; print(sysconfig.get_config_var('LIBDIR'))"
# Should print something like: /home/user/miniconda3/envs/scryneuro/lib
ls $(python3 -c "import sysconfig; print(sysconfig.get_config_var('LIBDIR'))")/libpython3*.so*# Install uv if needed
curl -LsSf https://astral.sh/uv/install.sh | sh
# Create project venv
uv venv .venv --python 3.12
source .venv/bin/activate
# Install dependencies
uv pip install numpy torch
# Verify
python3 -c "import sysconfig; print(sysconfig.get_config_var('LIBDIR'))"# Debian/Ubuntu
sudo apt install python3-dev python3-numpy
# Fedora
sudo dnf install python3-devel python3-numpy
# macOS (Homebrew)
brew install python@3.12 numpyCritical: Python must be built with shared library support. Conda and system packages always have this. If using
pyenv, build with:PYTHON_CONFIGURE_OPTS="--enable-shared" pyenv install 3.12
git clone <repo-url> ScryNeuro
cd ScryNeuro
# Activate your Python environment first!
conda activate scryneuro # or: source .venv/bin/activateUse the provided build script for your platform — it handles cargo build --release, copying the library to the project root, and exporting PYLIB, LD_LIBRARY_PATH / DYLD_LIBRARY_PATH, and SCRYNEURO_HOME:
# Linux
source build_linux.sh
# macOS
source build_macos.shIf your shell/environment still invokes cargo build --release directly with Python 3.14+, set:
PYO3_USE_ABI3_FORWARD_COMPATIBILITY=1 cargo build --releaseWhy
source? Running withsource(or. ./build_linux.sh) makes the exported variables persist in your current shell. Running as./build_linux.shsets them only in a subshell that immediately exits.
Alternatively, build manually:
cargo build --release
cp target/release/libscryneuro.so ./ # Linux
# cp target/release/libscryneuro.dylib ./ # macOSThe build output should show Building with Python 3.12.x (matching your active environment).
ScryNeuro requires two shared libraries at runtime:
libscryneuro.so— the Prolog↔Python bridge (in project root aftercp)libpython3.x.so— the Python shared library (location varies by system/environment)
The LD_LIBRARY_PATH (Linux) or DYLD_LIBRARY_PATH (macOS) tells the OS dynamic linker where to find these .so files.
# Linux — robust command that works on all distros and environments:
PYLIB=$(python3 -c "import sysconfig; print(sysconfig.get_config_var('LIBDIR'))")
LD_LIBRARY_PATH=".:$PYLIB:$LD_LIBRARY_PATH" scryer-prolog examples/basic.pl
# macOS:
PYLIB=$(python3 -c "import sysconfig; print(sysconfig.get_config_var('LIBDIR'))")
DYLD_LIBRARY_PATH=".:$PYLIB:$DYLD_LIBRARY_PATH" scryer-prolog examples/basic.plShortcut: On some systems (e.g., Arch Linux with the
pythonsystem package),libpython3.x.sois already in/usr/lib/(a default linker search path), soLD_LIBRARY_PATH=.alone suffices. However, the robust command above works everywhere — including Debian/Ubuntu, Fedora, and conda/venv environments wherelibpythonlives in a non-default path. See UnderstandingLD_LIBRARY_PATHfor details.
Expected output:
=== Arithmetic ===
2^10 = 1024
sum(0..99) = 4950
...
=== All basic examples complete ===
ScryNeuro can be called from Prolog projects located outside the ScryNeuro directory. There are two approaches:
Set SCRYNEURO_HOME to point to your ScryNeuro installation. py_init/0 will automatically discover the shared library and configure Python's sys.path.
export SCRYNEURO_HOME=/path/to/ScryNeuro
# Ensure both libscryneuro.so and libpython3.x.so are discoverable
PYLIB=$(python3 -c "import sysconfig; print(sysconfig.get_config_var('LIBDIR'))")
export LD_LIBRARY_PATH="$SCRYNEURO_HOME:$PYLIB:$LD_LIBRARY_PATH" # Linux
# export DYLD_LIBRARY_PATH="$SCRYNEURO_HOME:$PYLIB:$DYLD_LIBRARY_PATH" # macOS
# Run your project from anywhere
scryer-prolog /path/to/your_project/main.plIn your Prolog code, use an absolute path to load the module:
:- op(700, xfx, :=).
:- use_module('/path/to/ScryNeuro/prolog/scryer_py').
main :-
py_init, %% auto-discovers library via SCRYNEURO_HOME
X := py_eval("1 + 2"),
py_to_int(X, V),
format("Result: ~d~n", [V]),
py_free(X),
py_finalize.Use py_init_home/1 to specify the ScryNeuro root directory directly, without setting any environment variables:
:- use_module('/path/to/ScryNeuro/prolog/scryer_py').
main :-
py_init_home("/path/to/ScryNeuro"),
%% ... your code ...
py_finalize.You still need LD_LIBRARY_PATH (or DYLD_LIBRARY_PATH on macOS) to include the ScryNeuro directory so the OS can find libscryneuro.so.
Create a run.sh in your project:
#!/bin/bash
# Run a Prolog script with ScryNeuro from any directory
SCRYNEURO=/path/to/ScryNeuro
eval "$(conda shell.bash hook)"
conda activate scryneuro
export SCRYNEURO_HOME=$SCRYNEURO
PYLIB=$(python3 -c "import sysconfig; print(sysconfig.get_config_var('LIBDIR'))")
export LD_LIBRARY_PATH="$SCRYNEURO:$PYLIB:$LD_LIBRARY_PATH"
scryer-prolog "$@"Then run: ./run.sh main.pl
ScryNeuro loads two shared libraries at runtime:
| Library | What it is | Where it lives |
|---|---|---|
libscryneuro.so |
The Prolog↔Python bridge | ScryNeuro project root (after cp) |
libpython3.x.so |
Python’s shared library | Varies by system and environment |
The Linux dynamic linker (ld.so) searches for .so files in this order:
LD_LIBRARY_PATH(user-set)/etc/ld.so.cache(configured via/etc/ld.so.confandldconfig)/lib,/usr/lib(built-in defaults)
If both libraries are in those default paths, you don’t need LD_LIBRARY_PATH at all. In practice, libscryneuro.so is never in a default path (it’s in your project), so you always need at least . in LD_LIBRARY_PATH.
Whether you also need to add libpython’s path depends on your system:
| Environment | libpython location |
LD_LIBRARY_PATH=. sufficient? |
Why |
|---|---|---|---|
Arch Linux + system python package |
/usr/lib/ |
✅ Yes | /usr/lib/ is a default search path |
Arch Linux + conda (but system python pkg installed) |
/usr/lib/ (system pkg provides it) |
✅ Yes | Even with conda active, the system libpython.so in /usr/lib/ is found by the linker |
Debian/Ubuntu + libpython3-dev |
/usr/lib/x86_64-linux-gnu/ |
✅ Yes | Multiarch path is in ld.so.conf |
| Debian/Ubuntu + conda | ~/miniconda3/envs/.../lib/ |
❌ No | Conda’s lib dir is not in any default search path |
| Fedora + conda | ~/miniconda3/envs/.../lib/ |
❌ No | Same reason |
Any distro + pyenv (with --enable-shared) |
~/.pyenv/versions/.../lib/ |
❌ No | Not in default path |
Using run.sh wrapper |
Any | ✅ (handled by script) | Script sets LD_LIBRARY_PATH automatically |
Key insight: There are two separate dependency layers:
- OS dynamic linker — finds
.sofiles (this is whatLD_LIBRARY_PATHcontrols) - Python package system — finds Python packages/modules (this is what conda/pip manages)
Conda provides Python packages, but libpython.so itself may come from either the conda environment or the system. When a system package (e.g., Arch’s python pacman package) provides libpython.so in /usr/lib/, the linker finds it there regardless of which conda environment is active.
One-liner (always works):
PYLIB=$(python3 -c "import sysconfig; print(sysconfig.get_config_var('LIBDIR'))")
LD_LIBRARY_PATH=".:$PYLIB:$LD_LIBRARY_PATH" scryer-prolog examples/basic.plIf you know . alone is enough for your system:
LD_LIBRARY_PATH=. scryer-prolog examples/basic.plThe RTLD_GLOBAL mechanism in spy_init() automatically re-opens libpython3.x.so with global symbol visibility. This is required for C extensions like NumPy and PyTorch to resolve symbols correctly. You don’t need to do anything special — just ensure libpython is discoverable via LD_LIBRARY_PATH as described above.
If RTLD_GLOBAL auto-detection fails (rare), fall back to LD_PRELOAD:
export LD_PRELOAD=$(python3 -c "import sysconfig, os; print(os.path.join(sysconfig.get_config_var('LIBDIR'), 'libpython3.12.so'))")To avoid setting LD_LIBRARY_PATH every session, add this to ~/.bashrc or ~/.zshrc:
# === ScryNeuro Configuration ===
# Set SCRYNEURO_HOME to your ScryNeuro installation directory.
# This enables cross-project usage (calling ScryNeuro from any directory).
export SCRYNEURO_HOME="$HOME/path/to/ScryNeuro"
# Ensure the dynamic linker can find libscryneuro.so and libpython3.x.so.
# This queries the active Python environment for libpython's location.
export LD_LIBRARY_PATH="$SCRYNEURO_HOME:$(python3 -c 'import sysconfig; print(sysconfig.get_config_var("LIBDIR"))'):$LD_LIBRARY_PATH"Note: If you use conda, the
python3in the snippet above refers to whichever Python is active when the shell starts. If you activate a different conda environment later,$LD_LIBRARY_PATHwill still point to the original environment’slibpython. This is usually fine becauselibpython3.12.sois ABI-compatible across conda envs of the same Python version. If you switch Python versions, rebuild ScryNeuro (cargo clean && cargo build --release).
Alternative: If you prefer a per-invocation approach (no persistent config), use the
run.shwrapper described in Cross-Project Usage or in the convenience wrapper below.
#!/bin/bash
# Activate conda env and run a Prolog script with ScryNeuro
eval "$(conda shell.bash hook)"
conda activate scryneuro
PYLIB=$(python3 -c "import sysconfig; print(sysconfig.get_config_var('LIBDIR'))")
LD_LIBRARY_PATH=".:$PYLIB:$LD_LIBRARY_PATH" scryer-prolog "$@"On macOS, the shared library extension is .dylib, and the environment variable is DYLD_LIBRARY_PATH:
# Build
cargo build --release
cp target/release/libscryneuro.dylib ./
# Run (robust — works with conda, uv, Homebrew, system Python):
PYLIB=$(python3 -c "import sysconfig; print(sysconfig.get_config_var('LIBDIR'))")
DYLD_LIBRARY_PATH=".:$PYLIB:$DYLD_LIBRARY_PATH" scryer-prolog examples/basic.pl
# Run (shortcut — if libpython is already in a default search path):
DYLD_LIBRARY_PATH=. scryer-prolog examples/basic.plNote: macOS SIP (System Integrity Protection) strips
DYLD_LIBRARY_PATHfrom child processes in some contexts (e.g., from GUI apps, or when the binary is in/usr/bin). If you encounter issues:
- Use
install_name_toolto embed the rpath:install_name_tool -add_rpath @loader_path/. target/release/libscryneuro.dylib- Or place
libscryneuro.dylibin a standard search path like/usr/local/lib.
macOS does not need the RTLD_GLOBAL workaround — Python C extensions resolve symbols differently on Darwin. The spy_init() code handles this via #[cfg(target_os = "linux")].
If you switch conda environments (or Python versions), you must rebuild:
conda activate other_env
cargo clean # Remove old build artifacts linked to previous Python
cargo build --release
cp target/release/libscryneuro.so ./ # or .dylib on macOSThe build system detects Python via python3 in your PATH. You can override this with:
PYO3_PYTHON=/path/to/specific/python3 cargo build --release:- op(700, xfx, :=).
:- use_module('prolog/scryer_py').
main :-
% Initialize the Python interpreter and load the bridge library.
py_init,
% Evaluate a Python expression and get a handle to the result.
X := py_eval("1 + 2"),
% Convert the Python integer handle to a Prolog integer.
py_to_int(X, Val),
% Print the result.
format("Result: ~d~n", [Val]),
% Release the Python object handle to prevent memory leaks.
py_free(X),
% Shut down the Python interpreter.
py_finalize.This example demonstrates the fundamental workflow: initialize the bridge, evaluate Python code to get a handle, convert the result to a Prolog-native value, print it, free the handle, and shut down. Every ScryNeuro program follows this pattern.
If you do not want to build a frontend app, use the built-in Python Gradio page for quick testing.
pip install gradiopython python/web_ui/app_gradio.pyThen open: http://127.0.0.1:7860
- Select profile/provider/model context
- Create/replace agent from profile
- Enable builtin tools
- Load skills
- Multi-turn conversation with persistent context (same agent)
- Run a task (
agent_run) and render chat history - Refresh conversation view from runtime state
- Reset conversation while keeping the same profile/tools/skills config
- Inspect trace (
agent_trace) - Close agent (
agent_unload)
python/web_ui/app_gradio.py: UI onlypython/web_ui/agent_adapter.py: thin adapter layer calling existing runtime APIs- Core logic remains in
python/scryer_agent_runtime.pyand related modules
Python objects are stored in a Rust-side HashMap registry. From the Prolog perspective, these objects are represented as opaque integer IDs called handles.
- Python objects live in a Rust-side HashMap. Prolog sees them as opaque integer IDs (handles).
- Handle
0is the error sentinel and never represents a valid Python object. Valid handles start at 1, monotonically increasing. - When you pass a handle back to a
py_*function, the Rust layer looks up the actual Python object in the registry. - Think of handles like file descriptors — opaque numbers that reference real resources.
Every FFI call to the Python runtime acquires the GIL automatically. Python's GIL ensures that only one thread executes Python bytecode at a time. While this is handled transparently, it means that all Python calls are serialized even from multi-threaded Prolog.
The registry, managed by src/registry.rs, is a thread-safe (Mutex-protected) HashMap that tracks all live Python objects being used by Prolog.
- Every
py_eval,py_import,py_from_*, or similar function creates a new entry in the registry and increments the Python object's reference count. - Calling
py_free/1removes the entry from the registry and decrements the Python object's reference count. - Once freed, a handle becomes invalid. Using a freed handle will result in an error.
At the Rust FFI level, three primary patterns are used:
- Handle functions: Return
0on error. - Status functions: Return
-1on error,0on success. - String functions: Return an empty string
""on error.
The Prolog layer now enforces strict error handling for all high-level conversion predicates: it clears stale thread-local errors before calls and throws error(python_error(Msg), Context) if any conversion/inspection call leaves an error.
In practice, this means py_to_int/2, py_to_float/2, py_to_str/2, py_to_repr/2, py_to_json/2, py_list_len/2, py_handle_count/1, and py_free/1 now throw on failure instead of exposing ambiguous sentinel values.
Use catch/3 to handle these errors gracefully:
catch(
(X := py_eval("1/0"), py_to_int(X, V)),
error(python_error(Msg), _),
format("Caught: ~s~n", [Msg])
).Every handle represents a resource in the Rust/Python layers. You must free handles when they are no longer needed to prevent memory leaks.
py_free/1: Manual cleanup of a specific handle.with_py(Handle, Goal): RAII-style cleanup. Executes the goal and then frees the handle regardless of whether the goal succeeded, failed, or threw an exception.py_handle_count/1: Diagnostic tool that returns the number of currently active handles.
Scryer Prolog represents double-quoted strings like "hello" as lists of characters (char lists). Atoms like hello are symbolic constants, not strings. This distinction is critical for the := operator's dispatch mechanism. Note that Scryer Prolog does not support \n escapes in double-quoted strings, which is why py_exec_lines/1 is provided for multi-line Python code.
String-returning FFI functions, such as py_to_str or py_to_json, write their results into a thread-local storage (TLS) buffer on the Rust side. The Prolog layer immediately copies the contents of this buffer into a Prolog char list. This management is transparent to the user.
These predicates manage the state of the embedded Python interpreter.
Initialize the Python interpreter with the default library path ./libscryneuro.so. This call is idempotent, meaning it does nothing if the interpreter is already initialized. On Linux, it handles RTLD_GLOBAL for C extensions and adds the current directory . to sys.path. If the SCRYNEURO_HOME environment variable is set, the library is loaded from that directory and $SCRYNEURO_HOME/python is added to sys.path, enabling cross-project usage.
Initialize the interpreter with a custom path to the shared library. This is also idempotent.
| Parameter | Type | Description |
|---|---|---|
| Path | String | Path to the shared library file |
Initialize the interpreter using an explicit ScryNeuro root directory. The shared library is loaded from Home/libscryneuro.so (or .dylib on macOS), and Home/python is added to sys.path. Use this as an alternative to the SCRYNEURO_HOME environment variable.
| Parameter | Type | Description |
|---|---|---|
| Home | String | Absolute path to the ScryNeuro root directory |
Shuts down the Python interpreter, clears the handle registry, and retracts the initialization flag. It is safe to call even if the interpreter wasn't initialized.
Example:
:- use_module('prolog/scryer_py').
main :-
py_init, % Initialize with default path
% ... your code ...
py_finalize. % Clean shutdown
main_custom :-
py_init("/opt/lib/libscryneuro.so"), % Custom path
% ... your code ...
py_finalize.
main_cross_project :-
py_init_home("/path/to/ScryNeuro"), % Explicit home directory
% ... your code ...
py_finalize.Execute Python code directly from Prolog.
Evaluates a Python expression and returns a handle to the result. An expression must produce a value (e.g., 1 + 1, len([1,2,3])).
| Parameter | Type | Description |
|---|---|---|
| Code | String | Python expression to evaluate |
| Handle | Integer | Handle to the resulting object |
Executes Python statements. Use this for code that does not return a value, such as imports, variable assignments, or class definitions.
| Parameter | Type | Description |
|---|---|---|
| Code | String | Python statements to execute |
Takes a list of strings and joins them with newlines before passing the result to py_exec. This is the preferred way to execute multi-line Python code because Scryer Prolog doesn't support \n in strings.
| Parameter | Type | Description |
|---|---|---|
| Lines | List of Strings | Lines of Python code |
Pitfall: py_eval is for expressions that return a value. py_exec is for statements. Using py_eval on a statement like import math will result in an error.
Example:
:- use_module('prolog/scryer_py').
eval_exec_demo :-
py_init,
%% py_eval: evaluate an expression
py_eval("2 ** 10", H),
py_to_int(H, Val),
format("2^10 = ~d~n", [Val]),
py_free(H),
%% py_exec: execute a statement
py_exec("import math"),
%% py_eval using an imported module
py_eval("math.pi", PiH),
py_to_float(PiH, Pi),
format("Pi = ~f~n", [Pi]),
py_free(PiH),
%% py_exec_lines: multi-line Python code
py_exec_lines([
"class Greeter:",
" def __init__(self, name):",
" self.name = name",
" def greet(self):",
" return f'Hello, {self.name}!'"
]),
%% Use the class we just defined
py_eval("Greeter('World')", G),
py_call(G, "greet", Greeting),
py_to_str(Greeting, Str),
format("~s~n", [Str]),
py_free(Greeting),
py_free(G).Import Python modules to access their functionality.
Imports a Python module by name and returns a handle to the module object.
| Parameter | Type | Description |
|---|---|---|
| ModuleName | String | Module name (e.g., "math", "numpy") |
| Handle | Integer | Handle to the module object |
Example:
:- op(700, xfx, :=).
:- use_module('prolog/scryer_py').
module_demo :-
py_init,
Math := py_import("math"),
py_getattr(Math, "pi", PiH),
py_to_float(PiH, Pi),
format("math.pi = ~f~n", [Pi]),
py_free(PiH),
py_free(Math).Read and write attributes of Python objects.
Gets the value of an attribute from a Python object.
| Parameter | Type | Description |
|---|---|---|
| Obj | Handle | The Python object |
| AttrName | String | Attribute name |
| Value | Handle | Handle to the attribute value |
Sets the value of an attribute on a Python object.
| Parameter | Type | Description |
|---|---|---|
| Obj | Handle | The Python object |
| AttrName | String | Attribute name |
| Value | Handle | Handle to the new value |
Example:
:- use_module('prolog/scryer_py').
attr_demo :-
py_init,
py_exec_lines([
"class Point:",
" def __init__(self, x, y):",
" self.x = x",
" self.y = y"
]),
py_eval("Point(3, 4)", P),
%% Get attributes
py_getattr(P, "x", XH),
py_to_int(XH, X),
format("x = ~d~n", [X]),
%% Set an attribute
py_from_int(10, NewX),
py_setattr(P, "x", NewX),
%% Verify the change
py_getattr(P, "x", XH2),
py_to_int(XH2, X2),
format("x after set = ~d~n", [X2]),
maplist(py_free, [P, XH, NewX, XH2]).Invoke methods on Python objects.
Calls a method on an object with 0 to 3 arguments. The last argument is always the output handle.
| Parameter | Type | Description |
|---|---|---|
| Obj | Handle | The Python object |
| Method | String | Method name (must be a string/char list, NOT an atom) |
| ArgX | Handle | Argument handles |
| Result | Handle | Handle to the return value |
Calls a method with N arguments. Args can be either a Prolog list of handles [H1, H2, ...] or a handle to an existing Python list or sequence.
Pitfall: The method name must be a string. py_call(Obj, "upper", R) works, but py_call(Obj, upper, R) will fail because upper is an atom.
Example:
:- use_module('prolog/scryer_py').
method_demo :-
py_init,
%% 0-arg method call
py_from_str("hello world", S),
py_call(S, "upper", Upper),
py_to_str(Upper, UpperStr),
format("upper: ~s~n", [UpperStr]),
%% 2-arg method call: "hello world".replace("world", "prolog")
py_from_str("world", Old),
py_from_str("prolog", New),
py_call(S, "replace", Old, New, Replaced),
py_to_str(Replaced, ReplacedStr),
format("replaced: ~s~n", [ReplacedStr]),
%% N-arg method call with py_calln
%% Equivalent to "hello world".split("o", 1)
py_from_str("o", Sep),
py_from_int(1, MaxSplit),
py_calln(S, "split", [Sep, MaxSplit], SplitResult),
py_to_str(SplitResult, SplitStr),
format("split result: ~s~n", [SplitStr]),
maplist(py_free, [S, Upper, Old, New, Replaced, Sep, MaxSplit, SplitResult]).Call functions, lambdas, or class constructors directly.
Invokes a callable object with 0 to N arguments. Same rules apply for Args in py_invoken as in py_calln.
| Parameter | Type | Description |
|---|---|---|
| Callable | Handle | Any Python callable (function, class, lambda) |
| ArgX | Handle | Argument handles |
| Result | Handle | Handle to the return value |
Key Difference: py_call calls a method on an object (obj.method(args)), whereas py_invoke calls the object itself (callable(args)). Use py_invoke for functions, lambdas, and constructors.
Example:
:- use_module('prolog/scryer_py').
invoke_demo :-
py_init,
py_import("math", Math),
%% Get a reference to the sqrt function
py_getattr(Math, "sqrt", SqrtFn),
%% Call it directly with py_invoke (not py_call!)
py_from_float(144.0, Arg),
py_invoke(SqrtFn, Arg, Result),
py_to_float(Result, Val),
format("sqrt(144) = ~f~n", [Val]),
%% Call a lambda
py_eval("lambda x, y: x * y + 1", Fn),
py_from_int(3, A),
py_from_int(4, B),
py_invoke(Fn, A, B, R2),
py_to_int(R2, V2),
format("lambda(3,4) = ~d~n", [V2]),
maplist(py_free, [Math, SqrtFn, Arg, Result, Fn, A, B, R2]).Convert data between Prolog and Python types.
| Predicate | Direction | Prolog Type | Python Type |
|---|---|---|---|
py_to_str/2 |
Py -> Pl | String (char list) | str(obj) |
py_to_repr/2 |
Py -> Pl | String (char list) | repr(obj) |
py_to_int/2 |
Py -> Pl | Integer | int |
py_to_float/2 |
Py -> Pl | Float | float |
py_to_bool/2 |
Py -> Pl | Atom (true/false) |
bool |
py_from_int/2 |
Pl -> Py | Integer | int |
py_from_float/2 |
Pl -> Py | Float | float |
py_from_bool/2 |
Pl -> Py | Atom (true/false) |
bool |
py_from_str/2 |
Pl -> Py | String (char list) | str |
Converts a Python object to its string representation using Python's str() function.
| Parameter | Type | Description |
|---|---|---|
| Handle | Handle | Python object to convert |
| String | String (char list) | The string representation |
Converts a Python object to its repr string using Python's repr() function. Useful for debugging, as it shows the object's type and value in an unambiguous format (e.g., strings are shown with quotes: 'hello').
| Parameter | Type | Description |
|---|---|---|
| Handle | Handle | Python object to convert |
| String | String (char list) | The repr representation |
Extracts the integer value from a Python int object.
| Parameter | Type | Description |
|---|---|---|
| Handle | Handle | Python int object |
| Value | Integer | The Prolog integer value |
Error behavior: Throws error(python_error(Msg), py_to_int/2) on conversion failure.
Extracts the float value from a Python float object.
| Parameter | Type | Description |
|---|---|---|
| Handle | Handle | Python float object |
| Value | Float | The Prolog float value |
Error behavior: Throws error(python_error(Msg), py_to_float/2) on conversion failure.
Extracts a boolean value from a Python bool object.
| Parameter | Type | Description |
|---|---|---|
| Handle | Handle | Python bool object |
| Value | Atom | true or false |
Note: Returns Prolog atoms true/false, NOT integers 1/0. Internally, the FFI returns 1 (true), 0 (false), or -1 (error), and the Prolog layer converts these.
Creates a Python int object from a Prolog integer.
| Parameter | Type | Description |
|---|---|---|
| Value | Integer | Prolog integer to convert |
| Handle | Handle | Handle to the new Python int |
Creates a Python float object from a Prolog float.
| Parameter | Type | Description |
|---|---|---|
| Value | Float | Prolog float to convert |
| Handle | Handle | Handle to the new Python float |
Creates a Python bool object from a Prolog atom.
| Parameter | Type | Description |
|---|---|---|
| Value | Atom | true or false |
| Handle | Handle | Handle to the new Python bool |
Note: Only accepts true or false atoms. Any other atom or non-atom will cause an error.
Creates a Python str object from a Prolog string (char list).
| Parameter | Type | Description |
|---|---|---|
| Value | String (char list) | The Prolog string to convert |
| Handle | Handle | Handle to the new Python str |
Pitfalls:
py_to_intandpy_to_floatnow throw on conversion errors; catch exceptions withcatch/3.py_to_boolreturns Prolog atomstrueandfalse, not integers.py_from_boolexpects atomstrueorfalse.py_to_repris useful for debugging as it returns the output of Python'srepr()function.
Example:
:- use_module('prolog/scryer_py').
conversion_demo :-
py_init,
%% Prolog -> Python -> Prolog round-trip
py_from_int(42, H1),
py_to_int(H1, V1),
format("int round-trip: ~d~n", [V1]),
py_from_float(3.14, H2),
py_to_float(H2, V2),
format("float round-trip: ~f~n", [V2]),
py_from_bool(true, H3),
py_to_bool(H3, V3),
format("bool round-trip: ~w~n", [V3]),
py_from_str("hello", H4),
py_to_str(H4, V4),
format("str round-trip: ~s~n", [V4]),
%% repr for debugging
py_to_repr(H4, Repr),
format("repr: ~s~n", [Repr]),
maplist(py_free, [H1, H2, H3, H4]).Handle Python's None singleton.
Returns a handle to the Python None object.
Succeeds if the handle points to None and fails otherwise. Invalid handles now raise an exception.
Example:
:- use_module('prolog/scryer_py').
none_demo :-
py_init,
py_none(N),
( py_is_none(N) ->
format("It is None~n", [])
; format("It is not None~n", [])
),
py_free(N),
%% A non-None value
py_from_int(42, H),
( py_is_none(H) ->
format("42 is None~n", [])
; format("42 is not None~n", [])
),
py_free(H).A robust way to exchange structured data between Prolog and Python.
Serializes a Python object to a JSON string using json.dumps.
Deserializes a JSON string into a Python object using json.loads.
Pitfall: This bridge only works for JSON-serializable objects (dicts, lists, strings, numbers, booleans, and None). Custom classes or tensors will fail.
Tip: The JSON bridge is often the simplest method for transferring complex data structures.
Example:
:- use_module('prolog/scryer_py').
json_demo :-
py_init,
%% Python -> JSON -> Prolog string
py_eval("{'name': 'Alice', 'age': 30}", DictH),
py_to_json(DictH, Json),
format("JSON: ~s~n", [Json]),
py_free(DictH),
%% Prolog string -> JSON -> Python
py_from_json("[1, 2, 3]", ListH),
py_to_str(ListH, Str),
format("Parsed: ~s~n", [Str]),
py_free(ListH).Manipulate Python's native list and dictionary types.
Creates a new empty Python list [].
Appends an item handle to a Python list. This operation mutates the list in-place.
Retrieves the item at the specified 0-based index.
Returns the length of the list. Throws an exception on error.
Converts a Prolog list of handles into a Python list object.
Creates a new empty Python dictionary {}.
Sets a key-value pair in a dictionary. Key must be a string, and Value must be a handle.
Retrieves a value from a dictionary using a string key. This predicate throws an exception if the key is not found.
Example:
:- use_module('prolog/scryer_py').
collection_demo :-
py_init,
%% Build a Python list
py_list_new(List),
py_from_int(10, A),
py_from_int(20, B),
py_from_int(30, C),
py_list_append(List, A),
py_list_append(List, B),
py_list_append(List, C),
py_list_len(List, Len),
format("List length: ~d~n", [Len]),
%% Access by index
py_list_get(List, 1, Item),
py_to_int(Item, ItemVal),
format("List[1] = ~d~n", [ItemVal]),
%% Build a Python dict
py_dict_new(Dict),
py_from_str("Alice", Name),
py_from_int(30, Age),
py_dict_set(Dict, "name", Name),
py_dict_set(Dict, "age", Age),
%% Read back
py_dict_get(Dict, "name", NameBack),
py_to_str(NameBack, NameStr),
format("Dict['name'] = ~s~n", [NameStr]),
%% Serialize entire dict to JSON
py_to_json(Dict, Json),
format("JSON: ~s~n", [Json]),
%% py_list_from_handles: batch convert
py_from_int(1, H1),
py_from_int(2, H2),
py_from_int(3, H3),
py_list_from_handles([H1, H2, H3], PyList),
py_to_str(PyList, ListStr),
format("From handles: ~s~n", [ListStr]),
%% Clean up
maplist(py_free, [List, A, B, C, Item, Dict, Name, Age, NameBack,
H1, H2, H3, PyList]).Tools for managing handle lifecycles and diagnosing leaks.
Releases a handle, removing it from the registry and decrementing the Python reference count. Throws on invalid handle.
RAII-style wrapper. Executes Goal and ensures Handle is freed regardless of the outcome.
Returns the number of active handles in the registry. Useful for leak detection.
Returns the last Python error message as a string. Returns an empty list if no error occurred.
Example (with_py):
:- use_module('prolog/scryer_py').
raii_demo :-
py_init,
py_handle_count(Before),
format("Handles before: ~d~n", [Before]),
py_eval("[1, 2, 3, 4, 5]", ListH),
with_py(ListH, (
py_to_json(ListH, Json),
format("List as JSON: ~s~n", [Json])
)),
%% ListH is automatically freed here
py_handle_count(After),
format("Handles after: ~d~n", [After]).Example (error checking):
check_error_demo :-
py_init,
catch(
(
py_eval("int('bad')", H),
py_to_int(H, _),
py_free(H)
),
error(python_error(Msg), _),
format("Conversion error: ~s~n", [Msg])
).Requires plugin:
:- use_module('prolog/scryer_nn').
Predicates for managing and running deep learning models.
Loads a model from a file and registers it under a symbolic name.
| Parameter | Type | Description |
|---|---|---|
| Name | Atom | Symbolic identifier for the model |
| Path | String | Path to the model file |
| Options | List | Key=Value pairs (e.g., model_type=pytorch) |
Common Options for nn_load:
| Option | Example | Description |
|---|---|---|
model_type |
model_type=pytorch |
Framework: pytorch, tensorflow, onnx |
device |
device=cuda |
Compute device: cpu, cuda, cuda:0 |
weights_only |
weights_only=true |
PyTorch: load weights only (safer) |
Runs inference using a registered model.
| Parameter | Type | Description |
|---|---|---|
| Name | Atom | Identifier matching a loaded model |
| Input | Handle | Input data handle (tensor or array) |
| Output | Handle | Handle to the inference result |
| Options | List | Key=Value pairs for inference |
Common Options for nn_predict:
| Option | Example | Description |
|---|---|---|
batch_size |
batch_size=32 |
Batch size for inference |
no_grad |
no_grad=true |
Disable gradient computation |
Options are formatted as [key1=value1, key2=value2, ...] where keys are atoms. Values can be numbers or atoms (atoms are converted to strings).
Example:
:- op(700, xfx, :=).
:- use_module('prolog/scryer_py').
:- use_module('prolog/scryer_nn').
neural_demo :-
py_init,
%% Load a PyTorch model
nn_load(my_model, "models/classifier.pt",
[model_type=pytorch, device=cpu, weights_only=true]),
%% Create input tensor (via Python)
Input := py_eval("__import__('torch').randn(1, 784)"),
%% Run inference
nn_predict(my_model, Input, Output),
py_to_str(Output, OutputStr),
format("Prediction: ~s~n", [OutputStr]),
py_free(Input),
py_free(Output),
py_finalize.Requires plugin:
:- use_module('prolog/scryer_llm').
Predicates for interacting with Large Language Model providers.
Configures an LLM provider and model.
| Parameter | Type | Description |
|---|---|---|
| Name | Atom | Symbolic identifier |
| ModelId | String | Model ID (e.g., "gpt-4") |
| Options | List | Configuration (e.g., provider=openai) |
Common Options for llm_load:
| Option | Example | Description |
|---|---|---|
provider |
provider=openai |
LLM provider |
api_key |
api_key="sk-..." |
API key (string) |
temperature |
temperature=0.7 |
Sampling temperature |
max_tokens |
max_tokens=1024 |
Maximum response tokens |
base_url |
base_url="http://..." |
Custom endpoint URL |
Supported providers include openai, anthropic, huggingface, ollama, and custom.
Generates text based on a prompt.
| Parameter | Type | Description |
|---|---|---|
| Name | Atom | Identifier matching a loaded LLM |
| Prompt | String | Input text prompt |
| Response | String | Generated text response |
| Options | List | Parameters for generation |
Common Options for llm_generate:
| Option | Example | Description |
|---|---|---|
temperature |
temperature=0.5 |
Override temperature |
max_tokens |
max_tokens=256 |
Override max tokens |
stop |
stop="\n" |
Stop sequence |
Example:
:- use_module('prolog/scryer_py').
:- use_module('prolog/scryer_llm').
llm_demo :-
py_init,
catch(
(
llm_load(gpt, "gpt-4", [provider=openai]),
llm_generate(gpt, "What is 2+2? Reply with just the number.", Response),
format("LLM says: ~s~n", [Response])
),
_Error,
format("LLM not available (no API key or network)~n", [])
).Requires plugin:
:- use_module('prolog/scryer_rl').
Predicates for training, evaluating, and using reinforcement learning agents via Tianshou v2.0.
Creates and registers a new RL agent.
| Parameter | Type | Description |
|---|---|---|
| Name | Atom | Symbolic identifier for the agent |
| EnvId | String | Gymnasium environment ID (e.g., "CartPole-v1") |
| Algorithm | Atom | RL algorithm: dqn, ppo, a2c, sac, td3, ddpg, pg, discrete_sac |
| Options | List | Key=Value pairs |
Common Options for rl_create:
| Option | Example | Description |
|---|---|---|
lr |
lr=0.001 |
Learning rate |
gamma |
gamma=0.99 |
Discount factor |
hidden_sizes |
hidden_sizes=[64,64] |
MLP hidden layer sizes |
n_train_envs |
n_train_envs=4 |
Number of parallel training environments |
buffer_size |
buffer_size=20000 |
Replay buffer capacity |
eps_training |
eps_training=0.1 |
Epsilon for training (DQN) |
Loads a saved RL agent checkpoint.
| Parameter | Type | Description |
|---|---|---|
| Name | Atom | Symbolic identifier |
| Path | String | Path to the checkpoint file |
| Options | List | Required: env_id (string) and algorithm (atom) |
Saves the current agent policy to a checkpoint file.
| Parameter | Type | Description |
|---|---|---|
| Name | Atom | Identifier of a registered agent |
| Path | String | Output path for the checkpoint |
Queries the agent policy for an action given an observation.
| Parameter | Type | Description |
|---|---|---|
| Name | Atom | Identifier of a registered agent |
| State | Handle | Handle to the observation tensor |
| Action | Handle | Handle to the selected action |
| Options | List | e.g., [deterministic=true] |
Runs the training loop for the specified agent.
| Parameter | Type | Description |
|---|---|---|
| Name | Atom | Identifier of a registered agent |
| Options | List | Training configuration |
| Metrics | Handle | Handle to a dict of training metrics |
Common Options for rl_train:
| Option | Example | Description |
|---|---|---|
max_epochs |
max_epochs=10 |
Number of training epochs |
epoch_num_steps |
epoch_num_steps=5000 |
Steps per epoch |
batch_size |
batch_size=64 |
Mini-batch size for updates |
test_step_num_episodes |
test_step_num_episodes=5 |
Episodes per test phase |
Evaluates the agent over a fixed number of episodes.
| Parameter | Type | Description |
|---|---|---|
| Name | Atom | Identifier of a registered agent |
| NumEpisodes | Integer | Number of evaluation episodes |
| Metrics | Handle | Handle to evaluation metrics dict |
Returns metadata about a registered agent.
| Parameter | Type | Description |
|---|---|---|
| Name | Atom | Identifier of a registered agent |
| Info | Handle | Handle to an info dict |
Example:
:- use_module('prolog/scryer_py').
:- use_module('prolog/scryer_rl').
rl_demo :-
py_init,
%% Create a DQN agent for CartPole
rl_create(agent, "CartPole-v1", dqn,
[lr=0.001, hidden_sizes=[64,64]]),
%% Train for 5 epochs
rl_train(agent, [max_epochs=5, epoch_num_steps=2000], Metrics),
py_to_str(Metrics, MetricsStr),
format("Training metrics: ~s~n", [MetricsStr]),
%% Evaluate
rl_evaluate(agent, 10, EvalMetrics),
py_to_str(EvalMetrics, EvalStr),
format("Eval metrics: ~s~n", [EvalStr]),
%% Save checkpoint
rl_save(agent, "checkpoints/cartpole_dqn.pt"),
py_free(Metrics),
py_free(EvalMetrics),
py_finalize.The := operator enables a more concise syntax for common operations. It uses a 3-way dispatch mechanism to distinguish between types.
"hello"is a string (a list of characters, also known as a char list).hellois an atom (a symbolic constant).hello(X)is a compound term (an atom followed by arguments).
-
Var := Obj:"attrname": When the right side of the colon is a string, it performs attribute access.- Translates to:
py_getattr(Obj, "attrname", Var) - Example:
Pi := Math:"pi"retrievesmath.pi.
- Translates to:
-
Var := Obj:methodname: When the right side of the colon is an atom, it performs a no-argument method call.- Translates to:
py_call(Obj, "methodname", Var) - Example:
U := S:uppercallss.upper().
- Translates to:
-
Var := Obj:method(Arg1, Arg2, ...): When the right side is a compound term, it performs a method call with arguments.- 1-3 arguments: Dispatches to
py_call/4..6. - 4 or more arguments: Dispatches to
py_calln/4. - Example:
R := S:replace(Old, New)callss.replace(old, new).
- 1-3 arguments: Dispatches to
X := py_eval("expr")→py_eval("expr", X)M := py_import("mod")→py_import("mod", M)H := py_from_int(42)→py_from_int(42, H)H := py_from_float(3.14)→py_from_float(3.14, H)H := py_from_str("text")→py_from_str("text", H)H := py_from_json("[1,2]")→py_from_json("[1,2]", H)
Complete Example:
:- op(700, xfx, :=).
:- use_module('prolog/scryer_py').
sugar_demo :-
py_init,
%% Built-in shortcut: py_import
Math := py_import("math"),
%% String dispatch: attribute access
Pi := Math:"pi",
py_to_float(Pi, PiVal),
format("math.pi = ~f~n", [PiVal]),
%% Compound dispatch: method call with args
py_from_float(2.0, Two),
Sqrt := Math:sqrt(Two),
py_to_float(Sqrt, SqrtVal),
format("sqrt(2) = ~f~n", [SqrtVal]),
%% Atom dispatch: no-arg method call
S := py_from_str("hello"),
Upper := S:upper,
py_to_str(Upper, UpperStr),
format("upper = ~s~n", [UpperStr]),
%% Compound with 2 args
Old := py_from_str("hello"),
New := py_from_str("HI"),
Replaced := S:replace(Old, New),
py_to_str(Replaced, RStr),
format("replaced = ~s~n", [RStr]),
maplist(py_free, [Math, Pi, Two, Sqrt, S, Upper, Old, New, Replaced]).The := operator does NOT support every operation. These patterns require explicit predicate calls:
%% WRONG — := cannot wrap py_call directly
X := py_call(Obj, "method", Result). % SYNTAX ERROR
%% CORRECT — use py_call directly
py_call(Obj, "method", Result).
%% WRONG — := cannot wrap py_invoke
R := py_invoke(Fn, Arg). % SYNTAX ERROR
%% CORRECT — use py_invoke directly
py_invoke(Fn, Arg, R).
%% WRONG — := cannot set attributes
:= py_setattr(Obj, "name", Val). % SYNTAX ERROR
%% CORRECT — use py_setattr directly
py_setattr(Obj, "name", Val).The := operator only supports:
py_eval,py_import,py_from_*,py_from_json(shortcut pattern)Obj:"attr"(attribute access)Obj:methodorObj:method(Args...)(method calls)
For all other operations, use the explicit predicate form.
Pattern 1: Batch Data Processing
:- op(700, xfx, :=).
:- use_module('prolog/scryer_py').
%% Process a list of Prolog values through a Python function
batch_process(PrologList, Results) :-
py_init,
%% Define a Python function
py_exec_lines([
"def process_batch(items):",
" return [x ** 2 + 1 for x in items]"
]),
%% Build a Python list from Prolog values
py_list_new(PyList),
maplist(add_to_list(PyList), PrologList),
%% Call the function
py_eval("process_batch", Fn),
py_invoke(Fn, PyList, ResultH),
%% Convert back
py_to_json(ResultH, Json),
format("Results: ~s~n", [Json]),
maplist(py_free, [PyList, Fn, ResultH]).
add_to_list(PyList, Val) :-
py_from_int(Val, H),
py_list_append(PyList, H),
py_free(H).Pattern 2: Error-Resilient Pipeline
:- use_module('prolog/scryer_py').
%% A pipeline that handles errors gracefully at each stage
safe_pipeline :-
py_init,
catch(
pipeline_body,
error(python_error(Msg), _),
format("Pipeline failed: ~s~n", [Msg])
),
py_finalize.
pipeline_body :-
py_exec("import json"),
%% Stage 1: Load data
py_eval("json.loads('{\"values\": [1, 2, 3]}')", Data),
with_py(Data, (
%% Stage 2: Extract values
py_getattr(Data, "__class__", _), % verify it's valid
py_to_json(Data, Json),
format("Data: ~s~n", [Json])
)).Pattern 3: Working with NumPy (if installed)
:- op(700, xfx, :=).
:- use_module('prolog/scryer_py').
numpy_demo :-
py_init,
NP := py_import("numpy"),
%% Create a numpy array via py_eval
Arr := py_eval("__import__('numpy').array([1.0, 2.0, 3.0, 4.0, 5.0])"),
%% Call numpy functions on it
Mean := NP:mean(Arr),
py_to_float(Mean, MeanVal),
format("Mean: ~f~n", [MeanVal]),
Std := NP:std(Arr),
py_to_float(Std, StdVal),
format("Std: ~f~n", [StdVal]),
%% Dot product
Arr2 := py_eval("__import__('numpy').array([2.0, 0.0, 1.0, 0.0, 3.0])"),
Dot := NP:dot(Arr, Arr2),
py_to_float(Dot, DotVal),
format("Dot product: ~f~n", [DotVal]),
maplist(py_free, [NP, Arr, Mean, Std, Arr2, Dot]).| File | Description |
|---|---|
examples/basic.pl |
Arithmetic, modules, collections, error handling, RAII cleanup |
examples/neural.pl |
MNIST classification, neuro-symbolic addition, LLM, RL agents |
examples/numpy_torch.pl |
NumPy vectors/matrices, PyTorch tensors, linear regression, CUDA GPU matmul |
examples/mnist_cnn.pl |
CNN training on MNIST from scratch — model definition, training loop, evaluation, neuro-symbolic inference |
examples/mnist_cnn_v2.pl |
Module pattern (recommended): same CNN training, but Python code in a separate .py file |
examples/rl_demo.pl |
DQN agent on CartPole-v1 — create, train, evaluate, save, load |
# Run all examples (robust — works on all systems)
PYLIB=$(python3 -c "import sysconfig; print(sysconfig.get_config_var('LIBDIR'))")
LD_LIBRARY_PATH=".:$PYLIB:$LD_LIBRARY_PATH" scryer-prolog examples/basic.pl
LD_LIBRARY_PATH=".:$PYLIB:$LD_LIBRARY_PATH" scryer-prolog examples/neural.pl
LD_LIBRARY_PATH=".:$PYLIB:$LD_LIBRARY_PATH" scryer-prolog examples/numpy_torch.pl
LD_LIBRARY_PATH=".:$PYLIB:$LD_LIBRARY_PATH" scryer-prolog examples/mnist_cnn.pl
LD_LIBRARY_PATH=".:$PYLIB:$LD_LIBRARY_PATH" scryer-prolog examples/mnist_cnn_v2.pl
LD_LIBRARY_PATH=".:$PYLIB:$LD_LIBRARY_PATH" scryer-prolog examples/rl_demo.pl
# Or use run.sh (see Platform-Specific Notes > Linux > Convenience Wrapper)
# ./run.sh examples/basic.plYour Scryer Prolog build doesn't include FFI support. Rebuild from latest main branch.
libpython not loaded with RTLD_GLOBAL. Ensure LD_LIBRARY_PATH includes the Python lib/ directory, or use LD_PRELOAD.
This is a Scryer Prolog quirk — use_foreign_module/2 is a runtime goal, not a directive. The scryer_py.pl module handles this correctly via :- initialization(...).
Python shared library not found. Check:
python3 -c "import sysconfig; print(sysconfig.get_config_var('LIBDIR'))"
ls $(python3 -c "import sysconfig; print(sysconfig.get_config_var('LIBDIR'))")/libpython3*If empty, your Python was built without --enable-shared. Use conda or rebuild Python.
Build-time and runtime Python versions must match. If you switch environments, cargo clean && cargo build --release.
ScryNeuro/
├── Cargo.toml # Rust config: pyo3 = "0.23", libc
├── build.rs # Python detection + linker config
├── src/
│ ├── lib.rs # Crate entry point
│ ├── ffi.rs # 40 exported extern "C" spy_* functions
│ ├── registry.rs # Thread-safe handle registry (Mutex<HashMap>)
│ ├── convert.rs # Type conversion + TLS string buffer
│ └── error.rs # TLS error storage (spy_last_error)
├── prolog/
│ ├── scryer_py.pl # Core: py_* predicates + := operator
│ ├── scryer_nn.pl # Plugin: nn_load, nn_predict
│ ├── scryer_llm.pl # Plugin: llm_load, llm_generate
│ └── scryer_rl.pl # Plugin: rl_create, rl_train, rl_action, ...
├── python/
│ ├── scryer_py_runtime.py # Core runtime: device management, TensorUtils
│ ├── scryer_nn_runtime.py # NN runtime: model loading + inference
│ ├── scryer_llm_runtime.py # LLM runtime: provider abstraction
│ └── scryer_rl_runtime.py # RL runtime: Tianshou v2.0 agent wrappers
├── examples/
│ ├── basic.pl # Basic interop demos
│ ├── neural.pl # Neuro-symbolic patterns (NN + LLM + RL)
│ ├── numpy_torch.pl # NumPy + PyTorch + CUDA demos
│ ├── mnist_cnn.pl # CNN MNIST training pipeline (inline Python)
│ ├── mnist_cnn_v2.pl # CNN MNIST training pipeline (module pattern)
│ ├── mnist_cnn_module.py # Python module for mnist_cnn_v2.pl
│ └── rl_demo.pl # RL demo: DQN on CartPole-v1
├── test/
│ ├── test_comprehensive.pl # 24 low-level FFI tests
│ ├── test_prolog_api.pl # 19 high-level API tests
│ ├── test_minimal_api.pl # 3 core smoke tests
│ ├── test_rl.pl # 17 RL plugin tests (scryer_rl.pl)
│ ├── test_rl.py # 15 Python RL runtime tests (scryer_rl_runtime.py)
│ ├── test_smoke.pl # low-level smoke test
│ └── test_pi.pl # quick pi/import sanity test
└── docs/
└── technical_report.md # Detailed Chinese technical report
MIT