Skip to content

feat(api): add porcelain api with connect#1779

Merged
paul-nechifor merged 3 commits intodevfrom
paul/feat/porcelain-api-with-connect
Apr 18, 2026
Merged

feat(api): add porcelain api with connect#1779
paul-nechifor merged 3 commits intodevfrom
paul/feat/porcelain-api-with-connect

Conversation

@paul-nechifor
Copy link
Copy Markdown
Contributor

@paul-nechifor paul-nechifor commented Apr 12, 2026

Problem

Closes DIM-730

Solution

See the docs for how it works.

Breaking Changes

How to Test

Included in the docs.

Contributor License Agreement

  • I have read and approved the CLA.

@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps Bot commented Apr 12, 2026

Greptile Summary

This PR introduces a Dimos porcelain API class and a Dimos.connect() classmethod that lets callers attach to an already-running DimOS instance via RPyC, plus a SkillsProxy for ergonomic skill discovery and dispatch. It also adds a coordinator-side RPyC discovery service (RpycServer/_CoordinatorService), wires rpyc_port into the run registry, and starts the service automatically in the CLI run command.

  • P1Dimos.connect(host="robot.local") (with no port) silently connects to localhost via the registry instead of raising an error; the host argument is swallowed by the if host is not None and port is not None guard falling through to the else branch.

Confidence Score: 4/5

Safe to merge after fixing the silent-host-ignore bug in Dimos.connect(); remaining findings are P2 style.

One P1 defect: Dimos.connect(host="x") silently ignores the host and connects to localhost instead, which could cause confusing misdirected connections in production. The rest are P2 style/quality concerns (private API access in LocalModuleSource, lock held during network I/O).

dimos/porcelain/dimos.py (P1 logic bug in connect()), dimos/porcelain/local_module_source.py (P2 style)

Important Files Changed

Filename Overview
dimos/porcelain/dimos.py Main porcelain API class; contains a P1 logic bug where host is silently ignored when port is omitted in Dimos.connect().
dimos/porcelain/local_module_source.py Local module source wrapper; bypasses the newly-added public coordinator methods and holds a lock during network I/O (P2 style issues).
dimos/porcelain/remote_module_source.py Remote module source backed by coordinator RPyC endpoint; correctly uses public API and handles connection caching.
dimos/porcelain/skills_proxy.py Skills discovery and dispatch proxy; cache is per-instance (a new SkillsProxy is created on each app.skills access), which is a reasonable design tradeoff.
dimos/core/coordination/rpyc_server.py New coordinator-side RPyC discovery service; correctly binds a port-0 socket in ThreadedServer.init before returning the port.
dimos/test_porcelain.py Comprehensive test suite covering local and remote Dimos scenarios; well structured with slow-marked integration tests.
dimos/porcelain/module_source.py Protocol definition for module sources; clean and minimal.
dimos/core/coordination/module_coordinator.py Added public list_module_names(), get_module_endpoint(), and start_rpyc_service() methods; correctly locks before accessing deployed modules.
dimos/core/run_registry.py Added rpyc_port field to RunEntry and get_most_recent_rpyc_port() helper; looks correct.

Sequence Diagram

sequenceDiagram
    participant User
    participant Dimos
    participant LocalModuleSource
    participant RemoteModuleSource
    participant ModuleCoordinator
    participant RpycServer
    participant Worker as PythonWorker (subprocess)

    Note over User,Worker: Local path (app.run)
    User->>Dimos: run(target)
    Dimos->>ModuleCoordinator: build(blueprint)
    ModuleCoordinator->>Worker: deploy / start modules
    ModuleCoordinator->>RpycServer: start() - coordinator port saved
    Dimos->>LocalModuleSource: created with coordinator ref

    User->>Dimos: app.skills.ping()
    Dimos->>LocalModuleSource: get_rpyc_module("StressTestModule")
    LocalModuleSource->>Worker: actor.start_rpyc() - worker port
    LocalModuleSource->>Worker: rpyc.connect(localhost, port)
    Worker-->>LocalModuleSource: module proxy
    LocalModuleSource-->>Dimos: module proxy
    Dimos-->>User: skill result

    Note over User,Worker: Remote path (Dimos.connect)
    User->>Dimos: connect(host, port)
    Dimos->>RemoteModuleSource: connect to coordinator RPyC port
    RemoteModuleSource->>RpycServer: list_modules() / get_module_endpoint(name)
    RpycServer->>ModuleCoordinator: delegate
    RemoteModuleSource->>Worker: rpyc.connect(host, worker_port)
    Worker-->>RemoteModuleSource: module proxy
    RemoteModuleSource-->>User: skill result
Loading

Reviews (1): Last reviewed commit: "feat(api): add porcelain api with connec..." | Re-trigger Greptile

Comment thread dimos/porcelain/dimos.py
Comment thread dimos/porcelain/local_module_source.py Outdated
Comment thread dimos/porcelain/local_module_source.py
@paul-nechifor paul-nechifor mentioned this pull request Apr 12, 2026
1 task
@paul-nechifor paul-nechifor force-pushed the paul/feat/porcelain-api-with-connect branch 2 times, most recently from 889de47 to cb78739 Compare April 12, 2026 06:15
Comment thread dimos/core/coordination/python_worker.py
Comment thread dimos/core/coordination/module_coordinator.py
leshy
leshy previously approved these changes Apr 17, 2026
Copy link
Copy Markdown
Contributor

@leshy leshy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

big comment on actual protocol to talk to workers, but I'm aproving as we iterate. protocol replacable later also.

Another

run() and restart() are only available in local mode. On a connected instance they raise NotImplementedError.

IMO we especially want those in remote mode:

  • you start dimos, then claude iterates on the code on individual module, it wants to restart that module
  • openclaw controls dimos, wants to deploy a new module

I'm not totally sure why those features are not "basically for free" and require extra work

@paul-nechifor paul-nechifor force-pushed the paul/feat/porcelain-api-with-connect branch from cb78739 to 77113a6 Compare April 18, 2026 00:14
@paul-nechifor
Copy link
Copy Markdown
Contributor Author

big comment on actual protocol to talk to workers, but I'm aproving as we iterate. protocol replacable later also.

Another

run() and restart() are only available in local mode. On a connected instance they raise NotImplementedError.

IMO we especially want those in remote mode:

* you start dimos, then claude iterates on the code on individual module, it wants to restart that module

* openclaw controls dimos, wants to deploy a new module

I'm not totally sure why those features are not "basically for free" and require extra work

Implemented this. Required pickling classes.

@paul-nechifor paul-nechifor enabled auto-merge (squash) April 18, 2026 00:19
Copy link
Copy Markdown
Contributor

@leshy leshy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So generally yes, but it feels like we might want a refactor.

Deploying multiples of modules/blueprints,
To potentially use RPC and not RPyC for everything

so we can sketch all this out later, but let's kick this off :D

@paul-nechifor paul-nechifor merged commit c645d0d into dev Apr 18, 2026
4 of 5 checks passed
@paul-nechifor paul-nechifor deleted the paul/feat/porcelain-api-with-connect branch April 18, 2026 05:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants