Skip to content

Expose "Processor" on PS() Call #490

@mattyMattMatt

Description

@mattyMattMatt

I am trying to get information on where the model is running in the system. On the CLI this retrieved with ollama ps. Here it displays:

PS C:\Users\m> ollama ps
NAME               ID              SIZE      PROCESSOR    UNTIL
llama3.2:latest    a80c4f17acd5    4.0 GB    100% GPU     2 minutes from now

It looks like in the API call "Processor" is not exposed which is why when running python ollama.ps() what I get is:
model='llama3.2:latest' name='llama3.2:latest' digest='a80c4f17acd55265feec403c7aef86be0c25983ab279d83f3bcd3abbcb5b8b72' expires_at=datetime.datetime(2025, 3, 31, 16, 13, 25, 974861, tzinfo=TzInfo(-04:00)) size=3972362240 size_vram=3972362240 details=ModelDetails(parent_model='', format='gguf', family='llama', families=['llama'], parameter_size='3.2B', quantization_level='Q4_K_M')

I have trawled through documents without finding where this is exposed so I assume it is not? I dont really want to run a subprocess call to the ollama cli but this information is rather useful for testing aspects of integration performance between systems. Any help would be appreciated.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions