Skip to content

lolrazh/codex

 
 

Repository files navigation

OpenAI Codex CLI (Open Responses Fork)

A fork of openai/codex integrated with Julep's Open Responses API


Overview

This repository is a fork of the original openai/codex CLI tool. The primary change in this fork is the migration from the original OpenAI API dependency (noted in the original repo readme referencing deprecation around March 2025) to using Julep's Open Responses API. This allows the CLI to interact with a wider range of language models from various providers.

For the original documentation, features, and goals of the Codex CLI project, please refer to the upstream repository. This README focuses specifically on the setup and usage changes related to the Open Responses integration.

Getting Started

Prerequisites

  • Node.js: Version 22 or newer (as required by the original codex-cli).
  • API Keys: You'll need API keys for the language model providers you intend to use (e.g., OpenAI, Anthropic, Groq).

Setup

  1. Install Open Responses CLI: Follow the instructions at the Open Responses Quickstart to install the CLI tool. Typically:

    npx open-responses
  2. Configure API Keys: Set the necessary API keys as environment variables for the providers you want to use. The Open Responses server will pick these up. Refer to the Open Responses documentation for the specific environment variable names (e.g., OPENAI_API_KEY, ANTHROPIC_API_KEY).

    npx open-responses setup
  3. Run the Open Responses Server: Start the local Open Responses server. It acts as a proxy to the different model providers.

    npx open-responses start

    Keep this server running in a separate terminal.

  4. Clone and Build this Fork: Clone this repository and build the codex-cli package:

    git clone https://github.com/lolrazh/codex.git
    cd codex/codex-cli
    npm install
    npm run build
  5. Run the CLI: Execute the CLI using node, specifying the model via the -m flag using the <provider>/<model> format recognized by your Open Responses server.

    # Example using an Groq model via Open Responses
    node dist/cli.js -m groq/gemm2-9b-it

Contributing

Contributions to the core functionality should ideally be directed to the upstream openai/codex repository. Issues or pull requests specific to the Open Responses integration in this fork can be opened here.


License

This fork retains the original Apache-2.0 License from the upstream repository.

About

Lightweight coding agent that runs in your terminal

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 87.3%
  • JavaScript 4.0%
  • Python 3.7%
  • Shell 2.4%
  • Jupyter Notebook 1.3%
  • HTML 1.1%
  • Dockerfile 0.2%