Skip to content
This repository was archived by the owner on Jul 4, 2025. It is now read-only.

Commit e143638

Browse files
committed
Update the readme and the server URL for the API documentation
1 parent d482c69 commit e143638

File tree

3 files changed

+32
-77
lines changed

3 files changed

+32
-77
lines changed

README.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -90,6 +90,9 @@ See [CLI Reference Docs](https://cortex.jan.ai/docs/cli) for more information.
9090
models start Start a specified model.
9191
models stop Stop a specified model.
9292
models update Update the configuration of a specified model.
93+
benchmark Benchmark and analyze the performance of a specific AI model using your system.
94+
presets Show all the available model presets within Cortex.
95+
telemetry Retrieve telemetry logs for monitoring and analysis.
9396
```
9497
9598
## Uninstall Cortex
@@ -98,7 +101,7 @@ Run the following command to uninstall Cortex globally on your machine:
98101
99102
```
100103
# Uninstall globally using NPM
101-
npm uninstall -g @janhq/cortex
104+
npm uninstall -g @janhq/cortexso
102105
```
103106
104107
## Contact Support

cortex-js/README.md

Lines changed: 27 additions & 75 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Cortex - CLI
1+
# Cortex
22
<p align="center">
33
<img alt="cortex-cpplogo" src="https://raw.githubusercontent.com/janhq/cortex/dev/assets/cortex-banner.png">
44
</p>
@@ -11,91 +11,39 @@
1111
> ⚠️ **Cortex is currently in Development**: Expect breaking changes and bugs!
1212
1313
## About
14-
Cortex is an openAI-compatible local AI server that developers can use to build LLM apps. It is packaged with a Docker-inspired command-line interface and a Typescript client library. It can be used as a standalone server, or imported as a library.
14+
Cortex is an OpenAI-compatible AI engine that developers can use to build LLM apps. It is packaged with a Docker-inspired command-line interface and client libraries. It can be used as a standalone server or imported as a library.
1515

16-
Cortex currently supports two inference engines:
16+
Cortex currently supports 3 inference engines:
1717

1818
- Llama.cpp
19+
- ONNX Runtime
1920
- TensorRT-LLM
2021

21-
> Read more about Cortex at https://jan.ai/cortex
22-
2322
## Quicklinks
24-
Cortex
25-
- [Website](https://jan.ai/)
26-
- [GitHub](https://github.com/janhq/cortex)
27-
- [User Guides](https://jan.ai/cortex)
28-
- [API reference](https://jan.ai/api-reference)
29-
30-
## Prerequisites
31-
32-
### **Dependencies**
33-
34-
Before installation, ensure that you have installed the following:
35-
- **Node.js**: version 18 and above is required to run the installation.
36-
- **NPM**: Needed to manage packages.
37-
- **CPU Instruction Sets**: Available for download from the [Cortex GitHub Releases](https://github.com/janhq/cortex/releases) page.
38-
39-
40-
>💡 The **CPU instruction sets** are not required for the initial installation of Cortex. This dependency will be automatically installed during the Cortex initialization if they are not already on your system.
41-
4223

43-
### **Hardware**
44-
45-
Ensure that your system meets the following requirements to run Cortex:
46-
47-
- **OS**:
48-
- MacOSX 13.6 or higher.
49-
- Windows 10 or higher.
50-
- Ubuntu 22.04 and later.
51-
- **RAM (CPU Mode):**
52-
- 8GB for running up to 3B models.
53-
- 16GB for running up to 7B models.
54-
- 32GB for running up to 13B models.
55-
- **VRAM (GPU Mode):**
56-
57-
- 6GB can load the 3B model (int4) with `ngl` at 120 ~ full speed on CPU/ GPU.
58-
- 8GB can load the 7B model (int4) with `ngl` at 120 ~ full speed on CPU/ GPU.
59-
- 12GB can load the 13B model (int4) with `ngl` at 120 ~ full speed on CPU/ GPU.
60-
61-
- **Disk**: At least 10GB for app and model download.
24+
- [Homepage](https://cortex.jan.ai/)
25+
- [Docs](https://cortex.jan.ai/docs/)
6226

6327
## Quickstart
64-
To install Cortex CLI, follow the steps below:
65-
1. Install the Cortex NPM package globally:
66-
``` bash
67-
npm i -g cortexso
68-
```
69-
> Cortex automatically detects your CPU and GPU, downloading the appropriate CPU instruction sets and required dependencies to optimize GPU performance.
7028

71-
2. Download a GGUF model from Hugging Face:
72-
``` bash
73-
# Pull a model most compatible with your hardware
74-
cortex pull llama3
75-
76-
# Pull a specific variant with `repo_name:branch`
77-
cortex pull llama3:7b
29+
Visit [Quickstart](https://cortex.jan.ai/docs/quickstart) to get started.
7830

79-
# Pull a model with the HuggingFace `model_id`
80-
cortex pull microsoft/Phi-3-mini-4k-instruct-gguf
81-
```
82-
3. Load the model:
8331
``` bash
84-
cortex models start llama3:7b
32+
npm i -g @janhq/cortex
33+
cortex run llama3
8534
```
86-
87-
4. Start chatting with the model:
88-
``` bash
89-
cortex chat tell me a joke
90-
```
91-
92-
93-
## Run as an API server
9435
To run Cortex as an API server:
9536
```bash
9637
cortex serve
38+
39+
# Output
40+
# Started server at http://localhost:1337
41+
# Swagger UI available at http://localhost:1337/api
9742
```
9843

44+
You can now access the Cortex API server at `http://localhost:1337`,
45+
and the Swagger UI at `http://localhost:1337/api`.
46+
9947
## Build from Source
10048

10149
To install Cortex from the source, follow the steps below:
@@ -120,9 +68,10 @@ chmod +x '[path-to]/cortex/cortex-js/dist/src/command.js'
12068
npm link
12169
```
12270

123-
## Cortex CLI Command
124-
The following CLI commands are currently available:
125-
> ⚠️ **Cortex is currently in Development**: More commands will be added soon!
71+
## Cortex CLI Commands
72+
73+
The following CLI commands are currently available.
74+
See [CLI Reference Docs](https://cortex.jan.ai/docs/cli) for more information.
12675

12776
```bash
12877

@@ -141,18 +90,21 @@ The following CLI commands are currently available:
14190
models start Start a specified model.
14291
models stop Stop a specified model.
14392
models update Update the configuration of a specified model.
144-
engines Execute a specified command related to engines.
145-
engines list List all available engines.
93+
benchmark Benchmark and analyze the performance of a specific AI model using your system.
94+
presets Show all the available model presets within Cortex.
95+
telemetry Retrieve telemetry logs for monitoring and analysis.
14696
```
97+
14798
## Uninstall Cortex
14899
149100
Run the following command to uninstall Cortex globally on your machine:
150101
151102
```
152103
# Uninstall globally using NPM
153-
npm uninstall -g cortexso
104+
npm uninstall -g @janhq/cortexso
154105
```
106+
155107
## Contact Support
156108
- For support, please file a GitHub ticket.
157109
- For questions, join our Discord [here](https://discord.gg/FTk2MvZwJH).
158-
- For long-form inquiries, please email [hello@jan.ai](mailto:hello@jan.ai).
110+
- For long-form inquiries, please email [hello@jan.ai](mailto:hello@jan.ai).

cortex-js/src/app.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ export const getApp = async () => {
6262
'Events',
6363
'Endpoints for observing Cortex statuses through event notifications.',
6464
)
65-
.addServer('http://localhost:1337')
65+
.addServer('http://localhost:1337/api')
6666
.addServer('http://localhost:1337/v1')
6767
.build();
6868
const document = SwaggerModule.createDocument(app, config);

0 commit comments

Comments
 (0)