Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.apiyi.com/llms.txt

Use this file to discover all available pages before exploring further.

Overview

Codex CLI is OpenAI’s official command-line programming assistant designed for terminal workflows. Integrating it with API易 takes one thing only:
Replace OpenAI’s endpoint with API易
API易 is an OpenAI-compatible interface (transparent proxy)—at its core, you just swap the Base URL and Key. No custom model_provider, no custom env_key required. If it works with the OpenAI SDK, it works with Codex CLI.

🚀 One-Step Setup

Two environment variables — no complex config files

⚡ Latest Models

Supports gpt-5.5 / gpt-5.4 / gpt-5.3-codex

💰 Same Billing Model

Aligned with OpenAI’s billing, better pricing

🔧 Cross-Platform

Mac / Linux / Windows (PowerShell) — all supported

1. Prerequisites

Install Codex CLI

Globally install the official CLI (requires Node.js 18+):
npm install -g @openai/codex
Verify:
codex --version
Mac users hitting permission errors can prefix with sudo; using nvm / fnm to manage Node is recommended to avoid global-install permission issues.

2. Core Config: Two Environment Variables

API易 is an OpenAI-compatible interface, so the entire config is two lines:
OPENAI_BASE_URL=https://api.apiyi.com/v1
OPENAI_API_KEY=sk-your-APIYI-key
Variable names are fixed: must be OPENAI_API_KEY and OPENAI_BASE_URL. These names are hardcoded in Codex CLI — do not rename them.Base URL must include /v1: write it as https://api.apiyi.com/v1. Missing /v1 will 404.

Mac / Linux Setup

Edit your shell config:
vim ~/.zshrc
# or ~/.bashrc
Add:
export OPENAI_BASE_URL="https://api.apiyi.com/v1"
export OPENAI_API_KEY="sk-your-APIYI-key"
Apply:
source ~/.zshrc
# or source ~/.bashrc

Windows Setup

3. Launch & Basic Usage

Enter your project and launch:
cd /your/project
codex
Or run a one-shot task:
codex "write a Python HTTP server for me"
Non-interactive (silent) mode:
codex -q "fix the build errors in this project"

4. Models (API易 Recommendations)

Common models available on API易 for Codex CLI:
ModelStrengthsBest For
gpt-5.5Latest flagshipComplex coding, engineering analysis, agent workflows
gpt-5.4Stable workhorseDay-to-day coding, debugging, refactoring
gpt-5.4-miniCheap 5.4 variantMid-scale tasks, batch processing
gpt-5.4-nanoUltra-cheapSimple completions, lightweight tasks
gpt-5.3-codexCodex-tunedComplex software engineering tasks
gpt-4.1Classic & cheapBudget-sensitive daily use
How to pick: daily → gpt-5.4; heavy work / agents → gpt-5.5; pure code engineering → gpt-5.3-codex; cost-saving → gpt-5.4-mini / gpt-4.1.

1. Specify model at launch

Use -m or --model:
codex -m gpt-5.5
codex --model gpt-5.4
codex -m gpt-5.3-codex
You can also pass a task directly:
codex -m gpt-5.5 "review this project structure and point out areas to optimize"

2. Specify model in non-interactive mode

codex -q -m gpt-5.4 "fix the build errors in this project"

3. Switch models inside a session

After entering Codex CLI, in the active session:
/model
Then choose or type the model to switch to.

4. Configure default model

To make a model the default, edit ~/.codex/config.toml:
model = "gpt-5.5"
Codex’s current official config file is config.toml (not the older config.json). The user-level config lives at ~/.codex/config.toml; project-level .codex/config.toml is also supported.

5. Advanced Configuration

Custom System Prompts

Edit:
~/.codex/instructions.md
You can define coding style, output language, project conventions, e.g.:
- Write code comments in English
- Follow the project's ESLint config
- Provide detailed explanations

Project-Level AGENTS.md

On first entry into a project, run codex /init to generate AGENTS.md recording the project structure and conventions. To make Codex respond in a specific language by default, add:
Always respond to the user in English for this project.

Common Flags

codex -h          # full help
codex -m <model>  # specify model
codex -q          # non-interactive / silent mode
codex --full-auto # auto-execute (use cautiously)

6. Notes

1. Key & Variable Names

  • Get your API易 Key from the console at api.apiyi.com/token
  • Variable name must be OPENAI_API_KEY — hardcoded in the CLI, do not rename

2. Base URL

  • Must include /v1: https://api.apiyi.com/v1
  • Other gateway domains (e.g. b.apiyi.com/v1, vip.apiyi.com/v1) work the same way

3. Network Issues

If you hit timeouts / connection failures, check in order:
  • Local HTTP/HTTPS proxy settings
  • DNS resolution
  • Whether you’re accidentally using a Cloudflare relay domain (not recommended)

4. Model Differences

  • Codex CLI is biased toward coding tasks; multimodal capabilities (image/audio) are better called via dedicated endpoints
  • Different gpt-5.x models vary in tool calling, long-context handling, and reasoning depth — pick by task complexity

7. FAQ

Because API易 is fully compatible with the OpenAI API protocolhttps://api.apiyi.com/v1 and https://api.openai.com/v1 are interchangeable in request/response format. Replacing the Base URL is enough.
Confirm install:
npm install -g @openai/codex
codex --version
If it still fails, check whether npm bin -g is on your PATH.
  1. Make sure you’re using an API易 Key (starts with sk-), not an OpenAI key
  2. Verify the env var is actually exported:
    echo $OPENAI_API_KEY        # Mac/Linux
    echo %OPENAI_API_KEY%       # Windows CMD
    echo $env:OPENAI_API_KEY    # Windows PowerShell
    
  3. After changing env vars, restart the terminal or run source ~/.zshrc
Most common cause: Base URL is missing /v1.Correct: https://api.apiyi.com/v1Next, check your local proxy and DNS.
Three ways:
  • One-off: codex -m gpt-5.5
  • In session: /model
  • Default: edit ~/.codex/config.toml, set model = "gpt-5.5"
  • OpenAI series: ✅ fully supported (recommended gpt-5.5 / gpt-5.4 / gpt-5.3-codex)
  • Other vendor aggregated models: supported by API易, but Codex CLI itself is biased toward the OpenAI protocol — non-OpenAI models may behave differently around tool use / function calling
  • For Claude / Gemini-based coding, use their native CLIs (e.g. Claude Code)
  • CLI: best for dev-time productivity
  • Production: prefer direct API calls (more control, monitoring, gradual rollout)
Uninstall the CLI:
npm uninstall -g @openai/codex
Disable API易 config: remove the env vars and config file.
# Mac/Linux: edit ~/.zshrc or ~/.bashrc and remove the OPENAI_* lines
rm ~/.codex/config.toml

# Windows: remove OPENAI_BASE_URL / OPENAI_API_KEY from System Environment Variables

8. Summary

The whole integration is one sentence:
Replace OpenAI’s endpoint with API易
The two essential lines:
export OPENAI_BASE_URL="https://api.apiyi.com/v1"
export OPENAI_API_KEY="sk-your-APIYI-key"
Everything else — model picking, prompts, instructions.md, AGENTS.md — is just polish on top.

API易 Console

Manage API keys and view usage

Claude Code Integration

Use Claude models for CLI coding

Model Comparison

All available models and pricing

API Manual

General calling conventions