OpenAI Compatible

OpenAI Provider

Generic OpenAI-compatible provider — works with any API that implements the OpenAI chat completions format. One C bridge, configurable base URL, Bearer auth, thinking and tool use support.

📄 scorpiox/libsxnet/sx_provider_openai.c

Overview

The OpenAI provider in scorpiox code is a generic bridge that speaks the OpenAI chat completions protocol. Unlike the dedicated Anthropic or Gemini providers, this one has no hardcoded base URL — you point OPENAI_BASE_URL at any compatible endpoint and it just works.

The C implementation lives in sx_provider_openai.c inside libsxnet. It handles request construction, Bearer token auth, response parsing, thinking blocks, tool-use function calls, and optional traffic logging — all in pure C with zero dependencies.

How It Works

When PROVIDER=openai (or API_BACKEND=openai), the runtime loads the OpenAI provider bridge. Here's the request flow:

scorpiox-env.txt
sx_config_load()
sx_provider_openai
OPENAI_BASE_URL/chat/completions
# The provider constructs a standard OpenAI-format request:

POST $OPENAI_BASE_URL/chat/completions
Authorization: Bearer $OPENAI_API_KEY
Content-Type: application/json

{
  "model": "$OPENAI_MODEL",
  "messages": [
    {"role": "system", "content": "..."},
    {"role": "user", "content": "..."}
  ],
  "tools": [...],          # if TOOLS enabled
  "stream": false
}

Configuration Reference

All keys are set in scorpiox-env.txt or as environment variables. The provider reads them at startup via sx_config.

Key Type Default Description
PROVIDER string scorpiox Set to openai to activate this provider
API_BACKEND string claude Alternative provider selector — set to openai
OPENAI_API_KEY string API key sent as Authorization: Bearer header. Required for authenticated endpoints.
OPENAI_BASE_URL path Base URL for the API. No default — must be set. Example: https://api.openai.com/v1
OPENAI_MODEL string Model identifier to pass in the request body. Example: gpt-4o, gpt-4-turbo
OPENAI_TRAFFIC_DIR path Directory path for storing API traffic logs (request/response JSON). Empty to disable.
OPENAI_TRAFFIC_SEQ integer 0 Sequence counter for traffic log file naming. Auto-incremented per request.

Authentication

The OpenAI provider uses standard Bearer token authentication. The C bridge reads OPENAI_API_KEY from config and injects the header on every request.

# Auth header constructed by sx_provider_openai.c:

Authorization: Bearer sk-proj-abc123...

# For local/self-hosted endpoints that don't require auth,
# set OPENAI_API_KEY to any non-empty value (e.g. "xxx")
OPENAI_API_KEY=xxx

Supported Features

Feature support for the OpenAI-compatible provider, as defined in the C source:

🧠

Thinking

Extended reasoning / chain-of-thought blocks parsed from the response.

🔧

Tool Use

Function calling via the OpenAI tools/functions protocol.

📡

Streaming

Not currently enabled. Requests use non-streaming mode.

🔄

Token Refresh

Not applicable. Uses static API key, no OAuth flow.

Traffic Logging

The OpenAI provider includes built-in traffic logging — useful for debugging, auditing, or replaying API interactions. Set OPENAI_TRAFFIC_DIR to enable.

# Enable traffic logging in scorpiox-env.txt:
OPENAI_TRAFFIC_DIR=/tmp/sx-openai-traffic
OPENAI_TRAFFIC_SEQ=0

# Each request/response pair is saved as:
#   /tmp/sx-openai-traffic/000001_request.json
#   /tmp/sx-openai-traffic/000001_response.json
# Sequence auto-increments per session.

Examples

OpenAI Direct

# scorpiox-env.txt — Use OpenAI directly
PROVIDER=openai
OPENAI_API_KEY=sk-proj-abc123...
OPENAI_BASE_URL=https://api.openai.com/v1
OPENAI_MODEL=gpt-4o

Azure OpenAI

# scorpiox-env.txt — Azure OpenAI endpoint
PROVIDER=openai
OPENAI_API_KEY=your-azure-api-key
OPENAI_BASE_URL=https://your-resource.openai.azure.com/openai/deployments/gpt-4o/v1
OPENAI_MODEL=gpt-4o

Local LLM (Ollama)

# scorpiox-env.txt — Local Ollama instance
PROVIDER=openai
OPENAI_API_KEY=xxx
OPENAI_BASE_URL=http://localhost:11434/v1
OPENAI_MODEL=llama3.1

Together AI

# scorpiox-env.txt — Together AI hosted models
PROVIDER=openai
OPENAI_API_KEY=your-together-key
OPENAI_BASE_URL=https://api.together.xyz/v1
OPENAI_MODEL=meta-llama/Llama-3-70b-chat-hf

With Traffic Logging

# scorpiox-env.txt — Debug with full traffic capture
PROVIDER=openai
OPENAI_API_KEY=sk-proj-abc123...
OPENAI_BASE_URL=https://api.openai.com/v1
OPENAI_MODEL=gpt-4o
OPENAI_TRAFFIC_DIR=/tmp/sx-traffic
OPENAI_TRAFFIC_SEQ=0

Compatible Services

Any service that implements the OpenAI /v1/chat/completions endpoint works with this provider.

OpenAI
api.openai.com/v1
Azure OpenAI
*.openai.azure.com
Ollama
localhost:11434/v1
Together AI
api.together.xyz/v1
Groq
api.groq.com/openai/v1
OpenRouter
openrouter.ai/api/v1
LM Studio
localhost:1234/v1
vLLM
localhost:8000/v1