Files
openclaw-oci/README.md
2026-02-26 16:38:41 -03:00

10 KiB
Raw Permalink Blame History

Integrating OpenClaw with Oracle Cloud Generative AI (OCI)

Enterprise AI Power, Open Ecosystem, Zero Compromise

The rapid evolution of AI orchestration tools has reshaped how companies build intelligent systems. Among these tools, OpenClaw has emerged as a powerful open-source platform designed to simplify the creation of AI agents, conversational workflows, and multi-channel integrations.

OpenClaw is not just another wrapper around LLM APIs. It is:

  • Modular
  • Plugin-driven
  • Open-source
  • OpenAI-compatible
  • Community-powered

Its OpenAI-compatible design makes it instantly interoperable with the entire AI tooling ecosystem — SDKs, automation frameworks, browser clients, bots, and custom agent pipelines.

And because it is open source, innovation happens in public.

There is an active and growing community contributing:

  • New plugins
  • Messaging integrations (WhatsApp, web, etc.)
  • Tool execution engines
  • Agent frameworks
  • Workflow automation patterns
  • Performance optimizations

This means OpenClaw evolves continuously — without vendor lock-in.

But while agility and innovation are essential, enterprises require something more:

  • Security
  • Governance
  • Compliance
  • Regional data sovereignty
  • Observability
  • Controlled network exposure
  • Predictable scalability

This is where Oracle Cloud Infrastructure (OCI) Generative AI becomes the strategic enterprise choice.

The Power of Ecosystem + Enterprise Security

OpenClaw: Open Ecosystem Advantage

Because OpenClaw is:

  • Open-source
  • Community-driven
  • Plugin-extensible
  • OpenAI-protocol compatible

You benefit from:

  • Rapid innovation
  • Transparent architecture
  • Community-tested integrations
  • Zero dependency on a single SaaS provider
  • Full customization capability

You are not locked into one AI vendor. You control your orchestration layer.

This flexibility is critical in a world where models evolve rapidly and enterprises need adaptability.

OCI Generative AI: Enterprise Trust Layer

Oracle Cloud Infrastructure adds what large organizations require:

  • Fine-grained IAM control
  • Signed API requests (no exposed API keys)
  • Dedicated compartments
  • Private VCN networking
  • Sovereign cloud regions
  • Enterprise SLAs
  • Monitoring & logging integration
  • Production-ready inference endpoints

OCI Generative AI supports powerful production-grade models such as:

  • Cohere Command
  • LLaMA family
  • Embedding models
  • Custom enterprise deployments
  • OpenAI-compatible models via mapping

This creates a secure AI backbone inside your own tenancy.

Why This Combination Is Strategically Powerful

By implementing a local OpenAI-compatible gateway backed by OCI:

OpenClaw continues to behave exactly as designed — while inference happens securely inside Oracle Cloud.

You gain:

  • Full OpenAI protocol compatibility
  • Enterprise security boundaries
  • Cloud tenancy governance
  • Scalable AI inference
  • Ecosystem extensibility
  • Open-source flexibility

Without rewriting your agents. Without breaking plugins. Without sacrificing innovation.

About the tutorial

This tutorial explains how to integrate OpenClaw with Oracle Cloud Infrastructure (OCI) Generative AI by building an OpenAI-compatible API gateway using FastAPI.

Instead of modifying OpenClaw's core, we expose an OpenAI-compatible endpoint (/v1/chat/completions) that internally routes requests to OCI Generative AI.

This approach provides:

  • Full OpenClaw compatibility
  • Control over OCI model mapping
  • Support for streaming responses
  • Enterprise-grade OCI infrastructure
  • Secure request signing via OCI SDK

Why Use OCI Generative AI?

Oracle Cloud Infrastructure provides:

  • Enterprise security (IAM, compartments, VCN)
  • Flexible model serving (ON_DEMAND, Dedicated)
  • High scalability
  • Cost control
  • Regional deployment control
  • Native integration with Oracle ecosystem

By building an OpenAI-compatible proxy, we combine:

OpenClaw flexibility + OCI enterprise power


Architecture

OpenClaw ↓ OpenAI-Compatible Gateway (FastAPI) ↓ OCI Generative AI REST API (20231130) ↓ OCI Hosted LLM


Project Structure

project/
 ├── oci_openai_proxy.py
 ├── README.md

Key Code Sections Explained

1 Configuration Section

OCI_CONFIG_FILE = os.getenv("OCI_CONFIG_FILE", os.path.expanduser("~/.oci/config"))
OCI_PROFILE = os.getenv("OCI_PROFILE", "DEFAULT")
OCI_COMPARTMENT_ID = os.getenv("OCI_COMPARTMENT_ID", "...")
OCI_GENAI_ENDPOINT = os.getenv(
    "OCI_GENAI_ENDPOINT",
    "https://inference.generativeai.us-chicago-1.oci.oraclecloud.com"
)

What it does:

  • Reads OCI authentication config
  • Defines target compartment
  • Defines the OCI inference endpoint

2 Model Mapping

MODEL_MAP = {
    "gpt-5": "openai.gpt-4.1",
    "openai/gpt-5": "openai.gpt-4.1",
    "openai-compatible/gpt-5": "openai.gpt-4.1",
}

Why this is important:

OpenClaw expects OpenAI model names.
OCI uses different model IDs.

This dictionary translates between them.


3 Pydantic OpenAI-Compatible Request Model

class ChatRequest(BaseModel):
    model: str
    messages: List[Message]
    temperature: Optional[float] = None
    max_tokens: Optional[int] = None
    stream: Optional[bool] = False

Purpose:

Defines a request format fully compatible with OpenAI's API.


4 OCI Signer

def get_signer():
    config = oci.config.from_file(OCI_CONFIG_FILE, OCI_PROFILE)
    signer = oci.signer.Signer(
        tenancy=config["tenancy"],
        user=config["user"],
        fingerprint=config["fingerprint"],
        private_key_file_location=config["key_file"],
        pass_phrase=config.get("pass_phrase"),
    )
    return signer

Purpose:

Creates a signed request for OCI REST calls.

Without this, OCI rejects the request.


5 Message Conversion (OpenAI → OCI Format)

def openai_to_oci_messages(messages: list, model_id: str) -> list:

OCI expects:

{
  "role": "USER",
  "content": [
    {"type": "TEXT", "text": "..."}
  ]
}

OpenAI sends:

{ "role": "user", "content": "..." }

This function converts formats.


6 OCI REST Call

url = f"{OCI_GENAI_ENDPOINT}/20231130/actions/chat"

We use OCI's REST endpoint:

POST /20231130/actions/chat

Payload structure:

{
  "compartmentId": "...",
  "servingMode": {
    "servingType": "ON_DEMAND",
    "modelId": "gpt-5"
  },
  "chatRequest": {
    "apiFormat": "GENERIC",
    "messages": [...],
    "maxTokens": 512
  }
}

7 Streaming Implementation

def fake_stream(text: str, model: str):

Since OCI GENERIC mode returns full response (not streaming), we simulate OpenAI streaming by splitting the response into chunks.

This keeps OpenClaw fully compatible.


8 OpenAI-Compatible Response Builder

def build_openai_response(model: str, text: str)

Formats the OCI response to match OpenAI's schema:

{
  "id": "...",
  "object": "chat.completion",
  "choices": [...]
}

Running the Server

Install dependencies:

pip install fastapi uvicorn requests oci pydantic

Run:

uvicorn oci_openai_proxy:app --host 0.0.0.0 --port 8050

Testing with curl

curl http://127.0.0.1:8050/v1/chat/completions   -H "Content-Type: application/json"   -d '{
    "model": "gpt-5",
    "messages": [
      {"role": "user", "content": "Hello"}
    ]
  }'

OpenClaw Configuration (openclaw.json)

Edit your openclaw.json configuration file (normaly it's in ~/.openclaw/openclaw.json) and replace models and agents definitions with:

{
   "models":{
      "providers":{
         "openai-compatible":{
            "baseUrl":"http://127.0.0.1:8050/v1",
            "apiKey":"sk-test",
            "api":"openai-completions",
            "models":[
              {
                "id": "gpt-5",
                "name": "gpt-5" ,
                "reasoning": false,
                "input": ["text"],
                "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
                "contextWindow": 200000,
                "maxTokens": 8192
              }
            ]
         }
      }
   },
   "agents":{
      "defaults":{
         "model":{
           "primary": "openai-compatible/gpt-5"
         }
      }
   },
   "gateway":{
      "port":18789,
      "mode":"local",
      "bind":"loopback"
   }
}

Important Fields

  Field           Purpose
  --------------- --------------------------------
  baseUrl         Points OpenClaw to our gateway
  api             Must be openai-completions
  model id        Must match MODEL_MAP key
  contextWindow   Model context size
  maxTokens       Max response tokens

Final Notes

You now have:

✔ OpenClaw fully integrated
✔ OCI Generative AI backend
✔ Streaming compatibility
✔ Enterprise-ready architecture


Reference

Acknowledgments

  • Author - Cristiano Hoshikawa (Oracle LAD A-Team Solution Engineer)