Introduction
The OCI Generative AI team has introduced the OCI OpenAI package, a Python library that enables developers to invoke models hosted on the OCI Generative AI Service using familiar OpenAI SDK interfaces. The package automatically manages OCI authentication, providing a secure and streamlined connection to OCI-hosted models. By supporting a subset of the OpenAI SDK, it allows teams already experienced with the OpenAI API to integrate their existing workflows with OCI Generative AI more efficiently and reduce onboarding time. Additional documentation is available in the OCI Generative AI user guide.
In this blog, we will build a simple weather agent ( “Hello World” example for agentic applications) using the OCI OpenAI package across multiple agentic frameworks, including the OpenAI SDK, OpenAI Agents SDK, LangChain 1.0, LangGraph 1.0, and the Microsoft Agent Framework. The oci-openai package supports multiple authentication methods for connecting to OCI Generative AI, further details can be found in the project repository. In this blog example, the weather agent uses the OciUserPrincipalAuth authentication method to connect to the OCI Generative AI LLM.
OpenAI SDK
The OpenAI SDK is the official client library providing a streamlined interface to OpenAI’s core AI capabilities, including text generation and chat completions. It serves as the foundational layer for interacting with OpenAI models in a reliable, consistent, and developer-friendly manner. The OCI OpenAI package extends this SDK by adding native OCI authentication support. In this implementation, the weather agent is built using the OpenAI Chat Completions API configured to use the xai.grok-3-minimodel hosted on OCI Generative AI. The agent constructs tools and their schemas, then iteratively invokes the Chat Completions API to produce a final answer for the user query.
import httpx
from openai import OpenAI
from oci_openai import OciUserPrincipalAuth
import inspect
from pydantic import BaseModel
from typing import Optional
import json
COMPARTMENT_ID="AddCompartmentId"
model = "xai.grok-3-mini"
client = OpenAI(
api_key="OCI",
base_url="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com/20231130/actions/v1",
http_client=httpx.Client(
auth=OciUserPrincipalAuth(profile_name="DEFAULT"),
headers={"CompartmentId": COMPARTMENT_ID}
),
)
class Agent(BaseModel):
name: str = "Agent"
model: str = "xai.grok-3-mini"
instructions: str = "You are a helpful Agent"
tools: list = []
def function_to_schema(func) -> dict:
type_map = {
str: "string",
int: "integer",
float: "number",
bool: "boolean",
list: "array",
dict: "object",
type(None): "null",
}
try:
signature = inspect.signature(func)
except ValueError as e:
raise ValueError(
f"Failed to get signature for function {func.__name__}: {str(e)}"
)
parameters = {}
for param in signature.parameters.values():
try:
param_type = type_map.get(param.annotation, "string")
except KeyError as e:
raise KeyError(
f"Unknown type annotation {param.annotation} for parameter {param.name}: {str(e)}"
)
parameters[param.name] = {"type": param_type}
required = [
param.name
for param in signature.parameters.values()
if param.default == inspect._empty
]
return {
"type": "function",
"function": {
"name": func.__name__,
"description": (func.__doc__ or "").strip(),
"parameters": {
"type": "object",
"properties": parameters,
"required": required,
},
},
}
def run_agent(agent, messages):
num_init_messages = len(messages)
messages = messages.copy()
# turn python functions into tools and save a reverse map
tool_schemas = [function_to_schema(tool) for tool in agent.tools]
tools_map = {tool.__name__: tool for tool in agent.tools}
while True:
# === 1. get openai completion ===
response = client.chat.completions.create(
model=agent.model,
messages=[{"role": "system", "content": agent.instructions}] + messages,
tools=tool_schemas or None,
)
message = response.choices[0].message
messages.append(message)
if message.content: # print assistant response
print("Assistant:", message.content)
if not message.tool_calls: # if finished handling tool calls, break
break
# === 2. handle tool calls ===
for tool_call in message.tool_calls:
result = execute_tool_call(tool_call, tools_map)
result_message = {
"role": "tool",
"tool_call_id": tool_call.id,
"content": result,
}
messages.append(result_message)
# ==== 3. return new messages =====
return messages[num_init_messages:]
def execute_tool_call(tool_call, tools_map):
name = tool_call.function.name
args = json.loads(tool_call.function.arguments)
print(f"Assistant: {name}({args})")
# call corresponding function with provided arguments
return tools_map[name](**args)
def get_weather(city: str) -> str:
"""Get current temperature for a given city."""
return f"The weather in {city} is sunny."
weather_agent = Agent(
name="Weather Assistant",
instructions="You are a helpful weather assistant.",
tools=[get_weather]
)
messages = []
user_query="What is the weather like in Bangalore today?"
print("User:", user_query)
messages.append({"role": "user", "content": user_query})
response = run_agent(weather_agent, messages)
print("Final Response: ",response[-1].content)
OpenAI Agents SDK
The OpenAI Agents SDK provides a higher-level framework for building agentic applications on top of OpenAI models. It allows developers to define agents capable of reasoning through tasks, invoking tools, and orchestrating multi-step workflows. The SDK supports structured tool calling, function execution, file handling, and state management, enabling agents to autonomously determine next actions based on context. In this blog example, the weather agent development is more simplified using the Agents SDK by creating an OCI Generative AI client with the AsyncOpenAI API and configuring it through the OpenAIChatCompletionsModel to use the xai.grok-3-mini model. The configured model, along with the agent’s instructions and tools, is then passed to the Agent.
import asyncio
from agents import Agent, Runner,AsyncOpenAI,function_tool,OpenAIChatCompletionsModel #OpenAIResponsesModel
import httpx
from oci_openai import OciUserPrincipalAuth
COMPARTMENT_ID="AddCompartmentId"
# Example for OCI Data Science Model Deployment endpoint
client = AsyncOpenAI(
api_key="OCI",
base_url="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com/20231130/actions/v1",
http_client=httpx.AsyncClient(
auth=OciUserPrincipalAuth(profile_name="DEFAULT"),
headers={"CompartmentId": COMPARTMENT_ID}
)
)
model = OpenAIChatCompletionsModel(model="xai.grok-3-mini", openai_client=client)
@function_tool
def get_weather(city: str) -> str:
"""Get current temperature for a given city."""
return f"The weather in {city} is sunny."
weather_agent = Agent(
name="Weather Assistant",
instructions="You are a helpful weather assistant.",
tools=[get_weather],
model=model
)
async def main():
result = await Runner.run(weather_agent, input="What's the weather in Bangalore?")
print(result.final_output)
if __name__ == "__main__":
asyncio.run(main())
LangChain
LangChain provides framework for building applications powered by large language models that simplify prompt management, model interaction, memory handling, data retrieval, and tool integration. LangChain 1.0 represents a significant evolution of the LangChain framework, focusing on simplifying and streamlining the development of AI agents powered by large language models (LLMs). The weather agent is implemented using LangChain’s create_agent API, which is built on top of LangGraph. The OCI xai.grok-3-mini model is configured through the ChatOpenAI API and passed into the create_agent function to construct the agent.
from langchain.agents import create_agent
from langchain_openai import ChatOpenAI
import httpx
from oci_openai import OciUserPrincipalAuth
COMPARTMENT_ID="AddCompartmentId"
llm = ChatOpenAI(
model="xai.grok-3-mini", # for example "xai.grok-4-fast-reasoning"
api_key="OCI",
base_url="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com/20231130/actions/v1",
http_client=httpx.Client(
auth=OciUserPrincipalAuth(profile_name="DEFAULT"),
headers={"CompartmentId": COMPARTMENT_ID}
)
)
def get_weather(city: str) -> str:
"""Get current temperature for a given city."""
return f"The weather in {city} is sunny."
weather_agent = create_agent(
model=llm,
tools=[get_weather],
system_prompt="You are a helpful weather assistant.",
)
response=weather_agent.invoke(
{"messages": {"role": "user", "content": "What is the weather like in Bangalore today?"}}
)
print(response["messages"][-1].content)
LangGraph
LangGraph is a graph-based orchestration framework designed for building stateful, multi-step agentic workflows on top of LangChain. It allows developers to define agents and tools as nodes in a directed graph, enabling precise control over execution order, branching logic, and iterative reasoning loops. It is particularly useful for applications that require deterministic control flow, long-running processes. In this example, the weather agent is constructed using a graph-based approach, where the OCI xai.grok-3-mini model is instantiated as a node via ChatOpenAI.
from langgraph.prebuilt import ToolNode,tools_condition
from langchain_core.tools import tool
from langgraph.graph import MessagesState,StateGraph, START, END
from langchain.messages import SystemMessage,HumanMessage
from langchain_openai import ChatOpenAI
import httpx
from oci_openai import OciUserPrincipalAuth
COMPARTMENT_ID="AddCompartmentId"
@tool
def get_weather(location: str) -> str:
"""Get current temperature for a given city."""
return f"The weather in {city} is sunny."
llm_with_tools = ChatOpenAI(
model="xai.grok-3-mini",
api_key="OCI",
base_url="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com/20231130/actions/v1",
http_client=httpx.Client(
auth=OciUserPrincipalAuth(profile_name="DEFAULT"),
headers={"CompartmentId": COMPARTMENT_ID}
)
).bind_tools([get_weather])
def call_model(state: MessagesState):
system_prompt=[SystemMessage(content="You are a helpful weather assistant.")]
messages = state["messages"]
response = llm_with_tools.invoke(system_prompt+messages)
return {"messages": [response]}
agent_builder = StateGraph(MessagesState)
agent_builder.add_node("llm", call_model)
agent_builder.add_node("tools", ToolNode([get_weather]))
agent_builder.add_edge(START, "llm")
agent_builder.add_conditional_edges("llm",tools_condition,["tools", END])
agent_builder.add_edge("tools", "llm")
weather_agent = agent_builder.compile()
response = weather_agent.invoke(input={"messages": [HumanMessage("What is the weather in Bangalore")]},)
print(response["messages"][-1].content)
Microsoft Agent Framework
The Microsoft Agent Framework, recently introduced by Microsoft, builds on Semantic Kernel and AutoGen to support the creation of intelligent, task-oriented agents. It provides abstractions for tool invocation, multi-step reasoning, and integration with external systems. In this blog, the weather agent is built by creating an OCI Generative AI client using the AsyncOpenAI API and passing it to an OpenAIChatClient configured with the xai.grok-3-mini model. The configured model, along with the agent’s instructions and tools, is then supplied to the Agent.
import asyncio
from agent_framework import ChatAgent
from agent_framework.openai import OpenAIChatClient
import httpx
from oci_openai import OciUserPrincipalAuth
from openai import AsyncOpenAI
def get_weather(city: str) -> str:
"""Get current temperature for a given city."""
return f"The weather in {city} is sunny."
async def main() -> None:
# Create your agent
COMPARTMENT_ID="AddCompartmentId"
oci_client = AsyncOpenAI(
api_key="OCI",
base_url="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com/20231130/actions/v1",
http_client=httpx.AsyncClient(
auth=OciUserPrincipalAuth(profile_name="DEFAULT"),
headers={"CompartmentId": COMPARTMENT_ID}
)
)
chat_client=OpenAIChatClient(
async_client = oci_client,
model_id="xai.grok-3-mini"
)
weather_agent = ChatAgent(
name="WeatherAgent",
chat_client=chat_client,
instructions="You are a helpful weather assistant.",
tools=[get_weather]
)
query = "What's the weather like in Bangalore?"
print(f"User: {query}")
result = await weather_agent.run(query)
print(f"Agent: {result}\n")
if __name__ == "__main__":
asyncio.run(main())
Conclusion
This blog demonstrated how the OCI OpenAI package makes it easy to build agentic applications on OCI Generative AI using multiple frameworks. By implementing a simple weather agent across the OpenAI SDK, OpenAI Agents SDK, LangChain, LangGraph, and the Microsoft Agent Framework, we highlighted how consistently the OCI OpenAI package integrates with different development approaches. This flexibility allows teams to adopt the framework that best fits their needs while benefiting from OCI’s secure, scalable AI infrastructure.
I would like to acknowledge the support provided by Lyudmil Pelov in the development of this blog.
