first commit

This commit is contained in:
2025-03-07 10:29:41 -03:00
parent d61d795ecb
commit ab04e3f594
9 changed files with 148 additions and 104 deletions

View File

@@ -1,4 +1,4 @@
# Develop an Agent AI with Oracle Cloud Generative AI # Develop a simple AI Agent Tool using Oracle Cloud Generative AI and REST APIs
## Introduction ## Introduction
@@ -24,7 +24,9 @@ Throughout the document, common scenarios will be presented where the applicatio
You can find and test the code here: [agent_ocigenai.py](./source/agent_ocigenai.py) You can find and test the code here: [agent_ocigenai.py](./source/agent_ocigenai.py)
The code is divided in 4 modules: The code is divided in 5 modules:
**Simple Database Persistence Services**: The code define a creation of a simple database for an insert, delete, query and summarize the order.
**Service Definition**: The code defines several services, such as insert_order, delete_order, search_order, order_cost, and delivery_address. These services are decorated with the @tool decorator, which indicates that they can be called by the conversational agent. **Service Definition**: The code defines several services, such as insert_order, delete_order, search_order, order_cost, and delivery_address. These services are decorated with the @tool decorator, which indicates that they can be called by the conversational agent.
@@ -34,43 +36,47 @@ The code is divided in 4 modules:
**Conversational Loop**: The code enters an infinite loop, where it waits for user input and processes the responses using the conversational agent. **Conversational Loop**: The code enters an infinite loop, where it waits for user input and processes the responses using the conversational agent.
### Database Services
### REST SERVICES Just to store the data into a simple database, this demo creates a SQLite3 database for persistence. These services will be used in the Business Services.
![img.png](images/img_22.png)
### REST Services
Here are the services defined for the REST calls. For the example of fetching the address from the zip code, a call is being made to the OCI API Gateway which exposes an integration built in Oracle Integration to get the address from a microservice on Oracle Kubernetes Engine (OKE). Here are the services defined for the REST calls. For the example of fetching the address from the zip code, a call is being made to the OCI API Gateway which exposes an integration built in Oracle Integration to get the address from a microservice on Oracle Kubernetes Engine (OKE).
![img_1.png](images/img_1.png) ![img_1.png](images/img_1.png)
### Business Services
When implementing business services, it is possible to expose these services so that Generative AI can better explore each of them. This is possible through a library called langchain_core.tools, which is capable of interpreting a given context in natural language and associating it with a specific business service. When declaring the services that will be part of the business logic, it is possible to declare "aliases" in the docstrings of each of them to help contextualize them.
### BUSINESS SERVICES ![img.png](images/img_17.png)
When implementing business services, it is possible to expose these services so that Generative AI can better explore each of them. This is possible through a library called langchain.tools, which is capable of interpreting a given context in natural language and associating it with a specific business service. When declaring the services that will be part of the business logic, it is possible to declare "aliases" in the docstrings of each of them to help contextualize them.
![img.png](images/img_7.png)
As well as the context declaration is necessary in the prompt to use the AI model. As well as the context declaration is necessary in the prompt to use the AI model.
![img.png](images/img_9.png) ![img_1.png](images/img_18.png)
Note that in each service definition, it is possible to determine a specific context so that, when sending a request in natural language, the library can interpret what was requested and determine which appropriate service should be executed. Note that in each service definition, it is possible to determine a specific context so that, when sending a request in natural language, the library can interpret what was requested and determine which appropriate service should be executed.
The langchain.tools library understands the scope of work by associating the contexts and services available for use. This is done by the following declaration: The langchain_core.tools library understands the scope of work by associating the contexts and services available for use. This is done by the following declaration:
![img.png](images/img_6.png) ![img.png](images/img_6.png)
Another interesting point about the langchain.tools library is that the service signature attributes are also interpreted, that is, the library itself determines how to forward the request in natural language and define the attributes of the parameters of the service in question. This is already very impressive in itself, as it greatly reduces the implementation burden on integrations. In the traditional integration model, there is time to be spent defining the FROM-TO between the source and destination of these integrations. This is a very reasonable effort. In the Agent AI model, it is through the context that the attributes are passed, that is, the library can determine what each parameter is and pass it to the service in the correct way. Another interesting point about the langchain_core.tools library is that the service signature attributes are also interpreted, that is, the library itself determines how to forward the request in natural language and define the attributes of the parameters of the service in question. This is already very impressive in itself, as it greatly reduces the implementation burden on integrations. In the traditional integration model, there is time to be spent defining the FROM-TO between the source and destination of these integrations. This is a very reasonable effort. In the Agent AI model, it is through the context that the attributes are passed, that is, the library can determine what each parameter is and pass it to the service in the correct way.
![img.png](img.png)
### Test the Code ### Test the Code
You can test and adjust the code for your purposes. The service named "delivery_address" was implemented calling a REST API. In this example, you can test the code change the real REST request to a fake request. To do this, comment the real code: You can test and adjust the code for your purposes. The service named "delivery_address" was implemented calling a REST API. In this example, you can test the code change the real REST request to a fake request. To do this, comment the real code:
![img_1.png](images/img_11.png) ![img_2.png](images/img_19.png)
To comment the code, just put the "#" into the lines: To comment the code, just put the "#" into the lines:
![img_2.png](images/img_12.png) ![img_3.png](images/img_20.png)
And discomment this code: And discomment this code:
@@ -107,8 +113,7 @@ You can run the code executing this command on your terminal:
python agent_ocigenai.py python agent_ocigenai.py
![img.png](images/img_8.png) ![img_4.png](images/img_21.png)
## Scenarios for Agent AI ## Scenarios for Agent AI

BIN
images/img_17.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 154 KiB

BIN
images/img_18.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 107 KiB

BIN
images/img_19.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 81 KiB

BIN
images/img_20.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 68 KiB

BIN
images/img_21.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 36 KiB

BIN
images/img_22.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 104 KiB

View File

@@ -1,16 +1,61 @@
# Adapter for OCI Generative AI Agent import sqlite3
from langchain_core.prompts import PromptTemplate, ChatPromptTemplate from langchain_core.prompts import PromptTemplate, ChatPromptTemplate
from langchain.agents import AgentExecutor, create_tool_calling_agent from langchain_core.messages import HumanMessage, SystemMessage
from langchain.tools import BaseTool, StructuredTool, Tool, tool from langgraph.prebuilt.chat_agent_executor import AgentState
from langgraph.checkpoint.memory import MemorySaver
from langgraph.prebuilt import create_react_agent
from langchain_core.tools import tool
from langchain_community.chat_models.oci_generative_ai import ChatOCIGenAI from langchain_community.chat_models.oci_generative_ai import ChatOCIGenAI
import requests import requests
from requests.auth import HTTPBasicAuth from requests.auth import HTTPBasicAuth
user_name = "#######################" user_name = "YOUR_USER_NAME"
password = "#######################" password = "YOUR_PASSWORD"
order_list = [] # Persistence for Order #--------------------------------------------------------------------------
# DB PERSISTENCE
# Create the connection with a database: SQLite
def connect_db():
return sqlite3.connect('orders.db')
# Create the table orders if not exists
def create_orders_table():
conn = connect_db()
cursor = conn.cursor()
cursor.execute("""
CREATE TABLE IF NOT EXISTS orders (
id INTEGER PRIMARY KEY AUTOINCREMENT,
item TEXT
)
""")
conn.commit()
conn.close()
# Add items to an order
def insert_item_to_db(item):
conn = connect_db()
cursor = conn.cursor()
cursor.execute("INSERT INTO orders (item) VALUES (?)", (item,))
conn.commit()
conn.close()
# Delete item from an order
def delete_item_from_db(item):
conn = connect_db()
cursor = conn.cursor()
cursor.execute("DELETE FROM orders WHERE item = ?", (item,))
conn.commit()
conn.close()
# Search for items in order
def get_all_items_from_db():
conn = connect_db()
cursor = conn.cursor()
cursor.execute("SELECT item FROM orders")
items = cursor.fetchall()
conn.close()
return [item[0] for item in items]
#-------------------------------------------------------------------------- #--------------------------------------------------------------------------
# REST SERVICES # REST SERVICES
@@ -44,44 +89,42 @@ def post_request(url, data, headers=None):
# BUSINESS SERVICES # BUSINESS SERVICES
@tool @tool
def insert_order(items): def insert_order(item):
"""Create an order with items. The customer can ask for items from a restaurant. The customer wants to include an item.""" """Create an order with items."""
global order_list insert_item_to_db(item)
order_list.extend(items) # Adds new items to the order print("Item(s):", item)
print("Item(s) added:", items) return {"message": "Items added to order"}
return {"message": "Items added to order", "current_order": order_list}
@tool @tool
def delete_order(item): def delete_order(item):
"""Delete an item from the order. The customer may change their mind about one or more items. """Delete an Item in the order."""
The customer can request to delete an item from a restaurant order. delete_item_from_db(item)
The customer may ask to 'Remove the item', 'I don't want the item anymore', or 'Delete the item'."""
global order_list
print("Trying to remove:", item) print("Trying to remove:", item)
for global_item in order_list: return {"message": "Item excluded to order"}
if global_item in item:
order_list.remove(global_item)
print("Item(s) removed:", global_item)
return {"message": "Item removed from order", "current_order": order_list}
@tool @tool
def search_order(): def search_order():
"""Search an order with items.""" """Search an order with items."""
global order_list print("Item(s):", get_all_items_from_db())
print("Current items in order:", order_list) return {"message": "That's it!"}
return {"message": "Current order details", "current_order": order_list}
@tool @tool
def order_cost(): def order_cost():
"""This service provides the total cost of the order, summarizing the items. """This service gives the total of the order."""
If the customer asks 'give me the bill', 'summarize the order', 'what is the total', or 'how much is it'.""" order_list = get_all_items_from_db()
global order_list
if not order_list: if not order_list:
return {"message": "No items in the order"} return {"message": "No items in the order"}
total = len(order_list) * 10 # Assuming each item costs 10 total = len(order_list) * 10 # Supondo que cada item custa 10
print("Total: $", total) print("Total: $", total)
return {"total_cost": total, "order_items": order_list} return {"message": total}
@tool
def delivery_address(postalCode: str, number: str = "", complement: str = "") -> str:
"""Find the complete address of a postal code."""
full_address = f"Paulista Avenue, 1000 - 01310-000 - Sao Paulo - SP"
print(full_address)
return str(full_address)
# @tool # @tool
# def delivery_address(postalCode: str, number: str = "", complement: str = "") -> str: # def delivery_address(postalCode: str, number: str = "", complement: str = "") -> str:
@@ -90,7 +133,7 @@ def order_cost():
# number is the number of buiding and complenent is the apartment or other complement for the address. always confirm the address # number is the number of buiding and complenent is the apartment or other complement for the address. always confirm the address
# and the total cost of order.""" # and the total cost of order."""
# #
# url = f"https://xxxxxxxxxxxxxxxxxx.apigateway.us-ashburn-1.oci.customer-oci.com/cep/cep?cep={postalCode}" # url = f"https://xxxxxxxxxxxxxxxxxxxxx.apigateway.us-ashburn-1.oci.customer-oci.com/cep/cep?cep={postalCode}"
# response = get_rest_service_auth(url) # response = get_rest_service_auth(url)
# #
# address = response["frase"] # address = response["frase"]
@@ -98,34 +141,23 @@ def order_cost():
# print(full_address) # print(full_address)
# return str(full_address) # return str(full_address)
@tool
def delivery_address(postalCode: str, number: str = "", complement: str = "") -> str:
"""Find the complete address of a postal code to delivery, along with the building number and complement.
The customer can ask for 'delivery to' or 'my address is'. postalCode normally is the postal code or CEP,
number is the number of buiding and complenent is the apartment or other complement for the address. always confirm the address
and the total cost of order."""
full_address = f"Paulista Avenue, 1000 - 01310-000 - Sao Paulo - SP"
print(full_address)
return str(full_address)
#-------------------------------------------------------------------------- #--------------------------------------------------------------------------
tools = [insert_order, order_cost, search_order, delivery_address, delete_order] tools = [insert_order, order_cost, search_order, delivery_address, delete_order]
# PROMPT AND CONTEXT # PROMPT AND CONTEXT
prompt = ChatPromptTemplate.from_messages( prompt = ChatPromptTemplate.from_messages(
[ [
("system", """You are an assistant that helps customers place orders at a restaurant. ("system", """You are an assistant that helps customers place orders at a restaurant.
After a customer adds an item to the order, always inform them of the total. The customer can add an item into the order using 'insert_order' service. The item is understood as a request input by the customer
If the customer provides a postal code (ZIP), use the find_address tool to get the complete address. Every time an item is added using 'insert_order', immediately call 'order_cost' to show the total.
The customer can check their order at any time. They may request delivery by saying 'deliver to' or 'my address is' followed by the postal code, ZIP code, or street name."""), Every time the customer ask for delivery or give the postal code (ZIP), always use the 'delivery_address' service to search for the postal code and
("placeholder", "{chat_history}"), give the complete address.
("human", "{input}"), Every time the customer ask to check or view their order details, always call the 'search_order' service.
("placeholder", "{agent_scratchpad}"), Every time the customer ask to check their order price (cost order) like 'how much is it?' or 'what is the cost?', always call the 'order_cost' service.
Every time the customer ask to delete an item, always call the 'delete_item' service.
"""),
("placeholder", "{messages}"),
] ]
) )
@@ -135,29 +167,35 @@ prompt = ChatPromptTemplate.from_messages(
llm = ChatOCIGenAI( llm = ChatOCIGenAI(
model_id="cohere.command-r-08-2024", model_id="cohere.command-r-08-2024",
service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com", service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
compartment_id="ocid1.compartment.oc1..aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa", compartment_id="ocid1.compartment.oc1..aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
auth_profile="DEFAULT", # replace with your profile name, auth_profile="DEFAULT", # replace with your profile name,
model_kwargs={"temperature": 0.1, "top_p": 0.75, "max_tokens": 2000} model_kwargs={"temperature": 0.1, "top_p": 0.75, "max_tokens": 2000}
) )
agent = create_tool_calling_agent(llm, tools, prompt) memory = MemorySaver()
langgraph_agent_executor = create_react_agent(
model=llm, tools=tools, prompt=prompt, checkpointer=memory
)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=False) config = {"configurable": {"thread_id": "test-thread"}}
#-------------------------------------------------------------------------- #--------------------------------------------------------------------------
# CHAT # CHAT
print("READY") print("READY")
create_orders_table() # Create the orders table
while (True): while (True):
try: try:
query = input() query = input()
if query == "quit": if query == "quit":
break break
response = agent_executor.invoke({ if query == "":
"input": query continue
} messages = langgraph_agent_executor.invoke(
) {"messages": [("human", query)]}, config
)["messages"]
response = messages[-1].content
print(response) print(response)
except: except Exception as ex:
print("Invalid Command") None

View File

@@ -2,3 +2,4 @@ langchain
langchain_community langchain_community
langchain-core langchain-core
oci-cli oci-cli
langgraph