Use Edgee with LangChain for building AI applications with chains, agents, and RAG.
Edgee’s OpenAI-compatible API works seamlessly with LangChain, allowing you to leverage LangChain’s powerful features like chains, agents, memory, and RAG while maintaining control over your LLM infrastructure.
from langchain_openai import ChatOpenAIfrom langchain_core.messages import HumanMessage, SystemMessageimport os# Initialize the LLM with Edgee endpointllm = ChatOpenAI( base_url="https://api.edgee.ai/v1", api_key=os.getenv("API_KEY"), model="mistral-small", # or any model available through Edgee timeout=30.0,)# Simple chatmessages = [ SystemMessage(content="You are a helpful assistant."), HumanMessage(content="What is LangChain?"),]response = llm.invoke(messages)print(response.content)
# Basic usage (uses mistral-small by default)uv run langchain_script.py --message "Tell me a joke"# With custom modeluv run langchain_script.py --model "gpt-4" --message "Explain quantum computing"# With custom system promptuv run langchain_script.py \ --model "mistral-small" \ --message "Write a haiku" \ --system "You are a creative poet"
llm = ChatOpenAI( base_url="https://api.edgee.ai/v1", api_key=os.getenv("API_KEY"), model="mistral-small", streaming=True,)for chunk in llm.stream("Tell me a long story"): print(chunk.content, end="", flush=True)
You can add tags to your requests for analytics and filtering using the default_headers parameter:
Copy
Ask AI
from langchain_openai import ChatOpenAIimport osllm = ChatOpenAI( base_url="https://api.edgee.ai/v1", api_key=os.getenv("API_KEY"), model="mistral-small", default_headers={ "x-edgee-tags": "production,user-123,langchain", },)# All requests from this client will include these tagsresponse = llm.invoke("What is LangChain?")
Tags are comma-separated strings in the header. They help you categorize and filter requests in Edgee’s analytics dashboard.