Get Your API Key
Log in to Bud AI Foundry
Navigate to API Keys & Security
Click + Create API Key
Name it “Quickstart” and click Create
Copy the key (starts with bud_)
Save your API key securely. It won’t be shown again after creation.
Python SDK
Install
pip install git+https://github.com/BudEcosystem/BudAIFoundry-SDK
Your First Request
from budai import BudClient
# Initialize client
client = BudClient( api_key = "bud_your_key_here" )
# Make inference request
response = client.chat.completions.create(
model = "llama-3.2-1b" ,
messages = [
{ "role" : "user" , "content" : "Explain quantum computing in one sentence" }
]
)
print (response.choices[ 0 ].message.content)
Output:
Quantum computing uses quantum mechanical phenomena like superposition and
entanglement to perform calculations exponentially faster than classical computers
for certain problems.
Run a Pipeline
# Create a deployment pipeline
pipeline = client.pipelines.create(
name = "Deploy Llama" ,
definition = {
"steps" : [
{
"id" : "add_model" ,
"action" : "model_add" ,
"params" : {
"model_uri" : "meta-llama/Llama-3.2-1B-Instruct" ,
"model_source" : "hugging_face"
}
},
{
"id" : "deploy" ,
"action" : "deployment_create" ,
"params" : {
"model_id" : " {{ steps.add_model.output.model_id }} " ,
"cluster_id" : "cluster_prod"
},
"depends_on" : [ "add_model" ]
}
]
}
)
# Execute the pipeline
execution = client.executions.create( pipeline_id = pipeline.id)
print ( f "Deployment created: { execution.outputs[ 'deployment_id' ] } " )
Gateway API (cURL)
Direct API Call
curl https://gateway.bud.studio/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer bud_your_key_here" \
-d '{
"model": "llama-3.2-1b",
"messages": [
{"role": "user", "content": "What is machine learning?"}
]
}'
Response
{
"id" : "chatcmpl-abc123" ,
"object" : "chat.completion" ,
"created" : 1706554800 ,
"model" : "llama-3.2-1b" ,
"choices" : [
{
"index" : 0 ,
"message" : {
"role" : "assistant" ,
"content" : "Machine learning is a subset of artificial intelligence..."
},
"finish_reason" : "stop"
}
],
"usage" : {
"prompt_tokens" : 12 ,
"completion_tokens" : 45 ,
"total_tokens" : 57
}
}
Environment Setup
Environment Variables
# Set your API key
export BUD_API_KEY = "bud_your_key_here"
# Set custom endpoint (optional)
export BUD_BASE_URL = "https://gateway.bud.studio/v1"
Configuration File
Create ~/.budai/config.yaml:
api_key : bud_your_key_here
base_url : https://gateway.bud.studio/v1
timeout : 30
The SDK automatically loads this configuration.
Check Available Models
# List all available models
models = client.models.list()
for model in models:
print ( f " { model.id } : { model.description } " )
Output:
llama-3.2-1b: Llama 3.2 1B Instruct
llama-3.2-3b: Llama 3.2 3B Instruct
gpt-4o: OpenAI GPT-4 Omni (cloud)
claude-3-5-sonnet: Anthropic Claude 3.5 Sonnet (cloud)
Error Handling
from budai.exceptions import BudAPIError
try :
response = client.chat.completions.create(
model = "nonexistent-model" ,
messages = [{ "role" : "user" , "content" : "Hello" }]
)
except BudAPIError as e:
print ( f "Error: { e.status_code } - { e.message } " )
Next Steps
Python SDK Guide Deep dive into Python SDK features
Gateway API REST API reference and examples
Authentication Secure your API access
CLI Tools Command-line interface guide