Skip to content

Getting Started

This guide walks through the current production path: Bedrock provider + ToolLoopAgent.

Installation

Terminal window
go get github.com/iamanishx/go-ai/agent
go get github.com/iamanishx/go-ai/provider
go get github.com/iamanishx/go-ai/provider/bedrock

Optional MCP module:

Terminal window
go get github.com/iamanishx/go-ai/mcp

1) Create a Bedrock Provider

import "github.com/iamanishx/go-ai/provider/bedrock"
bedrockProvider := bedrock.Create(bedrock.BedrockProviderSettings{
Region: "us-east-1",
Profile: "myprofile",
})

You can also authenticate with environment variables:

Terminal window
export AWS_REGION=us-east-1
export AWS_ACCESS_KEY_ID=your-key
export AWS_SECRET_ACCESS_KEY=your-secret
bedrockProvider := bedrock.Create(bedrock.BedrockProviderSettings{
Region: "us-east-1",
})

2) Define a Tool

import (
"fmt"
"github.com/iamanishx/go-ai/provider"
)
weatherTool := provider.Tool{
Name: "get_weather",
Description: "Get weather for a location",
Parameters: map[string]interface{}{
"type": "object",
"properties": map[string]interface{}{
"location": map[string]interface{}{"type": "string"},
},
"required": []string{"location"},
},
Execute: func(input map[string]interface{}) (string, error) {
location, _ := input["location"].(string)
return fmt.Sprintf("Weather in %s: Sunny", location), nil
},
}

3) Create a ToolLoopAgent

import "github.com/iamanishx/go-ai/agent"
toolAgent := agent.CreateToolLoopAgent(agent.ToolLoopAgentSettings{
Model: bedrockProvider.Chat("anthropic.claude-3-sonnet-20240229-v1:0"),
Tools: []provider.Tool{weatherTool},
ExecuteTools: true,
MaxSteps: 10,
})

4) Generate Text

ctx := context.Background()
result, err := toolAgent.Generate(ctx, agent.AgentCallOptions{
Prompt: "What's the weather in San Francisco?",
System: "You are a concise assistant.",
})
if err != nil {
log.Fatal(err)
}
fmt.Println(result.Text)

5) Stream Responses

stream, err := toolAgent.Stream(ctx, agent.AgentCallOptions{
Prompt: "What's the weather in San Francisco?",
})
if err != nil {
log.Fatal(err)
}
defer stream.Close()
for part := range stream.Part() {
switch part.Type {
case "text-delta":
fmt.Print(part.Text)
case "tool-call":
fmt.Printf("Calling tool: %s\n", part.ToolName)
case "error":
fmt.Printf("Stream error: %v\n", part.Error)
case "finish":
fmt.Printf("Done: %s\n", part.FinishReason)
}
}

Configuration

Agent Options

OptionTypeDescription
Modelprovider.ChatModelThe language model to use
Tools[]provider.ToolAvailable tools
ExecuteToolsboolWhether to execute tools
MaxStepsintMaximum steps (default: 20)
OnStartfuncCalled when agent starts
OnStepFinishfuncCalled after each step
OnFinishfuncCalled when agent finishes

Provider Options

OptionTypeDescription
RegionstringAWS region
ProfilestringAWS profile name
AccessKeyIDstringAWS access key
SecretAccessKeystringAWS secret key
SessionTokenstringAWS session token
APIKeystringBearer token
BaseURLstringCustom endpoint

Next Steps