A minimalist workflow template for building LLM applications with Flyt, a Go-based workflow framework with zero dependencies.
This template provides a starting point for building LLM-powered applications using Flyt's graph-based workflow system. It includes:
- 📊 Flow-based Architecture: Model your LLM workflows as directed graphs
- 🔄 Reusable Nodes: Build modular components that handle specific tasks
- 🛡️ Error Handling: Built-in retry logic and fallback mechanisms
- 🚀 Zero Dependencies: Pure Go implementation for maximum portability
flyt-project-template/
├── README.md # This file
├── flow.go # Flow definition and connections
├── main.go # Application entry point
├── nodes.go # Node implementations
├── go.mod # Go module definition
├── docs/
│ └── design.md # Design documentation
└── utils/
├── llm.go # LLM integration utilities
└── helpers.go # General helper functions
- Go 1.21 or later
- OpenAI API key (or other LLM provider)
- Clone this template:
git clone <your-repo-url>
cd flyt-project-template
- Install dependencies:
go mod tidy
- Set your API key:
export OPENAI_API_KEY="your-api-key-here"
- Run the example:
go run .
Nodes are the building blocks of your workflow. Each node has three phases:
- Prep - Read from shared store and prepare data
- Exec - Execute main logic (can be retried)
- Post - Process results and decide next action
node := flyt.NewNode(
flyt.WithPrepFunc(func(ctx context.Context, shared *flyt.SharedStore) (any, error) {
// Prepare data
return data, nil
}),
flyt.WithExecFunc(func(ctx context.Context, prepResult any) (any, error) {
// Execute logic
return result, nil
}),
flyt.WithPostFunc(func(ctx context.Context, shared *flyt.SharedStore, prepResult, execResult any) (flyt.Action, error) {
// Store results and return next action
return flyt.DefaultAction, nil
}),
)
Flows connect nodes to create workflows:
flow := flyt.NewFlow(startNode)
flow.Connect(startNode, "success", processNode)
flow.Connect(startNode, "error", errorNode)
flow.Connect(processNode, flyt.DefaultAction, endNode)
Thread-safe data sharing between nodes:
shared := flyt.NewSharedStore()
shared.Set("input", "Hello, Flyt!")
value, ok := shared.Get("input")
// Create nodes
questionNode := CreateQuestionNode()
answerNode := CreateAnswerNode(apiKey)
// Connect nodes
flow := flyt.NewFlow(questionNode)
flow.Connect(questionNode, flyt.DefaultAction, answerNode)
// Run flow
shared := flyt.NewSharedStore()
err := flow.Run(context.Background(), shared)
// Create nodes with conditional routing
decideNode := CreateDecisionNode()
searchNode := CreateSearchNode()
answerNode := CreateAnswerNode()
// Build flow with branching
flow := flyt.NewFlow(decideNode)
flow.Connect(decideNode, "search", searchNode)
flow.Connect(decideNode, "answer", answerNode)
flow.Connect(searchNode, "decide", decideNode) // Loop back
Process multiple items concurrently:
processFunc := func(ctx context.Context, item any) (any, error) {
// Process each item
return processItem(item), nil
}
batchNode := flyt.NewBatchNode(processFunc, true) // true for concurrent
Add retry logic to handle transient failures:
node := flyt.NewNode(
flyt.WithExecFunc(func(ctx context.Context, prepResult any) (any, error) {
return callFlakeyAPI()
}),
flyt.WithMaxRetries(3),
flyt.WithWait(time.Second),
)
- Create a new node in
nodes.go
:
func CreateMyCustomNode() flyt.Node {
return flyt.NewNode(
// Your implementation
)
}
- Add it to your flow in
flow.go
:
customNode := CreateMyCustomNode()
flow.Connect(previousNode, "custom", customNode)
Modify utils/llm.go
to support your preferred LLM provider:
- OpenAI
- Anthropic Claude
- Google Gemini
- Local models (Ollama, etc.)
- Single Responsibility: Each node should do one thing well
- Idempotency: Nodes should be idempotent when possible
- Error Handling: Always handle errors appropriately
- Context Awareness: Respect context cancellation
- Logging: Add appropriate logging for debugging
Check out the Flyt cookbook for more examples:
- Agent - AI agent with web search
- Chat - Interactive chat application
- MCP - Model Context Protocol integration
- Summarize - Text summarization with retries
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
This template is MIT licensed. See LICENSE file for details.