Quick Start
Get up and running with GenieBuilder in minutes.
1. Complete Onboarding
When you first launch GenieBuilder, the setup wizard will guide you through:
- Selecting a project workspace
- Configuring cloud AI providers (OpenAI, Anthropic, Gemini)
- Setting up local AI (Ollama) if desired
- Configuring MCP tools
You can skip any step and configure it later in Settings.
2. Open a Workspace
Select a project folder using the file explorer. GenieBuilder will:
- Show a
.gitignore-aware file tree - Allow you to open files in the built-in editor
- Track tasks in the right panel (if Backlog is configured)
3. Configure an AI Provider
Choose from cloud or local providers:
Cloud Providers:
- OpenAI — GPT-5.4 and other GPT models
- Anthropic — Claude 4.6 Opus or Sonnet
- Google — Gemini 3.1 Pro and Flash
Local Providers:
- Ollama — Run models locally for privacy
- LM Studio — GUI-based local model management
- llama.cpp — Direct integration with custom builds
Add your API keys in Settings → Providers. Keys are stored securely using macOS Keychain.
4. Start a Chat
- Open the chat panel (bottom of screen)
- Click New Tab to start a conversation
- Select your provider and model
- Type your prompt and press ⌘+Enter to send
Each chat tab can use a different provider, letting you compare responses or use specialized models for different tasks.
5. Try a Workflow
- Click the Workflow button in the header
- Select an example from the sidebar (try "One Prompt" first)
- Click Run to execute
- Watch the workflow progress in real-time
Workflows can automate tasks like code review, bug fixes, or documentation generation.
Next Steps
- Explore Workflow Examples to see what's possible
- Configure CLI Runners for specialized AI agents
- Learn about Task Management with Backlog integration