Skip to main content

Quick Start

Get up and running with GenieBuilder in minutes.

1. Complete Onboarding

When you first launch GenieBuilder, the setup wizard will guide you through:

  • Selecting a project workspace
  • Configuring cloud AI providers (OpenAI, Anthropic, Gemini)
  • Setting up local AI (Ollama) if desired
  • Configuring MCP tools

You can skip any step and configure it later in Settings.

2. Open a Workspace

Select a project folder using the file explorer. GenieBuilder will:

  • Show a .gitignore-aware file tree
  • Allow you to open files in the built-in editor
  • Track tasks in the right panel (if Backlog is configured)

3. Configure an AI Provider

Choose from cloud or local providers:

Cloud Providers:

  • OpenAI — GPT-5.4 and other GPT models
  • Anthropic — Claude 4.6 Opus or Sonnet
  • Google — Gemini 3.1 Pro and Flash

Local Providers:

  • Ollama — Run models locally for privacy
  • LM Studio — GUI-based local model management
  • llama.cpp — Direct integration with custom builds

Add your API keys in SettingsProviders. Keys are stored securely using macOS Keychain.

4. Start a Chat

  1. Open the chat panel (bottom of screen)
  2. Click New Tab to start a conversation
  3. Select your provider and model
  4. Type your prompt and press ⌘+Enter to send

Each chat tab can use a different provider, letting you compare responses or use specialized models for different tasks.

5. Try a Workflow

  1. Click the Workflow button in the header
  2. Select an example from the sidebar (try "One Prompt" first)
  3. Click Run to execute
  4. Watch the workflow progress in real-time

Workflows can automate tasks like code review, bug fixes, or documentation generation.

Next Steps