What are the system requirements?
Char requires macOS 12.0 or later with Apple Silicon (M1/M2/M3) or Intel processor. We recommend at least 8GB of RAM for optimal performance with local AI features.
How much storage space does Char need?
The app itself is about 200MB. Recording storage depends on your usage - a 1-hour meeting uses approximately 50-100MB. We recommend having at least 5GB of free space.
Does Char work offline?
Yes! Since Char uses local AI, it works completely offline. You don't need an internet connection to record, transcribe, or generate summaries. See Local Models for available local STT models and Local LLM Setup for configuring local AI.
How does Char route cloud LLM requests?
When using cloud AI (Char Pro), LLM requests are routed through OpenRouter. OpenRouter acts as a unified gateway to multiple model providers (OpenAI, Anthropic, Google, and others), so Char can switch between models without requiring separate API keys for each provider. A single OPENROUTER_API_KEY is all that's needed for cloud LLM access.
How does in-app search work?
Char uses Tantivy, a Rust full-text search engine, to index and search your notes locally. Documents are indexed in the background as you create and edit sessions. The search supports filtering by type (meeting notes, people, organizations) and date range, with results ranked by relevance.
What is the support chat powered by?
The in-app support chat uses the Model Context Protocol (MCP) to connect to Char's support server. The MCP server provides tools for GitHub issue management (search_issues, create_issue, add_comment) and Stripe billing operations (list_subscriptions, create_billing_portal_session). It also serves MCP prompts that guide the chat's behavior. Destructive actions like creating issues require user confirmation before executing.