micasa

Configuration

micasa has minimal configuration – it’s designed to work out of the box.

CLI#

Every command, flag, and subcommand is documented on the CLI Reference page. That page is generated directly from the cobra command tree, so it never drifts from the binary.

config get accepts an optional jq filter expression. With no filter (or the identity .), the entire resolved configuration is printed as JSON. API keys are stripped from the output to avoid accidentally leaking secrets.

micasa config get                      # full config (identity)
micasa config get .chat.llm.model      # current chat model name
micasa config get .extraction.llm      # extraction section
micasa config get '.chat.llm | keys'   # list keys in a section

To persist demo data for later:

micasa demo /tmp/my-demo.db   # creates and populates
micasa /tmp/my-demo.db        # reopens with the demo data

backup creates a consistent snapshot using SQLite’s Online Backup API, safe to run while the TUI is open:

micasa backup ~/backups/micasa-$(date +%F).db
micasa backup --source /path/to/micasa.db ~/backups/snapshot.db

Platform data directory#

micasa uses platform-aware data directories (via adrg/xdg). When no path is specified (via argument or MICASA_DB_PATH), the database is stored at:

PlatformDefault path
Linux$XDG_DATA_HOME/micasa/micasa.db (default ~/.local/share/micasa/micasa.db)
macOS~/Library/Application Support/micasa/micasa.db
Windows%LOCALAPPDATA%\micasa\micasa.db

On Linux, XDG_DATA_HOME is respected per the XDG Base Directory Specification.

Database path resolution order#

The database path is resolved in this order:

  1. Positional CLI argument, if provided
  2. MICASA_DB_PATH environment variable, if set
  3. Platform data directory (see table above)

The demo subcommand uses an in-memory database (:memory:) when no path argument is given.

Config file#

micasa reads a TOML config file from your platform’s config directory:

PlatformDefault path
Linux$XDG_CONFIG_HOME/micasa/config.toml (default ~/.config/micasa/config.toml)
macOS~/Library/Application Support/micasa/config.toml
Windows%APPDATA%\micasa\config.toml

The config file is optional. If it doesn’t exist, all settings use their defaults. Unset fields fall back to defaults – you only need to specify the values you want to change.

Example config#

# micasa configuration
# Each section is self-contained. No section's values affect another section.

[chat]
# Set to false to hide the chat feature from the UI.
# enable = true

[chat.llm]
# LLM connection settings for the chat (NL-to-SQL) pipeline.
# provider = "ollama"
base_url = "http://localhost:11434"
model = "qwen3"
# api_key = ""
# timeout = "5m"
# effort = "medium"
# extra_context = "My house is a 1920s craftsman in Portland, OR."

[extraction]
# max_pages = 0

[extraction.llm]
# LLM connection settings for document extraction.
# enable = true
# provider = "ollama"
model = "qwen3"
# timeout = "5m"
# effort = "low"

[extraction.ocr]
# enable = true

[extraction.ocr.tsv]
# enable = true
# confidence_threshold = 70

[documents]
# max_file_size = "50 MiB"
# cache_ttl = "30d"

[locale]
# currency = "USD"

[chat] section#

Controls the chat (NL-to-SQL) feature and its LLM settings.

KeyTypeDefaultDescription
enable MICASA_CHAT_ENABLEbooltrueSet to false to hide the chat feature from the UI.

[chat.llm] section#

LLM connection settings for the chat pipeline. Each field has its own default; no values are inherited from other config sections.

KeyTypeDefaultDescription
provider MICASA_CHAT_LLM_PROVIDERstringollamaLLM provider. Supported: ollama, anthropic, openai, openrouter, deepseek, gemini, groq, mistral, llamacpp, llamafile. Auto-detected from base_url and api_key when not set.
base_url MICASA_CHAT_LLM_BASE_URLstringhttp://localhost:11434Root URL of the provider’s API. No /v1 suffix needed.
model MICASA_CHAT_LLM_MODELstringqwen3Model identifier sent in chat requests.
api_key MICASA_CHAT_LLM_API_KEYstring(empty)Authentication credential. Required for cloud providers. Leave empty for local servers.
timeout MICASA_CHAT_LLM_TIMEOUTstring"5m"Inference timeout for chat responses (including streaming). Go duration syntax, e.g. "10m".
effort MICASA_CHAT_LLM_EFFORT Replaces thinking (MICASA_CHAT_LLM_THINKING).string(unset)Model reasoning effort level. Supported: none, low, medium, high, auto. Empty = server default.
extra_context MICASA_CHAT_LLM_EXTRA_CONTEXTstring(empty)Custom text appended to chat system prompts. Useful for domain-specific details about your house. Currency is handled automatically via [locale].

[extraction.llm] section#

LLM connection settings for the document extraction pipeline. Fully independent from [chat.llm] – each pipeline has its own provider, model, and credentials.

KeyTypeDefaultDescription
enable MICASA_EXTRACTION_LLM_ENABLEbooltrueSet to false to disable LLM-powered structured extraction. OCR and pdftotext still run.
provider MICASA_EXTRACTION_LLM_PROVIDERstringollamaLLM provider for extraction. Same options as [chat.llm].
base_url MICASA_EXTRACTION_LLM_BASE_URLstringhttp://localhost:11434API base URL for extraction.
model MICASA_EXTRACTION_LLM_MODELstringqwen3Model for extraction. Extraction works well with small, fast models optimized for structured JSON output.
api_key MICASA_EXTRACTION_LLM_API_KEYstring(empty)Authentication credential for extraction.
timeout MICASA_EXTRACTION_LLM_TIMEOUTstring"5m"Extraction inference timeout.
effort MICASA_EXTRACTION_LLM_EFFORT Replaces thinking (MICASA_EXTRACTION_LLM_THINKING).string(unset)Reasoning effort level for extraction.

[documents] section#

Document attachment limits and caching.

KeyTypeDefaultDescription
max_file_size MICASA_DOCUMENTS_MAX_FILE_SIZEstring or integer"50 MiB"Maximum file size for document imports. Accepts unitized strings ("50 MiB", "1.5 GiB") or bare integers (bytes). Must be positive.
cache_ttl MICASA_DOCUMENTS_CACHE_TTL Replaces cache_ttl_days (MICASA_DOCUMENTS_CACHE_TTL_DAYS). integer days become duration strings, e.g. 30 becomes 30d.string or integer"30d"Cache lifetime for extracted documents. Accepts "30d", "720h", or bare integers (seconds). Set to "0s" to disable eviction.
file_picker_dir MICASA_DOCUMENTS_FILE_PICKER_DIRstring(Downloads)Starting directory for the file picker. Defaults to the platform’s Downloads directory.

[extraction] section#

Document extraction pipeline settings.

KeyTypeDefaultDescription
max_pages MICASA_EXTRACTION_MAX_PAGESint0Maximum pages to OCR per scanned document. 0 means no limit.

[extraction.ocr] section#

OCR sub-pipeline settings. Requires tesseract and pdftocairo.

KeyTypeDefaultDescription
enable MICASA_EXTRACTION_OCR_ENABLEbooltrueSet to false to disable OCR on documents. When disabled, scanned pages and images produce no text.

[extraction.ocr.tsv] section#

Spatial layout annotations (line-level bounding boxes) from tesseract OCR. Improves extraction accuracy for invoices and forms with tabular data, at ~2x token overhead.

KeyTypeDefaultDescription
enable MICASA_EXTRACTION_OCR_TSV_ENABLEbooltrueSet to false to disable spatial annotations sent to the LLM.
confidence_threshold MICASA_EXTRACTION_OCR_TSV_CONFIDENCE_THRESHOLDint70Confidence threshold (0-100). Lines with OCR confidence below this value include a confidence score; lines above omit it to save tokens. Set to 0 to never show confidence.

[locale] section#

Locale and currency settings. Controls currency formatting across all money fields in the application.

KeyTypeDefaultDescription
currency MICASA_LOCALE_CURRENCYstring(auto-detect)ISO 4217 currency code (e.g. USD, EUR, GBP, JPY). Auto-detected from LC_MONETARY/LANG if not set, falls back to USD. Persisted to the database on first run – after that the DB value is authoritative.

Currency resolution order (highest to lowest):

  1. Database value (authoritative once set – makes the DB file portable)
  2. MICASA_LOCALE_CURRENCY environment variable
  3. [locale] currency config value
  4. Auto-detect from LC_MONETARY or LANG locale
  5. USD fallback

Formatting is locale-correct: EUR uses comma decimals and period grouping (1.234,56), GBP uses the pound sign (£750.00), JPY uses yen with no decimal places, etc.

Supported LLM backends#

micasa talks to any server that implements the OpenAI chat completions API with streaming (SSE). All providers – including Ollama – communicate via OpenAI-compatible endpoints; there is no native SDK dependency on any provider.

Local backends#

Ollama is the primary tested backend:

BackendDefault URLNotes
Ollamahttp://localhost:11434/v1Default and tested. Models are pulled automatically if not present.
llama.cpp serverhttp://localhost:8080/v1Should work (untested). Pass --host and --port when starting the server.
llamafilehttp://localhost:8080/v1Single-file executable with built-in server.
LM Studiohttp://localhost:1234/v1Should work (untested). Enable the local server in LM Studio settings.

Cloud providers#

micasa also supports cloud LLM providers. Set provider, base_url, and api_key in [chat.llm] and/or [extraction.llm]. Cloud providers use their own default base URLs when none is configured.

ProviderNotes
OpenAIGPT-4o, o1, etc.
AnthropicClaude models. Does not support model listing.
DeepSeekDeepSeek-R1, DeepSeek-V3, etc.
Google GeminiGemini models.
GroqFast inference for open models.
MistralMistral and Mixtral models.
OpenRouterMulti-provider gateway. Uses the OpenAI protocol.

Override precedence#

Environment variables override config file values. The full precedence order (highest to lowest):

  1. Environment variables
  2. Config file values
  3. Built-in defaults

Each config key has a corresponding env var shown in gray below the key name: MICASA_ + uppercase config path with dots replaced by underscores. MICASA_DB_PATH is an exception – it controls the database path and has no config file equivalent.

extra_context examples#

The extra_context field in [chat.llm] is injected into every system prompt sent to the chat LLM, giving it persistent knowledge about your situation:

[chat.llm]
extra_context = """
My house is a 1920s craftsman bungalow in Portland, OR.
Property tax is assessed annually in November.
The HVAC system is a heat pump (Mitsubishi hyper-heat) -- no gas furnace.
"""

This helps the model give more relevant answers without you repeating context in every question. Currency is configured separately via [locale] currency and is automatically available to the LLM – no need to mention it in extra_context.

Persistent preferences#

Some preferences are stored in the SQLite database and persist across restarts. These are controlled through the UI rather than config files:

PreferenceDefaultHow to change
Dashboard on startupShownPress D to toggle; your choice is remembered
LLM modelFrom configChanged automatically when you switch models in the chat interface
CurrencyUSDSet via [locale] currency in config, MICASA_LOCALE_CURRENCY env var, or auto-detected from system locale. Persisted to the database on first use