AI Configuration — Models, Prompts, and Credits
Configure AI analysis used during scraping. Choose a model, enable prompts, and control usage.
Key fields
aiConfig.enabled: toggle AI processingaiConfig.selected_model:[displayName, apiId]aiConfig.thread_count: model concurrencyaiConfig.max_credits_per_post: usage limit per postaiConfig.prompts.prompt_{id}: prompt selection and bounds
The UI persists prompts as references under aiConfig.prompts.prompt_{id} with selection booleans and optional bounds.
Model selection
- Choose the provider/model in the UI; display name and API ID are stored together for clarity
- Related UI elements update across the page to reflect the active model
Prompts
- Create prompts under
users/{uid}/prompts(subcollection) - Add prompt references into the workflow at
aiConfig.prompts.* - Toggle prompt inclusion per workflow; supports numeric and string-bounded prompts
Credits
- User credits display updates in real time from
users/{uid} - Use
max_credits_per_postto bound per-post spend
Best practices
- Start with a narrow prompt set; expand after validating signal quality
- Use bounded numeric/string prompts where possible for predictable outputs
- Keep thread counts modest initially; scale with monitoring