Added in: 5.14.0
Prowler Lighthouse AI supports multiple Large Language Model (LLM) providers, offering flexibility to choose the provider that best fits infrastructure, compliance requirements, and cost considerations. This guide explains how to configure and use different LLM providers with Lighthouse AI.
Supported Providers
Lighthouse AI supports the following LLM providers:- OpenAI: Provides access to GPT models (GPT-4o, GPT-4, etc.)
- Amazon Bedrock: Offers AWS-hosted access to Claude, Llama, Titan, and other models
- OpenAI Compatible: Supports custom endpoints like OpenRouter, Ollama, or any OpenAI-compatible service
How Default Providers Work
All three providers can be configured for a tenant, but only one can be set as the default provider. The first configured provider automatically becomes the default. When visiting Lighthouse AI chat, the default provider’s default model loads automatically. Users can switch to any available LLM model (including those from non-default providers) using the dropdown in chat.
Configuring Providers
Navigate to Configuration → Lighthouse AI to see all three provider options with a Connect button under each.
Connecting a Provider
To connect a provider:- Click Connect under the desired provider
- Enter the required credentials
- Select a default model for that provider
- Click Connect to save
OpenAI
Required Information
- API Key: OpenAI API key (starts with
sk-orsk-proj-)
To generate an OpenAI API key, visit https://platform.openai.com/api-keys
Amazon Bedrock
Required Information
- AWS Access Key ID: AWS access key ID
- AWS Secret Access Key: AWS secret access key
- AWS Region: Region where Bedrock is available (e.g.,
us-east-1,us-west-2)
Required Permissions
The AWS user must have theAmazonBedrockLimitedAccess managed policy attached:
Currently, only AWS access key and secret key authentication is supported. Amazon Bedrock API key support will be available soon.
Available models depend on AWS region and account entitlements. Lighthouse AI displays only accessible models.
OpenAI Compatible
Use this option to connect to any LLM provider exposing OpenAI compatible API endpoint (OpenRouter, Ollama, etc.).Required Information
- API Key: API key from the compatible service
- Base URL: API endpoint URL including the API version (e.g.,
https://openrouter.ai/api/v1)
Example: OpenRouter
- Create an account at OpenRouter
- Generate an API key from the OpenRouter dashboard
- Configure in Lighthouse AI:
- API Key: OpenRouter API key
- Base URL:
https://openrouter.ai/api/v1
Changing the Default Provider
To set a different provider as default:- Navigate to Configuration → Lighthouse AI
- Click Configure under the provider you want as default
- Click Set as Default

Updating Provider Credentials
To update credentials for a connected provider:- Navigate to Configuration → Lighthouse AI
- Click Configure under the provider
- Enter the new credentials
- Click Update
Deleting a Provider
To remove a configured provider:- Navigate to Configuration → Lighthouse AI
- Click Configure under the provider
- Click Delete
Model Recommendations
For best results with Lighthouse AI, the recommended model isgpt-5 from OpenAI.
Models from other providers such as Amazon Bedrock and OpenAI Compatible endpoints can be connected and used, but performance is not guaranteed.

