Skip to main content
Added in: 5.14.0 Prowler Lighthouse AI supports multiple Large Language Model (LLM) providers, offering flexibility to choose the provider that best fits infrastructure, compliance requirements, and cost considerations. This guide explains how to configure and use different LLM providers with Lighthouse AI.

Supported Providers

Lighthouse AI supports the following LLM providers:
  • OpenAI: Provides access to GPT models (GPT-4o, GPT-4, etc.)
  • Amazon Bedrock: Offers AWS-hosted access to Claude, Llama, Titan, and other models
  • OpenAI Compatible: Supports custom endpoints like OpenRouter, Ollama, or any OpenAI-compatible service

How Default Providers Work

All three providers can be configured for a tenant, but only one can be set as the default provider. The first configured provider automatically becomes the default. When visiting Lighthouse AI chat, the default provider’s default model loads automatically. Users can switch to any available LLM model (including those from non-default providers) using the dropdown in chat. Switch models in Lighthouse AI chat interface

Configuring Providers

Navigate to ConfigurationLighthouse AI to see all three provider options with a Connect button under each. Prowler Lighthouse Configuration

Connecting a Provider

To connect a provider:
  1. Click Connect under the desired provider
  2. Enter the required credentials
  3. Select a default model for that provider
  4. Click Connect to save

OpenAI

Required Information

  • API Key: OpenAI API key (starts with sk- or sk-proj-)
To generate an OpenAI API key, visit https://platform.openai.com/api-keys

Amazon Bedrock

Required Information

  • AWS Access Key ID: AWS access key ID
  • AWS Secret Access Key: AWS secret access key
  • AWS Region: Region where Bedrock is available (e.g., us-east-1, us-west-2)

Required Permissions

The AWS user must have the AmazonBedrockLimitedAccess managed policy attached:
arn:aws:iam::aws:policy/AmazonBedrockLimitedAccess
Currently, only AWS access key and secret key authentication is supported. Amazon Bedrock API key support will be available soon.
Available models depend on AWS region and account entitlements. Lighthouse AI displays only accessible models.

OpenAI Compatible

Use this option to connect to any LLM provider exposing OpenAI compatible API endpoint (OpenRouter, Ollama, etc.).

Required Information

  • API Key: API key from the compatible service
  • Base URL: API endpoint URL including the API version (e.g., https://openrouter.ai/api/v1)

Example: OpenRouter

  1. Create an account at OpenRouter
  2. Generate an API key from the OpenRouter dashboard
  3. Configure in Lighthouse AI:
    • API Key: OpenRouter API key
    • Base URL: https://openrouter.ai/api/v1

Changing the Default Provider

To set a different provider as default:
  1. Navigate to ConfigurationLighthouse AI
  2. Click Configure under the provider you want as default
  3. Click Set as Default
Set default LLM provider

Updating Provider Credentials

To update credentials for a connected provider:
  1. Navigate to ConfigurationLighthouse AI
  2. Click Configure under the provider
  3. Enter the new credentials
  4. Click Update

Deleting a Provider

To remove a configured provider:
  1. Navigate to ConfigurationLighthouse AI
  2. Click Configure under the provider
  3. Click Delete

Model Recommendations

For best results with Lighthouse AI, the recommended model is gpt-5 from OpenAI. Models from other providers such as Amazon Bedrock and OpenAI Compatible endpoints can be connected and used, but performance is not guaranteed.

Getting Help

For issues or suggestions, reach out through our Slack channel.