Self-Hosting Analytics
The Supabase Analytics server is a Logflare self-hostable instance that manages the ingestion and query pipelines for searching and aggregating structured analytics events.
When self-hosting the Analytics server, the full logging experience matching that of the Supabase Platform is available in the Studio instance, allowing for an integrated and enhanced development experience. However, it's important to note that certain differences may arise due to the platform's infrastructure.
Logflare Technical Docs
All logflare technical docs are available at https://docs.logflare.app
Getting Started#
Pre-requisites#
Logflare currently requires BigQuery usage. You will need to create a Google Cloud project with billing enabled.
The requirements are as follows after creating the project:
- Project ID
- Project number
- A service account key
Setting up BigQuery Service Account
To ensure that you have sufficient permissions to insert into your Google Cloud BigQuery, ensure that you have created a service account with either:
- BigQuery Admin role; or
- The following permissions:
- bigquery.datasets.create
- bigquery.datasets.get
- bigquery.datasets.getIamPolicy
- bigquery.datasets.update
- bigquery.jobs.create
- bigquery.routines.create
- bigquery.routines.update
- bigquery.tables.create
- bigquery.tables.delete
- bigquery.tables.get
- bigquery.tables.getData
- bigquery.tables.update
- bigquery.tables.updateData
You can create the service account via the web console or gcloud
, as per the Google Cloud documentation. In the web console, you can create the key by navigating to IAM > Service Accounts > Actions (dropdown) > Manage Keys
We recommend setting the BigQuery Admin role, as it simplifies permissions setup.
Downloading the Service Account key
After the service account is created, you will need to create a key for the service account. This key will sign the JWTs for API requests that the Analytics server makes with BigQuery.
Docker Compose#
Using the example self-hosting stack based on docker-compose, you include the logging related services using the following command
You will first need to update the .env.example
file with the necessary environment variables.
GOOGLE_PROJECT_ID
GOOGLE_PROJECT_NUMBER
You will need to place your Service Account key in your present working directory with the filename gcloud.json
.
Thereafter, you can run the docker-compose commands and include the logging-specific compose file.
_10# assuming you clone the supabase/supabase repo._10cd docker_10docker compose -f docker-compose.yml -f docker-compose-logging.yml up
This would include two additional docker services to your compose stack: Logflare and Vector.
Self-hosting
Read more about self-hosting Logflare as your analytics server.
Vector Usage
In the Docker Compose example, we utilize vector to coordinate the logging pipeline between the services. However, if you peer into the vector configuration file, you will be able to see that Vector sends logs to the Analytics ingestion endpoint.
If you need to customize the logging pipeline for your own needs, you must ensure that the payloads matches the expected event schema structure. Without the correct structure, it would cause the Studio Logs UI features to break.
Standalone Docker Container#
If desired, you can utilize the standalone docker-container. Please refer to the docker-compose file for required docker configuration.
Additional supplementary technical documentation on self-hosting and using the supabase/logflare
image for a full Logflare experience is available at the official Logflare documentation.
Differences#
API logs rely on Kong instead of the Supabase Cloud API Gateway. Logs from Kong are not enriched with platform-only data.
Within the self-hosted setup, all logs are routed to Logflare via Vector. As Kong routes API requests to PostgREST, self-hosted or local deployments will result in Kong request logs instead. This would result in differences in the log event metadata between self-hosted API requests and Supabase Platform requests.
BigQuery#
All log event data is stored in and queried from BigQuery. To use the Analytics server with Supabase you'll need a Google Cloud Platform account for access to BigQuery.
Make sure to set the GOOGLE_DATASET_ID_APPEND
, GOOGLE_PROJECT_ID
and GOOGLE_PROJECT_NUMBER
environment variables.
Download your Google Cloud API JWT and store it under gcloud.json in your working directory.
note
You must also enable billing on your Google Cloud project, as the streaming inserts feature is required.
Production Recommendations#
To self-host in a production setting, we recommend performing the following for a better experience.
Ensure that Logflare is behind a firewall and restrict all network access to it besides safe requests.#
Self-hosted Logflare has UI authentication disabled and is intended for exposure to the internet. We recommend restricting access to the dashboard, accessible at the /dashboard
path.
If dashboard access is required for managing sources, we recommend having an authentication layer, such as a VPN.
Use a different Postgres Database to store Logflare data.#
Logflare requires a Postgres database to function. However, if there is an issue with you self-hosted Postgres service, you would not be able to debug it as it would also bring Logflare down together.
The self-hosted example is only used as a minimal example on running the entire stack, however it is not recommended to use the same database server for both production and observability.
Client libraries#
- JavaScript - Pino Transport
- Elixir
- Elixir - Logger Backend
- Erlang
- Erlang - Lager Backend
- Cloudflare Worker