When I build a small SaaS that calls paid model APIs, I do not want to start by wiring together multiple backend platforms.

My take: I would not start with Firebase or Supabase.

I would start with Postgres. Self-host it if you can. If not, use hosted Postgres from someone else, but keep it as normal Postgres that you can back up, move, and query directly.

I Want to Own the Database

For a solo SaaS, especially before the first 100 paying users, the backend should be boring.

Boring does not mean weak. It means I know where the data is, where auth is checked, where billing limits are enforced, where paid provider usage is counted, and how I get the whole thing out if I need to move.

That is why I lean toward plain Postgres.

The app will change. I might switch frontend frameworks, AI providers, or payment providers. The database is the part I want to keep stable.

I do not want my core product data modeled around a BaaS SDK.

The Problem With BaaS Defaults

Firebase and Supabase are useful products. I have used Firebase for prototypes before. Supabase gets you Postgres plus auth quickly.

But they both want to be more than a database.

Firebase gives you Auth, Firestore, Storage, Functions, security rules, and client SDKs.

Supabase gives you Auth, Postgres, Storage, Edge Functions, row-level security, realtime, and client SDKs.

That is convenient until the product depends on the platform more than the backend code.

Now the decisions are not just:

  • what tables do I need?
  • what queries do I need?
  • what does the backend enforce?

They become:

  • which client SDK owns this flow?
  • are permissions in application code, security rules, or row-level security?
  • does the frontend write directly to the database?
  • how do I run the same setup locally?
  • how do I back this up?
  • how do I move this if pricing, limits, or product direction changes?

For a small app, that is too many platform decisions before the product has customers.

Put Usage and Billing in SQL

For an app that charges for paid provider usage, I want Postgres.

The product can look simple at first. Then you need:

  • saved configurations
  • versions
  • tags
  • owners
  • teams
  • permissions
  • model settings
  • variables
  • test inputs
  • generated outputs
  • usage counts
  • billing limits
  • audit logs

That maps well to relational data. Some parts are structured. Some parts are flexible. Postgres handles both.

Example:

create table ai_workflows (
  id uuid primary key default gen_random_uuid(),
  user_id uuid not null,
  title text not null,
  instructions text not null,
  model text not null,
  metadata jsonb not null default '{}',
  created_at timestamptz not null default now(),
  updated_at timestamptz not null default now()
);

create table ai_workflow_versions (
  id uuid primary key default gen_random_uuid(),
  workflow_id uuid not null references ai_workflows(id),
  instructions text not null,
  metadata jsonb not null default '{}',
  created_at timestamptz not null default now()
);

create table ai_runs (
  id uuid primary key default gen_random_uuid(),
  workflow_id uuid not null references ai_workflows(id),
  user_id uuid not null,
  model text not null,
  input jsonb not null default '{}',
  output text,
  tokens_input integer not null default 0,
  tokens_output integer not null default 0,
  created_at timestamptz not null default now()
);

That is enough to start.

You can add team_id later. You can add sharing later. You can add full-text search later. You can add pgvector later if semantic search becomes useful.

The important thing is that the core data is in one place, in a format I can query directly.

Usage Has to Be Measurable

Calls to OpenAI, Anthropic, Gemini, or any other paid provider cost money. That changes the backend design.

If your app lets users run paid provider calls, the database needs to answer basic operational questions:

select
  user_id,
  count(*) as runs,
  sum(tokens_input + tokens_output) as total_tokens
from ai_runs
where created_at >= date_trunc('month', now())
group by user_id
order by total_tokens desc;

I expect to write queries like this in any app that charges for paid provider usage.

Who used the most tokens this month? Which workflow is expensive? Which free account is using too much? Did this customer exceed their plan? How many failed runs came from one model?

These are database questions.

I could model this in Firebase. I just would not pick a document database for an app where I already know usage, billing, joins, and reporting matter.

Self-Host Postgres If You Can

My preference:

  • self-host Postgres
  • if not, hosted Postgres
  • if not, a BaaS with Postgres underneath
  • Firebase only if the app genuinely fits Firebase

Self-hosting does not mean building a data center. For my current projects, one Hetzner server with Kamal gets me a lot of mileage.

Postgres backups are straightforward. Restores are understandable. Logs are mine. Migrations are mine. The app connects over a normal database URL. No special integration needed.

Hosted Postgres is fine too. I care less about the brand than the basics:

  • can I connect with standard Postgres tools?
  • can I run migrations?
  • can I dump and restore?
  • can I move away without rewriting the app?
  • can my backend own the product rules?

If yes, that is enough.

Keep the Backend in Charge

If there is already a backend, I would keep the security model simple:

  • the frontend calls your backend
  • the backend checks the user
  • the backend reads and writes Postgres
  • the backend calls the model provider
  • the backend handles payment webhooks
  • the backend enforces free tier limits and paid plan limits

Do not let the frontend become the place where important decisions happen.

The frontend can show the editor. It can show “3 runs left.” It can disable a button. But the backend has to enforce the limit.

I also would not put model provider calls in the frontend. API keys stay server-side. Usage gets recorded server-side. Billing checks happen server-side.

What I Would Build First

For a solo MVP, I would start with this:

  • frontend
  • one backend API
  • Postgres
  • one auth system
  • payment webhooks handled by the backend
  • model provider API called only from the backend
  • an ai_runs table for usage tracking from day one

The first version can be plain:

frontend -> backend -> Postgres
                 |
                 -> model provider
                 |
                 -> payment webhook handling

The backend owns the product rules.

That means:

  • who can read a workflow
  • who can edit a workflow
  • how many free runs are left
  • whether the account is paid
  • which model can be used
  • how much usage was recorded

You can still use a hosted auth provider if you want. You can still use a managed Postgres provider if you do not want to run the database yourself.

The line I would avoid crossing early: making the app depend on a BaaS when a normal backend plus Postgres is enough.

The Stack Should Match the Stage

Before 100 paying users, the stack has one job: help you learn what people will pay for.

You need enough structure that the app can grow. You do not need to organize the backend around a BaaS before you know what the product needs.

For this product, I would use Postgres and keep moving. Self-hosted if practical. Hosted if that saves time. Either way, I want standard Postgres, normal migrations, normal backups, and backend code that owns the business rules.

Firebase and Supabase can both work. I just would not start there by default.