Next.js Supabase V3 (#463)

Version 3 of the kit:
- Radix UI replaced with Base UI (using the Shadcn UI patterns)
- next-intl replaces react-i18next
- enhanceAction deprecated; usage moved to next-safe-action
- main layout now wrapped with [locale] path segment
- Teams only mode
- Layout updates
- Zod v4
- Next.js 16.2
- Typescript 6
- All other dependencies updated
- Removed deprecated Edge CSRF
- Dynamic Github Action runner
This commit is contained in:
Giancarlo Buomprisco
2026-03-24 13:40:38 +08:00
committed by GitHub
parent 4912e402a3
commit 7ebff31475
840 changed files with 71395 additions and 20095 deletions

View File

@@ -0,0 +1,263 @@
---
status: "published"
label: "Adding a Turborepo App"
title: "Add a New Application to Your Makerkit Monorepo"
description: "Create additional applications in your Turborepo monorepo using git subtree to maintain updates from Makerkit while building separate products."
order: 13
---
Add new applications to your Makerkit monorepo using `git subtree` to clone the `apps/web` template while maintaining the ability to pull updates from the Makerkit repository. This is useful for building multiple products (e.g., a main app and an admin dashboard) that share the same packages and infrastructure.
{% alert type="warning" title="Advanced Topic" %}
This guide is for advanced use cases where you need multiple applications in a single monorepo. For most projects, a single `apps/web` application is sufficient. Creating a separate repository may be simpler if you don't need to share code between applications.
{% /alert %}
{% sequence title="Add a Turborepo Application" description="Create a new application from the web template" %}
[Create the subtree branch](#step-1-create-the-subtree-branch)
[Add the new application](#step-2-add-the-new-application)
[Configure the application](#step-3-configure-the-new-application)
[Keep it updated](#step-4-pulling-updates)
{% /sequence %}
## When to Add a New Application
Add a new Turborepo application when:
- **Multiple products**: You're building separate products that share authentication, billing, or UI components
- **Admin dashboard**: You need a separate admin interface with different routing and permissions
- **API server**: You want a dedicated API application separate from your main web app
- **Mobile companion**: You're building a React Native or Expo app that shares business logic
Keep a single application when:
- You only need one web application
- Different features can live under different routes in `apps/web`
- Separation isn't worth the complexity
## Step 1: Create the Subtree Branch
First, create a branch that contains only the `apps/web` folder. This branch serves as the template for new applications.
```bash
git subtree split --prefix=apps/web --branch web-branch
```
This command:
1. Extracts the history of `apps/web` into a new branch
2. Creates `web-branch` containing only the `apps/web` contents
3. Preserves commit history for that folder
## Step 2: Add the New Application
Create your new application by pulling from the subtree branch.
For example, to create a `pdf-chat` application:
```bash
git subtree add --prefix=apps/pdf-chat origin web-branch --squash
```
This command:
1. Creates `apps/pdf-chat` with the same structure as `apps/web`
2. Squashes the history into a single commit (cleaner git log)
3. Sets up tracking for future updates
### Verify the Application
```bash
ls apps/pdf-chat
```
You should see the same structure as `apps/web`:
```
apps/pdf-chat/
├── app/
├── components/
├── config/
├── lib/
├── supabase/
├── next.config.mjs
├── package.json
└── ...
```
## Step 3: Configure the New Application
### Update package.json
Change the package name and any app-specific settings:
```json {% title="apps/pdf-chat/package.json" %}
{
"name": "pdf-chat",
"version": "0.0.1",
"scripts": {
"dev": "next dev --port 3001",
"build": "next build",
"start": "next start --port 3001"
}
}
```
### Update Environment Variables
Create a separate `.env.local` for the new application:
```bash {% title="apps/pdf-chat/.env.local" %}
NEXT_PUBLIC_SITE_URL=http://localhost:3001
NEXT_PUBLIC_APP_NAME="PDF Chat"
# ... other environment variables
```
### Update Supabase Configuration (if separate)
If the application needs its own database, update `apps/pdf-chat/supabase/config.toml` with unique ports and project settings.
### Add Turbo Configuration
Update the root `turbo.json` to include your new application:
```json {% title="turbo.json" %}
{
"tasks": {
"build": {
"dependsOn": ["^build"],
"outputs": [".next/**", "!.next/cache/**"]
},
"pdf-chat#dev": {
"dependsOn": ["^build"],
"persistent": true
}
}
}
```
### Run the New Application
```bash
# Run just the new app
pnpm --filter pdf-chat dev
# Run all apps in parallel
pnpm dev
```
## Step 4: Pulling Updates
When Makerkit releases updates, follow these steps to sync them to your new application.
### Pull Upstream Changes
First, pull the latest changes from Makerkit:
```bash
git pull upstream main
```
### Update the Subtree Branch
Re-extract the `apps/web` folder into the subtree branch:
```bash
git subtree split --prefix=apps/web --branch web-branch
```
### Push the Branch
Push the updated branch to your repository:
```bash
git push origin web-branch
```
### Pull Into Your Application
Finally, pull the updates into your new application:
```bash
git subtree pull --prefix=apps/pdf-chat origin web-branch --squash
```
### Resolve Conflicts
If you've modified files that were also changed upstream, you'll need to resolve conflicts:
```bash
# After conflicts appear
git status # See conflicted files
# Edit files to resolve conflicts
git add .
git commit -m "Merge upstream changes into pdf-chat"
```
## Update Workflow Summary
```bash
# 1. Get latest from Makerkit
git pull upstream main
# 2. Update the template branch
git subtree split --prefix=apps/web --branch web-branch
git push origin web-branch
# 3. Pull into each additional app
git subtree pull --prefix=apps/pdf-chat origin web-branch --squash
git subtree pull --prefix=apps/admin origin web-branch --squash
```
## Troubleshooting
**"fatal: refusing to merge unrelated histories"**
Add the `--squash` flag to ignore history differences:
```bash
git subtree pull --prefix=apps/pdf-chat origin web-branch --squash
```
**Subtree branch doesn't exist on remote**
Push it first:
```bash
git push origin web-branch
```
**Application won't start (port conflict)**
Update the port in `package.json`:
```json
{
"scripts": {
"dev": "next dev --port 3001"
}
}
```
**Shared packages not resolving**
Ensure the new app's `package.json` includes the workspace dependencies:
```json
{
"dependencies": {
"@kit/ui": "workspace:*",
"@kit/supabase": "workspace:*"
}
}
```
Then run `pnpm install` from the repository root.
## Related Resources
- [Adding Turborepo Packages](/docs/next-supabase-turbo/development/adding-turborepo-package) for creating shared packages
- [Technical Details](/docs/next-supabase-turbo/installation/technical-details) for monorepo structure
- [Clone Repository](/docs/next-supabase-turbo/installation/clone-repository) for initial setup

View File

@@ -0,0 +1,364 @@
---
status: "published"
label: "Adding a Turborepo Package"
title: "Add a Shared Package to Your Makerkit Monorepo"
description: "Create reusable packages for shared business logic, utilities, or components across your Turborepo monorepo applications."
order: 14
---
Create shared packages in your Makerkit monorepo using `turbo gen` to scaffold a new package at `packages/@kit/<name>`. Shared packages let you reuse business logic, utilities, or components across multiple applications while maintaining a single source of truth.
{% alert type="default" title="When to Create a Package" %}
Create a package when you have code that needs to be shared across multiple applications or when you want to enforce clear boundaries between different parts of your codebase. For code used only in `apps/web`, a folder within the app is simpler.
{% /alert %}
{% sequence title="Create a Shared Package" description="Add a new package to your monorepo" %}
[Generate the package](#step-1-generate-the-package)
[Configure exports](#step-2-configure-exports)
[Add to Next.js config](#step-3-add-to-nextjs-config)
[Use in your application](#step-4-use-the-package)
{% /sequence %}
## When to Create a Package
Create a shared package when:
- **Multiple applications**: Code needs to be used across `apps/web` and other applications
- **Clear boundaries**: You want to enforce separation between different domains
- **Reusable utilities**: Generic utilities that could be used in any application
- **Shared types**: TypeScript types shared across the codebase
Keep code in `apps/web` when:
- It's only used in one application
- It's tightly coupled to specific routes or pages
- Creating a package adds complexity without benefit
## Step 1: Generate the Package
Use the Turborepo generator to scaffold a new package:
```bash
turbo gen
```
Follow the prompts:
1. Select **"Create a new package"**
2. Enter the package name (e.g., `analytics`)
3. Optionally add dependencies
The generator creates a package at `packages/@kit/analytics` with this structure:
```
packages/@kit/analytics/
├── src/
│ └── index.ts
├── package.json
└── tsconfig.json
```
### Package.json Structure
```json {% title="packages/@kit/analytics/package.json" %}
{
"name": "@kit/analytics",
"version": "0.0.1",
"main": "./src/index.ts",
"types": "./src/index.ts",
"exports": {
".": "./src/index.ts"
},
"scripts": {
"typecheck": "tsc --noEmit"
},
"devDependencies": {
"typescript": "^5.9.0"
}
}
```
## Step 2: Configure Exports
### Single Export (Simple)
For packages with a single entry point, export everything from `index.ts`:
```typescript {% title="packages/@kit/analytics/src/index.ts" %}
export { trackEvent, trackPageView } from './tracking';
export { AnalyticsProvider } from './provider';
export type { AnalyticsEvent, AnalyticsConfig } from './types';
```
Import in your application:
```typescript
import { trackEvent, AnalyticsProvider } from '@kit/analytics';
```
### Multiple Exports (Tree-Shaking)
For packages with client and server code, use multiple exports for better tree-shaking:
```json {% title="packages/@kit/analytics/package.json" %}
{
"name": "@kit/analytics",
"exports": {
".": "./src/index.ts",
"./client": "./src/client.ts",
"./server": "./src/server.ts"
}
}
```
Create separate entry points:
```typescript {% title="packages/@kit/analytics/src/client.ts" %}
// Client-side analytics (runs in browser)
export { useAnalytics } from './hooks/use-analytics';
export { AnalyticsProvider } from './components/provider';
```
```typescript {% title="packages/@kit/analytics/src/server.ts" %}
// Server-side analytics (runs on server only)
export { trackServerEvent } from './server/tracking';
export { getAnalyticsClient } from './server/client';
```
Import the specific export:
```typescript
// In a Client Component
import { useAnalytics } from '@kit/analytics/client';
// In a Server Component or Server Action
import { trackServerEvent } from '@kit/analytics/server';
```
### When to Use Multiple Exports
Use multiple exports when:
- **Client/server separation**: Code that should only run in one environment
- **Large packages**: Reduce bundle size by allowing apps to import only what they need
- **Optional features**: Features that not all consumers need
## Step 3: Add to Next.js Config
For hot module replacement (HMR) to work during development, add your package to the `INTERNAL_PACKAGES` array in `apps/web/next.config.mjs`:
```javascript {% title="apps/web/next.config.mjs" %}
const INTERNAL_PACKAGES = [
'@kit/ui',
'@kit/auth',
'@kit/supabase',
// ... existing packages
'@kit/analytics', // Add your new package
];
```
This tells Next.js to:
1. Transpile the package (since it's TypeScript)
2. Watch for changes and trigger HMR
3. Include it in the build optimization
## Step 4: Use the Package
### Add as Dependency
Add the package to your application's dependencies:
```json {% title="apps/web/package.json" %}
{
"dependencies": {
"@kit/analytics": "workspace:*"
}
}
```
Run `pnpm install` to link the workspace package.
### Import and Use
```typescript {% title="apps/web/app/layout.tsx" %}
import { AnalyticsProvider } from '@kit/analytics';
export default function RootLayout({ children }) {
return (
<html>
<body>
<AnalyticsProvider>
{children}
</AnalyticsProvider>
</body>
</html>
);
}
```
```typescript {% title="apps/web/app/home/page.tsx" %}
import { trackPageView } from '@kit/analytics';
export default function HomePage() {
trackPageView({ page: 'home' });
return <div>Welcome</div>;
}
```
## Package Development Patterns
### Adding Dependencies
Add dependencies to your package:
```bash
pnpm --filter @kit/analytics add zod
```
### Using Other Workspace Packages
Reference other workspace packages:
```json {% title="packages/@kit/analytics/package.json" %}
{
"dependencies": {
"@kit/shared": "workspace:*"
}
}
```
### TypeScript Configuration
The package's `tsconfig.json` should extend the root configuration:
```json {% title="packages/@kit/analytics/tsconfig.json" %}
{
"extends": "../../../tsconfig.json",
"compilerOptions": {
"outDir": "./dist",
"rootDir": "./src"
},
"include": ["src/**/*"]
}
```
### Testing Packages
Add tests alongside your package code:
```
packages/@kit/analytics/
├── src/
│ ├── index.ts
│ └── tracking.ts
└── tests/
└── tracking.test.ts
```
Run tests:
```bash
pnpm --filter @kit/analytics test
```
## Example: Creating a Feature Package
Here's a complete example of creating a `notifications` package:
### 1. Generate
```bash
turbo gen
# Name: notifications
```
### 2. Structure
```
packages/@kit/notifications/
├── src/
│ ├── index.ts
│ ├── client.ts
│ ├── server.ts
│ ├── components/
│ │ └── notification-bell.tsx
│ ├── hooks/
│ │ └── use-notifications.ts
│ └── server/
│ └── send-notification.ts
└── package.json
```
### 3. Exports
```json {% title="packages/@kit/notifications/package.json" %}
{
"name": "@kit/notifications",
"exports": {
".": "./src/index.ts",
"./client": "./src/client.ts",
"./server": "./src/server.ts"
},
"dependencies": {
"@kit/supabase": "workspace:*"
}
}
```
### 4. Implementation
```typescript {% title="packages/@kit/notifications/src/client.ts" %}
export { NotificationBell } from './components/notification-bell';
export { useNotifications } from './hooks/use-notifications';
```
```typescript {% title="packages/@kit/notifications/src/server.ts" %}
export { sendNotification } from './server/send-notification';
```
### 5. Use
```typescript
// Client Component
import { NotificationBell } from '@kit/notifications/client';
// Server Action
import { sendNotification } from '@kit/notifications/server';
```
## Troubleshooting
**Module not found**
1. Ensure the package is in `INTERNAL_PACKAGES` in `next.config.mjs`
2. Run `pnpm install` to link workspace packages
3. Check the export path matches your import
**Types not resolving**
Ensure `tsconfig.json` includes the package paths:
```json
{
"compilerOptions": {
"paths": {
"@kit/*": ["./packages/@kit/*/src"]
}
}
}
```
**HMR not working**
Verify the package is listed in `INTERNAL_PACKAGES` and restart the dev server.
## Related Resources
- [Adding Turborepo Apps](/docs/next-supabase-turbo/development/adding-turborepo-app) for creating new applications
- [Technical Details](/docs/next-supabase-turbo/installation/technical-details) for monorepo architecture

View File

@@ -0,0 +1,304 @@
---
status: "published"
label: "Application tests (E2E)"
title: "Writing Application Tests (E2E) with Playwright"
description: "Learn how to write Application Tests (E2E) with Playwright to test your application and ensure it works as expected"
order: 11
---
End-to-end (E2E) tests are crucial for ensuring your application works correctly from the user's perspective. This guide covers best practices for writing reliable, maintainable E2E tests using Playwright in your Makerkit application.
## Core Testing Principles
### 1. Test Structure and Organization
Your E2E tests are organized in the `apps/e2e/tests/` directory with the following structure:
```
apps/e2e/tests/
├── authentication/ # Auth-related tests
│ ├── auth.spec.ts # Test specifications
│ └── auth.po.ts # Page Object Model
├── team-accounts/ # Team functionality tests
├── invitations/ # Invitation flow tests
├── utils/ # Shared utilities
│ ├── mailbox.ts # Email testing utilities
│ ├── otp.po.ts # OTP verification utilities
│ └── billing.po.ts # Billing test utilities
└── playwright.config.ts # Playwright configuration
```
**Key Principles:**
- Each feature has its own directory with `.spec.ts` and `.po.ts` files
- Shared utilities are in the `utils/` directory
- Page Object Model (POM) pattern is used consistently
### 2. Page Object Model Pattern
The Page Object Model encapsulates page interactions and makes tests more maintainable. Here's how it's implemented:
```typescript
// auth.po.ts
export class AuthPageObject {
private readonly page: Page;
private readonly mailbox: Mailbox;
constructor(page: Page) {
this.page = page;
this.mailbox = new Mailbox(page);
}
async signIn(params: { email: string; password: string }) {
await this.page.fill('input[name="email"]', params.email);
await this.page.fill('input[name="password"]', params.password);
await this.page.click('button[type="submit"]');
}
async signOut() {
await this.page.click('[data-test="account-dropdown-trigger"]');
await this.page.click('[data-test="account-dropdown-sign-out"]');
}
}
```
**Best Practices:**
- Group related functionality in Page Objects
- Use descriptive method names that reflect user actions
- Encapsulate complex workflows in single methods
- Return promises or use async/await consistently
The test file would look like this:
```typescript
import { expect, test } from '@playwright/test';
import { AuthPageObject } from './auth.po';
test.describe('Auth flow', () => {
test.describe.configure({ mode: 'serial' });
let email: string;
let auth: AuthPageObject;
test.beforeEach(async ({ page }) => {
auth = new AuthPageObject(page);
});
test('will sign-up and redirect to the home page', async ({ page }) => {
await auth.goToSignUp();
email = auth.createRandomEmail();
console.log(`Signing up with email ${email} ...`);
await auth.signUp({
email,
password: 'password',
repeatPassword: 'password',
});
await auth.visitConfirmEmailLink(email);
await page.waitForURL('**/home');
});
});
```
1. The test file instantiates the `AuthPageObject` before each test
2. The Page Object wraps the logic for the auth flow so that we can reuse it in the tests
## Data-Test Attributes
Use `data-test` attributes to create stable, semantic selectors that won't break when UI changes.
### ✅ Good: Using data-test attributes
```typescript
// In your React component
<button data-test="submit-button" onClick={handleSubmit}>
Submit
</button>
// In your test
await this.page.click('[data-test="submit-button"]');
```
### ❌ Bad: Using fragile selectors
```typescript
// Fragile - breaks if class names or text changes
await this.page.click('.btn-primary');
await this.page.click('button:has-text("Submit")');
```
### Common Data-Test Patterns
```typescript
// Form elements
<input data-test="email-input" name="email" />
<input data-test="password-input" name="password" />
<button data-test="submit-button" type="submit">Submit</button>
// Navigation
<button data-test="account-dropdown-trigger">Account</button>
<a data-test="settings-link" href="/settings">Settings</a>
// Lists and rows
<div data-test="team-member-row" data-user-id={user.id}>
<span data-test="member-role-badge">{role}</span>
</div>
// Forms with specific purposes
<form data-test="create-team-form">
<input data-test="team-name-input" />
<button data-test="create-team-button">Create</button>
</form>
```
## Retry-ability with expect().toPass()
Use `expect().toPass()` to wrap operations that might be flaky due to timing issues or async operations.
### ✅ Good: Using expect().toPass()
```typescript
async visitConfirmEmailLink(email: string) {
return expect(async () => {
const res = await this.mailbox.visitMailbox(email, { deleteAfter: true });
expect(res).not.toBeNull();
}).toPass();
}
async openAccountsSelector() {
return expect(async () => {
await this.page.click('[data-test="account-selector-trigger"]');
return expect(
this.page.locator('[data-test="account-selector-content"]'),
).toBeVisible();
}).toPass();
}
```
### ❌ Bad: Not using retry mechanisms
```typescript
// This might fail due to timing issues
async openAccountsSelector() {
await this.page.click('[data-test="account-selector-trigger"]');
await expect(
this.page.locator('[data-test="account-selector-content"]'),
).toBeVisible();
}
```
### When to Use expect().toPass()
- **Email operations**: Waiting for emails to arrive
- **Navigation**: Waiting for URL changes after actions
- **Async UI updates**: Operations that trigger network requests
- **External dependencies**: Interactions with third-party services
## Test Isolation and Deterministic Results
Test isolation is crucial for reliable test suites:
1. Make sure each tests sets up its own context and data
2. Never rely on data from other tests
3. For maximum isolation, you should create your own data for each test - however this can be time-consuming so you should take it into account when writing your tests
### 1. Independent Test Data
```typescript
// Generate unique test data for each test
createRandomEmail() {
const value = Math.random() * 10000000000000;
return `${value.toFixed(0)}@makerkit.dev`;
}
createTeamName() {
const id = Math.random().toString(36).substring(2, 8);
return {
teamName: `Test Team ${id}`,
slug: `test-team-${id}`,
};
}
```
## Email Testing with Mailbox
The `Mailbox` utility helps test email-dependent flows using Mailpit.
### 1. Basic Email Operations
```typescript
export class Mailbox {
static URL = 'http://127.0.0.1:54324';
async visitMailbox(email: string, params: { deleteAfter: boolean; subject?: string }) {
const json = await this.getEmail(email, params);
if (email !== json.To[0]!.Address) {
throw new Error(`Email address mismatch. Expected ${email}, got ${json.To[0]!.Address}`);
}
const el = parse(json.HTML);
const linkHref = el.querySelector('a')?.getAttribute('href');
return this.page.goto(linkHref);
}
}
```
## Race conditions
Race conditions issues are common in E2E tests. Testing UIs is inherently asynchronous, and you need to be careful about the order of operations.
In many cases, your application will execute async operations. In such cases, you want to use Playwright's utilities to wait for the operation to complete.
Below is a common pattern for handling async operations in E2E tests:
1. Click the button
2. Wait for the async operation to complete
3. Proceed with the test (expectations, assertions, etc.)
```typescript
const button = page.locator('[data-test="submit-button"]');
const response = page.waitForResponse((resp) => {
return resp.url().includes(`/your-api-endpoint`);
});
await Promise.all([button.click(), response]);
// proceed with the test
```
The pattern above ensures that the test will only proceed once the async operation has completed.
### Handling race conditions using timeouts
Timeouts are generally discouraged in E2E tests. However, in some cases, you may want to use them to avoid flaky tests when every other solution failed.
```tsx
await page.waitForTimeout(1000);
```
In general, during development, most operations resolve within 50-100ms - so these would be an appropriate amount of time to wait if you hit overly flaky tests.
## Testing Checklist
When writing E2E tests, ensure you:
- [ ] Use `data-test` attributes for element selection
- [ ] Implement Page Object Model pattern
- [ ] Wrap flaky operations in `expect().toPass()`
- [ ] Generate unique test data for each test run
- [ ] Clean up state between tests
- [ ] Handle async operations properly
- [ ] Test both happy path and error scenarios
- [ ] Include proper assertions and validations
- [ ] Follow naming conventions for test files and methods
- [ ] Document complex test scenarios
By following these best practices, you'll create robust, maintainable E2E tests that provide reliable feedback about your application's functionality.

View File

@@ -0,0 +1,215 @@
---
status: "published"
label: "Getting Started with Development"
order: 0
title: "Local Development Guide for the Next.js Supabase Starter Kit"
description: "Set up your development environment, understand Makerkit's architecture patterns, and navigate the development guides."
---
Start local development by running `pnpm dev` to launch the Next.js app and Supabase services. Makerkit uses a security-first, account-centric architecture where all business data belongs to accounts (personal or team), protected by Row Level Security (RLS) policies enforced at the database level.
{% sequence title="Development Setup" description="Get started with local development" %}
[Start development services](#development-environment)
[Understand the architecture](#development-philosophy)
[Navigate the guides](#development-guides-overview)
[Follow common patterns](#common-development-patterns)
{% /sequence %}
## Development Environment
### Starting Services
```bash
# Start all services (Next.js app + Supabase)
pnpm dev
# Or start individually
pnpm --filter web dev # Next.js app (port 3000)
pnpm run supabase:web:start # Local Supabase
```
### Key URLs
| Service | URL | Purpose |
|---------|-----|---------|
| Main app | http://localhost:3000 | Your application |
| Supabase Studio | http://localhost:54323 | Database admin UI |
| Inbucket (email) | http://localhost:54324 | Local email testing |
### Common Commands
```bash
# Database
pnpm run supabase:web:reset # Reset database to clean state
pnpm --filter web supabase:typegen # Regenerate TypeScript types
# Development
pnpm typecheck # Type check all packages
pnpm lint:fix # Fix linting issues
pnpm format:fix # Format code
```
## Development Philosophy
Makerkit is built around three core principles that guide all development decisions:
### Security by Default
Every feature leverages Row Level Security (RLS) and the permission system. Access controls are built into the database layer, not application code. When you add a new table, you also add RLS policies that enforce who can read, write, and delete data.
### Multi-Tenant from Day One
All business data belongs to accounts (personal or team). This design enables both B2C and B2B use cases while ensuring proper data isolation. Every table that holds user-generated data includes an `account_id` foreign key.
### Type-Safe Development
TypeScript types are auto-generated from your database schema. When you modify the database, run `pnpm --filter web supabase:typegen` to update types. This ensures end-to-end type safety from database to UI.
## Development Guides Overview
### Database & Data Layer
Start here to understand the foundation:
| Guide | Description |
|-------|-------------|
| [Database Architecture](/docs/next-supabase-turbo/development/database-architecture) | Multi-tenant data model, security patterns, core tables |
| [Database Schema](/docs/next-supabase-turbo/development/database-schema) | Add tables, RLS policies, triggers, and relationships |
| [Migrations](/docs/next-supabase-turbo/development/migrations) | Create and apply schema changes |
| [Database Functions](/docs/next-supabase-turbo/development/database-functions) | Built-in functions for permissions, roles, subscriptions |
| [Database Tests](/docs/next-supabase-turbo/development/database-tests) | Test RLS policies with pgTAP |
| [Database Webhooks](/docs/next-supabase-turbo/development/database-webhooks) | React to database changes |
### Application Development
| Guide | Description |
|-------|-------------|
| [Loading Data](/docs/next-supabase-turbo/development/loading-data-from-database) | Fetch data in Server Components and Client Components |
| [Writing Data](/docs/next-supabase-turbo/development/writing-data-to-database) | Server Actions, forms, and mutations |
| [Permissions and Roles](/docs/next-supabase-turbo/development/permissions-and-roles) | RBAC implementation and permission checks |
### Frontend & Marketing
| Guide | Description |
|-------|-------------|
| [Marketing Pages](/docs/next-supabase-turbo/development/marketing-pages) | Landing pages, pricing, FAQ |
| [Legal Pages](/docs/next-supabase-turbo/development/legal-pages) | Privacy policy, terms of service |
| [SEO](/docs/next-supabase-turbo/development/seo) | Metadata, sitemap, structured data |
| [External Marketing Website](/docs/next-supabase-turbo/development/external-marketing-website) | Redirect to Framer, Webflow, etc. |
### Architecture & Testing
| Guide | Description |
|-------|-------------|
| [Application Tests (E2E)](/docs/next-supabase-turbo/development/application-tests) | Playwright E2E testing patterns |
| [Adding Turborepo Apps](/docs/next-supabase-turbo/development/adding-turborepo-app) | Add new applications to the monorepo |
| [Adding Turborepo Packages](/docs/next-supabase-turbo/development/adding-turborepo-package) | Create shared packages |
## Common Development Patterns
### The Account-Centric Pattern
Every business entity references an `account_id`:
```sql
create table public.projects (
id uuid primary key default gen_random_uuid(),
account_id uuid not null references public.accounts(id) on delete cascade,
name text not null,
created_at timestamptz not null default now()
);
```
### The Security-First Pattern
Every table has RLS enabled with explicit policies:
```sql
alter table public.projects enable row level security;
create policy "Members can view their projects"
on public.projects
for select
to authenticated
using (public.has_role_on_account(account_id));
create policy "Users with write permission can create projects"
on public.projects
for insert
to authenticated
with check (public.has_permission(auth.uid(), account_id, 'projects.write'::app_permissions));
```
### The Type-Safe Pattern
Database types are auto-generated:
```typescript
import type { Database } from '@kit/supabase/database';
type Project = Database['public']['Tables']['projects']['Row'];
type NewProject = Database['public']['Tables']['projects']['Insert'];
```
### The Server Action Pattern
Use `authActionClient` for validated, authenticated server actions:
```typescript
import { authActionClient } from '@kit/next/safe-action';
import * as z from 'zod';
const schema = z.object({
name: z.string().min(1),
accountId: z.string().uuid(),
});
export const createProject = authActionClient
.inputSchema(schema)
.action(async ({ parsedInput: data, ctx: { user } }) => {
// data is validated, user is authenticated
const supabase = getSupabaseServerClient();
const { data: project } = await supabase
.from('projects')
.insert({ name: data.name, account_id: data.accountId })
.select()
.single();
return project;
});
```
## Recommended Learning Path
### 1. Foundation (Start Here)
1. [Database Architecture](/docs/next-supabase-turbo/development/database-architecture) - Understand the multi-tenant model
2. [Permissions and Roles](/docs/next-supabase-turbo/development/permissions-and-roles) - Learn RBAC implementation
3. [Database Schema](/docs/next-supabase-turbo/development/database-schema) - Build your first feature
### 2. Core Development
4. [Loading Data](/docs/next-supabase-turbo/development/loading-data-from-database) - Data fetching patterns
5. [Writing Data](/docs/next-supabase-turbo/development/writing-data-to-database) - Forms and mutations
6. [Migrations](/docs/next-supabase-turbo/development/migrations) - Schema change workflow
### 3. Advanced (As Needed)
- [Database Functions](/docs/next-supabase-turbo/development/database-functions) - Custom database logic
- [Database Webhooks](/docs/next-supabase-turbo/development/database-webhooks) - Event-driven features
- [Database Tests](/docs/next-supabase-turbo/development/database-tests) - Test RLS policies
## Next Steps
1. **Read [Database Architecture](/docs/next-supabase-turbo/development/database-architecture)** to understand the foundation
2. **Plan your first feature** - define entities, relationships, and access rules
3. **Implement step-by-step** following the [Database Schema](/docs/next-supabase-turbo/development/database-schema) guide
4. **Test your RLS policies** using [Database Tests](/docs/next-supabase-turbo/development/database-tests)
The guides are designed to be practical and production-ready. Each builds on knowledge from previous ones, developing your expertise with Makerkit's architecture and patterns.

View File

@@ -0,0 +1,821 @@
---
title: "Database Architecture in Makerkit"
label: "Database Architecture"
description: "Deep dive into Makerkit's database schema, security model, and best practices for building secure multi-tenant SaaS applications"
---
Makerkit implements a sophisticated, security-first database architecture designed for multi-tenant SaaS applications.
This guide provides a comprehensive overview of the database schema, security patterns, and best practices you should follow when extending the system.
{% sequence title="Database Architecture" description="Deep dive into Makerkit's database schema, security model, and best practices for building secure multi-tenant SaaS applications" %}
[Multi-Tenant Design](#multi-tenant-design)
[Core Tables](#core-tables)
[Authentication & Security](#authentication-security)
[Billing & Commerce](#billing-commerce)
[Features & Functionality](#features-functionality)
[Database Functions & Views](#database-functions-views)
[Database Schema Relationships](#database-schema-relationships)
[Understanding the Database Tables](#understanding-the-database-tables)
[Extending the Database: Decision Trees and Patterns](#extending-the-database-decision-trees-and-patterns)
[Security Model](#security-model)
[Summary](#summary)
{% /sequence %}
### Multi-Tenant Design
Makerkit supports two types of accounts, providing flexibility for both B2C and B2B use cases:
#### Personal Accounts
Individual user accounts where the user ID equals the account ID. Perfect for B2C applications or personal workspaces.
```sql
-- Personal account characteristics
- id = auth.uid() (user's ID)
- is_personal_account = true
- slug = NULL (no public URL needed)
- Automatically created on user signup
```
#### Team Accounts
Shared workspaces with multiple members, roles, and permissions. Ideal for B2B applications or collaborative features.
```sql
-- Team account characteristics
- id = UUID (unique account ID)
- is_personal_account = false
- slug = unique string (for public URLs)
- Members managed through accounts_memberships
```
### Complete Database Schema
Makerkit's database consists of 17 core tables organized across several functional areas:
#### Core Tables
| Table | Purpose | Key Relationships |
|-------|---------|------------------|
| `accounts` | Multi-tenant accounts (personal/team) | References `auth.users` as owner |
| `accounts_memberships` | Team membership with roles | Links `auth.users` to `accounts` |
| `roles` | Role definitions with hierarchy | Referenced by memberships |
| `role_permissions` | Permissions per role | Links roles to app permissions |
#### Authentication & Security
| Table | Purpose | Key Features |
|-------|---------|--------------|
| `nonces` | OTP for sensitive operations | Purpose-based, auto-expiring |
| `invitations` | Team invitation system | Token-based with role assignment |
#### Billing & Commerce
| Table | Purpose | Provider Support |
|-------|---------|-----------------|
| `billing_customers` | Customer records per provider | Stripe, LemonSqueezy, Paddle |
| `subscriptions` | Active subscriptions | Multiple billing providers |
| `subscription_items` | Subscription line items | Flat, per-seat, metered pricing |
| `orders` | One-time purchases | Product sales, licenses |
| `order_items` | Order line items | Detailed purchase records |
#### Features & Functionality
| Table | Purpose | Key Features |
|-------|---------|--------------|
| `notifications` | Multi-channel notifications | In-app, email, real-time |
#### Database Functions & Views
| Type | Purpose | Security Model |
|------|---------|----------------|
| Views | Data access abstractions | Security invoker for RLS |
| Functions | Business logic & helpers | Security definer with validation |
| Triggers | Data consistency | Automatic field updates |
### Database Schema Relationships
{% img src="/images/database-architecture.webp" width="1000" height="1000" alt="Database Architecture" /%}
## Understanding the Database Tables
This section provides detailed explanations of each table group, their relationships, and practical guidance on how to work with them effectively.
### Core Multi-Tenancy Tables
The foundation of Makerkit's architecture rests on a sophisticated multi-tenant design that seamlessly handles both individual users and collaborative teams.
#### The `accounts` Table: Your Tenancy Foundation
The `accounts` table serves as the cornerstone of Makerkit's multi-tenant architecture. Every piece of data in your application ultimately belongs to an account, making this table critical for data isolation and security.
**When to use personal accounts**: Personal accounts are automatically created when users sign up and are perfect for B2C applications, personal productivity tools, or individual workspaces. The account ID directly matches the user's authentication ID, creating a simple 1:1 relationship that's easy to reason about.
**When to use team accounts**: Team accounts enable collaborative features essential for B2B SaaS applications. They support multiple members with different permission levels, shared resources, and centralized billing. Each team account gets a unique slug for branded URLs like `yourapp.com/acme-corp`.
```sql
-- Example: Creating a team account for collaboration
INSERT INTO accounts (name, is_personal_account, slug)
VALUES ('Acme Corporation', false, 'acme-corp');
```
**Key architectural decisions**: The conditional constraint system ensures data integrity - personal accounts cannot have slugs (they don't need public URLs), while team accounts must have them. This prevents common mistakes and enforces the intended usage patterns.
#### The `accounts_memberships` Table: Team Collaboration Hub
This junction table manages the many-to-many relationship between users and team accounts. It's where team collaboration comes to life through role-based access control.
**Understanding membership lifecycle**: When a team account is created, the creator automatically becomes a member with the highest role. Additional members join through invitations or direct assignment. The composite primary key (user_id, account_id) ensures users can't have duplicate memberships in the same account.
**Role hierarchy in action**: The system uses a numerical hierarchy where lower numbers indicate higher privileges. An owner (hierarchy level 1) can manage all aspects of the account, while members (hierarchy level 2) have limited permissions. This makes it easy to add new roles between existing ones.
```sql
-- Example: Adding a member to a team
INSERT INTO accounts_memberships (user_id, account_id, account_role)
VALUES ('user-uuid', 'team-account-uuid', 'member');
```
**Best practices for membership management**: Always validate role hierarchy when promoting or demoting members. The system prevents removing the primary owner's membership to maintain account ownership integrity.
#### The `roles` and `role_permissions` Tables: Granular Access Control
These tables work together to provide a flexible, hierarchical permission system that can adapt to complex organizational structures.
**Designing permission systems**: The `roles` table defines named roles with hierarchy levels, while `role_permissions` maps specific permissions to each role. This separation allows you to easily modify what each role can do without restructuring your entire permission system.
**Permission naming conventions**: Permissions follow a `resource.action` pattern (e.g., `billing.manage`, `members.invite`). This makes them self-documenting and easy to understand. When adding new features, follow this pattern to maintain consistency.
```sql
-- Example: Creating a custom role with specific permissions
INSERT INTO roles (name, hierarchy_level) VALUES ('manager', 1.5);
INSERT INTO role_permissions (role, permission) VALUES
('manager', 'members.manage'),
('manager', 'settings.manage');
```
### Security and Access Control Tables
Makerkit implements multiple layers of security through specialized tables that handle authentication, authorization, and administrative access.
#### The `nonces` Table: Secure Operations Gateway
One-time tokens provide an additional security layer for sensitive operations that go beyond regular authentication. This table manages short-lived, purpose-specific codes that verify user intent for critical actions.
**Understanding token purposes**: Each token has a specific purpose (email verification, password reset, account deletion) and cannot be reused for other operations. This prevents token reuse attacks and ensures proper authorization flows.
**Implementation strategies**: Tokens automatically expire and are limited to specific scopes. When a user requests a new token for the same purpose, previous tokens are invalidated. This prevents accumulation of valid tokens and reduces security risks.
**Security considerations**: Always validate the IP address and user agent when possible. The table tracks these for audit purposes and can help detect suspicious activity.
Please refer to the [One-Time Tokens](../api/otp-api) documentation for more details.
#### The `invitations` Table: Secure Team Building
The invitation system enables secure team expansion while maintaining strict access controls. It bridges the gap between open team joining and secure access management.
**Invitation workflow design**: Invitations are token-based with automatic expiration. The inviter's permissions are validated at creation time, ensuring only authorized users can extend invitations. Role assignment happens at invitation time, not acceptance, providing clear expectations.
**Managing invitation security**: Each invitation includes a cryptographically secure token that cannot be guessed. Expired invitations are automatically invalid, and the system tracks who sent each invitation for audit purposes.
```sql
-- Example: Creating a secure invitation
INSERT INTO invitations (email, account_id, role, invite_token, expires_at, invited_by)
VALUES ('new-member@company.com', 'team-uuid', 'member', 'secure-random-token', now() + interval '7 days', 'inviter-uuid');
```
**Best practices for invitations**: Set reasonable expiration times (typically 7 days), validate email addresses before sending, and provide clear role descriptions in invitation emails.
#### The `super_admins` Table: Platform Administration
This table manages platform-level administrators who can perform system-wide operations that transcend individual accounts. It's designed with the highest security standards.
**Admin privilege model**: Super admin status requires multi-factor authentication and is separate from regular account permissions. This creates a clear separation between application users and platform administrators.
**Security enforcement**: All super admin operations require MFA verification through the `is_aal2()` function. This ensures that even if an admin's password is compromised, sensitive operations remain protected.
### Billing and Commerce Infrastructure
Makerkit's billing system is designed to handle complex pricing models across multiple payment providers while maintaining clean data architecture.
#### The `billing_customers` Table: Payment Provider Bridge
This table creates the essential link between your application's accounts and external payment provider customer records. It's the foundation that enables multi-provider billing support.
**Provider abstraction benefits**: By storing customer IDs for each provider separately, you can migrate between billing providers, support multiple providers simultaneously, or offer region-specific payment options without data loss.
**Customer lifecycle management**: When an account first needs billing capabilities, a customer record is created with their chosen provider. This lazy creation approach prevents unnecessary external API calls and keeps your billing clean.
```sql
-- Example: Linking an account to Stripe
INSERT INTO billing_customers (account_id, customer_id, provider)
VALUES ('account-uuid', 'cus_stripe_customer_id', 'stripe');
```
**Multi-provider strategies**: Some applications use different providers for different markets (Stripe for US/EU, local providers for other regions). The table structure supports this with provider-specific customer records.
#### The `subscriptions` and `subscription_items` Tables: Flexible Pricing Models
These tables work together to support sophisticated pricing models including flat-rate, per-seat, and usage-based billing across multiple products and features.
**Subscription architecture**: The parent `subscriptions` table tracks overall subscription status, billing periods, and provider information. Child `subscription_items` handle individual components, enabling complex pricing like "basic plan + extra seats + API usage."
**Pricing model flexibility**: The `type` field in subscription items enables different billing models:
- **Flat**: Fixed monthly/yearly pricing
- **Per-seat**: Automatically adjusted based on team size
- **Metered**: Based on usage (API calls, storage, etc.)
```sql
-- Example: Complex subscription with multiple items
-- Base plan + per-seat pricing + metered API usage
INSERT INTO subscription_items (subscription_id, price_id, quantity, type) VALUES
('sub-uuid', 'price_base_plan', 1, 'flat'),
('sub-uuid', 'price_per_seat', 5, 'per_seat'),
('sub-uuid', 'price_api_calls', 0, 'metered');
```
**Automatic seat management**: The per-seat billing service automatically adjusts quantities when team members are added or removed. This eliminates manual billing adjustments and ensures accurate charges.
#### The `orders` and `order_items` Tables: One-Time Purchases
These tables handle non-recurring transactions like product purchases, one-time fees, or license sales that complement subscription revenue.
**Order vs subscription distinction**: Orders represent completed transactions for specific products or services, while subscriptions handle recurring billing. This separation enables hybrid business models with both recurring and one-time revenue streams.
**Order fulfillment tracking**: Orders include status tracking and detailed line items for complex transactions. This supports scenarios like software licenses, premium features, or physical products.
### Application Feature Tables
#### The `notifications` Table: Multi-Channel Communication
This table powers Makerkit's notification system, supporting both in-app notifications and email delivery with sophisticated targeting and lifecycle management.
**Channel strategy**: Notifications can target specific channels (in-app, email) or both. This enables rich notification experiences where users see immediate in-app alerts backed by email records for important updates.
**Lifecycle management**: Notifications include dismissal tracking and automatic expiration. This prevents notification bloat while ensuring important messages reach users. The metadata JSONB field stores channel-specific data like email templates or push notification payloads.
```sql
-- Example: Creating a billing notification
INSERT INTO notifications (account_id, type, channel, metadata, expires_at)
VALUES ('account-uuid', 'billing_issue', 'in_app', '{"severity": "high", "action_url": "/billing"}', now() + interval '30 days');
```
**Performance considerations**: Index notifications by account_id and dismissed status for fast user queries. Consider archiving old notifications to maintain performance as your application scales.
## Extending the Database: Decision Trees and Patterns
Understanding when and how to extend Makerkit's database requires careful consideration of data ownership, security, and scalability. This section provides practical guidance for common scenarios.
### Adding New Feature Tables
When building new features, you'll need to decide how they integrate with the existing multi-tenant architecture. Here's a decision framework:
#### Step 1: Determine Data Ownership
**Question**: Who owns this data - individual users or accounts?
**User-owned data**: Data like user preferences, personal settings, or individual activity logs should reference `auth.users` directly. This data follows the user across all their account memberships.
```sql
-- Example: User preferences that follow the user everywhere
CREATE TABLE user_preferences (
user_id uuid REFERENCES auth.users(id) ON DELETE CASCADE,
theme varchar(20) DEFAULT 'light',
language varchar(10) DEFAULT 'en',
email_notifications boolean DEFAULT true
);
```
**Account-owned data**: Business data, shared resources, and collaborative content should reference `accounts`. This ensures proper multi-tenant isolation and enables team collaboration.
```sql
-- Example: Account-owned documents with proper tenancy
CREATE TABLE documents (
id uuid PRIMARY KEY DEFAULT gen_random_uuid(),
account_id uuid REFERENCES accounts(id) ON DELETE CASCADE,
title text NOT NULL,
content text,
created_by uuid REFERENCES auth.users(id),
-- Always include account_id for multi-tenancy
CONSTRAINT documents_account_ownership CHECK (account_id IS NOT NULL)
);
```
#### Step 2: Define Access Patterns
**Public data within account**: Use standard RLS patterns that allow all account members to read but restrict writes based on permissions.
**Private data within account**: Add a `created_by` field and restrict access to the creator plus users with specific permissions.
**Hierarchical data**: Consider department-level or project-level access within accounts for complex organizations.
### Common Table Patterns
#### Pattern 1: Simple Account-Owned Resources
Most feature tables follow this pattern. They belong to an account and have basic RLS policies.
```sql
-- Template for account-owned resources
CREATE TABLE your_feature (
id uuid PRIMARY KEY DEFAULT gen_random_uuid(),
account_id uuid REFERENCES accounts(id) ON DELETE CASCADE NOT NULL,
name text NOT NULL,
description text,
created_at timestamptz DEFAULT now(),
updated_at timestamptz DEFAULT now(),
created_by uuid REFERENCES auth.users(id),
updated_by uuid REFERENCES auth.users(id)
);
-- Standard RLS policy
CREATE POLICY "feature_account_access" ON your_feature
FOR ALL TO authenticated
USING (public.has_role_on_account(account_id))
WITH CHECK (public.has_permission(auth.uid(), account_id, 'feature.manage'));
```
#### Pattern 2: Hierarchical Resources
For features that need sub-categories or nested structures within accounts.
```sql
-- Example: Project categories with hierarchy
CREATE TABLE project_categories (
id uuid PRIMARY KEY DEFAULT gen_random_uuid(),
account_id uuid REFERENCES accounts(id) ON DELETE CASCADE NOT NULL,
parent_id uuid REFERENCES project_categories(id) ON DELETE CASCADE,
name text NOT NULL,
path ltree, -- PostgreSQL ltree for efficient tree operations
-- Ensure hierarchy stays within account
CONSTRAINT categories_same_account CHECK (
parent_id IS NULL OR
(SELECT account_id FROM project_categories WHERE id = parent_id) = account_id
)
);
```
#### Pattern 3: Permission-Gated Features
For sensitive features that require specific permissions beyond basic account membership.
```sql
-- Example: Financial reports requiring special permissions
CREATE TABLE financial_reports (
id uuid PRIMARY KEY DEFAULT gen_random_uuid(),
account_id uuid REFERENCES accounts(id) ON DELETE CASCADE NOT NULL,
report_data jsonb NOT NULL,
period_start date NOT NULL,
period_end date NOT NULL,
created_by uuid REFERENCES auth.users(id)
);
-- Restrictive RLS requiring specific permission
CREATE POLICY "financial_reports_access" ON financial_reports
FOR ALL TO authenticated
USING (public.has_permission(auth.uid(), account_id, 'reports.financial'))
WITH CHECK (public.has_permission(auth.uid(), account_id, 'reports.financial'));
```
### Integration with Billing
When adding features that affect billing, consider these patterns:
#### Feature Access Control
For subscription-gated features, create lookup tables that determine feature availability.
```sql
-- Example: Feature access based on subscription
CREATE TABLE subscription_features (
subscription_id uuid REFERENCES subscriptions(id) ON DELETE CASCADE,
feature_name text NOT NULL,
enabled boolean DEFAULT true,
usage_limit integer, -- NULL means unlimited
PRIMARY KEY (subscription_id, feature_name)
);
-- Helper function to check feature access
CREATE OR REPLACE FUNCTION has_feature_access(
target_account_id uuid,
feature_name text
) RETURNS boolean AS $$
DECLARE
has_access boolean := false;
BEGIN
SELECT sf.enabled INTO has_access
FROM subscriptions s
JOIN subscription_features sf ON s.id = sf.subscription_id
WHERE s.account_id = target_account_id
AND sf.feature_name = has_feature_access.feature_name
AND s.active = true;
RETURN COALESCE(has_access, false);
END;
$$ LANGUAGE plpgsql;
```
### Security Best Practices for Extensions
#### Always Enable RLS
Never create a table without enabling Row Level Security. This should be your default approach.
```sql
-- ALWAYS do this for new tables
CREATE TABLE your_new_table (...);
ALTER TABLE your_new_table ENABLE ROW LEVEL SECURITY;
```
#### Validate Cross-Account References
When tables reference multiple accounts, ensure data integrity through constraints.
```sql
-- Example: Collaboration requests between accounts
CREATE TABLE collaboration_requests (
id uuid PRIMARY KEY DEFAULT gen_random_uuid(),
from_account_id uuid REFERENCES accounts(id) ON DELETE CASCADE,
to_account_id uuid REFERENCES accounts(id) ON DELETE CASCADE,
status text CHECK (status IN ('pending', 'accepted', 'rejected')),
-- Prevent self-collaboration
CONSTRAINT no_self_collaboration CHECK (from_account_id != to_account_id)
);
```
### Key Design Principles
1. **Account-Centric**: All data associates with accounts via foreign keys for proper multi-tenancy
2. **Security by Default**: RLS enabled on all tables with explicit permission checks
3. **Provider Agnostic**: Billing supports multiple payment providers (Stripe, LemonSqueezy, Paddle)
4. **Audit Ready**: Comprehensive tracking with created_by, updated_by, timestamps
5. **Scalable**: Proper indexing and cascade relationships for performance
## Security Model
### Row Level Security (RLS)
> **⚠️ CRITICAL WARNING**: Always enable RLS on new tables. This is your first line of defense against unauthorized access.
Makerkit enforces RLS on all tables with carefully crafted policies:
```sql
-- Example: Notes table with proper RLS
CREATE TABLE if not exists public.notes (
id uuid primary key default gen_random_uuid(),
account_id uuid references public.accounts(id) on delete cascade,
content text,
created_by uuid references auth.users(id)
);
-- Enable RLS (NEVER SKIP THIS!)
ALTER TABLE public.notes ENABLE ROW LEVEL SECURITY;
-- Read policy: Owner or team member can read
CREATE POLICY "notes_read" ON public.notes FOR SELECT
TO authenticated USING (
account_id = (select auth.uid()) -- Personal account
OR
public.has_role_on_account(account_id) -- Team member
);
-- Write policy: Specific permission required
CREATE POLICY "notes_manage" ON public.notes FOR ALL
TO authenticated USING (
public.has_permission(auth.uid(), account_id, 'notes.manage'::app_permissions)
);
```
### Security Helper Functions
Makerkit provides battle-tested security functions. **Always use these instead of creating your own**:
#### Account Access Functions
```sql
-- Check if user owns the account
public.is_account_owner(account_id)
-- Check if user is a team member
public.has_role_on_account(account_id, role?)
-- Check specific permission
public.has_permission(user_id, account_id, permission)
-- Check if user can manage another member
public.can_action_account_member(account_id, target_user_id)
```
#### Security Check Functions
```sql
-- Verify user is super admin
public.is_super_admin()
-- Check MFA compliance
public.is_aal2()
public.is_mfa_compliant()
-- Check feature flags
public.is_set(field_name)
```
### SECURITY DEFINER Functions
> **🚨 DANGER**: SECURITY DEFINER functions bypass RLS. Only use when absolutely necessary and ALWAYS validate permissions first.
#### ❌ Bad Pattern - Never Do This
```sql
CREATE FUNCTION dangerous_delete_all()
RETURNS void
SECURITY DEFINER AS $$
BEGIN
-- This bypasses ALL security!
DELETE FROM sensitive_table;
END;
$$ LANGUAGE plpgsql;
```
#### ✅ Good Pattern - Always Validate First
```sql
CREATE FUNCTION safe_admin_operation(target_account_id uuid)
RETURNS void
SECURITY DEFINER
SET search_path = '' AS $$
BEGIN
-- MUST validate permissions FIRST
IF NOT public.is_account_owner(target_account_id) THEN
RAISE EXCEPTION 'Access denied: insufficient permissions';
END IF;
-- Now safe to proceed
-- Your operation here
END;
$$ LANGUAGE plpgsql;
```
## Core Tables Explained
### Accounts Table
The heart of the multi-tenant system:
```sql
public.accounts (
id -- UUID: Account identifier
primary_owner_user_id -- UUID: Account owner (ref auth.users)
name -- String: Display name
slug -- String: URL slug (NULL for personal)
email -- String: Contact email
is_personal_account -- Boolean: Account type
picture_url -- String: Avatar URL
public_data -- JSONB: Public metadata
)
```
**Key Features**:
- Automatic slug generation for team accounts
- Conditional constraints based on account type
- Protected fields preventing unauthorized updates
- Cascade deletion for data cleanup
### Memberships Table
Links users to team accounts with roles:
```sql
public.accounts_memberships (
user_id -- UUID: Member's user ID
account_id -- UUID: Team account ID
account_role -- String: Role name (owner/member)
PRIMARY KEY (user_id, account_id)
)
```
**Key Features**:
- Composite primary key prevents duplicates
- Role-based access control
- Automatic owner membership on account creation
- Prevention of owner removal
### Roles and Permissions
Hierarchical permission system:
```sql
public.roles (
name -- String: Role identifier
hierarchy_level -- Integer: Permission level (lower = more access)
)
public.role_permissions (
role -- String: Role name
permission -- Enum: Specific permission
)
```
**Available Permissions**:
- `roles.manage` - Manage team roles
- `billing.manage` - Handle billing
- `settings.manage` - Update settings
- `members.manage` - Manage members
- `invites.manage` - Send invitations
## Billing Architecture
### Subscription Model
```sql
billing_customers (
account_id -- Account reference
customer_id -- Provider's customer ID
provider -- stripe/lemonsqueezy/paddle
)
subscriptions (
customer_id -- Billing customer
status -- active/canceled/past_due
period_starts_at -- Current period start
period_ends_at -- Current period end
)
subscription_items (
subscription_id -- Parent subscription
price_id -- Provider's price ID
quantity -- Seats or usage
type -- flat/per_seat/metered
)
```
## Advanced Features
### Invitation System
Secure, token-based invitations:
```sql
public.invitations (
email -- Invitee's email
account_id -- Target team
invite_token -- Secure random token
expires_at -- Expiration timestamp
role -- Assigned role
)
```
**Security Features**:
- Unique tokens per invitation
- Automatic expiration
- Role hierarchy validation
- Batch invitation support
Generally speaking, you don't need to use this internally unless you are customizing the invitation system.
**Use Cases**:
- Email verification
- Password reset
- Sensitive operations
- Account deletion
### Notifications
Multi-channel notification system:
```sql
public.notifications (
account_id -- Target account
channel -- in_app/email
type -- Notification category
dismissed -- Read status
expires_at -- Auto-cleanup
metadata -- Additional data
)
```
### Creating New Tables
```sql
-- 1. Create table with proper structure
CREATE TABLE if not exists public.your_table (
id uuid PRIMARY KEY DEFAULT gen_random_uuid(),
account_id uuid REFERENCES accounts(id) ON DELETE CASCADE NOT NULL,
created_at timestamptz DEFAULT now() NOT NULL,
updated_at timestamptz DEFAULT now() NOT NULL,
created_by uuid REFERENCES auth.users(id),
-- your fields here
);
-- 2. Add comments for documentation
COMMENT ON TABLE public.your_table IS 'Description of your table';
COMMENT ON COLUMN public.your_table.account_id IS 'Account ownership';
-- 3. Create indexes for performance
CREATE INDEX idx_your_table_account_id ON public.your_table(account_id);
CREATE INDEX idx_your_table_created_at ON public.your_table(created_at DESC);
-- 4. Enable RLS (NEVER SKIP!)
ALTER TABLE public.your_table ENABLE ROW LEVEL SECURITY;
-- 5. Grant appropriate access
REVOKE ALL ON public.your_table FROM authenticated, service_role;
GRANT SELECT, INSERT, UPDATE, DELETE ON public.your_table TO authenticated;
-- 6. Create RLS policies
CREATE POLICY "your_table_select" ON public.your_table
FOR SELECT TO authenticated
USING (
account_id = (select auth.uid())
OR public.has_role_on_account(account_id)
);
CREATE POLICY "your_table_insert" ON public.your_table
FOR INSERT TO authenticated
WITH CHECK (
account_id = (select auth.uid())
OR public.has_permission(auth.uid(), account_id, 'your_feature.create')
);
```
### 3. Creating Views
```sql
-- Always use security invoker for views
CREATE VIEW public.your_view
WITH (security_invoker = true) AS
SELECT
t.*,
a.name as account_name
FROM your_table t
JOIN accounts a ON a.id = t.account_id;
-- Grant access
GRANT SELECT ON public.your_view TO authenticated;
```
{% alert type="warning" title="Security Invoker for Views" %}
Always use security invoker set to true for views.
{% /alert %}
### 4. Writing Triggers
```sql
-- Update timestamp trigger
CREATE TRIGGER update_your_table_updated_at
BEFORE UPDATE ON public.your_table
FOR EACH ROW
EXECUTE FUNCTION kit.trigger_set_timestamps();
-- Audit trigger
CREATE TRIGGER track_your_table_changes
BEFORE INSERT OR UPDATE ON public.your_table
FOR EACH ROW
EXECUTE FUNCTION kit.trigger_set_user_tracking();
```
### 5. Storage Security
When implementing file storage:
```sql
-- Create bucket with proper RLS
INSERT INTO storage.buckets (id, name, public)
VALUES ('your_files', 'your_files', false);
-- RLS policy validating account ownership
CREATE POLICY "your_files_policy" ON storage.objects
FOR ALL USING (
bucket_id = 'your_files'
AND public.has_role_on_account(
(storage.foldername(name))[1]::uuid
)
);
```
**Note:** The above assumes that `(storage.foldername(name))[1]::uuid` is the account id.
You can scope the account's files with the ID of the account, so that this RLS can protect the files from other accounts.
## Summary
Makerkit's database architecture provides:
- ✅ **Secure multi-tenancy** with RLS and permission checks
- ✅ **Flexible account types** for B2C and B2B use cases
- ✅ **Comprehensive billing** support for multiple providers
- ✅ **Built-in security** patterns and helper functions
- ✅ **Scalable design** with proper indexes and constraints
By following these patterns and best practices, you can confidently extend Makerkit's database while maintaining security, performance, and data integrity.
Remember: when in doubt, always err on the side of security and use the provided helper functions rather than creating custom solutions.

View File

@@ -0,0 +1,341 @@
---
status: "published"
label: "Database Functions"
order: 3
title: "PostgreSQL Database Functions for Multi-Tenant SaaS"
description: "Use built-in database functions for permissions, roles, subscriptions, and MFA checks. Includes has_permission, is_account_owner, has_active_subscription, and more."
---
Makerkit includes built-in PostgreSQL functions for common multi-tenant operations like permission checks, role verification, and subscription status. Use these functions in RLS policies and application code to enforce consistent security rules across your database.
{% sequence title="Database Functions Reference" description="Built-in functions for multi-tenant operations" %}
[Call functions from SQL and RPC](#calling-database-functions)
[Account ownership and membership](#account-functions)
[Permission checks](#permission-functions)
[Subscription and billing](#subscription-functions)
[MFA and authentication](#authentication-functions)
{% /sequence %}
## Calling Database Functions
### From SQL (RLS Policies)
Use functions directly in SQL schemas and RLS policies:
```sql
-- In an RLS policy
create policy "Users can view their projects"
on public.projects
for select
using (
public.has_role_on_account(account_id)
);
```
### From Application Code (RPC)
Call functions via Supabase RPC:
```tsx
const { data: isOwner, error } = await supabase.rpc('is_account_owner', {
account_id: accountId,
});
if (isOwner) {
// User owns this account
}
```
## Account Functions
### is_account_owner
Check if the current user owns an account. Returns `true` if the account is the user's personal account or if they created a team account.
```sql
public.is_account_owner(account_id uuid) returns boolean
```
**Use cases:**
- Restrict account deletion to owners
- Gate billing management
- Control team settings access
**Example RLS:**
```sql
create policy "Only owners can delete accounts"
on public.accounts
for delete
using (public.is_account_owner(id));
```
### has_role_on_account
Check if the current user has membership on an account, optionally with a specific role.
```sql
public.has_role_on_account(
account_id uuid,
account_role varchar(50) default null
) returns boolean
```
**Parameters:**
- `account_id`: The account to check
- `account_role`: Optional role name (e.g., `'owner'`, `'member'`). If omitted, returns `true` for any membership.
**Example RLS:**
```sql
-- Any member can view
create policy "Members can view projects"
on public.projects
for select
using (public.has_role_on_account(account_id));
-- Only owners can update
create policy "Owners can update projects"
on public.projects
for update
using (public.has_role_on_account(account_id, 'owner'));
```
### is_team_member
Check if a specific user is a member of a team account.
```sql
public.is_team_member(
account_id uuid,
user_id uuid
) returns boolean
```
**Use case:** Verify team membership when the current user context isn't available.
### can_action_account_member
Check if the current user can perform actions on another team member (remove, change role, etc.).
```sql
public.can_action_account_member(
target_team_account_id uuid,
target_user_id uuid
) returns boolean
```
**Logic:**
1. If current user is account owner: `true`
2. If target user is account owner: `false`
3. Otherwise: Compare role hierarchy levels
**Example:**
```tsx
const { data: canRemove } = await supabase.rpc('can_action_account_member', {
target_team_account_id: teamId,
target_user_id: memberId,
});
if (!canRemove) {
throw new Error('Cannot remove a user with equal or higher role');
}
```
## Permission Functions
### has_permission
Check if a user has a specific permission on an account. This is the primary function for granular access control.
```sql
public.has_permission(
user_id uuid,
account_id uuid,
permission_name app_permissions
) returns boolean
```
**Parameters:**
- `user_id`: The user to check (use `auth.uid()` for current user)
- `account_id`: The account context
- `permission_name`: A value from the `app_permissions` enum
**Default permissions:**
```sql
create type public.app_permissions as enum(
'roles.manage',
'billing.manage',
'settings.manage',
'members.manage',
'invites.manage'
);
```
**Example RLS:**
```sql
create policy "Users with tasks.write can insert tasks"
on public.tasks
for insert
with check (
public.has_permission(auth.uid(), account_id, 'tasks.write'::app_permissions)
);
```
**Example RPC:**
```tsx
async function checkTaskWritePermission(accountId: string) {
const { data: hasPermission } = await supabase.rpc('has_permission', {
user_id: (await supabase.auth.getUser()).data.user?.id,
account_id: accountId,
permission: 'tasks.write',
});
return hasPermission;
}
```
See [Permissions and Roles](/docs/next-supabase-turbo/development/permissions-and-roles) for adding custom permissions.
## Subscription Functions
### has_active_subscription
Check if an account has an active or trialing subscription.
```sql
public.has_active_subscription(account_id uuid) returns boolean
```
**Returns `true` when:**
- Subscription status is `active`
- Subscription status is `trialing`
**Returns `false` when:**
- No subscription exists
- Status is `canceled`, `past_due`, `unpaid`, `incomplete`, etc.
**Example RLS:**
```sql
create policy "Only paid accounts can create projects"
on public.projects
for insert
with check (
public.has_active_subscription(account_id)
);
```
**Example application code:**
```tsx
const { data: isPaid } = await supabase.rpc('has_active_subscription', {
account_id: accountId,
});
if (!isPaid) {
redirect('/pricing');
}
```
## Authentication Functions
### is_super_admin
Check if the current user is a super admin. Requires:
- User is authenticated
- User has `super_admin` role
- User is currently signed in with MFA (AAL2)
```sql
public.is_super_admin() returns boolean
```
**Example RLS:**
```sql
create policy "Super admins can view all accounts"
on public.accounts
for select
using (public.is_super_admin());
```
### is_mfa_compliant
Check if the current user meets MFA requirements. Returns `true` when:
- User enabled MFA and is signed in with MFA (AAL2)
- User has not enabled MFA (AAL1 is sufficient)
```sql
public.is_mfa_compliant() returns boolean
```
**Use case:** Allow users without MFA to continue normally while enforcing MFA for users who enabled it.
### is_aal2
Check if the current user is signed in with MFA (AAL2 authentication level).
```sql
public.is_aal2() returns boolean
```
**Use case:** Require MFA for sensitive operations regardless of user settings.
**Example:**
```sql
-- Require MFA for billing operations
create policy "MFA required for billing changes"
on public.billing_settings
for all
using (public.is_aal2());
```
## Configuration Functions
### is_set
Check if a configuration value is set in the `public.config` table.
```sql
public.is_set(field_name text) returns boolean
```
**Example:**
```sql
-- Check if a feature flag is enabled
select public.is_set('enable_new_dashboard');
```
## Function Reference Table
| Function | Purpose | Common Use |
|----------|---------|------------|
| `is_account_owner(account_id)` | Check account ownership | Delete, billing access |
| `has_role_on_account(account_id, role?)` | Check membership/role | View, edit access |
| `is_team_member(account_id, user_id)` | Check specific user membership | Team operations |
| `can_action_account_member(account_id, user_id)` | Check member management rights | Remove, role change |
| `has_permission(user_id, account_id, permission)` | Check granular permission | Feature access |
| `has_active_subscription(account_id)` | Check billing status | Paid features |
| `is_super_admin()` | Check admin status | Admin operations |
| `is_mfa_compliant()` | Check MFA compliance | Security policies |
| `is_aal2()` | Check MFA authentication | Sensitive operations |
## Related Resources
- [Permissions and Roles](/docs/next-supabase-turbo/development/permissions-and-roles) for adding custom permissions
- [Database Schema](/docs/next-supabase-turbo/development/database-schema) for extending your schema
- [Row Level Security](/docs/next-supabase-turbo/security/row-level-security) for RLS patterns
- [Database Tests](/docs/next-supabase-turbo/development/database-tests) for testing database functions

View File

@@ -0,0 +1,542 @@
---
status: "published"
label: "Extending the DB Schema"
order: 2
title: "Extending the Database Schema in Next.js Supabase"
description: "Learn how to create new migrations and update the database schema in your Next.js Supabase application"
---
{% sequence title="Steps to create a new migration" description="Learn how to create new migrations and update the database schema in your Next.js Supabase application" %}
[Planning Your Schema Extension](#planning-your-schema-extension)
[Creating Schema Files](#creating-schema-files)
[Permissions and Access Control](#permissions-and-access-control)
[Building Tables with RLS](#building-tables-with-rls)
[Advanced Patterns](#advanced-patterns)
[Testing and Deployment](#testing-and-deployment)
{% /sequence %}
This guide walks you through extending Makerkit's database schema with new tables and features. We'll use a comprehensive example that demonstrates best practices, security patterns, and integration with Makerkit's multi-tenant architecture.
## Planning Your Schema Extension
Before writing any SQL, it's crucial to understand how your new features fit into Makerkit's multi-tenant architecture.
### Decision Framework
**Step 1: Determine Data Ownership**
Ask yourself: "Who owns this data - individual users or accounts?"
- **User-owned data**: Personal preferences, activity logs, user settings
- **Account-owned data**: Business content, shared resources, collaborative features
**Step 2: Define Access Patterns**
- **Public within account**: All team members can access
- **Private within account**: Only creator + specific permissions
- **Admin-only**: Requires special permissions or super admin access
**Step 3: Consider Integration Points**
- Does this feature affect billing? (usage tracking, feature gates)
- Does it need notifications? (in-app alerts, email triggers)
- Should it have audit trails? (compliance, change tracking)
## Creating Schema Files
Makerkit organizes database schema in numbered files for proper ordering. Follow this workflow:
### 1. Create Your Schema File
```bash
# Create a new schema file with the next number
touch apps/web/supabase/schemas/18-notes-feature.sql
```
### 2. Apply Development Workflow
```bash
# Start Supabase
pnpm supabase:web:start
# Create migration from your schema file
pnpm --filter web run supabase:db:diff -f notes-feature
# Restart with new schema
pnpm supabase:web:reset
# Generate TypeScript types
pnpm supabase:web:typegen
```
## Permissions and Access Control
### Adding New Permissions
Makerkit defines permissions in the `public.app_permissions` enum. Add feature-specific permissions:
```sql
-- Add new permissions for your feature
ALTER TYPE public.app_permissions ADD VALUE 'notes.create';
ALTER TYPE public.app_permissions ADD VALUE 'notes.manage';
ALTER TYPE public.app_permissions ADD VALUE 'notes.delete';
COMMIT;
```
**Note:** The Supabase diff function does not support adding new permissions to enum types. Please add the new permissions manually instead of using the diff function.
**Permission Naming Convention**: Use the pattern `resource.action` for consistency:
- `notes.create` - Create new notes
- `notes.manage` - Edit existing notes
- `notes.delete` - Delete notes
- `notes.share` - Share with external users
### Role Assignment
Consider which roles should have which permissions by default:
```sql
-- Grant permissions to roles
INSERT INTO public.role_permissions (role, permission) VALUES
('owner', 'notes.create'),
('owner', 'notes.manage'),
('owner', 'notes.delete'),
('owner', 'notes.share'),
('member', 'notes.create'),
('member', 'notes.manage');
```
## Building Tables with RLS
Let's create a comprehensive notes feature that demonstrates various patterns and best practices.
### Core Notes Table
```sql
-- Create the main notes table with all standard fields
CREATE TABLE IF NOT EXISTS public.notes (
id uuid PRIMARY KEY DEFAULT gen_random_uuid(),
account_id uuid NOT NULL REFERENCES public.accounts(id) ON DELETE CASCADE,
title varchar(500) NOT NULL,
content text,
is_published boolean NOT NULL DEFAULT false,
tags text[] DEFAULT '{}',
metadata jsonb DEFAULT '{}',
-- Audit fields (always include these)
created_at timestamptz NOT NULL DEFAULT now(),
updated_at timestamptz NOT NULL DEFAULT now(),
created_by uuid REFERENCES auth.users(id),
updated_by uuid REFERENCES auth.users(id),
-- Data integrity constraints
CONSTRAINT notes_title_length CHECK (length(title) >= 1),
CONSTRAINT notes_account_required CHECK (account_id IS NOT NULL)
);
-- Add helpful comments for documentation
COMMENT ON TABLE public.notes IS 'User-generated notes with sharing capabilities';
COMMENT ON COLUMN public.notes.account_id IS 'Account that owns this note (multi-tenant isolation)';
COMMENT ON COLUMN public.notes.is_published IS 'Whether note is visible to all account members';
COMMENT ON COLUMN public.notes.tags IS 'Searchable tags for categorization';
COMMENT ON COLUMN public.notes.metadata IS 'Flexible metadata (view preferences, etc.)';
```
### Performance Indexes
Consider creating indexes for your query patterns if you are scaling to a large number of records.
```sql
-- Essential indexes for performance
CREATE INDEX idx_notes_account_id ON public.notes(account_id);
CREATE INDEX idx_notes_created_at ON public.notes(created_at DESC);
CREATE INDEX idx_notes_account_created ON public.notes(account_id, created_at DESC);
CREATE INDEX idx_notes_published ON public.notes(account_id, is_published) WHERE is_published = true;
CREATE INDEX idx_notes_tags ON public.notes USING gin(tags);
```
### Security Setup
```sql
-- Always enable RLS (NEVER skip this!)
ALTER TABLE public.notes ENABLE ROW LEVEL SECURITY;
-- Revoke default permissions and grant explicitly
REVOKE ALL ON public.notes FROM authenticated, service_role;
GRANT SELECT, INSERT, UPDATE, DELETE ON public.notes TO authenticated, service_role;
```
### RLS Policies
Create comprehensive policies that handle both personal and team accounts:
```sql
-- SELECT policy: Read published notes or own private notes
CREATE POLICY "notes_select" ON public.notes
FOR SELECT TO authenticated
USING (
-- Personal account: direct ownership
account_id = (SELECT auth.uid())
OR
-- Team account: member can read published notes
(public.has_role_on_account(account_id) AND is_published = true)
OR
-- Team account: creator can read their own drafts
(public.has_role_on_account(account_id) AND created_by = auth.uid())
OR
-- Team account: users with manage permission can read all
public.has_permission(auth.uid(), account_id, 'notes.manage')
);
-- INSERT policy: Must have create permission
CREATE POLICY "notes_insert" ON public.notes
FOR INSERT TO authenticated
WITH CHECK (
-- Personal account: direct ownership
account_id = (SELECT auth.uid())
OR
-- Team account: must have create permission
public.has_permission(auth.uid(), account_id, 'notes.create')
);
-- UPDATE policy: Owner or manager can edit
CREATE POLICY "notes_update" ON public.notes
FOR UPDATE TO authenticated
USING (
-- Personal account: direct ownership
account_id = (SELECT auth.uid())
OR
-- Team account: creator can edit their own
(public.has_role_on_account(account_id) AND created_by = auth.uid())
OR
-- Team account: users with manage permission
public.has_permission(auth.uid(), account_id, 'notes.manage')
)
WITH CHECK (
-- Same conditions for updates
account_id = (SELECT auth.uid())
OR
(public.has_role_on_account(account_id) AND created_by = auth.uid())
OR
public.has_permission(auth.uid(), account_id, 'notes.manage')
);
-- DELETE policy: Stricter permissions required
CREATE POLICY "notes_delete" ON public.notes
FOR DELETE TO authenticated
USING (
-- Personal account: direct ownership
account_id = (SELECT auth.uid())
OR
-- Team account: creator can delete own notes
(public.has_role_on_account(account_id) AND created_by = auth.uid())
OR
-- Team account: users with delete permission
public.has_permission(auth.uid(), account_id, 'notes.delete')
);
```
### Automatic Triggers
Add triggers for common patterns:
```sql
-- Automatically update timestamps
CREATE TRIGGER notes_updated_at
BEFORE UPDATE ON public.notes
FOR EACH ROW
EXECUTE FUNCTION kit.trigger_set_timestamps();
-- Track who made changes
CREATE TRIGGER notes_track_changes
BEFORE INSERT OR UPDATE ON public.notes
FOR EACH ROW
EXECUTE FUNCTION kit.trigger_set_user_tracking();
```
## Advanced Patterns
### 1. Hierarchical Notes (Categories)
```sql
-- Note categories with hierarchy
CREATE TABLE IF NOT EXISTS public.note_categories (
id uuid PRIMARY KEY DEFAULT gen_random_uuid(),
account_id uuid NOT NULL REFERENCES public.accounts(id) ON DELETE CASCADE,
parent_id uuid REFERENCES public.note_categories(id) ON DELETE CASCADE,
name varchar(255) NOT NULL,
color varchar(7), -- hex color codes
path ltree, -- efficient tree operations
created_at timestamptz NOT NULL DEFAULT now(),
created_by uuid REFERENCES auth.users(id),
-- Ensure hierarchy stays within account
CONSTRAINT categories_same_account CHECK (
parent_id IS NULL OR
(SELECT account_id FROM public.note_categories WHERE id = parent_id) = account_id
),
-- Prevent circular references
CONSTRAINT categories_no_self_parent CHECK (id != parent_id)
);
-- Link notes to categories
ALTER TABLE public.notes ADD COLUMN category_id uuid REFERENCES public.note_categories(id) ON DELETE SET NULL;
-- Index for tree operations
CREATE INDEX idx_note_categories_path ON public.note_categories USING gist(path);
CREATE INDEX idx_note_categories_account ON public.note_categories(account_id, parent_id);
```
### 2. Note Sharing and Collaboration
```sql
-- External sharing tokens
CREATE TABLE IF NOT EXISTS public.note_shares (
id uuid PRIMARY KEY DEFAULT gen_random_uuid(),
note_id uuid NOT NULL REFERENCES public.notes(id) ON DELETE CASCADE,
share_token varchar(64) NOT NULL UNIQUE,
expires_at timestamptz,
password_hash varchar(255), -- optional password protection
view_count integer DEFAULT 0,
max_views integer, -- optional view limit
created_at timestamptz NOT NULL DEFAULT now(),
created_by uuid REFERENCES auth.users(id),
-- Ensure token uniqueness
CONSTRAINT share_token_format CHECK (share_token ~ '^[a-zA-Z0-9_-]{32,64}$')
);
-- Function to generate secure share tokens
CREATE OR REPLACE FUNCTION generate_note_share_token()
RETURNS varchar(64) AS $$
BEGIN
RETURN encode(gen_random_bytes(32), 'base64url');
END;
$$ LANGUAGE plpgsql;
```
### 3. Usage Tracking for Billing
```sql
-- Track note creation for usage-based billing
CREATE TABLE IF NOT EXISTS public.note_usage_logs (
id uuid PRIMARY KEY DEFAULT gen_random_uuid(),
account_id uuid NOT NULL REFERENCES public.accounts(id) ON DELETE CASCADE,
action varchar(50) NOT NULL, -- 'create', 'share', 'export'
note_count integer DEFAULT 1,
date date DEFAULT CURRENT_DATE,
-- Daily aggregation
UNIQUE(account_id, action, date)
);
-- Function to track note usage
CREATE OR REPLACE FUNCTION track_note_usage(
target_account_id uuid,
usage_action varchar(50)
) RETURNS void AS $$
BEGIN
INSERT INTO public.note_usage_logs (account_id, action, note_count)
VALUES (target_account_id, usage_action, 1)
ON CONFLICT (account_id, action, date)
DO UPDATE SET note_count = note_usage_logs.note_count + 1;
END;
$$ LANGUAGE plpgsql;
-- Trigger to track note creation
CREATE OR REPLACE FUNCTION trigger_track_note_creation()
RETURNS trigger AS $$
BEGIN
PERFORM track_note_usage(NEW.account_id, 'create');
RETURN NEW;
END;
$$ LANGUAGE plpgsql;
CREATE TRIGGER notes_track_creation
AFTER INSERT ON public.notes
FOR EACH ROW
EXECUTE FUNCTION trigger_track_note_creation();
```
### 4. Feature Access Control
```sql
-- Check if account has access to advanced note features
CREATE OR REPLACE FUNCTION has_advanced_notes_access(target_account_id uuid)
RETURNS boolean AS $$
DECLARE
has_access boolean := false;
BEGIN
-- Check active subscription with advanced features
SELECT EXISTS(
SELECT 1
FROM public.subscriptions s
JOIN public.subscription_items si ON s.id = si.subscription_id
WHERE s.account_id = target_account_id
AND s.status = 'active'
AND si.price_id IN ('price_pro_plan', 'price_enterprise_plan')
) INTO has_access;
RETURN has_access;
END;
$$ LANGUAGE plpgsql;
-- Restrictive policy for advanced features
CREATE POLICY "notes_advanced_features" ON public.notes
AS RESTRICTIVE
FOR ALL TO authenticated
USING (
-- Basic features always allowed
is_published = true
OR category_id IS NULL
OR tags = '{}'
OR
-- Advanced features require subscription
has_advanced_notes_access(account_id)
);
```
## Security Enhancements
### MFA Compliance
For sensitive note operations, enforce MFA:
```sql
-- Require MFA for note deletion
CREATE POLICY "notes_delete_mfa" ON public.notes
AS RESTRICTIVE
FOR DELETE TO authenticated
USING (public.is_mfa_compliant());
```
### Super Admin Access
Allow super admins to access all notes for support purposes:
```sql
-- Super admin read access (for support)
CREATE POLICY "notes_super_admin_access" ON public.notes
FOR SELECT TO authenticated
USING (public.is_super_admin());
```
### Rate Limiting
Implement basic rate limiting for note creation:
```sql
-- Rate limiting: max 100 notes per day per account
CREATE OR REPLACE FUNCTION check_note_creation_limit(target_account_id uuid)
RETURNS boolean AS $$
DECLARE
daily_count integer;
BEGIN
SELECT COALESCE(note_count, 0) INTO daily_count
FROM public.note_usage_logs
WHERE account_id = target_account_id
AND action = 'create'
AND date = CURRENT_DATE;
RETURN daily_count < 100; -- Adjust limit as needed
END;
$$ LANGUAGE plpgsql;
-- Policy to enforce rate limiting
CREATE POLICY "notes_rate_limit" ON public.notes
AS RESTRICTIVE
FOR INSERT TO authenticated
WITH CHECK (check_note_creation_limit(account_id));
```
### Type Generation
After schema changes, always update TypeScript types:
```bash
# reset the database
pnpm supabase:web:reset
# Generate new types
pnpm supabase:web:typegen
# Verify types work in your application
pnpm typecheck
```
## Example Usage in Application
With your schema complete, here's how to use it in your application:
```typescript
// Server component - automatically inherits RLS protection
import { getSupabaseServerClient } from '@kit/supabase/server-client';
async function NotesPage({ params }: { params: Promise<{ account: string }> }) {
const { account } = await params;
const client = getSupabaseServerClient();
// RLS automatically filters to accessible notes
const { data: notes } = await client
.from('notes')
.select(`
*,
category:note_categories(name, color),
creator:created_by(name, avatar_url)
`)
.eq('account_id', params.account)
.order('created_at', { ascending: false });
return <NotesList notes={notes} />;
}
```
From the client component, you can use the `useQuery` hook to fetch the notes.
```typescript
// Client component with real-time updates
'use client';
import { useSupabase } from '@kit/supabase/hooks/use-supabase';
function useNotes(accountId: string) {
const supabase = useSupabase();
return useQuery({
queryKey: ['notes', accountId],
queryFn: async () => {
const { data } = await supabase
.from('notes')
.select('*, category:note_categories(name)')
.eq('account_id', accountId);
return data;
}
});
}
```
## Summary
You've now created a comprehensive notes feature that demonstrates:
✅ **Proper multi-tenancy** with account-based data isolation
✅ **Granular permissions** using Makerkit's role system
✅ **Advanced features** like categories, sharing, and usage tracking
✅ **Security best practices** with comprehensive RLS policies
✅ **Performance optimization** with proper indexing
✅ **Integration patterns** with billing and feature gates
This pattern can be adapted for any feature in your SaaS application. Remember to always:
- Start with proper planning and data ownership decisions
- Enable RLS and create comprehensive policies
- Add appropriate indexes for your query patterns
- Test thoroughly before deploying
- Update TypeScript types after schema changes
Your database schema is now production-ready and follows Makerkit's security and architecture best practices!

View File

@@ -0,0 +1,961 @@
---
status: "published"
label: "Database tests"
title: "Database Testing with pgTAP"
description: "Learn how to write comprehensive database tests using pgTAP to secure your application against common vulnerabilities"
order: 12
---
Database testing is critical for ensuring your application's security and data integrity. This guide covers how to write comprehensive database tests using **pgTAP** and **Basejump utilities** to protect against common vulnerabilities.
## Why Database Testing Matters
Database tests verify that your **Row Level Security (RLS)** policies work correctly and protect against:
- **Unauthorized data access** - Users reading data they shouldn't see
- **Data modification attacks** - Users updating/deleting records they don't own
- **Privilege escalation** - Users gaining higher permissions than intended
- **Cross-account data leaks** - Team members accessing other teams' data
- **Storage security bypasses** - Unauthorized file access
## Test Infrastructure
### Required Extensions
Makerkit uses these extensions for database testing:
```sql
-- Install Basejump test helpers
create extension "basejump-supabase_test_helpers" version '0.0.6';
-- The extension provides authentication simulation
-- and user management utilities
```
### Makerkit Test Helpers
The `/apps/web/supabase/tests/database/00000-makerkit-helpers.sql` file provides essential utilities:
```sql
-- Authenticate as a test user
select makerkit.authenticate_as('user_identifier');
-- Get account by slug
select makerkit.get_account_by_slug('team-slug');
-- Get account ID by slug
select makerkit.get_account_id_by_slug('team-slug');
-- Set user as super admin
select makerkit.set_super_admin();
-- Set MFA level
select makerkit.set_session_aal('aal1'); -- or 'aal2'
```
## Test Structure
Every pgTAP test follows this structure:
```sql
begin;
create extension "basejump-supabase_test_helpers" version '0.0.6';
select no_plan(); -- or select plan(N) for exact test count
-- Test setup
select makerkit.set_identifier('user1', 'user1@example.com');
select makerkit.set_identifier('user2', 'user2@example.com');
-- Your tests here
select is(
actual_result,
expected_result,
'Test description'
);
select * from finish();
rollback;
```
## Bypassing RLS in Tests
When you need to set up test data or verify data exists independently of RLS policies, use role switching:
### Role Types for Testing
```sql
-- postgres: Full superuser access, bypasses all RLS
set local role postgres;
-- service_role: Service-level access, bypasses RLS
set local role service_role;
-- authenticated: Normal user with RLS enforced (default for makerkit.authenticate_as)
-- No need to set this explicitly - makerkit.authenticate_as() handles it
```
### Common Patterns for Role Switching
#### Pattern 1: Setup Test Data
```sql
-- Use postgres role to insert test data that bypasses RLS
set local role postgres;
insert into accounts_memberships (account_id, user_id, account_role)
values (team_id, user_id, 'member');
-- Test as normal user (RLS enforced)
select makerkit.authenticate_as('member');
select isnt_empty($$ select * from team_data $$, 'Member can see team data');
```
#### Pattern 2: Verify Data Exists
```sql
-- Test that unauthorized user cannot see data
select makerkit.authenticate_as('unauthorized_user');
select is_empty($$ select * from private_data $$, 'Unauthorized user sees nothing');
-- Use postgres role to verify data actually exists
set local role postgres;
select isnt_empty($$ select * from private_data $$, 'Data exists (confirms RLS filtering)');
```
#### Pattern 3: Grant Permissions for Testing
```sql
-- Use postgres role to grant permissions
set local role postgres;
insert into role_permissions (role, permission)
values ('custom-role', 'invites.manage');
-- Test as user with the role
select makerkit.authenticate_as('custom_role_user');
select lives_ok($$ select create_invitation(...) $$, 'User with permission can invite');
```
### When to Use Each Role
#### Use `postgres` role when:
- Setting up complex test data with foreign key relationships
- Inserting data that would normally be restricted by RLS
- Verifying data exists independently of user permissions
- Modifying system tables (roles, permissions, etc.)
#### Use `service_role` when:
- You need RLS bypass but want to stay closer to application-level permissions
- Testing service-level operations
- Working with data that should be accessible to services but not users
#### Use `makerkit.authenticate_as()` when:
- Testing normal user operations (automatically sets `authenticated` role)
- Verifying RLS policies work correctly
- Testing user-specific access patterns
### Complete Test Example
```sql
begin;
create extension "basejump-supabase_test_helpers" version '0.0.6';
select no_plan();
-- Setup test users
select makerkit.set_identifier('owner', 'owner@example.com');
select makerkit.set_identifier('member', 'member@example.com');
select makerkit.set_identifier('stranger', 'stranger@example.com');
-- Create team (as owner)
select makerkit.authenticate_as('owner');
select public.create_team_account('TestTeam');
-- Add member using postgres role (bypasses RLS)
set local role postgres;
insert into accounts_memberships (account_id, user_id, account_role)
values (
(select id from accounts where slug = 'testteam'),
tests.get_supabase_uid('member'),
'member'
);
-- Test member access (RLS enforced)
select makerkit.authenticate_as('member');
select isnt_empty(
$$ select * from accounts where slug = 'testteam' $$,
'Member can see their team'
);
-- Test stranger cannot see team (RLS enforced)
select makerkit.authenticate_as('stranger');
select is_empty(
$$ select * from accounts where slug = 'testteam' $$,
'Stranger cannot see team due to RLS'
);
-- Verify team actually exists (bypass RLS)
set local role postgres;
select isnt_empty(
$$ select * from accounts where slug = 'testteam' $$,
'Team exists in database (confirms RLS is working, not missing data)'
);
select * from finish();
rollback;
```
### Key Principles
1. **Use `postgres` role for test setup**, then switch back to test actual user permissions
2. **Always verify data exists** using `postgres` role when testing that users cannot see data
3. **Never test application logic as `postgres`** - it bypasses all security
4. **Use role switching to confirm RLS is filtering**, not that data is missing
## Basic Security Testing Patterns
### 1. Testing Data Isolation
Verify users can only access their own data:
```sql
-- Create test users
select makerkit.set_identifier('owner', 'owner@example.com');
select tests.create_supabase_user('stranger', 'stranger@example.com');
-- Owner creates a record
select makerkit.authenticate_as('owner');
insert into notes (title, content, user_id)
values ('Secret Note', 'Private content', auth.uid());
-- Stranger cannot see the record
select makerkit.authenticate_as('stranger');
select is_empty(
$$ select * from notes where title = 'Secret Note' $$,
'Strangers cannot see other users notes'
);
```
### 2. Testing Write Protection
Ensure users cannot modify others' data:
```sql
-- Owner creates a record
select makerkit.authenticate_as('owner');
insert into posts (title, user_id)
values ('My Post', auth.uid()) returning id;
-- Store the post ID for testing
\set post_id (select id from posts where title = 'My Post')
-- Stranger cannot update the record
select makerkit.authenticate_as('stranger');
select throws_ok(
$$ update posts set title = 'Hacked!' where id = :post_id $$,
'update or delete on table "posts" violates row-level security policy',
'Strangers cannot update other users posts'
);
```
### 3. Testing Permission Systems
Verify role-based access control:
```sql
-- Test that only users with 'posts.manage' permission can create posts
select makerkit.authenticate_as('member');
select throws_ok(
$$ insert into admin_posts (title, content) values ('Test', 'Content') $$,
'new row violates row-level security policy',
'Members without permission cannot create admin posts'
);
-- Grant permission and test again
set local role postgres;
insert into user_permissions (user_id, permission)
values (tests.get_supabase_uid('member'), 'posts.manage');
select makerkit.authenticate_as('member');
select lives_ok(
$$ insert into admin_posts (title, content) values ('Test', 'Content') $$,
'Members with permission can create admin posts'
);
```
## Team Account Security Testing
### Testing Team Membership Access
```sql
-- Setup team and members
select makerkit.authenticate_as('owner');
select public.create_team_account('TestTeam');
-- Add member to team
set local role postgres;
insert into accounts_memberships (account_id, user_id, account_role)
values (
makerkit.get_account_id_by_slug('testteam'),
tests.get_supabase_uid('member'),
'member'
);
-- Test member can see team data
select makerkit.authenticate_as('member');
select isnt_empty(
$$ select * from team_posts where account_id = makerkit.get_account_id_by_slug('testteam') $$,
'Team members can see team posts'
);
-- Test non-members cannot see team data
select makerkit.authenticate_as('stranger');
select is_empty(
$$ select * from team_posts where account_id = makerkit.get_account_id_by_slug('testteam') $$,
'Non-members cannot see team posts'
);
```
### Testing Role Hierarchy
```sql
-- Test that members cannot promote themselves
select makerkit.authenticate_as('member');
select throws_ok(
$$ update accounts_memberships
set account_role = 'owner'
where user_id = auth.uid() $$,
'Only the account_role can be updated',
'Members cannot promote themselves to owner'
);
-- Test that members cannot remove the owner
select throws_ok(
$$ delete from accounts_memberships
where user_id = tests.get_supabase_uid('owner')
and account_id = makerkit.get_account_id_by_slug('testteam') $$,
'The primary account owner cannot be removed from the account membership list',
'Members cannot remove the account owner'
);
```
## Storage Security Testing
```sql
-- Test file access control
select makerkit.authenticate_as('user1');
-- User can upload to their own folder
select lives_ok(
$$ insert into storage.objects (bucket_id, name, owner, owner_id)
values ('avatars', auth.uid()::text, auth.uid(), auth.uid()) $$,
'Users can upload files with their own UUID as filename'
);
-- User cannot upload using another user's UUID as filename
select makerkit.authenticate_as('user2');
select throws_ok(
$$ insert into storage.objects (bucket_id, name, owner, owner_id)
values ('avatars', tests.get_supabase_uid('user1')::text, auth.uid(), auth.uid()) $$,
'new row violates row-level security policy',
'Users cannot upload files with other users UUIDs as filename'
);
```
## Common Testing Patterns
### 1. Cross-Account Data Isolation
```sql
-- Verify team A members cannot access team B data
select makerkit.authenticate_as('team_a_member');
insert into documents (title, team_id)
values ('Secret Doc', makerkit.get_account_id_by_slug('team-a'));
select makerkit.authenticate_as('team_b_member');
select is_empty(
$$ select * from documents where title = 'Secret Doc' $$,
'Team B members cannot see Team A documents'
);
```
### 2. Function Security Testing
```sql
-- Test that protected functions check permissions
select makerkit.authenticate_as('regular_user');
select throws_ok(
$$ select admin_delete_all_posts() $$,
'permission denied for function admin_delete_all_posts',
'Regular users cannot call admin functions'
);
-- Test with proper permissions
select makerkit.set_super_admin();
select lives_ok(
$$ select admin_delete_all_posts() $$,
'Super admins can call admin functions'
);
```
### 3. Invitation Security Testing
```sql
-- Test invitation creation permissions
select makerkit.authenticate_as('member');
-- Members can invite to same or lower roles
select lives_ok(
$$ insert into invitations (email, account_id, role, invite_token)
values ('new@example.com', makerkit.get_account_id_by_slug('team'), 'member', gen_random_uuid()) $$,
'Members can invite other members'
);
-- Members cannot invite to higher roles
select throws_ok(
$$ insert into invitations (email, account_id, role, invite_token)
values ('admin@example.com', makerkit.get_account_id_by_slug('team'), 'owner', gen_random_uuid()) $$,
'new row violates row-level security policy',
'Members cannot invite owners'
);
```
## Advanced Testing Techniques
### 1. Testing Edge Cases
```sql
-- Test NULL handling in RLS policies
select lives_ok(
$$ select * from posts where user_id IS NULL $$,
'Queries with NULL filters should not crash'
);
-- Test empty result sets
select is_empty(
$$ select * from posts where user_id = '00000000-0000-0000-0000-000000000000'::uuid $$,
'Invalid UUIDs should return empty results'
);
```
### 2. Performance Testing
```sql
-- Test that RLS policies don't create N+1 queries
select makerkit.authenticate_as('team_owner');
-- This should be efficient even with many team members
select isnt_empty(
$$ select p.*, u.name from posts p join users u on p.user_id = u.id
where p.team_id = makerkit.get_account_id_by_slug('large-team') $$,
'Joined queries with RLS should perform well'
);
```
### 3. Testing Trigger Security
```sql
-- Test that triggers properly validate permissions
select makerkit.authenticate_as('regular_user');
select throws_ok(
$$ update sensitive_settings set admin_only_field = 'hacked' $$,
'You do not have permission to update this field',
'Triggers should prevent unauthorized field updates'
);
```
## Best Practices
### 1. Always Test Both Positive and Negative Cases
- Verify authorized users CAN access data
- Verify unauthorized users CANNOT access data
### 2. Test All CRUD Operations
- CREATE: Can users insert the records they should?
- READ: Can users only see their authorized data?
- UPDATE: Can users only modify records they own?
- DELETE: Can users only remove their own records?
### 3. Use Descriptive Test Names
```sql
select is(
actual_result,
expected_result,
'Team members should be able to read team posts but not modify other teams data'
);
```
### 4. Test Permission Boundaries
- Test the minimum permission level that grants access
- Test that one level below is denied
- Test that users with higher permissions can also access
### 5. Clean Up After Tests
Always use transactions that rollback:
```sql
begin;
-- Your tests here
rollback; -- This cleans up all test data
```
## Common Anti-Patterns to Avoid
❌ **Don't test only happy paths**
```sql
-- Bad: Only testing that authorized access works
select isnt_empty($$ select * from posts $$, 'User can see posts');
```
✅ **Test both authorized and unauthorized access**
```sql
-- Good: Test both positive and negative cases
select makerkit.authenticate_as('owner');
select isnt_empty($$ select * from posts where user_id = auth.uid() $$, 'Owner can see own posts');
select makerkit.authenticate_as('stranger');
select is_empty($$ select * from posts where user_id != auth.uid() $$, 'Stranger cannot see others posts');
```
❌ **Don't forget to test cross-account scenarios**
```sql
-- Bad: Only testing within same account
select lives_ok($$ insert into team_docs (title) values ('Doc') $$, 'Can create doc');
```
✅ **Test cross-account isolation**
```sql
-- Good: Test that team A cannot access team B data
select makerkit.authenticate_as('team_a_member');
insert into team_docs (title, team_id) values ('Secret', team_a_id);
select makerkit.authenticate_as('team_b_member');
select is_empty($$ select * from team_docs where title = 'Secret' $$, 'Team B cannot see Team A docs');
```
## Testing Silent RLS Failures
**Critical Understanding**: RLS policies often fail **silently**. They don't throw errors - they just filter out data or prevent operations. This makes testing RLS policies tricky because you need to verify what **didn't** happen, not just what did.
### Why RLS Failures Are Silent
```sql
-- RLS policies work by:
-- 1. INSERT/UPDATE: If the policy check fails, the operation is ignored (no error)
-- 2. SELECT: If the policy fails, rows are filtered out (no error)
-- 3. DELETE: If the policy fails, nothing is deleted (no error)
```
### Testing Silent SELECT Filtering
When RLS policies prevent users from seeing data, queries return empty results instead of errors:
```sql
-- Setup: Create posts for different users
select makerkit.authenticate_as('user_a');
insert into posts (title, content, user_id)
values ('User A Post', 'Content A', auth.uid());
select makerkit.authenticate_as('user_b');
insert into posts (title, content, user_id)
values ('User B Post', 'Content B', auth.uid());
-- Test: User A cannot see User B's posts (silent filtering)
select makerkit.authenticate_as('user_a');
select is_empty(
$$ select * from posts where title = 'User B Post' $$,
'User A cannot see User B posts due to RLS filtering'
);
-- Test: User A can still see their own posts
select isnt_empty(
$$ select * from posts where title = 'User A Post' $$,
'User A can see their own posts'
);
-- Critical: Verify the post actually exists by switching context
select makerkit.authenticate_as('user_b');
select isnt_empty(
$$ select * from posts where title = 'User B Post' $$,
'User B post actually exists (not a test data issue)'
);
```
### Testing Silent UPDATE/DELETE Prevention
RLS policies can silently prevent modifications without throwing errors:
```sql
-- Setup: User A creates a post
select makerkit.authenticate_as('user_a');
insert into posts (title, content, user_id)
values ('Original Title', 'Original Content', auth.uid())
returning id;
-- Store the post ID for testing
\set post_id (select id from posts where title = 'Original Title')
-- Test: User B attempts to modify User A's post (silently fails)
select makerkit.authenticate_as('user_b');
update posts set title = 'Hacked Title' where id = :post_id;
-- Verify the update was silently ignored
select makerkit.authenticate_as('user_a');
select is(
(select title from posts where id = :post_id),
'Original Title',
'User B update attempt was silently ignored by RLS'
);
-- Test: User B attempts to delete User A's post (silently fails)
select makerkit.authenticate_as('user_b');
delete from posts where id = :post_id;
-- Verify the delete was silently ignored
select makerkit.authenticate_as('user_a');
select isnt_empty(
$$ select * from posts where title = 'Original Title' $$,
'User B delete attempt was silently ignored by RLS'
);
```
### Testing Silent INSERT Prevention
INSERT operations can also fail silently with restrictive RLS policies:
```sql
-- Test: Non-admin tries to insert into admin_settings table
select makerkit.authenticate_as('regular_user');
-- Attempt to insert (may succeed but be silently filtered on read)
insert into admin_settings (key, value) values ('test_key', 'test_value');
-- Critical: Don't just check for errors - verify the data isn't there
select is_empty(
$$ select * from admin_settings where key = 'test_key' $$,
'Regular user cannot insert admin settings (silent prevention)'
);
-- Verify an admin can actually insert this data
set local role postgres;
insert into admin_settings (key, value) values ('admin_key', 'admin_value');
select makerkit.set_super_admin();
select isnt_empty(
$$ select * from admin_settings where key = 'admin_key' $$,
'Admins can insert admin settings (confirms table works)'
);
```
### Testing Row-Level Filtering with Counts
Use count comparisons to detect silent filtering:
```sql
-- Setup: Create team data
select makerkit.authenticate_as('team_owner');
insert into team_documents (title, team_id) values
('Doc 1', (select id from accounts where slug = 'team-a')),
('Doc 2', (select id from accounts where slug = 'team-a')),
('Doc 3', (select id from accounts where slug = 'team-a'));
-- Test: Team member sees all team docs
select makerkit.authenticate_as('team_member_a');
select is(
(select count(*) from team_documents where team_id = (select id from accounts where slug = 'team-a')),
3::bigint,
'Team member can see all team documents'
);
-- Test: Non-member sees no team docs (silent filtering)
select makerkit.authenticate_as('external_user');
select is(
(select count(*) from team_documents where team_id = (select id from accounts where slug = 'team-a')),
0::bigint,
'External user cannot see any team documents due to RLS filtering'
);
```
### Testing Partial Data Exposure
Sometimes RLS policies expose some fields but not others:
```sql
-- Test: Public can see user profiles but not sensitive data
select tests.create_supabase_user('public_user', 'public@example.com');
-- Create user profile with sensitive data
select makerkit.authenticate_as('profile_owner');
insert into user_profiles (user_id, name, email, phone, ssn) values
(auth.uid(), 'John Doe', 'john@example.com', '555-1234', '123-45-6789');
-- Test: Public can see basic info but not sensitive fields
select makerkit.authenticate_as('public_user');
select is(
(select name from user_profiles where user_id = tests.get_supabase_uid('profile_owner')),
'John Doe',
'Public can see user name'
);
-- Critical: Test that sensitive fields are silently filtered
select is(
(select ssn from user_profiles where user_id = tests.get_supabase_uid('profile_owner')),
null,
'Public cannot see SSN (silently filtered by RLS)'
);
select is(
(select phone from user_profiles where user_id = tests.get_supabase_uid('profile_owner')),
null,
'Public cannot see phone number (silently filtered by RLS)'
);
```
### Testing Cross-Account Data Isolation
Verify users cannot access other accounts' data:
```sql
-- Setup: Create data for multiple teams
select makerkit.authenticate_as('team_a_owner');
insert into billing_info (team_id, subscription_id) values
((select id from accounts where slug = 'team-a'), 'sub_123');
select makerkit.authenticate_as('team_b_owner');
insert into billing_info (team_id, subscription_id) values
((select id from accounts where slug = 'team-b'), 'sub_456');
-- Test: Team A members cannot see Team B billing (silent filtering)
select makerkit.authenticate_as('team_a_member');
select is_empty(
$$ select * from billing_info where subscription_id = 'sub_456' $$,
'Team A members cannot see Team B billing info'
);
-- Test: Team A members can see their own billing
select isnt_empty(
$$ select * from billing_info where subscription_id = 'sub_123' $$,
'Team A members can see their own billing info'
);
-- Verify both billing records actually exist
set local role postgres;
select is(
(select count(*) from billing_info),
2::bigint,
'Both billing records exist in database (not a test data issue)'
);
```
### Testing Permission Boundary Edge Cases
Test the exact boundaries where permissions change:
```sql
-- Setup users with different permission levels
select makerkit.authenticate_as('admin_user');
select makerkit.authenticate_as('editor_user');
select makerkit.authenticate_as('viewer_user');
-- Test: Admins can see all data
select makerkit.authenticate_as('admin_user');
select isnt_empty(
$$ select * from sensitive_documents $$,
'Admins can see sensitive documents'
);
-- Test: Editors cannot see sensitive docs (silent filtering)
select makerkit.authenticate_as('editor_user');
select is_empty(
$$ select * from sensitive_documents $$,
'Editors cannot see sensitive documents due to RLS'
);
-- Test: Viewers cannot see sensitive docs (silent filtering)
select makerkit.authenticate_as('viewer_user');
select is_empty(
$$ select * from sensitive_documents $$,
'Viewers cannot see sensitive documents due to RLS'
);
```
### Testing Multi-Condition RLS Policies
When RLS policies have multiple conditions, test each condition:
```sql
-- Policy example: Users can only see posts if they are:
-- 1. The author, OR
-- 2. A team member of the author's team, AND
-- 3. The post is published
-- Test condition 1: Author can see unpublished posts
select makerkit.authenticate_as('author');
insert into posts (title, published, user_id) values
('Draft Post', false, auth.uid());
select isnt_empty(
$$ select * from posts where title = 'Draft Post' $$,
'Authors can see their own unpublished posts'
);
-- Test condition 2: Team members cannot see unpublished posts (silent filtering)
select makerkit.authenticate_as('team_member');
select is_empty(
$$ select * from posts where title = 'Draft Post' $$,
'Team members cannot see unpublished posts from teammates'
);
-- Test condition 3: Team members can see published posts
select makerkit.authenticate_as('author');
update posts set published = true where title = 'Draft Post';
select makerkit.authenticate_as('team_member');
select isnt_empty(
$$ select * from posts where title = 'Draft Post' $$,
'Team members can see published posts from teammates'
);
-- Test condition boundary: Non-team members cannot see any posts
select makerkit.authenticate_as('external_user');
select is_empty(
$$ select * from posts where title = 'Draft Post' $$,
'External users cannot see any posts (even published ones)'
);
```
### Common Silent Failure Patterns to Test
#### 1. The "Empty Result" Pattern
```sql
-- Always test that restricted queries return empty results, not errors
select is_empty(
$$ select * from restricted_table where condition = true $$,
'Unauthorized users see empty results, not errors'
);
```
#### 2. The "No-Effect" Pattern
```sql
-- Test that unauthorized modifications have no effect
update restricted_table set field = 'hacked' where id = target_id;
select is(
(select field from restricted_table where id = target_id),
'original_value',
'Unauthorized updates are silently ignored'
);
```
#### 3. The "Partial Visibility" Pattern
```sql
-- Test that only authorized fields are visible
select is(
(select public_field from mixed_table where id = target_id),
'visible_value',
'Public fields are visible'
);
select is(
(select private_field from mixed_table where id = target_id),
null,
'Private fields are silently filtered out'
);
```
#### 4. The "Context Switch" Verification Pattern
```sql
-- Always verify data exists by switching to authorized context
select makerkit.authenticate_as('unauthorized_user');
select is_empty(
$$ select * from protected_data $$,
'Unauthorized user sees no data'
);
-- Switch to authorized user to prove data exists
select makerkit.authenticate_as('authorized_user');
select isnt_empty(
$$ select * from protected_data $$,
'Data actually exists (confirms RLS filtering, not missing data)'
);
```
### Best Practices for Silent Failure Testing
#### ✅ Do: Test Both Positive and Negative Cases
```sql
-- Test that authorized users CAN access data
select makerkit.authenticate_as('authorized_user');
select isnt_empty($$ select * from protected_data $$, 'Authorized access works');
-- Test that unauthorized users CANNOT access data (silent filtering)
select makerkit.authenticate_as('unauthorized_user');
select is_empty($$ select * from protected_data $$, 'Unauthorized access silently filtered');
```
#### ✅ Do: Verify Data Exists in Different Context
```sql
-- Don't just test that unauthorized users see nothing
-- Verify the data actually exists by checking as an authorized user
select makerkit.authenticate_as('data_owner');
select isnt_empty($$ select * from my_data $$, 'Data exists');
select makerkit.authenticate_as('unauthorized_user');
select is_empty($$ select * from my_data $$, 'But unauthorized user cannot see it');
```
#### ✅ Do: Test Modification Boundaries
```sql
-- Test that unauthorized modifications are ignored
update sensitive_table set value = 'hacked';
select is(
(select value from sensitive_table),
'original_value',
'Unauthorized updates silently ignored'
);
```
#### ❌ Don't: Expect Errors from RLS Violations
```sql
-- Bad: RLS violations usually don't throw errors
select throws_ok(
$$ select * from protected_data $$,
'permission denied'
);
-- Good: RLS violations return empty results
select is_empty(
$$ select * from protected_data $$,
'Unauthorized users see no data due to RLS filtering'
);
```
#### ❌ Don't: Test Only Happy Paths
```sql
-- Bad: Only testing authorized access
select isnt_empty($$ select * from my_data $$, 'I can see my data');
-- Good: Test both authorized and unauthorized access
select makerkit.authenticate_as('owner');
select isnt_empty($$ select * from my_data $$, 'Owner can see data');
select makerkit.authenticate_as('stranger');
select is_empty($$ select * from my_data $$, 'Stranger cannot see data');
```
Remember: **RLS is designed to be invisible to attackers**. Your tests must verify this invisibility by checking for empty results and unchanged data, not for error messages.
## Running Tests
To run your database tests:
```bash
# Start Supabase locally
pnpm supabase:web:start
# Run all database tests
pnpm supabase:web:test
# Run specific test file
pnpm supabase test ./tests/database/your-test.test.sql
```
Your tests will help ensure your application is secure against common database vulnerabilities and that your RLS policies work as expected.

View File

@@ -0,0 +1,263 @@
---
status: "published"
label: "Database Webhooks"
order: 6
title: "Database Webhooks in the Next.js Supabase Starter Kit"
description: "Handle database change events with webhooks to send notifications, sync external services, and trigger custom logic when data changes."
---
Database webhooks let you execute custom code when rows are inserted, updated, or deleted in your Supabase tables. Makerkit provides a typed webhook handler at `@kit/database-webhooks` that processes these events in a Next.js API route.
{% sequence title="Database Webhooks Setup" description="Configure and handle database change events" %}
[Understand the webhook system](#how-database-webhooks-work)
[Add custom handlers](#adding-custom-webhook-handlers)
[Configure webhook triggers](#configuring-webhook-triggers)
[Test webhooks locally](#testing-webhooks-locally)
{% /sequence %}
## How Database Webhooks Work
Supabase database webhooks fire HTTP requests to your application when specified database events occur. The flow is:
1. A row is inserted, updated, or deleted in a table
2. Supabase sends a POST request to your webhook endpoint
3. Your handler processes the event and executes custom logic
4. The handler returns a success response
Makerkit includes built-in handlers for:
- **User deletion**: Cleans up related subscriptions and data
- **User signup**: Sends welcome emails
- **Invitation creation**: Sends invitation emails
You can extend this with your own handlers.
## Adding Custom Webhook Handlers
The webhook endpoint is at `apps/web/app/api/db/webhook/route.ts`. Add your handlers to the `handleEvent` callback:
```tsx {% title="apps/web/app/api/db/webhook/route.ts" %}
import { getDatabaseWebhookHandlerService } from '@kit/database-webhooks';
import { enhanceRouteHandler } from '@kit/next/routes';
export const POST = enhanceRouteHandler(
async ({ request }) => {
const service = getDatabaseWebhookHandlerService();
try {
const signature = request.headers.get('X-Supabase-Event-Signature');
if (!signature) {
return new Response('Missing signature', { status: 400 });
}
const body = await request.clone().json();
await service.handleWebhook({
body,
signature,
async handleEvent(change) {
// Handle new project creation
if (change.type === 'INSERT' && change.table === 'projects') {
await notifyTeamOfNewProject(change.record);
}
// Handle subscription cancellation
if (change.type === 'UPDATE' && change.table === 'subscriptions') {
if (change.record.status === 'canceled') {
await sendCancellationSurvey(change.record);
}
}
// Handle user deletion
if (change.type === 'DELETE' && change.table === 'accounts') {
await cleanupExternalServices(change.old_record);
}
},
});
return new Response(null, { status: 200 });
} catch (error) {
console.error('Webhook error:', error);
return new Response(null, { status: 500 });
}
},
{ auth: false },
);
```
### RecordChange Type
The `change` object is typed to your database schema:
```tsx
import type { Database } from '@kit/supabase/database';
type Tables = Database['public']['Tables'];
type TableChangeType = 'INSERT' | 'UPDATE' | 'DELETE';
interface RecordChange<
Table extends keyof Tables,
Row = Tables[Table]['Row'],
> {
type: TableChangeType;
table: Table;
record: Row; // Current row data (null for DELETE)
schema: 'public';
old_record: Row | null; // Previous row data (null for INSERT)
}
```
### Type-Safe Handlers
Cast to specific table types for better type safety:
```tsx
import type { RecordChange } from '@kit/database-webhooks';
type ProjectChange = RecordChange<'projects'>;
type SubscriptionChange = RecordChange<'subscriptions'>;
async function handleEvent(change: RecordChange<keyof Tables>) {
if (change.table === 'projects') {
const projectChange = change as ProjectChange;
// projectChange.record is now typed to the projects table
console.log(projectChange.record.name);
}
}
```
### Async Handlers
For long-running operations, consider using background jobs:
```tsx
async handleEvent(change) {
if (change.type === 'INSERT' && change.table === 'orders') {
// Queue for background processing instead of blocking
await queueOrderProcessing(change.record.id);
}
}
```
## Configuring Webhook Triggers
Webhooks are configured in Supabase. You can set them up via SQL or the Dashboard.
### SQL Configuration
Add a trigger in your schema file at `apps/web/supabase/schemas/`:
```sql {% title="apps/web/supabase/schemas/webhooks.sql" %}
-- Create the webhook trigger for the projects table
create trigger projects_webhook
after insert or update or delete on public.projects
for each row execute function supabase_functions.http_request(
'https://your-app.com/api/db/webhook',
'POST',
'{"Content-Type":"application/json"}',
'{}',
'5000'
);
```
### Dashboard Configuration
1. Open your Supabase project dashboard
2. Navigate to **Database** > **Webhooks**
3. Click **Create a new hook**
4. Configure:
- **Name**: `projects_webhook`
- **Table**: `projects`
- **Events**: INSERT, UPDATE, DELETE
- **Type**: HTTP Request
- **URL**: `https://your-app.com/api/db/webhook`
- **Method**: POST
### Webhook Security
Supabase automatically signs webhook payloads using the `X-Supabase-Event-Signature` header. The `@kit/database-webhooks` package verifies this signature against your `SUPABASE_DB_WEBHOOK_SECRET` environment variable.
Configure the webhook secret:
```bash {% title=".env.local" %}
SUPABASE_DB_WEBHOOK_SECRET=your-webhook-secret
```
Set the same secret in your Supabase webhook configuration. The handler validates signatures automatically, rejecting requests with missing or invalid signatures.
## Testing Webhooks Locally
### Local Development Setup
When running Supabase locally, webhooks need to reach your Next.js server:
1. Start your development server on a known port:
```bash
pnpm run dev
```
2. Configure the webhook URL in your local Supabase to point to `http://host.docker.internal:3000/api/db/webhook` (Docker) or `http://localhost:3000/api/db/webhook`.
### Manual Testing
Test your webhook handler by sending a mock request:
```bash
curl -X POST http://localhost:3000/api/db/webhook \
-H "Content-Type: application/json" \
-H "X-Supabase-Event-Signature: your-secret-key" \
-d '{
"type": "INSERT",
"table": "projects",
"schema": "public",
"record": {
"id": "test-id",
"name": "Test Project",
"account_id": "account-id"
},
"old_record": null
}'
```
Expected response: `200 OK`
### Debugging Tips
**Webhook not firing**: Check that the trigger exists in Supabase and the URL is correct.
**Handler not executing**: Add logging to trace the event flow:
```tsx
async handleEvent(change) {
console.log('Received webhook:', {
type: change.type,
table: change.table,
recordId: change.record?.id,
});
}
```
**Timeout errors**: Move long operations to background jobs. Webhooks should respond quickly.
## Common Use Cases
| Use Case | Trigger | Action |
|----------|---------|--------|
| Welcome email | INSERT on `users` | Send onboarding email |
| Invitation email | INSERT on `invitations` | Send invite link |
| Subscription change | UPDATE on `subscriptions` | Sync with CRM |
| User deletion | DELETE on `accounts` | Clean up external services |
| Audit logging | INSERT/UPDATE/DELETE | Write to audit table |
| Search indexing | INSERT/UPDATE | Update search index |
## Related Resources
- [Database Schema](/docs/next-supabase-turbo/development/database-schema) for extending your schema
- [Database Functions](/docs/next-supabase-turbo/development/database-functions) for built-in SQL functions
- [Email Configuration](/docs/next-supabase-turbo/emails/email-configuration) for sending emails from webhooks

View File

@@ -0,0 +1,210 @@
---
status: "published"
label: "External Marketing Website"
title: "External Marketing Website in the Next.js Supabase Turbo Starter Kit"
description: "Configure Makerkit to redirect marketing pages to an external website built with Framer, Webflow, or WordPress."
order: 9
---
Redirect Makerkit's marketing pages to an external website by configuring the `proxy.ts` middleware. This lets you use Framer, Webflow, or WordPress for your marketing site while keeping Makerkit for your SaaS application.
{% sequence title="External Marketing Website Setup" description="Configure redirects to your external marketing site" %}
[Understand the architecture](#when-to-use-an-external-marketing-website)
[Configure the middleware](#configuring-the-middleware)
[Handle subpaths and assets](#handling-subpaths-and-assets)
[Verify the redirects](#verify-the-redirects)
{% /sequence %}
## When to Use an External Marketing Website
Use an external marketing website when:
- **Marketing team independence**: Your marketing team needs to update content without developer involvement
- **Design flexibility**: You want visual builders like Framer or Webflow for landing pages
- **Content management**: WordPress or a headless CMS better fits your content workflow
- **A/B testing**: Your marketing tools integrate better with external platforms
Keep marketing pages in Makerkit when:
- You want a unified codebase and deployment
- Your team is comfortable with React and Tailwind
- You need tight integration between marketing and app features
## Configuring the Middleware
{% alert type="default" title="Next.js 16+" %}
In Next.js 16+, Makerkit uses `proxy.ts` for middleware. Prior versions used `middleware.ts`.
{% /alert %}
Edit `apps/web/proxy.ts` to redirect marketing pages:
```typescript {% title="apps/web/proxy.ts" %}
import type { NextRequest } from 'next/server';
import { NextResponse } from 'next/server';
const EXTERNAL_MARKETING_URL = 'https://your-marketing-site.com';
const MARKETING_PAGES = [
'/',
'/pricing',
'/faq',
'/contact',
'/about',
'/blog',
'/privacy-policy',
'/terms-of-service',
'/cookie-policy',
];
export function proxy(req: NextRequest) {
if (isMarketingPage(req)) {
const redirectUrl = new URL(
req.nextUrl.pathname,
EXTERNAL_MARKETING_URL
);
// Preserve query parameters
redirectUrl.search = req.nextUrl.search;
return NextResponse.redirect(redirectUrl, { status: 301 });
}
// Continue with existing middleware logic
return NextResponse.next();
}
function isMarketingPage(req: NextRequest): boolean {
const pathname = req.nextUrl.pathname;
return MARKETING_PAGES.some((page) => {
if (page === '/') {
return pathname === '/';
}
return pathname === page || pathname.startsWith(`${page}/`);
});
}
```
### Configuration Options
| Option | Description |
|--------|-------------|
| `EXTERNAL_MARKETING_URL` | Your external marketing site's base URL |
| `MARKETING_PAGES` | Array of paths to redirect |
| Status code `301` | Permanent redirect (SEO-friendly) |
| Status code `302` | Temporary redirect (for testing) |
## Handling Subpaths and Assets
### Blog Posts with Dynamic Paths
If your blog uses dynamic paths like `/blog/post-slug`, handle them separately:
```typescript
const MARKETING_PAGES = [
// ... other pages
];
const MARKETING_PREFIXES = [
'/blog',
'/resources',
'/case-studies',
];
function isMarketingPage(req: NextRequest): boolean {
const pathname = req.nextUrl.pathname;
// Check exact matches
if (MARKETING_PAGES.includes(pathname)) {
return true;
}
// Check prefix matches
return MARKETING_PREFIXES.some((prefix) =>
pathname.startsWith(prefix)
);
}
```
### Excluding Application Routes
Keep certain routes in Makerkit even if they share a marketing prefix:
```typescript
const APP_ROUTES = [
'/blog/admin', // Blog admin panel stays in Makerkit
'/pricing/checkout', // Checkout flow stays in Makerkit
];
function isMarketingPage(req: NextRequest): boolean {
const pathname = req.nextUrl.pathname;
// Never redirect app routes
if (APP_ROUTES.some((route) => pathname.startsWith(route))) {
return false;
}
// ... rest of the logic
}
```
## Verify the Redirects
After configuring, verify redirects work correctly:
```bash
# Start the development server
pnpm run dev
# Test a redirect (should return 301)
curl -I http://localhost:3000/pricing
```
Expected output:
```
HTTP/1.1 301 Moved Permanently
Location: https://your-marketing-site.com/pricing
```
### Common Issues
**Redirect loops**: Ensure your external site doesn't redirect back to Makerkit.
**Missing query parameters**: The example code preserves query params. Verify UTM parameters pass through correctly.
**Asset requests**: Don't redirect asset paths like `/images/` or `/_next/`. The middleware should only match page routes.
## Environment-Based Configuration
Use environment variables for different environments:
```typescript {% title="apps/web/proxy.ts" %}
const EXTERNAL_MARKETING_URL = process.env.EXTERNAL_MARKETING_URL;
export function proxy(req: NextRequest) {
// Only redirect if external URL is configured
if (!EXTERNAL_MARKETING_URL) {
return NextResponse.next();
}
// ... redirect logic
}
```
```bash {% title=".env.production" %}
EXTERNAL_MARKETING_URL=https://your-marketing-site.com
```
This lets you keep marketing pages in Makerkit during development while redirecting in production.
## Related Resources
- [Marketing Pages](/docs/next-supabase-turbo/development/marketing-pages) for customizing built-in marketing pages
- [SEO Configuration](/docs/next-supabase-turbo/development/seo) for sitemap and meta tag setup
- [Legal Pages](/docs/next-supabase-turbo/development/legal-pages) for privacy policy and terms pages

View File

@@ -0,0 +1,221 @@
---
status: "published"
label: "Legal Pages"
title: "Legal Pages in the Next.js Supabase Turbo Starter Kit"
description: "Create and customize legal pages including Terms of Service, Privacy Policy, and Cookie Policy in your Makerkit application."
order: 8
---
Legal pages in Makerkit are TSX files located at `apps/web/app/[locale]/(marketing)/(legal)/`. The kit includes placeholder files for Terms of Service, Privacy Policy, and Cookie Policy that you must customize with your own content.
{% sequence title="Legal Pages Setup" description="Configure your SaaS application's legal pages" %}
[Understand the included pages](#included-legal-pages)
[Customize the content](#customizing-legal-pages)
[Add new legal pages](#adding-new-legal-pages)
[Use a CMS for legal content](#using-a-cms-for-legal-pages)
{% /sequence %}
## Included Legal Pages
Makerkit includes three legal page templates:
| Page | File Location | URL |
|------|---------------|-----|
| Terms of Service | `apps/web/app/[locale]/(marketing)/(legal)/terms-of-service/page.tsx` | `/terms-of-service` |
| Privacy Policy | `apps/web/app/[locale]/(marketing)/(legal)/privacy-policy/page.tsx` | `/privacy-policy` |
| Cookie Policy | `apps/web/app/[locale]/(marketing)/(legal)/cookie-policy/page.tsx` | `/cookie-policy` |
{% alert type="error" title="Required: Add Your Own Content" %}
The included legal pages contain placeholder text only. You must replace this content with legally compliant policies for your jurisdiction and business model. Consult a lawyer for proper legal documentation.
{% /alert %}
## Customizing Legal Pages
### Basic MDX Structure
Each legal page uses MDX format with frontmatter:
```mdx {% title="apps/web/app/[locale]/(marketing)/(legal)/privacy-policy/page.tsx" %}
---
title: "Privacy Policy"
description: "How we collect, use, and protect your personal information"
---
# Privacy Policy
**Last updated: January 2026**
## Information We Collect
We collect information you provide directly...
## How We Use Your Information
We use the information we collect to...
## Contact Us
If you have questions about this Privacy Policy, contact us at...
```
### Adding Last Updated Dates
Include a visible "Last updated" date in your legal pages. This helps with compliance and user trust:
```mdx
**Last updated: January 15, 2026**
*This policy is effective as of the date above and replaces any prior versions.*
```
### Structuring Long Documents
For complex legal documents, use clear heading hierarchy:
```mdx
# Privacy Policy
## 1. Information Collection
### 1.1 Information You Provide
### 1.2 Information Collected Automatically
### 1.3 Information from Third Parties
## 2. Use of Information
### 2.1 Service Delivery
### 2.2 Communications
### 2.3 Analytics and Improvements
## 3. Data Sharing
...
```
## Adding New Legal Pages
Create additional legal pages in the `(legal)` directory:
```bash
# Create a new legal page
mkdir -p apps/web/app/\[locale\]/\(marketing\)/\(legal\)/acceptable-use
touch apps/web/app/\[locale\]/\(marketing\)/\(legal\)/acceptable-use/page.tsx
```
Add the content:
```mdx {% title="apps/web/app/[locale]/(marketing)/(legal)/acceptable-use/page.tsx" %}
---
title: "Acceptable Use Policy"
description: "Guidelines for using our service responsibly"
---
# Acceptable Use Policy
**Last updated: January 2026**
This Acceptable Use Policy outlines prohibited activities...
```
### Update Navigation
Add links to new legal pages in your footer or navigation. The footer typically lives in:
```
apps/web/app/[locale]/(marketing)/_components/site-footer.tsx
```
### Update Sitemap
Add new legal pages to your sitemap in `apps/web/app/sitemap.xml/route.ts`:
```typescript
function getPaths() {
const paths = [
// ... existing paths
'/acceptable-use', // Add new legal page
];
return paths.map((path) => ({
loc: new URL(path, appConfig.url).href,
lastmod: new Date().toISOString(),
}));
}
```
## Using a CMS for Legal Pages
For organizations that need non-developers to update legal content, use the CMS integration:
```tsx {% title="apps/web/app/(marketing)/(legal)/privacy-policy/page.tsx" %}
import { createCmsClient } from '@kit/cms';
export default async function PrivacyPolicyPage() {
const cms = await createCmsClient();
const { title, content } = await cms.getContentBySlug({
slug: 'privacy-policy',
collection: 'pages',
});
return (
<article className="prose prose-gray max-w-3xl mx-auto py-12">
<h1>{title}</h1>
<div dangerouslySetInnerHTML={{ __html: content }} />
</article>
);
}
```
### CMS Setup for Legal Pages
1. Create a `pages` collection in your CMS (Keystatic, WordPress, or custom)
2. Add entries for each legal page with slugs matching the URL paths
3. Use the CMS admin interface to edit content without code changes
See the [CMS documentation](/docs/next-supabase-turbo/content/cms) for detailed setup instructions.
## Legal Page Best Practices
### What to Include
**Privacy Policy** should cover:
- What data you collect (personal info, usage data, cookies)
- How you use the data
- Third-party services (analytics, payment processors)
- User rights (access, deletion, portability)
- Contact information
**Terms of Service** should cover:
- Service description and limitations
- User responsibilities and prohibited uses
- Payment terms (if applicable)
- Intellectual property rights
- Limitation of liability
- Termination conditions
**Cookie Policy** should cover:
- Types of cookies used (essential, analytics, marketing)
- Purpose of each cookie type
- How to manage cookie preferences
- Third-party cookies
### Compliance Considerations
| Regulation | Requirements |
|------------|--------------|
| GDPR (EU) | Privacy policy, cookie consent, data subject rights |
| CCPA (California) | Privacy policy with specific disclosures, opt-out rights |
| LGPD (Brazil) | Privacy policy, consent mechanisms, data protection officer |
{% alert type="warning" title="Not Legal Advice" %}
This documentation provides technical guidance only. Consult qualified legal counsel to ensure your policies comply with applicable laws and regulations.
{% /alert %}
## Related Resources
- [CMS Configuration](/docs/next-supabase-turbo/content/cms) for managing legal content through a CMS
- [Marketing Pages](/docs/next-supabase-turbo/development/marketing-pages) for customizing other marketing content
- [SEO Configuration](/docs/next-supabase-turbo/development/seo) for proper indexing of legal pages

View File

@@ -0,0 +1,231 @@
---
status: "published"
label: "Loading data from the DB"
order: 4
title: "Learn how to load data from the Supabase database"
description: "In this page we learn how to load data from the Supabase database and display it in our Next.js application."
---
Now that our database supports the data we need, we can start loading it into our application. We will use the `@makerkit/data-loader-supabase-nextjs` package to load data from the Supabase database.
Please check the [documentation](https://github.com/makerkit/makerkit/tree/main/packages/data-loader/supabase/nextjs) for the `@makerkit/data-loader-supabase-nextjs` package to learn more about how to use it.
This nifty package allows us to load data from the Supabase database and display it in our server components with support for pagination.
In the snippet below, we will:
1. Load the user's workspace data from the database. This allows us to get the user's account ID without further round-trips because the workspace is loaded by the user layout.
2. Load the user's tasks from the database.
3. Display the tasks in a table.
4. Use a search input to filter the tasks by title.
Let's take a look at the code:
```tsx
import { use } from 'react';
import { ServerDataLoader } from '@makerkit/data-loader-supabase-nextjs';
import { getSupabaseServerClient } from '@kit/supabase/server-client';
import { Button } from '@kit/ui/button';
import { Heading } from '@kit/ui/heading';
import { If } from '@kit/ui/if';
import { Input } from '@kit/ui/input';
import { PageBody } from '@kit/ui/page';
import { Trans } from '@kit/ui/trans';
import { getTranslations } from 'next-intl/server';
import { TasksTable } from './_components/tasks-table';
import { UserAccountHeader } from './_components/user-account-header';
import { loadUserWorkspace } from './_lib/server/load-user-workspace';
interface SearchParams {
page?: string;
query?: string;
}
export const generateMetadata = async () => {
const t = await getTranslations('account');
const title = t('homePage');
return {
title,
};
};
async function UserHomePage(props: { searchParams: Promise<SearchParams> }) {
const client = getSupabaseServerClient();
const { user } = use(loadUserWorkspace());
const searchParams = await props.searchParams;
const page = parseInt(searchParams.page ?? '1', 10);
const query = searchParams.query ?? '';
return (
<>
<UserAccountHeader
title={<Trans i18nKey={'common.homeTabLabel'} />}
description={<Trans i18nKey={'common.homeTabDescription'} />}
/>
<PageBody className={'space-y-4'}>
<div className={'flex items-center justify-between'}>
<div>
<Heading level={4}>
<Trans i18nKey={'tasks.tasksTabLabel'} defaults={'Tasks'} />
</Heading>
</div>
<div className={'flex items-center space-x-2'}>
<form className={'w-full'}>
<Input
name={'query'}
defaultValue={query}
className={'w-full lg:w-[18rem]'}
placeholder={'Search tasks'}
/>
</form>
</div>
</div>
<ServerDataLoader
client={client}
table={'tasks'}
page={page}
where={{
account_id: {
eq: user.id,
},
title: {
textSearch: query ? `%${query}%` : undefined,
},
}}
>
{(props) => {
return (
<div className={'flex flex-col space-y-8'}>
<If condition={props.count === 0 && query}>
<div className={'flex flex-col space-y-2.5'}>
<p>
<Trans
i18nKey={'tasks.noTasksFound'}
values={{ query }}
/>
</p>
<form>
<input type="hidden" name={'query'} value={''} />
<Button variant={'outline'} size={'sm'}>
<Trans i18nKey={'tasks.clearSearch'} />
</Button>
</form>
</div>
</If>
<TasksTable {...props} />
</div>
);
}}
</ServerDataLoader>
</PageBody>
</>
);
}
export default UserHomePage;
```
Let's break this down a bit:
1. We import the necessary components and functions.
2. We define the `SearchParams` interface to type the search parameters.
3. We define the `generateMetadata` function to generate the page metadata.
4. We define the `UserHomePage` component that loads the user's workspace and tasks from the database.
5. We define the `ServerDataLoader` component that loads the tasks from the database.
6. We render the tasks in a table and provide a search input to filter the tasks by title.
7. We export the `UserHomePage` component.
### Displaying the tasks in a table
Now, let's show the tasks table component:
```tsx
'use client';
import Link from 'next/link';
import { ColumnDef } from '@tanstack/react-table';
import { Pencil } from 'lucide-react';
import { useTranslations } from 'next-intl';
import { Button } from '@kit/ui/button';
import { DataTable } from '@kit/ui/enhanced-data-table';
import { Database } from '~/lib/database.types';
type Task = Database['public']['Tables']['tasks']['Row'];
export function TasksTable(props: {
data: Task[];
page: number;
pageSize: number;
pageCount: number;
}) {
const columns = useGetColumns();
return (
<div>
<DataTable {...props} columns={columns} />
</div>
);
}
function useGetColumns(): ColumnDef<Task>[] {
const t = useTranslations('tasks');
return [
{
header: t('task'),
cell: ({ row }) => (
<Link
className={'hover:underline'}
href={`/home/tasks/${row.original.id}`}
>
{row.original.title}
</Link>
),
},
{
header: t('createdAt'),
accessorKey: 'created_at',
},
{
header: t('updatedAt'),
accessorKey: 'updated_at',
},
{
header: '',
id: 'actions',
cell: ({ row }) => {
const id = row.original.id;
return (
<div className={'flex justify-end space-x-2'}>
<Link href={`/home/tasks/${id}`}>
<Button variant={'ghost'} size={'icon'}>
<Pencil className={'h-4'} />
</Button>
</Link>
</div>
);
},
},
];
}
```
In this snippet, we define the `TasksTable` component that renders the tasks in a table. We use the `DataTable` component from the `@kit/ui/enhanced-data-table` package to render the table.
We also define the `useGetColumns` hook that returns the columns for the table. We use the `useTranslations` hook from `next-intl` to translate the column headers.

View File

@@ -0,0 +1,409 @@
---
status: "published"
label: "Marketing Pages"
title: "Customize Marketing Pages in the Next.js Supabase Turbo Starter Kit"
description: "Build and customize landing pages, pricing pages, FAQ, and other marketing content using Next.js App Router and Tailwind CSS."
order: 7
---
Marketing pages in Makerkit live at `apps/web/app/[locale]/(marketing)/` and include landing pages, pricing, FAQ, blog, documentation, and contact forms. These pages use Next.js App Router with React Server Components for fast initial loads and SEO optimization.
{% sequence title="Marketing Pages Development" description="Customize and extend your marketing pages" %}
[Understand the structure](#marketing-pages-structure)
[Customize existing pages](#customizing-existing-pages)
[Create new marketing pages](#creating-new-marketing-pages)
[Configure navigation and footer](#navigation-and-footer)
{% /sequence %}
## Marketing Pages Structure
The marketing pages follow Next.js App Router conventions with a route group:
```
apps/web/app/[locale]/(marketing)/
├── layout.tsx # Shared layout with header/footer
├── page.tsx # Home page (/)
├── (legal)/ # Legal pages group
│ ├── cookie-policy/
│ ├── privacy-policy/
│ └── terms-of-service/
├── blog/ # Blog listing and posts
├── changelog/ # Product changelog
├── contact/ # Contact form
├── docs/ # Documentation
├── faq/ # FAQ page
├── pricing/ # Pricing page
└── _components/ # Shared marketing components
├── header.tsx
├── footer.tsx
└── site-navigation.tsx
```
### Route Groups Explained
The `(marketing)` folder is a route group that shares a layout without affecting the URL structure. Pages inside render at the root level:
| File Path | URL |
|-----------|-----|
| `app/[locale]/(marketing)/page.tsx` | `/` |
| `app/[locale]/(marketing)/pricing/page.tsx` | `/pricing` |
| `app/[locale]/(marketing)/blog/page.tsx` | `/blog` |
## Customizing Existing Pages
### Home Page
The home page at `apps/web/app/[locale]/(marketing)/page.tsx` typically includes:
```tsx {% title="apps/web/app/[locale]/(marketing)/page.tsx" %}
import { Hero } from './_components/hero';
import { Features } from './_components/features';
import { Testimonials } from './_components/testimonials';
import { Pricing } from './_components/pricing-section';
import { CallToAction } from './_components/call-to-action';
export default function HomePage() {
return (
<>
<Hero />
<Features />
<Testimonials />
<Pricing />
<CallToAction />
</>
);
}
```
Each section is a separate component in `_components/` for easy customization.
### Pricing Page
The pricing page displays your billing plans. It reads configuration from `apps/web/config/billing.config.ts`:
```tsx {% title="apps/web/app/[locale]/(marketing)/pricing/page.tsx" %}
import { PricingTable } from '@kit/billing-gateway/marketing';
import billingConfig from '~/config/billing.config';
export default function PricingPage() {
return (
<div className="container py-16">
<h1 className="text-4xl font-bold text-center mb-4">
Simple, Transparent Pricing
</h1>
<p className="text-muted-foreground text-center mb-12">
Choose the plan that fits your needs
</p>
<PricingTable config={billingConfig} />
</div>
);
}
```
See [Billing Configuration](/docs/next-supabase-turbo/billing/overview) for customizing plans and pricing.
### FAQ Page
The FAQ page uses an accordion component with content from a configuration file or CMS:
```tsx {% title="apps/web/app/[locale]/(marketing)/faq/page.tsx" %}
import {
Accordion,
AccordionContent,
AccordionItem,
AccordionTrigger,
} from '@kit/ui/accordion';
const faqs = [
{
question: 'How do I get started?',
answer: 'Sign up for a free account and follow our getting started guide.',
},
{
question: 'Can I cancel anytime?',
answer: 'Yes, you can cancel your subscription at any time with no penalties.',
},
// ... more FAQs
];
export default function FAQPage() {
return (
<div className="container max-w-3xl py-16">
<h1 className="text-4xl font-bold text-center mb-12">
Frequently Asked Questions
</h1>
<Accordion type="single" collapsible>
{faqs.map((faq, index) => (
<AccordionItem key={index} value={`item-${index}`}>
<AccordionTrigger>{faq.question}</AccordionTrigger>
<AccordionContent>{faq.answer}</AccordionContent>
</AccordionItem>
))}
</Accordion>
</div>
);
}
```
### Contact Page
The contact page includes a form that sends emails via your configured mailer:
```tsx {% title="apps/web/app/[locale]/(marketing)/contact/page.tsx" %}
import { ContactForm } from './_components/contact-form';
export default function ContactPage() {
return (
<div className="container max-w-xl py-16">
<h1 className="text-4xl font-bold text-center mb-4">
Contact Us
</h1>
<p className="text-muted-foreground text-center mb-8">
Have a question? We'd love to hear from you.
</p>
<ContactForm />
</div>
);
}
```
#### Contact Form Configuration
Configure the recipient email address in your environment:
```bash {% title=".env.local" %}
CONTACT_EMAIL=support@yourdomain.com
```
The form submission uses your [email configuration](/docs/next-supabase-turbo/emails/email-configuration). Ensure your mailer is configured before the contact form will work.
## Creating New Marketing Pages
### Basic Page Structure
Create a new page with proper metadata:
```tsx {% title="apps/web/app/[locale]/(marketing)/about/page.tsx" %}
import type { Metadata } from 'next';
export const metadata: Metadata = {
title: 'About Us | Your SaaS Name',
description: 'Learn about our mission, team, and the story behind our product.',
};
export default function AboutPage() {
return (
<div className="container py-16">
<h1 className="text-4xl font-bold mb-8">About Us</h1>
<div className="prose prose-gray max-w-none">
<p>Your company story goes here...</p>
</div>
</div>
);
}
```
### MDX Pages for Content-Heavy Pages
For content-heavy pages, use MDX:
```bash
# Create an MDX page
mkdir -p apps/web/app/\(marketing\)/about
touch apps/web/app/\(marketing\)/about/page.mdx
```
```mdx {% title="apps/web/app/[locale]/(marketing)/about/page.mdx" %}
---
title: "About Us"
description: "Learn about our mission and team"
---
# About Us
We started this company because...
## Our Mission
To help developers ship faster...
## The Team
Meet the people behind the product...
```
### Dynamic Pages with Data
For pages that need dynamic data, combine Server Components with data fetching:
```tsx {% title="apps/web/app/[locale]/(marketing)/customers/page.tsx" %}
import { createCmsClient } from '@kit/cms';
export default async function CustomersPage() {
const cms = await createCmsClient();
const caseStudies = await cms.getContentItems({
collection: 'case-studies',
limit: 10,
});
return (
<div className="container py-16">
<h1 className="text-4xl font-bold mb-12">Customer Stories</h1>
<div className="grid md:grid-cols-2 gap-8">
{caseStudies.map((study) => (
<CaseStudyCard key={study.slug} {...study} />
))}
</div>
</div>
);
}
```
## Navigation and Footer
### Header Navigation
Configure navigation links in the header component:
```tsx {% title="apps/web/app/[locale]/(marketing)/_components/site-navigation.tsx" %}
const navigationItems = [
{ label: 'Features', href: '/#features' },
{ label: 'Pricing', href: '/pricing' },
{ label: 'Blog', href: '/blog' },
{ label: 'Docs', href: '/docs' },
{ label: 'Contact', href: '/contact' },
];
```
### Footer Links
The footer typically includes multiple link sections:
```tsx {% title="apps/web/app/[locale]/(marketing)/_components/footer.tsx" %}
const footerSections = [
{
title: 'Product',
links: [
{ label: 'Features', href: '/#features' },
{ label: 'Pricing', href: '/pricing' },
{ label: 'Changelog', href: '/changelog' },
],
},
{
title: 'Resources',
links: [
{ label: 'Documentation', href: '/docs' },
{ label: 'Blog', href: '/blog' },
{ label: 'FAQ', href: '/faq' },
],
},
{
title: 'Legal',
links: [
{ label: 'Privacy Policy', href: '/privacy-policy' },
{ label: 'Terms of Service', href: '/terms-of-service' },
{ label: 'Cookie Policy', href: '/cookie-policy' },
],
},
];
```
### Customizing the Layout
All marketing pages inherit from `apps/web/app/[locale]/(marketing)/layout.tsx`. This layout includes:
- Header with navigation
- Footer with links
- Common metadata
- Analytics scripts
Edit this file to change the shared structure across all marketing pages.
## SEO for Marketing Pages
### Metadata API
Use Next.js Metadata API for SEO:
```tsx {% title="apps/web/app/[locale]/(marketing)/pricing/page.tsx" %}
import type { Metadata } from 'next';
export const metadata: Metadata = {
title: 'Pricing | Your SaaS Name',
description: 'Choose from flexible pricing plans. Start free, upgrade when ready.',
openGraph: {
title: 'Pricing | Your SaaS Name',
description: 'Choose from flexible pricing plans.',
images: ['/images/og/pricing.png'],
},
};
```
### Structured Data
Add JSON-LD structured data for rich search results. See the [Next.js JSON-LD guide](https://nextjs.org/docs/app/guides/json-ld) for more details:
```tsx
// JSON-LD structured data using Next.js metadata
export default function PricingPage() {
return (
<>
<script
type="application/ld+json"
dangerouslySetInnerHTML={{
__html: JSON.stringify({
'@context': 'https://schema.org',
'@type': 'Product',
name: 'Your SaaS Name',
offers: {
'@type': 'AggregateOffer',
lowPrice: '0',
highPrice: '99',
priceCurrency: 'USD',
},
}),
}}
/>
{/* Page content */}
</>
);
}
```
### Sitemap
Add new marketing pages to your sitemap at `apps/web/app/sitemap.xml/route.ts`:
```typescript
function getPaths() {
return [
'/',
'/pricing',
'/faq',
'/blog',
'/docs',
'/contact',
'/about', // Add new pages here
];
}
```
## Related Resources
- [SEO Configuration](/docs/next-supabase-turbo/development/seo) for detailed SEO setup
- [Legal Pages](/docs/next-supabase-turbo/development/legal-pages) for privacy policy and terms
- [External Marketing Website](/docs/next-supabase-turbo/development/external-marketing-website) for using Framer or Webflow
- [CMS Setup](/docs/next-supabase-turbo/content/cms) for blog configuration
- [Email Configuration](/docs/next-supabase-turbo/emails/email-configuration) for contact form setup

View File

@@ -0,0 +1,309 @@
---
status: "published"
label: "Migrations"
order: 1
title: "Database Migrations in the Next.js Supabase Starter Kit"
description: "Create and manage database migrations using Supabase's declarative schema and diffing tools to evolve your PostgreSQL schema safely."
---
Database migrations in Makerkit use Supabase's declarative schema approach. Define your schema in SQL files at `apps/web/supabase/schemas/`, then generate migration files that track changes over time. This keeps your schema version-controlled and deployable across environments.
{% sequence title="Database Migration Workflow" description="Create and apply schema changes safely" %}
[Edit the declarative schema](#editing-the-declarative-schema)
[Generate a migration file](#generating-a-migration-file)
[Test locally](#testing-locally)
[Push to production](#pushing-to-production)
{% /sequence %}
## Why Declarative Schema?
Makerkit uses declarative schema files instead of incremental migrations for several reasons:
- **Readable**: See your entire schema in one place
- **Mergeable**: Schema changes are easier to review in PRs
- **Recoverable**: Always know the intended state of your database
- **Automated**: Supabase generates migration diffs for you
{% alert type="warning" title="Avoid Supabase Studio for Schema Changes" %}
Don't use the hosted Supabase Studio to modify your schema. Changes made there won't be tracked in your codebase. Use your local Supabase instance and generate migrations from schema files.
{% /alert %}
## Schema File Organization
Schema files live in `apps/web/supabase/schemas/`:
```
apps/web/supabase/
├── config.toml # Supabase configuration
├── seed.sql # Seed data for development
├── schemas/ # Declarative schema files
│ ├── 00-extensions.sql
│ ├── 01-enums.sql
│ ├── 02-accounts.sql
│ ├── 03-roles.sql
│ ├── 04-memberships.sql
│ ├── 05-subscriptions.sql
│ └── your-feature.sql # Your custom schema
└── migrations/ # Generated migration files
├── 20240101000000_initial.sql
└── 20240115000000_add_projects.sql
```
Files are loaded alphabetically, so prefix with numbers to control order.
## Editing the Declarative Schema
### Adding a New Table
Create a schema file for your feature:
```sql {% title="apps/web/supabase/schemas/20-projects.sql" %}
-- Projects table for team workspaces
create table if not exists public.projects (
id uuid primary key default gen_random_uuid(),
account_id uuid not null references public.accounts(id) on delete cascade,
name text not null,
description text,
status text not null default 'active' check (status in ('active', 'archived')),
created_at timestamptz not null default now(),
updated_at timestamptz not null default now()
);
-- Enable RLS
alter table public.projects enable row level security;
-- RLS policies
create policy "Users can view their account's projects"
on public.projects
for select
using (
account_id in (
select account_id from public.accounts_memberships
where user_id = auth.uid()
)
);
create policy "Users with write permission can insert projects"
on public.projects
for insert
with check (
public.has_permission(auth.uid(), account_id, 'projects.write'::app_permissions)
);
-- Updated at trigger
create trigger set_projects_updated_at
before update on public.projects
for each row execute function public.set_updated_at();
```
### Modifying an Existing Table
Edit the schema file directly. For example, to add a column:
```sql {% title="apps/web/supabase/schemas/20-projects.sql" %}
create table if not exists public.projects (
id uuid primary key default gen_random_uuid(),
account_id uuid not null references public.accounts(id) on delete cascade,
name text not null,
description text,
status text not null default 'active' check (status in ('active', 'archived')),
priority integer not null default 0, -- New column
created_at timestamptz not null default now(),
updated_at timestamptz not null default now()
);
```
### Adding Indexes
Add indexes for frequently queried columns:
```sql
-- Add to your schema file
create index if not exists projects_account_id_idx
on public.projects(account_id);
create index if not exists projects_status_idx
on public.projects(status)
where status = 'active';
```
## Generating a Migration File
After editing schema files, generate a migration that captures the diff:
```bash
# Generate migration from schema changes
pnpm --filter web supabase:db:diff -f add_projects
```
This creates a timestamped migration file in `apps/web/supabase/migrations/`:
```sql {% title="apps/web/supabase/migrations/20260119000000_add_projects.sql" %}
-- Generated by Supabase CLI
create table public.projects (
id uuid primary key default gen_random_uuid(),
account_id uuid not null references public.accounts(id) on delete cascade,
name text not null,
description text,
status text not null default 'active' check (status in ('active', 'archived')),
created_at timestamptz not null default now(),
updated_at timestamptz not null default now()
);
alter table public.projects enable row level security;
-- ... policies and triggers
```
{% alert type="error" title="Always Review Generated Migrations" %}
The diffing tool has [known caveats](https://supabase.com/docs/guides/local-development/declarative-database-schemas#known-caveats). Always review generated migrations before applying them. Check for:
- Destructive operations (DROP statements)
- Missing or incorrect constraints
- Order of operations issues
{% /alert %}
## Testing Locally
Apply and test your migration locally before pushing to production:
```bash
# Stop Supabase if running
pnpm run supabase:web:stop
# Start with fresh database
pnpm run supabase:web:start
# Or reset to apply all migrations
pnpm run supabase:web:reset
```
### Verify the Schema
Check that your changes applied correctly:
```bash
# Open local Supabase Studio
open http://localhost:54323
```
Navigate to **Table Editor** and verify your table exists with the correct columns.
### Run Database Tests
If you have pgTAP tests, run them to verify RLS policies:
```bash
pnpm --filter web supabase:test
```
See [Database Tests](/docs/next-supabase-turbo/development/database-tests) for writing tests.
## Pushing to Production
After testing locally, push migrations to your remote Supabase instance:
```bash
# Link to your Supabase project (first time only)
pnpm --filter web supabase link --project-ref your-project-ref
# Push migrations
pnpm --filter web supabase db push
```
### Migration Commands Reference
| Command | Description |
|---------|-------------|
| `pnpm run supabase:web:start` | Start local Supabase |
| `pnpm run supabase:web:stop` | Stop local Supabase |
| `pnpm run supabase:web:reset` | Reset and apply all migrations |
| `pnpm --filter web supabase:db:diff -f <name>` | Generate migration from schema diff |
| `pnpm --filter web supabase db push` | Push migrations to remote |
| `pnpm --filter web supabase:typegen` | Regenerate TypeScript types |
## Regenerating TypeScript Types
After schema changes, regenerate the TypeScript types:
```bash
pnpm --filter web supabase:typegen
```
This updates `packages/supabase/src/database.types.ts` with your new tables and columns. Import types in your code:
```tsx
import type { Database } from '@kit/supabase/database';
type Project = Database['public']['Tables']['projects']['Row'];
type NewProject = Database['public']['Tables']['projects']['Insert'];
```
## Common Patterns
### Adding a Lookup Table
```sql
-- Status enum as lookup table
create table if not exists public.project_statuses (
id text primary key,
label text not null,
sort_order integer not null default 0
);
insert into public.project_statuses (id, label, sort_order) values
('active', 'Active', 1),
('archived', 'Archived', 2),
('deleted', 'Deleted', 3)
on conflict (id) do nothing;
```
### Adding a Junction Table
```sql
-- Many-to-many relationship
create table if not exists public.project_members (
project_id uuid not null references public.projects(id) on delete cascade,
user_id uuid not null references auth.users(id) on delete cascade,
role text not null default 'member',
created_at timestamptz not null default now(),
primary key (project_id, user_id)
);
```
### Data Migration
For data transformations, use a separate migration:
```sql {% title="apps/web/supabase/migrations/20260120000000_backfill_priority.sql" %}
-- Backfill priority based on status
update public.projects
set priority = case
when status = 'active' then 1
when status = 'archived' then 0
else 0
end
where priority is null;
```
## Troubleshooting
**Diff shows no changes**: Ensure your schema file is being loaded. Check file naming (alphabetical order matters).
**Migration fails on production**: The diff tool may generate invalid SQL. Review and manually fix the migration file.
**Type mismatch after migration**: Regenerate types with `pnpm --filter web supabase:typegen`.
**RLS policy errors**: Check that your policies reference valid columns and functions. Test with [database tests](/docs/next-supabase-turbo/development/database-tests).
## Related Resources
- [Database Schema](/docs/next-supabase-turbo/development/database-schema) for detailed schema patterns
- [Database Architecture](/docs/next-supabase-turbo/development/database-architecture) for understanding the data model
- [Database Functions](/docs/next-supabase-turbo/development/database-functions) for built-in SQL functions
- [Database Tests](/docs/next-supabase-turbo/development/database-tests) for testing migrations

View File

@@ -0,0 +1,526 @@
---
status: "published"
label: 'RBAC: Roles and Permissions'
title: 'Role-Based Access Control (RBAC) in Next.js Supabase'
description: 'Implement granular permissions with roles, hierarchy levels, and the app_permissions enum. Use has_permission in RLS policies and application code.'
order: 6
---
Makerkit implements RBAC through three components: the `roles` table (defines role names and hierarchy), the `role_permissions` table (maps roles to permissions), and the `app_permissions` enum (lists all available permissions). Use the `has_permission` function in RLS policies and application code for granular access control.
{% sequence title="RBAC Implementation" description="Set up and use roles and permissions" %}
[Understand the data model](#rbac-data-model)
[Add custom permissions](#adding-custom-permissions)
[Enforce in RLS policies](#using-permissions-in-rls)
[Check permissions in code](#checking-permissions-in-application-code)
[Show/hide UI elements](#client-side-permission-checks)
{% /sequence %}
## RBAC Data Model
### The roles Table
Defines available roles and their hierarchy:
```sql
create table public.roles (
name varchar(50) primary key,
hierarchy_level integer not null default 0
);
-- Default roles
insert into public.roles (name, hierarchy_level) values
('owner', 1),
('member', 2);
```
**Hierarchy levels** determine which roles can manage others. Lower numbers indicate higher privilege. Owners (level 1) can manage members (level 2), but members cannot manage owners.
### The role_permissions Table
Maps roles to their permissions:
```sql
create table public.role_permissions (
id serial primary key,
role varchar(50) references public.roles(name) on delete cascade,
permission app_permissions not null,
unique (role, permission)
);
```
### The app_permissions Enum
Lists all available permissions:
```sql
create type public.app_permissions as enum(
'roles.manage',
'billing.manage',
'settings.manage',
'members.manage',
'invites.manage'
);
```
### Default Permission Assignments
| Role | Permissions |
|------|-------------|
| `owner` | All permissions |
| `member` | `settings.manage`, `invites.manage` |
## Adding Custom Permissions
### Step 1: Add to the Enum
Create a migration to add new permissions:
```sql {% title="apps/web/supabase/migrations/add_task_permissions.sql" %}
-- Add new permissions to the enum
alter type public.app_permissions add value 'tasks.read';
alter type public.app_permissions add value 'tasks.write';
alter type public.app_permissions add value 'tasks.delete';
commit;
```
{% alert type="warning" title="Enum Values Cannot Be Removed" %}
PostgreSQL enum values cannot be removed once added. Plan your permission names carefully. Use a consistent naming pattern like `resource.action`.
{% /alert %}
### Step 2: Assign to Roles
```sql
-- Owners get all task permissions
insert into public.role_permissions (role, permission) values
('owner', 'tasks.read'),
('owner', 'tasks.write'),
('owner', 'tasks.delete');
-- Members can read and write but not delete
insert into public.role_permissions (role, permission) values
('member', 'tasks.read'),
('member', 'tasks.write');
```
### Step 3: Add Custom Roles (Optional)
```sql
-- Add a new role
insert into public.roles (name, hierarchy_level) values
('admin', 1); -- Between owner (0) and member (2)
-- Assign permissions to the new role
insert into public.role_permissions (role, permission) values
('admin', 'tasks.read'),
('admin', 'tasks.write'),
('admin', 'tasks.delete'),
('admin', 'members.manage'),
('admin', 'invites.manage');
```
## Using Permissions in RLS
The `has_permission` function checks if a user has a specific permission on an account.
### Function Signature
```sql
public.has_permission(
user_id uuid,
account_id uuid,
permission_name app_permissions
) returns boolean
```
### Read Access Policy
```sql
create policy "Users with tasks.read can view tasks"
on public.tasks
for select
to authenticated
using (
public.has_permission(auth.uid(), account_id, 'tasks.read'::app_permissions)
);
```
### Write Access Policy
```sql
create policy "Users with tasks.write can create tasks"
on public.tasks
for insert
to authenticated
with check (
public.has_permission(auth.uid(), account_id, 'tasks.write'::app_permissions)
);
```
### Update Policy
```sql
create policy "Users with tasks.write can update tasks"
on public.tasks
for update
to authenticated
using (
public.has_permission(auth.uid(), account_id, 'tasks.write'::app_permissions)
)
with check (
public.has_permission(auth.uid(), account_id, 'tasks.write'::app_permissions)
);
```
### Delete Policy
```sql
create policy "Users with tasks.delete can delete tasks"
on public.tasks
for delete
to authenticated
using (
public.has_permission(auth.uid(), account_id, 'tasks.delete'::app_permissions)
);
```
### Complete Example
Here's a full schema with RLS:
```sql {% title="apps/web/supabase/schemas/20-tasks.sql" %}
-- Tasks table
create table if not exists public.tasks (
id uuid primary key default gen_random_uuid(),
account_id uuid not null references public.accounts(id) on delete cascade,
title text not null,
description text,
status text not null default 'pending',
created_at timestamptz not null default now(),
updated_at timestamptz not null default now()
);
-- Enable RLS
alter table public.tasks enable row level security;
-- RLS policies
create policy "tasks_select" on public.tasks
for select to authenticated
using (public.has_permission(auth.uid(), account_id, 'tasks.read'::app_permissions));
create policy "tasks_insert" on public.tasks
for insert to authenticated
with check (public.has_permission(auth.uid(), account_id, 'tasks.write'::app_permissions));
create policy "tasks_update" on public.tasks
for update to authenticated
using (public.has_permission(auth.uid(), account_id, 'tasks.write'::app_permissions))
with check (public.has_permission(auth.uid(), account_id, 'tasks.write'::app_permissions));
create policy "tasks_delete" on public.tasks
for delete to authenticated
using (public.has_permission(auth.uid(), account_id, 'tasks.delete'::app_permissions));
```
## Checking Permissions in Application Code
### Server-Side Check (Server Actions)
```tsx {% title="apps/web/lib/server/tasks/create-task.action.ts" %}
'use server';
import { getSupabaseServerClient } from '@kit/supabase/server-client';
import * as z from 'zod';
const schema = z.object({
accountId: z.string().uuid(),
title: z.string().min(1),
});
export async function createTask(data: z.infer<typeof schema>) {
const supabase = getSupabaseServerClient();
// Get current user
const { data: { user } } = await supabase.auth.getUser();
if (!user) {
throw new Error('Not authenticated');
}
// Check permission via RPC
const { data: hasPermission } = await supabase.rpc('has_permission', {
user_id: user.id,
account_id: data.accountId,
permission: 'tasks.write',
});
if (!hasPermission) {
throw new Error('You do not have permission to create tasks');
}
// Create the task (RLS will also enforce this)
const { data: task, error } = await supabase
.from('tasks')
.insert({
account_id: data.accountId,
title: data.title,
})
.select()
.single();
if (error) {
throw error;
}
return task;
}
```
### Permission Check Helper
Create a reusable helper:
```tsx {% title="apps/web/lib/server/permissions.ts" %}
import { getSupabaseServerClient } from '@kit/supabase/server-client';
export async function checkPermission(
accountId: string,
permission: string,
): Promise<boolean> {
const supabase = getSupabaseServerClient();
const { data: { user } } = await supabase.auth.getUser();
if (!user) {
return false;
}
const { data: hasPermission } = await supabase.rpc('has_permission', {
user_id: user.id,
account_id: accountId,
permission,
});
return hasPermission ?? false;
}
export async function requirePermission(
accountId: string,
permission: string,
): Promise<void> {
const hasPermission = await checkPermission(accountId, permission);
if (!hasPermission) {
throw new Error(`Permission denied: ${permission}`);
}
}
```
Usage:
```tsx
import { requirePermission } from '~/lib/server/permissions';
export async function deleteTask(taskId: string, accountId: string) {
await requirePermission(accountId, 'tasks.delete');
// Proceed with deletion
}
```
## Client-Side Permission Checks
The Team Account Workspace loader provides permissions for UI rendering.
### Loading Permissions
```tsx {% title="apps/web/app/[locale]/home/[account]/tasks/page.tsx" %}
import { loadTeamWorkspace } from '~/home/[account]/_lib/server/team-account-workspace.loader';
interface Props {
params: Promise<{ account: string }>;
}
export default async function TasksPage({ params }: Props) {
const { account } = await params;
const workspace = await loadTeamWorkspace(account);
const permissions = workspace.account.permissions;
// permissions is string[] of permission names the user has
return (
<TasksPageClient permissions={permissions} />
);
}
```
### Conditional UI Rendering
```tsx {% title="apps/web/app/[locale]/home/[account]/tasks/_components/tasks-page-client.tsx" %}
'use client';
interface TasksPageClientProps {
permissions: string[];
}
export function TasksPageClient({ permissions }: TasksPageClientProps) {
const canWrite = permissions.includes('tasks.write');
const canDelete = permissions.includes('tasks.delete');
return (
<div>
<h1>Tasks</h1>
{canWrite && (
<Button onClick={openCreateDialog}>
Create Task
</Button>
)}
<TaskList
onDelete={canDelete ? handleDelete : undefined}
/>
</div>
);
}
```
### Permission Gate Component
Create a reusable component:
```tsx {% title="apps/web/components/permission-gate.tsx" %}
'use client';
interface PermissionGateProps {
permissions: string[];
required: string | string[];
children: React.ReactNode;
fallback?: React.ReactNode;
}
export function PermissionGate({
permissions,
required,
children,
fallback = null,
}: PermissionGateProps) {
const requiredArray = Array.isArray(required) ? required : [required];
const hasPermission = requiredArray.every((p) => permissions.includes(p));
if (!hasPermission) {
return fallback;
}
return children;
}
```
Usage:
```tsx
<PermissionGate permissions={permissions} required="tasks.delete">
<DeleteButton onClick={handleDelete} />
</PermissionGate>
<PermissionGate
permissions={permissions}
required={['tasks.write', 'tasks.delete']}
fallback={<span>Read-only access</span>}
>
<EditControls />
</PermissionGate>
```
### Page-Level Access Control
```tsx {% title="apps/web/app/[locale]/home/[account]/admin/page.tsx" %}
import { redirect } from 'next/navigation';
import { loadTeamWorkspace } from '~/home/[account]/_lib/server/team-account-workspace.loader';
interface Props {
params: Promise<{ account: string }>;
}
export default async function AdminPage({ params }: Props) {
const { account } = await params;
const workspace = await loadTeamWorkspace(account);
const permissions = workspace.account.permissions;
if (!permissions.includes('settings.manage')) {
redirect('/home');
}
return <AdminDashboard />;
}
```
## Permission Naming Conventions
Use a consistent `resource.action` pattern:
| Pattern | Examples |
|---------|----------|
| `resource.read` | `tasks.read`, `reports.read` |
| `resource.write` | `tasks.write`, `settings.write` |
| `resource.delete` | `tasks.delete`, `members.delete` |
| `resource.manage` | `billing.manage`, `roles.manage` |
The `.manage` suffix typically implies all actions on that resource.
## Testing Permissions
Test RLS policies with pgTAP:
```sql {% title="apps/web/supabase/tests/tasks-permissions.test.sql" %}
begin;
select plan(3);
-- Create test user and account
select tests.create_supabase_user('test-user');
select tests.authenticate_as('test-user');
-- Get the user's personal account
select set_config('test.account_id',
(select id::text from accounts where primary_owner_user_id = tests.get_supabase_uid('test-user')),
true
);
-- Test: User with tasks.write can insert
select lives_ok(
$$
insert into tasks (account_id, title)
values (current_setting('test.account_id')::uuid, 'Test Task')
$$,
'User with tasks.write permission can create tasks'
);
-- Test: User without tasks.delete cannot delete
select throws_ok(
$$
delete from tasks
where account_id = current_setting('test.account_id')::uuid
$$,
'User without tasks.delete permission cannot delete tasks'
);
select * from finish();
rollback;
```
See [Database Tests](/docs/next-supabase-turbo/development/database-tests) for more testing patterns.
## Related Resources
- [Database Functions](/docs/next-supabase-turbo/development/database-functions) for the `has_permission` function
- [Database Schema](/docs/next-supabase-turbo/development/database-schema) for creating tables with RLS
- [Database Tests](/docs/next-supabase-turbo/development/database-tests) for testing permissions
- [Row Level Security](/docs/next-supabase-turbo/security/row-level-security) for RLS patterns

432
docs/development/seo.mdoc Normal file
View File

@@ -0,0 +1,432 @@
---
status: "published"
label: "SEO"
title: "SEO Configuration for the Next.js Supabase Starter Kit"
description: "Configure sitemaps, metadata, structured data, and search engine optimization for your Makerkit SaaS application."
order: 10
---
SEO in Makerkit starts with Next.js Metadata API for page-level optimization, an auto-generated sitemap at `/sitemap.xml`, and proper robots.txt configuration. The kit handles technical SEO out of the box, so you can focus on content quality and backlink strategy.
{% sequence title="SEO Configuration" description="Set up search engine optimization for your SaaS" %}
[Configure page metadata](#page-metadata)
[Customize the sitemap](#sitemap-configuration)
[Add structured data](#structured-data)
[Submit to Google Search Console](#google-search-console)
{% /sequence %}
## Page Metadata
### Next.js Metadata API
Use the Next.js Metadata API to set page-level SEO:
```tsx {% title="apps/web/app/[locale]/(marketing)/pricing/page.tsx" %}
import type { Metadata } from 'next';
export const metadata: Metadata = {
title: 'Pricing | Your SaaS Name',
description: 'Simple, transparent pricing. Start free, upgrade when you need more.',
openGraph: {
title: 'Pricing | Your SaaS Name',
description: 'Simple, transparent pricing for teams of all sizes.',
images: ['/images/og/pricing.png'],
type: 'website',
},
twitter: {
card: 'summary_large_image',
title: 'Pricing | Your SaaS Name',
description: 'Simple, transparent pricing for teams of all sizes.',
images: ['/images/og/pricing.png'],
},
};
export default function PricingPage() {
// ...
}
```
### Dynamic Metadata
For pages with dynamic content, use `generateMetadata`:
```tsx {% title="apps/web/app/[locale]/(marketing)/blog/[slug]/page.tsx" %}
import type { Metadata } from 'next';
import { createCmsClient } from '@kit/cms';
interface Props {
params: Promise<{ slug: string }>;
}
export async function generateMetadata({ params }: Props): Promise<Metadata> {
const { slug } = await params;
const cms = await createCmsClient();
const post = await cms.getContentBySlug({ slug, collection: 'posts' });
return {
title: `${post.title} | Your SaaS Blog`,
description: post.description,
openGraph: {
title: post.title,
description: post.description,
images: [post.image],
type: 'article',
publishedTime: post.publishedAt,
},
};
}
```
### Global Metadata
Set default metadata in your root layout at `apps/web/app/layout.tsx`:
```tsx {% title="apps/web/app/layout.tsx" %}
import type { Metadata } from 'next';
import appConfig from '~/config/app.config';
export const metadata: Metadata = {
title: {
default: appConfig.name,
template: `%s | ${appConfig.name}`,
},
description: appConfig.description,
metadataBase: new URL(appConfig.url),
openGraph: {
type: 'website',
locale: 'en_US',
siteName: appConfig.name,
},
robots: {
index: true,
follow: true,
},
};
```
## Sitemap Configuration
Makerkit auto-generates a sitemap at `/sitemap.xml`. The configuration lives in `apps/web/app/sitemap.xml/route.ts`.
### Adding Static Pages
Add new pages to the `getPaths` function:
```tsx {% title="apps/web/app/sitemap.xml/route.ts" %}
import appConfig from '~/config/app.config';
function getPaths() {
const paths = [
'/',
'/pricing',
'/faq',
'/blog',
'/docs',
'/contact',
'/about', // Add new pages
'/features',
'/privacy-policy',
'/terms-of-service',
'/cookie-policy',
];
return paths.map((path) => ({
loc: new URL(path, appConfig.url).href,
lastmod: new Date().toISOString(),
}));
}
```
### Dynamic Content
Blog posts and documentation pages are automatically added to the sitemap. The CMS integration handles this:
```tsx
// Blog posts are added automatically
const posts = await cms.getContentItems({ collection: 'posts' });
posts.forEach((post) => {
sitemap.push({
loc: new URL(`/blog/${post.slug}`, appConfig.url).href,
lastmod: post.updatedAt || post.publishedAt,
});
});
```
### Excluding Pages
Exclude pages from the sitemap by not including them in `getPaths()`. For pages that should not be indexed at all, use the `robots` metadata:
```tsx
export const metadata: Metadata = {
robots: {
index: false,
follow: false,
},
};
```
## Structured Data
Add JSON-LD structured data for rich search results. See the [Next.js JSON-LD guide](https://nextjs.org/docs/app/guides/json-ld) for the recommended approach.
### Organization Schema
Add to your home page or layout:
```tsx {% title="apps/web/app/[locale]/(marketing)/page.tsx" %}
// JSON-LD structured data using a script tag
export default function HomePage() {
return (
<>
<script
type="application/ld+json"
dangerouslySetInnerHTML={{
__html: JSON.stringify({
'@context': 'https://schema.org',
'@type': 'Organization',
name: 'Your SaaS Name',
url: 'https://yoursaas.com',
logo: 'https://yoursaas.com/logo.png',
sameAs: [
'https://twitter.com/yoursaas',
'https://github.com/yoursaas',
],
}),
}}
/>
{/* Page content */}
</>
);
}
```
### Product Schema
Add to your pricing page:
```tsx {% title="apps/web/app/[locale]/(marketing)/pricing/page.tsx" %}
<script
type="application/ld+json"
dangerouslySetInnerHTML={{
__html: JSON.stringify({
'@context': 'https://schema.org',
'@type': 'SoftwareApplication',
name: 'Your SaaS Name',
applicationCategory: 'BusinessApplication',
offers: {
'@type': 'AggregateOffer',
lowPrice: '0',
highPrice: '99',
priceCurrency: 'USD',
offerCount: 3,
},
}),
}}
/>
```
### FAQ Schema
Use the Markdoc FAQ node for automatic FAQ schema:
```markdown
{% faq
title="Frequently Asked Questions"
items=[
{"question": "How do I get started?", "answer": "Sign up for a free account..."},
{"question": "Can I cancel anytime?", "answer": "Yes, you can cancel..."}
]
/%}
```
### Article Schema
Add to blog posts:
```tsx
<script
type="application/ld+json"
dangerouslySetInnerHTML={{
__html: JSON.stringify({
'@context': 'https://schema.org',
'@type': 'Article',
headline: post.title,
description: post.description,
image: post.image,
datePublished: post.publishedAt,
dateModified: post.updatedAt,
author: {
'@type': 'Person',
name: post.author,
},
}),
}}
/>
```
## Robots.txt
The robots.txt is generated dynamically at `apps/web/app/robots.ts`:
```typescript {% title="apps/web/app/robots.ts" %}
import type { MetadataRoute } from 'next';
export default function robots(): MetadataRoute.Robots {
return {
rules: {
userAgent: '*',
allow: '/',
disallow: ['/home/', '/admin/', '/api/'],
},
sitemap: 'https://yoursaas.com/sitemap.xml',
};
}
```
Update the sitemap URL to your production domain.
## Google Search Console
### Verification
1. Go to [Google Search Console](https://search.google.com/search-console)
2. Add your property (URL prefix method)
3. Choose verification method:
- **HTML tag**: Add to your root layout's metadata
- **HTML file**: Upload to `public/`
```tsx
// HTML tag verification
export const metadata: Metadata = {
verification: {
google: 'your-verification-code',
},
};
```
### Submit Sitemap
After verification:
1. Navigate to **Sitemaps** in Search Console
2. Enter `sitemap.xml` in the input field
3. Click **Submit**
Google will crawl and index your sitemap within a few days.
### Monitor Indexing
Check Search Console regularly for:
- **Coverage**: Pages indexed vs. excluded
- **Enhancements**: Structured data validation
- **Core Web Vitals**: Performance metrics
- **Mobile Usability**: Mobile-friendly issues
## SEO Best Practices
### Content Quality
Content quality matters more than technical SEO. Focus on:
- **Helpful content**: Solve problems your customers search for
- **Unique value**: Offer insights competitors don't have
- **Regular updates**: Keep content fresh and accurate
- **Comprehensive coverage**: Answer related questions
### Keyword Strategy
| Element | Recommendation |
|---------|----------------|
| Title | Primary keyword near the beginning |
| Description | Include keyword naturally, focus on click-through |
| H1 | One per page, include primary keyword |
| URL | Short, descriptive, include keyword |
| Content | Use variations naturally, don't stuff |
### Image Optimization
```tsx
import Image from 'next/image';
<Image
src="/images/feature-screenshot.webp"
alt="Dashboard showing project analytics with team activity"
width={1200}
height={630}
priority={isAboveFold}
/>
```
- Use WebP format for better compression
- Include descriptive alt text with keywords
- Use descriptive filenames (`project-dashboard.webp` not `img1.webp`)
- Size images appropriately for their display size
### Internal Linking
Link between related content:
```tsx
// In your blog post about authentication
<p>
Learn more about{' '}
<Link href="/docs/authentication/setup">
setting up authentication
</Link>{' '}
in our documentation.
</p>
```
### Page Speed
Makerkit is optimized for performance out of the box:
- Next.js automatic code splitting
- Image optimization with `next/image`
- Font optimization with `next/font`
- Static generation for marketing pages
Check your scores with [PageSpeed Insights](https://pagespeed.web.dev/).
## Backlinks
Backlinks remain the strongest ranking factor. Strategies that work:
| Strategy | Effort | Impact |
|----------|--------|--------|
| Create linkable content (guides, tools, research) | High | High |
| Guest posting on relevant blogs | Medium | Medium |
| Product directories (Product Hunt, etc.) | Low | Medium |
| Open source contributions | Medium | Medium |
| Podcast appearances | Medium | Medium |
Focus on quality over quantity. One link from a high-authority site beats dozens of low-quality links.
## Timeline Expectations
SEO takes time. Typical timelines:
| Milestone | Timeline |
|-----------|----------|
| Initial indexing | 1-2 weeks |
| Rankings for low-competition terms | 1-3 months |
| Rankings for medium-competition terms | 3-6 months |
| Rankings for high-competition terms | 6-12+ months |
Keep creating content and building backlinks. Results compound over time.
## Related Resources
- [Marketing Pages](/docs/next-supabase-turbo/development/marketing-pages) for building optimized landing pages
- [CMS Setup](/docs/next-supabase-turbo/content/cms) for content marketing
- [App Configuration](/docs/next-supabase-turbo/configuration/application-configuration) for base URL and metadata settings

View File

@@ -0,0 +1,294 @@
---
status: "published"
label: "Writing data to Database"
order: 5
title: "Learn how to write data to the Supabase database in your Next.js app"
description: "In this page we learn how to write data to the Supabase database in your Next.js app"
---
In this page, we will learn how to write data to the Supabase database in your Next.js app.
{% sequence title="How to write data to the Supabase database" description="In this page we learn how to write data to the Supabase database in your Next.js app" %}
[Writing a Server Action to Add a Task](#writing-a-server-action-to-add-a-task)
[Defining a Schema for the Task](#defining-a-schema-for-the-task)
[Writing the Server Action to Add a Task](#writing-the-server-action-to-add-a-task)
[Creating a Form to Add a Task](#creating-a-form-to-add-a-task)
[Using a Dialog component to display the form](#using-a-dialog-component-to-display-the-form)
{% /sequence %}
## Writing a Server Action to Add a Task
Server Actions are defined by adding `use server` at the top of the function or file. When we define a function as a Server Action, it will be executed on the server-side.
This is useful for various reasons:
1. By using Server Actions, we can revalidate data fetched through Server Components
2. We can execute server side code just by calling the function from the client side
In this example, we will write a Server Action to add a task to the database.
### Defining a Schema for the Task
We use Zod to validate the data that is passed to the Server Action. This ensures that the data is in the correct format before it is written to the database.
The convention in Makerkit is to define the schema in a separate file and import it where needed. We use the convention `file.schema.ts` to define the schema.
```tsx
import * as z from 'zod';
export const WriteTaskSchema = z.object({
title: z.string().min(1),
description: z.string().nullable(),
});
```
### Writing the Server Action to Add a Task
In this example, we write a Server Action to add a task to the database. We use the `revalidatePath` function to revalidate the `/home` page after the task is added.
```tsx
'use server';
import { revalidatePath } from 'next/cache';
import { getLogger } from '@kit/shared/logger';
import { getSupabaseServerClient } from '@kit/supabase/server-client';
import { authActionClient } from '@kit/next/safe-action';
import { WriteTaskSchema } from '~/home/(user)/_lib/schema/write-task.schema';
export const addTaskAction = authActionClient
.inputSchema(WriteTaskSchema)
.action(async ({ parsedInput: task, ctx: { user } }) => {
const logger = await getLogger();
const client = getSupabaseServerClient();
logger.info(task, `Adding task...`);
const { data, error } = await client
.from('tasks')
.insert({ ...task, account_id: user.id });
if (error) {
logger.error(error, `Failed to add task`);
throw new Error(`Failed to add task`);
}
logger.info(data, `Task added successfully`);
revalidatePath('/home', 'page');
});
```
Let's focus on this bit for a second:
```tsx
const { data, error } = await client
.from('tasks')
.insert({ ...task, account_id: auth.data.id });
```
Do you see the `account_id` field? This is a foreign key that links the task to the user who created it. This is a common pattern in database design.
Now that we have written the Server Action to add a task, we can call this function from the client side. But we need a form, which we define in the next section.
### Creating a Form to Add a Task
We create a form to add a task. The form is a React component that accepts a `SubmitButton` prop and an `onSubmit` prop.
```tsx
import { zodResolver } from '@hookform/resolvers/zod';
import { useForm } from 'react-hook-form';
import * as z from 'zod';
import {
Form,
FormControl,
FormDescription,
FormField,
FormItem,
FormLabel,
FormMessage,
} from '@kit/ui/form';
import { Input } from '@kit/ui/input';
import { Textarea } from '@kit/ui/textarea';
import { Trans } from '@kit/ui/trans';
import { WriteTaskSchema } from '../_lib/schema/write-task.schema';
export function TaskForm(props: {
task?: z.infer<typeof WriteTaskSchema>;
onSubmit: (task: z.infer<typeof WriteTaskSchema>) => void;
SubmitButton: React.ComponentType;
}) {
const form = useForm({
resolver: zodResolver(WriteTaskSchema),
defaultValues: props.task,
});
return (
<Form {...form}>
<form
className={'flex flex-col space-y-4'}
onSubmit={form.handleSubmit(props.onSubmit)}
>
<FormField
render={(item) => {
return (
<FormItem>
<FormLabel>
<Trans i18nKey={'tasks:taskTitle'} />
</FormLabel>
<FormControl>
<Input required {...item.field} />
</FormControl>
<FormDescription>
<Trans i18nKey={'tasks:taskTitleDescription'} />
</FormDescription>
<FormMessage />
</FormItem>
);
}}
name={'title'}
/>
<FormField
render={(item) => {
return (
<FormItem>
<FormLabel>
<Trans i18nKey={'tasks:taskDescription'} />
</FormLabel>
<FormControl>
<Textarea {...item.field} />
</FormControl>
<FormDescription>
<Trans i18nKey={'tasks:taskDescriptionDescription'} />
</FormDescription>
<FormMessage />
</FormItem>
);
}}
name={'description'}
/>
<props.SubmitButton />
</form>
</Form>
);
}
```
### Using a Dialog component to display the form
We use the Dialog component from the `@kit/ui/dialog` package to display the form in a dialog. The dialog is opened when the user clicks on a button.
```tsx
'use client';
import { useState, useTransition } from 'react';
import { PlusCircle } from 'lucide-react';
import { Button } from '@kit/ui/button';
import {
Dialog,
DialogContent,
DialogDescription,
DialogHeader,
DialogTitle,
DialogTrigger,
} from '@kit/ui/dialog';
import { Trans } from '@kit/ui/trans';
import { TaskForm } from '../_components/task-form';
import { addTaskAction } from '../_lib/server/server-actions';
export function NewTaskDialog() {
const [pending, startTransition] = useTransition();
const [isOpen, setIsOpen] = useState(false);
return (
<Dialog open={isOpen} onOpenChange={setIsOpen}>
<DialogTrigger asChild>
<Button>
<PlusCircle className={'mr-1 h-4'} />
<span>
<Trans i18nKey={'tasks:addNewTask'} />
</span>
</Button>
</DialogTrigger>
<DialogContent>
<DialogHeader>
<DialogTitle>
<Trans i18nKey={'tasks:addNewTask'} />
</DialogTitle>
<DialogDescription>
<Trans i18nKey={'tasks:addNewTaskDescription'} />
</DialogDescription>
</DialogHeader>
<TaskForm
SubmitButton={() => (
<Button>
{pending ? (
<Trans i18nKey={'tasks:addingTask'} />
) : (
<Trans i18nKey={'tasks:addTask'} />
)}
</Button>
)}
onSubmit={(data) => {
startTransition(async () => {
await addTaskAction(data);
setIsOpen(false);
});
}}
/>
</DialogContent>
</Dialog>
);
}
```
We can now import `NewTaskDialog` in the `/home` page and display the dialog when the user clicks on a button.
Let's go back to the home page and add the component right next to the input filter:
```tsx {18}
<div className={'flex items-center justify-between'}>
<div>
<Heading level={4}>
<Trans i18nKey={'tasks:tasksTabLabel'} defaults={'Tasks'} />
</Heading>
</div>
<div className={'flex items-center space-x-2'}>
<form className={'w-full'}>
<Input
name={'query'}
defaultValue={query}
className={'w-full lg:w-[18rem]'}
placeholder={'Search tasks'}
/>
</form>
<NewTaskDialog />
</div>
</div>
```