MCP/Rules Improvements + MCP Prompts (#357)
- Use ESM for building the MCP Server - Added own Postgres dependency to MCP Server for querying tables and other entities in MCP - Vastly improved AI Agent rules - Added MCP Prompts for reviewing code and planning features - Minor refactoring
This commit is contained in:
committed by
GitHub
parent
f85035bd01
commit
9712e2354b
118
AGENTS.md
118
AGENTS.md
@@ -1,4 +1,4 @@
|
||||
This file provides guidance to Claude Code when working with code in this repository.
|
||||
This file provides guidance to AI Agents when working with code in this repository.
|
||||
|
||||
## Core Technologies
|
||||
|
||||
@@ -12,6 +12,7 @@ This file provides guidance to Claude Code when working with code in this reposi
|
||||
## Monorepo Structure
|
||||
|
||||
- `apps/web` - Main Next.js SaaS application
|
||||
- `apps/web/supabase` - Supabase folder (migrations, schemas, tests)
|
||||
- `apps/e2e` - Playwright end-to-end tests
|
||||
- `packages/features/*` - Feature packages
|
||||
- `packages/` - Shared packages and utilities
|
||||
@@ -37,11 +38,30 @@ pnpm --filter web dev # Main app (port 3000)
|
||||
|
||||
```bash
|
||||
pnpm supabase:web:start # Start Supabase locally
|
||||
pnpm supabase:web:reset # Reset with latest schema
|
||||
pnpm --filter web supabase migration up # Apply new migrations
|
||||
pnpm supabase:web:reset # Reset with latest schema (clean rebuild)
|
||||
pnpm supabase:web:typegen # Generate TypeScript types
|
||||
pnpm --filter web supabase:db:diff # Create migration
|
||||
```
|
||||
|
||||
The typegen command must be run after applying migrations or resetting the database.
|
||||
|
||||
## Database Workflow - CRITICAL SEQUENCE ⚠️
|
||||
|
||||
When adding new database features, ALWAYS follow this exact order:
|
||||
|
||||
1. **Create/modify schema file** in `apps/web/supabase/schemas/XX-feature.sql`
|
||||
2. **Generate migration**: `pnpm --filter web supabase:db:diff -f <migration_name>`
|
||||
3. **Apply changes**: `pnpm --filter web supabase migration up` (or `pnpm supabase:web:reset` for clean rebuild)
|
||||
4. **Generate types**: `pnpm supabase:web:typegen`
|
||||
5. **Verify types exist** before using in code
|
||||
|
||||
⚠️ **NEVER skip step 2** - schema files alone don't create tables! The migration step is required to apply changes to the database.
|
||||
|
||||
**Migration vs Reset**:
|
||||
- Use `migration up` for normal development (applies only new migrations)
|
||||
- Use `reset` when you need a clean database state or have schema conflicts
|
||||
|
||||
### Code Quality
|
||||
|
||||
```bash
|
||||
@@ -50,7 +70,7 @@ pnpm lint:fix
|
||||
pnpm typecheck
|
||||
```
|
||||
|
||||
- Run the typecheck command regularly to ensure your code is type-safe.
|
||||
- Run the typecheck command regularly to ensure your code is type-safe.
|
||||
- Run the linter and the formatter when your task is complete.
|
||||
|
||||
## Typescript
|
||||
@@ -61,12 +81,98 @@ pnpm typecheck
|
||||
- Always use implicit type inference, unless impossible
|
||||
- You must avoid using `any`
|
||||
- Handle errors gracefully using try/catch and appropriate error types
|
||||
- Use service pattern for server-side APIs
|
||||
- Add `server-only` to code that is exclusively server-side
|
||||
- Never mix client and server imports from a file or a package
|
||||
- Extract self-contained classes/utilities (ex. algortihmic code) from classes that cross the network boundary
|
||||
|
||||
## React
|
||||
|
||||
- Use functional components
|
||||
- Encapsulate repeated blocks of code into reusable local components
|
||||
- Write small, composable, explicit, well-named components
|
||||
- Always use `react-hook-form` and `@kit/ui/form` for writing forms
|
||||
- Always use 'use client' directive for client components
|
||||
- Add `data-test` for E2E tests where appropriate
|
||||
- `useEffect` is a code smell and must be justified - avoid if possible
|
||||
- Do not write many separate `useState`, prefer single state object (unless required)
|
||||
- Prefer server-side data fetching using RSC
|
||||
- Do not write many (such as 4-5) separate `useState`, prefer single state object (unless required)
|
||||
- Prefer server-side data fetching using RSC
|
||||
- Display loading indicators (ex. with LoadingSpinner) component where appropriate
|
||||
|
||||
## Next.js
|
||||
|
||||
- Use `enhanceAction` for Server Actions
|
||||
- Use `enhanceRouteHandler` for API Routes
|
||||
- Export page components using the `withI18n` utility
|
||||
- Add well-written page metadata to pages
|
||||
- Redirect using `redirect` follwing a server action instead of using client-side router
|
||||
- Since `redirect` throws an error, handle `catch` block using `isRedirectError` from `next/dist/client/components/redirect-error`
|
||||
|
||||
## UI Components
|
||||
|
||||
- UI Components are placed at `packages/ui`. Call MCP tool to list components to verify they exist.
|
||||
|
||||
## Form Architecture
|
||||
|
||||
Always organize schemas for reusability between server actions and client forms:
|
||||
|
||||
```
|
||||
_lib/
|
||||
├── schemas/
|
||||
│ └── feature.schema.ts # Shared Zod schemas
|
||||
├── server/
|
||||
│ └── server-actions.ts # Server actions import schemas
|
||||
└── client/
|
||||
└── forms.tsx # Forms import same schemas
|
||||
```
|
||||
|
||||
**Example implementation:**
|
||||
|
||||
```typescript
|
||||
// _lib/schemas/project.schema.ts
|
||||
export const CreateProjectSchema = z.object({
|
||||
name: z.string().min(1).max(255),
|
||||
description: z.string().optional(),
|
||||
});
|
||||
|
||||
// _lib/server/project.mutations.ts
|
||||
import { CreateProjectSchema } from '../schemas/project.schema';
|
||||
|
||||
export const createProjectAction = enhanceAction(
|
||||
async (data) => { /* implementation */ },
|
||||
{ schema: CreateProjectSchema }
|
||||
);
|
||||
|
||||
// _components/create-project-form.tsx
|
||||
import { CreateProjectSchema } from '../_lib/schemas/project.schema';
|
||||
|
||||
const form = useForm({
|
||||
resolver: zodResolver(CreateProjectSchema)
|
||||
});
|
||||
```
|
||||
|
||||
## Import Guidelines - ALWAYS Check These
|
||||
|
||||
**UI Components**: Always check `@kit/ui` first before external packages:
|
||||
- Toast notifications: `import { toast } from '@kit/ui/sonner'`
|
||||
- Forms: `import { Form, FormField, ... } from '@kit/ui/form'`
|
||||
- All UI components: Use MCP tool to verify: `mcp__makerkit__get_components`
|
||||
|
||||
**React Hook Form Pattern**:
|
||||
```typescript
|
||||
// ❌ WRONG - Redundant generic with resolver
|
||||
const form = useForm<FormData>({
|
||||
resolver: zodResolver(Schema)
|
||||
});
|
||||
|
||||
// ✅ CORRECT - Type inference from resolver
|
||||
const form = useForm({
|
||||
resolver: zodResolver(Schema)
|
||||
});
|
||||
```
|
||||
|
||||
## Verification Steps
|
||||
|
||||
After implementation:
|
||||
1. **Run `pnpm typecheck`** - Must pass without errors
|
||||
2. **Run `pnpm lint:fix`** - Auto-fix issues
|
||||
3. **Run `pnpm format:fix`** - Format code
|
||||
114
CLAUDE.md
114
CLAUDE.md
@@ -12,6 +12,7 @@ This file provides guidance to Claude Code when working with code in this reposi
|
||||
## Monorepo Structure
|
||||
|
||||
- `apps/web` - Main Next.js SaaS application
|
||||
- `apps/web/supabase` - Supabase folder (migrations, schemas, tests)
|
||||
- `apps/e2e` - Playwright end-to-end tests
|
||||
- `packages/features/*` - Feature packages
|
||||
- `packages/` - Shared packages and utilities
|
||||
@@ -37,11 +38,30 @@ pnpm --filter web dev # Main app (port 3000)
|
||||
|
||||
```bash
|
||||
pnpm supabase:web:start # Start Supabase locally
|
||||
pnpm supabase:web:reset # Reset with latest schema
|
||||
pnpm --filter web supabase migration up # Apply new migrations
|
||||
pnpm supabase:web:reset # Reset with latest schema (clean rebuild)
|
||||
pnpm supabase:web:typegen # Generate TypeScript types
|
||||
pnpm --filter web supabase:db:diff # Create migration
|
||||
```
|
||||
|
||||
The typegen command must be run after applying migrations or resetting the database.
|
||||
|
||||
## Database Workflow - CRITICAL SEQUENCE ⚠️
|
||||
|
||||
When adding new database features, ALWAYS follow this exact order:
|
||||
|
||||
1. **Create/modify schema file** in `apps/web/supabase/schemas/XX-feature.sql`
|
||||
2. **Generate migration**: `pnpm --filter web supabase:db:diff -f <migration_name>`
|
||||
3. **Apply changes**: `pnpm --filter web supabase migration up` (or `pnpm supabase:web:reset` for clean rebuild)
|
||||
4. **Generate types**: `pnpm supabase:web:typegen`
|
||||
5. **Verify types exist** before using in code
|
||||
|
||||
⚠️ **NEVER skip step 2** - schema files alone don't create tables! The migration step is required to apply changes to the database.
|
||||
|
||||
**Migration vs Reset**:
|
||||
- Use `migration up` for normal development (applies only new migrations)
|
||||
- Use `reset` when you need a clean database state or have schema conflicts
|
||||
|
||||
### Code Quality
|
||||
|
||||
```bash
|
||||
@@ -61,12 +81,98 @@ pnpm typecheck
|
||||
- Always use implicit type inference, unless impossible
|
||||
- You must avoid using `any`
|
||||
- Handle errors gracefully using try/catch and appropriate error types
|
||||
- Use service pattern for server-side APIs
|
||||
- Add `server-only` to code that is exclusively server-side
|
||||
- Never mix client and server imports from a file or a package
|
||||
- Extract self-contained classes/utilities (ex. algortihmic code) from classes that cross the network boundary
|
||||
|
||||
## React
|
||||
|
||||
- Use functional components
|
||||
- Encapsulate repeated blocks of code into reusable local components
|
||||
- Write small, composable, explicit, well-named components
|
||||
- Always use `react-hook-form` and `@kit/ui/form` for writing forms
|
||||
- Always use 'use client' directive for client components
|
||||
- Add `data-test` for E2E tests where appropriate
|
||||
- `useEffect` is a code smell and must be justified - avoid if possible
|
||||
- Do not write many separate `useState`, prefer single state object (unless required)
|
||||
- Prefer server-side data fetching using RSC
|
||||
- Do not write many (such as 4-5) separate `useState`, prefer single state object (unless required)
|
||||
- Prefer server-side data fetching using RSC
|
||||
- Display loading indicators (ex. with LoadingSpinner) component where appropriate
|
||||
|
||||
## Next.js
|
||||
|
||||
- Use `enhanceAction` for Server Actions
|
||||
- Use `enhanceRouteHandler` for API Routes
|
||||
- Export page components using the `withI18n` utility
|
||||
- Add well-written page metadata to pages
|
||||
- Redirect using `redirect` following a server action instead of using client-side router
|
||||
- Since `redirect` throws an error, handle `catch` block using `isRedirectError` from `next/dist/client/components/redirect-error`
|
||||
|
||||
## UI Components
|
||||
|
||||
- UI Components are placed at `packages/ui`. Call MCP tool to list components to verify they exist.
|
||||
|
||||
## Form Architecture
|
||||
|
||||
Always organize schemas for reusability between server actions and client forms:
|
||||
|
||||
```
|
||||
_lib/
|
||||
├── schemas/
|
||||
│ └── feature.schema.ts # Shared Zod schemas
|
||||
├── server/
|
||||
│ └── server-actions.ts # Server actions import schemas
|
||||
└── client/
|
||||
└── forms.tsx # Forms import same schemas
|
||||
```
|
||||
|
||||
**Example implementation:**
|
||||
|
||||
```typescript
|
||||
// _lib/schemas/project.schema.ts
|
||||
export const CreateProjectSchema = z.object({
|
||||
name: z.string().min(1).max(255),
|
||||
description: z.string().optional(),
|
||||
});
|
||||
|
||||
// _lib/server/project.mutations.ts
|
||||
import { CreateProjectSchema } from '../schemas/project.schema';
|
||||
|
||||
export const createProjectAction = enhanceAction(
|
||||
async (data) => { /* implementation */ },
|
||||
{ schema: CreateProjectSchema }
|
||||
);
|
||||
|
||||
// _components/create-project-form.tsx
|
||||
import { CreateProjectSchema } from '../_lib/schemas/project.schema';
|
||||
|
||||
const form = useForm({
|
||||
resolver: zodResolver(CreateProjectSchema)
|
||||
});
|
||||
```
|
||||
|
||||
## Import Guidelines - ALWAYS Check These
|
||||
|
||||
**UI Components**: Always check `@kit/ui` first before external packages:
|
||||
- Toast notifications: `import { toast } from '@kit/ui/sonner'`
|
||||
- Forms: `import { Form, FormField, ... } from '@kit/ui/form'`
|
||||
- All UI components: Use MCP tool to verify: `mcp__makerkit__get_components`
|
||||
|
||||
**React Hook Form Pattern**:
|
||||
```typescript
|
||||
// ❌ WRONG - Redundant generic with resolver
|
||||
const form = useForm<FormData>({
|
||||
resolver: zodResolver(Schema)
|
||||
});
|
||||
|
||||
// ✅ CORRECT - Type inference from resolver
|
||||
const form = useForm({
|
||||
resolver: zodResolver(Schema)
|
||||
});
|
||||
```
|
||||
|
||||
## Verification Steps
|
||||
|
||||
After implementation:
|
||||
1. **Run `pnpm typecheck`** - Must pass without errors
|
||||
2. **Run `pnpm lint:fix`** - Auto-fix issues
|
||||
3. **Run `pnpm format:fix`** - Format code
|
||||
@@ -36,6 +36,29 @@ Example:
|
||||
- Team server utils: `app/home/[account]/_lib/server/`
|
||||
- Marketing components: `app/(marketing)/_components/`
|
||||
|
||||
The `[account]` parameter is the `accounts.slug` property, not the ID
|
||||
|
||||
## React Server Components - Async Pattern
|
||||
|
||||
**CRITICAL**: In Next.js 15, always await params directly in async server components:
|
||||
|
||||
```typescript
|
||||
// ❌ WRONG - Don't use React.use() in async functions
|
||||
async function Page({ params }: Props) {
|
||||
const { account } = use(params);
|
||||
}
|
||||
|
||||
// ✅ CORRECT - await params directly in Next.js 15
|
||||
async function Page({ params }: Props) {
|
||||
const { account } = await params; // ✅ Server component pattern
|
||||
}
|
||||
|
||||
// ✅ CORRECT - "use" in non-async functions in Next.js 15
|
||||
function Page({ params }: Props) {
|
||||
const { account } = use(params); // ✅ Server component pattern
|
||||
}
|
||||
```
|
||||
|
||||
## Data Fetching Strategy
|
||||
|
||||
**Quick Decision Framework:**
|
||||
@@ -182,7 +205,10 @@ import { Trans } from '@kit/ui/trans';
|
||||
2. Create translation files in `public/locales/[new-language]/`
|
||||
3. Copy structure from English files
|
||||
|
||||
Translation files: `public/locales/<locale>/<namespace>.json`
|
||||
### Adding new namespaces
|
||||
|
||||
1. Translation files: `public/locales/<locale>/<namespace>.json`
|
||||
2. Add namespace to `defaultI18nNamespaces` in `apps/web/lib/i18n/i18n.settings.ts`
|
||||
|
||||
## Workspace Contexts 🏢
|
||||
|
||||
@@ -238,6 +264,55 @@ export const POST = enhanceRouteHandler(
|
||||
);
|
||||
```
|
||||
|
||||
## Navigation Menu Configuration 🗺️
|
||||
|
||||
### Adding Sidebar Menu Items
|
||||
|
||||
**Config Files:**
|
||||
|
||||
- Personal: `config/personal-account-navigation.config.tsx`
|
||||
- Team: `config/team-account-navigation.config.tsx`
|
||||
|
||||
**Add to Personal Navigation:**
|
||||
|
||||
```typescript
|
||||
{
|
||||
label: 'common:routes.yourFeature',
|
||||
path: pathsConfig.app.yourFeaturePath,
|
||||
Icon: <YourIcon className="w-4" />,
|
||||
end: true,
|
||||
},
|
||||
```
|
||||
|
||||
**Add to Team Navigation:**
|
||||
|
||||
```typescript
|
||||
{
|
||||
label: 'common:routes.yourTeamFeature',
|
||||
path: createPath(pathsConfig.app.yourTeamFeaturePath, account),
|
||||
Icon: <YourIcon className="w-4" />,
|
||||
},
|
||||
```
|
||||
|
||||
**Add Paths:**
|
||||
|
||||
```typescript
|
||||
// config/paths.config.ts
|
||||
app: {
|
||||
yourFeaturePath: '/home/your-feature',
|
||||
yourTeamFeaturePath: '/home/[account]/your-feature',
|
||||
}
|
||||
```
|
||||
|
||||
**Add Translations:**
|
||||
|
||||
```json
|
||||
// public/locales/en/common.json
|
||||
"routes": {
|
||||
"yourFeature": "Your Feature"
|
||||
}
|
||||
```
|
||||
|
||||
## Security Guidelines 🛡️
|
||||
|
||||
### Authentication & Authorization
|
||||
@@ -252,13 +327,3 @@ export const POST = enhanceRouteHandler(
|
||||
- **Never pass sensitive data** to Client Components
|
||||
- **Never expose server environment variables** to client (unless prefixed with NEXT_PUBLIC)
|
||||
- Always validate user input
|
||||
|
||||
### Super Admin Protection
|
||||
|
||||
For admin routes, use `AdminGuard` from `@packages/features/admin/src/components/admin-guard.tsx`:
|
||||
|
||||
```tsx
|
||||
import { AdminGuard } from '@kit/admin/components/admin-guard';
|
||||
|
||||
export default AdminGuard(AdminPageComponent);
|
||||
```
|
||||
|
||||
@@ -36,6 +36,29 @@ Example:
|
||||
- Team server utils: `app/home/[account]/_lib/server/`
|
||||
- Marketing components: `app/(marketing)/_components/`
|
||||
|
||||
The `[account]` parameter is the `accounts.slug` property, not the ID
|
||||
|
||||
## React Server Components - Async Pattern
|
||||
|
||||
**CRITICAL**: In Next.js 15, always await params directly in async server components:
|
||||
|
||||
```typescript
|
||||
// ❌ WRONG - Don't use React.use() in async functions
|
||||
async function Page({ params }: Props) {
|
||||
const { account } = use(params);
|
||||
}
|
||||
|
||||
// ✅ CORRECT - await params directly in Next.js 15
|
||||
async function Page({ params }: Props) {
|
||||
const { account } = await params; // ✅ Server component pattern
|
||||
}
|
||||
|
||||
// ✅ CORRECT - "use" in non-async functions in Next.js 15
|
||||
function Page({ params }: Props) {
|
||||
const { account } = use(params); // ✅ Server component pattern
|
||||
}
|
||||
```
|
||||
|
||||
## Data Fetching Strategy
|
||||
|
||||
**Quick Decision Framework:**
|
||||
@@ -182,7 +205,10 @@ import { Trans } from '@kit/ui/trans';
|
||||
2. Create translation files in `public/locales/[new-language]/`
|
||||
3. Copy structure from English files
|
||||
|
||||
Translation files: `public/locales/<locale>/<namespace>.json`
|
||||
### Adding new namespaces
|
||||
|
||||
1. Translation files: `public/locales/<locale>/<namespace>.json`
|
||||
2. Add namespace to `defaultI18nNamespaces` in `apps/web/lib/i18n/i18n.settings.ts`
|
||||
|
||||
## Workspace Contexts 🏢
|
||||
|
||||
@@ -238,6 +264,55 @@ export const POST = enhanceRouteHandler(
|
||||
);
|
||||
```
|
||||
|
||||
## Navigation Menu Configuration 🗺️
|
||||
|
||||
### Adding Sidebar Menu Items
|
||||
|
||||
**Config Files:**
|
||||
|
||||
- Personal: `config/personal-account-navigation.config.tsx`
|
||||
- Team: `config/team-account-navigation.config.tsx`
|
||||
|
||||
**Add to Personal Navigation:**
|
||||
|
||||
```typescript
|
||||
{
|
||||
label: 'common:routes.yourFeature',
|
||||
path: pathsConfig.app.yourFeaturePath,
|
||||
Icon: <YourIcon className="w-4" />,
|
||||
end: true,
|
||||
},
|
||||
```
|
||||
|
||||
**Add to Team Navigation:**
|
||||
|
||||
```typescript
|
||||
{
|
||||
label: 'common:routes.yourTeamFeature',
|
||||
path: createPath(pathsConfig.app.yourTeamFeaturePath, account),
|
||||
Icon: <YourIcon className="w-4" />,
|
||||
},
|
||||
```
|
||||
|
||||
**Add Paths:**
|
||||
|
||||
```typescript
|
||||
// config/paths.config.ts
|
||||
app: {
|
||||
yourFeaturePath: '/home/your-feature',
|
||||
yourTeamFeaturePath: '/home/[account]/your-feature',
|
||||
}
|
||||
```
|
||||
|
||||
**Add Translations:**
|
||||
|
||||
```json
|
||||
// public/locales/en/common.json
|
||||
"routes": {
|
||||
"yourFeature": "Your Feature"
|
||||
}
|
||||
```
|
||||
|
||||
## Security Guidelines 🛡️
|
||||
|
||||
### Authentication & Authorization
|
||||
@@ -252,13 +327,3 @@ export const POST = enhanceRouteHandler(
|
||||
- **Never pass sensitive data** to Client Components
|
||||
- **Never expose server environment variables** to client (unless prefixed with NEXT_PUBLIC)
|
||||
- Always validate user input
|
||||
|
||||
### Super Admin Protection
|
||||
|
||||
For admin routes, use `AdminGuard` from `@packages/features/admin/src/components/admin-guard.tsx`:
|
||||
|
||||
```tsx
|
||||
import { AdminGuard } from '@kit/admin/components/admin-guard';
|
||||
|
||||
export default AdminGuard(AdminPageComponent);
|
||||
```
|
||||
|
||||
119
apps/web/app/admin/AGENTS.md
Normal file
119
apps/web/app/admin/AGENTS.md
Normal file
@@ -0,0 +1,119 @@
|
||||
# Super Admin
|
||||
|
||||
This file provides specific guidance for AI agents working in the super admin section of the application.
|
||||
|
||||
## Core Admin Principles
|
||||
|
||||
### Security-First Development
|
||||
|
||||
- **ALWAYS** use `AdminGuard` to protect admin pages
|
||||
- **NEVER** bypass authentication or authorization checks
|
||||
- **CRITICAL**: Use admin Supabase client with manual authorization validation
|
||||
- Validate permissions for every admin operation
|
||||
|
||||
### Admin Client Usage Pattern
|
||||
|
||||
```typescript
|
||||
import { isSuperAdmin } from '@kit/admin';
|
||||
import { getSupabaseServerAdminClient } from '@kit/supabase/server-admin-client';
|
||||
|
||||
async function adminOperation() {
|
||||
const adminClient = getSupabaseServerAdminClient();
|
||||
|
||||
// CRITICAL: Always validate admin status first
|
||||
const currentUser = await getCurrentUser();
|
||||
if (!(await isSuperAdmin(currentUser))) {
|
||||
throw new Error('Unauthorized: Admin access required');
|
||||
}
|
||||
|
||||
// Now safe to proceed with admin privileges
|
||||
const { data } = await adminClient.from('accounts').select('*');
|
||||
return data;
|
||||
}
|
||||
```
|
||||
|
||||
## Page Structure Patterns
|
||||
|
||||
### Standard Admin Page Template
|
||||
|
||||
```typescript
|
||||
import { AdminGuard } from '@kit/admin/components/admin-guard';
|
||||
import { PageBody, PageHeader } from '@kit/ui/page';
|
||||
import { AppBreadcrumbs } from '@kit/ui/app-breadcrumbs';
|
||||
|
||||
async function AdminPageComponent() {
|
||||
return (
|
||||
<>
|
||||
<PageHeader description={<AppBreadcrumbs />}>
|
||||
{/* Page actions go here */}
|
||||
</PageHeader>
|
||||
|
||||
<PageBody>
|
||||
{/* Main content */}
|
||||
</PageBody>
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
// ALWAYS wrap with AdminGuard
|
||||
export default AdminGuard(AdminPageComponent);
|
||||
```
|
||||
|
||||
### Async Server Component Pattern
|
||||
|
||||
```typescript
|
||||
// ✅ CORRECT - Next.js 15 pattern
|
||||
async function AdminPage({ params }: { params: Promise<{ id: string }> }) {
|
||||
const { id } = await params; // ✅ await params directly
|
||||
|
||||
// Fetch admin data
|
||||
const data = await loadAdminData(id);
|
||||
|
||||
return <AdminContent data={data} />;
|
||||
}
|
||||
```
|
||||
|
||||
## Security Guidelines
|
||||
|
||||
### Critical Security Rules
|
||||
|
||||
1. **NEVER** expose admin functionality to non-admin users
|
||||
2. **ALWAYS** validate admin status before operations
|
||||
3. **NEVER** trust client-side admin checks alone
|
||||
4. **ALWAYS** use server-side validation for admin actions
|
||||
5. **NEVER** log sensitive admin data
|
||||
6. **ALWAYS** audit admin operations
|
||||
|
||||
### Admin Action Auditing
|
||||
|
||||
```typescript
|
||||
async function auditedAdminAction(action: string, data: unknown) {
|
||||
const logger = await getLogger();
|
||||
|
||||
await logger.info(
|
||||
{
|
||||
name: 'admin-audit',
|
||||
action,
|
||||
adminId: currentUser.id,
|
||||
timestamp: new Date().toISOString(),
|
||||
data: {
|
||||
// Log only non-sensitive fields
|
||||
operation: action,
|
||||
targetId: data.id,
|
||||
},
|
||||
},
|
||||
'Admin action performed',
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
## Common Patterns to Follow
|
||||
|
||||
1. **Always wrap admin pages with `AdminGuard`**
|
||||
2. **Use admin client only when RLS bypass is required**
|
||||
3. **Implement proper error boundaries for admin components**
|
||||
4. **Add comprehensive logging for admin operations**
|
||||
5. **Use TypeScript strictly for admin interfaces**
|
||||
6. **Follow the established admin component naming conventions**
|
||||
7. **Implement proper loading states for admin operations**
|
||||
8. **Add proper metadata to admin pages**
|
||||
119
apps/web/app/admin/CLAUDE.md
Normal file
119
apps/web/app/admin/CLAUDE.md
Normal file
@@ -0,0 +1,119 @@
|
||||
# Super Admin
|
||||
|
||||
This file provides specific guidance for AI agents working in the super admin section of the application.
|
||||
|
||||
## Core Admin Principles
|
||||
|
||||
### Security-First Development
|
||||
|
||||
- **ALWAYS** use `AdminGuard` to protect admin pages
|
||||
- **NEVER** bypass authentication or authorization checks
|
||||
- **CRITICAL**: Use admin Supabase client with manual authorization validation
|
||||
- Validate permissions for every admin operation
|
||||
|
||||
### Admin Client Usage Pattern
|
||||
|
||||
```typescript
|
||||
import { isSuperAdmin } from '@kit/admin';
|
||||
import { getSupabaseServerAdminClient } from '@kit/supabase/server-admin-client';
|
||||
|
||||
async function adminOperation() {
|
||||
const adminClient = getSupabaseServerAdminClient();
|
||||
|
||||
// CRITICAL: Always validate admin status first
|
||||
const currentUser = await getCurrentUser();
|
||||
if (!(await isSuperAdmin(currentUser))) {
|
||||
throw new Error('Unauthorized: Admin access required');
|
||||
}
|
||||
|
||||
// Now safe to proceed with admin privileges
|
||||
const { data } = await adminClient.from('accounts').select('*');
|
||||
return data;
|
||||
}
|
||||
```
|
||||
|
||||
## Page Structure Patterns
|
||||
|
||||
### Standard Admin Page Template
|
||||
|
||||
```typescript
|
||||
import { AdminGuard } from '@kit/admin/components/admin-guard';
|
||||
import { PageBody, PageHeader } from '@kit/ui/page';
|
||||
import { AppBreadcrumbs } from '@kit/ui/app-breadcrumbs';
|
||||
|
||||
async function AdminPageComponent() {
|
||||
return (
|
||||
<>
|
||||
<PageHeader description={<AppBreadcrumbs />}>
|
||||
{/* Page actions go here */}
|
||||
</PageHeader>
|
||||
|
||||
<PageBody>
|
||||
{/* Main content */}
|
||||
</PageBody>
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
// ALWAYS wrap with AdminGuard
|
||||
export default AdminGuard(AdminPageComponent);
|
||||
```
|
||||
|
||||
### Async Server Component Pattern
|
||||
|
||||
```typescript
|
||||
// ✅ CORRECT - Next.js 15 pattern
|
||||
async function AdminPage({ params }: { params: Promise<{ id: string }> }) {
|
||||
const { id } = await params; // ✅ await params directly
|
||||
|
||||
// Fetch admin data
|
||||
const data = await loadAdminData(id);
|
||||
|
||||
return <AdminContent data={data} />;
|
||||
}
|
||||
```
|
||||
|
||||
## Security Guidelines
|
||||
|
||||
### Critical Security Rules
|
||||
|
||||
1. **NEVER** expose admin functionality to non-admin users
|
||||
2. **ALWAYS** validate admin status before operations
|
||||
3. **NEVER** trust client-side admin checks alone
|
||||
4. **ALWAYS** use server-side validation for admin actions
|
||||
5. **NEVER** log sensitive admin data
|
||||
6. **ALWAYS** audit admin operations
|
||||
|
||||
### Admin Action Auditing
|
||||
|
||||
```typescript
|
||||
async function auditedAdminAction(action: string, data: unknown) {
|
||||
const logger = await getLogger();
|
||||
|
||||
await logger.info(
|
||||
{
|
||||
name: 'admin-audit',
|
||||
action,
|
||||
adminId: currentUser.id,
|
||||
timestamp: new Date().toISOString(),
|
||||
data: {
|
||||
// Log only non-sensitive fields
|
||||
operation: action,
|
||||
targetId: data.id,
|
||||
},
|
||||
},
|
||||
'Admin action performed',
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
## Common Patterns to Follow
|
||||
|
||||
1. **Always wrap admin pages with `AdminGuard`**
|
||||
2. **Use admin client only when RLS bypass is required**
|
||||
3. **Implement proper error boundaries for admin components**
|
||||
4. **Add comprehensive logging for admin operations**
|
||||
5. **Use TypeScript strictly for admin interfaces**
|
||||
6. **Follow the established admin component naming conventions**
|
||||
7. **Implement proper loading states for admin operations**
|
||||
8. **Add proper metadata to admin pages**
|
||||
@@ -6,13 +6,15 @@ This file contains guidance for working with database schemas, migrations, and S
|
||||
|
||||
Schemas are organized in numbered files in the `schemas/` directory. Numbers are used to sort dependencies.
|
||||
|
||||
## Schema Development Workflow
|
||||
Migrations are generated from schemas. If creating a new schema, the migration can be created using the exact same content.
|
||||
|
||||
If modifying an existing migration, use the `diff` command:
|
||||
|
||||
### 1. Creating New Schema Files
|
||||
|
||||
```bash
|
||||
# Create new schema file
|
||||
touch schemas/15-my-new-feature.sql
|
||||
touch apps/web/supabase/schemas/15-my-new-feature.sql
|
||||
|
||||
# Apply changes and create migration
|
||||
pnpm --filter web run supabase:db:diff -f my-new-feature
|
||||
@@ -24,6 +26,8 @@ pnpm supabase:web:reset
|
||||
pnpm supabase:web:typegen
|
||||
```
|
||||
|
||||
Verify the diff command generated the same content as the schema; if not, take steps to fix the migration.
|
||||
|
||||
### 2. Modifying Existing Schemas
|
||||
|
||||
```bash
|
||||
@@ -35,6 +39,8 @@ pnpm --filter web run supabase:db:diff -f update-accounts
|
||||
|
||||
# Apply and test
|
||||
pnpm supabase:web:reset
|
||||
|
||||
# After resetting
|
||||
pnpm supabase:web:typegen
|
||||
```
|
||||
|
||||
@@ -223,47 +229,6 @@ pnpm supabase:web:reset
|
||||
pnpm run supabase:web:test
|
||||
```
|
||||
|
||||
## Type Generation
|
||||
|
||||
### After Schema Changes
|
||||
|
||||
```bash
|
||||
# Generate types after any schema changes
|
||||
pnpm supabase:web:typegen
|
||||
# Types are generated to src/lib/supabase/database.types.ts
|
||||
|
||||
# Reset DB
|
||||
pnpm supabase:web:reset
|
||||
```
|
||||
|
||||
### Using Generated Types
|
||||
|
||||
```typescript
|
||||
import { Enums, Tables } from '@kit/supabase/database';
|
||||
|
||||
// Table types
|
||||
type Account = Tables<'accounts'>;
|
||||
type Note = Tables<'notes'>;
|
||||
|
||||
// Enum types
|
||||
type AppPermission = Enums<'app_permissions'>;
|
||||
|
||||
// Insert types
|
||||
type AccountInsert = Tables<'accounts'>['Insert'];
|
||||
type AccountUpdate = Tables<'accounts'>['Update'];
|
||||
|
||||
// Use in functions
|
||||
async function createNote(data: Tables<'notes'>['Insert']) {
|
||||
const { data: note, error } = await supabase
|
||||
.from('notes')
|
||||
.insert(data)
|
||||
.select()
|
||||
.single();
|
||||
|
||||
return note;
|
||||
}
|
||||
```
|
||||
|
||||
## Common Schema Patterns
|
||||
|
||||
### Audit Trail
|
||||
|
||||
@@ -6,13 +6,15 @@ This file contains guidance for working with database schemas, migrations, and S
|
||||
|
||||
Schemas are organized in numbered files in the `schemas/` directory. Numbers are used to sort dependencies.
|
||||
|
||||
## Schema Development Workflow
|
||||
Migrations are generated from schemas. If creating a new schema, the migration can be created using the exact same content.
|
||||
|
||||
If modifying an existing migration, use the `diff` command:
|
||||
|
||||
### 1. Creating New Schema Files
|
||||
|
||||
```bash
|
||||
# Create new schema file
|
||||
touch schemas/15-my-new-feature.sql
|
||||
touch apps/web/supabase/schemas/15-my-new-feature.sql
|
||||
|
||||
# Apply changes and create migration
|
||||
pnpm --filter web run supabase:db:diff -f my-new-feature
|
||||
@@ -24,6 +26,8 @@ pnpm supabase:web:reset
|
||||
pnpm supabase:web:typegen
|
||||
```
|
||||
|
||||
Verify the diff command generated the same content as the schema; if not, take steps to fix the migration.
|
||||
|
||||
### 2. Modifying Existing Schemas
|
||||
|
||||
```bash
|
||||
@@ -35,6 +39,8 @@ pnpm --filter web run supabase:db:diff -f update-accounts
|
||||
|
||||
# Apply and test
|
||||
pnpm supabase:web:reset
|
||||
|
||||
# After resetting
|
||||
pnpm supabase:web:typegen
|
||||
```
|
||||
|
||||
@@ -223,47 +229,6 @@ pnpm supabase:web:reset
|
||||
pnpm run supabase:web:test
|
||||
```
|
||||
|
||||
## Type Generation
|
||||
|
||||
### After Schema Changes
|
||||
|
||||
```bash
|
||||
# Generate types after any schema changes
|
||||
pnpm supabase:web:typegen
|
||||
# Types are generated to src/lib/supabase/database.types.ts
|
||||
|
||||
# Reset DB
|
||||
pnpm supabase:web:reset
|
||||
```
|
||||
|
||||
### Using Generated Types
|
||||
|
||||
```typescript
|
||||
import { Enums, Tables } from '@kit/supabase/database';
|
||||
|
||||
// Table types
|
||||
type Account = Tables<'accounts'>;
|
||||
type Note = Tables<'notes'>;
|
||||
|
||||
// Enum types
|
||||
type AppPermission = Enums<'app_permissions'>;
|
||||
|
||||
// Insert types
|
||||
type AccountInsert = Tables<'accounts'>['Insert'];
|
||||
type AccountUpdate = Tables<'accounts'>['Update'];
|
||||
|
||||
// Use in functions
|
||||
async function createNote(data: Tables<'notes'>['Insert']) {
|
||||
const { data: note, error } = await supabase
|
||||
.from('notes')
|
||||
.insert(data)
|
||||
.select()
|
||||
.single();
|
||||
|
||||
return note;
|
||||
}
|
||||
```
|
||||
|
||||
## Common Schema Patterns
|
||||
|
||||
### Audit Trail
|
||||
|
||||
@@ -9,7 +9,7 @@ import { EllipsisVertical } from 'lucide-react';
|
||||
import { useForm } from 'react-hook-form';
|
||||
import { z } from 'zod';
|
||||
|
||||
import { Database } from '@kit/supabase/database';
|
||||
import { Tables } from '@kit/supabase/database';
|
||||
import { Button } from '@kit/ui/button';
|
||||
import {
|
||||
DropdownMenu,
|
||||
@@ -38,7 +38,7 @@ import { AdminDeleteUserDialog } from './admin-delete-user-dialog';
|
||||
import { AdminImpersonateUserDialog } from './admin-impersonate-user-dialog';
|
||||
import { AdminResetPasswordDialog } from './admin-reset-password-dialog';
|
||||
|
||||
type Account = Database['public']['Tables']['accounts']['Row'];
|
||||
type Account = Tables<'accounts'>;
|
||||
|
||||
const FiltersSchema = z.object({
|
||||
type: z.enum(['all', 'team', 'personal']),
|
||||
|
||||
@@ -5,7 +5,6 @@ import { useCallback, useEffect, useState } from 'react';
|
||||
import { Bell, CircleAlert, Info, TriangleAlert, XIcon } from 'lucide-react';
|
||||
import { useTranslation } from 'react-i18next';
|
||||
|
||||
import { Database } from '@kit/supabase/database';
|
||||
import { Button } from '@kit/ui/button';
|
||||
import { If } from '@kit/ui/if';
|
||||
import { Popover, PopoverContent, PopoverTrigger } from '@kit/ui/popover';
|
||||
@@ -13,38 +12,29 @@ import { Separator } from '@kit/ui/separator';
|
||||
import { cn } from '@kit/ui/utils';
|
||||
|
||||
import { useDismissNotification, useFetchNotifications } from '../hooks';
|
||||
|
||||
type Notification = Database['public']['Tables']['notifications']['Row'];
|
||||
|
||||
type PartialNotification = Pick<
|
||||
Notification,
|
||||
'id' | 'body' | 'dismissed' | 'type' | 'created_at' | 'link'
|
||||
>;
|
||||
import { Notification } from '../types';
|
||||
|
||||
export function NotificationsPopover(params: {
|
||||
realtime: boolean;
|
||||
accountIds: string[];
|
||||
onClick?: (notification: PartialNotification) => void;
|
||||
onClick?: (notification: Notification) => void;
|
||||
}) {
|
||||
const { i18n, t } = useTranslation();
|
||||
|
||||
const [open, setOpen] = useState(false);
|
||||
const [notifications, setNotifications] = useState<PartialNotification[]>([]);
|
||||
const [notifications, setNotifications] = useState<Notification[]>([]);
|
||||
|
||||
const onNotifications = useCallback(
|
||||
(notifications: PartialNotification[]) => {
|
||||
setNotifications((existing) => {
|
||||
const unique = new Set(existing.map((notification) => notification.id));
|
||||
const onNotifications = useCallback((notifications: Notification[]) => {
|
||||
setNotifications((existing) => {
|
||||
const unique = new Set(existing.map((notification) => notification.id));
|
||||
|
||||
const notificationsFiltered = notifications.filter(
|
||||
(notification) => !unique.has(notification.id),
|
||||
);
|
||||
const notificationsFiltered = notifications.filter(
|
||||
(notification) => !unique.has(notification.id),
|
||||
);
|
||||
|
||||
return [...notificationsFiltered, ...existing];
|
||||
});
|
||||
},
|
||||
[],
|
||||
);
|
||||
return [...notificationsFiltered, ...existing];
|
||||
});
|
||||
}, []);
|
||||
|
||||
const dismissNotification = useDismissNotification();
|
||||
|
||||
|
||||
@@ -4,17 +4,9 @@ import { useQuery } from '@tanstack/react-query';
|
||||
|
||||
import { useSupabase } from '@kit/supabase/hooks/use-supabase';
|
||||
|
||||
import { Notification } from '../types';
|
||||
import { useNotificationsStream } from './use-notifications-stream';
|
||||
|
||||
type Notification = {
|
||||
id: number;
|
||||
body: string;
|
||||
dismissed: boolean;
|
||||
type: 'info' | 'warning' | 'error';
|
||||
created_at: string;
|
||||
link: string | null;
|
||||
};
|
||||
|
||||
export function useFetchNotifications({
|
||||
onNotifications,
|
||||
accountIds,
|
||||
|
||||
@@ -2,14 +2,7 @@ import { useEffect } from 'react';
|
||||
|
||||
import { useSupabase } from '@kit/supabase/hooks/use-supabase';
|
||||
|
||||
type Notification = {
|
||||
id: number;
|
||||
body: string;
|
||||
dismissed: boolean;
|
||||
type: 'info' | 'warning' | 'error';
|
||||
created_at: string;
|
||||
link: string | null;
|
||||
};
|
||||
import { Notification } from '../types';
|
||||
|
||||
export function useNotificationsStream({
|
||||
onNotifications,
|
||||
|
||||
6
packages/features/notifications/src/types.ts
Normal file
6
packages/features/notifications/src/types.ts
Normal file
@@ -0,0 +1,6 @@
|
||||
import { Tables } from '@kit/supabase/database';
|
||||
|
||||
export type Notification = Pick<
|
||||
Tables<'notifications'>,
|
||||
'id' | 'body' | 'dismissed' | 'type' | 'created_at' | 'link'
|
||||
>;
|
||||
@@ -3,9 +3,9 @@ import { SupabaseClient } from '@supabase/supabase-js';
|
||||
import { z } from 'zod';
|
||||
|
||||
import { getLogger } from '@kit/shared/logger';
|
||||
import { Database } from '@kit/supabase/database';
|
||||
import { Database, Tables } from '@kit/supabase/database';
|
||||
|
||||
type Invitation = Database['public']['Tables']['invitations']['Row'];
|
||||
type Invitation = Tables<'invitations'>;
|
||||
|
||||
const invitePath = '/join';
|
||||
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
import { z } from 'zod';
|
||||
|
||||
import { getLogger } from '@kit/shared/logger';
|
||||
import { Database } from '@kit/supabase/database';
|
||||
import { Tables } from '@kit/supabase/database';
|
||||
|
||||
type Account = Database['public']['Tables']['accounts']['Row'];
|
||||
type Account = Tables<'accounts'>;
|
||||
|
||||
export function createAccountWebhooksService() {
|
||||
return new AccountWebhooksService();
|
||||
|
||||
@@ -3,6 +3,7 @@
|
||||
"private": true,
|
||||
"version": "0.1.0",
|
||||
"main": "./build/index.js",
|
||||
"module": true,
|
||||
"bin": {
|
||||
"makerkit-mcp-server": "./build/index.js"
|
||||
},
|
||||
@@ -17,6 +18,7 @@
|
||||
"clean": "rm -rf .turbo node_modules",
|
||||
"format": "prettier --check \"**/*.{mjs,ts,md,json}\"",
|
||||
"build": "tsc && chmod 755 build/index.js",
|
||||
"build:watch": "tsc --watch",
|
||||
"mcp": "node build/index.js"
|
||||
},
|
||||
"devDependencies": {
|
||||
@@ -25,6 +27,7 @@
|
||||
"@kit/tsconfig": "workspace:*",
|
||||
"@modelcontextprotocol/sdk": "1.18.0",
|
||||
"@types/node": "^24.5.0",
|
||||
"postgres": "3.4.7",
|
||||
"zod": "^3.25.74"
|
||||
},
|
||||
"prettier": "@kit/prettier-config"
|
||||
|
||||
@@ -2,24 +2,30 @@ import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
|
||||
import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';
|
||||
|
||||
import { registerComponentsTools } from './tools/components';
|
||||
import { registerDatabaseTools } from './tools/database';
|
||||
import {
|
||||
registerDatabaseResources,
|
||||
registerDatabaseTools,
|
||||
} from './tools/database';
|
||||
import { registerGetMigrationsTools } from './tools/migrations';
|
||||
import { registerPromptsSystem } from './tools/prompts';
|
||||
import { registerScriptsTools } from './tools/scripts';
|
||||
|
||||
// Create server instance
|
||||
const server = new McpServer({
|
||||
name: 'makerkit',
|
||||
version: '1.0.0',
|
||||
capabilities: {},
|
||||
});
|
||||
|
||||
registerGetMigrationsTools(server);
|
||||
registerDatabaseTools(server);
|
||||
registerComponentsTools(server);
|
||||
registerScriptsTools(server);
|
||||
|
||||
async function main() {
|
||||
// Create server instance
|
||||
const server = new McpServer({
|
||||
name: 'makerkit',
|
||||
version: '1.0.0',
|
||||
});
|
||||
|
||||
const transport = new StdioServerTransport();
|
||||
|
||||
registerGetMigrationsTools(server);
|
||||
registerDatabaseTools(server);
|
||||
registerDatabaseResources(server);
|
||||
registerComponentsTools(server);
|
||||
registerScriptsTools(server);
|
||||
registerPromptsSystem(server);
|
||||
|
||||
await server.connect(transport);
|
||||
|
||||
console.error('Makerkit MCP Server running on stdio');
|
||||
|
||||
@@ -1,8 +1,17 @@
|
||||
import type { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
|
||||
import { type McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
|
||||
import { readFile, readdir, stat } from 'node:fs/promises';
|
||||
import { join } from 'node:path';
|
||||
import postgres from 'postgres';
|
||||
import { z } from 'zod';
|
||||
|
||||
const DATABASE_URL =
|
||||
process.env.DATABASE_URL ||
|
||||
'postgresql://postgres:postgres@127.0.0.1:54322/postgres';
|
||||
|
||||
const sql = postgres(DATABASE_URL, {
|
||||
prepare: false,
|
||||
});
|
||||
|
||||
interface DatabaseFunction {
|
||||
name: string;
|
||||
parameters: Array<{
|
||||
@@ -30,6 +39,58 @@ interface SchemaFile {
|
||||
topic: string;
|
||||
}
|
||||
|
||||
interface ProjectTable {
|
||||
name: string;
|
||||
schema: string;
|
||||
sourceFile: string;
|
||||
topic: string;
|
||||
}
|
||||
|
||||
interface TableColumn {
|
||||
name: string;
|
||||
type: string;
|
||||
nullable: boolean;
|
||||
defaultValue?: string;
|
||||
isPrimaryKey: boolean;
|
||||
isForeignKey: boolean;
|
||||
referencedTable?: string;
|
||||
referencedColumn?: string;
|
||||
}
|
||||
|
||||
interface TableIndex {
|
||||
name: string;
|
||||
columns: string[];
|
||||
unique: boolean;
|
||||
type: string;
|
||||
definition: string;
|
||||
}
|
||||
|
||||
interface TableForeignKey {
|
||||
name: string;
|
||||
columns: string[];
|
||||
referencedTable: string;
|
||||
referencedColumns: string[];
|
||||
onDelete?: string;
|
||||
onUpdate?: string;
|
||||
}
|
||||
|
||||
interface TableInfo {
|
||||
name: string;
|
||||
schema: string;
|
||||
sourceFile: string;
|
||||
topic: string;
|
||||
columns: TableColumn[];
|
||||
foreignKeys: TableForeignKey[];
|
||||
indexes: TableIndex[];
|
||||
createStatement?: string;
|
||||
}
|
||||
|
||||
interface EnumInfo {
|
||||
name: string;
|
||||
values: string[];
|
||||
sourceFile: string;
|
||||
}
|
||||
|
||||
export class DatabaseTool {
|
||||
static async getSchemaFiles(): Promise<SchemaFile[]> {
|
||||
const schemasPath = join(
|
||||
@@ -95,7 +156,21 @@ export class DatabaseTool {
|
||||
functionName: string,
|
||||
): Promise<DatabaseFunction> {
|
||||
const functions = await this.getFunctions();
|
||||
const func = functions.find((f) => f.name === functionName);
|
||||
|
||||
// Extract just the function name if schema prefix is provided (e.g., "public.has_permission" -> "has_permission")
|
||||
const nameParts = functionName.split('.');
|
||||
const cleanFunctionName = nameParts[nameParts.length - 1];
|
||||
const providedSchema = nameParts.length > 1 ? nameParts[0] : 'public';
|
||||
|
||||
// Try to find by exact name first, then by cleaned name and schema
|
||||
let func = functions.find((f) => f.name === functionName);
|
||||
|
||||
if (!func) {
|
||||
// Match by function name and schema (defaulting to public if no schema provided)
|
||||
func = functions.find(
|
||||
(f) => f.name === cleanFunctionName && f.schema === providedSchema,
|
||||
);
|
||||
}
|
||||
|
||||
if (!func) {
|
||||
throw new Error(`Function "${functionName}" not found`);
|
||||
@@ -108,12 +183,43 @@ export class DatabaseTool {
|
||||
const allFunctions = await this.getFunctions();
|
||||
const searchTerm = query.toLowerCase();
|
||||
|
||||
// Extract schema and function name from search query if provided
|
||||
const nameParts = query.split('.');
|
||||
const cleanSearchTerm = nameParts[nameParts.length - 1].toLowerCase();
|
||||
|
||||
const searchSchema =
|
||||
nameParts.length > 1 ? nameParts[0].toLowerCase() : null;
|
||||
|
||||
return allFunctions.filter((func) => {
|
||||
const matchesName = func.name.toLowerCase().includes(cleanSearchTerm);
|
||||
const matchesFullName = func.name.toLowerCase().includes(searchTerm);
|
||||
|
||||
const matchesSchema = searchSchema
|
||||
? func.schema.toLowerCase() === searchSchema
|
||||
: true;
|
||||
|
||||
const matchesDescription = func.description
|
||||
.toLowerCase()
|
||||
.includes(searchTerm);
|
||||
|
||||
const matchesPurpose = func.purpose.toLowerCase().includes(searchTerm);
|
||||
|
||||
const matchesReturnType = func.returnType
|
||||
.toLowerCase()
|
||||
.includes(searchTerm);
|
||||
|
||||
// If schema is specified in query, must match both name and schema
|
||||
if (searchSchema) {
|
||||
return (matchesName || matchesFullName) && matchesSchema;
|
||||
}
|
||||
|
||||
// Otherwise, match on any field
|
||||
return (
|
||||
func.name.toLowerCase().includes(searchTerm) ||
|
||||
func.description.toLowerCase().includes(searchTerm) ||
|
||||
func.purpose.toLowerCase().includes(searchTerm) ||
|
||||
func.returnType.toLowerCase().includes(searchTerm)
|
||||
matchesName ||
|
||||
matchesFullName ||
|
||||
matchesDescription ||
|
||||
matchesPurpose ||
|
||||
matchesReturnType
|
||||
);
|
||||
});
|
||||
}
|
||||
@@ -158,6 +264,262 @@ export class DatabaseTool {
|
||||
);
|
||||
}
|
||||
|
||||
static async getAllProjectTables(): Promise<ProjectTable[]> {
|
||||
const schemaFiles = await this.getSchemaFiles();
|
||||
const tables: ProjectTable[] = [];
|
||||
|
||||
for (const file of schemaFiles) {
|
||||
const content = await readFile(file.path, 'utf8');
|
||||
const extractedTables = this.extractTablesWithSchema(content);
|
||||
|
||||
for (const table of extractedTables) {
|
||||
tables.push({
|
||||
name: table.name,
|
||||
schema: table.schema || 'public',
|
||||
sourceFile: file.name,
|
||||
topic: file.topic,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return tables;
|
||||
}
|
||||
|
||||
static async getAllEnums(): Promise<Record<string, EnumInfo>> {
|
||||
try {
|
||||
// Try to get live enums from database first
|
||||
const liveEnums = await this.getEnumsFromDB();
|
||||
if (Object.keys(liveEnums).length > 0) {
|
||||
return liveEnums;
|
||||
}
|
||||
|
||||
// Fallback to schema files
|
||||
const enumContent = await this.getSchemaContent('01-enums.sql');
|
||||
return this.parseEnums(enumContent);
|
||||
} catch (error) {
|
||||
return {};
|
||||
}
|
||||
}
|
||||
|
||||
static async getTableInfo(
|
||||
schema: string,
|
||||
tableName: string,
|
||||
): Promise<TableInfo> {
|
||||
const schemaFiles = await this.getSchemaFiles();
|
||||
|
||||
for (const file of schemaFiles) {
|
||||
const content = await readFile(file.path, 'utf8');
|
||||
const tableDefinition = this.extractTableDefinition(
|
||||
content,
|
||||
schema,
|
||||
tableName,
|
||||
);
|
||||
|
||||
if (tableDefinition) {
|
||||
// Enhance with live database info
|
||||
const liveColumns = await this.getTableColumnsFromDB(schema, tableName);
|
||||
const liveForeignKeys = await this.getTableForeignKeysFromDB(
|
||||
schema,
|
||||
tableName,
|
||||
);
|
||||
const liveIndexes = await this.getTableIndexesFromDB(schema, tableName);
|
||||
|
||||
return {
|
||||
name: tableName,
|
||||
schema: schema,
|
||||
sourceFile: file.name,
|
||||
topic: file.topic,
|
||||
columns:
|
||||
liveColumns.length > 0
|
||||
? liveColumns
|
||||
: this.parseColumns(tableDefinition),
|
||||
foreignKeys:
|
||||
liveForeignKeys.length > 0
|
||||
? liveForeignKeys
|
||||
: this.parseForeignKeys(tableDefinition),
|
||||
indexes:
|
||||
liveIndexes.length > 0
|
||||
? liveIndexes
|
||||
: this.parseIndexes(content, tableName),
|
||||
createStatement: tableDefinition,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
throw new Error(`Table ${schema}.${tableName} not found in schema files`);
|
||||
}
|
||||
|
||||
static async getTableColumnsFromDB(
|
||||
schema: string,
|
||||
tableName: string,
|
||||
): Promise<TableColumn[]> {
|
||||
try {
|
||||
const columns = await sql`
|
||||
SELECT
|
||||
c.column_name,
|
||||
c.data_type,
|
||||
c.is_nullable,
|
||||
c.column_default,
|
||||
CASE WHEN pk.column_name IS NOT NULL THEN true ELSE false END as is_primary_key,
|
||||
CASE WHEN fk.column_name IS NOT NULL THEN true ELSE false END as is_foreign_key,
|
||||
fk.foreign_table_name as referenced_table,
|
||||
fk.foreign_column_name as referenced_column
|
||||
FROM information_schema.columns c
|
||||
LEFT JOIN (
|
||||
SELECT ku.table_name, ku.column_name
|
||||
FROM information_schema.table_constraints tc
|
||||
JOIN information_schema.key_column_usage ku
|
||||
ON tc.constraint_name = ku.constraint_name
|
||||
AND tc.table_schema = ku.table_schema
|
||||
WHERE tc.constraint_type = 'PRIMARY KEY'
|
||||
AND tc.table_schema = ${schema}
|
||||
) pk ON c.table_name = pk.table_name AND c.column_name = pk.column_name
|
||||
LEFT JOIN (
|
||||
SELECT
|
||||
ku.table_name,
|
||||
ku.column_name,
|
||||
ccu.table_name AS foreign_table_name,
|
||||
ccu.column_name AS foreign_column_name
|
||||
FROM information_schema.table_constraints tc
|
||||
JOIN information_schema.key_column_usage ku
|
||||
ON tc.constraint_name = ku.constraint_name
|
||||
AND tc.table_schema = ku.table_schema
|
||||
JOIN information_schema.constraint_column_usage ccu
|
||||
ON ccu.constraint_name = tc.constraint_name
|
||||
AND ccu.table_schema = tc.table_schema
|
||||
WHERE tc.constraint_type = 'FOREIGN KEY'
|
||||
AND tc.table_schema = ${schema}
|
||||
) fk ON c.table_name = fk.table_name AND c.column_name = fk.column_name
|
||||
WHERE c.table_schema = ${schema}
|
||||
AND c.table_name = ${tableName}
|
||||
ORDER BY c.ordinal_position
|
||||
`;
|
||||
|
||||
return columns.map((col) => ({
|
||||
name: col.column_name,
|
||||
type: col.data_type,
|
||||
nullable: col.is_nullable === 'YES',
|
||||
defaultValue: col.column_default,
|
||||
isPrimaryKey: col.is_primary_key,
|
||||
isForeignKey: col.is_foreign_key,
|
||||
referencedTable: col.referenced_table,
|
||||
referencedColumn: col.referenced_column,
|
||||
}));
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
static async getTableForeignKeysFromDB(
|
||||
schema: string,
|
||||
tableName: string,
|
||||
): Promise<TableForeignKey[]> {
|
||||
try {
|
||||
const foreignKeys = await sql`
|
||||
SELECT
|
||||
tc.constraint_name,
|
||||
string_agg(kcu.column_name, ',' ORDER BY kcu.ordinal_position) as columns,
|
||||
ccu.table_name AS foreign_table_name,
|
||||
string_agg(ccu.column_name, ',' ORDER BY kcu.ordinal_position) as foreign_columns,
|
||||
rc.delete_rule,
|
||||
rc.update_rule
|
||||
FROM information_schema.table_constraints tc
|
||||
JOIN information_schema.key_column_usage kcu
|
||||
ON tc.constraint_name = kcu.constraint_name
|
||||
AND tc.table_schema = kcu.table_schema
|
||||
JOIN information_schema.constraint_column_usage ccu
|
||||
ON ccu.constraint_name = tc.constraint_name
|
||||
AND ccu.table_schema = tc.table_schema
|
||||
JOIN information_schema.referential_constraints rc
|
||||
ON tc.constraint_name = rc.constraint_name
|
||||
AND tc.table_schema = rc.constraint_schema
|
||||
WHERE tc.constraint_type = 'FOREIGN KEY'
|
||||
AND tc.table_schema = ${schema}
|
||||
AND tc.table_name = ${tableName}
|
||||
GROUP BY tc.constraint_name, ccu.table_name, rc.delete_rule, rc.update_rule
|
||||
`;
|
||||
|
||||
return foreignKeys.map((fk: any) => ({
|
||||
name: fk.constraint_name,
|
||||
columns: fk.columns.split(','),
|
||||
referencedTable: fk.foreign_table_name,
|
||||
referencedColumns: fk.foreign_columns.split(','),
|
||||
onDelete: fk.delete_rule,
|
||||
onUpdate: fk.update_rule,
|
||||
}));
|
||||
} catch (error) {
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
static async getTableIndexesFromDB(
|
||||
schema: string,
|
||||
tableName: string,
|
||||
): Promise<TableIndex[]> {
|
||||
try {
|
||||
const indexes = await sql`
|
||||
SELECT
|
||||
i.indexname,
|
||||
i.indexdef,
|
||||
ix.indisunique as is_unique,
|
||||
string_agg(a.attname, ',' ORDER BY a.attnum) as columns
|
||||
FROM pg_indexes i
|
||||
JOIN pg_class c ON c.relname = i.tablename
|
||||
JOIN pg_namespace n ON n.oid = c.relnamespace
|
||||
JOIN pg_index ix ON ix.indexrelid = (
|
||||
SELECT oid FROM pg_class WHERE relname = i.indexname
|
||||
)
|
||||
JOIN pg_attribute a ON a.attrelid = c.oid
|
||||
AND a.attnum = ANY(ix.indkey)
|
||||
WHERE n.nspname = ${schema}
|
||||
AND i.tablename = ${tableName}
|
||||
AND i.indexname NOT LIKE '%_pkey'
|
||||
GROUP BY i.indexname, i.indexdef, ix.indisunique
|
||||
ORDER BY i.indexname
|
||||
`;
|
||||
|
||||
return indexes.map((idx) => ({
|
||||
name: idx.indexname,
|
||||
columns: idx.columns.split(','),
|
||||
unique: idx.is_unique,
|
||||
type: 'btree', // Default, could be enhanced
|
||||
definition: idx.indexdef,
|
||||
}));
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
static async getEnumsFromDB(): Promise<Record<string, EnumInfo>> {
|
||||
try {
|
||||
const enums = await sql`
|
||||
SELECT
|
||||
t.typname as enum_name,
|
||||
array_agg(e.enumlabel ORDER BY e.enumsortorder) as enum_values
|
||||
FROM pg_type t
|
||||
JOIN pg_enum e ON t.oid = e.enumtypid
|
||||
JOIN pg_namespace n ON n.oid = t.typnamespace
|
||||
WHERE n.nspname = 'public'
|
||||
GROUP BY t.typname
|
||||
ORDER BY t.typname
|
||||
`;
|
||||
|
||||
const result: Record<string, EnumInfo> = {};
|
||||
for (const enumData of enums) {
|
||||
result[enumData.enum_name] = {
|
||||
name: enumData.enum_name,
|
||||
values: enumData.enum_values,
|
||||
sourceFile: 'database', // Live from DB
|
||||
};
|
||||
}
|
||||
return result;
|
||||
} catch (error) {
|
||||
return {};
|
||||
}
|
||||
}
|
||||
|
||||
private static extractFunctionsFromContent(
|
||||
content: string,
|
||||
sourceFile: string,
|
||||
@@ -328,6 +690,32 @@ export class DatabaseTool {
|
||||
return [...new Set(tables)]; // Remove duplicates
|
||||
}
|
||||
|
||||
private static extractTablesWithSchema(content: string): Array<{
|
||||
name: string;
|
||||
schema: string;
|
||||
}> {
|
||||
const tables: Array<{ name: string; schema: string }> = [];
|
||||
const tableRegex =
|
||||
/create\s+table\s+(?:if\s+not\s+exists\s+)?(?:([a-zA-Z_][a-zA-Z0-9_]*)\.)?([a-zA-Z_][a-zA-Z0-9_]*)/gi;
|
||||
let match;
|
||||
|
||||
while ((match = tableRegex.exec(content)) !== null) {
|
||||
if (match[2]) {
|
||||
tables.push({
|
||||
schema: match[1] || 'public',
|
||||
name: match[2],
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return tables.filter(
|
||||
(table, index, arr) =>
|
||||
arr.findIndex(
|
||||
(t) => t.name === table.name && t.schema === table.schema,
|
||||
) === index,
|
||||
);
|
||||
}
|
||||
|
||||
private static extractFunctionNames(content: string): string[] {
|
||||
const functions: string[] = [];
|
||||
const functionRegex =
|
||||
@@ -361,6 +749,176 @@ export class DatabaseTool {
|
||||
return [...new Set(dependencies)]; // Remove duplicates
|
||||
}
|
||||
|
||||
private static extractTableDefinition(
|
||||
content: string,
|
||||
schema: string,
|
||||
tableName: string,
|
||||
): string | null {
|
||||
const tableRegex = new RegExp(
|
||||
`create\\s+table\\s+(?:if\\s+not\\s+exists\\s+)?(?:${schema}\\.)?${tableName}\\s*\\([^;]*?\\);`,
|
||||
'gis',
|
||||
);
|
||||
const match = content.match(tableRegex);
|
||||
return match ? match[0] : null;
|
||||
}
|
||||
|
||||
private static parseColumns(tableDefinition: string): TableColumn[] {
|
||||
const columns: TableColumn[] = [];
|
||||
|
||||
// Extract the content between parentheses
|
||||
const contentMatch = tableDefinition.match(/\(([\s\S]*)\)/);
|
||||
if (!contentMatch) return columns;
|
||||
|
||||
const content = contentMatch[1];
|
||||
|
||||
// Split by commas, but be careful of nested structures
|
||||
const lines = content
|
||||
.split('\n')
|
||||
.map((line) => line.trim())
|
||||
.filter((line) => line);
|
||||
|
||||
for (const line of lines) {
|
||||
if (
|
||||
line.startsWith('constraint') ||
|
||||
line.startsWith('primary key') ||
|
||||
line.startsWith('foreign key')
|
||||
) {
|
||||
continue; // Skip constraint definitions
|
||||
}
|
||||
|
||||
// Parse column definition: name type [constraints]
|
||||
const columnMatch = line.match(
|
||||
/^([a-zA-Z_][a-zA-Z0-9_]*)\s+([^,\s]+)(?:\s+(.*))?/,
|
||||
);
|
||||
if (columnMatch) {
|
||||
const [, name, type, constraints = ''] = columnMatch;
|
||||
|
||||
columns.push({
|
||||
name,
|
||||
type: type.replace(/,$/, ''), // Remove trailing comma
|
||||
nullable: !constraints.includes('not null'),
|
||||
defaultValue: this.extractDefault(constraints),
|
||||
isPrimaryKey: constraints.includes('primary key'),
|
||||
isForeignKey: constraints.includes('references'),
|
||||
referencedTable: this.extractReferencedTable(constraints),
|
||||
referencedColumn: this.extractReferencedColumn(constraints),
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return columns;
|
||||
}
|
||||
|
||||
private static extractDefault(constraints: string): string | undefined {
|
||||
const defaultMatch = constraints.match(/default\s+([^,\s]+)/i);
|
||||
return defaultMatch ? defaultMatch[1] : undefined;
|
||||
}
|
||||
|
||||
private static extractReferencedTable(
|
||||
constraints: string,
|
||||
): string | undefined {
|
||||
const refMatch = constraints.match(
|
||||
/references\s+([a-zA-Z_][a-zA-Z0-9_]*)/i,
|
||||
);
|
||||
return refMatch ? refMatch[1] : undefined;
|
||||
}
|
||||
|
||||
private static extractReferencedColumn(
|
||||
constraints: string,
|
||||
): string | undefined {
|
||||
const refMatch = constraints.match(
|
||||
/references\s+[a-zA-Z_][a-zA-Z0-9_]*\s*\(([^)]+)\)/i,
|
||||
);
|
||||
return refMatch ? refMatch[1].trim() : undefined;
|
||||
}
|
||||
|
||||
private static parseForeignKeys(tableDefinition: string): TableForeignKey[] {
|
||||
const foreignKeys: TableForeignKey[] = [];
|
||||
|
||||
// Match foreign key constraints
|
||||
const fkRegex =
|
||||
/foreign\s+key\s*\(([^)]+)\)\s*references\s+([a-zA-Z_][a-zA-Z0-9_]*)\s*\(([^)]+)\)(?:\s+on\s+delete\s+([a-z\s]+))?(?:\s+on\s+update\s+([a-z\s]+))?/gi;
|
||||
|
||||
let match;
|
||||
while ((match = fkRegex.exec(tableDefinition)) !== null) {
|
||||
const [
|
||||
,
|
||||
columns,
|
||||
referencedTable,
|
||||
referencedColumns,
|
||||
onDelete,
|
||||
onUpdate,
|
||||
] = match;
|
||||
|
||||
foreignKeys.push({
|
||||
name: `fk_${referencedTable}_${columns.replace(/\s/g, '')}`,
|
||||
columns: columns.split(',').map((col) => col.trim()),
|
||||
referencedTable,
|
||||
referencedColumns: referencedColumns
|
||||
.split(',')
|
||||
.map((col) => col.trim()),
|
||||
onDelete: onDelete?.trim(),
|
||||
onUpdate: onUpdate?.trim(),
|
||||
});
|
||||
}
|
||||
|
||||
return foreignKeys;
|
||||
}
|
||||
|
||||
private static parseIndexes(
|
||||
content: string,
|
||||
tableName: string,
|
||||
): TableIndex[] {
|
||||
const indexes: TableIndex[] = [];
|
||||
|
||||
// Match CREATE INDEX statements
|
||||
const indexRegex = new RegExp(
|
||||
`create\\s+(?:unique\\s+)?index\\s+([a-zA-Z_][a-zA-Z0-9_]*)\\s+on\\s+(?:public\\.)?${tableName}\\s*\\(([^)]+)\\)`,
|
||||
'gi',
|
||||
);
|
||||
|
||||
let match;
|
||||
while ((match = indexRegex.exec(content)) !== null) {
|
||||
const [fullMatch, indexName, columns] = match;
|
||||
|
||||
indexes.push({
|
||||
name: indexName,
|
||||
columns: columns.split(',').map((col) => col.trim()),
|
||||
unique: fullMatch.toLowerCase().includes('unique'),
|
||||
type: 'btree', // Default type
|
||||
definition: fullMatch,
|
||||
});
|
||||
}
|
||||
|
||||
return indexes;
|
||||
}
|
||||
|
||||
private static parseEnums(content: string): Record<string, EnumInfo> {
|
||||
const enums: Record<string, EnumInfo> = {};
|
||||
|
||||
// Match CREATE TYPE ... AS ENUM
|
||||
const enumRegex =
|
||||
/create\s+type\s+([a-zA-Z_][a-zA-Z0-9_]*)\s+as\s+enum\s*\(([^)]+)\)/gi;
|
||||
|
||||
let match;
|
||||
while ((match = enumRegex.exec(content)) !== null) {
|
||||
const [, enumName, values] = match;
|
||||
|
||||
const enumValues = values
|
||||
.split(',')
|
||||
.map((value) => value.trim().replace(/['"]/g, ''))
|
||||
.filter((value) => value);
|
||||
|
||||
enums[enumName] = {
|
||||
name: enumName,
|
||||
values: enumValues,
|
||||
sourceFile: '01-enums.sql',
|
||||
};
|
||||
}
|
||||
|
||||
return enums;
|
||||
}
|
||||
|
||||
private static determineTopic(fileName: string, content: string): string {
|
||||
// Map file names to topics
|
||||
const fileTopicMap: Record<string, string> = {
|
||||
@@ -426,6 +984,14 @@ export function registerDatabaseTools(server: McpServer) {
|
||||
createSearchFunctionsTool(server);
|
||||
}
|
||||
|
||||
export function registerDatabaseResources(server: McpServer) {
|
||||
createDatabaseSummaryTool(server);
|
||||
createDatabaseTablesListTool(server);
|
||||
createGetTableInfoTool(server);
|
||||
createGetEnumInfoTool(server);
|
||||
createGetAllEnumsTool(server);
|
||||
}
|
||||
|
||||
function createGetSchemaFilesTool(server: McpServer) {
|
||||
return server.tool(
|
||||
'get_schema_files',
|
||||
@@ -704,3 +1270,192 @@ function createGetSchemaBySectionTool(server: McpServer) {
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
function createDatabaseSummaryTool(server: McpServer) {
|
||||
return server.tool(
|
||||
'get_database_summary',
|
||||
'📊 Get comprehensive database overview with tables, enums, and functions',
|
||||
async () => {
|
||||
const tables = await DatabaseTool.getAllProjectTables();
|
||||
const enums = await DatabaseTool.getAllEnums();
|
||||
const functions = await DatabaseTool.getFunctions();
|
||||
|
||||
const summary = {
|
||||
overview: {
|
||||
totalTables: tables.length,
|
||||
totalEnums: Object.keys(enums).length,
|
||||
totalFunctions: functions.length,
|
||||
},
|
||||
tables: tables.map((t) => ({
|
||||
name: t.name,
|
||||
schema: t.schema,
|
||||
topic: t.topic,
|
||||
sourceFile: t.sourceFile,
|
||||
})),
|
||||
enums: Object.entries(enums).map(([name, info]) => ({
|
||||
name,
|
||||
values: info.values,
|
||||
sourceFile: info.sourceFile,
|
||||
})),
|
||||
functions: functions.map((f) => ({
|
||||
name: f.name,
|
||||
schema: f.schema,
|
||||
purpose: f.purpose,
|
||||
sourceFile: f.sourceFile,
|
||||
})),
|
||||
tablesByTopic: tables.reduce(
|
||||
(acc, table) => {
|
||||
if (!acc[table.topic]) acc[table.topic] = [];
|
||||
acc[table.topic].push(table.name);
|
||||
return acc;
|
||||
},
|
||||
{} as Record<string, string[]>,
|
||||
),
|
||||
};
|
||||
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: 'text',
|
||||
text: `📊 DATABASE OVERVIEW\n\n${JSON.stringify(summary, null, 2)}`,
|
||||
},
|
||||
],
|
||||
};
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
function createDatabaseTablesListTool(server: McpServer) {
|
||||
return server.tool(
|
||||
'get_database_tables',
|
||||
'📋 Get list of all project-defined database tables',
|
||||
async () => {
|
||||
const tables = await DatabaseTool.getAllProjectTables();
|
||||
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: 'text',
|
||||
text: `📋 PROJECT TABLES\n\n${JSON.stringify(tables, null, 2)}`,
|
||||
},
|
||||
],
|
||||
};
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
function createGetTableInfoTool(server: McpServer) {
|
||||
return server.tool(
|
||||
'get_table_info',
|
||||
'🗂️ Get detailed table schema with columns, foreign keys, and indexes',
|
||||
{
|
||||
state: z.object({
|
||||
schema: z.string().default('public'),
|
||||
tableName: z.string(),
|
||||
}),
|
||||
},
|
||||
async ({ state }) => {
|
||||
try {
|
||||
const tableInfo = await DatabaseTool.getTableInfo(
|
||||
state.schema,
|
||||
state.tableName,
|
||||
);
|
||||
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: 'text',
|
||||
text: `🗂️ TABLE: ${state.schema}.${state.tableName}\n\n${JSON.stringify(tableInfo, null, 2)}`,
|
||||
},
|
||||
],
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: 'text',
|
||||
text: `❌ Error: ${error instanceof Error ? error.message : String(error)}`,
|
||||
},
|
||||
],
|
||||
};
|
||||
}
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
function createGetEnumInfoTool(server: McpServer) {
|
||||
return server.tool(
|
||||
'get_enum_info',
|
||||
'🏷️ Get enum type definition with all possible values',
|
||||
{
|
||||
state: z.object({
|
||||
enumName: z.string(),
|
||||
}),
|
||||
},
|
||||
async ({ state }) => {
|
||||
try {
|
||||
const enums = await DatabaseTool.getAllEnums();
|
||||
const enumInfo = enums[state.enumName];
|
||||
|
||||
if (!enumInfo) {
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: 'text',
|
||||
text: `❌ Enum "${state.enumName}" not found. Available enums: ${Object.keys(enums).join(', ')}`,
|
||||
},
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: 'text',
|
||||
text: `🏷️ ENUM: ${state.enumName}\n\n${JSON.stringify(enumInfo, null, 2)}`,
|
||||
},
|
||||
],
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: 'text',
|
||||
text: `❌ Error: ${error instanceof Error ? error.message : String(error)}`,
|
||||
},
|
||||
],
|
||||
};
|
||||
}
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
function createGetAllEnumsTool(server: McpServer) {
|
||||
return server.tool(
|
||||
'get_all_enums',
|
||||
'🏷️ Get all enum types and their values',
|
||||
async () => {
|
||||
try {
|
||||
const enums = await DatabaseTool.getAllEnums();
|
||||
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: 'text',
|
||||
text: `🏷️ ALL ENUMS\n\n${JSON.stringify(enums, null, 2)}`,
|
||||
},
|
||||
],
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: 'text',
|
||||
text: `❌ Error: ${error instanceof Error ? error.message : String(error)}`,
|
||||
},
|
||||
],
|
||||
};
|
||||
}
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
551
packages/mcp-server/src/tools/prompts.ts
Normal file
551
packages/mcp-server/src/tools/prompts.ts
Normal file
@@ -0,0 +1,551 @@
|
||||
import type { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
|
||||
import { z } from 'zod';
|
||||
|
||||
interface PromptTemplate {
|
||||
name: string;
|
||||
title: string;
|
||||
description: string;
|
||||
category:
|
||||
| 'code-review'
|
||||
| 'development'
|
||||
| 'database'
|
||||
| 'testing'
|
||||
| 'architecture'
|
||||
| 'debugging';
|
||||
arguments: Array<{
|
||||
name: string;
|
||||
description: string;
|
||||
required: boolean;
|
||||
type: 'string' | 'text' | 'enum';
|
||||
options?: string[];
|
||||
}>;
|
||||
template: string;
|
||||
examples?: string[];
|
||||
}
|
||||
|
||||
export class PromptsManager {
|
||||
private static prompts: PromptTemplate[] = [
|
||||
{
|
||||
name: 'code_review',
|
||||
title: 'Comprehensive Code Review',
|
||||
description:
|
||||
'Analyze code for quality, security, performance, and best practices',
|
||||
category: 'code-review',
|
||||
arguments: [
|
||||
{
|
||||
name: 'code',
|
||||
description: 'The code to review',
|
||||
required: true,
|
||||
type: 'text',
|
||||
},
|
||||
{
|
||||
name: 'focus_area',
|
||||
description: 'Specific area to focus the review on',
|
||||
required: false,
|
||||
type: 'enum',
|
||||
options: [
|
||||
'security',
|
||||
'performance',
|
||||
'maintainability',
|
||||
'typescript',
|
||||
'react',
|
||||
'all',
|
||||
],
|
||||
},
|
||||
{
|
||||
name: 'severity_level',
|
||||
description: 'Minimum severity level for issues to report',
|
||||
required: false,
|
||||
type: 'enum',
|
||||
options: ['low', 'medium', 'high', 'critical'],
|
||||
},
|
||||
],
|
||||
template: `Please review the following code with a focus on {{focus_area || 'all aspects'}}.
|
||||
|
||||
**Code to Review:**
|
||||
\`\`\`
|
||||
{{code}}
|
||||
\`\`\`
|
||||
|
||||
**Makerkit Standards Review Criteria:**
|
||||
|
||||
**TypeScript Excellence:**
|
||||
- Strict TypeScript with no 'any' types - use explicit types always
|
||||
- Implicit type inference preferred unless impossible
|
||||
- Proper error handling with try/catch and typed error objects
|
||||
- Clean, clear, well-designed code without obvious comments
|
||||
|
||||
**React & Next.js 15 Best Practices:**
|
||||
- Functional components only with 'use client' directive for client components
|
||||
- Encapsulate repeated blocks of code into reusable local components
|
||||
- Avoid useEffect (code smell) - justify if absolutely necessary
|
||||
- Single state objects over multiple useState calls
|
||||
- Prefer server-side data fetching using React Server Components
|
||||
- Display loading indicators with LoadingSpinner component where appropriate
|
||||
- Add data-test attributes for E2E testing where appropriate
|
||||
- Server actions that redirect should handle the error using "isRedirectError" from 'next/dist/client/components/redirect-error'
|
||||
|
||||
**Makerkit Architecture Patterns:**
|
||||
- Multi-tenant architecture with proper account-based access control
|
||||
- Use account_id foreign keys for data association
|
||||
- Personal vs Team accounts pattern implementation
|
||||
- Proper use of Row Level Security (RLS) policies
|
||||
- Supabase integration best practices
|
||||
|
||||
**Database Best Practices:**
|
||||
- Use existing database functions instead of writing your own
|
||||
- RLS are applied to all tables unless explicitly instructed otherwise
|
||||
- RLS prevents data leakage between accounts
|
||||
- User is prevented from updating fields that are not allowed to be updated (uses column-level permissions)
|
||||
- Triggers for tracking timestamps and user tracking are used if required
|
||||
- Schema is thorough and covers all data integrity and business rules, but is not unnecessarily complex or over-engineered
|
||||
- Schema uses constraints/triggers where required for data integrity and business rules
|
||||
- Schema prevents invalid data from being inserted or updated
|
||||
|
||||
**Code Quality Standards:**
|
||||
- No unnecessary complexity or overly abstract code
|
||||
- Consistent file structure following monorepo patterns
|
||||
- Proper package organization in Turborepo structure
|
||||
- Use of @kit/ui components and established patterns
|
||||
|
||||
{{#if severity_level}}
|
||||
**Severity Filter:** Only report issues of {{severity_level}} severity or higher.
|
||||
{{/if}}
|
||||
|
||||
**Please provide:**
|
||||
1. **Overview:** Brief summary of code quality
|
||||
2. **Issues Found:** List specific problems with severity levels
|
||||
3. **Suggestions:** Concrete improvement recommendations
|
||||
4. **Best Practices:** Relevant patterns from the Makerkit codebase
|
||||
5. **Security Review:** Any security concerns or improvements`,
|
||||
examples: [
|
||||
'Review a React component for best practices',
|
||||
'Security-focused review of authentication code',
|
||||
'Performance analysis of database queries',
|
||||
],
|
||||
},
|
||||
{
|
||||
name: 'component_implementation',
|
||||
title: 'Component Implementation Guide',
|
||||
description:
|
||||
'Generate implementation guidance for creating new UI components',
|
||||
category: 'development',
|
||||
arguments: [
|
||||
{
|
||||
name: 'component_description',
|
||||
description: 'Description of the component to implement',
|
||||
required: true,
|
||||
type: 'text',
|
||||
},
|
||||
{
|
||||
name: 'component_type',
|
||||
description: 'Type of component to create',
|
||||
required: true,
|
||||
type: 'enum',
|
||||
options: ['shadcn', 'makerkit', 'page', 'form', 'table', 'modal'],
|
||||
},
|
||||
{
|
||||
name: 'features',
|
||||
description: 'Specific features or functionality needed',
|
||||
required: false,
|
||||
type: 'text',
|
||||
},
|
||||
],
|
||||
template: `Help me implement a {{component_type}} component: {{component_description}}
|
||||
|
||||
{{#if features}}
|
||||
**Required Features:**
|
||||
{{features}}
|
||||
{{/if}}
|
||||
|
||||
**Please provide:**
|
||||
1. **Component Design:** Architecture and structure recommendations
|
||||
2. **Code Implementation:** Full TypeScript/React code with proper typing
|
||||
3. **Styling Approach:** Tailwind CSS classes and variants (use CVA if applicable)
|
||||
4. **Props Interface:** Complete TypeScript interface definition
|
||||
5. **Usage Examples:** How to use the component in different scenarios
|
||||
6. **Testing Strategy:** Unit tests and accessibility considerations
|
||||
7. **Makerkit Integration:** How this fits with existing patterns
|
||||
|
||||
**Makerkit Implementation Requirements:**
|
||||
|
||||
**TypeScript Standards:**
|
||||
- Strict TypeScript with no 'any' types
|
||||
- Use implicit type inference unless impossible
|
||||
- Proper error handling with typed errors
|
||||
- Clean code without unnecessary comments
|
||||
|
||||
**Component Architecture:**
|
||||
- Functional components with proper 'use client' directive
|
||||
- Use existing @kit/ui components (shadcn + makerkit customs)
|
||||
- Follow established patterns: enhanced-data-table, if, trans, page
|
||||
- Implement proper conditional rendering with <If> component
|
||||
- Display loading indicators with LoadingSpinner component where appropriate
|
||||
- Encapsulate repeated blocks of code into reusable local components
|
||||
|
||||
**Styling & UI Standards:**
|
||||
- Tailwind CSS 4 with CVA (Class Variance Authority) for variants
|
||||
- Responsive design with mobile-first approach
|
||||
- Proper accessibility with ARIA attributes and data-test for E2E
|
||||
- Use shadcn components as base, extend with makerkit patterns
|
||||
|
||||
**State & Data Management:**
|
||||
- Single state objects over multiple useState
|
||||
- Server-side data fetching with RSC preferred
|
||||
- Supabase client integration with proper error handling
|
||||
- Account-based data access with proper RLS policies
|
||||
|
||||
**File Structure:**
|
||||
- Follow monorepo structure: packages/features/* for feature packages
|
||||
- Use established naming conventions and folder organization
|
||||
- Import from @kit/* packages appropriately`,
|
||||
examples: [
|
||||
'Create a data table component with sorting and filtering',
|
||||
'Build a multi-step form component',
|
||||
'Design a notification center component',
|
||||
],
|
||||
},
|
||||
{
|
||||
name: 'architecture_guidance',
|
||||
title: 'Architecture Guidance',
|
||||
description: 'Provide architectural recommendations for complex features',
|
||||
category: 'architecture',
|
||||
arguments: [
|
||||
{
|
||||
name: 'feature_scope',
|
||||
description: 'Description of the feature or system to architect',
|
||||
required: true,
|
||||
type: 'text',
|
||||
},
|
||||
{
|
||||
name: 'scale_requirements',
|
||||
description: 'Expected scale and performance requirements',
|
||||
required: false,
|
||||
type: 'text',
|
||||
},
|
||||
{
|
||||
name: 'constraints',
|
||||
description: 'Technical constraints or requirements',
|
||||
required: false,
|
||||
type: 'text',
|
||||
},
|
||||
],
|
||||
template: `Provide architectural guidance for: {{feature_scope}}
|
||||
|
||||
{{#if scale_requirements}}
|
||||
**Scale Requirements:** {{scale_requirements}}
|
||||
{{/if}}
|
||||
|
||||
{{#if constraints}}
|
||||
**Constraints:** {{constraints}}
|
||||
{{/if}}
|
||||
|
||||
**Please provide:**
|
||||
1. **Architecture Overview:** High-level system design and components
|
||||
2. **Data Architecture:** Database design and data flow patterns
|
||||
3. **API Design:** RESTful endpoints and GraphQL considerations
|
||||
4. **State Management:** Client-side state architecture
|
||||
5. **Security Architecture:** Authentication, authorization, and data protection
|
||||
6. **Performance Strategy:** Caching, optimization, and scaling approaches
|
||||
7. **Integration Patterns:** How this fits with existing Makerkit architecture
|
||||
|
||||
**Makerkit Architecture Standards:**
|
||||
|
||||
**Multi-Tenant Patterns:**
|
||||
- Account-based data isolation with proper foreign key relationships
|
||||
- Personal vs Team account architecture (auth.users.id = accounts.id for personal)
|
||||
- Role-based access control with roles, memberships, and permissions tables
|
||||
- RLS policies that enforce account boundaries at database level
|
||||
|
||||
**Technology Stack Integration:**
|
||||
- Next.js 15 App Router with React Server Components
|
||||
- Supabase for database, auth, storage, and real-time features
|
||||
- TypeScript strict mode with no 'any' types
|
||||
- Tailwind CSS 4 with shadcn/ui and custom Makerkit components
|
||||
- Turborepo monorepo with proper package organization
|
||||
|
||||
**Performance & Security:**
|
||||
- Server-side data fetching preferred over client-side
|
||||
- Proper error boundaries and graceful error handling
|
||||
- Account-level data access patterns with efficient queries
|
||||
- Use of existing database functions for complex operations
|
||||
|
||||
**Code Organization:**
|
||||
- For simplicity, place feature directly in the application (apps/web) unless you're asked to create a separate package for it
|
||||
- Shared utilities in packages/* (ui, auth, billing, etc.)
|
||||
- Consistent naming conventions and file structure
|
||||
- Proper import patterns from @kit/* packages`,
|
||||
examples: [
|
||||
'Design a real-time notification system',
|
||||
'Architect a file upload and processing system',
|
||||
'Design a reporting and analytics feature',
|
||||
],
|
||||
},
|
||||
{
|
||||
name: 'makerkit_feature_implementation',
|
||||
title: 'Makerkit Feature Implementation Guide',
|
||||
description:
|
||||
'Complete guide for implementing new features following Makerkit patterns',
|
||||
category: 'development',
|
||||
arguments: [
|
||||
{
|
||||
name: 'feature_name',
|
||||
description: 'Name of the feature to implement',
|
||||
required: true,
|
||||
type: 'string',
|
||||
},
|
||||
{
|
||||
name: 'feature_type',
|
||||
description: 'Type of feature being implemented',
|
||||
required: true,
|
||||
type: 'enum',
|
||||
options: [
|
||||
'billing',
|
||||
'auth',
|
||||
'team-management',
|
||||
'data-management',
|
||||
'api',
|
||||
'ui-component',
|
||||
],
|
||||
},
|
||||
{
|
||||
name: 'user_stories',
|
||||
description: 'User stories or requirements for the feature',
|
||||
required: false,
|
||||
type: 'text',
|
||||
},
|
||||
],
|
||||
template: `Implement a {{feature_type}} feature: {{feature_name}}
|
||||
|
||||
{{#if user_stories}}
|
||||
**User Requirements:**
|
||||
{{user_stories}}
|
||||
{{/if}}
|
||||
|
||||
**Please provide a complete Makerkit implementation including:**
|
||||
|
||||
**1. Database Design:**
|
||||
- Schema changes following multi-tenant patterns
|
||||
- RLS policies for account-based access control
|
||||
- Database functions if needed (SECURITY DEFINER/INVOKER)
|
||||
- Proper foreign key relationships with account_id
|
||||
- Schema uses constraints/triggers where required for data integrity and business rules
|
||||
- Schema prevents invalid data from being inserted or updated
|
||||
|
||||
**2. Backend Implementation:**
|
||||
- Server Actions or API routes following Next.js 15 patterns
|
||||
- Proper error handling with typed responses
|
||||
- Integration with existing Supabase auth and database
|
||||
- Account-level data access patterns
|
||||
- Redirect using Server Actions/API Routes instead of client-side navigation
|
||||
|
||||
**3. Frontend Components:**
|
||||
- React Server Components where possible
|
||||
- Use of @kit/ui components (shadcn + makerkit)
|
||||
- Small, composable, explicit, reusable, well-named components
|
||||
- Proper TypeScript interfaces and types
|
||||
- Single state objects over multiple useState
|
||||
- Conditional rendering with <If> component
|
||||
|
||||
**4. Package Organization:**
|
||||
- If reusable, create feature package in packages/features/{{feature_name}}
|
||||
- Proper exports and package.json configuration
|
||||
- Integration with existing packages (@kit/auth, @kit/ui, etc.)
|
||||
|
||||
**5. Code Quality:**
|
||||
- TypeScript strict mode with no 'any' types
|
||||
- Proper error boundaries and handling
|
||||
- Follow established file structure and naming conventions
|
||||
|
||||
**Makerkit Standards:**
|
||||
- Multi-tenant architecture with account-based access
|
||||
- Use existing database functions where applicable
|
||||
- Follow monorepo patterns and package organization
|
||||
- Implement proper security and performance best practices`,
|
||||
examples: [
|
||||
'Implement team collaboration features',
|
||||
'Build a subscription management system',
|
||||
'Create a file sharing feature with permissions',
|
||||
],
|
||||
},
|
||||
{
|
||||
name: 'supabase_rls_policy_design',
|
||||
title: 'Supabase RLS Policy Design',
|
||||
description:
|
||||
'Design Row Level Security policies for Makerkit multi-tenant architecture',
|
||||
category: 'database',
|
||||
arguments: [
|
||||
{
|
||||
name: 'table_name',
|
||||
description: 'Table that needs RLS policies',
|
||||
required: true,
|
||||
type: 'string',
|
||||
},
|
||||
{
|
||||
name: 'access_patterns',
|
||||
description: 'Who should access this data and how',
|
||||
required: true,
|
||||
type: 'text',
|
||||
},
|
||||
{
|
||||
name: 'data_sensitivity',
|
||||
description: 'Sensitivity level of the data',
|
||||
required: true,
|
||||
type: 'enum',
|
||||
options: [
|
||||
'public',
|
||||
'account-restricted',
|
||||
'role-restricted',
|
||||
'owner-only',
|
||||
],
|
||||
},
|
||||
],
|
||||
template: `Design RLS policies for table: {{table_name}}
|
||||
|
||||
**Access Requirements:** {{access_patterns}}
|
||||
**Data Sensitivity:** {{data_sensitivity}}
|
||||
|
||||
**Please provide:**
|
||||
|
||||
**1. Policy Design:**
|
||||
- Complete RLS policy definitions (SELECT, INSERT, UPDATE, DELETE)
|
||||
- Use of existing Makerkit functions: has_role_on_account, has_permission
|
||||
- Account-based access control following multi-tenant patterns
|
||||
|
||||
**2. Security Analysis:**
|
||||
- How policies enforce account boundaries
|
||||
- Role-based access control integration
|
||||
- Prevention of data leakage between accounts
|
||||
|
||||
**3. Performance Considerations:**
|
||||
- Index requirements for efficient policy execution
|
||||
- Query optimization with RLS overhead
|
||||
- Use of SECURITY DEFINER functions where needed
|
||||
|
||||
**4. Policy SQL:**
|
||||
\`\`\`sql
|
||||
-- Enable RLS
|
||||
ALTER TABLE {{table_name}} ENABLE ROW LEVEL SECURITY;
|
||||
|
||||
-- Your policies here
|
||||
\`\`\`
|
||||
|
||||
**5. Testing Strategy:**
|
||||
- Test cases for different user roles and permissions
|
||||
- Verification of account isolation
|
||||
- Performance testing with large datasets
|
||||
|
||||
**Makerkit RLS Standards:**
|
||||
- All user data must respect account boundaries
|
||||
- Use existing permission functions for consistency
|
||||
- Personal accounts: auth.users.id = accounts.id
|
||||
- Team accounts: check via accounts_memberships table
|
||||
- Leverage roles and role_permissions for granular access`,
|
||||
examples: [
|
||||
'Design RLS for a documents table',
|
||||
'Create policies for team collaboration data',
|
||||
'Set up RLS for billing and subscription data',
|
||||
],
|
||||
},
|
||||
];
|
||||
|
||||
static getAllPrompts(): PromptTemplate[] {
|
||||
return this.prompts;
|
||||
}
|
||||
|
||||
static getPromptsByCategory(category: string): PromptTemplate[] {
|
||||
return this.prompts.filter((prompt) => prompt.category === category);
|
||||
}
|
||||
|
||||
static getPrompt(name: string): PromptTemplate | null {
|
||||
return this.prompts.find((prompt) => prompt.name === name) || null;
|
||||
}
|
||||
|
||||
static searchPrompts(query: string): PromptTemplate[] {
|
||||
const searchTerm = query.toLowerCase();
|
||||
return this.prompts.filter(
|
||||
(prompt) =>
|
||||
prompt.name.toLowerCase().includes(searchTerm) ||
|
||||
prompt.title.toLowerCase().includes(searchTerm) ||
|
||||
prompt.description.toLowerCase().includes(searchTerm) ||
|
||||
prompt.category.toLowerCase().includes(searchTerm),
|
||||
);
|
||||
}
|
||||
|
||||
static renderPrompt(name: string, args: Record<string, string>): string {
|
||||
const prompt = this.getPrompt(name);
|
||||
if (!prompt) {
|
||||
throw new Error(`Prompt "${name}" not found`);
|
||||
}
|
||||
|
||||
// Simple template rendering with Handlebars-like syntax
|
||||
let rendered = prompt.template;
|
||||
|
||||
// Replace {{variable}} placeholders
|
||||
rendered = rendered.replace(/\{\{(\w+)\}\}/g, (match, varName) => {
|
||||
return args[varName] || '';
|
||||
});
|
||||
|
||||
// Replace {{variable || default}} placeholders
|
||||
rendered = rendered.replace(
|
||||
/\{\{(\w+)\s*\|\|\s*'([^']*)'\}\}/g,
|
||||
(match, varName, defaultValue) => {
|
||||
return args[varName] || defaultValue;
|
||||
},
|
||||
);
|
||||
|
||||
// Handle conditional blocks {{#if variable}}...{{/if}}
|
||||
rendered = rendered.replace(
|
||||
/\{\{#if\s+(\w+)\}\}([\s\S]*?)\{\{\/if\}\}/g,
|
||||
(match, varName, content) => {
|
||||
return args[varName] ? content : '';
|
||||
},
|
||||
);
|
||||
|
||||
return rendered.trim();
|
||||
}
|
||||
}
|
||||
|
||||
export function registerPromptsSystem(server: McpServer) {
|
||||
// Register all prompts using the SDK's prompt API
|
||||
const allPrompts = PromptsManager.getAllPrompts();
|
||||
|
||||
for (const promptTemplate of allPrompts) {
|
||||
// Convert arguments to proper Zod schema format
|
||||
const argsSchema = promptTemplate.arguments.reduce(
|
||||
(acc, arg) => {
|
||||
if (arg.required) {
|
||||
acc[arg.name] = z.string().describe(arg.description);
|
||||
} else {
|
||||
acc[arg.name] = z.string().optional().describe(arg.description);
|
||||
}
|
||||
return acc;
|
||||
},
|
||||
{} as Record<string, z.ZodString | z.ZodOptional<z.ZodString>>,
|
||||
);
|
||||
|
||||
server.prompt(
|
||||
promptTemplate.name,
|
||||
promptTemplate.description,
|
||||
argsSchema,
|
||||
async (args: Record<string, string>) => {
|
||||
const renderedPrompt = PromptsManager.renderPrompt(
|
||||
promptTemplate.name,
|
||||
args,
|
||||
);
|
||||
|
||||
return {
|
||||
messages: [
|
||||
{
|
||||
role: 'user',
|
||||
content: {
|
||||
type: 'text',
|
||||
text: renderedPrompt,
|
||||
},
|
||||
},
|
||||
],
|
||||
};
|
||||
},
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -1,391 +0,0 @@
|
||||
import { ComponentsTool } from './src/tools/components';
|
||||
import { DatabaseTool } from './src/tools/database';
|
||||
import { MigrationsTool } from './src/tools/migrations';
|
||||
import { ScriptsTool } from './src/tools/scripts';
|
||||
|
||||
console.log('=== Testing MigrationsTool ===');
|
||||
console.log(await MigrationsTool.GetMigrations());
|
||||
|
||||
console.log(
|
||||
await MigrationsTool.getMigrationContent('20240319163440_roles-seed.sql'),
|
||||
);
|
||||
|
||||
console.log('\n=== Testing ComponentsTool ===');
|
||||
|
||||
console.log('\n--- Getting all components ---');
|
||||
const components = await ComponentsTool.getComponents();
|
||||
console.log(`Found ${components.length} components:`);
|
||||
components.slice(0, 5).forEach((component) => {
|
||||
console.log(
|
||||
`- ${component.name} (${component.category}): ${component.description}`,
|
||||
);
|
||||
});
|
||||
console.log('...');
|
||||
|
||||
console.log('\n--- Testing component content retrieval ---');
|
||||
try {
|
||||
const buttonContent = await ComponentsTool.getComponentContent('button');
|
||||
console.log('Button component content length:', buttonContent.length);
|
||||
console.log('First 200 characters:', buttonContent.substring(0, 200));
|
||||
} catch (error) {
|
||||
console.error('Error getting button component:', error);
|
||||
}
|
||||
|
||||
console.log('\n--- Testing component filtering by category ---');
|
||||
const shadcnComponents = components.filter((c) => c.category === 'shadcn');
|
||||
const makerkitComponents = components.filter((c) => c.category === 'makerkit');
|
||||
const utilsComponents = components.filter((c) => c.category === 'utils');
|
||||
|
||||
console.log(`Shadcn components: ${shadcnComponents.length}`);
|
||||
console.log(`Makerkit components: ${makerkitComponents.length}`);
|
||||
console.log(`Utils components: ${utilsComponents.length}`);
|
||||
|
||||
console.log('\n--- Sample components by category ---');
|
||||
console.log(
|
||||
'Shadcn:',
|
||||
shadcnComponents
|
||||
.slice(0, 3)
|
||||
.map((c) => c.name)
|
||||
.join(', '),
|
||||
);
|
||||
console.log(
|
||||
'Makerkit:',
|
||||
makerkitComponents
|
||||
.slice(0, 3)
|
||||
.map((c) => c.name)
|
||||
.join(', '),
|
||||
);
|
||||
console.log('Utils:', utilsComponents.map((c) => c.name).join(', '));
|
||||
|
||||
console.log('\n--- Testing error handling ---');
|
||||
try {
|
||||
await ComponentsTool.getComponentContent('non-existent-component');
|
||||
} catch (error) {
|
||||
console.log(
|
||||
'Expected error for non-existent component:',
|
||||
error instanceof Error ? error.message : String(error),
|
||||
);
|
||||
}
|
||||
|
||||
console.log('\n=== Testing ScriptsTool ===');
|
||||
|
||||
console.log('\n--- Getting all scripts ---');
|
||||
const scripts = await ScriptsTool.getScripts();
|
||||
console.log(`Found ${scripts.length} scripts:`);
|
||||
|
||||
console.log('\n--- Critical and High importance scripts ---');
|
||||
const importantScripts = scripts.filter(
|
||||
(s) => s.importance === 'critical' || s.importance === 'high',
|
||||
);
|
||||
importantScripts.forEach((script) => {
|
||||
const healthcheck = script.healthcheck ? ' [HEALTHCHECK]' : '';
|
||||
console.log(
|
||||
`- ${script.name} (${script.importance})${healthcheck}: ${script.description}`,
|
||||
);
|
||||
});
|
||||
|
||||
console.log('\n--- Healthcheck scripts (code quality) ---');
|
||||
const healthcheckScripts = scripts.filter((s) => s.healthcheck);
|
||||
console.log('Scripts that should be run after writing code:');
|
||||
healthcheckScripts.forEach((script) => {
|
||||
console.log(`- pnpm ${script.name}: ${script.usage}`);
|
||||
});
|
||||
|
||||
console.log('\n--- Scripts by category ---');
|
||||
const categories = [...new Set(scripts.map((s) => s.category))];
|
||||
categories.forEach((category) => {
|
||||
const categoryScripts = scripts.filter((s) => s.category === category);
|
||||
console.log(`${category}: ${categoryScripts.map((s) => s.name).join(', ')}`);
|
||||
});
|
||||
|
||||
console.log('\n--- Testing script details ---');
|
||||
try {
|
||||
const typecheckDetails = await ScriptsTool.getScriptDetails('typecheck');
|
||||
console.log('Typecheck script details:');
|
||||
console.log(` Command: ${typecheckDetails.command}`);
|
||||
console.log(` Importance: ${typecheckDetails.importance}`);
|
||||
console.log(` Healthcheck: ${typecheckDetails.healthcheck}`);
|
||||
console.log(` Usage: ${typecheckDetails.usage}`);
|
||||
} catch (error) {
|
||||
console.error('Error getting typecheck details:', error);
|
||||
}
|
||||
|
||||
console.log('\n--- Testing error handling for scripts ---');
|
||||
try {
|
||||
await ScriptsTool.getScriptDetails('non-existent-script');
|
||||
} catch (error) {
|
||||
console.log(
|
||||
'Expected error for non-existent script:',
|
||||
error instanceof Error ? error.message : String(error),
|
||||
);
|
||||
}
|
||||
|
||||
console.log('\n=== Testing New ComponentsTool Features ===');
|
||||
|
||||
console.log('\n--- Testing component search ---');
|
||||
const buttonSearchResults = await ComponentsTool.searchComponents('button');
|
||||
console.log(`Search for "button": ${buttonSearchResults.length} results`);
|
||||
buttonSearchResults.forEach((component) => {
|
||||
console.log(` - ${component.name}: ${component.description}`);
|
||||
});
|
||||
|
||||
console.log('\n--- Testing search by category ---');
|
||||
const shadcnSearchResults = await ComponentsTool.searchComponents('shadcn');
|
||||
console.log(
|
||||
`Search for "shadcn": ${shadcnSearchResults.length} results (showing first 3)`,
|
||||
);
|
||||
shadcnSearchResults.slice(0, 3).forEach((component) => {
|
||||
console.log(` - ${component.name}`);
|
||||
});
|
||||
|
||||
console.log('\n--- Testing search by description keyword ---');
|
||||
const formSearchResults = await ComponentsTool.searchComponents('form');
|
||||
console.log(`Search for "form": ${formSearchResults.length} results`);
|
||||
formSearchResults.forEach((component) => {
|
||||
console.log(` - ${component.name}: ${component.description}`);
|
||||
});
|
||||
|
||||
console.log('\n--- Testing component props extraction ---');
|
||||
try {
|
||||
console.log('\n--- Button component props ---');
|
||||
const buttonProps = await ComponentsTool.getComponentProps('button');
|
||||
console.log(`Component: ${buttonProps.componentName}`);
|
||||
console.log(`Interfaces: ${buttonProps.interfaces.join(', ')}`);
|
||||
console.log(`Props (${buttonProps.props.length}):`);
|
||||
buttonProps.props.forEach((prop) => {
|
||||
const optional = prop.optional ? '?' : '';
|
||||
console.log(` - ${prop.name}${optional}: ${prop.type}`);
|
||||
});
|
||||
if (buttonProps.variants) {
|
||||
console.log('Variants:');
|
||||
Object.entries(buttonProps.variants).forEach(([variantName, options]) => {
|
||||
console.log(` - ${variantName}: ${options.join(' | ')}`);
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error getting button props:', error);
|
||||
}
|
||||
|
||||
console.log('\n--- Testing simpler component props ---');
|
||||
try {
|
||||
const ifProps = await ComponentsTool.getComponentProps('if');
|
||||
console.log(`Component: ${ifProps.componentName}`);
|
||||
console.log(`Interfaces: ${ifProps.interfaces.join(', ')}`);
|
||||
console.log(`Props count: ${ifProps.props.length}`);
|
||||
if (ifProps.props.length > 0) {
|
||||
ifProps.props.forEach((prop) => {
|
||||
const optional = prop.optional ? '?' : '';
|
||||
console.log(` - ${prop.name}${optional}: ${prop.type}`);
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error getting if component props:', error);
|
||||
}
|
||||
|
||||
console.log('\n--- Testing search with no results ---');
|
||||
const noResults = await ComponentsTool.searchComponents('xyz123nonexistent');
|
||||
console.log(`Search for non-existent: ${noResults.length} results`);
|
||||
|
||||
console.log('\n--- Testing props extraction error handling ---');
|
||||
try {
|
||||
await ComponentsTool.getComponentProps('non-existent-component');
|
||||
} catch (error) {
|
||||
console.log(
|
||||
'Expected error for non-existent component props:',
|
||||
error instanceof Error ? error.message : String(error),
|
||||
);
|
||||
}
|
||||
|
||||
console.log('\n=== Testing DatabaseTool ===');
|
||||
|
||||
console.log('\n--- Getting schema files ---');
|
||||
const schemaFiles = await DatabaseTool.getSchemaFiles();
|
||||
console.log(`Found ${schemaFiles.length} schema files:`);
|
||||
schemaFiles.slice(0, 5).forEach((file) => {
|
||||
console.log(` - ${file.name}: ${file.section}`);
|
||||
});
|
||||
|
||||
console.log('\n--- Getting database functions ---');
|
||||
const dbFunctions = await DatabaseTool.getFunctions();
|
||||
console.log(`Found ${dbFunctions.length} database functions:`);
|
||||
dbFunctions.forEach((func) => {
|
||||
const security = func.securityLevel === 'definer' ? ' [DEFINER]' : '';
|
||||
console.log(` - ${func.name}${security}: ${func.purpose}`);
|
||||
});
|
||||
|
||||
console.log('\n--- Testing function search ---');
|
||||
const authFunctions = await DatabaseTool.searchFunctions('auth');
|
||||
console.log(`Functions related to "auth": ${authFunctions.length}`);
|
||||
authFunctions.forEach((func) => {
|
||||
console.log(` - ${func.name}: ${func.purpose}`);
|
||||
});
|
||||
|
||||
console.log('\n--- Testing function search by security ---');
|
||||
const definerFunctions = await DatabaseTool.searchFunctions('definer');
|
||||
console.log(`Functions with security definer: ${definerFunctions.length}`);
|
||||
definerFunctions.forEach((func) => {
|
||||
console.log(` - ${func.name}: ${func.purpose}`);
|
||||
});
|
||||
|
||||
console.log('\n--- Testing function details ---');
|
||||
if (dbFunctions.length > 0) {
|
||||
try {
|
||||
const firstFunction = dbFunctions[0];
|
||||
if (firstFunction) {
|
||||
const functionDetails = await DatabaseTool.getFunctionDetails(
|
||||
firstFunction.name,
|
||||
);
|
||||
console.log(`Details for ${functionDetails.name}:`);
|
||||
console.log(` Purpose: ${functionDetails.purpose}`);
|
||||
console.log(` Return Type: ${functionDetails.returnType}`);
|
||||
console.log(` Security: ${functionDetails.securityLevel}`);
|
||||
console.log(` Parameters: ${functionDetails.parameters.length}`);
|
||||
functionDetails.parameters.forEach((param) => {
|
||||
const defaultVal = param.defaultValue
|
||||
? ` (default: ${param.defaultValue})`
|
||||
: '';
|
||||
console.log(` - ${param.name}: ${param.type}${defaultVal}`);
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error getting function details:', error);
|
||||
}
|
||||
}
|
||||
|
||||
console.log('\n--- Testing function search with no results ---');
|
||||
const noFunctionResults =
|
||||
await DatabaseTool.searchFunctions('xyz123nonexistent');
|
||||
console.log(
|
||||
`Search for non-existent function: ${noFunctionResults.length} results`,
|
||||
);
|
||||
|
||||
console.log('\n--- Testing function details error handling ---');
|
||||
try {
|
||||
await DatabaseTool.getFunctionDetails('non-existent-function');
|
||||
} catch (error) {
|
||||
console.log(
|
||||
'Expected error for non-existent function:',
|
||||
error instanceof Error ? error.message : String(error),
|
||||
);
|
||||
}
|
||||
|
||||
console.log('\n=== Testing Enhanced DatabaseTool Features ===');
|
||||
|
||||
console.log('\n--- Testing direct schema content access ---');
|
||||
try {
|
||||
const accountsSchemaContent =
|
||||
await DatabaseTool.getSchemaContent('03-accounts.sql');
|
||||
console.log('Accounts schema content length:', accountsSchemaContent.length);
|
||||
console.log('First 200 characters:', accountsSchemaContent.substring(0, 200));
|
||||
} catch (error) {
|
||||
console.error(
|
||||
'Error getting accounts schema content:',
|
||||
error instanceof Error ? error.message : String(error),
|
||||
);
|
||||
}
|
||||
|
||||
console.log('\n--- Testing schema search by topic ---');
|
||||
const authSchemas = await DatabaseTool.getSchemasByTopic('auth');
|
||||
console.log(`Schemas related to "auth": ${authSchemas.length}`);
|
||||
authSchemas.forEach((schema) => {
|
||||
console.log(` - ${schema.name} (${schema.topic}): ${schema.section}`);
|
||||
if (schema.functions.length > 0) {
|
||||
console.log(` Functions: ${schema.functions.join(', ')}`);
|
||||
}
|
||||
});
|
||||
|
||||
console.log('\n--- Testing schema search by topic - billing ---');
|
||||
const billingSchemas = await DatabaseTool.getSchemasByTopic('billing');
|
||||
console.log(`Schemas related to "billing": ${billingSchemas.length}`);
|
||||
billingSchemas.forEach((schema) => {
|
||||
console.log(` - ${schema.name}: ${schema.description}`);
|
||||
if (schema.tables.length > 0) {
|
||||
console.log(` Tables: ${schema.tables.join(', ')}`);
|
||||
}
|
||||
});
|
||||
|
||||
console.log('\n--- Testing schema search by topic - accounts ---');
|
||||
const accountSchemas = await DatabaseTool.getSchemasByTopic('accounts');
|
||||
console.log(`Schemas related to "accounts": ${accountSchemas.length}`);
|
||||
accountSchemas.forEach((schema) => {
|
||||
console.log(` - ${schema.name}: ${schema.description}`);
|
||||
if (schema.dependencies.length > 0) {
|
||||
console.log(` Dependencies: ${schema.dependencies.join(', ')}`);
|
||||
}
|
||||
});
|
||||
|
||||
console.log('\n--- Testing schema by section lookup ---');
|
||||
try {
|
||||
const accountsSection = await DatabaseTool.getSchemaBySection('Accounts');
|
||||
if (accountsSection) {
|
||||
console.log(`Found section: ${accountsSection.section}`);
|
||||
console.log(`File: ${accountsSection.name}`);
|
||||
console.log(`Topic: ${accountsSection.topic}`);
|
||||
console.log(`Tables: ${accountsSection.tables.join(', ')}`);
|
||||
console.log(`Last modified: ${accountsSection.lastModified.toISOString()}`);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error getting accounts section:', error);
|
||||
}
|
||||
|
||||
console.log('\n--- Testing enhanced schema metadata ---');
|
||||
const enhancedSchemas = await DatabaseTool.getSchemaFiles();
|
||||
console.log(`Total schemas with metadata: ${enhancedSchemas.length}`);
|
||||
|
||||
// Show schemas with the most tables
|
||||
const schemasWithTables = enhancedSchemas.filter((s) => s.tables.length > 0);
|
||||
console.log(`Schemas with tables: ${schemasWithTables.length}`);
|
||||
schemasWithTables.slice(0, 3).forEach((schema) => {
|
||||
console.log(
|
||||
` - ${schema.name}: ${schema.tables.length} tables (${schema.tables.join(', ')})`,
|
||||
);
|
||||
});
|
||||
|
||||
// Show schemas with functions
|
||||
const schemasWithFunctions = enhancedSchemas.filter(
|
||||
(s) => s.functions.length > 0,
|
||||
);
|
||||
console.log(`Schemas with functions: ${schemasWithFunctions.length}`);
|
||||
schemasWithFunctions.slice(0, 3).forEach((schema) => {
|
||||
console.log(
|
||||
` - ${schema.name}: ${schema.functions.length} functions (${schema.functions.join(', ')})`,
|
||||
);
|
||||
});
|
||||
|
||||
// Show topic distribution
|
||||
const topicCounts = enhancedSchemas.reduce(
|
||||
(acc, schema) => {
|
||||
acc[schema.topic] = (acc[schema.topic] || 0) + 1;
|
||||
return acc;
|
||||
},
|
||||
{} as Record<string, number>,
|
||||
);
|
||||
|
||||
console.log('\n--- Topic distribution ---');
|
||||
Object.entries(topicCounts).forEach(([topic, count]) => {
|
||||
console.log(` - ${topic}: ${count} files`);
|
||||
});
|
||||
|
||||
console.log('\n--- Testing error handling for enhanced features ---');
|
||||
try {
|
||||
await DatabaseTool.getSchemaContent('non-existent-schema.sql');
|
||||
} catch (error) {
|
||||
console.log(
|
||||
'Expected error for non-existent schema:',
|
||||
error instanceof Error ? error.message : String(error),
|
||||
);
|
||||
}
|
||||
|
||||
try {
|
||||
const nonExistentSection =
|
||||
await DatabaseTool.getSchemaBySection('NonExistentSection');
|
||||
console.log('Non-existent section result:', nonExistentSection);
|
||||
} catch (error) {
|
||||
console.error('Unexpected error for non-existent section:', error);
|
||||
}
|
||||
|
||||
const emptyTopicResults =
|
||||
await DatabaseTool.getSchemasByTopic('xyz123nonexistent');
|
||||
console.log(
|
||||
`Search for non-existent topic: ${emptyTopicResults.length} results`,
|
||||
);
|
||||
@@ -6,8 +6,8 @@
|
||||
"noEmit": false,
|
||||
"strict": false,
|
||||
"target": "ES2022",
|
||||
"module": "commonjs",
|
||||
"moduleResolution": "node"
|
||||
"module": "nodenext",
|
||||
"moduleResolution": "nodenext"
|
||||
},
|
||||
"files": ["src/index.ts"],
|
||||
"exclude": ["node_modules"]
|
||||
|
||||
@@ -59,10 +59,9 @@ export const createNoteAction = enhanceAction(
|
||||
|
||||
```typescript
|
||||
export const myAction = enhanceAction(
|
||||
async function (data, user, requestData) {
|
||||
async function (data, user) {
|
||||
// data: validated input data
|
||||
// user: authenticated user (if auth: true)
|
||||
// requestData: additional request information
|
||||
|
||||
return { success: true };
|
||||
},
|
||||
@@ -167,6 +166,11 @@ export const POST = enhanceRouteHandler(
|
||||
);
|
||||
```
|
||||
|
||||
## Revalidation
|
||||
|
||||
- Use `revalidatePath` for revalidating data after a migration.
|
||||
- Avoid calling `router.refresh()` or `router.push()` following a Server Action. Use `revalidatePath` and `redirect` from the server action instead.
|
||||
|
||||
## Error Handling Patterns
|
||||
|
||||
### Server Actions with Error Handling
|
||||
@@ -201,8 +205,10 @@ export const createNoteAction = enhanceAction(
|
||||
|
||||
return { success: true, note };
|
||||
} catch (error) {
|
||||
logger.error({ ...ctx, error }, 'Create note action failed');
|
||||
throw error;
|
||||
if (!isRedirectError(error)) {
|
||||
logger.error({ ...ctx, error }, 'Create note action failed');
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
@@ -212,6 +218,26 @@ export const createNoteAction = enhanceAction(
|
||||
);
|
||||
```
|
||||
|
||||
|
||||
### Server Action Redirects - Client Handling
|
||||
|
||||
When server actions call `redirect()`, it throws a special error that should NOT be treated as a failure:
|
||||
|
||||
```typescript
|
||||
import { isRedirectError } from 'next/dist/client/components/redirect-error';
|
||||
|
||||
async function handleSubmit(formData: FormData) {
|
||||
try {
|
||||
await myServerAction(formData);
|
||||
} catch (error) {
|
||||
// Don't treat redirects as errors
|
||||
if (!isRedirectError(error)) {
|
||||
// Handle actual errors
|
||||
toast.error('Something went wrong');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
### Route Handler with Error Handling
|
||||
|
||||
```typescript
|
||||
@@ -307,6 +333,8 @@ function CreateNoteForm() {
|
||||
}
|
||||
```
|
||||
|
||||
NB: When using `redirect`, we must handle it using `isRedirectError` otherwise we display an error after the server action succeeds
|
||||
|
||||
### Using Route Handlers with Fetch
|
||||
|
||||
```typescript
|
||||
@@ -420,16 +448,4 @@ export const deleteAccountAction = enhanceAction(
|
||||
schema: DeleteAccountSchema,
|
||||
},
|
||||
);
|
||||
```
|
||||
|
||||
## Middleware Integration
|
||||
|
||||
The `enhanceAction` and `enhanceRouteHandler` utilities integrate with the application middleware for:
|
||||
|
||||
- CSRF protection
|
||||
- Authentication verification
|
||||
- Request logging
|
||||
- Error handling
|
||||
- Input validation
|
||||
|
||||
This ensures consistent security and monitoring across all server actions and API routes.
|
||||
```
|
||||
@@ -59,10 +59,9 @@ export const createNoteAction = enhanceAction(
|
||||
|
||||
```typescript
|
||||
export const myAction = enhanceAction(
|
||||
async function (data, user, requestData) {
|
||||
async function (data, user) {
|
||||
// data: validated input data
|
||||
// user: authenticated user (if auth: true)
|
||||
// requestData: additional request information
|
||||
|
||||
return { success: true };
|
||||
},
|
||||
@@ -167,6 +166,11 @@ export const POST = enhanceRouteHandler(
|
||||
);
|
||||
```
|
||||
|
||||
## Revalidation
|
||||
|
||||
- Use `revalidatePath` for revalidating data after a migration.
|
||||
- Avoid calling `router.refresh()` or `router.push()` following a Server Action. Use `revalidatePath` and `redirect` from the server action instead.
|
||||
|
||||
## Error Handling Patterns
|
||||
|
||||
### Server Actions with Error Handling
|
||||
@@ -201,8 +205,10 @@ export const createNoteAction = enhanceAction(
|
||||
|
||||
return { success: true, note };
|
||||
} catch (error) {
|
||||
logger.error({ ...ctx, error }, 'Create note action failed');
|
||||
throw error;
|
||||
if (!isRedirectError(error)) {
|
||||
logger.error({ ...ctx, error }, 'Create note action failed');
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
@@ -212,6 +218,26 @@ export const createNoteAction = enhanceAction(
|
||||
);
|
||||
```
|
||||
|
||||
|
||||
### Server Action Redirects - Client Handling
|
||||
|
||||
When server actions call `redirect()`, it throws a special error that should NOT be treated as a failure:
|
||||
|
||||
```typescript
|
||||
import { isRedirectError } from 'next/dist/client/components/redirect-error';
|
||||
|
||||
async function handleSubmit(formData: FormData) {
|
||||
try {
|
||||
await myServerAction(formData);
|
||||
} catch (error) {
|
||||
// Don't treat redirects as errors
|
||||
if (!isRedirectError(error)) {
|
||||
// Handle actual errors
|
||||
toast.error('Something went wrong');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
### Route Handler with Error Handling
|
||||
|
||||
```typescript
|
||||
@@ -307,6 +333,8 @@ function CreateNoteForm() {
|
||||
}
|
||||
```
|
||||
|
||||
NB: When using `redirect`, we must handle it using `isRedirectError` otherwise we display an error after the server action succeeds
|
||||
|
||||
### Using Route Handlers with Fetch
|
||||
|
||||
```typescript
|
||||
@@ -420,16 +448,4 @@ export const deleteAccountAction = enhanceAction(
|
||||
schema: DeleteAccountSchema,
|
||||
},
|
||||
);
|
||||
```
|
||||
|
||||
## Middleware Integration
|
||||
|
||||
The `enhanceAction` and `enhanceRouteHandler` utilities integrate with the application middleware for:
|
||||
|
||||
- CSRF protection
|
||||
- Authentication verification
|
||||
- Request logging
|
||||
- Error handling
|
||||
- Input validation
|
||||
|
||||
This ensures consistent security and monitoring across all server actions and API routes.
|
||||
```
|
||||
@@ -2,6 +2,23 @@
|
||||
|
||||
This file contains instructions for working with Supabase, database security, and authentication.
|
||||
|
||||
## Schemas and Migrations ⚠️
|
||||
|
||||
**Critical Understanding**: Schema files are NOT automatically applied to the database!
|
||||
|
||||
- **Schemas** (`supabase/schemas/`) represent the desired database state (source of truth)
|
||||
- **Migrations** (`supabase/migrations/`) are the actual SQL commands that modify the database
|
||||
|
||||
### The Required Workflow
|
||||
|
||||
1. **Edit schema file** (e.g., `supabase/schemas/18-projects.sql`)
|
||||
2. **Generate migration**: `pnpm --filter web supabase:db:diff -f migration_name`
|
||||
- This compares your schema against the current database and creates a migration
|
||||
3. **Apply migration**: `pnpm --filter web supabase migration up`
|
||||
- This actually executes the SQL changes in the database
|
||||
|
||||
**⚠️ CRITICAL**: Editing a schema file alone does NOTHING to your database. You MUST generate and apply a migration for changes to take effect. Schema files are templates - migrations are the actual database operations.
|
||||
|
||||
## Database Security Guidelines ⚠️
|
||||
|
||||
**Critical Security Guidelines - Read Carefully!**
|
||||
@@ -98,22 +115,8 @@ CREATE POLICY "notes_manage" ON public.notes FOR ALL
|
||||
);
|
||||
```
|
||||
|
||||
## Schema Management Workflow
|
||||
|
||||
1. Create schemas in `apps/web/supabase/schemas/` as `<number>-<name>.sql`
|
||||
2. After changes: `pnpm supabase:web:stop`
|
||||
3. Run: `pnpm --filter web run supabase:db:diff -f <filename>`
|
||||
4. Restart: `pnpm supabase:web:start` and `pnpm supabase:web:reset`
|
||||
5. Generate types: `pnpm supabase:web:typegen`
|
||||
|
||||
- **Never modify database.types.ts**: Instead, use the Supabase CLI using our package.json scripts to re-generate the types after resetting the DB
|
||||
|
||||
### Key Schema Files
|
||||
|
||||
- Accounts: `apps/web/supabase/schemas/03-accounts.sql`
|
||||
- Memberships: `apps/web/supabase/schemas/05-memberships.sql`
|
||||
- Permissions: `apps/web/supabase/schemas/06-roles-permissions.sql`
|
||||
|
||||
## Type Generation
|
||||
|
||||
```typescript
|
||||
@@ -296,7 +299,7 @@ async function databaseOperation() {
|
||||
## Migration Best Practices
|
||||
|
||||
1. Always test migrations locally first
|
||||
2. Use transactions for complex migrations
|
||||
2. Use transactions for complex operations
|
||||
3. Add proper indexes for new columns
|
||||
4. Update RLS policies when adding new tables
|
||||
5. Generate TypeScript types after schema changes
|
||||
|
||||
@@ -2,6 +2,23 @@
|
||||
|
||||
This file contains instructions for working with Supabase, database security, and authentication.
|
||||
|
||||
## Schemas and Migrations ⚠️
|
||||
|
||||
**Critical Understanding**: Schema files are NOT automatically applied to the database!
|
||||
|
||||
- **Schemas** (`supabase/schemas/`) represent the desired database state (source of truth)
|
||||
- **Migrations** (`supabase/migrations/`) are the actual SQL commands that modify the database
|
||||
|
||||
### The Required Workflow
|
||||
|
||||
1. **Edit schema file** (e.g., `supabase/schemas/18-projects.sql`)
|
||||
2. **Generate migration**: `pnpm --filter web supabase:db:diff -f migration_name`
|
||||
- This compares your schema against the current database and creates a migration
|
||||
3. **Apply migration**: `pnpm --filter web supabase migration up`
|
||||
- This actually executes the SQL changes in the database
|
||||
|
||||
**⚠️ CRITICAL**: Editing a schema file alone does NOTHING to your database. You MUST generate and apply a migration for changes to take effect. Schema files are templates - migrations are the actual database operations.
|
||||
|
||||
## Database Security Guidelines ⚠️
|
||||
|
||||
**Critical Security Guidelines - Read Carefully!**
|
||||
@@ -98,22 +115,8 @@ CREATE POLICY "notes_manage" ON public.notes FOR ALL
|
||||
);
|
||||
```
|
||||
|
||||
## Schema Management Workflow
|
||||
|
||||
1. Create schemas in `apps/web/supabase/schemas/` as `<number>-<name>.sql`
|
||||
2. After changes: `pnpm supabase:web:stop`
|
||||
3. Run: `pnpm --filter web run supabase:db:diff -f <filename>`
|
||||
4. Restart: `pnpm supabase:web:start` and `pnpm supabase:web:reset`
|
||||
5. Generate types: `pnpm supabase:web:typegen`
|
||||
|
||||
- **Never modify database.types.ts**: Instead, use the Supabase CLI using our package.json scripts to re-generate the types after resetting the DB
|
||||
|
||||
### Key Schema Files
|
||||
|
||||
- Accounts: `apps/web/supabase/schemas/03-accounts.sql`
|
||||
- Memberships: `apps/web/supabase/schemas/05-memberships.sql`
|
||||
- Permissions: `apps/web/supabase/schemas/06-roles-permissions.sql`
|
||||
|
||||
## Type Generation
|
||||
|
||||
```typescript
|
||||
@@ -296,7 +299,7 @@ async function databaseOperation() {
|
||||
## Migration Best Practices
|
||||
|
||||
1. Always test migrations locally first
|
||||
2. Use transactions for complex migrations
|
||||
2. Use transactions for complex operations
|
||||
3. Add proper indexes for new columns
|
||||
4. Update RLS policies when adding new tables
|
||||
5. Generate TypeScript types after schema changes
|
||||
|
||||
9
pnpm-lock.yaml
generated
9
pnpm-lock.yaml
generated
@@ -1144,6 +1144,9 @@ importers:
|
||||
'@types/node':
|
||||
specifier: ^24.5.0
|
||||
version: 24.5.0
|
||||
postgres:
|
||||
specifier: 3.4.7
|
||||
version: 3.4.7
|
||||
zod:
|
||||
specifier: ^3.25.74
|
||||
version: 3.25.76
|
||||
@@ -7618,6 +7621,10 @@ packages:
|
||||
resolution: {integrity: sha512-9ZhXKM/rw350N1ovuWHbGxnGh/SNJ4cnxHiM0rxE4VN41wsg8P8zWn9hv/buK00RP4WvlOyr/RBDiptyxVbkZQ==}
|
||||
engines: {node: '>=0.10.0'}
|
||||
|
||||
postgres@3.4.7:
|
||||
resolution: {integrity: sha512-Jtc2612XINuBjIl/QTWsV5UvE8UHuNblcO3vVADSrKsrc6RqGX6lOW1cEo3CM2v0XG4Nat8nI+YM7/f26VxXLw==}
|
||||
engines: {node: '>=12'}
|
||||
|
||||
prelude-ls@1.2.1:
|
||||
resolution: {integrity: sha512-vkcDPrRZo1QZLbn5RLGPpg/WmIQ65qoWWhcGKf/b5eplkkarX0m9z8ppCat4mlOqUsWpyNuYgO3VRyrYHSzX5g==}
|
||||
engines: {node: '>= 0.8.0'}
|
||||
@@ -16574,6 +16581,8 @@ snapshots:
|
||||
dependencies:
|
||||
xtend: 4.0.2
|
||||
|
||||
postgres@3.4.7: {}
|
||||
|
||||
prelude-ls@1.2.1: {}
|
||||
|
||||
prettier-plugin-tailwindcss@0.6.14(@trivago/prettier-plugin-sort-imports@5.2.2(prettier@3.6.2))(prettier@3.6.2):
|
||||
|
||||
Reference in New Issue
Block a user