Refactor file upload and toast notifications (#472)

* Refactor file upload and toast notifications

Replaces react-hot-toast with a custom toast system using @radix-ui/react-toast, updating all usages and adding new UI components for toast and dialog. Refactors file upload to use a two-step process: first generating an S3 upload URL, then adding the file to the database, and adds file deletion support with confirmation dialog and S3 cleanup. Updates Prisma schema, removes unused fields, and cleans up navigation and admin settings page.

* Enforce file upload limit and update dependencies on opensaas-sh

Added a check to restrict users to 2 file uploads in the demo, with a new helper function and error handling. Updated navigation items, improved landing page content, and removed unused dependencies (react-hot-toast). Added @radix-ui/react-toast, updated testimonials, and made minor content and code improvements.

* update tests

* Improve file deletion error handling and cleanup

Refactors file deletion to delete the database record before attempting S3 deletion, ensuring the file is removed from the app even if S3 deletion fails. Adds error logging for failed S3 deletions to aid in manual cleanup. Also simplifies error handling in the file upload page and removes unused imports in the demo app page.

* Add credit check and S3 file existence validation

Added logic to decrement user credits or throw an error if out of credits in the AI demo app. Updated file upload operations to validate file existence in S3 before adding to the database, and implemented S3 file existence check utility. Minor UI and code improvements included.

* Update s3Utils.ts

* update app_diff

* fix diff

* Update deletions

* Improve toast UI, error handling, and credit messaging

Updated toast action hover style and icon spacing for better UI consistency. Enhanced error handling in file deletion to display specific error messages. Refined credit/subscription error message in GPT response operation for clarity and removed redundant credit decrement logic.

* Refactor file upload validation and error handling

Replaces error state management with toast notifications for file upload errors and success. Refactors file type validation to use a new ALLOWED_FILE_TYPES_CONST and type AllowedFileTypes. Updates validation logic to throw errors instead of returning error objects, and simplifies type handling across file upload modules.

* Refactor file upload to use s3Key and add cleanup job

Replaces the 'key' field with 's3Key' for file storage references throughout the codebase and database schema. Updates all related logic, types, and API contracts to use 's3Key'. Adds a scheduled job to delete old files from S3 and the database. Cleans up file type validation constants and improves consistency in file upload and download operations.

* add orphaned file clean up

* remove s3 cleanup job from template

removed but added suggestion to docs.

* Update SettingsPage.tsx

* prettier format

* Update  UI, remove unused files

Updated README with deployment and demo details. Removed unused App.tsx and package-lock.json files. Modified Main.css, NavBar constants, file uploading logic, file upload operations, and landing page content sections for improved UI and functionality.

* remove pricing page from isMarketingPage
This commit is contained in:
vincanger
2025-10-15 12:01:08 +02:00
committed by GitHub
parent 57ae0bf5ba
commit 7d36c8f0b1
35 changed files with 2586 additions and 1351 deletions

View File

@@ -143,6 +143,116 @@ To begin customizing file uploads, is important to know where everything lives i
- The `getAllFilesByUser` fetches all File information uploaded by the user. Note that the files do not exist in the app database, but rather the file data, its name and its `key`, which is used to fetch the file from S3.
- The `getDownloadFileSignedURL` query fetches the presigned URL for a file to be downloaded from S3 using the file's `key` stored in the app's database.
### Cleaning up "orphaned" files in S3
In the current logic, files are first deleted from the app's database before attempting to delete them from S3. If, for some reason, the S3 deletion were to fail, the file would remain in S3 and not in the app's database and be orphaned:
```ts
// src/file-upload/operations.ts
export const deleteFile: DeleteFile<DeleteFileInput, File> = async (args, context) => {
const deletedFile = await context.entities.File.delete(args.fileId);
try {
return await deleteFileFromS3({ s3Key: deletedFile.s3Key });
} catch (error) {
console.error(`S3 deletion failed. Orphaned file s3Key: ${deletedFile.s3Key}`, error);
}
};
```
To clean up these orphaned files, you could add a cleanup job that runs at an interval of your choosing to:
1. Fetch all file keys from S3
2. Fetch all file keys from the app's database
3. Compare the two lists and delete any files from S3 that are not in the database
Here's an example of how you could implement this:
```ts
// .wasp config file
job cleanUpOrphanedFilesS3Job {
executor: PgBoss,
perform: {
fn: import { cleanUpOrphanedFilesS3 } from "@src/file-upload/workers"
},
schedule: {
cron: "0 5 * * 0" // every week on Sunday at 5am
},
entities: [File]
}
```
```ts
// src/file-upload/workers.ts
import type { CleanUpOrphanedFilesS3Job } from 'wasp/server/jobs';
import { s3Client, deleteFileFromS3 } from './s3Utils';
import { ListObjectsV2Command, ListObjectsV2CommandOutput } from '@aws-sdk/client-s3';
export const cleanUpOrphanedFilesS3: CleanUpOrphanedFilesS3Job<never, void> = async (
_args,
context
) => {
const allFileKeysFromS3 = await fetchAllFileKeysFromS3();
const allFileKeysFromDb = await context.entities.File.findMany({
select: { s3Key: true },
});
await findAndDeleteOrphanedFilesInS3(allFileKeysFromS3, allFileKeysFromDb);
};
const fetchAllFileKeysFromS3 = async () => {
const allS3Keys: string[] = [];
let continuationToken: string | undefined = undefined;
do {
const command = new ListObjectsV2Command({
Bucket: process.env.AWS_S3_FILES_BUCKET,
ContinuationToken: continuationToken,
});
const response: ListObjectsV2CommandOutput = await s3Client.send(command);
if (response.Contents) {
const keys = response.Contents.reduce((acc: string[], object) => {
if (object.Key) {
acc.push(object.Key);
}
return acc;
}, []);
allS3Keys.push(...keys);
}
continuationToken = response.NextContinuationToken;
} while (continuationToken);
console.log(`Found ${allS3Keys.length} total files in S3`);
return allS3Keys;
};
const findAndDeleteOrphanedFilesInS3 = async (
allFileKeysFromS3: string[],
allFileKeysFromDb: { s3Key: string }[]
) => {
const s3KeysNotFoundInDb = allFileKeysFromS3.filter(
(s3Key) => !allFileKeysFromDb.some((file) => file.s3Key === s3Key)
);
// Delete files from S3 that are not in the database
// If any file deletion fails, the job can continue and pick it up next run.
const s3DeletionResults = await Promise.allSettled(
s3KeysNotFoundInDb.map((s3Key) => deleteFileFromS3({ s3Key }))
);
const successfulDeletions = s3DeletionResults.filter((result) => result.status === 'fulfilled');
console.log(
`Successfully deleted ${successfulDeletions.length} out of ${s3KeysNotFoundInDb.length} orphaned files from S3`
);
};
```
## Using Multer to upload files to your server
If you're looking to upload files to the app server, you can use the Multer middleware to handle file uploads. This will allow you to store files on your server and is a good option if you need a quick and dirty, free solution for simple file uploads.

View File

@@ -32,8 +32,8 @@ The template comes with:
We've also created a bunch of LLM-friendly documentation:
- [Open SaaS Docs - LLMs.txt](https://docs.opensaas.sh/llms.txt) - Links to the raw text docs.
- **[Open SaaS Docs - LLMs-full.txt](https://docs.opensaas.sh/llms-full.txt) - Complete docs as one text file.** ✅😎
- Coming Soon! ~~[Wasp Docs - LLMs.txt](https://wasp.sh/llms.txt)~~ - Links to the raw text docs.
- Coming Soon! ~~[Wasp Docs - LLMs-full.txt](https://wasp.sh/llms-full.txt)~~ - Complete docs as one text file.
- [Wasp Docs - LLMs.txt](https://wasp.sh/llms.txt) - Links to the raw text docs.
- **[Wasp Docs - LLMs-full.txt](https://wasp.sh/llms-full.txt) - Complete docs as one text file.**
Add these to your AI-assisted IDE settings so you can easily reference them in your chat sessions with the LLM.
**In most cases, you'll want to pass the `llms-full.txt` to the LLM and ask it to help you with a specific task.**