datopian / r2-bucket-uploader

Cloudflare R2 bucket File Uploader

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Nextjs 14 issue with multipart upload?

28development opened this issue · comments

Not sure if it is an issue with nextjs 14 but I am struggling using the multipart upload procedure.
The completeMultipartUpload does not run somehow.

Not sure if someone already tested it with nextjs 14 or if it is just a mistake which I am doing (the single file upload work fine though).

This is the component:

import React, { useMemo } from "react";
import Uppy, { type UploadResult } from "@uppy/core";
import { Dashboard } from "@uppy/react";
import { sha256 } from "crypto-hash";
import AwsS3Multipart from "@uppy/aws-s3-multipart";

// Uppy styles
import "@uppy/core/dist/style.min.css";
import "@uppy/dashboard/dist/style.min.css";

const fetchUploadApiEndpoint = async (endpoint: string, data: any) => {
  const res = await fetch(`/api/multipart-upload/${endpoint}`, {
    method: "POST",
    body: JSON.stringify(data),
    headers: {
      accept: "application/json",
      "Content-Type": "application/json",
    },
  });

  return res.json();
};

export function MultipartFileUploader({
  onUploadSuccess,
}: {
  onUploadSuccess: (result: UploadResult) => void;
}) {
  const uppy = useMemo(() => {
    const uppy = new Uppy({
      autoProceed: true,
    }).use(AwsS3Multipart, {
      createMultipartUpload: async (file) => {
        const arrayBuffer = await new Response(file.data).arrayBuffer();
        const fileHash = await sha256(arrayBuffer);
        const contentType = file.type;
        console.log("createMultipartUpload", file, fileHash, contentType);

        return fetchUploadApiEndpoint("create-multipart-upload", {
          file,
          fileHash,
          contentType,
        });
      },
      prepareUploadParts: async (file, partData) => {
        const response = await fetchUploadApiEndpoint("prepare-upload-parts", {
          file,
          partData,
        });

        return {
          presignedUrls: response.presignedUrls,
        };
      },
      completeMultipartUpload: async (file, props) => {
        const response = await fetchUploadApiEndpoint(
          "complete-multipart-upload",
          {
            file,
            ...props,
          }
        );

        console.log("completeMultipartUpload", response);

        return response;
      },
      listParts: async (file, props) => {
        console.log("inside listParts", file, props);
        const response = await fetchUploadApiEndpoint("list-parts", {
          file,
          ...props,
        });

        return response;
      },
      abortMultipartUpload: async (file, props) => {
        console.log("inside abortMultipartUpload", file, props);
        const response = await fetchUploadApiEndpoint(
          "abort-multipart-upload",
          {
            file,
            ...props,
          }
        );

        return response;
      },
    });
    return uppy;
  }, []);

  uppy.on("complete", (result) => {
    onUploadSuccess(result);
  });

  uppy.on("upload-success", (file, response) => {
    console.log("upload-success location", response.body.Location);
    uppy.setFileState(file.id, {
      progress: uppy.getState().files[file.id].progress,
      uploadURL: response.body.Location,
      response: response,
      isPaused: false,
    });
  });

  return <Dashboard uppy={uppy} showLinkToFileUploadResult={true} />;
}

This is how my API routes look like:
Bildschirmfoto 2023-11-07 um 00 48 43

This is the prepareUploadParts method, which runs before the complete:

import {
    UploadPartCommand,
} from "@aws-sdk/client-s3";
import { type NextRequest, type NextResponse } from 'next/server'
import { R2, R2_BUCKET_NAME } from "@/lib/s3/util";
import { getSignedUrl } from "@aws-sdk/s3-request-presigner";

export async function POST(
    req: NextRequest,
    res: NextResponse
) {
    const { partData } = await req.json();
    const parts = partData.parts;

    const response: { presignedUrls: { [key: string]: string } } = {
        presignedUrls: {},
    };

    for (let part of parts) {
        try {
            const params = {
                Bucket: R2_BUCKET_NAME,
                Key: partData.key,
                PartNumber: part.number,
                UploadId: partData.uploadId,
            };
            const command = new UploadPartCommand({ ...params });
            const url = await getSignedUrl(R2, command, { expiresIn: 3600 });

            response.presignedUrls[part.number] = url;
        } catch (err) {
            console.log("Error", err);
            return Response.json({
                status: 500,
                err
            });
        }
    }

    return Response.json({
        status: 200,
        response
    });
}

Ok nevermind actually this caused it:

return Response.json({
      status: 200,
      response
});

This fixed it:

  return Response.json(response);

It returned undefined since I was expecting response in the frontend to have the data, but my initial response contain another response object causing this. So yeah it was my fault the repo still works great 🚀

@28development does this solution make the bandwidth cost increase even with a multipart upload solution?

I'm concerned if we decide to use this solution and upload a 1TB file we will be charged for that

Vercel has a client upload scenario where it works via tokens and the uploading is done by the user in the browser not the server