How to Easily Upload Large Files in Parts with Multiple Params
Uploading large files efficiently is essential in web applications. This guide details an approach using JavaScript and a pre-upload hook to split large files into manageable chunks, send them one by one, and then merge them server-side. The method provides support for resumable uploads, progress tracking, and error handling.
Main Steps
Step 1: File Slicing
First, slice the large file into smaller chunks using Blob.prototype.slice
.
1 createFileChunk = (dataSource, size = 5 * 1024 * 1024) => {2 const fileChunkList = [];3 let cur = 0, index = 0;45 while (cur < dataSource.size) {6 const chunk = dataSource.slice(cur, cur + size);7 fileChunkList.push({8 hash: `${dataSource.name}_${index++}`,9 file: chunk,10 });11 cur += size;12 }13 return fileChunkList;14 };
Each part has a unique identifier (a hash based on the file name and index) to ensure proper order during the merge process.
Step 2: Creating Fetch Requests for Each Chunk
For each chunk, create a function that returns a fetch()
request to upload it. This enables flexibility and asynchronous execution.
1 createHttp = (data) => {2 const { hash, file } = data;3 const formData = new FormData();4 formData.append('chunk', file);5 formData.append('hash', hash);67 return async () => {8 try {9 const response = await fetch(this.props.action, {10 method: 'POST',11 body: formData,12 headers: {13 'Authorization': this.props.token14 }15 });16 if (!response.ok) throw new Error(`Upload failed: ${response.statusText}`);17 return await response.json();18 } catch (error) {19 console.error("Chunk upload failed", error);20 throw error;21 }22 };23 };
This createHttp
function wraps the chunk in FormData
, sets up the request headers, and returns a function that will initiate the upload when called. This enables each chunk to be managed individually.
Step 3: Uploading Chunks with Progress Tracking
Using fetch()
requires handling progress separately, so we use a helper function with ReadableStream
to track the upload’s progress:
1 uploadWithProgress = async (fileChunkList) => {2 for (let chunk of fileChunkList) {3 await chunk()4 .then((res) => console.log("Chunk upload complete", res))5 .catch((err) => this.failedChunks.push(chunk));6 }7 };
The uploadWithProgress
function iterates over each chunk and uploads it sequentially. If a chunk fails, it’s stored in this.failedChunks
for retrying later.
Step 4: Pausing and Resuming Uploads
In fetch()
, pausing uploads is achieved by stopping execution. Resume by retrying any remaining chunks.
1 pauseUpload = () => {2 this.isPaused = true;3 };45 resumeUpload = async () => {6 this.isPaused = false;7 await this.uploadWithProgress(this.failedChunks);8 };
Setting a flag (this.isPaused
) allows us to control the upload flow. If paused, uploading stops until resumeUpload
is called.
Step 5: Merging Chunks on the Server
When all chunks are successfully uploaded, a merge request is sent to the server.
1 mergeChunks = async () => {2 try {3 const response = await fetch(`${this.props.action}/merge`, {4 method: 'POST',5 headers: { 'Authorization': this.props.token },6 body: JSON.stringify({ fileName: this.fileName })7 });8 if (!response.ok) throw new Error("Merge request failed");9 console.log("File merged successfully");10 } catch (error) {11 console.error("Error merging file", error);12 }13 };
The server’s /merge
endpoint will combine all uploaded chunks into a single file. The mergeChunks
function sends the request and handles any errors.
Final Notes
This method demonstrates a general approach to uploading a large file in chunks with JavaScript. The server needs to support chunk uploads and provide an endpoint for merging the parts. The hash used here combines the file name and index, while using a hashing algorithm like spark-md5 on the file contents is ideal for consistency.
This approach can easily extend to multi-file uploads, progress tracking for each file, and even for each chunk.