I was working on a SvelteKit based local cloud project. I know that there are already solutions out there that work out of the box, but I want to try and build my own version from scratch. One of the key features is the file uploading and I had a tough time collecting all the information piece by piece. So, I hope that you find my solution helpful.
Our topics for today:
Problems with FormData approach
Upload file on client side
Receive file as stream
Store data memory efficiently
Problems with FormData approach
The normal way to upload files is by using a FormData as the body and send it to the server. SvelteKit offers a simple way to parse and access the data. But there is a downside: The entire data will be loaded into memory and processed afterwards.
export const POST = (async ({ params, request }) => {
const formData = await request.formData();
...
}
This is fine if you handle text formats like emails, passwords, and other strings. Even small files pose no problems. But once you want to upload a 1GB file, the server will load the entire 1GB of data into memory. Also, while we await the completion of the download, we do nothing with the chunks we already received. This is unacceptable.
A better way to handle large file uploads in SvelteKit is to use streams. This will significantly improve memory efficiency and reduce upload time by processing data chunks as soon as they are ready. So, we are going to use a raw file object in the request body of the client.
**Side note: **It is possible to wrap the file in a FormData object and still use it as a stream, but we need to extract the real file data from the FormData while only receiving chunks of it. Basically, you need to look for certain sequences in the data chunks to know when the real file begins and ends. If you feel daring, you can check out this thread to learn more about the concept.
Upload file on client side
Firstly, we need to implement the uploading logic on the client side. To do this we always use the HTML Input Element. I like to customize the style and behaviour by creating a custom onClick event:
<button on:click={onSelectFiles}>Upload</button>
function onSelectFiles() {
// Create input element and simulate clicking on it
const input = document.createElement('input');
input.type = 'file';
// input.accept = 'image/*';
// input.multiple = false;
input.click();
// Will be called once you click 'open' on file-selector window
input.onchange = async () => {
const files = input.files;
if (!files || files.length === 0) return;
// Upload - Continue reading ^^
uploadFile(files[0]);
};
// Cleanup
input.remove();
}
With this, the user can click on a button and select a file. To allow only certain file types you can change input.accept
. If you want to upload multiple files you can set input.multiple = true
and you will also need to create a loop that calls uploadFile(files[i])
for each file.
Now we can implement the actual upload method. We do this by using fetch
in POST mode:
async function uploadFile(file: File) {
const res = await fetch(`/api/upload/${file.name}`, {
method: 'POST',
body: file
});
if (res.ok) console.log('File uploaded successfully');
else console.error('Failed to upload file');
}
I like to use the SvelteKit Rest parameter routing, because you can easily recreate the path where the file is stored on a disk by using the URL path. You can also just send it with a custom header.
If you want to display a progress bar to your users, you can use a XMLHttpRequest
with a progress event. Unfortunately, I find it to be slower than fetch
. Another idea is to use a real stream as body, but you will need http2 and SvelteKit does not support it out of the box. With a custom server you could enable http2, although it is a lot of extra work.
Receive file as stream
Let’s look at the server side. Your server structure should look something like the image below. You can change the path to your liking, but the [...path]
is a Rest parameter that we use to send the filename. You can also send the relative path with subfolders for your storage system with it. Be aware that you need to secure the legitimacy of the path.
In the +server.ts
file we define a POST-handler:
import type { RequestHandler } from '@sveltejs/kit';
// Create a new file on the server
export const POST = (async ({ params, request }) => {
const stream = request.body;
// Use split('/') if file is inside folder
const filename = params.path;
// Data validation
if (!filename)return new Response('No path received', { status: 400 });
if (!stream) return new Response('No body received', { status: 400 });
// Stream logic - Continue reading ^^
const success = ...
return new Response(null, { status: success ? 200 : 500 });
}) satisfies RequestHandler;
The body is a ReadableStream
and we can use that to efficiently work with incoming data chunks. The filename is stored in params.path
. If you want to use subfolders, you need to split it and combine it later with /
or \
for every folder using path. (Windows why backslash…)
Then we validate our data. You should definitely do some more for security reasons.
Now comes the important part — reading the body stream:
const writeableStream = new WritableStream<Uint8Array>({
start() {
console.log('Stream started!');
},
write(chunk: Uint8Array) {
// Handle chunk - Continue reading ^^
},
close() {
console.log('Stream closed');
},
abort() {
console.log('Stream aborted');
}
});
// Promisify and wait for stream to finish
const success = await new Promise<boolean>((resolve) =>
stream
.pipeTo(writeableStream) // Pipe it!
.then(() => resolve(true))
.catch(() => resolve(false))
);
We use a WriteableStream
to pipe the body to a new stream. The benefit is that we only need to define event functions and everything else is covered under the hood. You can also use stream.getReader().read()
in a while loop, but I find the pipe mechanic more reliable.
Also, we promisify and wait until the stream has finished. It is important to catch
, because what can go wrong when a user uploads a 1GB file over their internet connection — literally everything.
Important: You probably need to adjust the BODY_SIZE_LIMIT
for SvelteKit (and for NGINX).
If you followed along up to this point, check if console.log(chunk)
works.
Store data memory efficiently
The benefit with streams is that we can receive a data chunk, process it and then just forget about it. We don’t keep it in memory, but we pass it on.
You can use these chunks however you want. For my use case I want to store them on a hard drive within the same system. We will use fs
with fs.createWriteStream
:
// Create new file at ./filename - You will need to change the path
const diskStream = fs.createWriteStream(filename);
diskStream.on('error', (err) => {
console.log('Stream error:', err);
});
const writeableStream = new WritableStream<Uint8Array>({
start() {
console.log('Stream started');
},
write(chunk: Uint8Array) {
diskStream.write(chunk);
},
close() {
console.log('Stream closed');
diskStream.end();
},
abort() {
console.log('Stream aborted');
diskStream.end();
// You might want to remove unfinished file
}
});
Fortunately, upon receiving an event (write
,close
,abort
) we only need to pass it on by calling a method. Also, it is best to define an on('error')
function, because if one upload throws an error it’s good practice to not crash everything.
Side note: It is not possible to do body.pipe(diskStream)
, because WriteableStream
is not compatible to WriteStream
.
Conclusion
To sum it up, with streams we can efficiently handle large file uploads without using up our entire memory. I should mention that file uploads by the user is fairly dangerous and you need to ensure that it is safe, because everything that can go wrong, will go wrong. But is a great learning experience to work it out yourself without relying on external packages.
I hope you found my ideas helpful ^^
Thanks for reading!
Top comments (0)