Skip to main content

Advanced Uploading

This section covers how to use the JS SDK's advanced chunking uploader \ a fault tolerant, resumable, stream based signer/uploader.


First, create a Bundlr client instance:

const bundlr = new Bundlr("", "arweave", wallet);

More extensive client creation examples can be found here.


This uploader is usable in both the NodeJS and browser (web) versions of the client.

Next, get a new chunked uploader instance:

const uploader = bundlr.uploader.chunkedUploader;

Key terminology

Batch size - the maximum number of chunks to upload at once. Defaults to 5. \
Chunk size - the maximum size of a single chunk. Defaults to 25MB. \ For those with slower/unstable connections, reducing both should lead to improved reliability. \ For those with faster connections, increasing both will lead to higher throughput, at the cost of more memory (and CPU).


By default, the uploader has a batch size of 5 and a chunk size of 25MB.

Change the batch size


Change the chunk size


Running an upload

The uploader has two modes of operation, data mode and transaction mode.

Data Mode

In data mode, the uploader expects a Readable (stream) or Buffer containing just the data you want to upload.


Do not give data mode a transaction, as it will create and sign one for you.

const transactionOptions = {tags: [{name: "Content-Type", value: "text/plain" }]};
const data = Buffer.from("Hello, world!");

const result = await uploader.uploadData(data, transactionOptions);

Upload from a file (read stream)

const transactionOptions = {tags: [{name: "Content-Type", value: "text/plain" }]};
const data = fs.createReadStream("./data.txt");

const result = await uploader.uploadData(data, transactionOptions);

Transaction Mode

const transaction = bundlr.createTransaction("Hello, world!");

const result = await uploader.uploadTransaction(transaction);

Controlling the upload

The uploader is able to be paused and resumed, even by a new uploader instance. \ To control the upload with code running after running uploader.uploadTransaction() or uploader.uploadData(), omit await as shown:

const upload = uploader.uploadTransaction(transaction);

uploader.pause(); // pauses the upload

uploader.resume(); // resumes the upload

const resumeData = uploader.getResumeData(); // get the data required to resume the upload with a new instance

const result = await upload;
// or
const result = await uploader.completionPromise();

To resume an upload from a new uploader instance, you must:

  • Use the same currency
  • Use the same Bundlr node
  • Use the same input data
  • Use the same configured chunk size

Additionally, in progress uploads will expire after a period of inactivity - if you try to resume an expired upload, you will receive an error.

uploader.setResumeData(resumeData); // set resume data

await uploader.uploadTransaction(dataItem); // upload as normal

Uploader Events

The uploader emits events as actions occur, which you can subscribe to.

Upload progress

The chunkUpload event is emitted whenever a chunk is uploaded.

uploader.on("chunkUpload", (chunkInfo) => {
console.log(`Uploaded Chunk number ${}, offset of ${chunkInfo.offset}, size ${chunkInfo.size} Bytes, with a total of ${chunkInfo.totalUploaded} bytes uploaded.`);

Upload error

The chunkError event is emitted whenever a chunk upload fails - due to internal retry logic, these errors can most likely be ignored as long as the upload doesn't error overall.

uploader.on("chunkError", (e) => {
console.error(`Error uploading chunk number ${} - ${e.res.statusText}`);

Upload completion

The done event is emitted when the upload completes.

uploader.on("done", (finishRes) => {
console.log(`Upload completed with ID ${}`);