Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in
Celebration

Earn badges and make progress

You're on your way to the next level! Join the Kudos program to earn points and save your progress.

Deleted user Avatar
Deleted user

Level 1: Seed

25 / 150 points

Next: Root

Avatar

1 badge earned

Collect

Participate in fun challenges

Challenges come and go, but your rewards stay with you. Do more to earn more!

Challenges
Coins

Gift kudos to your peers

What goes around comes around! Share the love by gifting kudos to your peers.

Recognition
Ribbon

Rise up in the ranks

Keep earning points to reach the top of the leaderboard. It resets every quarter so you always have a chance!

Leaderboard

Come for the products,
stay for the community

The Atlassian Community can help you and your team get more value out of Atlassian products and practices.

Atlassian Community about banner
4,551,849
Community Members
 
Community Events
184
Community Groups

Download large file aborted using axios nodejs in Confluence Cloud

Hello

■Background

In Google Cloud Run using AXIOS with node.js for file downloading from atlassian Confluence, when the file's size is greater than 100M,download failed.

■What I did

Read response data in chunks. I found that a request oborted exception occurred when it reached the end. The response data was not read completely. By the way my code can download successfully more than 100M from zoom.

■Code

↓source code↓

const writer = fs.createWriteStream(filepath);

const response = await Axios({

   url: fileUrl,

   method: "GET",

   responseType: "stream",

   validateStatus : null,

   httpAgent: new http.Agent({ keepAlive: true }),

   httpsAgent: new https.Agent({ keepAlive: true })

}).catch(err => {

   console.log("axios error: "+ err);

});

const contentLength = response.headers['content-length'];

console.log("contentLength: " + contentLength);

let chunklength = 0;

response.data.on("data", (chunk) => {

   chunklength = chunklength + chunk.length;

   console.log("chunk-length-sum: "+ chunklength);

   writer.write(chunk);

});

response.data.on("aborted", () => { });

response.data.on("close", () => { });

response.data.on("end", () => { });

writer.on("finish", async () => { })

■Code Error Explanation

After the log reports an error, it will got to the 'aborted' event, and then directly go to the 'close' event, skipping the 'finish' event, resulting in interruption of uploading and downloading files.

↓Error log↓

Error: aborted at connResetException (node:internal/errors:704:14) at TLSSocket.socketCloseListener (node:_http_client:441:19) at TLSSocket.emit (node:events:525:35)

at TLSSocket.emit (node:domain:489:12) at node:net:757:14 at TCP.done (node:_tls_wrap:583:7) Emitted 'error' event on Request instance at:

at Request.onerror (node:internal/streams/legacy:62:12)

at Request.emit (node:events:513:28)

at Request.emit (node:domain:489:12)

at IncomingMessage.<anonymous>(/usr/src/app/node_modules/request/request.js:1079:12)

at IncomingMessage.emit (node:events:513:28)

at IncomingMessage.emit (node:domain:489:12)

at emitErrorNT (node:internal/streams/destroy:151:8)

at emitErrorCloseNT (node:internal/streams/destroy:116:3)

at process.processTicksAndRejections (node:internal/process/task_queues:82:21) { code: 'ECONNRESET' }

■What I expect

Used the AXIOS with node.js can download large files or other methods can be used to download large files.

■Environment

node.js 18.7

AXIOS 0.26.2

■Question

1)Does Confluence Cloud have some limitations?

2)What could be the possible reason for this error (connResetException )and how to resolve it.

0 answers

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events