Execute code only after a Javascript forEach/map - javascript

I have a forEach function that loops through an array of files, uploads the file to Firebase Storage and then after the upload, it stores the image reference link into a array defined before.
Process of each iteration:
Take file object and convert to Blob
Upload Blob onto Firebase
After Blob uploads, receive a callback which has the image reference of the upload
Store the image reference in a list
After the forEach, I use the list of image references for further processing. However, when I try and reference the list, it is usually empty because this code executes before any images upload (and references are not recieved). How can I use async/await with a Javascript forEach function where I can invoke my code to wait where it should expect a populated image reference list. I've already tried to find a solution on stackoverflow but hasn't helped toward my solution.
submitPost = async event => {
event.preventDefault();
const img_names = [];
this.state.files.forEach((ref_, index) => {
ref_.current.getCroppedCanvas().toBlob(blob => {
var uploadTask = this.storageRef
.child("images/" + String(index))
.put(blob);
uploadTask.on(
"state_changed",
snapshot => {
var progress =
(snapshot.bytesTransferred / snapshot.totalBytes) * 100;
console.log("Upload is " + progress + "% done");
},
error => {
console.log(error);
},
() => {
img_names.push(uploadTask.snapshot.ref.toString());
}
);
});
console.log(img_names) // empty here
});

You can achieve that functionality by using map instead of forEach. Convert your array into an Array of Promises. Then use that array inside a Promise.all() and await it.
const promisifyUpload = (storageRef, ref_, index) => {
return new Promise((resolve, reject) => {
ref_.current.getCroppedCanvas().toBlob(blob => {
var uploadTask = storageRef
.child("images/" + String(index))
.put(blob);
uploadTask.on(
"state_changed",
snapshot => {
var progress = (snapshot.bytesTransferred / snapshot.totalBytes) * 100;
console.log("Upload is " + progress + "% done");
},
error => console.log(error),
() => resolve(uploadTask.snapshot.ref.toString())
);
});
});
};
const submitPost = async event => {
event.preventDefault();
const img_names = await Promise.all(
this.state.files.map((ref_, index) => promisifyUpload(this.storageRef, ref_, index))
));
console.log(image_names); // Should now be filled
});
Edit: I split it up into 2 functions to show isolate the "promisify" logic

Related

Promise.all but in interval, how to achieve it?

I want to upload 20 photos to imgur at once. But I want to do in certain time gap between each upload. I was using Promise.all([...listof20promises]). But now I want to do api calls in interval of 1 second, and also I should be able to get response in the form of .then((responseArray)=>...) like we get in Promise.all.
How can I achieve it?
Use async/await and a normal loop with a delay between each iterations, for example:
const delay = t => new Promise(resolve => setTimeout(resolve, t));
const uploadAll = async files => {
const results = [];
for (const file in files) {
if (results.length) {
await delay(1000); // 1 second
}
const response = await uploadFile(file);
results.push({ file, response });
}
return results;
}
And use it as
uploadAll([ file1, file2, file3, ... ]).then(results => {
// results[0] = { file, response } where file === file1
// results[1] = { file, response } where file === file2
// ...
});
You can do as follows:
async function upload(files) {
while (files.length) {
await Promise.all(files.splice(0, 20));
await new Promise(resolve => setTimeout(resolve, 1000));
}
}
upload([...]);
Remember that splice will mutate your array. If you don't want that, you can use slice like below:
async function upload(files) {
let i = 0;
while (i < files.length) {
await Promise.all(files.slice(i, i += 20));
await new Promise(resolve => setTimeout(resolve, 1000));
}
}
upload([...]);
If you want the delay to be between one upload start to the next upload start, you can map the files, use a delay promise (taken from Yanick Rochon's answer) with the index of the current file * 1000 as the timeout, and then call the upload:
const delay = t => new Promise(resolve => setTimeout(resolve, t));
const uploadAll = files => Promise.all(
files.map((file, i) =>
delay(i * 1000).then(() => uploadFile(file))
)
);
The main difference between Yanick Rochon's answer and mine, is that in his answer the next upload would start 1 second after the previous call ended, while this code would dispatch the calls within 1 second of each other, even if the previous call is still pending.

Upload Progress with Fetch in Vanilla JS [duplicate]

I'm struggling to find documentation or examples of implementing an upload progress indicator using fetch.
This is the only reference I've found so far, which states:
Progress events are a high level feature that won't arrive in fetch for now. You can create your own by looking at the Content-Length header and using a pass-through stream to monitor the bytes received.
This means you can explicitly handle responses without a Content-Length differently. And of course, even if Content-Length is there it can be a lie. With streams you can handle these lies however you want.
How would I write "a pass-through stream to monitor the bytes" sent? If it makes any sort of difference, I'm trying to do this to power image uploads from the browser to Cloudinary.
NOTE: I am not interested in the Cloudinary JS library, as it depends on jQuery and my app does not. I'm only interested in the stream processing necessary to do this with native javascript and Github's fetch polyfill.
https://fetch.spec.whatwg.org/#fetch-api
Streams are starting to land in the web platform (https://jakearchibald.com/2016/streams-ftw/) but it's still early days.
Soon you'll be able to provide a stream as the body of a request, but the open question is whether the consumption of that stream relates to bytes uploaded.
Particular redirects can result in data being retransmitted to the new location, but streams cannot "restart". We can fix this by turning the body into a callback which can be called multiple times, but we need to be sure that exposing the number of redirects isn't a security leak, since it'd be the first time on the platform JS could detect that.
Some are questioning whether it even makes sense to link stream consumption to bytes uploaded.
Long story short: this isn't possible yet, but in future this will be handled either by streams, or some kind of higher-level callback passed into fetch().
My solution is to use axios, which supports this pretty well:
axios.request({
method: "post",
url: "/aaa",
data: myData,
onUploadProgress: (p) => {
console.log(p);
//this.setState({
//fileprogress: p.loaded / p.total
//})
}
}).then (data => {
//this.setState({
//fileprogress: 1.0,
//})
})
I have example for using this in react on github.
fetch: not possible yet
It sounds like upload progress will eventually be possible with fetch once it supports a ReadableStream as the body. This is currently not implemented, but it's in progress. I think the code will look something like this:
warning: this code does not work yet, still waiting on browsers to support it
async function main() {
const blob = new Blob([new Uint8Array(10 * 1024 * 1024)]); // any Blob, including a File
const progressBar = document.getElementById("progress");
const totalBytes = blob.size;
let bytesUploaded = 0;
const blobReader = blob.stream().getReader();
const progressTrackingStream = new ReadableStream({
async pull(controller) {
const result = await blobReader.read();
if (result.done) {
console.log("completed stream");
controller.close();
return;
}
controller.enqueue(result.value);
bytesUploaded += result.value.byteLength;
console.log("upload progress:", bytesUploaded / totalBytes);
progressBar.value = bytesUploaded / totalBytes;
},
});
const response = await fetch("https://httpbin.org/put", {
method: "PUT",
headers: {
"Content-Type": "application/octet-stream"
},
body: progressTrackingStream,
});
console.log("success:", response.ok);
}
main().catch(console.error);
upload: <progress id="progress" />
workaround: good ol' XMLHttpRequest
Instead of fetch(), it's possible to use XMLHttpRequest to track upload progress — the xhr.upload object emits a progress event.
async function main() {
const blob = new Blob([new Uint8Array(10 * 1024 * 1024)]); // any Blob, including a File
const uploadProgress = document.getElementById("upload-progress");
const downloadProgress = document.getElementById("download-progress");
const xhr = new XMLHttpRequest();
const success = await new Promise((resolve) => {
xhr.upload.addEventListener("progress", (event) => {
if (event.lengthComputable) {
console.log("upload progress:", event.loaded / event.total);
uploadProgress.value = event.loaded / event.total;
}
});
xhr.addEventListener("progress", (event) => {
if (event.lengthComputable) {
console.log("download progress:", event.loaded / event.total);
downloadProgress.value = event.loaded / event.total;
}
});
xhr.addEventListener("loadend", () => {
resolve(xhr.readyState === 4 && xhr.status === 200);
});
xhr.open("PUT", "https://httpbin.org/put", true);
xhr.setRequestHeader("Content-Type", "application/octet-stream");
xhr.send(blob);
});
console.log("success:", success);
}
main().catch(console.error);
upload: <progress id="upload-progress"></progress><br/>
download: <progress id="download-progress"></progress>
Update: as the accepted answer says it's impossible now. but the below code handled our problem for sometime. I should add that at least we had to switch to using a library that is based on XMLHttpRequest.
const response = await fetch(url);
const total = Number(response.headers.get('content-length'));
const reader = response.body.getReader();
let bytesReceived = 0;
while (true) {
const result = await reader.read();
if (result.done) {
console.log('Fetch complete');
break;
}
bytesReceived += result.value.length;
console.log('Received', bytesReceived, 'bytes of data so far');
}
thanks to this link: https://jakearchibald.com/2016/streams-ftw/
As already explained in the other answers, it is not possible with fetch, but with XHR. Here is my a-little-more-compact XHR solution:
const uploadFiles = (url, files, onProgress) =>
new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.upload.addEventListener('progress', e => onProgress(e.loaded / e.total));
xhr.addEventListener('load', () => resolve({ status: xhr.status, body: xhr.responseText }));
xhr.addEventListener('error', () => reject(new Error('File upload failed')));
xhr.addEventListener('abort', () => reject(new Error('File upload aborted')));
xhr.open('POST', url, true);
const formData = new FormData();
Array.from(files).forEach((file, index) => formData.append(index.toString(), file));
xhr.send(formData);
});
Works with one or multiple files.
If you have a file input element like this:
<input type="file" multiple id="fileUpload" />
Call the function like this:
document.getElementById('fileUpload').addEventListener('change', async e => {
const onProgress = progress => console.log('Progress:', `${Math.round(progress * 100)}%`);
const response = await uploadFiles('/api/upload', e.currentTarget.files, onProgress);
if (response.status >= 400) {
throw new Error(`File upload failed - Status code: ${response.status}`);
}
console.log('Response:', response.body);
}
Also works with the e.dataTransfer.files you get from a drop event when building a file drop zone.
I don't think it's possible. The draft states:
it is currently lacking [in comparison to XHR] when it comes to request progression
(old answer):
The first example in the Fetch API chapter gives some insight on how to :
If you want to receive the body data progressively:
function consume(reader) {
var total = 0
return new Promise((resolve, reject) => {
function pump() {
reader.read().then(({done, value}) => {
if (done) {
resolve()
return
}
total += value.byteLength
log(`received ${value.byteLength} bytes (${total} bytes in total)`)
pump()
}).catch(reject)
}
pump()
})
}
fetch("/music/pk/altes-kamuffel.flac")
.then(res => consume(res.body.getReader()))
.then(() => log("consumed the entire body without keeping the whole thing in memory!"))
.catch(e => log("something went wrong: " + e))
Apart from their use of the Promise constructor antipattern, you can see that response.body is a Stream from which you can read byte by byte using a Reader, and you can fire an event or do whatever you like (e.g. log the progress) for every of them.
However, the Streams spec doesn't appear to be quite finished, and I have no idea whether this already works in any fetch implementation.
with fetch: now possible with Chrome >= 105 🎉
How to:
https://developer.chrome.com/articles/fetch-streaming-requests/
Currently not supported by other browsers (maybe that will be the case when you read this, please edit my answer accordingly)
Feature detection (source)
const supportsRequestStreams = (() => {
let duplexAccessed = false;
const hasContentType = new Request('', {
body: new ReadableStream(),
method: 'POST',
get duplex() {
duplexAccessed = true;
return 'half';
},
}).headers.has('Content-Type');
return duplexAccessed && !hasContentType;
})();
HTTP >= 2 required
The fetch will be rejected if the connection is HTTP/1.x.
Since none of the answers solve the problem.
Just for implementation sake, you can detect the upload speed with some small initial chunk of known size and the upload time can be calculated with content-length/upload-speed. You can use this time as estimation.
A possible workaround would be to utilize new Request() constructor then check Request.bodyUsed Boolean attribute
The bodyUsed attribute’s getter must return true if disturbed, and
false otherwise.
to determine if stream is distributed
An object implementing the Body mixin is said to be disturbed if
body is non-null and its stream is disturbed.
Return the fetch() Promise from within .then() chained to recursive .read() call of a ReadableStream when Request.bodyUsed is equal to true.
Note, the approach does not read the bytes of the Request.body as the bytes are streamed to the endpoint. Also, the upload could complete well before any response is returned in full to the browser.
const [input, progress, label] = [
document.querySelector("input")
, document.querySelector("progress")
, document.querySelector("label")
];
const url = "/path/to/server/";
input.onmousedown = () => {
label.innerHTML = "";
progress.value = "0"
};
input.onchange = (event) => {
const file = event.target.files[0];
const filename = file.name;
progress.max = file.size;
const request = new Request(url, {
method: "POST",
body: file,
cache: "no-store"
});
const upload = settings => fetch(settings);
const uploadProgress = new ReadableStream({
start(controller) {
console.log("starting upload, request.bodyUsed:", request.bodyUsed);
controller.enqueue(request.bodyUsed);
},
pull(controller) {
if (request.bodyUsed) {
controller.close();
}
controller.enqueue(request.bodyUsed);
console.log("pull, request.bodyUsed:", request.bodyUsed);
},
cancel(reason) {
console.log(reason);
}
});
const [fileUpload, reader] = [
upload(request)
.catch(e => {
reader.cancel();
throw e
})
, uploadProgress.getReader()
];
const processUploadRequest = ({value, done}) => {
if (value || done) {
console.log("upload complete, request.bodyUsed:", request.bodyUsed);
// set `progress.value` to `progress.max` here
// if not awaiting server response
// progress.value = progress.max;
return reader.closed.then(() => fileUpload);
}
console.log("upload progress:", value);
progress.value = +progress.value + 1;
return reader.read().then(result => processUploadRequest(result));
};
reader.read().then(({value, done}) => processUploadRequest({value,done}))
.then(response => response.text())
.then(text => {
console.log("response:", text);
progress.value = progress.max;
input.value = "";
})
.catch(err => console.log("upload error:", err));
}
I fished around for some time about this and just for everyone who may come across this issue too here is my solution:
const form = document.querySelector('form');
const status = document.querySelector('#status');
// When form get's submitted.
form.addEventListener('submit', async function (event) {
// cancel default behavior (form submit)
event.preventDefault();
// Inform user that the upload has began
status.innerText = 'Uploading..';
// Create FormData from form
const formData = new FormData(form);
// Open request to origin
const request = await fetch('https://httpbin.org/post', { method: 'POST', body: formData });
// Get amount of bytes we're about to transmit
const bytesToUpload = request.headers.get('content-length');
// Create a reader from the request body
const reader = request.body.getReader();
// Cache how much data we already send
let bytesUploaded = 0;
// Get first chunk of the request reader
let chunk = await reader.read();
// While we have more chunks to go
while (!chunk.done) {
// Increase amount of bytes transmitted.
bytesUploaded += chunk.value.length;
// Inform user how far we are
status.innerText = 'Uploading (' + (bytesUploaded / bytesToUpload * 100).toFixed(2) + ')...';
// Read next chunk
chunk = await reader.read();
}
});
const req = await fetch('./foo.json');
const total = Number(req.headers.get('content-length'));
let loaded = 0;
for await(const {length} of req.body.getReader()) {
loaded = += length;
const progress = ((loaded / total) * 100).toFixed(2); // toFixed(2) means two digits after floating point
console.log(`${progress}%`); // or yourDiv.textContent = `${progress}%`;
}
Key part is ReadableStream &Lt;obj_response.body&Gt;.
Sample:
let parse=_/*result*/=>{
console.log(_)
//...
return /*cont?*/_.value?true:false
}
fetch('').
then(_=>( a/*!*/=_.body.getReader(), b/*!*/=z=>a.read().then(parse).then(_=>(_?b:z=>z)()), b() ))
You can test running it on a huge page eg https://html.spec.whatwg.org/ and https://html.spec.whatwg.org/print.pdf . CtrlShiftJ and load the code in.
(Tested on Chrome.)

Uploading Files: FileAPI vs fetch, equivalent of a progress hook in fetch? [duplicate]

I'm struggling to find documentation or examples of implementing an upload progress indicator using fetch.
This is the only reference I've found so far, which states:
Progress events are a high level feature that won't arrive in fetch for now. You can create your own by looking at the Content-Length header and using a pass-through stream to monitor the bytes received.
This means you can explicitly handle responses without a Content-Length differently. And of course, even if Content-Length is there it can be a lie. With streams you can handle these lies however you want.
How would I write "a pass-through stream to monitor the bytes" sent? If it makes any sort of difference, I'm trying to do this to power image uploads from the browser to Cloudinary.
NOTE: I am not interested in the Cloudinary JS library, as it depends on jQuery and my app does not. I'm only interested in the stream processing necessary to do this with native javascript and Github's fetch polyfill.
https://fetch.spec.whatwg.org/#fetch-api
Streams are starting to land in the web platform (https://jakearchibald.com/2016/streams-ftw/) but it's still early days.
Soon you'll be able to provide a stream as the body of a request, but the open question is whether the consumption of that stream relates to bytes uploaded.
Particular redirects can result in data being retransmitted to the new location, but streams cannot "restart". We can fix this by turning the body into a callback which can be called multiple times, but we need to be sure that exposing the number of redirects isn't a security leak, since it'd be the first time on the platform JS could detect that.
Some are questioning whether it even makes sense to link stream consumption to bytes uploaded.
Long story short: this isn't possible yet, but in future this will be handled either by streams, or some kind of higher-level callback passed into fetch().
My solution is to use axios, which supports this pretty well:
axios.request({
method: "post",
url: "/aaa",
data: myData,
onUploadProgress: (p) => {
console.log(p);
//this.setState({
//fileprogress: p.loaded / p.total
//})
}
}).then (data => {
//this.setState({
//fileprogress: 1.0,
//})
})
I have example for using this in react on github.
fetch: not possible yet
It sounds like upload progress will eventually be possible with fetch once it supports a ReadableStream as the body. This is currently not implemented, but it's in progress. I think the code will look something like this:
warning: this code does not work yet, still waiting on browsers to support it
async function main() {
const blob = new Blob([new Uint8Array(10 * 1024 * 1024)]); // any Blob, including a File
const progressBar = document.getElementById("progress");
const totalBytes = blob.size;
let bytesUploaded = 0;
const blobReader = blob.stream().getReader();
const progressTrackingStream = new ReadableStream({
async pull(controller) {
const result = await blobReader.read();
if (result.done) {
console.log("completed stream");
controller.close();
return;
}
controller.enqueue(result.value);
bytesUploaded += result.value.byteLength;
console.log("upload progress:", bytesUploaded / totalBytes);
progressBar.value = bytesUploaded / totalBytes;
},
});
const response = await fetch("https://httpbin.org/put", {
method: "PUT",
headers: {
"Content-Type": "application/octet-stream"
},
body: progressTrackingStream,
});
console.log("success:", response.ok);
}
main().catch(console.error);
upload: <progress id="progress" />
workaround: good ol' XMLHttpRequest
Instead of fetch(), it's possible to use XMLHttpRequest to track upload progress — the xhr.upload object emits a progress event.
async function main() {
const blob = new Blob([new Uint8Array(10 * 1024 * 1024)]); // any Blob, including a File
const uploadProgress = document.getElementById("upload-progress");
const downloadProgress = document.getElementById("download-progress");
const xhr = new XMLHttpRequest();
const success = await new Promise((resolve) => {
xhr.upload.addEventListener("progress", (event) => {
if (event.lengthComputable) {
console.log("upload progress:", event.loaded / event.total);
uploadProgress.value = event.loaded / event.total;
}
});
xhr.addEventListener("progress", (event) => {
if (event.lengthComputable) {
console.log("download progress:", event.loaded / event.total);
downloadProgress.value = event.loaded / event.total;
}
});
xhr.addEventListener("loadend", () => {
resolve(xhr.readyState === 4 && xhr.status === 200);
});
xhr.open("PUT", "https://httpbin.org/put", true);
xhr.setRequestHeader("Content-Type", "application/octet-stream");
xhr.send(blob);
});
console.log("success:", success);
}
main().catch(console.error);
upload: <progress id="upload-progress"></progress><br/>
download: <progress id="download-progress"></progress>
Update: as the accepted answer says it's impossible now. but the below code handled our problem for sometime. I should add that at least we had to switch to using a library that is based on XMLHttpRequest.
const response = await fetch(url);
const total = Number(response.headers.get('content-length'));
const reader = response.body.getReader();
let bytesReceived = 0;
while (true) {
const result = await reader.read();
if (result.done) {
console.log('Fetch complete');
break;
}
bytesReceived += result.value.length;
console.log('Received', bytesReceived, 'bytes of data so far');
}
thanks to this link: https://jakearchibald.com/2016/streams-ftw/
As already explained in the other answers, it is not possible with fetch, but with XHR. Here is my a-little-more-compact XHR solution:
const uploadFiles = (url, files, onProgress) =>
new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.upload.addEventListener('progress', e => onProgress(e.loaded / e.total));
xhr.addEventListener('load', () => resolve({ status: xhr.status, body: xhr.responseText }));
xhr.addEventListener('error', () => reject(new Error('File upload failed')));
xhr.addEventListener('abort', () => reject(new Error('File upload aborted')));
xhr.open('POST', url, true);
const formData = new FormData();
Array.from(files).forEach((file, index) => formData.append(index.toString(), file));
xhr.send(formData);
});
Works with one or multiple files.
If you have a file input element like this:
<input type="file" multiple id="fileUpload" />
Call the function like this:
document.getElementById('fileUpload').addEventListener('change', async e => {
const onProgress = progress => console.log('Progress:', `${Math.round(progress * 100)}%`);
const response = await uploadFiles('/api/upload', e.currentTarget.files, onProgress);
if (response.status >= 400) {
throw new Error(`File upload failed - Status code: ${response.status}`);
}
console.log('Response:', response.body);
}
Also works with the e.dataTransfer.files you get from a drop event when building a file drop zone.
I don't think it's possible. The draft states:
it is currently lacking [in comparison to XHR] when it comes to request progression
(old answer):
The first example in the Fetch API chapter gives some insight on how to :
If you want to receive the body data progressively:
function consume(reader) {
var total = 0
return new Promise((resolve, reject) => {
function pump() {
reader.read().then(({done, value}) => {
if (done) {
resolve()
return
}
total += value.byteLength
log(`received ${value.byteLength} bytes (${total} bytes in total)`)
pump()
}).catch(reject)
}
pump()
})
}
fetch("/music/pk/altes-kamuffel.flac")
.then(res => consume(res.body.getReader()))
.then(() => log("consumed the entire body without keeping the whole thing in memory!"))
.catch(e => log("something went wrong: " + e))
Apart from their use of the Promise constructor antipattern, you can see that response.body is a Stream from which you can read byte by byte using a Reader, and you can fire an event or do whatever you like (e.g. log the progress) for every of them.
However, the Streams spec doesn't appear to be quite finished, and I have no idea whether this already works in any fetch implementation.
with fetch: now possible with Chrome >= 105 🎉
How to:
https://developer.chrome.com/articles/fetch-streaming-requests/
Currently not supported by other browsers (maybe that will be the case when you read this, please edit my answer accordingly)
Feature detection (source)
const supportsRequestStreams = (() => {
let duplexAccessed = false;
const hasContentType = new Request('', {
body: new ReadableStream(),
method: 'POST',
get duplex() {
duplexAccessed = true;
return 'half';
},
}).headers.has('Content-Type');
return duplexAccessed && !hasContentType;
})();
HTTP >= 2 required
The fetch will be rejected if the connection is HTTP/1.x.
Since none of the answers solve the problem.
Just for implementation sake, you can detect the upload speed with some small initial chunk of known size and the upload time can be calculated with content-length/upload-speed. You can use this time as estimation.
A possible workaround would be to utilize new Request() constructor then check Request.bodyUsed Boolean attribute
The bodyUsed attribute’s getter must return true if disturbed, and
false otherwise.
to determine if stream is distributed
An object implementing the Body mixin is said to be disturbed if
body is non-null and its stream is disturbed.
Return the fetch() Promise from within .then() chained to recursive .read() call of a ReadableStream when Request.bodyUsed is equal to true.
Note, the approach does not read the bytes of the Request.body as the bytes are streamed to the endpoint. Also, the upload could complete well before any response is returned in full to the browser.
const [input, progress, label] = [
document.querySelector("input")
, document.querySelector("progress")
, document.querySelector("label")
];
const url = "/path/to/server/";
input.onmousedown = () => {
label.innerHTML = "";
progress.value = "0"
};
input.onchange = (event) => {
const file = event.target.files[0];
const filename = file.name;
progress.max = file.size;
const request = new Request(url, {
method: "POST",
body: file,
cache: "no-store"
});
const upload = settings => fetch(settings);
const uploadProgress = new ReadableStream({
start(controller) {
console.log("starting upload, request.bodyUsed:", request.bodyUsed);
controller.enqueue(request.bodyUsed);
},
pull(controller) {
if (request.bodyUsed) {
controller.close();
}
controller.enqueue(request.bodyUsed);
console.log("pull, request.bodyUsed:", request.bodyUsed);
},
cancel(reason) {
console.log(reason);
}
});
const [fileUpload, reader] = [
upload(request)
.catch(e => {
reader.cancel();
throw e
})
, uploadProgress.getReader()
];
const processUploadRequest = ({value, done}) => {
if (value || done) {
console.log("upload complete, request.bodyUsed:", request.bodyUsed);
// set `progress.value` to `progress.max` here
// if not awaiting server response
// progress.value = progress.max;
return reader.closed.then(() => fileUpload);
}
console.log("upload progress:", value);
progress.value = +progress.value + 1;
return reader.read().then(result => processUploadRequest(result));
};
reader.read().then(({value, done}) => processUploadRequest({value,done}))
.then(response => response.text())
.then(text => {
console.log("response:", text);
progress.value = progress.max;
input.value = "";
})
.catch(err => console.log("upload error:", err));
}
I fished around for some time about this and just for everyone who may come across this issue too here is my solution:
const form = document.querySelector('form');
const status = document.querySelector('#status');
// When form get's submitted.
form.addEventListener('submit', async function (event) {
// cancel default behavior (form submit)
event.preventDefault();
// Inform user that the upload has began
status.innerText = 'Uploading..';
// Create FormData from form
const formData = new FormData(form);
// Open request to origin
const request = await fetch('https://httpbin.org/post', { method: 'POST', body: formData });
// Get amount of bytes we're about to transmit
const bytesToUpload = request.headers.get('content-length');
// Create a reader from the request body
const reader = request.body.getReader();
// Cache how much data we already send
let bytesUploaded = 0;
// Get first chunk of the request reader
let chunk = await reader.read();
// While we have more chunks to go
while (!chunk.done) {
// Increase amount of bytes transmitted.
bytesUploaded += chunk.value.length;
// Inform user how far we are
status.innerText = 'Uploading (' + (bytesUploaded / bytesToUpload * 100).toFixed(2) + ')...';
// Read next chunk
chunk = await reader.read();
}
});
const req = await fetch('./foo.json');
const total = Number(req.headers.get('content-length'));
let loaded = 0;
for await(const {length} of req.body.getReader()) {
loaded = += length;
const progress = ((loaded / total) * 100).toFixed(2); // toFixed(2) means two digits after floating point
console.log(`${progress}%`); // or yourDiv.textContent = `${progress}%`;
}
Key part is ReadableStream &Lt;obj_response.body&Gt;.
Sample:
let parse=_/*result*/=>{
console.log(_)
//...
return /*cont?*/_.value?true:false
}
fetch('').
then(_=>( a/*!*/=_.body.getReader(), b/*!*/=z=>a.read().then(parse).then(_=>(_?b:z=>z)()), b() ))
You can test running it on a huge page eg https://html.spec.whatwg.org/ and https://html.spec.whatwg.org/print.pdf . CtrlShiftJ and load the code in.
(Tested on Chrome.)

How to upload images to firebase storage and wait for URLs to return before continuing

I'm trying to create a "Create Post" form where the user can input text, as well as input images.
uploadImages() {
const files = document.querySelector('#imagesInput').files;
files.forEach(image => {
console.log("uploading", image.name)
const name = (+new Date()) + '-' + image.name;
const task = fb.storage.child(name).put(image, {contentType: image.type});
task.then(snapshot => {
snapshot.ref.getDownloadURL().then(url => {
console.log(url);
})
})
});
}
The image upload process needs to happen first, and I need to return all the URLs (as an array) before the post is created in the database.
imageURLs = this.uploadImages(); // <-- Wait for this to complete and return the image URIs
this.createPost("Hello, world!", imageURLs) // <-- then do this
Since you call the asynchronous getDownloadURL() method a variable number of times (i.e. unknown at the time of coding), you need to use Promise.all() as follows:
uploadImages() {
const files = document.querySelector('#imagesInput').files;
const promises = [];
files.forEach(image => {
console.log("uploading", image.name)
const name = (+new Date()) + '-' + image.name;
const task = fb.storage.child(name).put(image, { contentType: image.type });
promises.push(task);
});
return Promise.all(promises)
.then(snapshotsArray => {
// snapshotsArray is an array with a variable number of elements
// You again need to use Promise.all
const promises = [];
snapshotsArray.forEach(snapshot => {
promises.push(snapshot.ref.getDownloadURL());
})
return Promise.all(promises);
});
}
The uploadImages() function is asynchronous and returns a Promise (since Promise.all() and then() return Promises): you therefore need to call it by using then(), as follows:
this.uploadImages()
.then(imageURLs => {
this.createPost("Hello, world!", imageURLs)
})
Add an .addOnSuccessListener to the line where you upload images and add the next part of the code in that
Like this
imageURLs = this.uploadImages().addOnSuccessListener(new OnSuccessListener()
({
this.createPost("Hello, world!", imageURLs)
});
I am not sure if the code is entirely correct, it will depend on what you are exactly trying to do, but it will give you a general idea of what you need to do

How to upload multiple files to Firebase?

Is there a way to upload multiple files to Firebase storage. It can upload single file within single attempt as follows.
fileButton.addEventListener('change', function(e){
//Get file
var file = e.target.files[0];
//Create storage reference
var storageRef = firebase.storage().ref(DirectryPath+"/"+file.name);
//Upload file
var task = storageRef.put(file);
//Update progress bar
task.on('state_changed',
function progress(snapshot){
var percentage = snapshot.bytesTransferred / snapshot.totalBytes * 100;
uploader.value = percentage;
},
function error(err){
},
function complete(){
var downloadURL = task.snapshot.downloadURL;
}
);
});
How to upload multiple files to the Firebase storage.
I found the solution for my above question and I like to put it here because it can be useful for anyone.
//Listen for file selection
fileButton.addEventListener('change', function(e){
//Get files
for (var i = 0; i < e.target.files.length; i++) {
var imageFile = e.target.files[i];
uploadImageAsPromise(imageFile);
}
});
//Handle waiting to upload each file using promise
function uploadImageAsPromise (imageFile) {
return new Promise(function (resolve, reject) {
var storageRef = firebase.storage().ref(fullDirectory+"/"+imageFile.name);
//Upload file
var task = storageRef.put(imageFile);
//Update progress bar
task.on('state_changed',
function progress(snapshot){
var percentage = snapshot.bytesTransferred / snapshot.totalBytes * 100;
uploader.value = percentage;
},
function error(err){
},
function complete(){
var downloadURL = task.snapshot.downloadURL;
}
);
});
}
Firebase Storage uses Promise, so you can use Promises to achieve it.
Here's the firebase blog article that covers this subject:
Keeping our Promises (and Callbacks)
Give Promise.all() an "Array of Promises"
Promise.all(
// Array of "Promises"
myItems.map(item => putStorageItem(item))
)
.then((url) => {
console.log(`All success`)
})
.catch((error) => {
console.log(`Some failed: `, error.message)
});
Upload each file and return a Promise
putStorageItem(item) {
// the return value will be a Promise
return firebase.storage().ref("YourPath").put("YourFile")
.then((snapshot) => {
console.log('One success:', item)
}).catch((error) => {
console.log('One failed:', item, error.message)
});
}
YourPath and YourFile can be carried with myItems array (thus the item object).
I omitted them here just for readability, but you get the concept.
I believe there's a simpler solution:
// set it up
firebase.storage().ref().constructor.prototype.putFiles = function(files) {
var ref = this;
return Promise.all(files.map(function(file) {
return ref.child(file.name).put(file);
}));
}
// use it!
firebase.storage().ref().putFiles(files).then(function(metadatas) {
// Get an array of file metadata
}).catch(function(error) {
// If any task fails, handle this
});
let ad_images=["file:///data/user/0/..../IMG-20181216-WA00001.jpg",
"file:///data/user/0/..../IMG-20181216-WA00002.jpg",
"file:///data/user/0/..../IMG-20181216-WA00003.jpg"];
let firebase_images=[];
const ref = firebase.firestore().collection('ads').doc(newRecord.id);
putStorageItem = (url,index,ext) => {
return firebase.storage().ref('YOURFOLDER/'+ index +'.'+ext ).putFile(url)
.then((snapshot) => {
console.log(snapshot)
firebase_images[index] = snapshot.downloadURL;
//OR
//firebase_images.push(snapshot.downloadURL);
}).catch((error) => {
console.log('One failed:', error.message)
});
}
Promise.all(
ad_images.map( async (item,index) => {
let ext = item.split('/').pop().split(".").pop();
console.log(newRecord.id, item, index, ext);
await putStorageItem(newRecord.id, item, index, ext);
})
)
.then((url) => {
console.log(`All success`);
console.log(firebase_images);
})
.catch((error) => {
console.log(`Some failed: `, error.message)
});
This is a modification of the marked answer for those looking to wait for each upload to complete before the other starts.
As the marked answer stands, the promise is not resolved or rejected so when the upload begins from the loop everything just starts, the 1st file, 2nd.....
Think of 3 uploads each 20mb. The loop will call the upload function almost at the same time, making them run almost concurrently.
This answer solves this using async/await to handle the promises
fileButton.addEventListener('change', async function(e){
//Get files
for (var i = 0; i < e.target.files.length; i++) {
var imageFile = e.target.files[i];
await uploadImageAsPromise(imageFile).then((res)=>{
console.log(res);
});
}
});
//Handle waiting to upload each file using promise
async function uploadImageAsPromise (imageFile) {
return new Promise(function (resolve, reject) {
var storageRef = firebase.storage().ref(fullDirectory+"/"+imageFile.name);
var task = storageRef.put(imageFile);
//Update progress bar
task.on('state_changed',
function progress(snapshot){
var percentage = snapshot.bytesTransferred / snapshot.totalBytes *
100;
},
function error(err){
console.log(err);
reject(err);
},
function complete(){
var downloadURL = task.snapshot.downloadURL;
resolve(downloadURL);
}
);
});
}
#isuru, the guy who uploaded the question has a great solution provided below. But, some of the firebase functions have been updated. So, I have just updated the solution with the new updates in the Firebase.
//Firebase Storage Reference
const storageRef = firebase.storage().ref();
//Upload Image Function returns a promise
async function uploadImageAsPromise(imageFile) {
return new Promise(function (resolve, reject) {
const task = storageRef.child(imageFile.name).put(imageFile);
task.on(
"state_changed",
function progress(snapshot) {
const percentage = (snapshot.bytesTransferred / snapshot.totalBytes) * 100;
},
function error(err) {
reject(err);
},
async function complete() {
//The getDownloadURL returns a promise and it is resolved to get the image url.
const imageURL = await task.snapshot.ref.getDownloadURL();
resolve(imageURL);
}
);
});
}
//Handling the files
fileButton.addEventListener('change', function(e){
const promises = [];
for(const file of e.target.files){//Instead of e.target.files, you could also have your files variable
promises.push(uploadImageAsPromise(file))
}
//The Promise.all() will stop the execution, until all of the promises are resolved.
Promise.all(promises).then((fileURLS)=>{
//Once all the promises are resolved, you will get the urls in a array.
console.log(fileURLS)
})
});
Upload a file & get download URL
export const handleFileUploadOnFirebaseStorage = async (bucketName, file) => {
// 1. If no file, return
if (file === "") return "";
// 2. Put the file into bucketName
const uploadTask = await storage.ref(`/${bucketName}/${file.name}`).put(file);
// 3. Get download URL and return it as
return uploadTask.ref.getDownloadURL().then((fileURL) => fileURL);
};
Upload multiple files & get download URL
export const handleFilesUploadOnFirebaseStorage = async (bucketName, files) => {
// 1. If no file, return
if (files.length === 0) return [];
// 2. Create an array to store all download URLs
let fileUrls = [];
// 3. Loop over all the files
for (var i = 0; i < files.length; i++) {
// 3A. Get a file to upload
const file = files[i];
// 3B. handleFileUploadOnFirebaseStorage function is in above section
const downloadFileResponse = await handleFileUploadOnFirebaseStorage(bucketName, file);
// 3C. Push the download url to URLs array
fileUrls.push(downloadFileResponse);
}
return fileUrls;
};
all the promises get messy pretty quickly, why not use async and await instead?
Here, I have a function that keep tracks of all the images selected from the input/file control to be uploaded:
let images =[];
let imagePaths=[];
const trackFiles =(e)=>{
images =[];
imagePaths =[];
for (var i = 0; i < e.target.files.length; i++) {
images.push(e.target.files[i]);
}
}
And I have another function that will be triggered by a button that the user will click on when ready to do the actual upload:
const uploadFiles =()=>{
const storageRef = storage.ref();
images.map(async img =>{
let fileRef = storageRef.child(img.name);
await fileRef.put(img);
const singleImgPath = await fileRef.getDownloadURL();
imagePaths.push(singleImgPath);
if(imagePaths.length == images.length){
console.log("got all paths here now: ", imagePaths);
}
})
}
We basically loop through each image and perform the upload, and push the image paths into a separate imagePaths array one by one as each of them gets finished at its own pace, I then grab all the paths once we know they are all done by comparing the length of the images and their final paths.
We can Combine multiple Promises like this
Promise.all([promise1, promise2, promise3]).then(function(values) {
console.log(values);
});
And we can Chain Promise like this
return myFirstPromise.then( (returnFromFirst) => {
//Do something
return secondPromise();
}).then( (returnFromSecond) => {
//Do something
return thirdPromise();
}).then( (returnFromThird) => {
//All Done
}).catch( (e) =>{}
console.error("SOMETHING WENT WRONG!!!");
);
Idea is to combine upload file promises with Promise.all & chain them together to get download URLS after each upload
Promise.all(
//Array.map creates a new array with the results
// of calling a function for every array element.
//In this case Array of "Promises"
this.state.filesToUpload.map(item =>
this.uploadFileAsPromise(item))
)
.then(url => {
console.log(`All success`);
//Handle Success all image upload
})
.catch(error => {
console.log(`Some failed: `, error.message);
//Handle Failure some/all image upload failed
});
//return a promise which upload file & get download URL
uploadFileAsPromise(imageFile) {
// the return value will be a Promise
return storageRef
.child("images/users/" + imageFile.name)
.put(imageFile.file)
.then(snapshot => {
console.log("Uploaded File:", imageFile.name);
return snapshot.ref.getDownloadURL().then(downloadURL => {
//promise inside promise to get donloadable URL
console.log("File available at", downloadURL);
);
});
})
.catch(error => {
console.log("Upload failed:", imageFile.name, error.message);
});
}
This was a breeze implementing with rxjs's switchMap and combineLatest for the Angular fire

Categories

Resources