web worker - data undefined when passed back to main thread - javascript

I'm trying to use a web worker to get some data and then pass them back to the main thread. I have tried with this code but it will not work as expected
onmessage = (e) => {
console.log(e);
if( e.data[0] === 'fetchData' ){
fetch('https://example.com/platform/api/v1/endpoint')
.then( (res) => res.json() )
.then( async (data) => {
const imagesData = await Promise.all(
data.map( async (item) => {
let res = await fetch(item.src);
let img = await res.blob();
let reader = new FileReader();
reader.readAsDataURL(img);
reader.onloadend = () => {
return {
title: item.title,
link: item.link,
src: reader.result
}
}
})
)
postMessage(imagesData);
})
}
}
The imagesData after a console.log will contain an array of seven undefined elements, how I can fix this?
UPDATE
I've changed the code in vue front-end and into the worker and now I'm able to get the data but sometimes the worker will not work and I will not able to get the data or I get just two entries but the expected number is seven items for the front-end. Here is how I've modified the code, maybe I need to terminate the work before use another one?
NB: I'm creating a tab override chrome extension
vue front-end code
<script>
const worker = new Worker('services.js');
export default {
name: 'App',
beforeCreate() {
worker.postMessage(['fetchData']);
},
created() {
this.init();
this.clock();
},
data() {
return {
mostVisited: [],
imagesData: [],
isLoading: true
}
},
methods: {
init() {
worker.onmessage = (e) => {
console.log(e);
this.imagesData = e.data;
this.isLoading = false;
}
browser.topSites.get().then( (sites) => this.mostVisited = sites );
} //end init
}
</script>
web worker code
onmessage = (e) => {
console.log(e);
if( e.data[0] === 'fetchData' ){
fetch('https://example.com/platform/api/v1/endpoint')
.then( (res) => res.json() )
.then( async (data) => {
let imagesData = [];
await Promise.all(
data.map( async (item) => {
let res = await fetch(item.src);
let img = await res.blob();
let reader = new FileReader();
reader.readAsDataURL(img);
reader.onloadend = () => {
imagesData.push({ title: item.title, link: item.link, src: reader.result });
}
})
)
postMessage(imagesData);
}); // end then(data)
}
}

You are not waiting for the asynchronous FileReader to have read your files before resolving the outer Promise, so the Promise.all Promise resolves after let img = await res.blob(); but before onloadend does.
Since you are in a Worker context, you can use the synchronous FileReaderSync API, which would give something like
.then( async (data) => {
let imagesData = [];
await Promise.all(
data.map( async (item) => {
let res = await fetch(item.src);
let img = await res.blob();
let reader = new FileReaderSync();
const result = reader.readAsDataURL(img);
imagesData.push({ title: item.title, link: item.link, src: result });
})
)
postMessage(imagesData);
});
But I'm 99% confident that you don't even need that data: URL, and that it will do more harm than anything.
Remember that the data: URL from a FileReader always encode to base64, which will produce a DOMString containing 134% of the original data, that you multiply per 2 since DOMStrings are stored in UTF-16.
Then, to pass that data to the main context, the browser will have to serialize the data using the structured clone algorithm, and for DOMStrings that means a simple copy in memory. Each image now takes in memory about 5 times its real size, and it's even before the main context starts parsing it again to binary data so it can build the pixels data...
Instead, simply pass the Blob you get as img.
Blobs' inner data is passed by reference by the structured clone algorithm, so all you copy is the small js wrapper around.
Then when you need to display these images on the main context, use URL.createObejctURL(img) which will return a blob: URL directly pointing to that same Blob's data, which is still stored only once in memory.
.then( async (data) => {
let imagesData = [];
await Promise.all(
data.map( async (item) => {
let res = await fetch(item.src);
let img = await res.blob();
const url = URL.createObjectURL(img);
imagesData.push({ title: item.title, link: item.link, src: url, file: img });
})
)
postMessage(imagesData);
});

Related

How to stream x-ndjson content using Express and parse the streamed data?

I have a TS library using Node v19.1.0. The library has a function that observes streamed server events.
The server provides a /events route streaming 'application/x-ndjson' content which might be an event/ping/... ( sending a ping every x seconds is important to keep the connection alive )
My observe function parses the streamed data and inspects it. If it is a valid event it will pass it to a callback function. The caller also receives an abort function to abort the streaming on demand.
Whenever I run tests locally or via CI I get the following error
Warning: Test "observes events." generated asynchronous activity after the test ended. This activity created the error "AbortError: The operation was aborted." and would have caused the test to fail, but instead triggered an unhandledRejection event.
I tried to minimize the example code using plain JavaScript
const assert = require('assert/strict');
const express = require('express');
const { it } = require('node:test');
it('observes events.', async () => {
const expectedEvent = { type: 'event', payload: { metadata: { type: 'entity-created', commandId: 'commandId' } } };
const api = express();
const server = api
.use(express.json())
.post('/events', (request, response) => {
response.writeHead(200, {
'content-type': 'application/x-ndjson',
});
const line = JSON.stringify(expectedEvent) + '\n';
response.write(line);
})
.listen(3000);
let stopObserving = () => {
throw new Error('should never happen');
};
const actualEventPayload = await new Promise(async resolve => {
stopObserving = await observeEvents(async newEvent => {
resolve(newEvent);
});
});
stopObserving();
server.closeAllConnections();
server.close();
assert.deepEqual(actualEventPayload, expectedEvent.payload);
});
const observeEvents = async function (onReceivedFn) {
const abortController = new AbortController();
const response = await fetch('http://localhost:3000/events', {
method: 'POST',
headers: { 'content-type': 'application/json' },
signal: abortController.signal,
});
if (!response.ok) {
throw new Error('error handling goes here - request failed');
}
Promise.resolve().then(async () => {
if (!response.body) {
throw new Error('error handling goes here - missing response body');
}
for await (const item of parseStream(response.body, abortController)) {
switch (item.type) {
case 'event': {
await onReceivedFn(item.payload);
break;
}
case 'ping':
// Intentionally left blank
break;
case 'error':
throw new Error('error handling goes here - stream failed');
default:
throw new Error('error handling goes here - should never happen');
}
}
});
return () => { abortController.abort(); };
};
const parseLine = function () {
return new TransformStream({
transform(chunk, controller) {
try {
const data = JSON.parse(chunk);
// ... check if this is a valid line...
controller.enqueue(data);
} catch (error) {
controller.error(error);
}
},
});
};
const splitLines = function () {
let buffer = '';
return new TransformStream({
transform(chunk, controller) {
buffer += chunk;
const lines = buffer.split('\n');
for (let i = 0; i < lines.length - 1; i++) {
controller.enqueue(lines[i]);
}
buffer = lines.at(-1) ?? '';
},
flush(controller) {
if (buffer.length > 0) {
controller.enqueue(buffer);
}
},
});
};
const parseStream = async function* (stream, abortController) {
let streamReader;
try {
const pipedStream = stream
.pipeThrough(new TextDecoderStream())
.pipeThrough(splitLines())
.pipeThrough(parseLine());
streamReader = pipedStream.getReader();
while (true) {
const item = await streamReader.read();
if (item.done) {
break;
}
yield item.value;
}
} finally {
await streamReader?.cancel();
abortController.abort();
}
};
Unfortunately, when running node --test, the test does not finish. I have to cancel it manually.
The test breaks with these lines
const actualEventPayload = await new Promise(async resolve => {
stopObserving = await observeEvents(async newEvent => {
resolve(newEvent);
});
});
and I think that's because the Promise never resolves. I thought the stream parsing might have a bug but if you remove all the stream parsing stuff and replace
Promise.resolve().then(async () => {
/* ... */
});
with
Promise.resolve().then(async () => {
await onReceivedFn({ metadata: { type: 'entity-created', commandId: 'commandId' }});
});
it doesn't work neither. Does someone know what's wrong or missing?
The problem here has nothing to do with your promise not resolving since you never even get to that point.
The problem here is that observeEvents is not yet initialized when the test is being run and thus throws a ReferenceError: Cannot access 'observeEvents' before initialization error.
To see that for yourself you can add a simple const it = (name, fn) => fn(); stub to the top of the file and run it without the --test.
There are multiple ways to fix this and the simplest one is to move the test function to the bottom of the file.
If you don't want to do that you can also define the observeEvents function like this: async function observeEvents(onReceivedFn) {...}. This way it will be available immediately.

initialization in async reduce [duplicate]

This question already has answers here:
JavaScript array .reduce with async/await
(11 answers)
Closed 8 months ago.
const handleFileChange = async (e) => {
const target = e?.target?.files;
const attachments = await Array.from(target).reduce(async (acum, file) => {
file.id = uniqid();
// const format = file.name.split('.').pop();
// if (IMAGE_FORMATS.includes(format)) {
setIsLoading(true);
if (file.type.startsWith('image/')) {
const response = await channel.sendImage(file);
file.src = response.file;
acum.images.push(file);
} else {
const response = await channel.sendFile(file);
file.src = response.file;
acum.files.push(file);
}
setIsLoading(false);
return acum;
}, Promise.resolve({ files: [], images: [] }));
setFilesList(prev => {
console.log('files', [...prev, ...attachments.files]);
return [...prev, ...attachments.files];
});
setImagesList(prev => {
console.log('images', [...prev, ...attachments.images]);
return [...prev, ...attachments.images];
});
};
In the above code I got the following error
It looks it's cause by my initialization of array, but how should I address it?
An async function returns Promise, which makes it difficult to work with when using .reduce() as you would need to await your accumulator each iteration to get your data. As an alternative, you can create an array of Promises using the mapper function of Array.from() (which you can think of as using .map() directly after Array.from()). The idea here is that the map will trigger multiple asynchronous calls for each file by using sendImage/sendFile. These calls will run in parallel in the background. The value that we return from the mapping function will be a Promise that notifies us when the asynchronous call has successfully completed (once it resolves). Moreover, the mapping function defines what the promise resolves with, in our case that is the new object with the src property:
const isImage = file => file.type.startsWith('image/');
const filePromises = Array.from(target, async file => {
const response = await (isImage(file) ? channel.sendImage(file) : channel. sendFile(file));
return {...file, type: file.type, src: response.file};
});
Above filePromises is an array of Promises (as the async mapper function returns a Promise implicitly). We can use Promise.all() to wait for all of our Promises to resolve. This is faster than performing each asynchronous call one by one and only moving to the next once we've waited for the previous to complete:
setIsLoading(true); // set loading to `true` before we start waiting for our asynchronous work to complete
const fileObjects = await Promise.all(filePromises);
setIsLoading(false); // complete asynchronous loading/waiting
Lastly, fileObjects is an array that contains all objects, both files and images. We can do one iteration to partition this array into seperate arrays, one for images, and one for files:
const attachments = {files: [], images: []};
for(const fileObj of fileObjects) {
if(isImage(fileObj))
attachments.images.push(fileObj);
else
attachments.files.push(fileObj);
}
The reduce is not really necessary at this point:
Here is a solution with a map to transform the elements in promises and then Promise.all to wait for the exectution
const channel = {
sendImage: async (file) => {return {file}},
sendFile: async (file) => {return {file}}
}
const uniqid = () => Math.floor(Math.random() * 100);
const input = {
target: {
files: [{
src: 'src',
type: 'image/123'
},
{
src: 'src',
type: 'image/321'
},
{
src: 'src',
type: '123'
},
{
src: 'src',
type: '321'
}]
}
}
const setIsLoading = () => null;
const handleFileChange = async (e) => {
const target = e?.target?.files;
setIsLoading(true);
const attachments = {
images: [],
files: [],
}
await Promise.all(Array.from(target).map((file) => {
file.id = uniqid();
return new Promise(async (resolve) => {
if (file.type.startsWith('image/')) {
const response = await channel.sendImage(file);
file.src = response.file;
attachments.images.push(file);
} else {
const response = await channel.sendFile(file);
file.src = response.file;
attachments.files.push(file);
}
resolve();
});
}));
setIsLoading(false)
return attachments;
};
handleFileChange(input).then(res => console.log(res))

I need help in using double Promises in Javascript

Here is the code that I tried.
// To get base64 code of file
const toBase64 = file => new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onloaded = () => resolve(reader.result.replace(/^data:.+;base64,/, ''));
reader.onerror = error => reject(error);
})
// To make an array of files
const getAttachments = async files => {
let documents;
try {
documents = files.map(async file => {
let base64 = await toBase64(file)
return {
doc: base64,
documentName: file.name,
documentType: file.type
}
})
} catch {
console.error('Failed to get files as base64')
}
return Promise.resolve(documents)
}
And I just tried to get an object array as a result by using the above 2 functions.
Like the following;
getAttachments(Array.from(event.target.files)).then(documents => {
console.info(documents)
}
But the result is
Logged out result in Console
I'd love to know how I can get what I want.
Thanks.
Instead of returning an array of promises try returning an array of resolved promise using the await keyword.
try this
const getAttachments = async files => {
let documents;
try {
documents = files.map(async file => {
let base64 = await toBase64(file)
return {
doc: base64,
documentName: file.name,
documentType: file.type
}
})
return await Promise.all(documents);
} catch {
console.error('Failed to get files as base64')
}
}

Write Blocking Js code to write = csv file (using csv-parser) after scraping data from web

In writeFile fn the records returned by makeCsv function are empty (used await still :/ empty) how to make all the code inside makeCsv blocking such that it i get all entries in records array when i call fn(makeCsv) Expected Code Flow
makeCsv -> reads a csv from local storage and calls function getPubKey
getPubKey-> fetches a key against each call for the account name passed by makeCsv by making a request to url and returns that key
makeCsv-> appends key property to object pushes it to results array and returns it
writeFile -> calls makeCsv and takes array to write new Csv with keys included
Issue : As soon as call to getPubkey function is made end event is triggred resolving promise with empty array and thus writeFile receives empty array because it runs before makeCsv has ended creating all requests and then adding keys to data objects
Code
const csv = require('csv-parser')
const createCsvWriter = require('csv-writer').createObjectCsvWriter;
const fetch = require('node-fetch');
const fs = require('fs');
let nkas = []
async function makeCsv(results) {
const readable = fs.createReadStream('/home/vector/Desktop/test.csv')
return new Promise((resolve, reject) => {
readable.pipe(csv(['acc', 'balance']))
.on('data', async (data) => {
data.key = await getPubkey(data.acc)
results.push(data)
})
.on('end', () => {
return resolve(results);
});
})
}
async function getPubkey(account) {
const body = { account_name: account }
let key;
await fetch('https://eos.greymass.com/v1/chain/get_account', {
method: 'post',
body: JSON.stringify(body),
headers: { 'Content-Type': 'application/json' },
})
.then(res => res.json())
.then(json => key = json.permissions[1].required_auth.keys[0].key)
.catch(err => console.log(err))
if (key && key.length > 0)
return key
else
nkas.push(account);
console.log(nkas);
}
async function writeFile() {
const csvWriter = createCsvWriter({
path: 'out.csv',
header: [
{ id: 'acc', title: 'Account' },
{ id: 'balance', title: 'Balance' },
{ id: 'key', title: 'PubKey' }
]
});
let records = []
records = await makeCsv(records)
console.log(records)
csvWriter.writeRecords(records) // returns a promise
.then(() => {
console.log('...Done');
});
}
writeFile();
You need to wait for the getPubKey() promises to resolve before resolving the makeCsv() promise:
async function makeCsv() {
const readable = fs.createReadStream('/home/vector/Desktop/test.csv')
const pending = [];
return new Promise((resolve, reject) => {
readable.pipe(csv(['acc', 'balance']))
.on('data', (data) => {
const getKey = async (data) => {
data.key = await getPubkey(data.acc)
return data
}
pending.push(getKey(data))
})
.on('end', () => {
Promise.all(pending).then(results => resolve(results));
});
})
}

Why is my object's values a function and not a string?

I have an update function for an event. It is possible that the user has added a teaser video or not. If they have I want to upload it and save that object with the download url.
I use a different function to upload the video and only call it if the user attached a video.
But when I try to update, it tells me that I'm trying to write an invalid object because the data in teaser is a function, where I want it to be a string (either the download url, or just an empty string.
What am I doing wrong?
This is how I call the function:
updateEvent(values, videoAsFile, () => {
setIsEventEdited(false);
setCurrentEvent(values);
})
Then these are the functions:
const uploadTeaserVideo = (event, video) => async () => {
const ref = storage.ref(`/videos/events/${event.id}/`);
const upload = await ref.put(video);
if (!upload) return "";
const downloadUrl = await ref.getDownloadURL();
if (!downloadUrl) return "";
return downloadUrl;
};
export const updateEvent = (values, teaserVideo, cb) => async () => {
if (teaserVideo) {
const teaser = await uploadTeaserVideo(values, teaserVideo);
db.collection("events")
.doc(values.id)
.set({ ...values, teaser })
.then(() => {
cb();
});
} else {
db.collection("events")
.doc(values.id)
.set(values)
.then(() => {
cb();
});
}
};
I've checked, and teaserVideo is a valid video file, or null if a video wasn't chosen.
uploadTeaserVideo is defined as a function that returns a function:
// vv−−−−−−−−−−−−−−− Start of the uploadTeaserVideo function body
const uploadTeaserVideo = (event, video) => async () => {
// ^^−−− Start of the body of the function it returns
const ref = storage.ref(`/videos/events/${event.id}/`);
const upload = await ref.put(video);
if (!upload) return "";
const downloadUrl = await ref.getDownloadURL();
if (!downloadUrl) return "";
return downloadUrl;
};
I suspect you meant it to be an async function that returns (a promise of) downloadUrl:
const uploadTeaserVideo = async (event, video) => {

Categories

Resources