I'm trying to overcome Call stack size exceeded error but with no luck,
Goal is to re-run the GET request as long as I get music in type field.
//tech: node.js + mongoose
//import components
const https = require('https');
const options = new URL('https://www.boredapi.com/api/activity');
//obtain data using GET
https.get(options, (response) => {
//console.log('statusCode:', response.statusCode);
//console.log('headers:', response.headers);
response.on('data', (data) => {
//process.stdout.write(data);
apiResult = JSON.parse(data);
apiResultType = apiResult.type;
returnDataOutside(data);
});
})
.on('error', (error) => {
console.error(error);
});
function returnDataOutside(data){
apiResultType;
if (apiResultType == 'music') {
console.log(apiResult);
} else {
returnDataOutside(data);
console.log(apiResult); //Maximum call stack size exceeded
};
};
Your function returnDataOutside() is calling itself recursively. If it doesn't gets an apiResultType of 'music' on the first time, then it just keeps calling itself deeper and deeper until the stack overflows with no chance of ever getting the music type because you're just calling it with the same data over and over.
It appears that you want to rerun the GET request when you don't have music type, but your code is not doing that - it's just calling your response function over and over. So, instead, you need to put the code that makes the GET request into a function and call that new function that actually makes a fresh GET request when the apiResultType isn't what you want.
In addition, you shouldn't code something like this that keeping going forever hammering some server. You should have either a maximum number of times you try or a timer back-off or both.
And, you can't just assume that response.on('data', ...) contains a perfectly formed piece of JSON. If the data is anything but very small, then the data may arrive in any arbitrary sized chunks. It make take multiple data events to get your entire payload. And, this may work on fast networks, but not on slow networks or through some proxies, but not others. Instead, you have to accumulate the data from the entire response (all the data events that occur) concatenated together and then process that final result on the end event.
While, you can code the plain https.get() to collect all the results for you (there's an example of that right in the doc here), it's a lot easier to just use a higher level library that brings support for a bunch of useful things.
My favorite library to use in this regard is got(), but there's a list of alternatives here and you can find the one you like. Not only do these libraries accumulate the entire request for you with you writing any extra code, but they are promise-based which makes the asynchronous coding easier and they also automatically check status code results for you, follow redirects, etc... - many things you would want an http request library to "just handle" for you.
Related
I am having some issues trying to connect to a matrix server using the matrix-js-sdk in a react app.
I have provided a simple code example below, and made sure that credentials are valid (login works) and that the environment variable containing the URL for the matrix client is set. I have signed into element in a browser and created two rooms for testing purposes, and was expecting these two rooms would be returned from matrixClient.getRooms(). However, this simply returns an empty array. With some further testing it seems like the asynchronous functions provided for fetching room, member and group ID's only, works as expected.
According to https://matrix.org/docs/guides/usage-of-the-matrix-js-sd these should be valid steps for setting up the matrix-js-sdk, however the sync is never executed either.
const matrixClient = sdk.createClient(
process.env.REACT_APP_MATRIX_CLIENT_URL!
);
await matrixClient.long("m.login.password", credentials);
matrixClient.once('sync', () => {
debugger; // Never hit
}
for (const room of matrixClient.getRooms()) {
debugger; // Never hit
}
I did manage to use the roomId's returned from await matrixClient.roomInitialSync(roomId, limit, callback), however this lead me to another issue where I can't figure out how to decrypt messages, as the events containing the messages sent in the room seems to be of type 'm.room.encrypted' instead of 'm.room.message'.
Does anyone have any good examples of working implementations for the matrix-js-sdk, or any other good resources for properly understanding how to put this all together? I need to be able to load rooms, persons, messages etc. and display these respectively in a ReactJS application.
It turns out I simply forgot to run startClient on the matrix client, resulting in it not fetching any data.
I have two API urls to hit. One known to be fast (~50-100ms). One known to be slow (~1s). I use the results of these to display product choices to the user. Currently I await-download one, then do the second. Pretty synchronous and because of that it's adding 50-100ms to the already-slow second hit.
I would like to:
Send both requests at once
Start processing data as soon as one request comes back
Wait for both requests before moving on from there.
I've seen the example Axios give...
axios.all([getUserAccount(), getUserPermissions()])
.then(axios.spread(function (acct, perms) {
// Both requests are now complete
}));
But this appears to wait for both URLs to commit. This would still be marginally faster but I want the data from my 50ms API hit to start showing as soon as it's ready.
For sure you can chain additional .thens to the promises returned by axios:
Promise.all([
getUserAccount()
.then(processAccount),
getUserPermissions()
.then(processPermissions)
]).then(([userAccount, permissions]) => {
//...
});
wereas processAccount and processPermissions are functions that take the axios response object as an argument and return the wanted results.
For sure you can also add multiple .thens to the same promise:
const account = getUserAccount();
const permissions = getUserPermissions();
// Show permissions when ready
permissions.then(processPermissions);
Promise.all([account, permissions])
.then(([account, permissions]) => {
// Do stuff when both are ready
});
I replaced axios.all with Promise.all - I don't know why axios provides that helper as JS has a native implementation for that. I tried consulting the docs ... but they are not even documenting that API.
I have code on a web-worker and because i can't post to it an object with methods(functions) , i dont know how to stop blocking the UI with this code:
if (data != 'null') {
obj['backupData'] = obj.tbl.data().toArray();
obj['backupAllData'] = data[0];
}
obj.tbl.clear();
obj.tbl.rows.add(obj['backupAllData']);
var ext = config.extension.substring(1);
$.fn.dataTable.ext.buttons[ext + 'Html5'].action(e, dt, button, config);
obj.tbl.clear();
obj.tbl.rows.add(obj['backupData'])
This code exports records from an html table. Data is an array and is returned from a web worker and sometimes can have 50k or more objects.
As obj and all the methods that it contains are not transferable to we-worker, when data length 30k ,40k or 50k or even more, the UI blocks.
which is the best way to do this?
Thanks in advance.
you could try wrapping the heavy work in an async function like a timeout to allow the engine to queue the whole logic and elaborate it as soon as it has time
setTimeout(function(){
if (data != 'null') {
obj['backupData'] = obj.tbl.data().toArray();
obj['backupAllData'] = data[0];
}
//heavy stuff
}, 0)
or , if the code is extremely long, you can try figure it out a strategy to split your code into chunk of operation and execute each chunk in a separate async function (timeout)
Best way to iterate over an array without blocking the UI
Update:
Sadly, ImmutableJS doesn't work at the moment across webworkers. You should be able to transfer the ArrayBuffer so you don't need to parse it back into an array. Also read this article. If your workload is that heavy, it would be best to actually send back one item at a time from the worker.
Previously:
The code is converting all the data into an array, which is immediately costly. Try returning an immutable data structure from web worker if possible. This will guarantee that it doesn't change when the references change and you can continue iterating over it slowly in batches.
The next thing you can do is to use requestIdleCallback to schedule small batches of items to be processed.
This way you should be able to make the UI breathe a bit.
I'm using spawn-child npm package to spawn a shell where i run a binary file which was originally built on C++. I provide Stdin's to the binary and then the binary would be sending out the Stdout's constantly for every second. On the node part once i start receiving the Stdout's from binary i have an on listener which would look something like stdout.on('data', function (data) {}) where i send these data's to the SSE channel.
Everything is working fine but the major concern is the constant memory growth of node process that i see when i hit the binary everytime with an Stdin. I have outlined how my code looks, is there an elegant way to control this memory growth, if so please share.
var sseChannel = require('sse-channel'),
spawnCommand = require('spawn-command'),
cmd = 'path to the binary file',
globalArray = [],
uuid = require('uuid');
module.exports = function(app) {
var child = spawnCommand(cmd),
privateChannel = new sseChannel({
historySize: 0,
cors: {
origins: ['*']
},
pingInterval: 15 * 1000,
jsonEncode: false
});
srvc = {
get: function(req, res) {
globalArray[uuid.v4()] = res;
child.stdin.write('a json object in a format that is expected by binary' + '\n'); // req.query.<queryVal>
child.stdout.on('data', function(data) {
privateChannel.send(JSON.stringify(data));
});
},
delete: function(sessionID) {
var response = globalArray[sessionID];
privateChannel.removeClient(response);
response.end();
delete globalArray[sessionID];
}
}
}
This code is just to enumerate how it would look in the app. Hitting the Run code snippet would not work in this case.
I collected heapdump at 2 different intervals and this is how the statistics looks, there is a tremendous increase in the Typed Array value, what could be done to maintain or suppress the growth of Typed Array,
The problem is that you're spawning a process once and then adding a new data event handler for every request to your http server that never gets removed. So this would explain why the memory usage never drops even after gc.
Another (unrelated) problem is that if you are using your single child process to process multiple incoming requests, you can run into the problem of mixing responses for different requests (you cannot assume that one data event will contain only the data for a particular request). If the child process is node.js-based, you could set up an ipc channel with it and then just pass regular JavaScript values back and forth instead of setting up stdout handling/parsing. If the child isn't node.js-based or you want an alternative (no-ipc) solution, you could set up a queue that all requests get pushed onto and then have a function that processes the queue and responds to each request serially (only moving onto the next request once you have somehow determined you have received all output from the child process for the current request).
If you instead meant for the child process to only be used for a single request, you will need to tweak your code to spawn once per request instead (moving spawn() inside get()).
I am working on a project which is aimed at the Chrome browser. Our goal which we would like to accomplish is to get a one million record array into the browser to work with the data. When I generated a test file that contained a million records it was a bit more than one gigabyte.
For reasons I will explain, I believe we can accomplish the goal if can get the browser to collect the garbage when necessary. I believe the browser holds the text of the AJAX responses when it doesn't need to and crashes for that reason.
Now, I can generate a million records within the browser and manipulate it as I need to. However, I have trouble sending the AJAX to the browser without crashing it.
Since sending one million crashes it, I tried sending batches of one hundred thousand. I can get two such batches across and parse the JSON. If I do not have a onreadystatechange on my AJAX call, I can make the call a number of times. Also, if I receive a hundred thousand records, I can go over it ten times and make the full array.
Because I seem to be able to actually hold one million records, I believe that, as I said, holding the response texts is overwhelming the browsers.
In order to try to get better memory management, I have pushed the AJAX resquests and parsing into a web worker. When the webworker gets the AJAX and makes the hundred thousand record array, it pushes it to the DOM thread. When the DOM thread has taken the data it has the web worker do another AJAX.
However, it still crashes.
I am open to using websockets or something else, if that would help somehow.
Here is the code in the DOM thread:
var iterations=3;
var url='hunthou.json';
var worker=new Worker('src/worker.js');
var count=0;
worker.addEventListener('message',function(e){
alert('count: '+count);
//bigArr=bigArr.concat(e.data);
console.log('e.data length: '+e.data.length);
bigArr[count]=e.data;
console.log('bigArr length: '+bigArr.length);
if(count<(iterations-1)){
worker.postMessage(url);
} else{
alert('done');
console.log('done');
worker.terminate();
console.log('bye');
}
count++;
});
worker.postMessage(url);
Here is the webworker:
var arr=[];
var request = new XMLHttpRequest();
request.onreadystatechange = function () {
var DONE = this.DONE || 4;
if (this.readyState === DONE){
arr=JSON.parse(request.responseText);
self.postMessage(arr);
arr.length=0;
request.responseText.length=0;
console.log('okay');
}
};
self.addEventListener('message', function(e) {
var url=e.data;
console.log('url: '+url);
request.open("GET",'../'+url,true);
request.send(null);
}, false);
Instead of sending the whole data at once.
You can create multiple requests which will retrieve chunks of data instead of retrieving whole data at once, this will prevent your browser from crashing.