Stringify error with parallel.js - javascript

I am trying to parallelize a JavaScript project using parallel.js, but I am running into some kind of parsing problems.
The goal of the project is to have 9 ticket-sellers process customers in parallel. We have a working code right now, but the processing of the customers is still asynchronous, and we are trying to achieve the parallelization part using parallel.js.
My original code in JS before parallelizing is:
var ticketers = ["H", "M1", "M2", ...]; //There are 9 ticket-sellers
for(var i = 0; i < ticketers.length; i++) {
ticketerBehavior(i);
}
Where ticketers are the 9 ticket-sellers, and ticketerBehavior() is the function that takes care of what happens when a ticket-seller receives a customer.
Using the parallel.js documentation and examples, this is what we tried out:
var ticketBehav = function(ticketers){
for (var i = 0; i < ticketers.length; i++)
ticketerBehavior(i);
};
var p = new Parallel(100);
p.spawn(ticketBehav(ticketers)).then(console.log(ticketBehav(ticketers)));
However, when we run the program, it gives us this error:
/home/user/node_modules/paralleljs/lib/parallel.js:106
return preStr + 'process.on("message", function(e) {process.send(JSON.stringify((' + cb.toString() + ')(JSON.parse(e).data)))})';
^
TypeError: Cannot read property 'toString' of undefined
at Parallel.getWorkerSource (/home/user/node_modules/paralleljs/lib/parallel.js:106:91)
I tried googling the error but it does not return any results so far. I was a little confused on how to troubleshoot this error, since our data is technically serializable as JSON since they are only strings. Would a version compatibility cause any issues with running parallel.js as well?
Could anyone please advise me on where to start troubleshooting? Any tips/advice would be appreciated. Thank you!

Try this:
Pass your ticketers
For each ticket execute your function
Then return your data
p.spawn(function (ticketers) {return ticketBehav(data)}).then(function(data){ console.log(data) });

Related

Cannot get ExecuteScriptAsync() to work as expected

I'm trying to set a HTML input to read-only using ExecuteScriptAsync. I can make it work, but it's not an ideal scenario, so I'm wondering if anyone knows why it doesn't work the way I would expect it to.
I'm using Cef3, version 63.
I tried to see if it's a timing issue and doesn't appear to be.
I tried invalidating the view of the browser but that doesn't seem to help.
The code I currently have, which works:
public void SetReadOnly()
{
var script = #"
(function(){
var labelTags = document.getElementsByTagName('label');
var searchingText = 'Notification Initiator';
var found;
for (var i=0; i<labelTags.length; i++)
{
if(labelTags[i].textContent == searchingText)
{
found = labelTags[i]
break;
}
}
if(found)
{
found.innerHTML='Notification Initiator (Automatic)';
var input;
input = found.nextElementSibling;
if(input)
{
input.setAttribute('readonly', 'readonly');
}
}})()
";
_viewer.Browser.ExecuteScriptAsync(script);
_viewer.Browser.ExecuteScriptAsync(script);
}
now, if I remove
found.innerHTML='Notification Initiator (Automatic)';
the input is no longer shown as read-only. The HTML source of the loaded webpage does show it as read-only, but it seems like the frame doesn't get re-rendered once that property is set.
Another issue is that I'm executing the script twice. If I run it only once I don't get the desired result. I'm thinking this could be a problem with V8 Context that is required for the script to run. Apparently running the script will create the context, so that could be the reason why running it twice works.
I have been trying to figure this out for hours, haven't found anything that would explain this weird behaviour. Does anyone have a clue?
Thanks!

Zapier: Task timed out after 1.00 seconds

I am using the Zapier Code application in the javascript language, I am making a request in an api but in almost all attempts at the time of executing the script, I get the error message: "We had trouble sending your test through. Please try again. Error:
2018-03-09T14:32:54.748Z c0958e0a-23a6-11e8-9be1-a515bc24f853 Task timed out after 1.00 seconds". Sometimes script execution happens successfully, but most of the time it gives this error.
The calling code in the api I'm using is this:
var promises = [];
var retornoDaChamada;
promises.push(fetch(urls));
Promise.all(promises).then(function(res){
var blobPromises = [];
for (var i = res.length - 1; i >= 0; i--) {
blobPromises.push(res[i].text());
}
return Promise.all(blobPromises);
}).then(function(body){
retornoDaChamada=JSON.parse(body);
var titulosDaApi = [retornoDaChamada.length];
var duracao = [retornoDaChamada.length];
var ids = [retornoDaChamada.length];
for(var i=0; i<retornoDaChamada.length; i++){
titulosDaApi[i]=retornoDaChamada[i].title;
duracao[i]=milissegundosParaHorasMinutosSegundos(retornoDaChamada[i].files[0].fileInfo.duration);
ids[i]=retornoDaChamada[i].id;
}
var output = {titulosDaApi, duracao, ids};
callback(null, output);
}).catch(callback);
I read the documentation of the application Code and I kind of understood that free user only has the time of up to 1 second for calls in Api, is there any way I can get around this problem even though I am a free user?
David here, from the Zapier Platform team.
It's a little tough to understand the context of your code, but it looks like you're doing multiple HTTP requests. Due to their nature, they're a very slow operation. If you're doing more than 1 (maybe two if the external resource responds really quickly), you're unlikely to be able to fit everything into 1 second.
Sorry for the bad news!

Saving To MongoDB In A Loop

i am having trouble saving a new record to mongoDB. i am pretty sure there is something i am using in my code that i don't fully understand and i was hoping someone might be able to help.
i am trying to save a new record to mongoDB for each of the cats. this code is for node.js
for(var x = 0; x < (cats.length - 1); x++){
if (!blocked){
console.log("x = "+x);
var memberMessage = new Message();
memberMessage.message = message.message;
memberMessage.recipient = room[x].userId;
memberMessage.save(function(err){
if (err) console.log(err);
console.log(memberMessage + " saved for "+cats[x].name);
});
}
});
}
i log the value of "cats" before the loop and i do get all the names i expect so i would think that looping through the array it would store a new record for each loop.
what seems to happen is that when i look ta the the database, it seems to have only saved for the last record for every loop cycle. i don't know how/why it would be doing that.
any help on this is appreciated because I'm new to node.js and mongoDB.
thanks.
That's because the save is actually a I/O operation which is Async. Now, the for loop is actually sync.
Think of it this way: your JS engine serially executes each line it sees. Assume these lines are kept one-after-another on a stack. When it comes to the save, it keeps it aside on a different stack (as it is an I/O operation, and thus would take time) and goes ahead with the rest of the loop. It so turns out that the engine would only check this new stack after it has completed every line on the older one. Therefore, the value of the variable cats will be the last item in the array. Thus, only the last value is saved.
To fight this tragedy, you can use mutiple methods:
Closures - Read More
You can make closure like so: cats.forEach()
Promises - Read More. There is a sweet library which promisifies the mongo driver to make it easier to work with.
Generators, etc. - Read More. Not ready for primetime yet.
Note about #2 - I'm not a contributor of the project, but do work with the author. I've been using the library for well over an year now, and it's fast and awesome!
You can use a batch create feature from mongoose:
var messages = [];
for(var x = 0; x < (cats.length - 1); x++) {
if (!blocked) {
var message = new Message();
message.message = message.message;
message.recipient = room[x].userId;
messages.push(message);
}
}
Message.create(messages, function (err) {
if (err) // ...
});

Error handling "Out of Memory Exception" in browser?

I am debugging a javascript/html5 web app that uses a lot of memory. Occasionally I get an error message in the console window saying
"uncaught exception: out of memory".
Is there a way for me to gracefully handle this error inside the app?
Ultimately I need to re-write parts of this to prevent this from happening in the first place.
You should calclulate size of your localStorage,
window.localStorage is full
as a solution is to try to add something
var localStorageSpace = function(){
var allStrings = '';
for(var key in window.localStorage){
if(window.localStorage.hasOwnProperty(key)){
allStrings += window.localStorage[key];
}
}
return allStrings ? 3 + ((allStrings.length*16)/(8*1024)) + ' KB' : 'Empty (0 KB)';
};
var storageIsFull = function () {
var size = localStorageSpace(); // old size
// try to add data
var er;
try {
window.localStorage.setItem("test-size", "1");
} catch(er) {}
// check if data added
var isFull = (size === localStorageSpace());
window.localStorage.removeItem("test-size");
return isFull;
}
I also got the same error message recently when working on a project having lots of JS and sending Json, but the solution which I found was to update input type="submit" attribute to input type="button". I know there are limitations of using input type="button"..> and the solution looks weird, but if your application has ajax with JS,Json data, you can give it a try. Thanks.
Faced the same problem in Firefox then later I came to know I was trying to reload a HTML page even before setting up some data into local-storage inside if loop. So you need to take care of that one and also check somewhere ID is repeating or not.
But same thing was working great in Chrome. Maybe Chrome is more Intelligent.

How to structure my code to return a callback?

So I've been stuck on this for quite a while. I asked a similar question here: How exactly does done() work and how can I loop executions inside done()?
but I guess my problem has changed a bit.
So the thing is, I'm loading a lot of streams and it's taking a while to process them all. So to make up for that, I want to at least load the streams that have already been processed onto my webpage, and continue processing stream of tweets at the same time.
loadTweets: function(username) {
$.ajax({
url: '/api/1.0/tweetsForUsername.php?username=' + username
}).done(function (data) {
var json = jQuery.parseJSON(data);
var jsonTweets = json['tweets'];
$.Mustache.load('/mustaches.php', function() {
for (var i = 0; i < jsonTweets.length; i++) {
var tweet = jsonTweets[i];
var optional_id = '_user_tweets';
$('#all-user-tweets').mustache('tweets_tweet', { tweet: tweet, optional_id: optional_id });
configureTweetSentiment(tweet);
configureTweetView(tweet);
}
});
});
}};
}
This is pretty much the structure to my code right now. I guess the problem is the for loop, because nothing will display until the for loop is done. So I have two questions.
How can I get the stream of tweets to display on my website as they're processed?
How can I make sure the Mustache.load() is only executed once while doing this?
The problem is that the UI manipulation and JS operations all run in the same thread. So to solve this problem you should just use a setTimeout function so that the JS operations are queued at the end of all UI operations. You can also pass a parameter for the timeinterval (around 4 ms) so that browsers with a slower JS engine can also perform smoothly.
...
var i = 0;
var timer = setInterval(function() {
var tweet = jsonTweets[i++];
var optional_id = '_user_tweets';
$('#all-user-tweets').mustache('tweets_tweet', {
tweet: tweet,
optional_id: optional_id
});
configureTweetSentiment(tweet);
configureTweetView(tweet);
if(i === jsonTweets.length){
clearInterval(timer);
}
}, 4); //Interval between loading tweets
...
NOTE
The solution is based on the following assumptions -
You are manipulating the dom with the configureTweetSentiment and the configureTweetView methods.
Ideally the solution provided above would not be the best solution. Instead you should create all html elements first in javascript only and at the end append the final html string to a div. You would see a drastic change in performance (Seriously!)
You don't want to use web workers because they are not supported in old browsers. If that's not the case and you are not manipulating the dom with the configure methods then web workers are the way to go for data intensive operations.

Categories

Resources