I got a blob to construct, and received almost 100 parts of (500k) to decrypt and construct a blob file.
Actually it's working fine, but when i do my decryption, that take processor, and freeze my page.
I try different approach, with defered of jquery, timeout but always the same probleme.
It's there a ways to not freez the UI thread ?
var parts = blobs.sort(function (a, b) {
return a.part - b.part;
})
// notre bytesarrays finales
var byteArrays = [];
i = 0;
for (var i = 0; i < blobs.length; i++)
{
// That job is intensive, and take time
byteArrays.push(that.decryptBlob(parts[i].blob.b64, fileType));
}
// create new blob with all data
var blob = new Blob(byteArrays, { type: fileType });
The body inside for(...) loop is synchronous, so the entire decryption process is synchronous, in simple words, decryption happens chunk after chunk. How about making it asynchronous ? Like decrypting multiple chunks in parallel. In JavaScript terminology we can use Asynchronous Workers. These workers can work in parallel, so if you spawn 5 workers for example. The total time is reduced by T / 5. (T = total time in synchronous mode).
Read more about worker threads here :
https://blog.logrocket.com/node-js-multithreading-what-are-worker-threads-and-why-do-they-matter-48ab102f8b10/
Tanks to Sebastian Simon,
I took the avenue of worker. And it's working fine.
var chunks = [];
var decryptedChucnkFnc = function (args) {
// My builder blob job here
}
// determine the number of maximum worker to use
var maxWorker = 5;
if (totalParts < maxWorker) {
maxWorker = totalParts;
}
for (var iw = 0; iw < maxWorker; iw++) {
eval('var w' + iw + ' = new Worker("decryptfile.min.js")');
var wo = eval("w" + iw);
var item = blobs.pop();
wo.postMessage(MyObjectPassToTheFile);
wo.onmessage = decryptedChucnkFnc;
}
Related
I have a cordova app for iOS in which I'm using indexedDB to store significant amounts of data in separate stores in one database.
I want to inform the user of the amount of space being used by the app in this way, partly as the limit for indexedDB seems to be unclear/different on different devices, and I'd like to use it to see where the usage is at at point of failure, and also as a way to warn the user that they need to manage the data they're storing offline before it becomes a problem (although I know I can capture this is the transaction abort event - I just have no idea what the limit is!)
In development I've been using the function below in the browser (I have the browser platform added, just for development) which has worked well:
function showIndexedDbSize(db_name) {
"use strict";
var this_db;
var storesizes = new Array();
function openDatabase() {
return new Promise(function(resolve, reject) {
var request = window.indexedDB.open(db_name);
request.onsuccess = function (event) {
this_db = event.target.result;
resolve(this_db.objectStoreNames);
};
});
}
function getObjectStoreData(storename) {
return new Promise(function(resolve, reject) {
var trans = this_db.transaction(storename, IDBTransaction.READ_ONLY);
var store = trans.objectStore(storename);
var items = [];
trans.oncomplete = function(evt) {
var szBytes = toSize(items);
var szMBytes = (szBytes / 1024 / 1024).toFixed(2);
storesizes.push({'Store Name': storename, 'Items': items.length, 'Size': szMBytes + 'MB (' + szBytes + ' bytes)'});
resolve();
};
var cursorRequest = store.openCursor();
cursorRequest.onerror = function(error) {
reject(error);
};
cursorRequest.onsuccess = function(evt) {
var cursor = evt.target.result;
if (cursor) {
items.push(cursor.value);
cursor.continue();
}
}
});
}
function toSize(items) {
var size = 0;
for (var i = 0; i < items.length; i++) {
var objectSize = JSON.stringify(items[i]).length;
size += objectSize * 2;
}
return size;
}
openDatabase().then(function(stores) {
var PromiseArray = [];
for (var i=0; i < stores.length; i++) {
PromiseArray.push(getObjectStoreData(stores[i]));
}
Promise.all(PromiseArray).then(function() {
this_db.close();
console.table(storesizes);
});
});
};
It works well on the device too when the stores total <150MB, or thereabouts (there isn't a clear threshold), but it uses JSON.stringify to serialize the objects in order to count the bytes, and the process of doing this as the database grows larger on the device forces the app to restart. I'm watching the memory usage in XCode and it doesn't peak at all. Nothing. It hovers between 25 and 30MB whatever you do, not just this, which seems ok to me. The CPU is also <5%. The energy usage is high, but I'm not sure this would affect the app negatively, just drain the battery faster (unless I've misunderstood something). So I'm not sure why it's forcing an ugly restart. In my endless googling I've learnt that JSON.parse and JSON.stringify are very hungry processes, which is why I switched to indexedDB in the first place as it allows the storage of objects, avoiding these processes entirely.
My questions are as follows:
Is there a way to amend the function to slow it down (it doesn't need to be fast, just reliable!) to prevent the restart?
Why would the app refresh if there is not discernible pressure on the memory in XCode? Or is this not a very good way of detecting this sort of thing? Is there some hidden garbage collection problem in the function (I'm a noob when it comes to GC generally, but there doesn't seem to be any leaks in the app)
Is there a better way to show the usage of the database that would avoid this problem? Everything I find always relies on these JSON processes and the navigator.storage Web API doesn't appear to be supported on the cordova iOS platform (which is a real shame as it works amazingly on the browser! Gah!)
Any suggestions/thoughts massively appreciated!
I have a cpu intensive task in JavaScript that is blocking the VM while executing within a Promise:
An example could be the following (try it out in the browser):
function task() {
return new Promise((r,s) => {
for(var x=0; x < 1000000*1000000; x++) {
var Y = Math.sqrt(x/2)
}
return r(true)
})
}
I would like to avoid the VM main thread to be blocked, so I have tried to detach using a setTimeout in the Promise passing the resolve and reject as context like:
function task() {
return new Promise((r,s) => {
var self=this;
setTimeout( function(r,s) {
for(var x=0; x < 1000000*1000000; x++) {
var Y = Math.sqrt( Math.sin (x/2) + Math.cos(x/2))
}
return r(true);
},500,r,s);
})
}
but with no success. Any idea how to avoid the main thread to be stuck?
In addition to using web workers or similar, you can also (depending on the task) break up the work into smaller chunks that are worked on and scheduled using setImmediate. This example is a little silly, but you get the idea.
function brokenUpTask() {
let x = 0; // Keep track of progress here, not in the loop.
const limit = 1000000;
const chunk = 100000;
return new Promise((resolve) => {
function tick() { // Work a single chunk.
let chunkLimit = Math.min(x + chunk, limit);
for(x = 0; x < chunkLimit; x++) {
var Y = Math.sqrt(x/2);
}
if(x === limit) { // All done?
resolve(true);
return;
}
setImmediate(tick); // Still work to do.
}
tick(); // Start work.
});
}
brokenUpTask().then(() => console.log('ok'));
You can use "Web Workers". That way, you stay conform with standard. This will basically be a background thread (for example like BackgroundWorker in older C#).
MDN Web Workers
Using Web Workers to Speed-Up Your JavaScript Applications
I am not sure if current node.js supports this nativly, so use
npm web workers package
So I'm basically doing a lot of operations on objects in an array. So I decided to use webworkers so I can process them in a parallel manner. However, if I inputted an array with 10 objects, only 9 workers would return a value. So I created this simple mockup that reproduces the problem:
var numbers = [12, 2, 6, 5, 5, 2, 9, 8, 1, 4];
var create = function(number) {
var source = 'onmessage = function(e) { postMessage(e.data * 3) }';
var blob = new Blob([source]);
var url = window.URL.createObjectURL(blob);
return new Worker(url)
};
var newnumbers = [];
for (var i = 0; i < numbers.length; i++) {
var worker = create();
worker.onmessage = function(e) {
newnumbers.push(e.data);
worker.terminate();
}
worker.postMessage(numbers[i]);
}
So basically, each number in de array gets multiplied by 3 and added to a new array newnumbers. However, numbers.length = 10 and newnumbers.length=9. I have debugged this for quite a while and I verified that 10 workers were created.
I feel like i'm doing something stupidly wrong, but could someone explain?
Run it here on JSFiddle
You call terminate on the last worker before it processes the messsage, so the last worker is not outputting anything.
This is happens because the worker variable is actually a global variable instead of a local one. You can replace var worker with let worker to make it a local variable. If you are worried about let browser compatibility use an Array to store the workers, or simply create a function scope`.
Now, terminate is called on the last worker because the var worker variable will be set to the last worker when the loop ends. Note that the loop will complete executing before any worker will start processing (as the loop is synchronous code).
In your original code instead of calling terminate() on each worker you would call 10 times terminate() on the last worker.
var newnumbers = [];
for (var i = 0; i < numbers.length; i++) {
let worker = create();
worker.onmessage = function(e) {
newnumbers.push(e.data);
worker.terminate(); // "worker" refers to the unique variable created each iteration
}
worker.postMessage(numbers[i]);
}
Demo: https://jsfiddle.net/bqf5e9o1/2/
Main purpose: I'm trying to scrape data off of around 10,000 different pages using Node.js.
Problem: It scrapes through the first 500~1000 very fast and then turns into a turtle (its variable where it slows down) beyond that, and then eventually just seems stuck forever.
I'm using the request module in Node.js to make the requests I then use cheerio to start scraping,
This code replicates my problem:
var request = require('request');
var requestsCalledCounter = 0;
var requestsCompletedCounter = 0;
var MAX_REQUESTS = 500;
var start = function () {
while (requestsCalledCounter < MAX_REQUESTS) {
request("http://www.google.com", function (error, response, html) {
requestsCompletedCounter++;
});
requestsCalledCounter++;
}
};
start();
Output:
Test 1:
447/500
89.4%
Timed out: No requests completed after 5 seconds
447 Completed
Test 2:
427/500
85.39999999999999%
Timed out: No requests completed after 5 seconds
427
Extra details that might help:
I have an array of URL's that I am going to scrape, so I am looping through them making a request to every URL in the array. It has about 10,000 URL's.
I agree with #cviejo in the comments. You should use an existing project. However to increase the understanding, here is an implementation that will only have 10 requests outstanding at a time.
var request = require('request');
var requestsCalledCounter = 0;
var requestsCompletedCounter = 0;
var pending = 0;
var MAX_PENDING = 10;
var MAX_REQUESTS = 500;
var doreq = function () {
request("http://www.google.com", function (error, response, html) {
requestsCompletedCounter++;
pending--;
});
pending++;
requestsCalledCounter++;
}
var start = function () {
while (pending < MAX_PENDING && requestsCalledCounter < MAX_REQUESTS) {
doreq();
}
if (requestsCalledCounter < MAX_REQUESTS) {
setTimeout(start, 1);
}
};
start();
I'm writing a TCP game server in Node.js and am having issues with splitting the TCP stream into messages. As i want to read numbers and floats from the buffer i cannot find a suitable module to outsource to as all the ones i've found deal with simple strings ending with a new line delimiter. I decided to go with prefixing each message with the length in bytes of the message. I did this and wrote a simple program to spam the server with random messages ( well constructed with a UInt16LE prefix depicting the length of the message ). I noticed that the longer I leave the programs running my actual server keeps using up more and more memory. I tried using a debugging tool to trace the memory allocation with no success so I figured i'd post my code here and hope for a reply. So here is my code... any tips or pointers as to where I'm going wrong or what I can do differently/more efficiently would be amazing!
Thanks.
server.on("connection", function(socket) {
var session = new sessionCS(socket);
console.log("Connection from " + session.address);
// data buffering variables
var currentBuffer = new Buffer(args.bufSize);
var bufWrite = 0;
var bufRead = 0;
var mSize = null;
var i = 0;
socket.on("data", function(dataBuffer) {
// check if buffer risk of overflow
if (bufWrite + dataBuffer.length > args.bufSize-1) {
var newBufWrite = 0;
var newBuffer = new Buffer(args.bufSize);
while(bufRead < bufWrite) {
newBuffer[newBufWrite] = currentBuffer[bufRead];
newBufWrite++;
bufRead++;
}
currentBuffer = newBuffer;
bufWrite = newBufWrite;
bufRead = 0;
newBufWrite = null;
}
// appending buffer
for (i=0; i<dataBuffer.length; i++) {
currentBuffer[bufWrite] = dataBuffer[i];
bufWrite ++;
}
// if beginning of message not acknowleged
if (mSize === null && (bufWrite - bufRead) >= 2) {
mSize = currentBuffer.readUInt16LE(bufRead);
}
// if difference between read and write is greater or equal to message mSize + 2
// +2 for the integer holding the message size
// this means that a full message is in the buffer and needs to be extracted
while ((bufWrite - bufRead) >= mSize+2) {
bufRead += 2;
var messageBuffer = new Buffer(mSize);
for(i=0; i<messageBuffer.length; i++) {
messageBuffer[i] = currentBuffer[bufRead];
bufRead++;
}
// this is where the message buffer would be passed to the router
router(session, messageBuffer);
messageBuffer = null;
// seeinf if another message length indicator is in the buffer
if ((bufWrite - bufRead) >= 2) {
mSize = currentBuffer.readUInt16LE(bufRead);
}
else {
mSize = null;
}
}
});
}
Buffer Frame Serialization Protocol (BUFSP) https://github.com/teambition/bufsp
It may be that you want: encode messages into buffer, write to TCP, receive and splitting the TCP stream buffers into messages.