I wanted to try IndexedDB, to see if it is fit for my purpose.
Doing some testing, I noticed, that its grow rate seems to be exponentially with every insert.
(Only tested in google chrome version 31.0.1650.63 (Offizieller Build 238485) m / Windows by now)
My Code in full: http://pastebin.com/15WK96FY
Basically I save a string with 2.6 mio characters.
Checking window.webkitStorageInfo.queryUsageAndQuota I see that it consumes ~7.8MB, meaning ~3 bytes per character used.
If I save the string 10 times however, I get a usage of ~167MB, meaning ~6.4 bytes per character used.
By saving it 50 times I'm high up in the gigabytes and my computer starts to freeze.
Am I doing something wrong or is there a way around this behaviour?
Your test is wrong. Field test2 should not be indexed.
Related
A section of my Node.js application involves receiving a string as input from the user and storing it in a JSON file. JSON itself obviously has no limit on this, but is there any upper bound on the amount of text that Node can process into JSON?
Note that I am not using MongoDB or any other technology for the actual insertion - this is native stringification and saving to a .json file using fs.
V8 (the JavaScript engine node is built upon) until very recently had a hard limit on heap size of about 1.9 GB.
Node v0.10 is stuck on an older version of V8 (3.14) due to breaking V8 API changes around native addons. Node 0.12 will update to the newest V8 (3.26), which will break many native modules, but opens the door for the 1.9 GB heap limit to be raised.
So as it stands, a single node process can keep no more than 1.9 GB of JavaScript code, objects, strings, etc combined. That means the maximum length of a string is under 1.9 GB.
You can get around this by using Buffers, which store data outside of the V8 heap (but still in your process's heap). A 64-bit build of node can pretty much fill all your RAM as long as you never have more than 1.9 GB of data in JavaScript variables.
All that said, you should never come anywhere near this limit. When dealing with this much data, you must deal with it as a stream. You should never have more than a few megabytes (at most) in memory at one time. The good news is node is especially well-suited to dealing with streaming data.
You should ask yourself some questions:
What kind of data are you actually receiving from the user?
Why do you want to store it in JSON format?
Is it really a good idea to stuff gigabytes into JSON? (The answer is no.)
What will happen with the data later, after it is stored? Will your code read it? Something else?
The question you've posted is actually quite vague in regard to what you're actually trying to accomplish. For more specific advice, update your question with more information.
If you expect the data to never be all that big, just throw a reasonable limit of 10 MB or something on the input, buffer it all, and use JSON.stringify.
If you expect to deal with data any larger, you need to stream the input straight to disk. Look in to transform streams if you need to process/modify the data before it goes to disk. For example, there are modules that deal with streaming JSON.
The maximum string size in "vanilla" nodeJS (v0.10.28) is in the ballpark of 1GB.
If your are in a hurry, you can test the maximum supported string size with a self doubling string. The system tested has 8GB of RAM, mostly unused.
x = 'x';
while (1){
x = ''+x+x; // string context
console.log(x.length);
}
2
4
8
16
32
64
128
256
512
1024
2048
4096
8192
16384
32768
65536
131072
262144
524288
1048576
2097152
4194304
8388608
16777216
33554432
67108864
134217728
268435456
536870912
FATAL ERROR: JS Allocation failed - process out of memory
Aborted (core dumped)
In another test I got to 1,000,000,000 with a one char at a time for loop.
Now a critic might say, "wait, what about JSON. the question is about JSON!" and I would shout THERE ARE NO JSON OBJECTS IN JAVASCRIPT the JS types are Object, Array, String, Number, etc.... and as JSON is a String representation this question boils down to what is the longest allowed string. But just to double check, let's add a JSON.stringify call to address the JSON conversion.
Code
x = 'x';
while (1){
x = ''+x+x; // string context
console.log(JSON.stringify({a:x}).length);
}
Expectations: the size of the JSON string will start greater than 2, because the first object is going to stringify to '{"a":"xx"}' for 10 chars. It won't start to double until the x string in property a gets bigger. It will probably fail around 256M since it probably makes a second copy in stringification. Recall a stringification is independent of the original object.
Result:
10
12
16
24
40
72
136
264
520
1032
2056
4104
8200
16392
32776
65544
131080
262152
524296
1048584
2097160
4194312
8388616
16777224
33554440
67108872
134217736
268435464
Pretty much as expected....
Now these limits are probably related to the C/C++ code that implements JS in the nodeJS project, which at this time I believe is the same V8 code used in Chrome browsers.
There is evidence from blog posts of people recompiling nodeJS to get around memory limits in older versions. There are also a number of nodejs command line switches. I have not tested the effect of any of this.
The maximum length of a string in node.js is defined by the underlying Javascript Engine "V8". In V8 the maximum length is independent of the heap size. The size of a string is actually constrained by the limits defined by optimized object layout. See https://chromium-review.googlesource.com/c/v8/v8/+/2030916 which is a recent (Feb 2020) change to the maximum length of a string in V8. The commit message explains the different lengths over time. The limit has gone from about 256MB to 1GB then back to 512MB (on 64-bit V8 platforms).
This is a good question, but I think the upper limit you need to be worried about doesn't involve the max JSON string size.
In my opinion the limit you need to worry about is how long do you wish to block the request thread while it's processing the user's request.
Any string size over 1MB will take the user a few seconds to upload and 10 of Megabytes could take minutes. After receiving the request, the server will take a few hundred milliseconds to seconds to parse into a data structure leading to a very poor user experience (Parsing JSON is very expensive)
The bandwidth and server processing times will overshadow any limit JSON may have on string size.
I'm comparing performance between a few frameworks (namely ReactJS and AngularJS) versus a "vanilla HTML + JS". During this I came across absolutely abysmal performance with Internet Explorer (I've tested in IE9 and IE11 and they both exhibit performance issues but differently).
The original code is an HTML file but I've moved it to JSFiddle for the sake of sharing it here. If you'd like, I can post it as a GitHub Gist, instead.
Anyways, goal is to render a table with 5,000 items in it (representing files and folders). On my test machine, IE11 takes around 30 seconds for the initial rendering while Chrome/Safari/Firefox are in the 1.5–3 second range. If I look at just how long it takes to generate the HTML string (so not even DOM manipulation), that alone is about 15 seconds on IE11 plus another 15 for actual rendering.
Any thoughts as to what I'm doing wrong? Make sure you change the sampleSize from 100 to 5,000 once you want to see the actual results:
var sampleSize = 100;
to
var sampleSize = 5000;
Note: here's what I've already done to improve performance:
Changed the string concatenation of each row to using an array with a .join('') at the end which is a known performance issue with IE
Only a single DOM access with $(tblBody).html(nodes.join('')); rather than append one row at a time
The above two enhancements brought the initial rendering from 36s down to 30s.
Note 2: the code is that that f*ed up since it's still faster than either my ReactJS- or AngularJS-based solutions. So the main question is: what in the world is IE doing?
I'm writing a Google Chrome extension. I know that Chrome currently sets the limit of 5MB on the maximum allowed size of localStorage. But I'm curious if there's any way to get this from the Chrome itself, anything like a JS constant/global variable?
PS. I just hate to hard-code this value in case they change it in the future.
We assume that one character equals one byte, but this is not a safe assumption. Strings in JavaScript are UTF-16, so each character requires two bytes of memory. This means that while many browsers have a 5 MB limit, you can only store 2.5 M characters.
It is quite difficult to predict how much is left for the domain, even if it is set to 5 MB.
After reading through about HTML5 Storage. It is quite possible to look for unlimited storage option.
https://developers.google.com/chrome/whitepapers/storage#unlimited
In this documentation they are suggesting manifest as:
"storage": {
"managed_schema": "schema.json"
},
I have not tested myself but it is worth to give it a try. If it works then please let me know as well.
Updated 2021:
https://developer.chrome.com/docs/extensions/reference/storage/#property-local
Use chrome.storage.local.QUOTA_BYTES.
A section of my Node.js application involves receiving a string as input from the user and storing it in a JSON file. JSON itself obviously has no limit on this, but is there any upper bound on the amount of text that Node can process into JSON?
Note that I am not using MongoDB or any other technology for the actual insertion - this is native stringification and saving to a .json file using fs.
V8 (the JavaScript engine node is built upon) until very recently had a hard limit on heap size of about 1.9 GB.
Node v0.10 is stuck on an older version of V8 (3.14) due to breaking V8 API changes around native addons. Node 0.12 will update to the newest V8 (3.26), which will break many native modules, but opens the door for the 1.9 GB heap limit to be raised.
So as it stands, a single node process can keep no more than 1.9 GB of JavaScript code, objects, strings, etc combined. That means the maximum length of a string is under 1.9 GB.
You can get around this by using Buffers, which store data outside of the V8 heap (but still in your process's heap). A 64-bit build of node can pretty much fill all your RAM as long as you never have more than 1.9 GB of data in JavaScript variables.
All that said, you should never come anywhere near this limit. When dealing with this much data, you must deal with it as a stream. You should never have more than a few megabytes (at most) in memory at one time. The good news is node is especially well-suited to dealing with streaming data.
You should ask yourself some questions:
What kind of data are you actually receiving from the user?
Why do you want to store it in JSON format?
Is it really a good idea to stuff gigabytes into JSON? (The answer is no.)
What will happen with the data later, after it is stored? Will your code read it? Something else?
The question you've posted is actually quite vague in regard to what you're actually trying to accomplish. For more specific advice, update your question with more information.
If you expect the data to never be all that big, just throw a reasonable limit of 10 MB or something on the input, buffer it all, and use JSON.stringify.
If you expect to deal with data any larger, you need to stream the input straight to disk. Look in to transform streams if you need to process/modify the data before it goes to disk. For example, there are modules that deal with streaming JSON.
The maximum string size in "vanilla" nodeJS (v0.10.28) is in the ballpark of 1GB.
If your are in a hurry, you can test the maximum supported string size with a self doubling string. The system tested has 8GB of RAM, mostly unused.
x = 'x';
while (1){
x = ''+x+x; // string context
console.log(x.length);
}
2
4
8
16
32
64
128
256
512
1024
2048
4096
8192
16384
32768
65536
131072
262144
524288
1048576
2097152
4194304
8388608
16777216
33554432
67108864
134217728
268435456
536870912
FATAL ERROR: JS Allocation failed - process out of memory
Aborted (core dumped)
In another test I got to 1,000,000,000 with a one char at a time for loop.
Now a critic might say, "wait, what about JSON. the question is about JSON!" and I would shout THERE ARE NO JSON OBJECTS IN JAVASCRIPT the JS types are Object, Array, String, Number, etc.... and as JSON is a String representation this question boils down to what is the longest allowed string. But just to double check, let's add a JSON.stringify call to address the JSON conversion.
Code
x = 'x';
while (1){
x = ''+x+x; // string context
console.log(JSON.stringify({a:x}).length);
}
Expectations: the size of the JSON string will start greater than 2, because the first object is going to stringify to '{"a":"xx"}' for 10 chars. It won't start to double until the x string in property a gets bigger. It will probably fail around 256M since it probably makes a second copy in stringification. Recall a stringification is independent of the original object.
Result:
10
12
16
24
40
72
136
264
520
1032
2056
4104
8200
16392
32776
65544
131080
262152
524296
1048584
2097160
4194312
8388616
16777224
33554440
67108872
134217736
268435464
Pretty much as expected....
Now these limits are probably related to the C/C++ code that implements JS in the nodeJS project, which at this time I believe is the same V8 code used in Chrome browsers.
There is evidence from blog posts of people recompiling nodeJS to get around memory limits in older versions. There are also a number of nodejs command line switches. I have not tested the effect of any of this.
The maximum length of a string in node.js is defined by the underlying Javascript Engine "V8". In V8 the maximum length is independent of the heap size. The size of a string is actually constrained by the limits defined by optimized object layout. See https://chromium-review.googlesource.com/c/v8/v8/+/2030916 which is a recent (Feb 2020) change to the maximum length of a string in V8. The commit message explains the different lengths over time. The limit has gone from about 256MB to 1GB then back to 512MB (on 64-bit V8 platforms).
This is a good question, but I think the upper limit you need to be worried about doesn't involve the max JSON string size.
In my opinion the limit you need to worry about is how long do you wish to block the request thread while it's processing the user's request.
Any string size over 1MB will take the user a few seconds to upload and 10 of Megabytes could take minutes. After receiving the request, the server will take a few hundred milliseconds to seconds to parse into a data structure leading to a very poor user experience (Parsing JSON is very expensive)
The bandwidth and server processing times will overshadow any limit JSON may have on string size.
I just ran in a small issue with WebGL today, while doing a project on point set visualisation. I understand there is a index limit in drawElements, due to the indexes being 16-bit integers. According to this post, however, there isn't for drawArrays, which I confirmed by being able to send some 400k points to the GPU.
The thing is, once I tried with 400k, I wanted to explore the possibilities of WebGL, and I tried with a 3M vertices model. Bang! Nothing gets displayed, and the WebGL inspector shows no drawArrays call.
Are you aware of some kind of limit for direct drawArray calls?
It looks like the same question is already discussed/answered here: Is there a limit of vertices in WebGL?. In that thread, the post by brainjam says that he discovered that drawArrays was not limited to 65k.
It sounds like you've got an outdated driver. The definition of drawArrays():
void drawArrays(enum mode, int first, long count)
The count elements is a long integer, that would mean at least 2^32 Elements in 32-bit architectures and 2^64 on 64-bit archs.
Remember that, unlike what anyone could presume, both Chrome/Chromium and Firefox use Direct3D as the underlying technology for WebGL on windows.