I want to know the size occupied by a JavaScript object.
Take the following function:
function Marks(){
this.maxMarks = 100;
}
function Student(){
this.firstName = "firstName";
this.lastName = "lastName";
this.marks = new Marks();
}
Now I instantiate the student:
var stud = new Student();
so that I can do stuff like
stud.firstName = "new Firstname";
alert(stud.firstName);
stud.marks.maxMarks = 200;
etc.
Now, the stud object will occupy some size in memory. It has some data and more objects.
How do I find out how much memory the stud object occupies? Something like a sizeof() in JavaScript? It would be really awesome if I could find it out in a single function call like sizeof(stud).
I’ve been searching the Internet for months—couldn’t find it (asked in a couple of forums—no replies).
I have re-factored the code in my original answer. I have removed the recursion and removed the assumed existence overhead.
function roughSizeOfObject( object ) {
var objectList = [];
var stack = [ object ];
var bytes = 0;
while ( stack.length ) {
var value = stack.pop();
if ( typeof value === 'boolean' ) {
bytes += 4;
}
else if ( typeof value === 'string' ) {
bytes += value.length * 2;
}
else if ( typeof value === 'number' ) {
bytes += 8;
}
else if
(
typeof value === 'object'
&& objectList.indexOf( value ) === -1
)
{
objectList.push( value );
for( var i in value ) {
stack.push( value[ i ] );
}
}
}
return bytes;
}
The Google Chrome Heap Profiler allows you to inspect object memory use.
You need to be able to locate the object in the trace which can be tricky. If you pin the object to the Window global, it is pretty easy to find from the "Containment" listing mode.
In the attached screenshot, I created an object called "testObj" on the window. I then located in the profiler (after making a recording) and it shows the full size of the object and everything in it under "retained size".
More details on the memory breakdowns.
In the above screenshot, the object shows a retained size of 60. I believe the unit is bytes here.
Sometimes I use this to flag really big objects that might be going to the client from the server. It doesn't represent the in memory footprint. It just gets you approximately what it'd cost to send it, or store it.
Also note, it's slow, dev only. But for getting an ballpark answer with one line of code it's been useful for me.
roughObjSize = JSON.stringify(bigObject).length;
I just wrote this to solve a similar (ish) problem. It doesn't exactly do what you may be looking for, ie it doesn't take into account how the interpreter stores the object.
But, if you are using V8, it should give you a fairly ok approximation as the awesome prototyping and hidden classes lick up most of the overhead.
function roughSizeOfObject( object ) {
var objectList = [];
var recurse = function( value )
{
var bytes = 0;
if ( typeof value === 'boolean' ) {
bytes = 4;
}
else if ( typeof value === 'string' ) {
bytes = value.length * 2;
}
else if ( typeof value === 'number' ) {
bytes = 8;
}
else if
(
typeof value === 'object'
&& objectList.indexOf( value ) === -1
)
{
objectList[ objectList.length ] = value;
for( i in value ) {
bytes+= 8; // an assumed existence overhead
bytes+= recurse( value[i] )
}
}
return bytes;
}
return recurse( object );
}
Here's a slightly more compact solution to the problem:
const typeSizes = {
"undefined": () => 0,
"boolean": () => 4,
"number": () => 8,
"string": item => 2 * item.length,
"object": item => !item ? 0 : Object
.keys(item)
.reduce((total, key) => sizeOf(key) + sizeOf(item[key]) + total, 0)
};
const sizeOf = value => typeSizes[typeof value](value);
There is a NPM module to get object sizeof, you can install it with npm install object-sizeof
var sizeof = require('object-sizeof');
// 2B per character, 6 chars total => 12B
console.log(sizeof({abc: 'def'}));
// 8B for Number => 8B
console.log(sizeof(12345));
var param = {
'a': 1,
'b': 2,
'c': {
'd': 4
}
};
// 4 one two-bytes char strings and 3 eighth-bytes numbers => 32B
console.log(sizeof(param));
This is a hacky method, but i tried it twice with different numbers and it seems to be consistent.
What you can do is to try and allocate a huge number of objects, like one or two million objects of the kind you want. Put the objects in an array to prevent the garbage collector from releasing them (note that this will add a slight memory overhead because of the array, but i hope this shouldn't matter and besides if you are going to worry about objects being in memory, you store them somewhere). Add an alert before and after the allocation and in each alert check how much memory the Firefox process is taking. Before you open the page with the test, make sure you have a fresh Firefox instance. Open the page, note the memory usage after the "before" alert is shown. Close the alert, wait for the memory to be allocated. Subtract the new memory from the older and divide it by the amount of allocations. Example:
function Marks()
{
this.maxMarks = 100;
}
function Student()
{
this.firstName = "firstName";
this.lastName = "lastName";
this.marks = new Marks();
}
var manyObjects = new Array();
alert('before');
for (var i=0; i<2000000; i++)
manyObjects[i] = new Student();
alert('after');
I tried this in my computer and the process had 48352K of memory when the "before" alert was shown. After the allocation, Firefox had 440236K of memory. For 2million allocations, this is about 200 bytes for each object.
I tried it again with 1million allocations and the result was similar: 196 bytes per object (i suppose the extra data in 2mill was used for Array).
So, here is a hacky method that might help you. JavaScript doesn't provide a "sizeof" method for a reason: each JavaScript implementaion is different. In Google Chrome for example the same page uses about 66 bytes for each object (judging from the task manager at least).
Having the same problem. I searched on Google and I want to share with stackoverflow community this solution.
Important:
I used the function shared by Yan Qing on github
https://gist.github.com/zensh/4975495
function memorySizeOf(obj) {
var bytes = 0;
function sizeOf(obj) {
if(obj !== null && obj !== undefined) {
switch(typeof obj) {
case 'number':
bytes += 8;
break;
case 'string':
bytes += obj.length * 2;
break;
case 'boolean':
bytes += 4;
break;
case 'object':
var objClass = Object.prototype.toString.call(obj).slice(8, -1);
if(objClass === 'Object' || objClass === 'Array') {
for(var key in obj) {
if(!obj.hasOwnProperty(key)) continue;
sizeOf(obj[key]);
}
} else bytes += obj.toString().length * 2;
break;
}
}
return bytes;
};
function formatByteSize(bytes) {
if(bytes < 1024) return bytes + " bytes";
else if(bytes < 1048576) return(bytes / 1024).toFixed(3) + " KiB";
else if(bytes < 1073741824) return(bytes / 1048576).toFixed(3) + " MiB";
else return(bytes / 1073741824).toFixed(3) + " GiB";
};
return formatByteSize(sizeOf(obj));
};
var sizeOfStudentObject = memorySizeOf({Student: {firstName: 'firstName', lastName: 'lastName', marks: 10}});
console.log(sizeOfStudentObject);
What do you think about it?
Sorry I could not comment, so I just continue the work from tomwrong.
This enhanced version will not count object more than once, thus no infinite loop.
Plus, I reckon the key of an object should be also counted, roughly.
function roughSizeOfObject( value, level ) {
if(level == undefined) level = 0;
var bytes = 0;
if ( typeof value === 'boolean' ) {
bytes = 4;
}
else if ( typeof value === 'string' ) {
bytes = value.length * 2;
}
else if ( typeof value === 'number' ) {
bytes = 8;
}
else if ( typeof value === 'object' ) {
if(value['__visited__']) return 0;
value['__visited__'] = 1;
for( i in value ) {
bytes += i.length * 2;
bytes+= 8; // an assumed existence overhead
bytes+= roughSizeOfObject( value[i], 1 )
}
}
if(level == 0){
clear__visited__(value);
}
return bytes;
}
function clear__visited__(value){
if(typeof value == 'object'){
delete value['__visited__'];
for(var i in value){
clear__visited__(value[i]);
}
}
}
roughSizeOfObject(a);
i want to know if my memory reduction efforts actually help in reducing memory
Following up on this comment, here's what you should do:
Try to produce a memory problem - Write code that creates all these objects and graudally increase the upper limit until you ran into a problem (Browser crash, Browser freeze or an Out-Of-memory error). Ideally you should repeat this experiment with different browsers and different operating system.
Now there are two options:
option 1 - You didn't succeed in producing the memory problem. Hence, you are worrying for nothing. You don't have a memory issue and your program is fine.
option 2- you did get a memory problem. Now ask yourself whether the limit at which the problem occurred is reasonable (in other words: is it likely that this amount of objects will be created at normal use of your code). If the answer is 'No' then you're fine. Otherwise you now know how many objects your code can create. Rework the algorithm such that it does not breach this limit.
This Javascript library sizeof.js does the same thing.
Include it like this
<script type="text/javascript" src="sizeof.js"></script>
The sizeof function takes an object as a parameter and returns its approximate size in bytes. For example:
// define an object
var object =
{
'boolean' : true,
'number' : 1,
'string' : 'a',
'array' : [1, 2, 3]
};
// determine the size of the object
var size = sizeof(object);
The sizeof function can handle objects that contain multiple references to other objects and recursive references.
Originally published here.
If your main concern is the memory usage of your Firefox extension, I suggest checking with Mozilla developers.
Mozilla provides on its wiki a list of tools to analyze memory leaks.
Chrome developer tools has this functionality. I found this article very helpful and does exactly what you want:
https://developers.google.com/chrome-developer-tools/docs/heap-profiling
Many thanks to everyone that has been working on code for this!
I just wanted to add that I've been looking for exactly the same thing, but in my case it's for managing a cache of processed objects to avoid having to re-parse and process objects from ajax calls that may or may not have been cached by the browser. This is especially useful for objects that require a lot of processing, usually anything that isn't in JSON format, but it can get very costly to keep these things cached in a large project or an app/extension that is left running for a long time.
Anyway, I use it for something something like:
var myCache = {
cache: {},
order: [],
size: 0,
maxSize: 2 * 1024 * 1024, // 2mb
add: function(key, object) {
// Otherwise add new object
var size = this.getObjectSize(object);
if (size > this.maxSize) return; // Can't store this object
var total = this.size + size;
// Check for existing entry, as replacing it will free up space
if (typeof(this.cache[key]) !== 'undefined') {
for (var i = 0; i < this.order.length; ++i) {
var entry = this.order[i];
if (entry.key === key) {
total -= entry.size;
this.order.splice(i, 1);
break;
}
}
}
while (total > this.maxSize) {
var entry = this.order.shift();
delete this.cache[entry.key];
total -= entry.size;
}
this.cache[key] = object;
this.order.push({ size: size, key: key });
this.size = total;
},
get: function(key) {
var value = this.cache[key];
if (typeof(value) !== 'undefined') { // Return this key for longer
for (var i = 0; i < this.order.length; ++i) {
var entry = this.order[i];
if (entry.key === key) {
this.order.splice(i, 1);
this.order.push(entry);
break;
}
}
}
return value;
},
getObjectSize: function(object) {
// Code from above estimating functions
},
};
It's a simplistic example and may have some errors, but it gives the idea, as you can use it to hold onto static objects (contents won't change) with some degree of intelligence. This can significantly cut down on any expensive processing requirements that the object had to be produced in the first place.
The accepted answer does not work with Map, Set, WeakMap and other iterable objects. (The package object-sizeof, mentioned in other answer, has the same problem).
Here's my fix
export function roughSizeOfObject(object) {
const objectList = [];
const stack = [object];
const bytes = [0];
while (stack.length) {
const value = stack.pop();
if (value == null) bytes[0] += 4;
else if (typeof value === 'boolean') bytes[0] += 4;
else if (typeof value === 'string') bytes[0] += value.length * 2;
else if (typeof value === 'number') bytes[0] += 8;
else if (typeof value === 'object' && objectList.indexOf(value) === -1) {
objectList.push(value);
if (typeof value.byteLength === 'number') bytes[0] += value.byteLength;
else if (value[Symbol.iterator]) {
// eslint-disable-next-line no-restricted-syntax
for (const v of value) stack.push(v);
} else {
Object.keys(value).forEach(k => {
bytes[0] += k.length * 2; stack.push(value[k]);
});
}
}
}
return bytes[0];
}
It also includes some other minor improvements: counts keys storage and works with ArrayBuffer.
function sizeOf(parent_data, size)
{
for (var prop in parent_data)
{
let value = parent_data[prop];
if (typeof value === 'boolean')
{
size += 4;
}
else if (typeof value === 'string')
{
size += value.length * 2;
}
else if (typeof value === 'number')
{
size += 8;
}
else
{
let oldSize = size;
size += sizeOf(value, oldSize) - oldSize;
}
}
return size;
}
function roughSizeOfObject(object)
{
let size = 0;
for each (let prop in object)
{
size += sizeOf(prop, 0);
} // for..
return size;
}
I use Chrome dev tools' Timeline tab, instantiate increasingly large amounts of objects, and get good estimates like that. You can use html like this one below, as boilerplate, and modify it to better simulate the characteristics of your objects (number and types of properties, etc...). You may want to click the trash bit icon at the bottom of that dev tools tab, before and after a run.
<html>
<script>
var size = 1000*100
window.onload = function() {
document.getElementById("quantifier").value = size
}
function scaffold()
{
console.log("processing Scaffold...");
a = new Array
}
function start()
{
size = document.getElementById("quantifier").value
console.log("Starting... quantifier is " + size);
console.log("starting test")
for (i=0; i<size; i++){
a[i]={"some" : "thing"}
}
console.log("done...")
}
function tearDown()
{
console.log("processing teardown");
a.length=0
}
</script>
<body>
<span style="color:green;">Quantifier:</span>
<input id="quantifier" style="color:green;" type="text"></input>
<button onclick="scaffold()">Scaffold</button>
<button onclick="start()">Start</button>
<button onclick="tearDown()">Clean</button>
<br/>
</body>
</html>
Instantiating 2 million objects of just one property each (as in this code above) leads to a rough calculation of 50 bytes per object, on my Chromium, right now. Changing the code to create a random string per object adds some 30 bytes per object, etc.
Hope this helps.
If you need to programatically check for aprox. size of objects you can also check this library http://code.stephenmorley.org/javascript/finding-the-memory-usage-of-objects/ that I have been able to use for objects size.
Otherwise I suggest to use the Chrome/Firefox Heap Profiler.
I had problems with the above answer with an ArrayBuffer.
After checking the documentation, I found that ArrayBuffer has a byteLength property which tells me exactly what I need, hence:
function sizeOf(data)
{
if (typeof(data) === 'object')
{
if (data instanceof ArrayBuffer)
{
return data.byteLength;
}
// other objects goes here
}
// non-object cases goes here
}
console.log(sizeOf(new ArrayBuffer(15))); // 15
Reference:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer/byteLength
Building upon the already compact solution from #Dan, here's a self-contained function version of it. Variable names are reduced to single letters for those who just want it to be as compact as possible at the expense of context.
const ns = {};
ns.sizeof = function(v) {
let f = ns.sizeof, //this needs to match the name of the function itself, since arguments.callee.name is defunct
o = {
"undefined": () => 0,
"boolean": () => 4,
"number": () => 8,
"string": i => 2 * i.length,
"object": i => !i ? 0 : Object
.keys(i)
.reduce((t, k) => f(k) + f(i[k]) + t, 0)
};
return o[typeof v](v);
};
ns.undef;
ns.bool = true;
ns.num = 1;
ns.string = "Hello";
ns.obj = {
first_name: 'John',
last_name: 'Doe',
born: new Date(1980, 1, 1),
favorite_foods: ['Pizza', 'Salad', 'Indian', 'Sushi'],
can_juggle: true
};
console.log(ns.sizeof(ns.undef));
console.log(ns.sizeof(ns.bool));
console.log(ns.sizeof(ns.num));
console.log(ns.sizeof(ns.string));
console.log(ns.sizeof(ns.obj));
console.log(ns.sizeof(ns.obj.favorite_foods));
I believe you forgot to include 'array'.
typeOf : function(value) {
var s = typeof value;
if (s === 'object')
{
if (value)
{
if (typeof value.length === 'number' && !(value.propertyIsEnumerable('length')) && typeof value.splice === 'function')
{
s = 'array';
}
}
else
{
s = 'null';
}
}
return s;
},
estimateSizeOfObject: function(value, level)
{
if(undefined === level)
level = 0;
var bytes = 0;
if ('boolean' === typeOf(value))
bytes = 4;
else if ('string' === typeOf(value))
bytes = value.length * 2;
else if ('number' === typeOf(value))
bytes = 8;
else if ('object' === typeOf(value) || 'array' === typeOf(value))
{
for(var i in value)
{
bytes += i.length * 2;
bytes+= 8; // an assumed existence overhead
bytes+= estimateSizeOfObject(value[i], 1)
}
}
return bytes;
},
formatByteSize : function(bytes)
{
if (bytes < 1024)
return bytes + " bytes";
else
{
var floatNum = bytes/1024;
return floatNum.toFixed(2) + " kb";
}
},
I know this is absolutely not the right way to do it, yet it've helped me a few times in the past to get the approx object file size:
Write your object/response to the console or a new tab, copy the results to a new notepad file, save it, and check the file size. The notepad file itself is just a few bytes, so you'll get a fairly accurate object file size.
Related
I am trying to solve the Hackerrank problem Jesse and Cookies:
Jesse loves cookies and wants the sweetness of some cookies to be greater than value 𝑘. To do this, two cookies with the least sweetness are repeatedly mixed. This creates a special combined cookie with:
sweetness = (1 × Least sweet cookie + 2 × 2nd least sweet cookie).
This occurs until all the cookies have a sweetness ≥ 𝑘.
Given the sweetness of a number of cookies, determine the minimum number of operations required. If it is not possible, return −1.
Example
k = 9
A = [2,7,3,6,4,6]
The smallest values are 2, 3.
Remove them then return 2 + 2 × 3 = 8 to the array. Now A = [8,7,6,4,6].
Remove 4, 6 and return 4 + 2 × 6 = 16 to the array. Now A = [16,8,7,6].
Remove 6, 7, return 6 + 2 × 7 = 20 and A = [20,16,8,7].
Finally, remove 8, 7 and return 7 + 2 × 8 = 23 to A. Now A = [23,20,16].
All values are ≥ 𝑘 = 9 so the process stops after 4 iterations. Return 4.
I couldn't find a JavaScript solution or a hint for this problem. My code seems to be working, except that it times out for a large array (input size > 1 million).
Is there a way to make my code more efficient? I think the time complexity is between linear and O(n log n).
My Code:
function cookies(k, A) {
A.sort((a,b)=>a-b)
let ops = 0;
while (A[0] < k && A.length > 1) {
ops++;
let calc = (A[0] * 1) + (A[1] * 2);
A.splice(0, 2);
let inserted = false
if (A.length === 0) { // when the array is empty after splice
A.push(calc);
} else {
for (var i = 0; i < A.length && !inserted; i++) {
if (A[A.length - 1] < calc) {
A.push(calc)
inserted = true
} else if (A[i] >= calc) {
A.splice(i, 0, calc);
inserted = true
}
}
}
}
if (A[0] < k) {
ops = -1;
}
return ops;
}
It is indeed a problem that can be solved efficiently with a heap. As JavaScript has no native heap, just implement your own.
You should also cope with inputs that are huge, but where most values are greater than k. Those should not have to be part of the heap -- it would just make heap operations unnecessarily slower. Also, when cookies are augmented, they only need to enter back into the heap when they are not yet good enough.
Special care needs to be taken when the heap ends up with just one value (less than k). In that case it needs to be checked whether any good cookies were created (and thus did not end up in the heap). If so, then with one more operation the solution has been found. But if not, it means there is no solution and -1 should be returned.
Here is an implementation in JavaScript:
/* MinHeap implementation without payload. */
const MinHeap = {
/* siftDown:
* The node at the given index of the given heap is sifted down in its subtree
* until it does not have a child with a lesser value.
*/
siftDown(arr, i=0, value=arr[i]) {
if (i >= arr.length) return;
while (true) {
// Choose the child with the least value
let j = i*2+1;
if (j+1 < arr.length && arr[j] > arr[j+1]) j++;
// If no child has lesser value, then we've found the spot!
if (j >= arr.length || value <= arr[j]) break;
// Move the selected child value one level up...
arr[i] = arr[j];
// ...and consider the child slot for putting our sifted value
i = j;
}
arr[i] = value; // Place the sifted value at the found spot
},
/* heapify:
* The given array is reordered in-place so that it becomes a valid heap.
* Elements in the given array must have a [0] property (e.g. arrays). That [0] value
* serves as the key to establish the heap order. The rest of such an element is just payload.
* It also returns the heap.
*/
heapify(arr) {
// Establish heap with an incremental, bottom-up process
for (let i = arr.length>>1; i--; ) this.siftDown(arr, i);
return arr;
},
/* pop:
* Extracts the root of the given heap, and returns it (the subarray).
* Returns undefined if the heap is empty
*/
pop(arr) {
// Pop the last leaf from the given heap, and exchange it with its root
return this.exchange(arr, arr.pop());
},
/* exchange:
* Replaces the root node of the given heap with the given node, and returns the previous root.
* Returns the given node if the heap is empty.
* This is similar to a call of pop and push, but is more efficient.
*/
exchange(arr, value) {
if (!arr.length) return value;
// Get the root node, so to return it later
let oldValue = arr[0];
// Inject the replacing node using the sift-down process
this.siftDown(arr, 0, value);
return oldValue;
},
/* push:
* Inserts the given node into the given heap. It returns the heap.
*/
push(arr, value) {
// First assume the insertion spot is at the very end (as a leaf)
let i = arr.length;
let j;
// Then follow the path to the root, moving values down for as long as they
// are greater than the value to be inserted
while ((j = (i-1)>>1) >= 0 && value < arr[j]) {
arr[i] = arr[j];
i = j;
}
// Found the insertion spot
arr[i] = value;
return arr;
}
};
function cookies(k, arr) {
// Remove values that are already OK so to keep heap size minimal
const heap = arr.filter(val => val < k);
let greaterPresent = heap.length < arr.length; // Mark whether there is a good cookie
MinHeap.heapify(heap);
let result = 0;
while (heap.length > 1) {
const newValue = MinHeap.pop(heap) + MinHeap.pop(heap) * 2;
// Only push result back to heap if it still is not great enough
if (newValue < k) MinHeap.push(heap, newValue);
else greaterPresent = true; // Otherwise just mark that we have a good cookie
result++;
}
// If not good cookies were created, then return -1
// Otherwise, if there is still 1 element in the heap, add 1
return greaterPresent ? result + heap.length : -1;
}
// Example run
console.log(cookies(9, [2,7,3,6,4,6])); // 4
I solved it using java. You may adapt to Javascript.
This code does not require using a heap. It just work on the same array passed. Passed all tests for me.
static int cookies(int k, int[] arr) {
/*
* Write your code here.
*/
Arrays.sort(arr);
int i = 0,
c = arr.length,
i0 = 0,
c0 = 0,
op = 0;
while( (arr[i]<k || arr[i0]<k) && (c0-i0 + c-i)>1 ) {
int s1 = i0==c0 || arr[i]<=arr[i0] ? arr[i++] : arr[i0++],
s2 = i0==c0 || (i!=c && arr[i]<=arr[i0]) ? arr[i++] : arr[i0++];
arr[c0++] = s1 + 2*s2;
op++;
if( i==c ) {
i = i0;
c = c0;
c0 = i0;
}
}
return c-i>1 || arr[i]>=k ? op : -1;
}
First of all sort array.
For newly calculated values, store them in the array[i0-c0] range, this new array does not require sorting, because it is already sorted.
When array[i-c] reaches(i==c: true) end, forget it, and work on arr[i0-c0].
I want to know the size occupied by a JavaScript object.
Take the following function:
function Marks(){
this.maxMarks = 100;
}
function Student(){
this.firstName = "firstName";
this.lastName = "lastName";
this.marks = new Marks();
}
Now I instantiate the student:
var stud = new Student();
so that I can do stuff like
stud.firstName = "new Firstname";
alert(stud.firstName);
stud.marks.maxMarks = 200;
etc.
Now, the stud object will occupy some size in memory. It has some data and more objects.
How do I find out how much memory the stud object occupies? Something like a sizeof() in JavaScript? It would be really awesome if I could find it out in a single function call like sizeof(stud).
I’ve been searching the Internet for months—couldn’t find it (asked in a couple of forums—no replies).
I have re-factored the code in my original answer. I have removed the recursion and removed the assumed existence overhead.
function roughSizeOfObject( object ) {
var objectList = [];
var stack = [ object ];
var bytes = 0;
while ( stack.length ) {
var value = stack.pop();
if ( typeof value === 'boolean' ) {
bytes += 4;
}
else if ( typeof value === 'string' ) {
bytes += value.length * 2;
}
else if ( typeof value === 'number' ) {
bytes += 8;
}
else if
(
typeof value === 'object'
&& objectList.indexOf( value ) === -1
)
{
objectList.push( value );
for( var i in value ) {
stack.push( value[ i ] );
}
}
}
return bytes;
}
The Google Chrome Heap Profiler allows you to inspect object memory use.
You need to be able to locate the object in the trace which can be tricky. If you pin the object to the Window global, it is pretty easy to find from the "Containment" listing mode.
In the attached screenshot, I created an object called "testObj" on the window. I then located in the profiler (after making a recording) and it shows the full size of the object and everything in it under "retained size".
More details on the memory breakdowns.
In the above screenshot, the object shows a retained size of 60. I believe the unit is bytes here.
Sometimes I use this to flag really big objects that might be going to the client from the server. It doesn't represent the in memory footprint. It just gets you approximately what it'd cost to send it, or store it.
Also note, it's slow, dev only. But for getting an ballpark answer with one line of code it's been useful for me.
roughObjSize = JSON.stringify(bigObject).length;
I just wrote this to solve a similar (ish) problem. It doesn't exactly do what you may be looking for, ie it doesn't take into account how the interpreter stores the object.
But, if you are using V8, it should give you a fairly ok approximation as the awesome prototyping and hidden classes lick up most of the overhead.
function roughSizeOfObject( object ) {
var objectList = [];
var recurse = function( value )
{
var bytes = 0;
if ( typeof value === 'boolean' ) {
bytes = 4;
}
else if ( typeof value === 'string' ) {
bytes = value.length * 2;
}
else if ( typeof value === 'number' ) {
bytes = 8;
}
else if
(
typeof value === 'object'
&& objectList.indexOf( value ) === -1
)
{
objectList[ objectList.length ] = value;
for( i in value ) {
bytes+= 8; // an assumed existence overhead
bytes+= recurse( value[i] )
}
}
return bytes;
}
return recurse( object );
}
Here's a slightly more compact solution to the problem:
const typeSizes = {
"undefined": () => 0,
"boolean": () => 4,
"number": () => 8,
"string": item => 2 * item.length,
"object": item => !item ? 0 : Object
.keys(item)
.reduce((total, key) => sizeOf(key) + sizeOf(item[key]) + total, 0)
};
const sizeOf = value => typeSizes[typeof value](value);
There is a NPM module to get object sizeof, you can install it with npm install object-sizeof
var sizeof = require('object-sizeof');
// 2B per character, 6 chars total => 12B
console.log(sizeof({abc: 'def'}));
// 8B for Number => 8B
console.log(sizeof(12345));
var param = {
'a': 1,
'b': 2,
'c': {
'd': 4
}
};
// 4 one two-bytes char strings and 3 eighth-bytes numbers => 32B
console.log(sizeof(param));
This is a hacky method, but i tried it twice with different numbers and it seems to be consistent.
What you can do is to try and allocate a huge number of objects, like one or two million objects of the kind you want. Put the objects in an array to prevent the garbage collector from releasing them (note that this will add a slight memory overhead because of the array, but i hope this shouldn't matter and besides if you are going to worry about objects being in memory, you store them somewhere). Add an alert before and after the allocation and in each alert check how much memory the Firefox process is taking. Before you open the page with the test, make sure you have a fresh Firefox instance. Open the page, note the memory usage after the "before" alert is shown. Close the alert, wait for the memory to be allocated. Subtract the new memory from the older and divide it by the amount of allocations. Example:
function Marks()
{
this.maxMarks = 100;
}
function Student()
{
this.firstName = "firstName";
this.lastName = "lastName";
this.marks = new Marks();
}
var manyObjects = new Array();
alert('before');
for (var i=0; i<2000000; i++)
manyObjects[i] = new Student();
alert('after');
I tried this in my computer and the process had 48352K of memory when the "before" alert was shown. After the allocation, Firefox had 440236K of memory. For 2million allocations, this is about 200 bytes for each object.
I tried it again with 1million allocations and the result was similar: 196 bytes per object (i suppose the extra data in 2mill was used for Array).
So, here is a hacky method that might help you. JavaScript doesn't provide a "sizeof" method for a reason: each JavaScript implementaion is different. In Google Chrome for example the same page uses about 66 bytes for each object (judging from the task manager at least).
Having the same problem. I searched on Google and I want to share with stackoverflow community this solution.
Important:
I used the function shared by Yan Qing on github
https://gist.github.com/zensh/4975495
function memorySizeOf(obj) {
var bytes = 0;
function sizeOf(obj) {
if(obj !== null && obj !== undefined) {
switch(typeof obj) {
case 'number':
bytes += 8;
break;
case 'string':
bytes += obj.length * 2;
break;
case 'boolean':
bytes += 4;
break;
case 'object':
var objClass = Object.prototype.toString.call(obj).slice(8, -1);
if(objClass === 'Object' || objClass === 'Array') {
for(var key in obj) {
if(!obj.hasOwnProperty(key)) continue;
sizeOf(obj[key]);
}
} else bytes += obj.toString().length * 2;
break;
}
}
return bytes;
};
function formatByteSize(bytes) {
if(bytes < 1024) return bytes + " bytes";
else if(bytes < 1048576) return(bytes / 1024).toFixed(3) + " KiB";
else if(bytes < 1073741824) return(bytes / 1048576).toFixed(3) + " MiB";
else return(bytes / 1073741824).toFixed(3) + " GiB";
};
return formatByteSize(sizeOf(obj));
};
var sizeOfStudentObject = memorySizeOf({Student: {firstName: 'firstName', lastName: 'lastName', marks: 10}});
console.log(sizeOfStudentObject);
What do you think about it?
Sorry I could not comment, so I just continue the work from tomwrong.
This enhanced version will not count object more than once, thus no infinite loop.
Plus, I reckon the key of an object should be also counted, roughly.
function roughSizeOfObject( value, level ) {
if(level == undefined) level = 0;
var bytes = 0;
if ( typeof value === 'boolean' ) {
bytes = 4;
}
else if ( typeof value === 'string' ) {
bytes = value.length * 2;
}
else if ( typeof value === 'number' ) {
bytes = 8;
}
else if ( typeof value === 'object' ) {
if(value['__visited__']) return 0;
value['__visited__'] = 1;
for( i in value ) {
bytes += i.length * 2;
bytes+= 8; // an assumed existence overhead
bytes+= roughSizeOfObject( value[i], 1 )
}
}
if(level == 0){
clear__visited__(value);
}
return bytes;
}
function clear__visited__(value){
if(typeof value == 'object'){
delete value['__visited__'];
for(var i in value){
clear__visited__(value[i]);
}
}
}
roughSizeOfObject(a);
i want to know if my memory reduction efforts actually help in reducing memory
Following up on this comment, here's what you should do:
Try to produce a memory problem - Write code that creates all these objects and graudally increase the upper limit until you ran into a problem (Browser crash, Browser freeze or an Out-Of-memory error). Ideally you should repeat this experiment with different browsers and different operating system.
Now there are two options:
option 1 - You didn't succeed in producing the memory problem. Hence, you are worrying for nothing. You don't have a memory issue and your program is fine.
option 2- you did get a memory problem. Now ask yourself whether the limit at which the problem occurred is reasonable (in other words: is it likely that this amount of objects will be created at normal use of your code). If the answer is 'No' then you're fine. Otherwise you now know how many objects your code can create. Rework the algorithm such that it does not breach this limit.
This Javascript library sizeof.js does the same thing.
Include it like this
<script type="text/javascript" src="sizeof.js"></script>
The sizeof function takes an object as a parameter and returns its approximate size in bytes. For example:
// define an object
var object =
{
'boolean' : true,
'number' : 1,
'string' : 'a',
'array' : [1, 2, 3]
};
// determine the size of the object
var size = sizeof(object);
The sizeof function can handle objects that contain multiple references to other objects and recursive references.
Originally published here.
If your main concern is the memory usage of your Firefox extension, I suggest checking with Mozilla developers.
Mozilla provides on its wiki a list of tools to analyze memory leaks.
Chrome developer tools has this functionality. I found this article very helpful and does exactly what you want:
https://developers.google.com/chrome-developer-tools/docs/heap-profiling
Many thanks to everyone that has been working on code for this!
I just wanted to add that I've been looking for exactly the same thing, but in my case it's for managing a cache of processed objects to avoid having to re-parse and process objects from ajax calls that may or may not have been cached by the browser. This is especially useful for objects that require a lot of processing, usually anything that isn't in JSON format, but it can get very costly to keep these things cached in a large project or an app/extension that is left running for a long time.
Anyway, I use it for something something like:
var myCache = {
cache: {},
order: [],
size: 0,
maxSize: 2 * 1024 * 1024, // 2mb
add: function(key, object) {
// Otherwise add new object
var size = this.getObjectSize(object);
if (size > this.maxSize) return; // Can't store this object
var total = this.size + size;
// Check for existing entry, as replacing it will free up space
if (typeof(this.cache[key]) !== 'undefined') {
for (var i = 0; i < this.order.length; ++i) {
var entry = this.order[i];
if (entry.key === key) {
total -= entry.size;
this.order.splice(i, 1);
break;
}
}
}
while (total > this.maxSize) {
var entry = this.order.shift();
delete this.cache[entry.key];
total -= entry.size;
}
this.cache[key] = object;
this.order.push({ size: size, key: key });
this.size = total;
},
get: function(key) {
var value = this.cache[key];
if (typeof(value) !== 'undefined') { // Return this key for longer
for (var i = 0; i < this.order.length; ++i) {
var entry = this.order[i];
if (entry.key === key) {
this.order.splice(i, 1);
this.order.push(entry);
break;
}
}
}
return value;
},
getObjectSize: function(object) {
// Code from above estimating functions
},
};
It's a simplistic example and may have some errors, but it gives the idea, as you can use it to hold onto static objects (contents won't change) with some degree of intelligence. This can significantly cut down on any expensive processing requirements that the object had to be produced in the first place.
The accepted answer does not work with Map, Set, WeakMap and other iterable objects. (The package object-sizeof, mentioned in other answer, has the same problem).
Here's my fix
export function roughSizeOfObject(object) {
const objectList = [];
const stack = [object];
const bytes = [0];
while (stack.length) {
const value = stack.pop();
if (value == null) bytes[0] += 4;
else if (typeof value === 'boolean') bytes[0] += 4;
else if (typeof value === 'string') bytes[0] += value.length * 2;
else if (typeof value === 'number') bytes[0] += 8;
else if (typeof value === 'object' && objectList.indexOf(value) === -1) {
objectList.push(value);
if (typeof value.byteLength === 'number') bytes[0] += value.byteLength;
else if (value[Symbol.iterator]) {
// eslint-disable-next-line no-restricted-syntax
for (const v of value) stack.push(v);
} else {
Object.keys(value).forEach(k => {
bytes[0] += k.length * 2; stack.push(value[k]);
});
}
}
}
return bytes[0];
}
It also includes some other minor improvements: counts keys storage and works with ArrayBuffer.
function sizeOf(parent_data, size)
{
for (var prop in parent_data)
{
let value = parent_data[prop];
if (typeof value === 'boolean')
{
size += 4;
}
else if (typeof value === 'string')
{
size += value.length * 2;
}
else if (typeof value === 'number')
{
size += 8;
}
else
{
let oldSize = size;
size += sizeOf(value, oldSize) - oldSize;
}
}
return size;
}
function roughSizeOfObject(object)
{
let size = 0;
for each (let prop in object)
{
size += sizeOf(prop, 0);
} // for..
return size;
}
I use Chrome dev tools' Timeline tab, instantiate increasingly large amounts of objects, and get good estimates like that. You can use html like this one below, as boilerplate, and modify it to better simulate the characteristics of your objects (number and types of properties, etc...). You may want to click the trash bit icon at the bottom of that dev tools tab, before and after a run.
<html>
<script>
var size = 1000*100
window.onload = function() {
document.getElementById("quantifier").value = size
}
function scaffold()
{
console.log("processing Scaffold...");
a = new Array
}
function start()
{
size = document.getElementById("quantifier").value
console.log("Starting... quantifier is " + size);
console.log("starting test")
for (i=0; i<size; i++){
a[i]={"some" : "thing"}
}
console.log("done...")
}
function tearDown()
{
console.log("processing teardown");
a.length=0
}
</script>
<body>
<span style="color:green;">Quantifier:</span>
<input id="quantifier" style="color:green;" type="text"></input>
<button onclick="scaffold()">Scaffold</button>
<button onclick="start()">Start</button>
<button onclick="tearDown()">Clean</button>
<br/>
</body>
</html>
Instantiating 2 million objects of just one property each (as in this code above) leads to a rough calculation of 50 bytes per object, on my Chromium, right now. Changing the code to create a random string per object adds some 30 bytes per object, etc.
Hope this helps.
If you need to programatically check for aprox. size of objects you can also check this library http://code.stephenmorley.org/javascript/finding-the-memory-usage-of-objects/ that I have been able to use for objects size.
Otherwise I suggest to use the Chrome/Firefox Heap Profiler.
I had problems with the above answer with an ArrayBuffer.
After checking the documentation, I found that ArrayBuffer has a byteLength property which tells me exactly what I need, hence:
function sizeOf(data)
{
if (typeof(data) === 'object')
{
if (data instanceof ArrayBuffer)
{
return data.byteLength;
}
// other objects goes here
}
// non-object cases goes here
}
console.log(sizeOf(new ArrayBuffer(15))); // 15
Reference:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer/byteLength
Building upon the already compact solution from #Dan, here's a self-contained function version of it. Variable names are reduced to single letters for those who just want it to be as compact as possible at the expense of context.
const ns = {};
ns.sizeof = function(v) {
let f = ns.sizeof, //this needs to match the name of the function itself, since arguments.callee.name is defunct
o = {
"undefined": () => 0,
"boolean": () => 4,
"number": () => 8,
"string": i => 2 * i.length,
"object": i => !i ? 0 : Object
.keys(i)
.reduce((t, k) => f(k) + f(i[k]) + t, 0)
};
return o[typeof v](v);
};
ns.undef;
ns.bool = true;
ns.num = 1;
ns.string = "Hello";
ns.obj = {
first_name: 'John',
last_name: 'Doe',
born: new Date(1980, 1, 1),
favorite_foods: ['Pizza', 'Salad', 'Indian', 'Sushi'],
can_juggle: true
};
console.log(ns.sizeof(ns.undef));
console.log(ns.sizeof(ns.bool));
console.log(ns.sizeof(ns.num));
console.log(ns.sizeof(ns.string));
console.log(ns.sizeof(ns.obj));
console.log(ns.sizeof(ns.obj.favorite_foods));
I believe you forgot to include 'array'.
typeOf : function(value) {
var s = typeof value;
if (s === 'object')
{
if (value)
{
if (typeof value.length === 'number' && !(value.propertyIsEnumerable('length')) && typeof value.splice === 'function')
{
s = 'array';
}
}
else
{
s = 'null';
}
}
return s;
},
estimateSizeOfObject: function(value, level)
{
if(undefined === level)
level = 0;
var bytes = 0;
if ('boolean' === typeOf(value))
bytes = 4;
else if ('string' === typeOf(value))
bytes = value.length * 2;
else if ('number' === typeOf(value))
bytes = 8;
else if ('object' === typeOf(value) || 'array' === typeOf(value))
{
for(var i in value)
{
bytes += i.length * 2;
bytes+= 8; // an assumed existence overhead
bytes+= estimateSizeOfObject(value[i], 1)
}
}
return bytes;
},
formatByteSize : function(bytes)
{
if (bytes < 1024)
return bytes + " bytes";
else
{
var floatNum = bytes/1024;
return floatNum.toFixed(2) + " kb";
}
},
I know this is absolutely not the right way to do it, yet it've helped me a few times in the past to get the approx object file size:
Write your object/response to the console or a new tab, copy the results to a new notepad file, save it, and check the file size. The notepad file itself is just a few bytes, so you'll get a fairly accurate object file size.
I've been trying to figure this problem out for a couple days and can't seem to optimize the solution to make it tenable.
I have a tree that's 100 levels deep, so brute force (2^100 possible combinations?) is obviously not working..
Here is the code I have so far:
// Node constructor | structure of nodes
var Node = function(val) {
this.val = Number(val);
this.left = null;
this.right = null;
}
function maxPathUtil(top, store) {
if (top === null || top === undefined) return 0;
if (top.left === null && top.right === null) {
// console.log("hello");
return top.val;
}
leftSub = maxPathUtil(top.left, store);
rightSub = maxPathUtil(top.right, store);
store[0] = Math.max(store[0], leftSub+rightSub+top.val);
return Math.max(leftSub, rightSub) + top.val;
}
function maxPathSum(top) {
store = [];
store[0] = 0;
maxPathUtil(top, store)
return store[0];
}
var top = nodify(levels);
console.log(maxPathSum(top));
Is there a way to memoize this solution / otherwise improve the big O or is that as efficient as it gets?
So, each node should have ab additional extra property, let's call it "currLength"
When you parse the text file and you add the nodes to the tree, then you need to set the this value to the currLength of the parent node + the value of the current node.
After this you only need to check the bottom nodes of the tree to get the lengths of each path.
If you want to optimize this further, then you can add a currMaxLength variable to the tree, set it to -Infinity initially, and when you add a node with a higher currLength, then you set currMaxLength to currLength.
Keep in mind, this doesn't make it any faster, this just moves the compexity to the init part. It works like insertion sort.
here's how I've solved this problem in JavaScript, hope it helps!
var hasPathSum = function(root, sum) {
if (root == null) {
return false
}
var result = checkTree(root,root.val,sum)
console.log(result)
return result
};
function checkTree(node,currentSum,sum) {
if (currentSum == sum && node.left == null && node.right == null) {
return true
}
var result = false
if (node.left != null) {
result = result || checkTree(node.left,currentSum+node.left.val,sum)
}
if (node.right != null) {
result = result || checkTree(node.right,currentSum+node.right.val,sum)
}
return result
}
also on my Github
function pathSum(root , sum){
//Add Base Case For Recursion
if(root === null)return false;
//Add another base case
if(root.val === sum &&(root.left === null && root.right === null)return true
// here when it reach to the leaf where the next is null for both left and right
return pathSum(root.left , sum - root.val)|| pathSum(root.right ,sum - root.val )
//So In the code above I added (||) because If the first recursion return false, it will execute next recursion and so on
}
Is there a way how to test if two JavaScript ArrayBuffers are equal? I would like to write test for message composing method. The only way I found is to convert the ArrayBuffer to string and then compare. Did I miss something?
Following code is giving false, even if I think that it should be true:
(function() {
'use strict';
/* Fill buffer with data of Verse header and user_auth
* command */
var buf_pos = 0;
var name_len = 6
var message_len = 4 + 1 + 1 + 1 + name_len + 1;
var buf = new ArrayBuffer(message_len);
var view = new DataView(buf);
/* Verse header starts with version */
view.setUint8(buf_pos, 1 << 4); /* First 4 bits are reserved for version of protocol */
buf_pos += 2;
/* The lenght of the message */
view.setUint16(buf_pos, message_len);
buf_pos += 2;
buf_pos = 0;
var buf2 = new ArrayBuffer(message_len);
var view2 = new DataView(buf);
/* Verse header starts with version */
view2.setUint8(buf_pos, 1 << 4); /* First 4 bits are reserved for version of protocol */
buf_pos += 2;
/* The lenght of the message */
view2.setUint16(buf_pos, message_len);
buf_pos += 2;
if(buf == buf2){
console.log('true');
}
else{
console.log('false');
}
}());
If I try to compare view and view2 it's false again.
You cannot compare two objects directly in JavaScript using == or ===.
These operators will only check the equality of references (i.e. if expressions reference the same object).
You can, however, use DataView or ArrayView objects to retrieve values of specific parts of ArrayBuffer objects and check them.
If you want to check headers:
if ( view1.getUint8 (0) == view2.getUint8 (0)
&& view1.getUint16(2) == view2.getUint16(2)) ...
Or if you want to check the globality of your buffers:
function equal (buf1, buf2)
{
if (buf1.byteLength != buf2.byteLength) return false;
var dv1 = new Int8Array(buf1);
var dv2 = new Int8Array(buf2);
for (var i = 0 ; i != buf1.byteLength ; i++)
{
if (dv1[i] != dv2[i]) return false;
}
return true;
}
If you want to implement a complex data structure based on ArrayBuffer, I suggest creating your own class, or else you will have to resort to cumbersome raw DataView / ArrayView instances each time you will want to move a matchstick in and out of the structure.
In general javascript, you currently have to compare two ArrayBuffer objects by wrapping each with a TypedArray, then manually iterating over each element and doing element-wise equality.
If the underlying buffer is 2 or 4-byte memory-aligned then you can make a significant optimization by employing Uint16 or Uint32 typed-arrays for the comparison.
/**
* compare two binary arrays for equality
* #param {(ArrayBuffer|ArrayBufferView)} a
* #param {(ArrayBuffer|ArrayBufferView)} b
*/
function equal(a, b) {
if (a instanceof ArrayBuffer) a = new Uint8Array(a, 0);
if (b instanceof ArrayBuffer) b = new Uint8Array(b, 0);
if (a.byteLength != b.byteLength) return false;
if (aligned32(a) && aligned32(b))
return equal32(a, b);
if (aligned16(a) && aligned16(b))
return equal16(a, b);
return equal8(a, b);
}
function equal8(a, b) {
const ua = new Uint8Array(a.buffer, a.byteOffset, a.byteLength);
const ub = new Uint8Array(b.buffer, b.byteOffset, b.byteLength);
return compare(ua, ub);
}
function equal16(a, b) {
const ua = new Uint16Array(a.buffer, a.byteOffset, a.byteLength / 2);
const ub = new Uint16Array(b.buffer, b.byteOffset, b.byteLength / 2);
return compare(ua, ub);
}
function equal32(a, b) {
const ua = new Uint32Array(a.buffer, a.byteOffset, a.byteLength / 4);
const ub = new Uint32Array(b.buffer, b.byteOffset, b.byteLength / 4);
return compare(ua, ub);
}
function compare(a, b) {
for (let i = a.length; -1 < i; i -= 1) {
if ((a[i] !== b[i])) return false;
}
return true;
}
function aligned16(a) {
return (a.byteOffset % 2 === 0) && (a.byteLength % 2 === 0);
}
function aligned32(a) {
return (a.byteOffset % 4 === 0) && (a.byteLength % 4 === 0);
}
and called via:
equal(buf1, buf2)
here are the performance tests for 1-, 2-, 4-byte aligned memory.
Alternatives:
You may also get more performance with WASM, but its possible the cost of transferring the data to the heap may negate the comparison benefit.
Within Node.JS you may get more performance with Buffer as it will have native code: Buffer.from(buf1, 0).equals(Buffer.from(buf2, 0))
In today's V8, DataView should now be "usable for performance-critical real-world applications" — https://v8.dev/blog/dataview
The functions below test equality based on the objects you already have instantiated. If you already have TypedArray objects, you could compare them directly without creating additional DataView objects for them (someone is welcome to measure performance for both options).
// compare ArrayBuffers
function arrayBuffersAreEqual(a, b) {
return dataViewsAreEqual(new DataView(a), new DataView(b));
}
// compare DataViews
function dataViewsAreEqual(a, b) {
if (a.byteLength !== b.byteLength) return false;
for (let i=0; i < a.byteLength; i++) {
if (a.getUint8(i) !== b.getUint8(i)) return false;
}
return true;
}
// compare TypedArrays
function typedArraysAreEqual(a, b) {
if (a.byteLength !== b.byteLength) return false;
return a.every((val, i) => val === b[i]);
}
To test for equality between two TypedArrays, consider using the every method, which exits as soon as an inconsistency is found:
const a = Uint8Array.from([0,1,2,3]);
const b = Uint8Array.from([0,1,2,3]);
const c = Uint8Array.from([0,1,2,3,4]);
const areEqual = (first, second) =>
first.length === second.length && first.every((value, index) => value === second[index]);
console.log(areEqual(a, b));
console.log(areEqual(a, c));
This is less expensive than alternatives (like toString() comparisons) which iterate over the remaining array even after a difference is found.
I wrote these functions to compare the most normal data types. It works with ArrayBuffer, TypedArray, DataView, Node.js Buffer and any normal Array with byte data (0-255).
// It will not copy any underlying buffers, instead it will create a view into them.
function dataToUint8Array(data) {
let uint8array
if (data instanceof ArrayBuffer || Array.isArray(data)) {
uint8array = new Uint8Array(data)
} else if (data instanceof Buffer) { // Node.js Buffer
uint8array = new Uint8Array(data.buffer, data.byteOffset, data.length)
} else if (ArrayBuffer.isView(data)) { // DataView, TypedArray or Node.js Buffer
uint8array = new Uint8Array(data.buffer, data.byteOffset, data.byteLength)
} else {
throw Error('Data is not an ArrayBuffer, TypedArray, DataView or a Node.js Buffer.')
}
return uint8array
}
function compareData(a, b) {
a = dataToUint8Array(a); b = dataToUint8Array(b)
if (a.byteLength != b.byteLength) return false
return a.every((val, i) => val == b[i])
}
You can always convert the arrays into strings and compare them. E.g.
let a = new Uint8Array([1, 2, 3, 4]);
let b = new Uint8Array([1, 2, 3, 4]);
if (a.toString() == b.toString()) {
console.log("Yes");
} else {
console.log("No");
}
I want to know the size occupied by a JavaScript object.
Take the following function:
function Marks(){
this.maxMarks = 100;
}
function Student(){
this.firstName = "firstName";
this.lastName = "lastName";
this.marks = new Marks();
}
Now I instantiate the student:
var stud = new Student();
so that I can do stuff like
stud.firstName = "new Firstname";
alert(stud.firstName);
stud.marks.maxMarks = 200;
etc.
Now, the stud object will occupy some size in memory. It has some data and more objects.
How do I find out how much memory the stud object occupies? Something like a sizeof() in JavaScript? It would be really awesome if I could find it out in a single function call like sizeof(stud).
I’ve been searching the Internet for months—couldn’t find it (asked in a couple of forums—no replies).
I have re-factored the code in my original answer. I have removed the recursion and removed the assumed existence overhead.
function roughSizeOfObject( object ) {
var objectList = [];
var stack = [ object ];
var bytes = 0;
while ( stack.length ) {
var value = stack.pop();
if ( typeof value === 'boolean' ) {
bytes += 4;
}
else if ( typeof value === 'string' ) {
bytes += value.length * 2;
}
else if ( typeof value === 'number' ) {
bytes += 8;
}
else if
(
typeof value === 'object'
&& objectList.indexOf( value ) === -1
)
{
objectList.push( value );
for( var i in value ) {
stack.push( value[ i ] );
}
}
}
return bytes;
}
The Google Chrome Heap Profiler allows you to inspect object memory use.
You need to be able to locate the object in the trace which can be tricky. If you pin the object to the Window global, it is pretty easy to find from the "Containment" listing mode.
In the attached screenshot, I created an object called "testObj" on the window. I then located in the profiler (after making a recording) and it shows the full size of the object and everything in it under "retained size".
More details on the memory breakdowns.
In the above screenshot, the object shows a retained size of 60. I believe the unit is bytes here.
Sometimes I use this to flag really big objects that might be going to the client from the server. It doesn't represent the in memory footprint. It just gets you approximately what it'd cost to send it, or store it.
Also note, it's slow, dev only. But for getting an ballpark answer with one line of code it's been useful for me.
roughObjSize = JSON.stringify(bigObject).length;
I just wrote this to solve a similar (ish) problem. It doesn't exactly do what you may be looking for, ie it doesn't take into account how the interpreter stores the object.
But, if you are using V8, it should give you a fairly ok approximation as the awesome prototyping and hidden classes lick up most of the overhead.
function roughSizeOfObject( object ) {
var objectList = [];
var recurse = function( value )
{
var bytes = 0;
if ( typeof value === 'boolean' ) {
bytes = 4;
}
else if ( typeof value === 'string' ) {
bytes = value.length * 2;
}
else if ( typeof value === 'number' ) {
bytes = 8;
}
else if
(
typeof value === 'object'
&& objectList.indexOf( value ) === -1
)
{
objectList[ objectList.length ] = value;
for( i in value ) {
bytes+= 8; // an assumed existence overhead
bytes+= recurse( value[i] )
}
}
return bytes;
}
return recurse( object );
}
Here's a slightly more compact solution to the problem:
const typeSizes = {
"undefined": () => 0,
"boolean": () => 4,
"number": () => 8,
"string": item => 2 * item.length,
"object": item => !item ? 0 : Object
.keys(item)
.reduce((total, key) => sizeOf(key) + sizeOf(item[key]) + total, 0)
};
const sizeOf = value => typeSizes[typeof value](value);
There is a NPM module to get object sizeof, you can install it with npm install object-sizeof
var sizeof = require('object-sizeof');
// 2B per character, 6 chars total => 12B
console.log(sizeof({abc: 'def'}));
// 8B for Number => 8B
console.log(sizeof(12345));
var param = {
'a': 1,
'b': 2,
'c': {
'd': 4
}
};
// 4 one two-bytes char strings and 3 eighth-bytes numbers => 32B
console.log(sizeof(param));
This is a hacky method, but i tried it twice with different numbers and it seems to be consistent.
What you can do is to try and allocate a huge number of objects, like one or two million objects of the kind you want. Put the objects in an array to prevent the garbage collector from releasing them (note that this will add a slight memory overhead because of the array, but i hope this shouldn't matter and besides if you are going to worry about objects being in memory, you store them somewhere). Add an alert before and after the allocation and in each alert check how much memory the Firefox process is taking. Before you open the page with the test, make sure you have a fresh Firefox instance. Open the page, note the memory usage after the "before" alert is shown. Close the alert, wait for the memory to be allocated. Subtract the new memory from the older and divide it by the amount of allocations. Example:
function Marks()
{
this.maxMarks = 100;
}
function Student()
{
this.firstName = "firstName";
this.lastName = "lastName";
this.marks = new Marks();
}
var manyObjects = new Array();
alert('before');
for (var i=0; i<2000000; i++)
manyObjects[i] = new Student();
alert('after');
I tried this in my computer and the process had 48352K of memory when the "before" alert was shown. After the allocation, Firefox had 440236K of memory. For 2million allocations, this is about 200 bytes for each object.
I tried it again with 1million allocations and the result was similar: 196 bytes per object (i suppose the extra data in 2mill was used for Array).
So, here is a hacky method that might help you. JavaScript doesn't provide a "sizeof" method for a reason: each JavaScript implementaion is different. In Google Chrome for example the same page uses about 66 bytes for each object (judging from the task manager at least).
Having the same problem. I searched on Google and I want to share with stackoverflow community this solution.
Important:
I used the function shared by Yan Qing on github
https://gist.github.com/zensh/4975495
function memorySizeOf(obj) {
var bytes = 0;
function sizeOf(obj) {
if(obj !== null && obj !== undefined) {
switch(typeof obj) {
case 'number':
bytes += 8;
break;
case 'string':
bytes += obj.length * 2;
break;
case 'boolean':
bytes += 4;
break;
case 'object':
var objClass = Object.prototype.toString.call(obj).slice(8, -1);
if(objClass === 'Object' || objClass === 'Array') {
for(var key in obj) {
if(!obj.hasOwnProperty(key)) continue;
sizeOf(obj[key]);
}
} else bytes += obj.toString().length * 2;
break;
}
}
return bytes;
};
function formatByteSize(bytes) {
if(bytes < 1024) return bytes + " bytes";
else if(bytes < 1048576) return(bytes / 1024).toFixed(3) + " KiB";
else if(bytes < 1073741824) return(bytes / 1048576).toFixed(3) + " MiB";
else return(bytes / 1073741824).toFixed(3) + " GiB";
};
return formatByteSize(sizeOf(obj));
};
var sizeOfStudentObject = memorySizeOf({Student: {firstName: 'firstName', lastName: 'lastName', marks: 10}});
console.log(sizeOfStudentObject);
What do you think about it?
Sorry I could not comment, so I just continue the work from tomwrong.
This enhanced version will not count object more than once, thus no infinite loop.
Plus, I reckon the key of an object should be also counted, roughly.
function roughSizeOfObject( value, level ) {
if(level == undefined) level = 0;
var bytes = 0;
if ( typeof value === 'boolean' ) {
bytes = 4;
}
else if ( typeof value === 'string' ) {
bytes = value.length * 2;
}
else if ( typeof value === 'number' ) {
bytes = 8;
}
else if ( typeof value === 'object' ) {
if(value['__visited__']) return 0;
value['__visited__'] = 1;
for( i in value ) {
bytes += i.length * 2;
bytes+= 8; // an assumed existence overhead
bytes+= roughSizeOfObject( value[i], 1 )
}
}
if(level == 0){
clear__visited__(value);
}
return bytes;
}
function clear__visited__(value){
if(typeof value == 'object'){
delete value['__visited__'];
for(var i in value){
clear__visited__(value[i]);
}
}
}
roughSizeOfObject(a);
i want to know if my memory reduction efforts actually help in reducing memory
Following up on this comment, here's what you should do:
Try to produce a memory problem - Write code that creates all these objects and graudally increase the upper limit until you ran into a problem (Browser crash, Browser freeze or an Out-Of-memory error). Ideally you should repeat this experiment with different browsers and different operating system.
Now there are two options:
option 1 - You didn't succeed in producing the memory problem. Hence, you are worrying for nothing. You don't have a memory issue and your program is fine.
option 2- you did get a memory problem. Now ask yourself whether the limit at which the problem occurred is reasonable (in other words: is it likely that this amount of objects will be created at normal use of your code). If the answer is 'No' then you're fine. Otherwise you now know how many objects your code can create. Rework the algorithm such that it does not breach this limit.
This Javascript library sizeof.js does the same thing.
Include it like this
<script type="text/javascript" src="sizeof.js"></script>
The sizeof function takes an object as a parameter and returns its approximate size in bytes. For example:
// define an object
var object =
{
'boolean' : true,
'number' : 1,
'string' : 'a',
'array' : [1, 2, 3]
};
// determine the size of the object
var size = sizeof(object);
The sizeof function can handle objects that contain multiple references to other objects and recursive references.
Originally published here.
If your main concern is the memory usage of your Firefox extension, I suggest checking with Mozilla developers.
Mozilla provides on its wiki a list of tools to analyze memory leaks.
Chrome developer tools has this functionality. I found this article very helpful and does exactly what you want:
https://developers.google.com/chrome-developer-tools/docs/heap-profiling
Many thanks to everyone that has been working on code for this!
I just wanted to add that I've been looking for exactly the same thing, but in my case it's for managing a cache of processed objects to avoid having to re-parse and process objects from ajax calls that may or may not have been cached by the browser. This is especially useful for objects that require a lot of processing, usually anything that isn't in JSON format, but it can get very costly to keep these things cached in a large project or an app/extension that is left running for a long time.
Anyway, I use it for something something like:
var myCache = {
cache: {},
order: [],
size: 0,
maxSize: 2 * 1024 * 1024, // 2mb
add: function(key, object) {
// Otherwise add new object
var size = this.getObjectSize(object);
if (size > this.maxSize) return; // Can't store this object
var total = this.size + size;
// Check for existing entry, as replacing it will free up space
if (typeof(this.cache[key]) !== 'undefined') {
for (var i = 0; i < this.order.length; ++i) {
var entry = this.order[i];
if (entry.key === key) {
total -= entry.size;
this.order.splice(i, 1);
break;
}
}
}
while (total > this.maxSize) {
var entry = this.order.shift();
delete this.cache[entry.key];
total -= entry.size;
}
this.cache[key] = object;
this.order.push({ size: size, key: key });
this.size = total;
},
get: function(key) {
var value = this.cache[key];
if (typeof(value) !== 'undefined') { // Return this key for longer
for (var i = 0; i < this.order.length; ++i) {
var entry = this.order[i];
if (entry.key === key) {
this.order.splice(i, 1);
this.order.push(entry);
break;
}
}
}
return value;
},
getObjectSize: function(object) {
// Code from above estimating functions
},
};
It's a simplistic example and may have some errors, but it gives the idea, as you can use it to hold onto static objects (contents won't change) with some degree of intelligence. This can significantly cut down on any expensive processing requirements that the object had to be produced in the first place.
The accepted answer does not work with Map, Set, WeakMap and other iterable objects. (The package object-sizeof, mentioned in other answer, has the same problem).
Here's my fix
export function roughSizeOfObject(object) {
const objectList = [];
const stack = [object];
const bytes = [0];
while (stack.length) {
const value = stack.pop();
if (value == null) bytes[0] += 4;
else if (typeof value === 'boolean') bytes[0] += 4;
else if (typeof value === 'string') bytes[0] += value.length * 2;
else if (typeof value === 'number') bytes[0] += 8;
else if (typeof value === 'object' && objectList.indexOf(value) === -1) {
objectList.push(value);
if (typeof value.byteLength === 'number') bytes[0] += value.byteLength;
else if (value[Symbol.iterator]) {
// eslint-disable-next-line no-restricted-syntax
for (const v of value) stack.push(v);
} else {
Object.keys(value).forEach(k => {
bytes[0] += k.length * 2; stack.push(value[k]);
});
}
}
}
return bytes[0];
}
It also includes some other minor improvements: counts keys storage and works with ArrayBuffer.
function sizeOf(parent_data, size)
{
for (var prop in parent_data)
{
let value = parent_data[prop];
if (typeof value === 'boolean')
{
size += 4;
}
else if (typeof value === 'string')
{
size += value.length * 2;
}
else if (typeof value === 'number')
{
size += 8;
}
else
{
let oldSize = size;
size += sizeOf(value, oldSize) - oldSize;
}
}
return size;
}
function roughSizeOfObject(object)
{
let size = 0;
for each (let prop in object)
{
size += sizeOf(prop, 0);
} // for..
return size;
}
I use Chrome dev tools' Timeline tab, instantiate increasingly large amounts of objects, and get good estimates like that. You can use html like this one below, as boilerplate, and modify it to better simulate the characteristics of your objects (number and types of properties, etc...). You may want to click the trash bit icon at the bottom of that dev tools tab, before and after a run.
<html>
<script>
var size = 1000*100
window.onload = function() {
document.getElementById("quantifier").value = size
}
function scaffold()
{
console.log("processing Scaffold...");
a = new Array
}
function start()
{
size = document.getElementById("quantifier").value
console.log("Starting... quantifier is " + size);
console.log("starting test")
for (i=0; i<size; i++){
a[i]={"some" : "thing"}
}
console.log("done...")
}
function tearDown()
{
console.log("processing teardown");
a.length=0
}
</script>
<body>
<span style="color:green;">Quantifier:</span>
<input id="quantifier" style="color:green;" type="text"></input>
<button onclick="scaffold()">Scaffold</button>
<button onclick="start()">Start</button>
<button onclick="tearDown()">Clean</button>
<br/>
</body>
</html>
Instantiating 2 million objects of just one property each (as in this code above) leads to a rough calculation of 50 bytes per object, on my Chromium, right now. Changing the code to create a random string per object adds some 30 bytes per object, etc.
Hope this helps.
If you need to programatically check for aprox. size of objects you can also check this library http://code.stephenmorley.org/javascript/finding-the-memory-usage-of-objects/ that I have been able to use for objects size.
Otherwise I suggest to use the Chrome/Firefox Heap Profiler.
I had problems with the above answer with an ArrayBuffer.
After checking the documentation, I found that ArrayBuffer has a byteLength property which tells me exactly what I need, hence:
function sizeOf(data)
{
if (typeof(data) === 'object')
{
if (data instanceof ArrayBuffer)
{
return data.byteLength;
}
// other objects goes here
}
// non-object cases goes here
}
console.log(sizeOf(new ArrayBuffer(15))); // 15
Reference:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer/byteLength
Building upon the already compact solution from #Dan, here's a self-contained function version of it. Variable names are reduced to single letters for those who just want it to be as compact as possible at the expense of context.
const ns = {};
ns.sizeof = function(v) {
let f = ns.sizeof, //this needs to match the name of the function itself, since arguments.callee.name is defunct
o = {
"undefined": () => 0,
"boolean": () => 4,
"number": () => 8,
"string": i => 2 * i.length,
"object": i => !i ? 0 : Object
.keys(i)
.reduce((t, k) => f(k) + f(i[k]) + t, 0)
};
return o[typeof v](v);
};
ns.undef;
ns.bool = true;
ns.num = 1;
ns.string = "Hello";
ns.obj = {
first_name: 'John',
last_name: 'Doe',
born: new Date(1980, 1, 1),
favorite_foods: ['Pizza', 'Salad', 'Indian', 'Sushi'],
can_juggle: true
};
console.log(ns.sizeof(ns.undef));
console.log(ns.sizeof(ns.bool));
console.log(ns.sizeof(ns.num));
console.log(ns.sizeof(ns.string));
console.log(ns.sizeof(ns.obj));
console.log(ns.sizeof(ns.obj.favorite_foods));
I believe you forgot to include 'array'.
typeOf : function(value) {
var s = typeof value;
if (s === 'object')
{
if (value)
{
if (typeof value.length === 'number' && !(value.propertyIsEnumerable('length')) && typeof value.splice === 'function')
{
s = 'array';
}
}
else
{
s = 'null';
}
}
return s;
},
estimateSizeOfObject: function(value, level)
{
if(undefined === level)
level = 0;
var bytes = 0;
if ('boolean' === typeOf(value))
bytes = 4;
else if ('string' === typeOf(value))
bytes = value.length * 2;
else if ('number' === typeOf(value))
bytes = 8;
else if ('object' === typeOf(value) || 'array' === typeOf(value))
{
for(var i in value)
{
bytes += i.length * 2;
bytes+= 8; // an assumed existence overhead
bytes+= estimateSizeOfObject(value[i], 1)
}
}
return bytes;
},
formatByteSize : function(bytes)
{
if (bytes < 1024)
return bytes + " bytes";
else
{
var floatNum = bytes/1024;
return floatNum.toFixed(2) + " kb";
}
},
I know this is absolutely not the right way to do it, yet it've helped me a few times in the past to get the approx object file size:
Write your object/response to the console or a new tab, copy the results to a new notepad file, save it, and check the file size. The notepad file itself is just a few bytes, so you'll get a fairly accurate object file size.