Array with less values than length - null-overhead in json - javascript

I'm writing a simple game. It's based on a field that is 32x32 squares. Each square can contain a game element. When an element appears/moves/changes, the server sends a json object containing a 2d array that resembles the whole game field with all it's elements via websockets to all clients.
That is quite a lot of data when sent to multiple clients, multiple times per second.
So I was wondering how I could reduce that.
I thought it would help to remove everything from the big array that the client already received anyways. But if you have something like this:
var gameField = [];
gameField[6][10] = "player1";
console.log(JSON.stringify(gameField));
it converts to:
[null,null,null,null,null,null,[null,null,null,null,null,null,null,null,null,null,"player1"]]
That's really unnecessary overhead.
Couldn't it generate something like this instead?
{"6": {"10":"player1"}}
I'm using node.js for the game server btw.
So how do I reduce the data? Is there maybe a special JSON.stringify method?

You can use an object instead of an array.
var gameField = {};
gameField[6] = gameField[6] || {};
gameField[6][10] = "player1";
console.log(JSON.stringify(gameField));

You can customize the json parsing, by using a replacer function. However the parsing reviver would depend on how the inner objects are recognized. For example, if the inner objects are strings as in the example:
var gameField = [];
gameField[6] = [];
gameField[6][10] = "player1";
function replacer(key,val){
if(val instanceof Array){
return val.reduce((o,el,i)=>{
if(el)o[i] = el;
return o;
},{});
}
return val;
}
var json = JSON.stringify(gameField, replacer);
console.log("JSON: ", json);
function reviver(key,val){
if(typeof val === 'string')
return val;
return Object.keys(val).reduce((arr,k) =>{
arr[k] = val[k];
return arr;
}, [] );
}
gameField = JSON.parse(json, reviver);
console.log("Parsed: ", gameField);

If you want to remove null from the array, use underscore library.
npm install underscore --save
in your file where the related code is there,
var _ = require('underscore');
To remove null just pass
result = _.filter(result, function(item){
return item!= null;
});
result will be an array without any null.

Related

Reading N-API object into C++ primitive

I've created a simple N-API module starting from the ObjectWrap boilerplate of generator-napi-module, and successfully passed data (an array containing objects with string, number and boolean properties) to JS. However, I'm unable to parse the properties of one of the same objects passed back to the native code; specifically, creating a uint32_t value from a property (a number) of the passed object.
Suppose an array of objects is created and passed to JS:
Napi::Value ObjectWrapAddon::GetSomeList(const Napi::CallbackInfo& info){
Napi::Env env = info.Env();
native_struct_one *data = NULL;
native_struct_two opts = { TRUE,FALSE,FALSE };
int retVal = native_lib_method(&data, &opts);
if(retVal!=OK) {
return Napi::Array::New(env); // return empty array
}
Napi::Array arr = Napi::Array::New(env);
uint32_t i = 0;
do {
Napi::Object tempObj = Napi::Object::New(env);
tempObj.Set("someProp", data->someVal);
arr[i] = tempObj;
i++;
data = data->next;
} while(data);
return arr;
}
Then one of those objects is passed to a native function call:
Napi::Value ObjectWrapAddon::OtherMethod(const Napi::CallbackInfo& info){
Napi::Env env = info.Env();
Napi::Object obj = info[0].As<Napi::Object>();
uint32_t temp = obj.Get("someProp").As<Napi::Number>();
return Napi::Number::New(env, temp);
}
This builds fine, but the above OtherMethod() gives an A number was expected error at uint32_t temp = obj.Get('someProp').As<Napi::Number>().
How would I create a native (C++) value from a JS object property value?
I missed two things, which allow this to work:
I was inconsistent with strings when using Get/Set. If Napi::Object::Set is used with single quotes, single quotes must be used with Napi::Object::Get; likewise for double quotes.
Uint32Value() method needs to be used per the docs (I must have removed this in my tinkering), giving: uint32_t temp = obj.Get("someProp").As<Napi::Number>().Uint32Value();
Fixing these issues provides the expected behavior and output.

Reduce javascript object to unique identifier

I have an object that I'm storing page settings in that looks something like this:
var filters={
"brands":["brand1","brand2","brand3"],
"family":"reds",
"palettes":["palette1","palette2","palette3"],
"color":"a1b2"
};
This object is constantly being changed as the user browses the page. I looking for some fast way in the code (maybe using a built in jquery or javascript function) to reduce the current settings object to a unique identifier I can reference without using a lot of loops. Maybe something like this:
"brandsbrand1brand2brand3familyredspalettespalette1palette2palette3colora1b2"
Doesn't have to necessarily convert the object to a long string like that, as long as it is something that will be unique to a particular group of settings. And I won't need to convert this identifier back into the object later.
EDITS:
I need to give some more information.
I'm looking to store the items of the results of the filters I'm doing inside a variable that's named the same as the unique ID. So, var uniqueID1 is from the settings object that has brand1 and brand2, and contains ["filteredObject1_1","filteredObject1_2"...,"filteredObject1_500"], and var uniqueID2 is from the settings object that has brand3 and brand4, and contains ["filteredObject2_1","filteredObject2_2"...,"filteredObject2_500"]. What I'm looking to do is avoid doing really really slow filtering code more than once on a bunch of items by storing results of the filtering in unique variables.
So:
Convert settings to unique id and see if that if that variable exists.
If variable exists, just get that variable that has the already filtered items.
If variable doesn't exist, do the really slow filtering on hundreds of items and store these items in unique id variable.
Hopefully I just didn't make this more confusing. I feel like I probably made it more confusing.
You can use JSON, which is a method of stringifying objects that was designed for JavaScript.
var filters={
"brands":["brand1","brand2","brand3"],
"family":"reds",
"palettes":["palette1","palette2","palette3"],
"color":"a1b2"
};
var uniqueId = JSON.stringify(filters);
uniqueId equals the following string:
{"brands":["brand1","brand2","brand3"],"family":"reds","palettes":["palette1","palette2","palette3"],"color":"a1b2"}
This has the added benefit of being able to be turned back into an object with JSON.parse(uniqueId).
Note that with JSON.stringify, two objects with have exactly the same values will be converted into the same unique id.
EDIT:
Please let me know if I interpreted your edit correctly. However, I think this is what you want to do.
//object that will act as a cache
var cached_filters = {}
//this assumes the existence of a get_filter function that processes the filters object
function get_cached_filter(filters) {
let uniqueId = JSON.stringify(filters);
//use already cached filters
if (cached_filters[uniqueId]) {
return cached_filters[uniqueId];
//create filter and cache it
} else {
cached_filters[uniqueId] = get_filter(filters);
return cached_filters[uniqueId];
}
}
This will store an object that has keys for each filter each time you call get_cached_filter. If get_cached_filter has already been called with the same exact filter, it will use it from the cache instead of recreating it; otherwise, it will create it and save it in the cache.
You could iterate the filter object and filter with Array#filter the data.
data.filter(function (o) {
return Object.keys(filters).every(function (k) {
return Array.isArray(filters[k])
? filters[k].some(function (f) { return o[k] === f; })
: o[k] === filters[k];
});
});
If you won't need to convert this identifier back into the object later, Here you can use this simple hashing function:
function UniqueHashCode(obj){
var str = JSON.stringify(obj)
var hash = 0;
if (str.length == 0) return hash;
for (i = 0; i < str.length; i++) {
char = str.charCodeAt(i);
hash = ((hash<<5)-hash)+char;
hash = hash & hash; // Convert to 32bit integer
}
return hash;
}
function UniqueHashCode(obj){
var str = JSON.stringify(obj)
var hash = 0;
if (str.length == 0) return hash;
for (i = 0; i < str.length; i++) {
char = str.charCodeAt(i);
hash = ((hash<<5)-hash)+char;
hash = hash & hash; // Convert to 32bit integer
}
return hash;
}
var filters={
"brands":["brand1","brand2","brand3"],
"family":"reds",
"palettes":["palette1","palette2","palette3"],
"color":"a1b2"
};
alert(UniqueHashCode(filters));
This function create a simple and very short integer (for example 661801383) by given object.
I hope to be helpful for you:)

Merging 2 arrays in local storage in javascript

I am trying to append an array of objects(new) to the local storage which already has some array of objects(previous) built in. Specifically, I want to merge these 2 arrays (previous and new) in the local storage.
Have tried the below code :
function appendToStorage(name, data)
{
var old = localStorage.getItem(name);
if(old === null)
old = "";
localStorage.setItem(name, old.concat(data));
}
appendToStorage('ObjAry', JSON.stringify(objectIdArray));
And this is the output that I am getting :
["IrGszUBa0F","l366vn6mPa","2qn7JUoRwg","s2fZa0mXnb","WIaXLwmXRa"]["ZKHtnHoHgH","rtbI1sDfPm","U1eVDi9bNM","tUGNCl6hNl","lkq6tswVsZ"]
All I want is that, the second array should append to the first array so the output becomes :
["IrGszUBa0F","l366vn6mPa","2qn7JUoRwg","s2fZa0mXnb","WIaXLwmXRa","ZKHtnHoHgH","rtbI1sDfPm","U1eVDi9bNM","tUGNCl6hNl","lkq6tswVsZ"]
Can anyone guide me on what I am doing wrong ?
You are pretty close, there are just three small mistakes:
You are stringifying the array before concatenating it (so you are attaching a string to an array).
The default value for old is a string, which should probably be an array?
In order to use the array from localstorage, you need to parse it again using JSON.parse.
The resulting code would then be:
function appendToStorage(name, data)
{
var old = localStorage.getItem(name);
if(old == null) {
old = [];
} else {
old = JSON.parse(old);
}
localStorage.setItem(name, JSON.stringify(old.concat(data)));
}
appendToStorage('ObjAry', objectIdArray);
If your local storage entry could contain other values as well, you could add a try ... catch block to your code to make sure that JSON.parse doesn't blow up if it fails to parse the value:
function appendToStorage(name, data)
{
var old = localStorage.getItem(name);
try {
old = JSON.parse(old);
} catch(e) {
old = [];
}
localStorage.setItem(name, JSON.stringify(old.concat(data)));
}
appendToStorage('ObjAry', objectIdArray);
You're concatenating the whole object returning from
JSON.stringify(objectIdArray)
Try
appendToStorage('ObjAry', objectIdArray);
LocalStorage stores a string values by a string key. To store arrays/objects as string we serialize them into JSON. So you need to parse JSON after getItem, merge parsed value with new portion of data, convert merged object to JSON and pass it to setItem.
function appendToStorage(name, data) {
var old = localStorage.getItem(name) || '[]';
var oldObject = JSON.parse(old) || [];
var merged = oldObject.concat(data);
localStorage.setItem(name, JSON.stringify(merged));
}
appendToStorage('ObjAry', objectIdArray);

JSON.stringify() only allows one value parameter. How do I add more parameters to be stringified under one brace?

http.get(options, function(res){
fs.appendFile('log.txt', JSON.stringify(res.headers, null, 4));
})
I have a question regarding the JSON.stringify() function.
I've learned that simply using the res.headers does not in fact output to JSON format.
At the moment, I am restricted to only being able to use one res.xxxxx method within JSON.stringify(). The piece of code in question is pasted above. How am I able to use more than one value? At the moment, I can only put in res.headers into the value parameter. I would also like to use res.statusCode and my own objects, all stringified under one brace {}.
The parameters of JSON.Stringify is as follows: JSON.stringify(value, [replacer], [space]);
You need to create a new js object and put res.headers into it.
var obj = {};
obj.headers = res.headers;
obj.somethingelse = somethingelse;
var string = JSON.stringify(obj);
JSON is always a single value. So the output out JSON.stringify can always only be a single value. It would make sense to have the input be a single value too. This is like asking why can't my function return two things? You can make it return some composite value, but that means you're still returning a single (composite) value. The solution here is the same, compose your input.
var reply = {
code: res.statusCode,
headers: parse_http_headers(res.headers),
etc: /* etc */
};
log(JSON.stringify(reply));
Note that you must write parse_http_headers yourself.
You could always add the extra things you want to the headers object...
res.headers.statusCode = res.statusCode
JSON.stringify(res.headers, null, 4);
I don't know if there are any bad side effects if you mutate the res object in node. You might want to consider creating a shallow copy of the headers object if you are worried about that.
You could as well stringify more than only the headers part of your object:
JSON.stringify(res, …)
If you want to stringify only certain parts of your object, you can
filter them with the replacer function,
delete everything else before,
or build a new object to be stringified:
JSON.stringify({
heads: res.headers,
…
}, …)
If you'd like to flatten several objects, you can use this function.
function merge() {
var out = {};
insert = function(e) {
for (var i in e) {
if (Object.prototype.hasOwnProperty.call(e, i))
out[i] = e[i];
}
};
for (var i = 0; i < arguments.length; i++) {
insert(arguments[i]);
}
return out;
}
var a = {'a': 'aa'};
var b = {'b': 'bb', 'bbb': 'bbbb'};
var c = {'c': 'cc'};
var combined = merge(a,b,c);

Checking for duplicate Javascript objects

TL;DR version: I want to avoid adding duplicate Javascript objects to an array of similar objects, some of which might be really big. What's the best approach?
I have an application where I'm loading large amounts of JSON data into a Javascript data structure. While it's a bit more complex than this, assume that I'm loading JSON into an array of Javascript objects from a server through a series of AJAX requests, something like:
var myObjects = [];
function processObject(o) {
myObjects.push(o);
}
for (var x=0; x<1000; x++) {
$.getJSON('/new_object.json', processObject);
}
To complicate matters, the JSON:
is in an unknown schema
is of arbitrary length (probably not enormous, but could be in the 100-200 kb range)
might contain duplicates across different requests
My initial thought is to have an additional object to store a hash of each object (via JSON.stringify?) and check against it on each load, like this:
var myHashMap = {};
function processObject(o) {
var hash = JSON.stringify(o);
// is it in the hashmap?
if (!(myHashMap[hash])) {
myObjects.push(o);
// set the hashmap key for future checks
myHashMap[hash] = true;
}
// else ignore this object
}
but I'm worried about having property names in myHashMap that might be 200 kb in length. So my questions are:
Is there a better approach for this problem than the hashmap idea?
If not, is there a better way to make a hash function for a JSON object of arbitrary length and schema than JSON.stringify?
What are the possible issues with super-long property names in an object?
I'd suggest you create an MD5 hash of the JSON.stringify(o) and store that in your hashmap with a reference to your stored object as the data for the hash. And to make sure that there are no object key order differences in the JSON.stringify(), you have to create a copy of the object that orders the keys.
Then, when each new object comes in, you check it against the hash map. If you find a match in the hash map, then you compare the incoming object with the actual object that you've stored to see if they are truly duplicates (since there can be MD5 hash collisions). That way, you have a manageable hash table (with only MD5 hashes in it).
Here's code to create a canonical string representation of an object (including nested objects or objects within arrays) that handles object keys that might be in a different order if you just called JSON.stringify().
// Code to do a canonical JSON.stringify() that puts object properties
// in a consistent order
// Does not allow circular references (child containing reference to parent)
JSON.stringifyCanonical = function(obj) {
// compatible with either browser or node.js
var Set = typeof window === "object" ? window.Set : global.Set;
// poor man's Set polyfill
if (typeof Set !== "function") {
Set = function(s) {
if (s) {
this.data = s.data.slice();
} else {
this.data = [];
}
};
Set.prototype = {
add: function(item) {
this.data.push(item);
},
has: function(item) {
return this.data.indexOf(item) !== -1;
}
};
}
function orderKeys(obj, parents) {
if (typeof obj !== "object") {
throw new Error("orderKeys() expects object type");
}
var set = new Set(parents);
if (set.has(obj)) {
throw new Error("circular object in stringifyCanonical()");
}
set.add(obj);
var tempObj, item, i;
if (Array.isArray(obj)) {
// no need to re-order an array
// but need to check it for embedded objects that need to be ordered
tempObj = [];
for (i = 0; i < obj.length; i++) {
item = obj[i];
if (typeof item === "object") {
tempObj[i] = orderKeys(item, set);
} else {
tempObj[i] = item;
}
}
} else {
tempObj = {};
// get keys, sort them and build new object
Object.keys(obj).sort().forEach(function(item) {
if (typeof obj[item] === "object") {
tempObj[item] = orderKeys(obj[item], set);
} else {
tempObj[item] = obj[item];
}
});
}
return tempObj;
}
return JSON.stringify(orderKeys(obj));
}
And, the algorithm
var myHashMap = {};
function processObject(o) {
var stringifiedCandidate = JSON.stringifyCanonical(o);
var hash = CreateMD5(stringifiedCandidate);
var list = [], found = false;
// is it in the hashmap?
if (!myHashMap[hash] {
// not in the hash table, so it's a unique object
myObjects.push(o);
list.push(myObjects.length - 1); // put a reference to the object with this hash value in the list
myHashMap[hash] = list; // store the list in the hash table for future comparisons
} else {
// the hash does exist in the hash table, check for an exact object match to see if it's really a duplicate
list = myHashMap[hash]; // get the list of other object indexes with this hash value
// loop through the list
for (var i = 0; i < list.length; i++) {
if (stringifiedCandidate === JSON.stringifyCanonical(myObjects[list[i]])) {
found = true; // found an exact object match
break;
}
}
// if not found, it's not an exact duplicate, even though there was a hash match
if (!found) {
myObjects.push(o);
myHashMap[hash].push(myObjects.length - 1);
}
}
}
Test case for jsonStringifyCanonical() is here: https://jsfiddle.net/jfriend00/zfrtpqcL/
Maybe. For example if You know what kind object goes by You could write better indexing and searching system than JS objects' keys. But You could only do that with JavaScript and object keys are written in C...
Must Your hashing be lossless or not? If can than try to lose compression (MD5). I guessing You will lose some speed and gain some memory. By the way, do JSON.stringify(o) guarantees same key ordering. Because {foo: 1, bar: 2} and {bar: 2, foo: 1} is equal as objects, but not as strings.
Cost memory
One possible optimization:
Instead of using getJSON use $.get and pass "text" as dataType param. Than You can use result as Your hash and convert to object afterwards.
Actually by writing last sentence I though about another solution:
Collect all results with $.get into array
Sort it with buildin (c speed) Array.sort
Now You can easily spot and remove duplicates with one for
Again different JSON strings can make same JavaScript object.

Categories

Resources