Mutable arrays as values in ES6 Map - javascript

Does it make sense to use an array (which may be frequently push'ed and splice'd) as the value in an ES6 Map, or should I stick to plain associative arrays?
eg something like the following Node server code which would track which users have already downloaded a specific file
// initialisation
var map = new Map();
map.set ("foo.png",[]);
// push a new user id to array associated with hashString
function userHasFile(filename,userId) {
// retrieve array of all userIds that have downloaded the file
var users = map.get(filename);
// delete the map reference since we are going to update it
map.delete(filename);
// push the new userId to the array of users that have downloaded file
users.push(userId);
// update the Map key to the new array
map.set(filename,users);
}
// example usage
userHasFile("foo.png","user1");
userHasFile("foo.png","user2");
I'm new to ES6 and the simplicity of the Map API appeals to me, so I'm wondering if there are any benefits\downsides (such as performance, memory, GC etc) to this approach over using standard associate arrays.

Absolutely.
Having a key-list store is a useful enough pattern that it's one of the data types in Redis.
Keeping an array in a Map is a totally legitimate and useful thing to do. It's a great way to store a list of objects that match some criteria and is especially efficient in JavaScript (and other languages that use references). The map doesn't actually hold a copy of the array, just a reference to it, so you can modify the array from outside of the map and the map will update without knowing or resizing.
However, you should be careful to use more functional patterns if they make sense. If you have incoming items, you may want to stream them into a map/reduce (using reduce to group them by key) rather than keeping a map and modifying the underlying arrays. You are relying on references and side effects here.
If you want to prevent the arrays from being updated at some point, you can replace them with immutable arrays, using a library like immutable.js.
If you're working in a server, say, with incoming requests that you need to record but you don't know when they will arrive (the kind of cache-and-flush that statsd does), a map makes sense:
let responses = new Map();
server.on('request', function (url, responseTime) {
if (!responses.has(url)) {
responses.set(url, []);
}
const times = responses.get(url);
times.push(responseTime);
});
const flushTimer = setInterval(function () {
stats.write(responses);
responses = new Map();
}, 10*1000);

Related

Avoiding duplication of key/data

I have a design annoyance with some existing code in JS. The code is working, so I have no desperate hurry to change it, but the duplication shown below does annoy me. What is the usual/recommended/official way of avoiding this situation?
The actual system is a large/complex financial system, so I have simplified it to the most basic example which demonstrates the problem:
var colours={
red:{id:"red", vals:[1,0,0]},
green:{id:"green", vals:[0,1,0]},
grey:{id:"grey", vals:[0.5,0.5,0.5]}
// ...etc
};
// id needs to be known internally within the object - thus it is defined as a property.
// e.g:
colour.prototype.identify(console.log(this.id));
// id also needs to be used externally to find an object quickly.
// e.g:
function getcolour(s){return colours[s];}
// Although this works. It does mean duplicating data, with the theoretical possibility of a mismatch:
var colours={//...
blue:{id:"green", // oh dear...
How would this normally be handled by the experts?
This question is somewhat subjective.
When creating my applications I typically try do do the following:
never define same data in multiple places. source should always be unambiguous
if I need to create any indices for faster/easier access, I use utility methods to do it. Those methods should be properly unit-tested, so that I would have little doubts on them doing the wrong thing
use third party libraries as much as possible (such as already suggested lodash or underscore) to minimize the amount of code to be written/maintained.
If your algorithms and utilities are properly unit-tested you should not worry (too much) about getting the data into inconsistent state. However, if those are critically important systems/interfaces, you may add some validation on output. And it is generally a good practice to have data validation and marshaling on input.
Explanation on the utility methods:
if you have data array, say
var data = [{"id":"i_1", ...}, {"id":"i_2", ...},{"id":"i_3",....}];
Then and you have to create an index out of that or create more data sets based on the original array, then you create yourself a library of utility methods that do the modification on the array, create derivative data sets, or iterate on the array and create a resulting item on the fly. For example:
var createIndex = function( arr ){
// do something that converts the data array with expected structure to object
// {
// i_1: {"id":"i_1", ...},
// i_2: {"id":"i_2", ...},
// i_3: {"id":"i_3", ...}
return newObj;
}
This method will create a hash-map to access your data, which is faster then to iterate over the original array all the time. But now, this method you can easily unit-test and be sure that when you use it on the source data to get your intended dataset, there will be no inconsistency.
I wouldn't change the colours[key] direct access with other method to avoid duplication.
Any other attempt will lead to processing and you have mentioned that you have a large amount of data.
I assume that the duplication is over the incoming data that is a waste.
An example of processing over the network data consuming could be, going over the map object and set the id dynamically according to the key. (processing vs traffic)
colours[key].id = key
You can filter your object converting it to an array of objects and then filtering unique values. Converting it to an array would allow you to perform a lot of operations quicker and easier.
So you can map your object to an array:
var coloursArray = myObj.map(function(value, index) {
return [value];
});
Remove duplicates:
function removeDuplicates() {
return coloursArray.filter((obj, pos, arr) => {
return arr.map(mapObj => mapObj[id]).indexOf(obj[id]) === pos;
});
}
You can remove duplicates from an array using for example underscore.js through the .uniq method:
var uniqueColoursArray = _.uniq(coloursArray , function(c){ return c.id; });
Moreover, this function is pretty useless because you can access your element directly:
function getcolour(s){return colours[s];}
Calling colours[s] it is also shorter than getcolour(s). Your function would make sense if you pass also the array because it is not accessible in some other scope.
Then I can't understand why you do pass a console.log as parameter here:
colour.prototype.identify(console.log(this.id));
maybe you would like to pass just the this.id

Avoid sorting using immutable.js

To me Immutable.js reduces a lot of headaches and it's a great library, but now im facing with a trouble, my original object comes from the server but when I use any of it's functions like fromJS({myObj}) it works but saves a copy but sorted "a-z" and I'm making something that need the original structure to keep the components in the order that comes from the server, someone any Idea?
fromJS translates your objects into lists and maps by default. The former is ordered but not keyed, while the latter is keyed but not ordered, so neither fits your use case.
What you're looking for is an OrderedMap, which is a Map with an additional insertion order guarantee:
import { OrderedMap } from 'immutable';
const orderedMap = OrderedMap({key: "value"});
You can achieve it by still using fromJS: It has a second parameter called reviver, which can be used also for using OrderedMaps instead standard Maps:
import Immutable from 'immutable';
const reviver = (key, value) =>
Immutable.Iterable.isKeyed(value) ? value.toOrderedMap() : value.toList();
const data = Immutable.fromJS(js, reviver);
Javascript core objects explicitly provide no guarantees about key order. Immutable.Map (the expected result of your fromJS() call) just follows that.
If you want order, you should either specify the order as another property on each item, or, more conventionally, create an Immutable.List from an Array.
In other words, this sounds like a square peg/round hole problem. Make sure you're using the right data structure for your task.

Updating objects in List in ImmutableJS

I am a little confused by the functionality of ImmutableJS when working with an array of objects. The following example shows that even though the List x is immutable, I can still modify properties of objects inside the list both with and without using Immutable List's update() function.
My question is, why would I use Immutable if I can still modify the contents of my objects? I expected this module to protect me from that. I realize that I will not be able to add or remove entire objects to/from the list, but that doesn't fully protect me from modifying the list, which when working with a list in React state, I do not want to be able to do.
The other interesting thing I noticed is that when I directly modify the name after first performing the update, x.get(0).name and y.get(0).name are both changed. I thought that the resulting list from update() would not contain references to the same objects in the list.
How and why is ImmutableJS really helping me in this case?
var x = Immutable.List.of({name: 'foo'});
console.log(x.get(0).name);
var y = x.update(0, (element) => {
element.name = 'bar';
return element;
});
console.log(x.get(0).name);
console.log(y.get(0).name);
x.get(0).name = 'baz';
console.log(x.get(0).name);
console.log(y.get(0).name);
Output:
foo
bar
bar
baz
baz
https://jsfiddle.net/shotolab/rwh116uw/1/
Example of #SpiderPig's suggestion of using Map:
var x = Immutable.List.of(new Immutable.Map({name: 'foo'}));
console.log(x.get(0).get('name'));
var y = x.update(0, (element) => {
return element.set('name', 'bar');
});
console.log(x.get(0).get('name'));
console.log(y.get(0).get('name'));
Output:
foo
foo
bar
While the last example shows what I was trying to accomplish, ultimately I don't know if I will end up using Map or List or even ImmutableJS at all. What I don't like is the alternate APIs (especially for a mapped object). I am afraid that when I hand my project off to another developer, or as others join the team, using these immutable objects and lists correctly will completely fall apart without the proper governance.
Maybe this is more of a commentary on React, but if React intends for the state to be immutable, but it's not enforced, it just seems to me like this will end up a mess in a project that is moving quickly with multiple developers. I was trying my best not to mutate the state, but forgetting that modifying an object in a list/array is very easy mistake to make.
The immutable.js does not provide true immutability in the sense that you could not modify the Objects directly - it just provides API which helps you to maintain the immutable state.
The update -function should return completely new version of the indexed object:
var y = x.update(0, (element) => {
return { name : "bar"};
});
But doing something like this is a big no-no: x.get(0).name = 'baz';
Here is a much better explanation of the whole thing than I could ever write:
https://github.com/facebook/immutable-js/issues/481
The point of immutable.js is to allow re-use of objects which are not modified, which consumes less memory and gives a good practical performance.
There is also library "Seamless immutable", which freezes the objects, so that they can not be modified, but this comes with some performance penalty under JavaScript: https://github.com/rtfeldman/seamless-immutable

finding all keys that map to a value in javascript. And efficient alternatives

I'm working on an application in javascript where each user is in a room. No two users share the same name and no two rooms share the same name. Currently I have it set up like this:
var userroommap = {
username: "room",
username2: "room",
username3: "room2"
}
getting the room a user is in is as simple as
userroommap["user"]
but in order to get all users which are present in a room I would have to iterate over the entire userroommap like so:
for (var x in userroommap) {
if (userroommap[x] == "room")
//user x is present in room
}
}
In my application I must know which users are in which rooms very often so I am considering using another object to hold all users in a room, something like:
var roomusermap = {
room:["username", "username2"],
room2:["username3"]
}
Adding users to the room is trivial because all you have to do is append to an array, however removing a username from a room requires iterating over the array and becomes a decent operation. This already is a decent solution to my problem, but I became curious if there was a better solution. So: is there a better way to (i) store the roomusermap, perhaps without arrays? or, alternatively (ii) find all users in a room?
The data-structure described in the previous answer is called a BiMap.
A BiMap ideally provides equivalent performance for value: keys lookup operations as for key: values lookups. It is typically implemented by internally managing two separate maps (one with a forward-mapping {key:values} and one with a reverse-mapping {value:keys}).
Here's an existing implementation to use if you're not rolling your own. https://www.npmjs.com/package/bimap
Unless you've identified a genuine, real-world performance problem, I'd stick with the simple solution.
That said, a few thoughts for you:
All modern JavaScript engines give you the Object.keys function, which returns an array of an object's own enumerable properties. This may be more efficient than your for-in loop for two reasons:
It's happening within the engine's code, which lets the engine optimize
for-in looks for enumerable properties in prototype objects, whereas Object.keys knows it's only supposed to look in that specific object
Your roomusermap can contain maps per room, it doesn't need to use arrays.
var roomusermap = {
room: {
username: user,
username2: user2
},
room2: {
username3: user3
}
};
Adding a user to a room becomes:
userroommap[username] = roomname;
roomusermap[roomname][username] = user;
Removing a user is:
delete userroommap[username];
delete roomusermap[roomname][username];
If you're seeing performance problems with those map objects, something to keep in mind is that removing a property from an object (delete) puts the object into "dictionary mode" on several JavaScript engines (having previously been in a more optimized state), significantly impacting the time required to look up properties on that object.
So in the very hypothetical case where the property lookup performance starts to be an issue, you could consider storing undefined rather than deleting the property. E.g., instead of:
delete userroommap[username];
delete roomusermap[roomname][username];
you'd do
userroommap[username] = undefined;
roomusermap[roomname][username] = undefined;
However, you'd have to adjust your checks for whether a user is in a room, and you couldn't use Object.keys (on its own) to get the list anymore since you have to weed out the properties with the value undefined. You could use Object.keys with filter:
var map = roomusermap[roomname];
var users = Object.keys(map).filter(function(username) {
return map[username] !== undefined;
});
So you'd really want to do that only if you've identifed a genuine problem caused by objects going into dictionary mode.

Javascript: how to un-jsonify efficiently/correctly

To be honest, I'm not quite sure where to start with this question.
I'll describe the situation: I am in the process of making a level editor for an HTML5 game. The level editor is already functional - now I would like to save/load levels created with this editor.
Since this is all being done in Javascript (the level editor as well as the game), I was thinking of having the save simply convert the level to a JSON and the load, well... un-jsonify it.
The problem is - the level contains several types of objects (several different types of entities, several types of animation objects, etc...) Right now, every time I want to add an object to the game I have to write an unjsonify method specifically for that object and then modify the level object's unjsonify method so it can handle unjsonifying the newly defined type of object.
I can't simply use JSON.parse because that just returns an object with the same keys and values as the original had, but it is not actually an object of that class/prototype. My question is, then, is there a correct way to do this that does not require having to continuously modify the code every time I want to add a new type of object to the game?
I would create serialise/deserialise methods on each of your objects to put their state into JSON objects and recover it from them. Compound objects would recursively serialise/deserialise their children. To give an example:
function Player {
this.weapon = new Weapon();
}
Player.prototype.serialise = function () {
return {'type': 'Player', weapon: this.weapon.serialise()};
}
Player.deserialise = function(json_object) {
var player = new Player();
player.weapon = Weapon.deserialise(json.weapon);
return player;
}
Obviously in real code you would have checks to make sure you were getting the types of objects that you expect. Arrays and simple hash objects could be simply copied during serialisation/deserialisation though their children will often need to be recursed over.

Categories

Resources