I'm dealing with an array of "events" where the key of the array is the Unix Timestamp of the event. In other words let's assume we have the following array of event objects in JS:
var MyEventsArray=[];
MyEventsArray[1513957775]={lat:40.671978333333, lng:14.778661666667, eventcode:46};
MyEventsArray[1513957845]={lat:40.674568332333, lng:14.568661645667, eventcode:23};
MyEventsArray[1513957932]={lat:41.674568332333, lng:13.568661645667, eventcode:133};
and so on for thousands rows...
Data are sent along Ajax call and encoded in JSON to be processed with JS. When the data set is received, I have another Unix Timestamp let say 1513957845, coming from another source and I want to find the event that happened at that time...it's quite easy I just need to take the element from the array having the given index (the second in the list above).
Now the question: immagine that the given index is not found (imagine we are looking for UXTimestamp=1513957855) and that this index was not existing in the array but I want to take the closest index (in the example above I would take the element MyEventsArray[1513957845] as it's index 1513957845 is the closest to 1513957855). What can I do to obtain this result?
My difficulties are in handling array index as I when I receive the array I don't know where the index begins.
How the machine will handle situations like that?
Will the machine allocate (and waste) memory for dummy/empty elements placed between each rows or the compiler have some kind of ability to build it's own index and optimize the space? In other words: is it safe to play with index as we're doing or it's better to allocate the array as:
var MyEventsArray=[];
MyEventsArray['1513957775']={lat:40.671978333333, lng:14.778661666667, eventcode:46};
MyEventsArray['1513957845']={lat:40.674568332333, lng:14.568661645667, eventcode:23};
MyEventsArray['1513957932']={lat:41.674568332333, lng:13.568661645667, eventcode:133};
and so on for thousands rows...
In this case the key and the index are clearly different so here it's possible to get the first element with MyArray[0] despite we don't know the key value. Is this approach more expensive (here we must save index and key) in term of memory or the effects are the same for the compiler?
There is no difference between MyEventsArray[1513957775] and MyEventsArray['1513957775']. Deep down, array indexes are just property names, and property names are strings.
Regarding the question of whether these sparse indices will lead to millions of empty cells being allocated, no, that won't happen. Sparse arrays only store what you put in them, not empty space.
If you want to find a key quickly, you can obtain an array of the keys, sort them, and then find the one you want:
var MyEventsArray=[];
MyEventsArray[1513957775]={lat:40.671978333333, lng:14.778661666667, eventcode:46};
MyEventsArray[1513957845]={lat:40.674568332333, lng:14.568661645667, eventcode:23};
MyEventsArray[1513957932]={lat:41.674568332333, lng:13.568661645667, eventcode:133};
var target = 1513957855;
var closest= Object.keys(MyEventsArray)
.map(k => ({ k, delta: Math.abs(target - k) }))
.sort((a, b) => a.delta - b.delta)[0].k;
console.log(closest);
You could take Array#some which allowes to exits the iteration if the delta is getting greater than the last delta.
var array = [];
array[1513957775] = { lat: 40.671978333333, lng: 14.778661666667, eventcode: 46 };
array[1513957845] = { lat: 40.674568332333, lng: 14.568661645667, eventcode: 23 };
array[1513957932] = { lat: 41.674568332333, lng: 13.568661645667, eventcode: 133 };
var key = 0,
search = 1513957855;
Object.keys(array).some(function (k) {
if (Math.abs(k - search) > Math.abs(key - search)) {
return true;
}
key = k;
});
console.log(key);
You can use Object.keys(MyEventsArray) to get an array of the keys (which are strangely expressed as strings); you could then iterate through that and find the closest match.
var MyEventsArray=[];
MyEventsArray[1513957775]={lat:40.671978333333, lng:14.778661666667, eventcode:46};
MyEventsArray[1513957845]={lat:40.674568332333, lng:14.568661645667, eventcode:23};
MyEventsArray[1513957932]={lat:41.674568332333, lng:13.568661645667, eventcode:133};
Object.keys(MyEventsArray)
["1513957775", "1513957845", "1513957932"]
Reference: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array
Related
I have a project that uses arrays of objects that I'm thinking of moving to es6 Sets or Maps.
I need to quickly get a random item from them (obviously trivial for my current arrays). How would I do this?
Maps and Sets are not well suited for random access. They are ordered and their length is known, but they are not indexed for access by an order index. As such, to get the Nth item in a Map or Set, you have to iterate through it to find that item.
The simple way to get a random item from a Set or Map would be to get the entire list of keys/items and then select a random one.
// get random item from a Set
function getRandomItem(set) {
let items = Array.from(set);
return items[Math.floor(Math.random() * items.length)];
}
You could make a version that would work with both a Set and a Map like this:
// returns random key from Set or Map
function getRandomKey(collection) {
let keys = Array.from(collection.keys());
return keys[Math.floor(Math.random() * keys.length)];
}
This is obviously not something that would perform well with a large Set or Map since it has to iterate all the keys and build a temporary array in order to select a random one.
Since both a Map and a Set have a known size, you could also select the random index based purely on the .size property and then you could iterate through the Map or Set until you got to the desired Nth item. For large collections, that might be a bit faster and would avoid creating the temporary array of keys at the expense of a little more code, though on average it would still be proportional to the size/2 of the collection.
// returns random key from Set or Map
function getRandomKey(collection) {
let index = Math.floor(Math.random() * collection.size);
let cntr = 0;
for (let key of collection.keys()) {
if (cntr++ === index) {
return key;
}
}
}
There's a short neat ES6+ version of the answer above:
const getRandomItem = iterable => iterable.get([...iterable.keys()][Math.floor(Math.random() * iterable.size)])
Works for Maps as well as for Sets (where keys() is an alias for value() method)
This is the short answer for Sets:
const getRandomItem = set => [...set][Math.floor(Math.random()*set.size)]
To be able to use setValues() instead of setValue on a high number of rows, I would like to know how can I convert all my arrays into arrays of the same length.
During a map function, I create one giant array that looks like this :
const myArray = [[Monday, Tuesday],[Monday,Tuesday,Wednesday],[Friday],[Monday,Friday],[Tuesday,Wednesday,Friday]] // And so on.
At the moment I use a setValue for each item of the array. The next step would be simply to use setValues(), and append an array but the problem is they are all of different lengths.
result.forEach(function(_,index){
currentSheet.getRange(1+index,5).setValue(result[index]);
});
That is going to do that for 600 lines and I will do it several times with other functions. I can live with it, but it seems like a waste. Is there a way to make the arrays homogenous (all arrays would be have a length of 3 elements, one or two being empty for example) an use one single setValues() instead ?
EDIT : the original map function to create the first array was requested. Here it is :
(Basically what it does is : it runs a map through a first array, look at the first element and go find in the source all the elements that have the same key and return the 9th and 10th elements of that array)
const result = UniquesDatas.map(uniqueRow =>
SourceDatas
.filter(row => row[0] === uniqueRow[0])
.map(row => [row[9]+" "+row[10]])
);
Thank you in advance,
Cedric
You can concat an array of empty elements of the remaining length.
const myArray = [['Monday', 'Tuesday'],['Monday','Tuesday','Wednesday'],['Friday'],['Monday','Friday'],['Tuesday','Wednesday','Friday']]
const res = myArray.map(x => x.concat(Array(3 - x.length)));
console.log(res);
As suggested by Kinglish, to get the length of the inner array with the most elements, use Math.max(...myArray.map(x=>x.length)), which will work for the more general case.
I have a map of id => value that I want to sort by value.
But no matter what I do, it always gets sorted by id.
Basically I have a sorted map on server side that I send to javascript via json.
{"3":"Apple","2":"Banana","1":"Orange"}
After de-serialization I get
{
1:"Orange",
2:"Banana",
3:"Apple"
}
And no matter what I try, it seems to stay in this order. Is it possible in javascript to force a non ascending sort order with interger keys?
var json = '{"3":"Apple", "2":"Banana", "1":"Orange"}';
var data = $.parseJSON(json);
for (var ix in data) {
console.log(ix + ": " + data[ix]);
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
You should not rely on objects key order for these reasons.
I personally would recommend you to either use a Map, or to build an Array instead.
Below is an example to build an array from your source: for simplicity, I've added a key property to make the sorting easier.
Note: I'm using Array.from to build the array, which is taking the length from the parsed object keys length, and using the callback to init the object inline.
var json = '{"3":"Apple", "2":"Banana", "1":"Orange"}';
// Parse the json string.
const parsed = JSON.parse(json);
// Acquire the keys length
const length = Object.keys(parsed).length;
// Build an array of objects ordered in the same way it came.
const result = Array.from({length}, (_, i) => ({key: length - i, [length - i]: parsed[length - i]}));
// Log a copy of the result.
console.log(JSON.parse(JSON.stringify(result)));
// Sort ascending:
result.sort((a,b) => a.key - b.key);
// Log a copy of the sorted result.
console.log(JSON.parse(JSON.stringify(result)));
// Sort descending:
result.sort((a,b) => b.key - a.key);
// log the sorted array
console.log(result);
If you really want to rely on key orders, you can (of course), but using an array is slightly cleverer and gives no chance to have something which is not ordered as expected, unless (of course) the sorting algorithm is wrong or fails for some reason (like if key is undefined or null or not numeric in the above case).
As a final note, I'm aware that the question is about sorting an object, but because of the above reasons, I think the correct answer is just to DON'T use an object at all in that scenario specifically.
Recently i had to make an Array with values at large indexes (due to plugin constraints).
eg:
var names[100000] = "a";
var names[150000] = "b" ... and so on till 5 large indexes.
and in between all values are undefined names[100001] //undefined.
Now my doubt is Since the array has only 5 elements but if i do
names.length //it is 300001
its a large Array. I am not iterating this array nor i am running any loop through it. I will get the values directly through their defined indexes from the array.
So will this structure make any significant performance differences on the Browser or is it alright to use this as long as the number of values in the array is less irrespective of its indexes and no iteration is involved.
Thanks
The only thing that differentiates an array from a plain object is its length property and how it behaves (and a few array specific methods of course). The length value simply increases with certain operations, like setting a numeric property or pushing a new element. That's it in a nutshell. The array doesn't actually contain 100000 elements when you set the property 100000 to a value, all that's happening is that you're setting one property and the value of length is adjusted accordingly.
So, no, it won't have a lot of impact on performance, unless somebody actually iterates through the array using for (let i = 0; i < arr.length; i++).
You can create an array with the length, given by your plugin and work locally with an object to limit the iterations. After all your processing has been applied, you copy the values to the array and send it to the plugin's function.
Keep an array with the desired length as a buffer
var buffer = new Array(20000);
Internally work with an object
var data = {};
Assign values to the object
data[10001] = "foo";
Once transformations or data processing has been applied to the object, copy data to the buffer
for (key in data){
buffer[key] = data[key];
}
Send buffer to the plugin. And clear data, if desired.
By doing so, you will not iterate more, than the actual data you processed.
I have million of objects each have an unique ID - number.
Each object for making it simple contains name
They objects are being added into array.
Into this array i'm adding and removing objects.
In order to remove object I have the id, and then need to find the index in the array and splice it out.
In my case i have allot of objects and can have allot of removes operations. so in case i have 1000 removes. and all of this objects ids are stored in the end of the array, than i will run over the all 1 million objects till i find them.
Storing the index in the object after adding is not good, because every each remove i need to update the indices of all objects stored after the removed one.
For example removing the first 1000 will cause updating the rest of the 1M-1000 items indices.
My question is, what is the best solution for my problem?
-- UPDATE --
for example: My flat array look like this after adding 1M objects
[ obj1, obj2, obj3, .... obj1000000 ]
I want to remove now the object obj1000000. For finding which index this object
was inserted to i need to run over all the array (or till i found the item) and compare the current item id with my obj1000000 id, and break out from the loop when found. Then remove the item by it's index.
If i would store the index of each object in the object itself after it being added to the array, i would have to update the rest of the objects indices after removing one.
For example: obj1 will contains property index=0, obj2 will have index=1 and so on. To remove obj5 i just get its property index which is 4 and remove it. but now obj6 which has index=5 is incorrect. and should be 4. and obj7 should be 5 and so on. so update must be done.
My SmartArray holds an TypedArray created in some size. And i'm expending it if needed. When push is called. I'm simply set the value in the last item this._array[this.length++] = value; (Checking of course if to expand the array)
SmartArray.prototype.getArray = function () {
return this._array.subarray(0, this.length);
}
SmartArray.prototype.splice = function (index, removeCount) {
if (index + removeCount < this.length) {
var sub = this._array.subarray(index + removeCount, this.length);
this._array.set(sub, index);
} else {
removeCount = this.length - index;
}
this.length -= removeCount;
}
It is working very fast, subarray doesn't create a new array. And set is working very fast as well.
The standard solutions for this problem are
balanced (binary) trees,
hash tables.
They take respectively O(Log(N)) and O(1) operations per search/insertion/deletion on average.
Both can be implemented in an array. You will find numerous versions of them by web search.