I want to get the average my firebase data - javascript

I want to average the related values ​​when the data in the FireBase is updated.
I am using Firebase functions and can not load data.
I can change the data I want when the event occurs, but I can not calculate the average of the data.
exports.taverage = functions.database.ref('/User/tsetUser/monthQuit/{pushId}')
.onCreate((snapshot, context) => {
const promiseRoomUserList = admin.database().ref('/User/tsetUser/monthQuit/{pushId}').once('value');
var sum=0;
const arrayTime = [];
snapshot.forEach(snapshot => {
arrayTime.push('/User/tsetUser/monthQuit/{pushId}'.val());
})
for(let i=0; i<arrayTime.length; i++){
sum+=arrayTime[i];
}
return admin.database().ref('/User/tsetUser/inform/standardQuit').set(sum);
});
//I Want 'standardQuit' value set average.

I'm not sure why you can't calculate the average, but a simpler version of your code would be:
exports.taverage = functions.database.ref('/User/tsetUser/monthQuit/{pushId}')
.onCreate((snapshot, context) => {
return admin.database().ref('/User/tsetUser/monthQuit/{pushId}').once('value')
.then(function(snapshot) {
let sum=0;
snapshot.forEach(child => {
sum = sum + child.val();
})
let avg = sum / snapshot.numChildren();
return admin.database().ref('/User/tsetUser/inform/standardQuit').set(avg);
});
});
The biggest differences:
This code returns promises from both the top-level, and the nested then(). This is needed so Cloud Functions knows when your code is done, and it can thus stop billing you (and potentially shut down the container).
We simply add the value of each child to the sum, since you weren't using the array in any other way. Note that the child.val() depends on your data structure, which you didn't share. So if it fails there, you'll need to update how you get the exact value (or share you data structure with us).
The code actually calculates the average by dividing the sum by the number of child nodes.
Consider using a moving average
One thing to keep in mind is that you're now reading all nodes every time one node gets added. This operation will get more and more expensive as nodes are added. Consider if you can use a moving average, which wouldn't require all child nodes, but merely the current average and the new child node. The value will be an approximate average where more recent value typically have more weight, and is much cheaper to calculate:
exports.taverage = functions.database.ref('/User/tsetUser/monthQuit/{pushId}')
.onCreate((snapshot, context) => {
return admin.database().ref('/User/tsetUser/inform/standardQuit').transaction(function(avg) {
if (!avg) avg = 0;
return (15.0 * avg + snapshot.val()) / 16.0;
});
});

Related

Iterate through collection items and sum property of each document while being subscribed for a collection

I have a collection counters, each counter document has a collection actions. Each action has a step property which is a Number.
Here's how I get all (subscribe) counters
firestoreRef.onSnapshot(
(countersSnapshot) => {
const arrayOfCounters = [];
if (!countersSnapshot.empty) {
countersSnapshot.forEach((counterDocument) =>
arrayOfCounters.push(counterDocument),
);
}
setCounters(arrayOfCounters); // <- this is React.useState()[1]
},
(e) => console.error(e),
);
I used to have actions' as an array in counter and I would get "currentValue" (which is a sum of all step values in actions) like this:
const getCurrentValue = (initialValue, actions, step) => {
if (!actions.length) {
return initialValue;
}
return actions.reduce((a, b) => {
return b.type === TYPE_INCREMENT ? a + step : a - step;
}, initialValue);
};
However I decided to transform actions to a Firestore collection. How should I get all counters and their current value, while also being subscribed to this so I get real-time updates?
Posting as Community Wiki answer, based in the comments.
Following examples here, should provide a good understanding and help with ideas to solve the issue. For example, subscribing to each counter.ref.collection('actions') might be a solution, however, take in account of all the queries will be made for each of the documents. I believe this link should help, as it's a similar cases and provide multiples alternatives - with code samples included - of solutions.

Vue computed property overwriting global state without vuex

I have a list of people who have scores. In state I have them listed in an array, one of the items in the array is 'scoreHistory' which is an array of objects containing their scores at different points in time. I want to filter this set for different time periods i.e. -5 days, -30 days so instead of just seeing the overall score I can see the scores if everyone started at 0 say 30 days ago.
I have it (kind of) working. See my code below:
filteredScores () {
if(!this.people) {
return
}
// Here I was trying to ensure there was a copy of the array created in order to not change the original array. I thought that might have been the problem.
let allPeople = this.people.slice(0) // this.people comes from another computed property with a simple getter. Returns an array.
let timeWindow = 30 //days
const windowStart = moment().subtract(timeWindow,'days').toDate()
for (const p of allPeople ) {
let filteredScores = inf.scoreHistory.filter(score => moment(score.date.toDate()).isSameOrAfter(windowStart,'day'))
p.scoreHistory=filteredScores
//calculate new score
p.score = inf.scoreHistory.reduce(function(sum,item) {
return sum + item.voteScore
},0)
}
return allInf
}
I expected it to return to me a new array where each person's score is summed up over the designated time period. It seems to do that OK. The problem is that it is altering the state that this.people reads from which is the overall data set. So once it filters all that data is gone. I don't know how I am altering global state without using vuex??
Any help is greatly appreciated.
Your problem isn't that you're modifying the array, but that you're modifying the objects within the array. You change the scoreHistory and score property of each item in the array. What you want to do instead is create a new array (I recommend using map) where each item is a copy of the existing item plus a new score property.
filteredScores () {
if(!this.people) {
return
}
let timeWindow = 30 //days
const windowStart = moment().subtract(timeWindow,'days').toDate()
return this.people.map(p => {
let filteredScores = p.scoreHistory.filter(score => moment(score.date.toDate()).isSameOrAfter(windowStart,'day'))
//calculate new score
let score = filteredScores.reduce(function(sum, item) {
return sum + item.voteScore
}, 0)
// Create a new object containing all the properties of p and adding score
return {
...p,
score
}
}
})

sorting two associative arrays/stacks

I am implementing an algorithm I designed and am exploring different approaches
This isn't a homework problem but I am going to explain it like one: lets say a merchant has bought inventory of apples on different days, and also sold some on different days. I want the weighted average timestamp of their current purchases.
I am storing this data object as timestamp string in epoch time, and quantity of apples. My dataset actually has the purchases and the sells in separate data sets, like so:
//buys
var incomingArray = {
"1518744389": 10,
"1318744389": 30
};
//sells
var outgoingArray = {
"1518744480": 3,
"1418744389": 5,
"1408744389": 8
};
and I would like the outcome to show only the remainding incomingArray timestamp purchase pairs.
var incomingArrayRemaining = {
"1518744389": 7,
"1318744389": 17
};
Where you see there was one outgoing transaction for 3 apples at a later timestamp, therefore subtracting from 10. And there were 13 outgoing transactions before the buy of 10, but after the purchase of 30, so they only subtract from the 30.
Note, if more than 10 were transferred after 10, it would subtract from both 10 and 30. The number of apples can never be less than 0.
First, to accomplish my goals it seems that I need to know how many are actually still owned from the lot they were purchased in.
Instead of doing stack subtracting in the LIFO method, it seems like this has to be more like Tax Lot Accounting. Where the lots themselves have to be treated independently.
Therefore I would have to take the timestamp of the first index of the sell in the outgoing array and find the nearest older timestamp of the buy in the incoming array
Here is what I tried:
for (var ink in incomingArray) {
var inInt = parseInt(ink);
for (var outk in outgoingArray) {
if (inInt >= 0) {
var outInt = parseInt(outk);
if (outInt >= inInt) {
inInt = inInt - outInt;
if (intInt < 0) {
outInt = inInt * -1; //remainder
inInt = 0;
} //end if
} //end if
} //end if
} //end innter for
} //end outer for
It is incomplete and the nested for loop solution will already have poor computational time.
That function merely tries to sort the transactions so that only the remaining balance remains, by subtracting an outgoing from the nearest incoming balance, and carrying that remainder to the next incoming balance
I feel like a recursive solution would be better, or maybe something more elegant that I hadn't thought of (nested Object forEach accessor in javascript)
After I get them sorted then I need to actually do the weighted average method, which I have some ideas for already.
First sorting, then weighted average of the remaining quantities.
Anyway, I know the javascript community on StackOverflow is particularly harsh about asking for help but I'm at an impasse because not only do I want a solution, but a computationally efficient solution, so I will probably throw a bounty on it.
You could convert the objects into an array of timestamp-value pairs. Outgoing ones could be negative. Then you can easily sort them after the timestamp and accumulate how you like it:
const purchases = Object.entries(incomingArray).concat(Object.entries(outgoingArray).map(([ts, val]) => ([ts, -val])));
purchases.sort(([ts1, ts2]) => ts1 - ts2);
Now you could iterate over the timespan and store the delta in a new array when the value increases (a new ingoing):
const result = [];
let delta = 0, lastIngoing = purchases[0][0];
for(const [time, value] of purchases){
if(value > 0){
// Store the old
result.push([lastIngoing, delta]);
// Set up new
delta = 0;
lastIngoing = time;
} else {
delta += value;
}
}

Not truly async?

I have a array of about 18 000 elements. I'm creating a map application where I want to add the elements when the user zooms in to a certain level.
So when the user zooms in under 9 I loop tru the array looking for elements that is in the view.
However, it does take some time looping thru the elements, causing the map application lag each time the user zooms out and in of "level 9". Even if there are no elements to add or not, so the bottleneck is the looping I guess.
I've tried to solve it by asyncing it like:
function SearchElements(elementArr) {
var ret = new Promise(resolve => {
var arr = [];
for (var i in elementArr) {
var distanceFromCenter = getDistanceFromLatLonInKm(view.center.latitude, view.center.longitude, dynamicsEntities[i].pss_latitude, dynamicsEntities[i].pss_longitude);
var viewWidthInKm = getSceneWidthInKm(view, true);
if (distanceFromCenter > viewWidthInKm) continue;
arr.push(elementArr[i]);
}
resolve(arr);
});
return ret;
}
SearchElements(myElementsArray).Then(arr => {
// ...
});
But its still not async, this method hangs while the for loop runs.
Because you still have a tight loop that loops through all the elements in one loop, you'll always have the responsiveness issues
One way to tackle the issue is to works on chunks of the data
Note: I'm assuming elementArr is a javascript Array
function SearchElements(elementArr) {
var sliceLength = 100; // how many elements to work on at a time
var totalLength = elementArr.length;
var slices = ((totalLength + sliceLength - 1) / sliceLength) | 0; // integer
return Array.from({length:slices})
.reduce((promise, unused, outerIndex) =>
promise.then(results =>
Promise.resolve(elementArr.slice(outerIndex * sliceLength, sliceLength).map((item, innerIndex) => {
const i = outerIndex * sliceLength + innerIndex;
const distanceFromCenter = getDistanceFromLatLonInKm(view.center.latitude, view.center.longitude, dynamicsEntities[i].pss_latitude, dynamicsEntities[i].pss_longitude);
const viewWidthInKm = getSceneWidthInKm(view, true);
if (distanceFromCenter <= viewWidthInKm) {
return item; // this is like your `push`
}
// if distanceFromCenter > viewWidthInKm, return value will be `undefined`, filtered out later - this is like your `continue`
})).then(batch => results.concat(batch)) // concatenate to results
), Promise.resolve([]))
.then(results => results.filter(v => v !== undefined)); // filter out the "undefined"
}
use:
SearchElements(yourDataArray).then(results => {
// all results available here
});
My other suggestion in the comment was Web Workers (I originally called it worker threads, not sure where I got that term from) - I'm not familiar enough with Web Workers to offer a solution, however https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Using_web_workers and https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API should get you going
To be honest, I think this sort of heavy task would be better suited to Web Workers

immutable.js filter and mutate (remove) found entries

I have two loops, one for each day of the month, other with all events for this month. Let's say I have 100 000 events.
I'm looking for a way to remove events from the main events List once they were "consumed".
The code is something like:
const calendarRange = [{initialDate}, {initialDate}, {initialDate}, {initialDate}, ...] // say we have 30 dates, one for each day
const events = fromJS([{initialDate}, {initialDate}, {initialDate}, ...]) // let's say we have 100 000
calendarRange.map((day) => {
const dayEvents = events.filter((event) => day.get('initialDate').isSame(event.get('initialDate'), 'day')) // we get all events for each day
doSomeThingWithDays(dayEvents)
// how could I subtract `dayEvents` from `events` in a way
// the next celandarRange iteration we have less events to filter?
// the order of the first loop must be preserved (because it's from day 1 to day 3{01}])
}
With lodash I could just do something like:
calendarRange.map((day) => {
const dayEvents = events.filter((event) => day.get('initialDate').isSame(event.get('initialDate'), 'day')) // we get all events for each day
doSomeThingWithDays(dayEvents)
pullAllWith(events, dayEvents, (a, b) => a === b)
}
How to accomplish the same optimization with immutablejs? I'm not really expecting a solution for my way of iterating the list, but for a smart way of reducing the events List in a way it get smaller and smaller..
You can try a Map with events split into bins - based on your example, you bin based on dates - you can lookup a bin, process it as a batch and remove it O(1). Immutable maps are fairly inexpensive, and fare much better than iterating over lists. You can incur the cost of a one time binning, but amortize it over O(1) lookups.
Something like this perhaps:
eventbins = OrderedMap(events.groupBy(evt => evt.get('initialDate').dayOfYear() /* or whatever selector */))
function iter(list, bins) {
if(list.isEmpty())
return
day = list.first()
dayEvents = bins.get(day.dayOfYear())
doSomeThingWithDays(dayEvents)
iter(list.shift(), bins.delete(day))
}
iter(rangeOfDays, eventbins)
By remobing already processed elements you are not going to make anything faster. The cost of all filter operations will be halved on average, but constructing the new list in every iteration will cost you some cpu cycles so it is not going to be significantly faster (in a big O sense). Instead, you could build an index, for example an immutable map, based on the initialDate-s, making all the filter calls unnecessary.
const calendarRange = Immutable.Range(0, 10, 2).map(i => Immutable.fromJS({initialDate: i}));
const events = Immutable.Range(0, 20).map(i => Immutable.fromJS({initialDate: i%10, i:i}));
const index = events.groupBy(event => event.get('initialDate'));
calendarRange.forEach(day => {
const dayEvents = index.get(day.get('initialDate'));
doSomeThingWithDays(dayEvents);
});
function doSomeThingWithDays(data) {
console.log(data);
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/immutable/3.8.1/immutable.js"></script>

Categories

Resources