Firebase OrderByKey with startAt and endAt giving wrong results - javascript

I have 3 objects with the keys as it looks like this:
They are in format of YYYYMMDD. I am trying to get data of a month. But I am not getting the desired output.
When I query it like this:
var ref = db.child("-KPXECP6a1pXaM4gEYe0");
ref.orderByKey().startAt("20160901").once("value", function (snapshot) {
console.log("objects: " + snapshot.numChildren());
snapshot.forEach(function(childSnapshot) {
console.log(childSnapshot.key);
});
});
I get the following output:
objects: 3
20160822-KPl446bbdlaiQx6BOPL
20160901-KPl48ID2FuT3tAVf4DW
20160902-KPl4Fr4O28VpsIkB70Z
When I query this along with endAt:
ref.orderByKey().startAt("20160901").endAt("20160932").once("value", function (snapshot) {
console.log("objects: " + snapshot.numChildren());
snapshot.forEach(function(childSnapshot) {
console.log(childSnapshot.key);
});
});
I get this:
objects: 0
If I use ~ sign at the end,
ref.orderByKey().startAt("20160901").endAt("20160932~").once("value", function (snapshot) {
console.log("objects: " + snapshot.numChildren());
snapshot.forEach(function(childSnapshot) {
console.log(childSnapshot.key);
});
});
I get the output:
objects: 3
20160822-KPl446bbdlaiQx6BOPL
20160901-KPl48ID2FuT3tAVf4DW
20160902-KPl4Fr4O28VpsIkB70Z
Is there anything I am missing here?

Wow... this took some time to dig up. Thanks for the jsfiddle, that helped a lot.
TL;DR: ensure that you always have a non-numeric character in your search criteria, e.g. ref.orderByKey().startAt("20160901-").endAt("20160931~").
Longer explanation
In Firebase all keys are stored as strings. But we make it possible for developers to store arrays in the database. In order to allow that we store the array indices as string properties. So ref.set(["First", "Second", "Third"]) is actually stored as:
"0": "First"
"1": "Second"
"2": "Third"
When you get the data back from Firebase, it'll convert this into an array again. But it is important for your current use-case to understand that it is stored as key-value pairs with string keys.
When you execute a query, Firebase tries to detect whether you're querying a numeric range. When it thinks that is your intent, it converts the arguments into numbers and queries against the numeric conversion of the keys on the server.
In your case since you are querying on only a numeric value, it will switch to this numeric query mode. But since your keys are actually all strings, nothing will match.
For this reason I'd recommend that you prefix keys with a constant string. Any valid character will do, I used a - in my tests. This will fool our "is it an array?" check and everything will work the way you want it.
The quicker fix is to ensure that your conditions are non-convertible to a number. In the first snippet I did this by adding a very low range ASCII character to the startAt() and a very high ASCII character to endAt().
Both of these are workarounds for the way Firebase deals with arrays. Unfortunately the API doesn't have a simple way to handle it and requires such a workaround.

Related

How to make JSON.parse() to treat all the Numbers as BigInt?

I have some numbers in json which overflow the Number type, so I want it to be bigint, but how?
{"foo":[[0],[64],[89],[97]],"bar":[[2323866757078990912,144636906343245838,441695983932742154,163402272522524744],[2477006750808014916,78818525534420994],[18577623609266200],[9008333127155712]]}
TLDR;
You may employ JSON.parse() reviver parameter
Detailed Solution
To control JSON.parse() behavior that way, you can make use of the second parameter of JSON.parse (reviver) - the function that pre-processes key-value pairs (and may potentially pass desired values to BigInt()).
Yet, the values recognized as numbers will still be coerced (the credit for pinpointing this issue goes to #YohanesGultom).
To get around this, you may enquote your big numbers (to turn them into strings) in your source JSON string, so that their values are preserved upon converting to bigint.
As long as you wish to convert to bigint only certain numbers, you would need to pick up appropriate criteria (e.g. to check whether the value exceeds Number.MAX_SAFE_INTEGER with Number.isSafeInteger(), as #PeterSeliger has suggested).
Thus, your problem may be solved with something, like this:
// source JSON string
const input = `{"foo":[[0],[64],[89],[97]],"bar":[[2323866757078990912,144636906343245838,441695983932742154,163402272522524744],[2477006750808014916,78818525534420994],[18577623609266200],[9008333127155712]]}`
// function that implements desired criteria
// to separate *big numbers* from *small* ones
//
// (works for input parameter num of type number/string)
const isBigNumber = num => !Number.isSafeInteger(+num)
// function that enquotes *big numbers* matching
// desired criteria into double quotes inside
// JSON string
//
// (function checking for *big numbers* may be
// passed as a second parameter for flexibility)
const enquoteBigNumber = (jsonString, bigNumChecker) =>
jsonString
.replaceAll(
/([:\s\[,]*)(\d+)([\s,\]]*)/g,
(matchingSubstr, prefix, bigNum, suffix) =>
bigNumChecker(bigNum)
? `${prefix}"${bigNum}"${suffix}`
: matchingSubstr
)
// parser that turns matching *big numbers* in
// source JSON string to bigint
const parseWithBigInt = (jsonString, bigNumChecker) =>
JSON.parse(
enquoteBigNumber(jsonString, bigNumChecker),
(key, value) =>
!isNaN(value) && bigNumChecker(value)
? BigInt(value)
: value
)
// resulting output
const output = parseWithBigInt(input, isBigNumber)
console.log("output.foo[1][0]: \n", output.foo[1][0], `(type: ${typeof output.foo[1][0]})`)
console.log("output.bar[0][0]: \n", output.bar[0][0].toString(), `(type: ${typeof output.bar[0][0]})`)
.as-console-wrapper{min-height: 100% !important;}
Note: you may find RegExp pattern to match strings of digits among JSON values not quite robust, so feel free to come up with yours (as mine was the quickest I managed to pick off the top of my head for demo purposes)
Note: you may still opt in for some library, as it was suggested by #YohanesGultom, yet adding 10k to your client bundle or 37k to your server-side dependencies (possibly, to docker image size) for that sole purpose may not be quite reasonable.

What's the limit of arrays you can create?

What's the limit of arrays you can create? Is there a browser cap or limitation?
The idea is that I want to dynamically create an array for each employee information (includes name, active directory, employee number etc...)
Loop through employee list and create array:
window["arr_" + employee number]
Then when the user needs it, my code will call the array name based on their employee number. The button has the empn as an attribute so I can pass it:
console.log( window["arr_" + empn] )
I'm worried that I'll have more employees that I'm allowed to create arrays.
To answer the question you asked, your problem is not the theoretical limit of how many things you can store in an array, it's the practical limit at which point you'll start running out of memory or the site will be too slow. 100,000 is likely a good rule of thumb.
But there are a lot of other problems here too. First of all,
window["anything"]
is not an array, it's a property on the window object. And the window object is definitely the wrong place to store a list of employees. It's hard to tell with so little info, but what you probably want to do is create an object:
var employees = {};
and then populate it with key:value pairs, where the key is something like "employee_"+num and the value is the object you got from whatever you're looping over:
employees["employee_"+number] = data
The result of that will be an object like:
{
"employee_0" : { "name" : "John", "number": 0 }
"employee_1" : { "name" : "Joe", "number": 1 }
}
and you would reference them by key, like so:
console.log("Employee 0's name is " + employees["employee_0"].name);
Hope that helps. If you need more help, you'll need to be clearer about how you're looping over the employee data and what it looks like.
The maximum length until "it gets slow" is totally dependent on your target machine and your actual code, so you'll need to test on that (those) platform(s) to see what is acceptable.
However, the maximum length of an array according to the ECMAEdition specification is bound by an unsigned 32-bit integer due to the ToUint32 abstract operation, so the longest possible array could have 232-1 = 4,294,967,295 = 4.29 billion element.

Javascript object with arrays to search param style query string

Looking for clean way to convert a javascript object containing arrays as values to a search param compatible query string. Serializing an element from each array before moving to the next index.
Using libraries such as querystring or qs, converts the object just fine, but handles each array independently. Passing the resulting string to the server (which I cannot change) causes an error in handling of the items as each previous value is overwritten by the next. Using any kind of array notation in the query string is not supported. The only option I have not tried is a custom sort function, but seems like it would be worse than writing a custom function to parse the object. Any revision to the object that would generate the expected result is welcome as well.
var qs = require("qs")
var jsobj = {
origString:['abc','123'],
newString:['abcd','1234'],
action:'compare'
}
qs.stringify(jsobj,{encode:false})
qs.stringify(jsobj,{encode:false,indices:false})
qs.stringify(jsobj,{encode:false,indices:false,arrayFormat:'repeat'})
Result returned is
"origString=abc&origString=123&newString=abcd&newString=1234&action=compare"
Result desired would be
"origString=abc&newString=abcd&origString=123&newString=1234&action=compare"
I tried reorder your json:
> var jsobj = [{origString: 'abc', newString: 'abcd' }, {origString: '123',
newString: '1234' }, {action:'compare'}]
> qs.stringify(jsobj,{encode:false})
'0[origString]=abc&0[newString]=abcd&1[origString]=123&1[newString]=1234&2[action]=compare'
But I don't know if this is a good alternative for your problem.
Chalk this up to misunderstanding of the application. After spending some more time with the API I realized my mistake, and as posted above by others, order does no matter. Not sure why my first several attempts failed but the question is 'answered'

parse.com search for partial string in array

I'm trying to search a Parse.com field which is an array for a partial string.
When the field is in String format I can do the following:
// Update the filtered array based on the search text and scope.
// Remove all objects from the filtered search array
[self.searchResults removeAllObjects];
// Filter the array using NSPredicate
NSPredicate *predicate = [NSPredicate predicateWithFormat:#"SELF.busnumber contains[c] %#", searchText];
self.searchResults = [NSMutableArray arrayWithArray:[self.objects filteredArrayUsingPredicate:predicate]];
This works, however the new field I want to search in is an Array.
It works when I change the it to the following:
PFQuery * query = [PFQuery queryWithClassName:#"Bus"];
[query whereKey:#"route" equalTo:[NSString stringWithFormat:#"%#", searchText]];
[query findObjectsInBackgroundWithBlock:^(NSArray *objects, NSError *error) {
NSLog(#"Objects: %#", objects);
if (error)
{
NSLog(#"ERROR: %#", error.localizedDescription);
}
else
{
[self.searchResults removeAllObjects];
[self.searchResults addObjectsFromArray:objects];
[self.searchDisplayController.searchResultsTableView reloadData];
}}];
However I need the exact String for this.
I want to be able to search for parts of a string though, but when I change it to:
[query whereKey:#"route" containsString:[NSString stringWithFormat:#"%#", searchText]];
I get:
[Error]: $regex only works on string fields (Code: 102, Version: 1.7.4)
Any ideas? Thanks :)
What you've attempted is rational, but the string qualifiers on PFQuery work only on strings.
I've seen this theme frequently on SO: PFQuery provides only basic comparisons for simple attributes. To do anything more, one must query for a superset and do app level computation to reduce the superset to the desired set. Doing so is expensive for two reasons: app-level compute speed/space, and network transmission of the superset.
The first expense is mitigated and the second expense is eliminated by using a cloud function to do the app level reduction of the superset. Unless you need the superset records on the client anyway, consider moving this query to the cloud.
Specific to this question, here's what I think the cloud function would resemble:
// very handy to have underscore available
var _ = require('underscore');
// return Bus objects whose route array contains strings which contain
// the passed routeSubstring (request.params.routeSubstring)
Parse.Cloud.define("busWithRouteContaining", function(request, response) {
// for now, don't deal with counts > 1k
// there's a simple adjustment (using step recursively) to get > 1k results
var query = new Parse.Query("Bus");
query.find().then(function(buses) {
var matching = _.select(buses, function(bus) {
var route = bus.get("route");
var routeSubstring = request.params.routeSubstring;
return _.contains(route, function(routeString) {
return routeString.includes(routeSubstring);
});
});
response.success(matching);
}, function(error) {
response.error(error);
});
});
If you do decide to perform the reduction on the client and need help with the code, I can edit that in. It will be a pretty simple switch to predicateWithBlock: with a block that iterates the array attribute and checks rangeOfString: on each.

optimize search through large js string array?

if I have a large javascript string array that has over 10,000 elements,
how do I quickly search through it?
Right now I have a javascript string array that stores the description of a job,
and I"m allowing the user to dynamic filter the returned list as they type into an input box.
So say I have an string array like so:
var descArr = {"flipping burgers", "pumping gas", "delivering mail"};
and the user wants to search for: "p"
How would I be able to search a string array that has 10000+ descriptions in it quickly?
Obviously I can't sort the description array since they're descriptions, so binary search is out. And since the user can search by "p" or "pi" or any combination of letters, this partial search means that I can't use associative arrays (i.e. searchDescArray["pumping gas"] )
to speed up the search.
Any ideas anyone?
As regular expression engines in actual browsers are going nuts in terms of speed, how about doing it that way? Instead of an array pass a gigantic string and separate the words with an identifer.
Example:
String "flipping burgers""pumping gas""delivering mail"
Regex: "([^"]*ping[^"]*)"
With the switch /g for global you get all the matches. Make sure the user does not search for your string separator.
You can even add an id into the string with something like:
String "11 flipping burgers""12 pumping gas""13 delivering mail"
Regex: "(\d+) ([^"]*ping[^"]*)"
Example: http://jsfiddle.net/RnabN/4/ (30000 strings, limit results to 100)
There's no way to speed up an initial array lookup without making some changes. You can speed up consequtive lookups by caching results and mapping them to patterns dynamically.
1.) Adjust your data format. This makes initial lookups somewhat speedier. Basically, you precache.
var data = {
a : ['Ant farm', 'Ant massage parlor'],
b : ['Bat farm', 'Bat massage parlor']
// etc
}
2.) Setup cache mechanics.
var searchFor = function(str, list, caseSensitive, reduce){
str = str.replace(/(?:^\s*|\s*$)/g, ''); // trim whitespace
var found = [];
var reg = new RegExp('^\\s?'+str, 'g' + caseSensitive ? '':'i');
var i = list.length;
while(i--){
if(reg.test(list[i])) found.push(list[i]);
reduce && list.splice(i, 1);
}
}
var lookUp = function(str, caseSensitive){
str = str.replace(/(?:^\s*|\s*$)/g, ''); // trim whitespace
if(data[str]) return cache[str];
var firstChar = caseSensitive ? str[0] : str[0].toLowerCase();
var list = data[firstChar];
if(!list) return (data[str] = []);
// we cache on data since it's already a caching object.
return (data[str] = searchFor(str, list, caseSensitive));
}
3.) Use the following script to create a precache object. I suggest you run this once and use JSON.stringify to create a static cache object. (or do this on the backend)
// we need lookUp function from above, this might take a while
var preCache = function(arr){
var chars = "abcdefghijklmnopqrstuvwxyz".split('');
var cache = {};
var i = chars.length;
while(i--){
// reduce is true, so we're destroying the original list here.
cache[chars[i]] = searchFor(chars[i], arr, false, true);
}
return cache;
}
Probably a bit more code then you expected, but optimalisation and performance doesn't come for free.
This may not be an answer for you, as I'm making some assumptions about your setup, but if you have server side code and a database, you'd be far better off making an AJAX call back to get the cut down list of results, and using a database to do the filtering (as they're very good at this sort of thing).
As well as the database benefit, you'd also benefit from not outputting this much data (10000 variables) to a web based front end - if you only return those you require, then you'll save a fair bit of bandwidth.
I can't reproduce the problem, I created a naive implementation, and most browsers do the search across 10000 15 char strings in a single digit number of milliseconds. I can't test in IE6, but I wouldn't believe it to more than 100 times slower than the fastest browsers, which would still be virtually instant.
Try it yourself: http://ebusiness.hopto.org/test/stacktest8.htm (Note that the creation time is not relevant to the issue, that is just there to get some data to work on.)
One thing you could do wrong is trying to render all results, that would be quite a huge job when the user has only entered a single letter, or a common letter combination.
I suggest trying a ready made JS function, for example the autocomplete from jQuery. It's fast and it has many options to configure.
Check out the jQuery autocomplete demo
Using a Set for large datasets (1M+) is around 3500 times faster than Array .includes()
You must use a Set if you want speed.
I just wrote a node script that needs to look up a string in a 1.3M array.
Using Array's .includes for 10K lookups:
39.27 seconds
Using Set .has for 10K lookups:
0.01084 seconds
Use a Set.

Categories

Resources