Compound Query JS SDK paRse.com - javascript

I have one class Messages with 3 principal fields:
id FromUser ToUser
I do have a query where the To = Value field and the From field is not repeated. I mean, get all FROMUSER who sent me a message.
Any Idea?
Thanks!

As #Fosco says, "group by" or "select distinct" are not supported yet in Parse.com.
Moreover keep in mind the restriction on the selection limit (max 1000 results for query) and timeout request call ( 3 seconds in the before save events, 7/10 seconds in the custom functions ). For the "count" selection, the restriction is the timeout request call.
I'm working on Parse.com too, and i've changed a lot the structure of my db model, often adding some inconsistent columns in several classes, keeping them carefully updated for each necessary query.
For cases like yours, i suggest to make a custom function, that keep in input two parameter ( we can say, "myLimit" and "myOffset" ) for the lazy loading, then select the slices, and programmatically try to filter the resulting array item list (with a simple search using for..loop, or using some utility of UnderscoreJS). Start with small slices ( eg: 200-300 records maximum for selection ) until the last selection returns zero results ( end reached). You could count all items before start all of this, but the timeout limitation could cause you problems. If this not works as expected try to make the same, client side.
You could also make a different approach, so creating another table/class, and for each new message, adding the FromUser in that table ONLY if it doesn't already exist, for that specified ToUser.
Hope it helps

Related

How to load data onScroll in ReactNative with Firebase Realtime Database

i want to load data from my realtime database, but only 15 entries everytime, because the database is huge. The database has a name of the vocable and got information about it like translations and stats. I want to sort it alphabetic by the value "wordENG", but there is a problem, when i use orderByChild like this:
database()
.ref(`vocables/${UID}`)
.orderByChild("wordENG")
.startAt(requestCount)
.limitToFirst(15)
.once("value")
.then(snap => {
console.log(snap.val());
})
When i try to use startAt, to get the data on scrolling, i get the problem that startAt need to be a string, so a word of the database list. I don't want to store this word everytime and search for new one after that, but currently i cannot see another way. Is there a way to get data alphabetic on scrolling with a number to count or do i need to realize it with saving the last word and search from there?
Pagination with Firebase queries work based on knowing the anchor item, not on offsets. So you will need to indeed know the wordENG value of the node to start at (or start after with the relatively new startAfter method), and possibly the key (if there may be multiple child nodes with the same wordENG value.
If you're new to Firebase, I recommend also reading some of the previous questions about pagination, as this comes up regularly.

How to know how many items a Firestore query will return while implementing pagination

Firestore has this guide on how to paginate a query:
Firestore - Paginate data with query cursors
They show the following example:
Paginate a query
Paginate queries by combining query cursors with the limit() method. For example, use the last document in a batch as the start of a cursor for the next batch.
var first = db.collection("cities")
.orderBy("population")
.limit(25);
return first.get().then(function (documentSnapshots) {
// Get the last visible document
var lastVisible = documentSnapshots.docs[documentSnapshots.docs.length-1];
console.log("last", lastVisible);
// Construct a new query starting at this document,
// get the next 25 cities.
var next = db.collection("cities")
.orderBy("population")
.startAfter(lastVisible)
.limit(25);
});
QUESTION
I get the example, but how can I know how many items (in total, without the limit restriction) that query will return? I'll need that to calculate the number of pages and control the pagination component, won't I?
I can't simply display next and back buttons without knowing the limit.
How is it supposed to be done? Am I missing something?
You can't know the size of the result set in advance. You have to page through all the results to get the total size. This is similar to not being able to know the size of a collection without also recording that yourself somewhere else - it's just not scalable to provide this information, in the way that Cloud Firestore needs to scale.
this is not possible, the iterator cannot know how many documents it contains, as they are fetched via a gRPC stream.
But there is a workaround... but you have to make a few stuff:
1) write a counter in a firebase doc, which you increment or decrement everything you make a transaction
2) store the count in a field of your new entry, like position 10 or something.
Then you create an index on that field (position DESC).
This way you can do a skip+limit with a where("position", "<", N).orderBy("position", DESC)
It's complex but it does the trick

ArrayCollection (Collection of forms) index collision in Symfony 2

I am using Symfony2 to build up my page.
When I try to update a collection of forms (like described in the cookbook entry "How to Embed a Collection of Forms"), i get a collision of the indexes of the frontend and the indexes of the ArrayCollection in the backend.
I've got the relation User <-> Address (OneToMany). A user wants to create/update/delete his addresses, therefore he can add / delete in the frontend with the help of the javascript part new address elements. He does the following:
(1) Adds new address (has index: 0)
(2) Adds new address (has index: 1) and instantly removes this address again
(3) Adds new address (has index: 2).
When he clicks on save button, the following code saves/updates the user (and its addresses):
$this->em->persist($user);
$this->em->flush();
New addresses for example are then correctly persisted to the database.
Now the user wants to update the address e.g. with index 0.
When he now clicks on the save button, it updates the adress with "index 0", but at the same time, it adds again the address with "index 2" to the database (object).
To better understand the problem, i've drawn a small illustration (handmade, sorry for my bad art skills):
Now , i've got two times the address with "index 1" within my object / database.
I know why this happens, it's because the first "index 1" address gets mapped to the ArrayCollection element "number 1", and the second gets mapped to "number 2 "(because of the frontend name "index 2").
You can say: "it just fills up the addresses, until it reaches the frontend index in the backend"..
But how can I fix this behaviour ?
Site note:
This behaviour occurs using ajax requests, because if you would reload the page after clicking "save button", it would reindex the addresses in the frontend correctly with the indexes in the backend.
My suggestion to handle that situation:
Reindexing the frontend indexes after clicking save with the server side
indexes. Is this a clear / the only solution for my problem?
Yes, this is problem of Symfony form collection and it has no easy solution imho. But I have to ask why don't you do exactly the same thing what page refresh does? You can refresh only html snippet with collection. HTML code for snippet can come from server-side. Back to your question - yes, reindexing is good solution until you do not want to try write custom collection type on your own.
symfony/symfony/issues/7828
There is similar problem with validating in collection - symfony/symfony/issues/7468.
Well I think default collection type and the tutorial in Symfony docs has the some drawbacks. Hope that's help.
I have come round this issue on the client side by modifying the Javascript/Jquery code given in the Symfony Documentation.
Instead of numbering the new elements by counting the sub-elements, I am looking at the last element's id and extracting its index with a regular expression.
When adding an element, I am incrementing the last index by 1. That way, I never use the same index.
Here is my code :
// Initializing default index at 0
var index = 0;
// Looking for collection fields in the form
var $findinput = $container.find(':input');
// If fields found then looking for last existing index
if ( $findinput.length > 0 ) {
// Reading id of last field
var myString = $findinput.last().attr('id')
// Setting regular expression to extract number from id containing letters, hyphens and underscores
var myRegex = /^[-_A-Za-z]+([0-9]+)[-_A-Za-z]*$/
// Executing regular expression on last collection field id
var test = myRegex.exec(myString);
// Extracting last index and incrementing by 1
if (test.length > 0) index = parseInt(test[1]) + 1;
}
I ran into this problem a couple of times during the past two years. Usually, following the Symfony tutorial How to Embed a Collection of Forms does the job just fine. You need to do a little bit javascript coding to add the "edit/update" functionality, but other than that - you should be just fine using this approach.
If, on the other hand, you have a really complex form which uses AJAX to validate/save/calculation/business logic/etc, I've found it's usually a better to store the final data into an array in the session. After submitting the form, inside the if($form->isValid()){...} block, you would have
$collection = new ArrayCollection($mySessionPlainArray);
$user->setAddress($collection);
I would like to warn you to be careful with the serialization of your data - you might get some awkward exceptions or misbehavior if you're using entities (see my question).
I'm sorry I can't provide more code, but the solution to this problem sometimes is quite complex.

Javascript function taking too long to complete?

Below is a snipet of code that I am having trouble with. The purpose is to check duplicate entries in the database and return "h" with a boolean if true or false. For testing purposes I am returning a true boolean for "h" but by the time the alert(duplicate_count); line gets executed the duplicate_count is still 0. Even though the alert for a +1 gets executed.
To me it seems like the function updateUserFields is taking longer to execute so it's taking longer to finish before getting to the alert.
Any ideas or suggestions? Thanks!
var duplicate_count = 0
for (var i = 0; i < skill_id.length; i++) {
function updateUserFields(h) {
if(h) {
duplicate_count++;
alert("count +1");
} else {
alert("none found");
}
}
var g = new cfc_mentoring_find_mentor();
g.setCallbackHandler(updateUserFields);
g.is_relationship_duplicate(resource_id, mentee_id, section_id[i], skill_id[i], active_ind,table);
};
alert(duplicate_count);
There is no reason whatsoever to use client-side JavaScript/jQuery to remove duplicates from your database. Security concerns aside (and there are a lot of those), there is a much easier way to make sure the entries in your database are unique: use SQL.
SQL is capable of expressing the requirement that there be no duplicates in a table column, and the database engine will enforce that for you, never letting you insert a duplicate entry in the first place. The syntax varies very slightly by database engine, but whenever you create the table you can specify that a column must be unique.
Let's use SQLite as our example database engine. The relevant part of your problem is right now probably expressed with tables something like this:
CREATE TABLE Person(
id INTEGER PRIMARY KEY ASC,
-- Other fields here
);
CREATE TABLE MentorRelationship(
id INTEGER PRIMARY KEY ASC,
mentorID INTEGER,
menteeID INTEGER,
FOREIGN KEY (mentorID) REFERENCES Person(id),
FOREIGN KEY (menteeID) REFERENCES Person(id)
);
However, you can make enforce uniqueness i.e. require that any (mentorID, menteeID) pair is unique, by changing the pair (mentorID, menteeID) to be the primary key. This works because you are only allowed one copy of each primary key. Then, the MentorRelationship table becomes
CREATE TABLE MentorRelationship(
mentorID INTEGER,
menteeID INTEGER,
PRIMARY KEY (mentorID, menteeID),
FOREIGN KEY (mentorID) REFERENCES Person(id),
FOREIGN KEY (menteeID) REFERENCES Person(id)
);
EDIT: As per the comment, alerting the user to duplicates but not actually removing them
This is still much better with SQL than with JavaScript. When you do this in JavaScript, you read one database row at a time, send it over the network, wait for it to come to your page, process it, throw it away, and then request the next one. With SQL, all the hard work is done by the database engine, and you don't lose time by transferring unnecessary data over the network. Using the first set of table definitions above, you could write
SELECT mentorID, menteeID
FROM MentorRelationship
GROUP BY mentorID, menteeID
HAVING COUNT(*) > 1;
which will return all the (mentorID, menteeID) pairs that occur more than once.
Once you have a query like this working on the server (and are also pulling out all the information you want to show to the user, which is presumably more than just a pair of IDs), you need to send this over the network to the user's web browser. Essentially, on the server side you map a URL to return this information in some convenient form (JSON, XML, etc.), and on the client side you read this information by contacting that URL with an AJAX call (see jQuery's website for some code examples), and then display that information to the user. No need to write in JavaScript what a database engine will execute orders of magnitude faster.
EDIT 2: As per the second comment, checking whether an item is already in the database
Almost everything I said in the first edit applies, except for two changes: the schema and the query. The schema should become the second of the two schemas I posted, since you don't want the database engine to allow duplicates. Also, the query should be simply
SELECT COUNT(*) > 0
FROM MentorRelationship
WHERE mentorID = #mentorID AND menteeID = #menteeID;
where #mentorID and #menteeID are the items that the user selected, and are inserted into the query by a query builder library and not by string concatenation. Then, the server will get a true value if the item is already in the database, and a false value otherwise. The server can send that back to the client via AJAX as before, and the client (that's your JavaScript page) can alert the user if the item is already in the database.

live updating total num of results while user fills in the form

i am currently working a project and the client is asking for a feature which is going to require the help of javascript - which i'm no expert with, i can do basics etc but don't really know where to start with this, on the site there is a form (like an advanced search) and on the right it should show the total number of results, but it needs to keep updating as the form is filled in so as the user goes through filling in the form the total number of results is updated to reflect...
I thought maybe it could be done with ajax, passing along the contents/value of the input, performing a query then passing back the total num of results, but how would it work for every input, isn't that just going to be overkill, i tried to tell the client it would be a strain on the server (which surely it would be) but they seem dead-set on having it...
Any help or techniques would be very useful, or if you have come across something like this before please do let me know.
Thanks
Using jQuery you could attach a function to the blur() event of each input control that is used in the 'advanced search' to perform the ajax call to the server and get the current number of results.
This way the ajax server call will only fire each time an input field is completed and the focus moves elsewhere.
Of course if you have many fields this will result in a call each time a field is completed or amended and the focus is moved. It would also be wise to ensure that the field value has changed before making the ajax call. Something along the lines of:
var tempVal = "";
// Each field that is used in the advanced search will need to have the
// advancedSearchInput class
$('.advancedSearchInput').focus(function() {
tempVal = $(this).val();
});
$('.advancedSearchInput').blur(function() {
if($(this).val() == tempVal) {
// Get advanced search values and make ajax call
}
});
I tend to avoid this sort of thing as like you say it can be strain on the server. Here are few things I do to help reduce this risk:
Only perform the search/count when the user has entered at least 3 characters
Cache search terms and their resultant count in a memory store like Redis or memcached
Use MySQL full-text searching if you are running a MySQL DB
Another option is to present fuzzy numbers. So you get the total number of records when the page loads and then as the user types you randomly take a chunk off the total. Once the users term is more specific or they actually click the submit button you can execute actual Ajax requests to get the actual count.
So the user would see something like:
Search: ""
Response: There are about 3097 records relevant to your search
Search: "Ap"
Response: There are about 1567 records relevant to your search
Search: "Apple"
issues ajax request
Response: There are 542 records relevant to your search

Categories

Resources