ArrayCollection (Collection of forms) index collision in Symfony 2 - javascript

I am using Symfony2 to build up my page.
When I try to update a collection of forms (like described in the cookbook entry "How to Embed a Collection of Forms"), i get a collision of the indexes of the frontend and the indexes of the ArrayCollection in the backend.
I've got the relation User <-> Address (OneToMany). A user wants to create/update/delete his addresses, therefore he can add / delete in the frontend with the help of the javascript part new address elements. He does the following:
(1) Adds new address (has index: 0)
(2) Adds new address (has index: 1) and instantly removes this address again
(3) Adds new address (has index: 2).
When he clicks on save button, the following code saves/updates the user (and its addresses):
$this->em->persist($user);
$this->em->flush();
New addresses for example are then correctly persisted to the database.
Now the user wants to update the address e.g. with index 0.
When he now clicks on the save button, it updates the adress with "index 0", but at the same time, it adds again the address with "index 2" to the database (object).
To better understand the problem, i've drawn a small illustration (handmade, sorry for my bad art skills):
Now , i've got two times the address with "index 1" within my object / database.
I know why this happens, it's because the first "index 1" address gets mapped to the ArrayCollection element "number 1", and the second gets mapped to "number 2 "(because of the frontend name "index 2").
You can say: "it just fills up the addresses, until it reaches the frontend index in the backend"..
But how can I fix this behaviour ?
Site note:
This behaviour occurs using ajax requests, because if you would reload the page after clicking "save button", it would reindex the addresses in the frontend correctly with the indexes in the backend.
My suggestion to handle that situation:
Reindexing the frontend indexes after clicking save with the server side
indexes. Is this a clear / the only solution for my problem?

Yes, this is problem of Symfony form collection and it has no easy solution imho. But I have to ask why don't you do exactly the same thing what page refresh does? You can refresh only html snippet with collection. HTML code for snippet can come from server-side. Back to your question - yes, reindexing is good solution until you do not want to try write custom collection type on your own.
symfony/symfony/issues/7828
There is similar problem with validating in collection - symfony/symfony/issues/7468.
Well I think default collection type and the tutorial in Symfony docs has the some drawbacks. Hope that's help.

I have come round this issue on the client side by modifying the Javascript/Jquery code given in the Symfony Documentation.
Instead of numbering the new elements by counting the sub-elements, I am looking at the last element's id and extracting its index with a regular expression.
When adding an element, I am incrementing the last index by 1. That way, I never use the same index.
Here is my code :
// Initializing default index at 0
var index = 0;
// Looking for collection fields in the form
var $findinput = $container.find(':input');
// If fields found then looking for last existing index
if ( $findinput.length > 0 ) {
// Reading id of last field
var myString = $findinput.last().attr('id')
// Setting regular expression to extract number from id containing letters, hyphens and underscores
var myRegex = /^[-_A-Za-z]+([0-9]+)[-_A-Za-z]*$/
// Executing regular expression on last collection field id
var test = myRegex.exec(myString);
// Extracting last index and incrementing by 1
if (test.length > 0) index = parseInt(test[1]) + 1;
}

I ran into this problem a couple of times during the past two years. Usually, following the Symfony tutorial How to Embed a Collection of Forms does the job just fine. You need to do a little bit javascript coding to add the "edit/update" functionality, but other than that - you should be just fine using this approach.
If, on the other hand, you have a really complex form which uses AJAX to validate/save/calculation/business logic/etc, I've found it's usually a better to store the final data into an array in the session. After submitting the form, inside the if($form->isValid()){...} block, you would have
$collection = new ArrayCollection($mySessionPlainArray);
$user->setAddress($collection);
I would like to warn you to be careful with the serialization of your data - you might get some awkward exceptions or misbehavior if you're using entities (see my question).
I'm sorry I can't provide more code, but the solution to this problem sometimes is quite complex.

Related

Record answer randomization in Qualtrics using javascript

I am making a survey in Qualtrics. This survey has a repeating question with six answer choices. The six choices are randomized (in the standard way, no javascript). The question is being repeated using loop&merge, which works great because it's the same question structure over and over (36 times), but I can use the field function to adjust the question and answers for every iteration.
However, one problem I am running into is that Qualtrics does not (as standard) support the recording of the randomization data in the results - i.e. how it has randomized the six answer choices in each iteration. When I use the 'Export Randomized Viewing Order data' function when downloading results, it only shows the answer order of the last time it asked the question. So it seems that this is a value that gets overwritten after each iteration.
So now I'm looking to record the answer order for each iteration through javascript. However, I haven't found a function that gives the order answer (after randomization). I have consulted the Qualtrics javascript API and found some functions that seem promising, such as getChoices (), but when I try this all I get back is the order of answers without randomization (i.e. just 1,2,3,4,5,6).
Does anyone know a way to record the randomized choice order for each iteration, using javascript or otherwise?
I found a different way to record the loop and merge randomization order.
Create an embedded data field in survey flow. Here we will record the randomization order. I will call the field rand_order.
Add a loop and merge field with a unique number to identify each loop (e.g. 1, 2, 3, 4, 5, ..., n).
Then add the next javascript to any page of the looped block.
//*Place Your Javascript Below This Line*/
var questionText = "${lm://Field/1}"; // "${lm://Field/1}" will actually evaluate to
//whatever is Field 1 in the current Loop & Merge loop.
// You can do this with embedded data too, as seen in the next line
var order = "${e://Field/rand_order}" + "|" + questionText; // gets the value of the embedded data
// field "rand_order", concatenates the current loop's identifier to it,
//and stores that as a variable
Qualtrics.SurveyEngine.setEmbeddedData('rand_order', order); // updates the
//embeddeddata field "rand_order" to be our order variable, which has the current loop's
//identifier attached, effectively constructing a string of numbers representing the order
You will get a column with the name "rand_order" filled with "1|5|23|2...|n". You can change the separator to make more compatible with whatever script you are using to manipulate data.
Qualtrics already records this information for you. It's just a matter of explicitly asking for it when you download your data. Number 5 on this page has more info, but I'll recount the important bits:
In the “Data & Analysis” tab, click “Export & Import” and then “Export Data”.
In the “Download Data Table” window click “More Options”.
Check the box for “Export viewing order data for randomized surveys”.
I think the thing here is to look at the order of choices in the DOM. Qualtrics provides the getChoiceContainer() method to get the div containing the choices. Here's a snippet I wrote and minimally tested:
//get the div containing the choices, then get all input child elements of that div
var choices = this.getChoiceContainer().getElementsByTagName("input");
//initialize an array for the IDs of the choices
var choiceIDs = []
//add the ID of each choice to the array
for (var i=0; i < choices.length; i++) {
choiceIDs.push(choices[i].id);
}
//get the current choice order from embedded data and add this loop to it.
//Add a | to distinguish between loops.
var choiceOrder = "${e://field/choiceorder}" + choiceIDs.toString() + "|";
//set the embedded data with the new value
Qualtrics.SurveyEngine.setEmbeddedData("choiceorder", choiceOrder);
A couple of notes/caveats:
I only tested this on a basic multiple choice question with radio buttons. It may need to be adjusted for different question types.
I also just got the IDs of the question choices. You could probably modify it pretty easily to get other information, like the label of the choice, or the numeric value it corresponds to.

Compound Query JS SDK paRse.com

I have one class Messages with 3 principal fields:
id FromUser ToUser
I do have a query where the To = Value field and the From field is not repeated. I mean, get all FROMUSER who sent me a message.
Any Idea?
Thanks!
As #Fosco says, "group by" or "select distinct" are not supported yet in Parse.com.
Moreover keep in mind the restriction on the selection limit (max 1000 results for query) and timeout request call ( 3 seconds in the before save events, 7/10 seconds in the custom functions ). For the "count" selection, the restriction is the timeout request call.
I'm working on Parse.com too, and i've changed a lot the structure of my db model, often adding some inconsistent columns in several classes, keeping them carefully updated for each necessary query.
For cases like yours, i suggest to make a custom function, that keep in input two parameter ( we can say, "myLimit" and "myOffset" ) for the lazy loading, then select the slices, and programmatically try to filter the resulting array item list (with a simple search using for..loop, or using some utility of UnderscoreJS). Start with small slices ( eg: 200-300 records maximum for selection ) until the last selection returns zero results ( end reached). You could count all items before start all of this, but the timeout limitation could cause you problems. If this not works as expected try to make the same, client side.
You could also make a different approach, so creating another table/class, and for each new message, adding the FromUser in that table ONLY if it doesn't already exist, for that specified ToUser.
Hope it helps

Javascript function taking too long to complete?

Below is a snipet of code that I am having trouble with. The purpose is to check duplicate entries in the database and return "h" with a boolean if true or false. For testing purposes I am returning a true boolean for "h" but by the time the alert(duplicate_count); line gets executed the duplicate_count is still 0. Even though the alert for a +1 gets executed.
To me it seems like the function updateUserFields is taking longer to execute so it's taking longer to finish before getting to the alert.
Any ideas or suggestions? Thanks!
var duplicate_count = 0
for (var i = 0; i < skill_id.length; i++) {
function updateUserFields(h) {
if(h) {
duplicate_count++;
alert("count +1");
} else {
alert("none found");
}
}
var g = new cfc_mentoring_find_mentor();
g.setCallbackHandler(updateUserFields);
g.is_relationship_duplicate(resource_id, mentee_id, section_id[i], skill_id[i], active_ind,table);
};
alert(duplicate_count);
There is no reason whatsoever to use client-side JavaScript/jQuery to remove duplicates from your database. Security concerns aside (and there are a lot of those), there is a much easier way to make sure the entries in your database are unique: use SQL.
SQL is capable of expressing the requirement that there be no duplicates in a table column, and the database engine will enforce that for you, never letting you insert a duplicate entry in the first place. The syntax varies very slightly by database engine, but whenever you create the table you can specify that a column must be unique.
Let's use SQLite as our example database engine. The relevant part of your problem is right now probably expressed with tables something like this:
CREATE TABLE Person(
id INTEGER PRIMARY KEY ASC,
-- Other fields here
);
CREATE TABLE MentorRelationship(
id INTEGER PRIMARY KEY ASC,
mentorID INTEGER,
menteeID INTEGER,
FOREIGN KEY (mentorID) REFERENCES Person(id),
FOREIGN KEY (menteeID) REFERENCES Person(id)
);
However, you can make enforce uniqueness i.e. require that any (mentorID, menteeID) pair is unique, by changing the pair (mentorID, menteeID) to be the primary key. This works because you are only allowed one copy of each primary key. Then, the MentorRelationship table becomes
CREATE TABLE MentorRelationship(
mentorID INTEGER,
menteeID INTEGER,
PRIMARY KEY (mentorID, menteeID),
FOREIGN KEY (mentorID) REFERENCES Person(id),
FOREIGN KEY (menteeID) REFERENCES Person(id)
);
EDIT: As per the comment, alerting the user to duplicates but not actually removing them
This is still much better with SQL than with JavaScript. When you do this in JavaScript, you read one database row at a time, send it over the network, wait for it to come to your page, process it, throw it away, and then request the next one. With SQL, all the hard work is done by the database engine, and you don't lose time by transferring unnecessary data over the network. Using the first set of table definitions above, you could write
SELECT mentorID, menteeID
FROM MentorRelationship
GROUP BY mentorID, menteeID
HAVING COUNT(*) > 1;
which will return all the (mentorID, menteeID) pairs that occur more than once.
Once you have a query like this working on the server (and are also pulling out all the information you want to show to the user, which is presumably more than just a pair of IDs), you need to send this over the network to the user's web browser. Essentially, on the server side you map a URL to return this information in some convenient form (JSON, XML, etc.), and on the client side you read this information by contacting that URL with an AJAX call (see jQuery's website for some code examples), and then display that information to the user. No need to write in JavaScript what a database engine will execute orders of magnitude faster.
EDIT 2: As per the second comment, checking whether an item is already in the database
Almost everything I said in the first edit applies, except for two changes: the schema and the query. The schema should become the second of the two schemas I posted, since you don't want the database engine to allow duplicates. Also, the query should be simply
SELECT COUNT(*) > 0
FROM MentorRelationship
WHERE mentorID = #mentorID AND menteeID = #menteeID;
where #mentorID and #menteeID are the items that the user selected, and are inserted into the query by a query builder library and not by string concatenation. Then, the server will get a true value if the item is already in the database, and a false value otherwise. The server can send that back to the client via AJAX as before, and the client (that's your JavaScript page) can alert the user if the item is already in the database.

live updating total num of results while user fills in the form

i am currently working a project and the client is asking for a feature which is going to require the help of javascript - which i'm no expert with, i can do basics etc but don't really know where to start with this, on the site there is a form (like an advanced search) and on the right it should show the total number of results, but it needs to keep updating as the form is filled in so as the user goes through filling in the form the total number of results is updated to reflect...
I thought maybe it could be done with ajax, passing along the contents/value of the input, performing a query then passing back the total num of results, but how would it work for every input, isn't that just going to be overkill, i tried to tell the client it would be a strain on the server (which surely it would be) but they seem dead-set on having it...
Any help or techniques would be very useful, or if you have come across something like this before please do let me know.
Thanks
Using jQuery you could attach a function to the blur() event of each input control that is used in the 'advanced search' to perform the ajax call to the server and get the current number of results.
This way the ajax server call will only fire each time an input field is completed and the focus moves elsewhere.
Of course if you have many fields this will result in a call each time a field is completed or amended and the focus is moved. It would also be wise to ensure that the field value has changed before making the ajax call. Something along the lines of:
var tempVal = "";
// Each field that is used in the advanced search will need to have the
// advancedSearchInput class
$('.advancedSearchInput').focus(function() {
tempVal = $(this).val();
});
$('.advancedSearchInput').blur(function() {
if($(this).val() == tempVal) {
// Get advanced search values and make ajax call
}
});
I tend to avoid this sort of thing as like you say it can be strain on the server. Here are few things I do to help reduce this risk:
Only perform the search/count when the user has entered at least 3 characters
Cache search terms and their resultant count in a memory store like Redis or memcached
Use MySQL full-text searching if you are running a MySQL DB
Another option is to present fuzzy numbers. So you get the total number of records when the page loads and then as the user types you randomly take a chunk off the total. Once the users term is more specific or they actually click the submit button you can execute actual Ajax requests to get the actual count.
So the user would see something like:
Search: ""
Response: There are about 3097 records relevant to your search
Search: "Ap"
Response: There are about 1567 records relevant to your search
Search: "Apple"
issues ajax request
Response: There are 542 records relevant to your search

How to create efficient map system?

i have a map system (grid) for my website. I have defined 40000 'fields' on a grid. Each field has a XY value (for x(1-200) and y(1-200)) and a unique identifier: fieldid(1-40000).
I have a viewable area of 16x9 fields. When the user visits website.com/fieldid/422 it displays 16x9 fields starting with fieldid 422 in the upperleft corner. This obviously follows the XY system, which means the field in the second row, right below #422 is #622.
The user should be able to navigate Up, Down, Left and Right (meaning increment/decrement the X or Y value accordingly). I have a function which converts XY values to fieldids and vice-versa.
Everything good so far, I can:
Reload the entire page when a user clicks a navigate button (got this)
Send an ajax-request and get a jsonstring with the new 16x9 fields (got this)
But I want to build in some sort of caching system so that the data sent back from the server can be minimized after the first load. This would probably mean only sending new 'rows' or 'columns' of fields and storing them in somesort of javascript multidimensional array bigger then the 16x9 used for displaying. But I can't figure it out. Can somebody assist?
I see two possible solutions.
1 If you use ajax to get new tiles and do not reload entire page very often, you may just use an object that holds the contents of each tile, using unique tile ids as keys, like:
var mapCache = {
'1' : "tile 1 data",
'2' : "tile 2 data"
//etc.
}
When the user request new tiles, first check if you have them in your object (you know which tiles will be needed for given area), then download only what you need and add new key/value pairs to the cache. Obviously all cached data will disappear as soon as the page is reloaded by user.
2 If you reload the page for each request you might split your tiles into separate javascript "files". It doesn't really matter how it would be implemented on the server - static files like tile1.js, tile2.js etc, or dynamic script (probably with some server-side cache) like tile.php?id=1, tile.php?id=2 etc. What's important is that the server sends proper HTTP headers and makes it possible for the browser to cache these requests. So when a page containing some 144 tiles is requested you have 144 <script /> elements, each one containing data for one tile and each one will be stored in browser's cache. This solution makes sense only if there's lot of data for each tile and data doesn't change on the server very often, or/and there's significant cost of tile generation/trasfer.
You could just have an array of 40,000 references. Basically, empty array elements don't take up a lot of room until you actually put something in them (its one of the advantages of a dynamically typed language). Javascript doesn't know if you are going to put an int or an object into an array element, so it doesn't allocate the elements until yo put something in them. So to summarize, just put them in an array - that simple!
Alternatively, if you don't want the interpreter to allocate 40,000 NULLs at start, you could use a dictionary method, with the keys being the 1 in 40,000 array indices. Now the unused elements don't even get allocated. Though if you are going to eventually fill a substantial portion of the map, the dictionary method is much less efficient.
Have a single associative array, which initially starts out with zero values.
If the user visits, say, grid 32x41y, you set a value for the array like this:
if (!(visitedGrids.inArray('32'))
{
visitedGrids['32'] = {}
}
visitedGrids['32']['41'] = data;
(This is pseudo-code; I haven't checked the syntax.)
Then you can check to see if the user has visited the appropriate grid coordinates by seeing if there is a value in the associative array.

Categories

Resources