Ext.data.Store, Javascript Arrays and Ext.grid.ColumnModel - javascript

I am using Ext.data.Store to call a PHP script which returns a JSON response with some metadata about fields that will be used in a query (unique name, table, field, and user-friendly title). I then loop through each of the Ext.data.Record objects, placing the data I need into an array (this_column), push that array onto the end of another array (columns), and eventually pass this to an Ext.grid.ColumnModel object.
The problem I am having is - no matter which query I am testing against (I have a number of them, varying in size and complexity), the columns array always works as expected up to columns[15]. At columns[16], all indexes from that point and previous are filled with the value of columns[15]. This behavior continues until the loop reaches the end of the Ext.data.Store object, when the entire arrays consists of the same value.
Here's some code:
columns = [];
this_column = [];
var MetaData = Ext.data.Record.create([
{name: 'id'},
{name: 'table'},
{name: 'field'},
{name: 'title'}
]);
// Query the server for metadata for the query we're about to run
metaDataStore = new Ext.data.Store({
autoLoad: true,
reader: new Ext.data.JsonReader({
totalProperty: 'results',
root: 'fields',
id: 'id'
}, MetaData),
proxy: new Ext.data.HttpProxy({
url: 'index.php/' + type + '/' + slug
}),
listeners: {
'load': function () {
metaDataStore.each(function(r) {
this_column['id'] = r.data['id'];
this_column['header'] = r.data['title'];
this_column['sortable'] = true;
this_column['dataIndex'] = r.data['table'] + '.' + r.data['field'];
// This display valid information, through the entire process
console.info(this_column['id'] + ' : ' + this_column['header'] + ' : ' + this_column['sortable'] + ' : ' + this_column['dataIndex']);
columns.push(this_column);
});
// This goes nuts at columns[15]
console.info(columns);
gridColModel = new Ext.grid.ColumnModel({
columns: columns
});

Okay, since the this_column array was responding correctly on each run, but the columns array was not, I figure it must be an issue with the push().
After a bit more toying with it, I moved altered the code to reset the this_column array on each iteration of the loop - seems to have fixed the issue...
metaDataStore.each(function(r) {
this_column = [];
this_column['id'] = r.data['id'];
this_column['header'] = r.data['title'];
this_column['sortable'] = true;
this_column['dataIndex'] = r.data['table'] + '.' + r.data['field'];
columns.push(this_column);
});

I see you've already found something that works, but just to offer some advice for the future: this is much easier if you use a json store and column model directly instead of performing intermediate steps by hand.
I'm not sure if you're using a grid or dataview, but the concept is pretty much the same for both of them. If you have to do a bit of data customization, but instead of doing it by hand here you can actually just do it in a prepareData callback function.

Because you first use the variable this_column in the global context (at the top of your example) it becomes a global variable. You should instead instantiate each column definition as an object literal (split into multiple lines for read-ability).
metaDataStore.each(function(r) {
columns.push({
id: r.data['id'],
header: r.data['title'],
sortable: true,
dataIndex: r.data['table'] + '.' + r.data['field']
});
});
Or if you really wanted to use a variable, you could just do this to make sure it's a local variable
metaDataStore.each(function(r) {
var this_column = {};
...

Related

How to get an array and use in datasource

So I have an ajax call that gathers an array:
function test(){
$.ajax({
url: '/whatever/here'.
data: data,
}).done(function(newData){
var getArray = newData.SubData;
newResults.push(getArray);
}
}
var newResults = [];
My issue is I have to save the array thats in the ajax call and use it outside of the function. So by pushing it into a new array, it creates another level of objects. So when I do a datasource call:
function standardCategoryDropDownEditor(container, options) {
$("<input data-bind='value:" + options.field + "'/>")
.appendTo(container)
.kendoDropDownList({
dataSource: newResults,
dataTextField: "Value",
dataValueField: "Key",
});
}
this produces nothing as there isn't anything on the first level since its now an object that has more objects in it. So how do I either go down a level to get the data or get it to be on the first level initially?
Hopefully I'm not misunderstanding, but if newData.SubData is an array that you want to add to newResults, but you actually want to append the newData.SubData array at the top level of newResults instead of pushing a new array into newResults (which would create an array of arrays), use concat instead of push.
So
var getArray = newData.SubData;
newResults.push(getArray);
becomes
var getArray = newData.SubData;
newResults = newResults.concat(getArray);
I figured it out. It's as simple as giving it an index so:
datasource = newResults[0];
This selects the first object so it gives you the rest of the objects.

vuejs - Caching searchresults of table based on multiple filters

I'm using vuejs for this project, but this problem is not necessarily connected - but if there is a vue-way, I would prefer that.
I'm building a table, that enables the user to have per-column-filters (in this case simple inputs). The columns are dynamic, so is the amount of data (thousands of rows, but less than 100.000 entries).
// example data
var columns = ['id', 'title', 'date', 'colour']
var data = [{ id: 1, title: 'Testentry 1', date: '2017-02-21T07:10:55.124Z', colour: 'green'}]
Here is the problem: I'm iterating over the columns, checking if a search-input exists, and if so, I try to filter the data based on the searchquery. In case of the ID, the time complexity is O(n). If I know search for a title additionally, I can reuse the result of the first searchquery, dramatically reducing the amount of data has to be looked at.
The searchqueries are stored in an object search, and the filtered data is a computed property, that gets updated whenever search changes. The way how that works though is, that if I change the searchquery for title, it would re-evaluate the searchquery even for the ID, although the searchquery for that didn't change.
This would require some kind of caching of data filtered for each column. And only the proceeding columns need to be queried upon.
edit: added code for the filtering:
filteredRows () {
var rows = this.data
for (var i = 0; i < this.columns.length; i++) {
var column = this.columns[i].name
var search = this.tSearch[column]
if (!search && search.length === 0) continue
console.log(column + ': ' + ' (' + search + ') -> ' + rows.length)
rows = _.filter(rows, (row) => {
var value = '' + row[column]
value.search(search) > -1
})
}
return rows
}
Just a suggestion, but did you try to use a watcher to get old and new value of input.
data: function() {
return {
propertyToWatch: 'something'
}
},
computed: {
...
},
watch: {
'propertyToWatch': function (val, oldVal) {
console.log(oldVal); // logs old value
console.log(val); // logs current value
// here you can call a function and send both of these args and detect diff
}
},
....

Pg-promise performance boost : Multiple inserts with multiple update parameters

I am implementing Vitaly's pg-promise performance patterns, as advised here and there.
Here is my code :
for (var i=0;i<chunkedData.length;i++){
var insertData = chunkedData[i].map(function (d) {
return {
application_id: d.application_id,
country_id: d.country_id,
collection_id: collectionId
};
});
// Would need to make a loop here, and thus turning the result into an array
var updateData = {
application_id: chunkedData[i][j].application_id,
country_id: chunkedData[i][j].country_id,
collection_id: collectionId
};
var query = h.insert(insertData, cs) +
" ON CONFLICT ON CONSTRAINT application_average_ranking_application_id_country_id_colle_key DO UPDATE SET " +
h.sets(updateData, cs);
db.none(query)
.then(data => {
console.log('success');
})
.catch(error=> {
console.log('insert error : ' + error);
});
}
My problem is that insertData is an Array of Objects, and the library's insert helper builds an insert request using that Array, as specified in pg-promise API. Whereas updateData must be a simple Object.
I would like that when :
ON CONFLICT ON CONSTRAINT constraintName DO UPDATE
is triggered, the update values match the corresponding object in 'insertData' array.
How can I work around that problem ?
I've tried to put everything in a loop, but it leaks memory like crazy, and well, I lose the benefits of the pattern...
EDIT :
I want my query to be the equivalent of :
var inserts = data.map(entry => {
return t.none(" INSERT INTO application_average_ranking (application_id,country_id,collection_id) VALUES ($1,$2,$3)" +
" ON CONFLICT ON CONSTRAINT application_average_ranking_application_id_country_id_colle_key" +
" DO UPDATE SET country_id=$2,collection_id=$3",
[entry.application_id,entry.country_id,collectionId]
);
});
In that case when Update is called, the parameters refer to values originally proposed for insertion.
Your task requires a static SQL to implement that kind of logic, by using EXCLUDED as the table reference with rows excluded due to the conflict:
var sqlConflict = " ON CONFLICT ON CONSTRAINT" +
" application_average_ranking_application_id_country_id_colle_key" +
" DO UPDATE SET application_id = excluded.application_id" +
" country_id = excluded.country_id, collection_id = excluded.collection_id";
var insertData = chunkedData.map(function (d) {
return {
application_id: d.application_id,
country_id: d.country_id,
collection_id: collectionId
};
});
var query = h.insert(insertData, cs) + sqlConflict;
db.none(query)
.then(data => {
console.log('success');
})
.catch(error=> {
console.log('insert error : ' + error);
});
UPDATE
And in case your static list of excluded fields is too long and you want to simplify it, you can can always rely on flexibility of the helpers methods:
// or pull them from an object using `Object.keys(obj)`:
var cols = ['application_id', 'country_id', 'collection_id'];
var sets = pgp.helpers.sets({}, cols.map(c=> ({
name: c, mod: '^', def: 'excluded.' + pgp.as.name(c)
})));
console.log(sets);
//=> "application_id"=excluded."application_id","country_id"=excluded."country_id",
// "collection_id"=excluded."collection_id"
// or its simple JavaScript equivalent:
var sets = cols.map(c=> {
var name = pgp.as.name(c);
return name + '=excluded.' + name;
}).join();
UPDATE
With version 7.3.0 of the library and later, you should use method assignColumns to generate all of the excluded sets, like this:
cs.assignColumns({from: 'EXCLUDED'})
//=> "application_id"=EXCLUDED."application_id","country_id"=EXCLUDED."country_id","collection_id"=EXCLUDED."collection_id"
or, if you want to skip application_id, then you can do:
cs.assignColumns({from: 'EXCLUDED', skip: 'application_id'})
//=> "country_id"=EXCLUDED."country_id","collection_id"=EXCLUDED."collection_id"
See ColumnSet.assignColumns
Don't use h.sets(). Just write the conflict_action yourself. Handbook says
The SET and WHERE clauses in ON CONFLICT DO UPDATE have access to the existing row using the table's name (or an alias), and to rows proposed for insertion using the special excluded table.
Postgres - Insert

IndexedDB: Can you use an array element as a key or an index?

Consider the following object store, with the domain key set as the keyPath
var tags = [
//codes: 0 - markdown wrap tag
// 1 - HTML wrap tag
// 2 - single tag
{ domain: "youtube",
bold:["*",0],
strikethrough:["-",0],
italic:["_",0]
},
{ domain: "stackoverflow",
bold:["<strong>",1],
italic:["<em>",1],
strikethrough:["<del>",1],
superscript:["<sup>",1],
subscript:["<sub>",1],
heading1:["<h1>",1],
heading2:["<h2>",1],
heading3:["<h3>",1],
blockquote:["<blockquote>",1],
code:["<code>",1],
newline:["<br>",2],
horizontal:["<hr>",2]
}
];
The above code works fine and lets me do look-ups easily and efficiently. However, there are many cases where two objects in the store are completely identical except for their domain attribute.
For example, I want to add objects for all of the Stack Exchange sites to the store, and all of those objects would be equal to the one for StackOverflow.
So, rather than create many separate objects, I want to do something like this:
var tags = [
//codes: 0 - markdown wrap tag
// 1 - HTML wrap tag
// 2 - single tag
{ domain: ["youtube"],
bold:["*",0],
strikethrough:["-",0],
italic:["_",0]
},
{ domain: ["stackoverflow","stackexchange",...],
bold:["<strong>",1],
italic:["<em>",1],
strikethrough:["<del>",1],
superscript:["<sup>",1],
subscript:["<sub>",1],
heading1:["<h1>",1],
heading2:["<h2>",1],
heading3:["<h3>",1],
blockquote:["<blockquote>",1],
code:["<code>",1],
newline:["<br>",2],
horizontal:["<hr>",2]
}
];
Would it be possible to use a KeyGen rather than a keyPath and set up some kind of index that took a value and searched for it in the arrays pointed to by the domain key?
Or would I have to use a cursor each time I want to do a look up?
Some potentially helpful references are:
https://developer.mozilla.org/en-US/docs/Web/API/IndexedDB_API/Basic_Concepts_Behind_IndexedDB
http://www.w3.org/TR/IndexedDB/#key-path-construct
https://developer.mozilla.org/en-US/docs/Web/API/IndexedDB_API/Using_IndexedDB
The solution is to use an index with the multiEntry key property set to true
see this link (thanks #kyaw Tun)
Each index also has a multiEntry flag. This flag affects how the index behaves when the result of evaluating the index's key path yields an Array. If the multiEntry flag is false, then a single record whose key is an Array is added to the index. If the multiEntry flag is true, then the one record is added to the index for each item in the Array. The key for each record is the value of respective item in the Array.
Armed with this index, a specific keyPath is no longer necessary, so you can just use a keyGen for simplicity.
So, to create the database:
request.onupgradeneeded = function(event)
{
var db = event.target.result;
var objectStore = db.createObjectStore("domains", {autoIncrement: true });
objectStore.createIndex("domain", "domain", { unique: true, multiEntry: true });
for(var i in tags)
{
objectStore.add(tags[i]);
console.log("added " + tags[i]["domain"] + " to the DB");
}
};
and an example of using a domain to query for an object:
var objectStore = db.transaction("domains").objectStore("domains");
var query = objectStore.index("domain").get(queryURL);
query.onsuccess = function(event){...};

JavaScript Reformatting JSON arrays

I am relatively new to the JSON notation, and have run into an issue when attempting to reformat. The current format contained in the database needs to be modified to a new format for importation into a project timeline graph.
Here is the current JSON format:
[
{
"name":"5-HP-N/A-N/A-F8",
"node":{
"name":"5",
"id":14
},
"timeline":{
"epc":null,
"m1":null,
"m2":null,
"m3":1554087600000,
"m4":1593572400000,
"m5":1625108400000,
"m6":1641006000000,
"m7":1656644400000
},
"fab":{
"name":"F8",
"id":1
}
},
However, in order to display in the graph, I need the following format:
{
'start': new Date(value from epc, or first non-null milestone),
'end': new Date(value from m1 or first non-null milestone), // end is optional
'content': 'label from start Date milestone'
'group' : ' value from name field above 5-HP'
'classname' : ' value from start Date milestone'
});
I am trying to write a function in order to accomplish this. Only epc, m1, or m2 may have the value of null, but the condition must be checked for to determine if an event range should be created and where it should end. What would be the best way to reformat this json data (preferrably from an external json sheet)?
Edit: Thanks for all the help I see how this is working now! I believe I didn't explain very well the first time though, I actually need multiple class items per "group".
The end result is that these will display inline on a timeline graph 'group' line, and so I am trying to figure out how to create multiple new objects per array element shown above.
So technically, the first one will have start date = m3, and end date = m4. Then, the next object would have the same group as the first (5-HP...), the start date = m4, end date = m5...etc. This would continue until m7 (always an end date but never a start date) is reached.
This is why the loop is not so simple, as many conditions to check.
see a working fiddle here: http://jsfiddle.net/K37Fa/
your input-data seems to be an array, so i build a loop around that. if not just see this fiddle where the input data is a simple object: http://jsfiddle.net/K37Fa/1/
var i
, result = [],
, current
, propCounter
, content = [ { "name":"5-HP-N/A-N/A-F8", "node":{ "name":"5", "id":14 }, "timeline":{ "epc":null, "m1":null, "m2":null, "m3":1554087600000, "m4":1593572400000, "m5":1625108400000, "m6":1641006000000, "m7":1656644400000 }, "fab":{ "name":"F8", "id":1 } }],
// get the milestone in a function
getMileStone = function(obj) {
propCounter = 1;
for(propCounter = 1; propCounter <= 7; propCounter++) {
// if m1, m2 and so on exists, return that value
if(obj.timeline["m" + propCounter]) {
return {key: "m" + propCounter, value: obj.timeline["m" + propCounter]};
}
}
};
// loop over content array (seems like you have an array of objects)
for(i=0;i< content.length;i++) {
current = content[i];
firstMileStone = getMileStone(current);
result.push({
'start': new Date(current.epc || firstMileStone.value),
'end': new Date(current.m1 || firstMileStone.value),
'content': firstMileStone.key,
'group' : current.name,
'classname' : firstMileStone.value
});
}
EDIT:
getMileStone is just a helper-function, so you could just call it with whatever you want. e.g. current[i+1]:
secondMileStone = getMileStone(current[i + 1])
you should just check, if you are not already at the last element of your array. if so current[i+1] is undefined, and the helperfunction should return undefined.
you could then use as fallback the firstMileStone:
secondMileStone = getMileStone(current[i + 1]) || firstMileStone;
see the updated fiddle (including check in the getMileStone-Helperfunction): http://jsfiddle.net/K37Fa/6/

Categories

Resources