I am using dhtmlxgrid and calling the function
mygrid.load ( url, "json" ) .
The url points to a file which has the data shown below.
The data returned by the url is of format
data = {
rows: [{
id: 1001,
data: ["200", "The Rainmaker", "John Grisham", "101", "0" ]
}, {
id: 1002,
data: ["1002", "A Time to Kill", "John", "90", "110" ]
}]
}
This mygrid.load is called repeatedly in loop every few seconds.
The first time, the data is correctly loaded in the mygrid. The next times also when the data is changed, it's correctly updated in the grid.
The problem is that the new records and deletes are not seen on the grid. Any idea how this method can be done in this method.
I am trying to find something equivalent of mygrid.updateFromXML - but for json with insert/deletes.
http://docs.dhtmlx.com/doku.php?id=dhtmlxgrid:api_method_dhtmlxgridobject_updatefromxml ( but for json )
To reload data you need to use something like next
grid.clearAll(); //clear old data first
grid.load(url, "json");
Related
I'm using select2 load remote data way to render results (50 at a time) from an api. The response of the api might have duplicate values in any page response.
I have tried formatting response but unfortunately the method is having access only to the current page data.
Below is my code,
jQuery('#items').select2({
minimumInputLength : 2,
placeholder : '-- Select Items --',
ajax : {
url : '/api/v1/items',
quietMillis : 200,
dataType : 'json',
data : function (term, page) {
return {
term : term,
page : page,
page_limit : 50
};
},
results : function(data, page) {
//Here I'm getting only current page data. How can i get previous page data to check for duplicate values.
}
}
});
So, how can I filter the response and eliminate duplicate values by checking against the data fetched so far.
Any help would be appreciated.
It would be better if you post an example of your code. Let's say you have some data with duplicated entries:
var rawData = [
{
id: 'AL',
name: 'Alaska'
},
{
id: 'GE',
name: 'Georgia'
},
{
id: 'WY',
name: 'Wyoming'
},
{
id: 'GE',
name: 'Georgia'
}
];
function clearDuplicates(data) {
var temp = {};
for (var i = 0; i < data.length; i++) {
temp[data[i]['id']] = data[i];
}
return Object.values(temp);
}
var clearData = clearDuplicates(rawData);
console.log(clearData);
See output: duplicated entry 'Georgia' is now in one record. There can be a lot of ways to eliminate duplicates. This is just one simple example.
UPDATE:
If you use pagination (infinite scroll) in Select2, every page request is sent separately and you have to process result data and eliminate duplicates manually. it can be done by processResults parameter. (See example)
In that case, easiest way would be:
Handle every page request in processResults
Store all results in a global variable
Eliminate duplicates as described in the example above
Return desired result
Return:
return {
results: <YOUR_FILTERED_DATA>,
pagination: {
//paginatioin params
}
}
There is a static array which should be dynamically updated based upon click event.
I am able to get the dynamic value in an array format using Ajax but finding difficulties in setting it to the existing variable.
Ajax returns the below array
echo json_encode($arry);
// ["2","1","1","0","1","0"]
$.post( "ajaxcall.php", { ids: id })
.done(function( returnedArray){
//returnedArray looks like ["2","1","1","0","1","0"]
datasets: [{
//data: [10,12,33,50,12,34]
data: returnedArray
}]
},
But after the click event, returnedArray does not get interpreted to place the value.
Bacialy the result should look like below:
data: ["2","1","1","0","1","0"]
Use the assignment operator instead of :
datasets = [{
//data: [10,12,33,50,12,34]
data: returnedArray
}]
I have a situation where a user can upload a csv file. This CSV file contains a lot of data, but I am only interested in 2 columns (ID and Date). At the moment, I am parsing the CSV using Papaparse
Papa.parse(ev.data, {
delimiter: "",
newline: "",
quoteChar: '"',
header: true,
error: function(err, file, inputElem, reason) { },
complete: function (results) {
this.parsed_csv = results.data;
}
});
When this is run this.parsed_csv represents objects of data keyed by the field name. So if I JSON.stringify the output is something like this
[
{
"ID": 123456,
"Date": "2012-01-01",
"Irrelevant_Column_1": 123,
"Irrelevant_Column_2": 234,
"Irrelevant_Column_3": 345,
"Irrelevant_Column_4": 456
},
...
]
So my main question is how can I get rid of the columns I dont need, and just produce a new csv containing the columns ID and Date?
Thanks
One thing I realised, is there a way to add dynamic variables. For instance I am letting users select the columns I want to map. Now I need to do something like this
let ID = this.selectedIdCol;
this.parsed_csv = results.data.map(element => ({ID: element.ID, Date: element.Date}));
It is saying that ID is unused however. Thanks
let data = [
{
"ID": 123456,
"Date": "2012-01-01",
"Irrelevant_Column_1": 123,
"Irrelevant_Column_2": 234,
"Irrelevant_Column_3": 345,
"Irrelevant_Column_4": 456
},
...
]
just produce results by using the following code:
data = data.map(element => ({ID: element.ID, Date: element.Date}))
Now you have desired column, please generate a new CSV on these columns
As Serrurier pointed out above, You should use the step/chunk function to alter the data rather than after parse map as in memory data is already available.
PapaParse.parse(file, { skipEmptyLines: true, header: true, step: (results, parser) => {
results.data = _.pick(results.data , [ 'column1' 'column2']);
return results;
}});
Note that if you are loading a huge file, you will have the whole file in memory right after the parsing. Moreover it may freeze the browser due to the heavy workload. You can avoid that by reading and discarding columns :
row by row
chunk by chunk.
You should read Papaparse's FAQ before implementing that. To sum up, you will store required columns by extracting them from the step or chunk callbacks.
Is there a way to change the data before it is displayed in a dropdown AS you click to edit the field?
Using a simpler solution and trying to use dataBinding, which is supposed to fire before the data is bound (and available on dropDownList as well), it doesn't appear to be able to change the data before the data runs. Or at least doesn't update it.
The deep dive into the problem is that I want to run a for loop around the array of data, and perform some operations on it, based on the rest of the data in the row.
The simpler idea, is that I want to edit the data before it is displayed, on click of the kendo grid.
https://dojo.telerik.com/UROwaWoN
var mydata = [
{ name: "Jane Doe", age: 30 },
{ name: "John Doe", age: 33 }];
$("#grid").kendoGrid({
columns: [
{ field: "name" },
{ field: "age" }
],
dataSource: mydata,
dataBinding: function(e) {
mydata.push({ name: "Kane Madison", age: 24 });
console.log("dataBinding");
}
});
The answer was that I have to set the datasource again. I did this by setting the datasource within the "edit" function on the options (before the dropdownlist select event). It looks like this:
edit: function(event) {
// .....
if (selectedFieldName == "thefieldIwant") {
let newDataSource = remapDataBased
let dropdownList = ..... (get dropdownlist from Kendogrid)
dropdownList.setDataSource(newDataSource);
}
}
I think it's possible there are other solutions to this, but this has been up a few days and nobody has found them yet.
I'm new in Firebase. I would like to create an app (using Angular and AngularFire library), which shows current price of some wares. I have list all available wares in Firebase Realtime Database in the following format:
"warehouse": {
"wares": {
"id1": {
"id": "id1",
"name": "name1",
"price": "0.99"
},
"id2": {
"id": "id2",
"name": "name2",
"price": "15.00"
},
... //much more stuff
}
}
I'm using ngrx with my app, so I think that I can load all wares to store as an object not list because normalizing state tree. I wanted load wares to store in this way:
this.db.object('warehouse/wares').valueChanges();
The problem is wares' price will be refresh every 5 minutes. The number og wares is huge (about 3000 items) so one response will be weight about 700kB. I know that I will exceed limit downloaded data in a short time, in this way.
I want limit the loading data to interesing for user, so every user will can choose wares. I will store this choices in following way:
"users": {
"user1": {
"id": "user1",
"wares": {
"id1": {
"order": 1
},
"id27": {
"order": 2
},
"id533": {
"order": 3
}
},
"waresIds": ["id1", "id27", "id533"]
}
}
And my question is:
Is there a way to getting wares based on waresIds' current user? I mean, does it exist way to get only wares, whose ids are in argument array? F.e.
"wares": {
"id1": {
"id": "id1",
"name": "name1",
"price": "0.99"
},
"id27": {
"id": "id27",
"name": "name27",
"price": "0.19"
},
"id533": {
"id": "id533",
"name": "name533",
"price": "1.19"
}
}
for query like:
this.db.object('warehouse/wares').contains(["id1", "id27", "id533"]).valueChanges();
I saw query limits in Angular Fire like equalTo and etc. but every is for list. I'm totally confused. Is there anyone who can help me? Maybe I'm making mistakes in the design of the app structure. If so, I am asking for clarification.
Because you are saving the ids inside user try this way.
wares: Observable<any[]>;
//inside ngOnInit or function
this.wares = this.db.list('users/currentUserId/wares').snapshotChanges().map(changes => {
return changes.map(c => {
const id = c.payload.key; //gets ids under users/wares/ids..
let wares=[];
//now get the wares
this.db.list('warehouse/wares', ref => ref.orderByChild('id').equalTo(id)).valueChanges().subscribe(res=>{
res.forEach(data=>{
wares.push(data);
})
});
return wares;
});
});
There are two things you can do. I don't believe Firebase allows you to query for multiple equals values at once. You can however loop over the array of "ids" and query for each one directly.
I am assuming you already queried for "waresIds" and you've stored those ID's in an array named idArray:
for id in idArray {
database.ref('warehouse/wares').orderByChild('id').equalTo(id).once('value').then((snapshot) => {
console.log(snapshot.val());
})
}
In order to use the above query efficiently you'll have to index your data on id.
Your second option would be to use .childChanged to get only the updated data after your initial fetch. This should cut down drastically on the amount of data you need to download.
Yes , you can get exactly data that you want in firebase,
See official Firebase documents about filtering
You need to get each waresID
var waresID = // logic to get waresID
var userId = // logic to get userId
var ref = firebase.database().ref("wares/" + userId).child(waresID);
ref.once("value")
.then(function(snapshot) {
console.log(snapshot.val());
});
this will return only data related to that waresID or userId
Note: this is javascript code, i hope this will work for you.