return from JS function - javascript

basic JS question, please go easy on me I'm a newb :)
I pass 2 variables to the findRelatedRecords function which queries other related tables and assembles an Array of Objects, called data. Since findRelatedRecords has so many inner functions, I'm having a hard time getting the data Array out of the function.
As it currently is, I call showWin inside findRelatedRecords, but I'd like to change it so that I can get data Array directly out of findRelatedRecords, and not jump to showWin
function findRelatedRecords(features,evtObj){
//first relationship query to find related branches
var selFeat = features
var featObjId = selFeat[0].attributes.OBJECTID_1
var relatedBranch = new esri.tasks.RelationshipQuery();
relatedBranch.outFields = ["*"];
relatedBranch.relationshipId = 1; //fac -to- Branch
relatedBranch.objectIds = [featObjId];
facSel.queryRelatedFeatures(relatedBranch, function(relatedBranches) {
var branchFound = false;
if(relatedBranches.hasOwnProperty(featObjId) == true){
branchFound = true;
var branchSet = relatedBranches[featObjId]
var cmdBranch = dojo.map(branchSet.features, function(feature){
return feature.attributes;
})
}
//regardless of whether a branch is found or not, we have to run the cmdMain relationship query
//the parent is still fac, no advantage of the parent being branch since cmcMain query has to be run regardless
//fac - branch - cmdMain - cmdSub <--sometimes
//fac - cmdMain - cmdSub <-- sometimes
//second relationship query to find related cmdMains
var relatedQuery = new esri.tasks.RelationshipQuery();
relatedQuery.outFields = ["*"];
relatedQuery.relationshipId = 0; //fac -to- cmdMain
relatedQuery.objectIds = [featObjId];
//rather then listen for "OnSelectionComplete" we are using the queryRelatedFeatures callback function
facSel.queryRelatedFeatures(relatedQuery, function(relatedRecords) {
var data = []
//if any cmdMain records were found, relatedRecords object will have a property = to the OBJECTID of the clicked feature
//i.e. if cmdMain records are found, true will be returned; and continue with finding cmdSub records
if(relatedRecords.hasOwnProperty(featObjId) == true){
var fset = relatedRecords[featObjId]
var cmdMain = dojo.map(fset.features, function(feature) {
return feature.attributes;
})
//we need to fill an array with the objectids of the returned cmdMain records
//the length of this list == total number of mainCmd records returned for the clicked facility
objs = []
for (var k in cmdMain){
var o = cmdMain[k];
objs.push(o.OBJECTID)
}
//third relationship query to find records related to cmdMain (cmdSub)
var subQuery = new esri.tasks.RelationshipQuery();
subQuery.outFields = ["*"];
subQuery.relationshipId = 2;
subQuery.objectIds = [objs]
subTbl.queryRelatedFeatures(subQuery, function (subRecords){
//subRecords is an object where each property is the objectid of a cmdMain record
//if a cmdRecord objectid is present in subRecords property, cmdMain has sub records
//we no longer need these objectids, so we'll remove them and put the array into cmdsub
var cmdSub = []
for (id in subRecords){
dojo.forEach(subRecords[id].features, function(rec){
cmdSub.push(rec.attributes)
})
}
var j = cmdSub.length;
var p;
var sub_key;
var obj;
if (branchFound == true){
var p1 = "branch";
obj1 = {};
obj1[p1] = [cmdBranch[0].Branches]
data.push(obj1)
}
for (var i=0, iLen = cmdMain.length; i<iLen; i++) {
p = cmdMain[i].ASGMT_Name
obj = {};
obj[p] = [];
sub_key = cmdMain[i].sub_key;
for (var j=0, jLen=cmdSub.length; j<jLen; j++) {
if (cmdSub[j].sub_key == sub_key) {
obj[p].push(cmdSub[j].Long_Name);
}
}
data.push(obj);
}
showWin(data,evtObj) <---this would go away
})
}
//no returned cmdRecords; cmdData not available
else{
p = "No Data Available"
obj = {}
obj[p] = []
data.push(obj)
}
showWin(data,evtObj) <--this would go away
})
})
}
I'd like to have access to data array simply by calling
function findRelatedRecords(feature,evt){
//code pasted above
}
function newfunct(){
var newData = findRelatedRecords(feature,evt)
console.log(newData)
}
is this possible?
thanks!
Edit
Little more explanation.....
I'm connecting an Object event Listener to a Function like so:
function b (input){
dojo.connect(obj, "onQueryRelatedFeaturesComplete", getData);
obj.queryRelatedFeatures(input);
console.log(arr) //<----this doesn't work
}
function getData(relatedFeatData){
var arr = [];
//populate arr
return arr;
}
So when obj.QueryRelatedFeatures() is complete, getData fires; this part works fine, but how to I access arr from function b ?

Post Edit Update:
Due to the way that this event is being hooked up you can't simple return data from it. Returning will just let Dojo call to the next method that is hooked up to onSelectionComplete.
When init runs it is long before findRelatedRecords will ever be executed/fired by the onSelectionComplete event of the well, which is why you were seeing undefined/null values. The only way to work with this sort of system is to either 1) call off to a method like you're already doing or 2) fire off a custom event/message (technically it's still just calling off to a method).
If you want to make this method easier to work with you should refactor/extract snippets of it to make it a smaller function but contained in many functions. Also, changing it to have only one exit point at the end of the findRelatedRecords method will help. The function defined inside of subTbl.queryRelatedFeatures() would be a great place to start.
Sorry, you're kind of limited by what Dojo gives you in this case.
Pre Edit Answer:
Just return your data out of it. Everywhere where there is a showWin call just use this return.
return {
data: data,
evtObj: evtObj
}
Then your newfunct would look like this.
function newfunct(){
var newData = findRelatedRecords(feature,evt);
console.log(newData);
console.log(newData.data);
console.log(newData.evtObj);
}
If you only need that "data" object, then change your return to just return data;.
Also, start using semicolons to terminate statements.

Related

javascript] Object's value update strangely

The value of the object is updated very strangely.
the current overall system structure is as follows.
There is a server that collects the status of each system.
Send the collected data from the server to the web server through websocket
When the web server receives the websocket, the callback function is called.
In the callback function, the object is updated with the received data.
The problem occurs when updating objects.
Here is the code for that part.
var systemDatas = {};
...
fn_callback = function(data){
fn_set_metric(data);
...
};
...
function fn_set_metric(data){
Object.entries(data).forEach(([apps, appArr]) => {
for(let i = 0; i < appArr.length; i++){
var app = {};
if(appArr[i].name === "GW"){
if(systemDatas.hasOwnProperty("GW")){
var gwDatas = systemDatas["GW"];
Object.keys(gwDatas).map(function(key){
try {
var keyIdx = 0;
for(let j = 0; j < (appArr[i].nodes).length ; j++){
if(appArr[i].nodes[j].name === key){
keyIdx = j;
break;
}
}
if(appArr[i].nodes[keyIdx].health === "on"){
gwDatas[key].process.cpuSystem = appArr[i].nodes[keyIdx].metrics[0].measurements[0].value;
gwDatas[key].process.cpuProcess = appArr[i].nodes[keyIdx].metrics[1].measurements[0].value;
gwDatas[key].memory.memUsed = appArr[i].nodes[keyIdx].metrics[2].measurements[0].value;
gwDatas[key].memory.heapUsed = appArr[i].nodes[keyIdx].metrics[4].measurements[0].value;
gwDatas[key].thread.threadDeamon = appArr[i].nodes[keyIdx].metrics[6].measurements[0].value;
gwDatas[key].thread.threadLive = appArr[i].nodes[keyIdx].metrics[7].measurements[0].value;
gwDatas[key].memory.memMax = appArr[i].nodes[keyIdx].metrics[3].measurements[0].value;
gwDatas[key].memory.heapMax = appArr[i].nodes[keyIdx].metrics[5].measurements[0].value;
gwDatas[key].thread.threadPeak = appArr[i].nodes[keyIdx].metrics[8].measurements[0].value;
gwDatas[key].process.uptime = appArr[i].nodes[keyIdx].metrics[9].measurements[0].value;
gwDatas[key].process.cpuCount = appArr[i].nodes[keyIdx].metrics[10].measurements[0].value;
console.log(key);
console.log(systemDatas["GW"][key].process.uptime);
console.log(systemDatas["GW"][key].process);
console.log(systemDatas["GW"][key]);
console.log(systemDatas["GW"]);
}
}
catch(e) {
console.error(e);
}
});
}
...
}
and the result of executing the function.
console.log
As you can see in the area marked in yellow in the result image. depending on the scope of the object, the value is different.
my expectation is
after systemDatas["GW"]["GW_1"] is updated, systemDatas["GW"]["GW_2"] is updated. sequentially.
but it's behaving in an incomprehensible way
except the callback function there is no part to update systemDatas.
Can you explain why it works this way?
Your code complexity (nesting) is to high - It is not helping you solve the problem.
Fixes
Break the function up into 2-3 separate functions const parseMetricsData, parseGWData; // etc..
Look over latest added Array methods, some of the new ones like [].find will make the code easier to read (MDN Array Docs).
Other tips after code example.
Example:
const systemDatas = {};
// ...
const fn_callback = function (data) {
fn_set_metric(data);
// ...
};
// ...
const parseGWData = (app, gwDatas) => {
for (const key of gwDatas.keys()) {
const gwData = gwData || {},
foundNode = !app.nodes ? null : app.nodes.find(n => n.name === key);
if (!foundNode || foundNode.health !== 'on') continue;
gwData.process.cpuSystem = foundNode.metrics[0].measurements[0].value;
gwData.process.cpuProcess = foundNode.metrics[1].measurements[0].value;
gwData.process.uptime = foundNode.metrics[9].measurements[0].value;
gwData.process.cpuCount = foundNode.metrics[10].measurements[0].value;
gwData.memory.memUsed = foundNode.metrics[2].measurements[0].value;
gwData.memory.heapUsed = foundNode.metrics[4].measurements[0].value;
gwData.memory.memMax = foundNode.metrics[3].measurements[0].value;
gwData.memory.heapMax = foundNode.metrics[5].measurements[0].value;
gwData.thread.threadDeamon = foundNode.metrics[6].measurements[0].value;
gwData.thread.threadLive = foundNode.metrics[7].measurements[0].value;
gwData.thread.threadPeak = foundNode.metrics[8].measurements[0].value;
console.log(key);
console.table(systemDatas.GW[key])
}
};
function fn_set_metric(data) {
for (const [apps, appArr] of Object.entries(data)) {
for (const app of appArr) {
if (app.name !== 'GW' ||
!Object.prototype.hasOwnProperty.call(systemDatas, 'GW')) continue;
parseGWData(systemDatas.GW);
}
}
}
Other code tips:
Put long property chains into variables, either via built-ins (app.nodes.find(app => app.name === key)) or directly.
Use built-ins (Array.prototype.find, for of loops etc. (use whatever your platform/platform version supports (see MDN Array, etc., for more).
Use negative if checks (instead of nesting main part of code in if statements you can check the opposite condition to avoid creating deeply nested code).
~~Consider not mutating static structures until loops/manipulations are complete; E.g., perform manipulations on pure, new, objects and then merge results into static structure(s) - will help you pinpoint issues~~ Consider that appArr may have duplicate app entries which may be overriding each others' values.

JavaScript - Issues recovering a map in an object after being saved in localStorage

I've been dealing with this for some time. I've a list of sections in which the user checks some checkboxes and that is sent to the server via AJAX. However, since the user can return to previous sections, I'm using some objects of mine to store some things the user has done (if he/she already finished working in that section, which checkboxes checked, etc). I'm doing this to not overload the database and only send new requests to store information if the user effectively changes a previous checkbox, not if he just starts clicking "Save" randomly. I'm using objects to see the sections of the page, and storing the previous state of the checkboxes in a Map. Here's my "supervisor":
function Supervisor(id) {
this.id = id;
this.verif = null;
this.selections = new Map();
var children = $("#ContentPlaceHolder1_checkboxes_div_" + id).children().length;
for (var i = 0; i < children; i++) {
if (i % 2 == 0) {
var checkbox = $("#ContentPlaceHolder1_checkboxes_div_" + id).children()[i];
var idCheck = checkbox.id.split("_")[2];
this.selections.set(idCheck, false);
}
}
console.log("Length " + this.selections.size);
this.change = false;
}
The console.log gives me the expected output, so I assume my Map is created and initialized correctly. Since the session of the user can expire before he finishes his work, or he can close his browser by accident, I'm storing this object using local storage, so I can change the page accordingly to what he has done should anything happen. Here are my functions:
function setObj(id, supervisor) {
localStorage.setItem(id, JSON.stringify(supervisor));
}
function getObj(key) {
var supervisor = JSON.parse(localStorage.getItem(key));
return supervisor;
}
So, I'm trying to add to the record whenever an user clicks in a checkbox. And this is where the problem happens. Here's the function:
function checkboxClicked(idCbx) {
var idSection = $("#ContentPlaceHolder1_hdnActualField").val();
var supervisor = getObj(idSection);
console.log(typeof (supervisor)); //Returns object, everythings fine
console.log(typeof (supervisor.change)); //Returns boolean
supervisor.change = true;
var idCheck = idCbx.split("_")[2]; //I just want a part of the name
console.log(typeof(supervisor.selections)); //Prints object
console.log("Length " + supervisor.selections.size); //Undefined!
supervisor.selections.set(idCheck, true); //Error! Note: The true is just for testing purposes
setObj(idSection, supervisor);
}
What am I doing wrong? Thanks!
Please look at this example, I removed the jquery id discovery for clarity. You'll need to adapt this to meet your needs but it should get you mostly there.
const mapToJSON = (map) => [...map];
const mapFromJSON = (json) => new Map(json);
function Supervisor(id) {
this.id = id;
this.verif = null;
this.selections = new Map();
this.change = false;
this.selections.set('blah', 'hello');
}
Supervisor.from = function (data) {
const id = data.id;
const supervisor = new Supervisor(id);
supervisor.verif = data.verif;
supervisor.selections = new Map(data.selections);
return supervisor;
};
Supervisor.prototype.toJSON = function() {
return {
id: this.id,
verif: this.verif,
selections: mapToJSON(this.selections)
}
}
const expected = new Supervisor(1);
console.log(expected);
const json = JSON.stringify(expected);
const actual = Supervisor.from(JSON.parse(json));
console.log(actual);
If you cant use the spread operation in 'mapToJSON' you could loop and push.
const mapToJSON = (map) => {
const result = [];
for (let entry of map.entries()) {
result.push(entry);
}
return result;
}
Really the only thing id change is have the constructor do less, just accept values, assign with minimal fiddling, and have a factory query the dom and populate the constructor with values. Maybe something like fromDOM() or something. This will make Supervisor more flexible and easier to test.
function Supervisor(options) {
this.id = options.id;
this.verif = null;
this.selections = options.selections || new Map();
this.change = false;
}
Supervisor.fromDOM = function(id) {
const selections = new Map();
const children = $("#ContentPlaceHolder1_checkboxes_div_" + id).children();
for (var i = 0; i < children.length; i++) {
if (i % 2 == 0) {
var checkbox = children[i];
var idCheck = checkbox.id.split("_")[2];
selections.set(idCheck, false);
}
}
return new Supervisor({ id: id, selections: selections });
};
console.log(Supervisor.fromDOM(2));
You can keep going and have another method that tries to parse a Supervisor from localStorageand default to the dom based factory if the localStorage one returns null.

Pass array values as parameter to function and create json data

I have a scenario where I am passing an array of objects to a function in nodejs, but the same is failing with undefined error.
Here is what I have tried :
var object = issues.issues //json data
var outarr=[];
for(var key in object){
outarr.push(object[key].key)
}
console.log(outarr) // array is formed like this : ['a','b','c','d','e']
for(var i =0; i<outarr.length;i++){
jira.findIssue(outarr[i]) //here I am trying to pass the array objects into the loop one by one
.then(function(issue) {
var issue_number = issue.key
var ape = issue.fields.customfield_11442[0].value
var description = issue.fields.summary
var ice = issue.fields.customfield_15890[0].value
var vice = issue.fields.customfield_15891.value
var sor = issue.fields.labels
if (sor.indexOf("testcng") > -1) {
var val = 'yes'
} else {
var val = 'yes'
}
var obj = {};
obj['ape_n'] = ape;
obj['description_n'] = description;
obj['ice_n'] = ice;
obj['vice_n'] = vice;
obj['sor_n'] = val;
var out = {}
var key = item;
out[key] = [];
out[key].push(obj);
console.log(out)
} })
.catch(function(err) {
console.error(err);
});
});
What I am trying to achieve : I want to pass the array values as a parameter which is required by jira.findissue(bassically passing the issue number) one by one and which should again fetch the values and give a combine json output.
How can I pass this array values one by one in this function and also run jira.findissue in loop.
Any help will be great !! :-)
I have taken a look at the code in your question.
To be honest the code you wrote is messy and contains some simple syntax errors.
A good tip is to use a linter to avoid those mistakes.
More info about linters here: https://www.codereadability.com/what-are-javascript-linters/
To output all results in one array you have to define the array outside the scope of the loop.
I cleaned the code a bit up and use some es6 features. I don't know the context of the code but this is what I can make off it:
//map every value the key to outarr
let outarr = issues.issues.map( elm => elm.key);
//Output defined outside the scope of the loop
let output = [];
//looping outarr
outarr.forEach( el => {
jira.findIssue(el).then(issue => {
//creating the issue object
let obj = {
ape_n: issue.fields.customfield_11442[0].value,
description_n: issue.fields.summary,
ice_n: issue.fields.customfield_15890[0].value,
vice_n: issue.fields.customfield_15891.value,
sor_n: issue.fields.labels.indexOf("testcng") > -1 ? "yes" : "yes",
};
//pushing to the output
output[issue.key] = obj;
}).catch(err => {
console.log(err);
});
});
//ouputing the output
console.log(output);
Some more info about es6 features: https://webapplog.com/es6/

Clearing JS response

I am making a call to an API. The API returns a list of results. When it does so - the response is fed into an object which I then use to iterate through and display them.
Here is the function which does that:
var getAvailability = () => {
if (chosenData.hotel == "") {
showError("Please select a location before booking.");
$timeout(() => LTBNavService.setTab('location'), 50);
return;
}
searchResponse = {};
console.log(searchResponse);
WebAPI.getHotelAvailability(genSearchObject()).then((data) => {
searchResponse = data;
$timeout(() => $('[data-tab-content] .search-btn').first().focus(), 50);
generateRoomTypeObject(searchResponse);
}, (data) => searchResponse.error = data.data.errors[0].error);
};
The Problem:
The old results are still displayed until the new set of results are available. This causes a flicker and a delay which is a bad user experience.
The solution:(which i need help with)
What is the best possible way of handling this problem? Ideally, I would like to reset/clear the search response. As in, the new results are delivered and the old ones are cleared. Is this possible from within the getAvailability function?
What would be the best way to achieve this?
The Solution:
Thanks to #Daniel Beck for his suggestion to call the generateRoomTypeObject function and feed it an empty object - +1'd his comment.
This triggered an undefined error in my generateRoomTypeObject function where i was running a few length checks(makes sense, because the object was empty - so there was nothing to do length checks on).
I handled the error by handling the undefined exception and setting the searchResponse to an empty object.
var generateRoomTypeObject = (searchResponse) => {
var ratePlans = searchResponse.ratePlans,
errors = searchResponse.error,
roomTypes = [],
ignoreBiggerRooms = false;
rawRoomsObjs = [];
if (angular.isUndefined(errors)) {
// Iterate over the rate plan
if(ratePlans === undefined){
//generateRoomTypeObject -- Handle undefined by creating new object
searchResponse = {}
}else{
for (var i = 0; i < ratePlans.length; i++) {
var ratePlan = ratePlans[i],
rooms = ratePlan.rooms;
// Iterate over the rooms and add rooms to room object. Also keep a list of room types.
for (var j = 0; j < rooms.length; j++) {
//Stuff here
}
}
}
}

Should I bother cleaning array in node.js?

In one of my script, I make extensive use of array to temporary store data. The problem I m facing is that I have a lot of code handling the array just so I make economic use of the space.
Should I even bother since Node.js array are associative array?
My current solution is:
//Get the minimum empty id in array
function get_id(callback) {
var i = 0;
while(array[i] != null) {
i = i + 1;
}
array[i] = 0;
callback(i);
}
get_id(function (i) {
array[i] = {large object};
//...
array[i] = null;
});
But I feel it is wrong and bug prone.
Can I just do:
array[i] = {large object};
i = i + 1;
//...
array[i] = null;
Or would it lead to large consumption of memory?
array is a global variable of the module using it.
Cut down code (I ve removed all computing not linked to the array player.active_mission):
var player = {},
missions = [{time: 1000}];
function end_mission(mission, squad, mission_log, callback) {
//Make all the computing of the mission to know if the player won...
callback(mission_log);
}
function get_ami(callback) {
var i = 0;
while(player.active_mission[i] != null) {
i = i + 1;
}
player.active_mission[i] = 0;
callback(i);
}
function wait_mission(mission, squad, mission_log, i, time, callback) {
setTimeout(function () {
console.log('End of mission');
player.active_mission[i] = null;
end_mission(mission, squad, mission_log, callback);
}, time);
}
function start_mission(mission, squad, callback) {
var mission_log = {mission: mission, time_start: new Date(), completed: false, read: false};
//Verify if the player can start the mission...
console.log('start_mission');
get_ami(function (i) {
player.active_mission[i] = {mission: mission, squad: squad, mission_log: mission_log}
wait_mission(mission, squad, mission_log, i, missions[mission].time, callback);
});
}
player.active_mission = [];
//This part is inside get request, after sanitizing all input
start_mission(0, [0, 1], function (r) {
//r.id = req.session.player_id;
if(r.error) {
console.log('start: error: ' + r.error);
} else {
console.log('start: Success: ' + r.result);
}
});
player.active_mission hold all uncompleted request of the player, and need to be saved if the player quit before completion. My problem is just if I should try to keep it with small id, or just go on with .push() and get the id with .length()?
In short: If a array have nothing but null for the 1000 first id, and start having data only at array[1000]`, am I wasting memory?
Can I just do:
i = i + 1;
array[i] = null;
Or would it lead to large consumption of memory?
Yes, considering that array is a global variable and won't get garbage-collected itself, filling it constantly with values (even if only null ones) will eventually let you run out of memory.
Your get_id approach that recycles unused ids does work, but is horribly inperformant - it requires linear time to find a new id. So it'll work for few users with few concurrent missions, but it won't scale.
You'll rather want to use an object and delete keys from it, then you don't get into problems when just counting up:
var count = 0;
var missions = {};
function somethingThatNeedsTheStore() {
var id = count++;
missions[id] = …;
// later
delete missions[id];
}
// repeatedly call somethingThatNeedsTheStore()
Or actually, on recent node versions, you should consider using a Map instead:
var count = 0;
var missions = new Map;
function somethingThatNeedsTheStore() {
var id = count++;
missions.set(id, …);
// later
missions.delete(id);
}
// repeatedly call somethingThatNeedsTheStore()
NodeJS has a garbage collector to destroy unreachable object/array/variable.
So when you do array[i] = {large object};, the large object will be in the memory and it will stay here. When you do array[i] = null;, the garbage collector will erase the large object (only if there's no other reference to this object of course).
So yes, it is always good to remove references to useless objects to let the garbage collector clean it.
The impact on the memory of an array of 1000 null (or undefined) will not be very big.
If you want to preserve your memory, you should use an object instead of an array. You can use it with this syntax :
var obj = {};
obj[id] = {large object};
// Free the id
delete obj[id];

Categories

Resources