Can't store result in session from async waterfall - javascript

I have a series of methods I run using async.waterfall which then return a result. I store that result into a request.session variable because I use Ajax and work with the result later on. However, I find that I can set the value of the session variable, but later on I cannot retrieve it.
Here's the code from the waterfall.
async.waterfall([
//Some functions
], function(err, result) {
request.session.cases = result;
response.render('template.jade', {
items : result
});
I then perform a jquery load to another url which then filters cases in request.session.cases.
But, I find that request.session.cases is undefined!
app.get('/case/:bugid', function(request, response) {
console.log(request.session.cases); //undefined
}
The weird part is that I can set request.session.cases to anything other than result (I can even set it to result[0]) and it will be fine. I've also printed out the value of result before I do the Ajax call and it is indeed populated with an array of objects.

cookieSession has a very small size of data that you can store into it. The data I was trying to store was too big for a cookieSession. Therefore I switched to using a session for that type of data, and using a normal cookie for user login.

Related

Parse Server Upset Data in loop

I'm trying to query an API to get some data then I want to upsert all of it into my table.
But for some reason I'm not having any luck.
What's the best way to go about this?
I don't think my method of doing a query in a loop is best.
var coin = new Parse.Object.extend("Coins");
axios.get('https://api.coinmarketcap.com/v1/ticker/')
.then(response => {
let data = response.data;
// Put into database
data.map(entry => {
let q = new Parse.Query(Model.Coins.className());
q.equalTo('symbol', entry.symbol);
q.first()
.then(record => {
record.set('symbol', entry.symbol);
record.set('price_usd', entry.price_usd);
return record.save(null, {useMasterKey: true});
});
});
res.success(response);
});
you should avoid fetching and updating objects in a loop. In order to make it better you need to use 2 things:
In your query, instead of using equalTo and first you need to use containedIn for query all the records in one call. Then you need to iterate on the query results, for each record in the loop you need to do the following:
record.set('symbol', entry.symbol);
record.set('price_usd', entry.price_usd);
Finally you need to use saveAll to save all the objects in one call (please notice that saveAll is static function of Parse.Object and you should pass an array into it. Please review the docs before doing it)
Check your data. You may find you have unexpectedly updated records.
Assuming that's the entire body of a cloud function, your function initiates an asynchronous server call, then immediately tells the requestor that the operation was successful, but response isn't populated yet so you pass back undefined. However, parse-server will still run that asynchronous code.
The solution is to put the res.success call inside another .then chain, so the function won't return a response until after the server call + save finishes.
You're also going to get an uncaught error if the symbol doesn't exist on your table, though. You don't check to make sure the query returned a response to the first() call.

Trying to understand Flux stores - so if the state is held in the store, is this also where I do database calls?

I'm trying to build a contacts list app to teach myself reactjs, and I am learning fluxible now.
1) A new contact is entered. Upon submit, a newContact object is created that holds:
firstName
lastName
email
phone1 (can add up to 3 phones)
image (right now its just a text field, you can add a URL..)
2) This newContact object is sent as a payload to my createNewContactAction, and dispatcher is "alerted" that a new contact has been made.
3) At this point, ContactStore comes into play.. This is where I am stuck.
I have gotten my object to this point. If I want to save this object to my database, is this where I would do that?
I'm a bit confused as to what to do next. My end goal would be to show all the contacts in a list, so I need to add each new contact somewhere so I can pull all of them.
Can someone point me in the right direction?
I would make a request to the server to save the newContact object before calling the createNewContactAction function. If the save is successful, then you can call the createNewContactAction to store the newContact object in the ContactStore. If it isn't successful, then you can do some error handling.
To understand why I think this pattern is preferable in most cases, imagine that you saved the contact in the store and then tried to save it in the database, but then the attempt to save in the database was unsuccessful for some reason. Now the store and database are out of sync, and you have to undo all of your changes to the store to get them back in sync. Making sure the database save is successful first makes it much easier to keep the store and database in sync.
There are cases where you might want to stash your data in the store before the database, but a user submitting a form with data you want to save in the database likely isn't one of those cases.
I like to create an additional file to handle my API calls, having all of your xhttp calls in your store can clutter things very quickly. I usually name it with my store, so in this case something like "contacts-api.js". In the api file I export an object with all of the api methods I need. eg using superagent for xhttp requests:
module.exports = {
createNewContact: function(data, callback) {
request
.post('/url')
.send(data)
.end(function(res, err) {
if (callback && typeof callback === 'function') {
callback(res, err);
}
});
}
}
I usually end up creating 3 actions per request. First one is to trigger the initial request with data, next is a success with the results and last is one for errors.
Your store methods for each action might end up looking something like this:
onCreateNewContactRequest: function(data) {
api.createNewContact(data, function(res, err) {
if (err) {
ContactsActions.createNewContactError(err);
} else {
ContactsActions.createNewContactSuccess(res);
}
});
},
onCreateNewContactSuccess: function(res) {
// save data to store
this.newContact = res;
},
onCreateNewContactError: function(err) {
// save error to store
this.error = err;
}
DB calls should ideally be made by action creators. Stores should only contain data.

jQuery Deferred returns only last value in loop

So I'm trying to go through one Firebase database to find entries in the database matching a criteria. Therefore I'm using the deferred object of jQuery to handle the database calls.
Once I get a return value from this first database I want to get the user info from a second database for each of those values in the first db. Then the results are added to a JSON array
so its:
<search for value, find one>
<<<search other db for oher info>>>
<continue search for outer value>
But this only returns one value - although everything else is running fine (and the console logs all the info correct).
Here's the code:
function find(searchLocation, profileID) {
var requestUserData = {
data: []
};
var def = $.Deferred();
//This will be executed as long as there are elements in the database that match the criteria and that haven't been loaded yet (so it's a simple loop)
Ref.orderByChild("location").equalTo(searchLocation).on("child_added", function(snapshot) {
def.ressolve(snapshot.val().ID);
});
return def.promise();
};
I hope you guys have any ideas on what to do or how I could solve this. Thanks in advance!
Edit: upon further testing I discovered that this problem already exists in the outer loop - so only the first value is being returned. I think this is related to the posission of the resolve() method but I didn't find a posibility on how to change this behaviour.
Firebase is a real-time database. The events stream as changes occur at the server. You're attempting to take this real-time model and force it into CRUD strategy and do a GET operation on the data. A better solution would be to simply update the values in real-time as they are modified.
See AngularFire, ReactFire, or BackboneFire for an example of how you can do this with your favorite bindings framework.
To directly answer the question, if you want to retrieve a static snapshot of the data, you want to use once() callback with a value event, not a real-time stream from child_added:
Ref.orderByChild("location").equalTo(searchLocation).once("value", function(snapshot) {
def.resolve(snapshot.val());
});

NodeJS: How to handle a variable number of callbacks run in parallel and map their responses to requests?

As an exercise to teach myself more about node js I started making a basic CRUD REST server for SimpleDB (sdb) using the aws-sdk.
Everything was running smoothly until I got to a function for reading the domains. The aws-sdk has two functions for this purpose: listDomains and domainMetadata. listDomains returns an array of sdb domain names. domainMetadata will return additional statistics about a domain, but will only return them for one domain at a time. It does not include the domain name in the results.
My script is running listDomains and returning an array in the JSON response just fine. I would like to make my api readDomains function more ambitious though and have it return the metadata for all of the domains in the same single api call. After all, running a handful of domainMetadata calls at the same time is where node's async io should shine.
The problem is I can't figure out how to run a variable number of calls, use the same callback for all of them, match the results of each domainMetadata call to it's domainName (since it's async and they're not guaranteed to return in the order they were requested) and tell when all of the metadata requests have finished so that I can send my final response. Put into code my problem areas are:
domain.receiveDomainList = function(err, data){
var domainList = [];
for(var i=0; i<data.DomainNames.length; i++){
sdb.domainMetaData({"DomainName":data.DomainNames[i]},domain.receiveMetadata);
// alternatively: domainList.push({"DomainName":data.DomainNames[i]});
}
// alternatively:
// async.map(domainList, sdb.domainMetadata, domain.receiveMetadata)
console.log(domainList);
}
domain.receiveMetadata = function (err, data){
// I figure I can stash the results one at a time in an array in the
// parent scope but...
// How can I tell when all of the results have been received?
// Since the domainname used for the original call is not returned with
// the results how do I tell what result matches what request?
}
Based on my reading of async's readme the map function should at least match the metadata responses with the requests through some black magic, but it causes node to bomb out in the aws sync library with an error of " has no method 'makeRequest'".
Is there any way to have it all: requests run in parallel, requests matched with responses and knowing when I've received everything?
Using .bind() you can set the context or this values as well as provide leading default arguments to the bound function.
The sample code below is purely to show how you might use .bind() to add additional context to your response callbacks.
In the code below, .bind is used to:
set a domainResults object as the context for the receiveMetaData callback
pass the current domain name as an argument to the callback
The domainResults object is used to:
keep track of the number of names received in the first request
keep track of the completedCount (incremented on each callback from the metaData request)
keep track of both error and success responses in list
provide a complete callback
Completely untested code for illustrative purposes only:
domain.receiveDomainList = function(err, data) {
// Assuming err is falsey
var domainResults = {
nameCount: data.DomainNames.length,
completeCount: 0,
list: {},
complete: function() {
console.log(this.list);
}
};
for (var i = 0; i < data.DomainNames.length; i++) {
sdb.domainMetaData({ "DomainName": data.DomainNames[i] },
domain.receiveMetadata.bind(domainResults, data.DomainNames[i]));
}
}
domain.receiveMetadata = function(domainName, err, data) {
// Because of .bind, this === domainResults
this.completeCount++;
this.list[domainName] = data || {error: err};
if(this.completeCount === this.nameCount) {
this.complete();
}
}

Node.js | loop over an array , make post calls and accumulate result in an array

I wish to make a call in Node.js somethine like this (i m using coffeescript for node.js)
test = [] //initially an empty array
list = []//an array with 10 json object
for li in list
get_data url , li, (err,data) -> test.push data
my get_data method look like
get_data: (url, json_data, callback) ->
throw "JSON obj is required" unless _.isObject(json_data)
post_callback = (error, response) ->
if error
callback(error)
else
callback(undefined, response)
return
request.post {url: url, json: json_data}, post_callback
return
problem is I am not able to collect the result from request.post into the 'test' array
I Know I am doing something wrong in the for loop but not sure what
You don't appear to have any way of knowing when all of the requests have returned. You should really consider using a good async library, but here's how you can do it:
test = [] //initially an empty array
list = []//an array with 10 json object
on_complete = ->
//here, test should be full
console.log test
return
remaining = list.length
for li in list
get_data url , li, (err,data) ->
remaining--
test.push data
if remaining == 0
on_complete()
In just looking at your code (not trying it out), the problem seems to be a matter of "when you'll get the response" rather than a matter of "if you'll get the response". After your for loop runs, all you have done is queue a bunch of requests. You need to either design it so the request for the second one doesn't occur until the first has responded OR (better) you need a way to accumulate the responses and know when all of the responses have come back (or timed out) and then use a different callback to return control to the main part of your program.
BTW, here is code for a multi-file loader that I created for ActionScript. Since I/O is asynchronous in ActionScript also, it implements the accumulating approach I describe above. It uses events rather than callbacks but it might give you some ideas on how to implement this for CoffeeScript.

Categories

Resources