Using Angular Http.Get() synchronously - javascript

I have a very simple requirement:
I'm writing a authorization service for an Angular site (Angular 7 it makes a difference).
The service needs a "hasPermission" function which takes a user and a resource name. It makes an Http.Get() call to a backend to determine whether or not the user is authorized to access that resource. When, and ONLY WHEN, the data comes back from the Get() call, it returns true or false.
I have been searching the web for about a week trying to figure out how to do it. My problem is that the Http.Get() returns an observable AND THEN CONTINUES. Which means the function returns before it receives the data back on which to make a decision. The same thing if I use the Http.Get().toPromise() - the function continues as soon as the promise is created.
It seems like every "solution" I read is some variant of "return a promise" or "return an observable". But then it's analagous to "It's turtles all the way down" -in this case, it's promises (or observables) all the way up. At some point there needs to be a method that waits for and returns the data, not a promise of the data or an observable of the data.
I need some way to add a "waitForDone" after creation of the observable or promise and before the function returns its value, but from everything I can find, you can't do that in JavaScript because it's single-threaded.
Note, I can't, as some solutions have suggested, "Just put the code after the http.get(...) in a separate function and call it from success callback" because the code to be execute needs to return from this function.
And async/await doesn't do it, because async turns the whole function into a promise, so, even though the await may wait for the Get() to return the data, the function will still have gone on and returned to the caller before it had the data it needed.
It doesn't seem to me that this is an unusual requirement. While it's nice to be able to issue a request then go do something else while you wait for the data to come back, there have to be times when you MUST HAVE the data before you can do anything else.
Any help is appreciated.

You can use the async-await syntax: instead of
get(someArgs).subscribe(doStuff);
you could put async in the function declaration and then go as follows
const someVal = await get(someArgs).toPromise();
doStuff(someVal); // this line won't execute until get() has resolved the promise
But, fundamentally, it is turtles all the way down. It's syntactic sugar atop of the Promise recipe, and that's what it gets transpiled to (if you transpile it).

Related

problems with request and cheerio in dialogflow

What I'm trying to do is to be able to use request-promise-native and cheerio in my dialogflow webhook to scrape some articles from a website, I've tried several ways but never been able to work it out.
My last attempt was doing as suggested in this post, but i could not make it work.
If you want to have a look at my code, here it is the code i wrote, with a bit of explanation: https://github.com/Vaelthur/webscraping-with-dialogflow-incomplete
The problem is in the function registered to the scrpwb intent.
You are calling prova_promise, which returns a Promise (which is correct!), but does not itself return a Promise. So the function returns nothing, which is handled immediately by the handler rather than waiting for the Promise to complete.
The solution is simple - make sure it returns a Promise which you can do with something like
return prova_promise().then((message) => {
and the rest being exactly the same.

Advice on creating asynchronous calls that depend on each other

I am attempting to create a library to make API calls to a web application (jira, if you care to know) I have my api calls working no problem, but I am looking to make the code a bit more readable and use-able. I have tried searching for my needs, but it turns out I am not sure what I need to be searching for.
I am having an issue with Asynchronous calls that depend on each other, I understand that I have to wait until the callback is ran to run my next item, but I am not sure of the best way to design this.
I really would like to make Chaining a feature of my api, which I would hope to look like this:
createProject(jsonProjectStuff)
.setLeadUser("myusername")
.createBoard("boardName")
.setBoardPermissions(boardPermissionJSONvar)
.addRole(personRoleJSONvar);
with this example, everything would have to wait on the createProject as it will return the project. createBoard doesn't rely on the project normally, but used in this context it should be "assigned" to the project made, setting the board permissions only relies on the createBoard to work. addRole is specific to the project again.
the questions I have are:
Is this possible to switch context like this and keep data in-between them without the need to run the function from the response hard coded?
If this is possible, is this a good idea? If not I am open to other schemes.
I can think of a couple ways to make it work, including registering the function calls with a dependency tree and then fulfilling promises as we go, although that is mostly conceptual for me at this point as I am trying to decide the best.
Edit 2/19/2016
So I have looked into this more and I have decided on a selective "then" only when it creating a new item doesn't relate directly to the parent.
//Numbers are ID, string is Name
copyProject(IDorName)
.setRoles(JSONItem)
.setOwner("Project.Owner")
.setDefaultEmail("noreply#fake.com")
.then(
copyBoard(IDorName)
.setName("Blah blah Name {project.key}"),
saveFilterAs(IDorName, "Board {project.key}",
"project = {project.key} ORDER BY Rank ASC")
.setFilterPermissions({shareValuesJSON})
)
I like this solution a lot, the only thing I am unsure of how to do is the string "variables", I suppose it could be "Blah blah Name " + this.project.key
either way I am unsure of how to give copyBoard or saveFilterAs access to it via the "then" function.
Any thoughts?
I've been using Nightmare (a headless browser) lately.
It has a fluent API that uses a nice design pattern.
Calling the API doesn't directly execute the actions, it only queues them and when you are ready to execute you must call the end function which returns a promise. The promise is resolved when the queue has completed its async execution.
For example, in your situation
createProject(jsonProjectStuff)
.setLeadUser("myusername")
.createBoard("boardName")
.setBoardPermissions(boardPermissionJSONvar)
.addRole(personRoleJSONvar)
.end() // Execute the queue of operations.
.then() => {
// All operations completed.
))
.catch(err => {
// An error occurred.
});
I feel like this pattern is quite elegant. It allows you to have a fluent API to build a sequence of actions. Then when you are ready to execute said operations you call end (or whatever). The sequence of operations are then completed asynchronously and you use the promise to handle completion and errors.

Javascript, Node, Promises, and recursion

I'm having trouble controlling execution flow. This is a follow-on to node.js, bluebird, poor control of execution path and node.js table search fails with promises in use. Judging by console.log print-outs, my recursive routine works great, except that the first call to resolve() (a signal to the nth recursive call) gives the green light to follow-on code that shouldn't get that green light until the first call to the recursive routine calls resolve(). It turns out the first call to the recursive routine delivers the answer I want reported, but by the time it reports it, the follow-on code is no longer listening for it and is running blissfully along with an "undefined" answer. Bad.
My code is much to long to share here. I tried to write a small model of the problem, but haven't found the combination of factors to replicate the behavior.
Sound familiar? How do you keep proper control over Promises releasing follow-on code on time?
I thought maybe the first call to the routine could start an array passed into a Promise.all and later calls would add another entry to that array. I haven't tried it. Crazy?
Without seeing your actual code, we can't answer specifically.
Sound familiar? How do you keep proper control over Promises releasing
follow-on code on time?
The answer is always to not resolve the first promise in the chain until you're ready for things to execute and to structure your promise chain so that dependent things don't get executed until the things they are waiting on have been properly resolved. If something is executing too soon, then you're either calling something too soon or your promise structure is not correct. Without seeing your actual code, we cannot know for sure.
A common mistake is this:
someAsyncOperation().then(someOtherAync()).then(...)
which should be:
someAsyncOperation().then(someOtherAync).then(...)
where you should pass a reference to the next async function rather than calling it immediately and passing its return value.
I thought maybe the first call to the routine could start an array
passed into a Promise.all and later calls would add another entry to
that array. I haven't tried it. Crazy?
You cannot pass an array to Promise.all() and then add things to the array later - that is not a supported capability of Promise.all(). You can chain subsequent things onto the results of Promise.all() or do another Promise.all() that includes the promise from the previous Promise.all() and some more promises.
var someArrayOfPromises = [...];
var pAll = Promise.all(someArrayOfPromises);
var someMorePromises = [...]
someMorePromises.push(pAll);
Promise.all(someMorePromoises).then(...)

How to access JSON Object.$$state.value?

Warning: I'm going to sound like I have no idea what I'm talking about here, because I kind of don't. I'm in the process of self-learning Javascript and AngularJS through a lot of trial and error coding.
I have some javascript code (hesitant to copy here because it's a mess) that returns an Object with the following structure:
What I want to save to a variable is the object corresponding to Object.$$state.value in the picture. This object has username, hash and salt, which are what I care about. I don't know what all the other stuff like $$state are or how they got there.
However, if I do this (let's call the main Object "whatIHave"):
var whatIWant = whatIHave.$$state.value;
this does not work. whatIWant is null.
Does anyone recognize what's going on here? What is $$state, how did it get there, and how can I extract the value I want?
So that is a promise. You need to do something like:
whatIHave.then(function(whatIWant) {
// Work here
});
I highly recommend you to research what a promise is (like this link)
If you're curious enough, about what is that $$state and what is that value, I will explain a bit:
The promises have a $$state and there angular saves all the callback functions you want to call in a pending array (all those function you registered with .then like I explained before).
It also have the status: resolved (1) and rejected (2)
Finally, when you resolve or reject the promise, the value you pass when doing that, is saved in value.
You're trying to cheat in here, because when you try to access that value, it could be not there yet (that is what async is all about).
So the idea is learning the basic of promises, learning how to work with them and then use your whatIHave correctly.

How to call an asynchronous JavaScript function and block the original caller

I have an interesting situation that my usually clever mind hasn't been able to come up with a solution for :) Here's the situation...
I have a class that has a get() method... this method is called to get stored user preferences... what it does is calls on some underlying provider to actually get the data... as written now, it's calling on a provider that talks cookies... so, get() calls providerGet() let's say, providerGet() returns a value and get() passes it along to the caller. The caller expects a response before it continues its work obviously.
Here's the tricky part... I now am trying to implement a provider that is asychronous in nature (using local storage in this case)... so, providerGet() would return right away, having dispatched a call to local storage that will, some time later, call a callback function that was passed to it... but, since providerGet() already returned, and so did get() now by extension to the original called, it obviously hasn't returned the actual retrieved data.
So, the question is simply is there a way to essentially "block" the return from providerGet() until the asychronous call returns? Note that for my purposes I'm not concerned with the performance implications this might have, I'm just trying to figure out how to make it work.
I don't think there's a way, certainly I know I haven't been able to come up with it... so I wanted to toss it out and see what other people can come up with :)
edit: I'm just learning now that the core of the problem, the fact that the web sql API is asychronous, may have a solution... turns out there's a synchronous version of the API as well, something I didn't realize... I'm reading through docs now to see how to use it, but that would solve the problem nicely since the only reason providerGet() was written asychronously at all was to allow for that provider... the code that get() is a part of is my own abstraction layer above various storage providers (cookies, web sql, localStorage, etc) so the lowest common denominator has to win, which means if one is asychronous they ALL have to be asychronous... the only one that was is web sql... so if there's a way to do that synchronously my point become moot (still an interesting question generically I think though)
edit2: Ah well, no help there it seems... seems like the synchronous version of the API isn't implemented in any browser and even if it was it's specified that it can only be used from worker threads, so this doesn't seem like it'd help anyway. Although, reading some other things it sounds like there's a way to pull of this trick using recursion... I'm throwing together some test code now, I'll post it if/when I get it working, seems like a very interesting way to get around any such situation generically.
edit3: As per my comments below, there's really no way to do exactly what I wanted. The solution I'm going with to solve my immediate problem is to simply not allow usage of web SQL for data storage. It's not the ideal solution, but as that spec is in flux and not widely implemented anyway it's not the end of the world... hopefully when its properly supported the synchronous version will be available and I can plug in a new provider for it and be good to go. Generically though, there doesn't appear to be any way to pull of this miracle... confirms what I expected was the case, but wish I was wrong this one time :)
spawn a webworker thread to do the async operation for you.
pass it info it needs to do the task plus a unique id.
the trick is to have it send the result to a webserver when it finishes.
meanwhile...the function which spawned the webworker sends an ajax request to the same webserver
use the synchronous flag of the xmlhttprequest object(yes, it has a synchronous option). since it will block until the http request is complete, you can just have your webserver script poll the database for updates or whatever until the result has been sent to it.
ugly, i know. but it would block without hogging cpu :D
basically
function get(...) {
spawnWebworker(...);
var xhr = sendSynchronousXHR(...);
return xhr.responseTEXT;
}
No, you can't block until the asynch call finishes. It's that simple.
It sounds like you may already know this, but if you want to use asynchronous ajax calls, then you have to restructure the way your code is used. You cannot just have a .get() method that makes an asynchronous ajax call, blocks until it's complete and returns the result. The design pattern most commonly used in these cases (look at all of Google's javascript APIs that do networking, for example) is to have the caller pass you a completion function. The call to .get() will start the asynchronous operation and then return immediately. When the operation completes, the completion function will be called. The caller must structure their code accordingly.
You simply cannot write straight, sequential procedural javascript code when using asynchronous networking like:
var result = abc.get()
document.write(result);
The most common design pattern is like this:
abc.get(function(result) {
document.write(result);
});
If your problem is several calling layers deep, then callbacks can be passed along to different levels and invoked when needed.
FYI, newer browsers support the concept of promises which can then be used with async and await to write code that might look like this:
async function someFunc() {
let result = await abc.get()
document.write(result);
}
This is still asynchronous. It is still non-blocking. abc.get() must return a promise that resolves to the value result. This code must be inside a function that is declared async and other code outside this function will continue to run (that's what makes this non-blocking). But, you get to write code that "looks" more like blocking code when local to the specific function it's contained within.
Why not just have the original caller pass in a callback of its own to get()? This callback would contain the code that relies on the response.
The get() method will forward the callback to providerGet(), which would then invoke it when it invokes its own callback.
The result of the fetch would be passed to the original caller's callback.
function get( arg1, arg2, fn ) {
// whatever code
// call providerGet, passing along the callback
providerGet( fn );
}
function providerGet( fn ) {
// do async activity
// in the callback to the async, invoke the callback and pass it the data
// ...
fn( received_data );
// ...
}
get( 'some_arg', 'another_arg', function( data ) {
alert( data );
});
When your async method starts, I would open some sort of modal dialog (that the user cannot close) telling them that the request is in process. When the request finishes, close the modal in your callback.
One possible way to do this is with jqModal, but that would require you to load jQuery into your project. I'm not sure if that's an option for you or not.
This is ugly, but anyway I think the question is kindof implying an ugly solution is desired...
In your get function, serialize your query into a string.
Open an iframe, passing (A) this serialized query and (B) a random number in querystring to this iframe
Your iframe has some javascript code that reads the SQL query and number from its own querystring
Your iframe asynchronously begins running the query.
When your iframe query is asynchronously finished, it sends it, along with the random number to a server of yours, say to /write.php?rand=###&reslt="blahblahblah"
Write.php saves this info somewhere
Back in your main script, after creating and loading the iframe, you create a synchronous AJAX request to your server, say to /read.php?rand=####
/read.php blocks until the written info is available, then returns it to your main page
Alternately, to avoid sending the data over the network, you could instead have your iframe encode the result into a canvas-generated image that the browser caches (similar to the approach that Zombie cookie reportedly used). Then your blocking script would try to continually load this image over and over again (with some small network-generated delay on each request) until the cached version is available, which you could recognize via some flag that you've set to indicate it's done.

Categories

Resources