Java have join method, what node NodeJs have equivalent to it? [closed] - javascript

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 4 years ago.
Improve this question
As we know, Java have join method in multi-threading, I want to achieve the same in NodeJs. Is these any equivalent to join method of java in NodeJs?
Below is my requirement:
I have 3 methods(m1(),m2(),m3()) which calculates huge data and returns a number. Then i have one Sum() method which adds all three numbers returned by these three(m1(),m2(),m3()) methods. As these three methods are taking approx. 10 min each to calculate the data, my sum() method have to wait for approx. 30 mins to start executing. So i want to execute these three(m1(),m2(),m3()) methods parallelly. But my condition here is, The Sum method should start execute only when all three methods complete their execution and add the data returned by three methods.

Node.js does not have the concept of multi-threading, since JavaScript by default runs single-threaded. Hence you don't have something such as join. Instead, you need to get to know concepts such as the event loop, callbacks, promises & co.
(Yes, to be true, Node.js meanwhile has the concept of worker threads, which allow multi-threading, but this is still experimental.)

All Node JS applications uses “Single Threaded Event Loop Model” architecture to handle multiple concurrent clients.
NodeJs operate asynchronously and uses the Event-loop mechanism to function.
And join method in JAVA allows one thread to wait until another thread completes its execution. So this can be achieved in Javascript using Promises or Async/Await.

Related

should Node.js library support both promises and callbacks [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I m working on a Node.js library project that exposes asynchronous APIs to the user. I haven't worked in javascript for long, but it seems that several years ago, callbacks were used to handle asynchronous results/errors. Then promises were introduced in ES6 and async-await in ES8 (which is just more convenient way to use promises).
My question is what would a typical user of the API currently expect? I noticed an idiom in several projects where both promises and callbacks are supported, in the way that a function takes a callback as a last argument and if it is not provided, returns a Promise.
I'm wondering if this idiom should be followed in current projects or should callbacks be removed all together? Would anybody still use callbacks?
I noticed that async-await is becoming the dominant style but Node.js API itself is still callback-based (in fact the APIs usually return an event emitter, not a promise, so that user can register events on the return value).
So I'm just looking for general input from Node.js community about which direction to take.
Thanks.
Edit:
Thank you for the answers. I was asked to clarify couple of things about the project in case there are more recommendations:
(1) The API is intended to be used to access a specific product
(2) All of the calls involve network communication, basically request/response.
We definitely intend to support promises (such that each request gives user a promise that can be fulfilled or rejected), the only question is whether callbacks should still be supported as well.
Perhaps I can rephrase the questions as to be less opinion based:
1) For Node.js libraries released recently (after async-await became supported in the language) and having requirements similar to above, what is the predominant style? Could you please point me to any examples?
2) Could there be any scenario in which application would prefer handling the asynchronous result/error via callback instead of a promise?
Thanks.
A Range Of Possibilities
If your library requires a synchronous relationship with the client program then you might want to use callbacks (Programs that are I/O dependent). Otherwise, if your API has long and inconsistent response times you definitely want to be able to handle promises so you are not bottle necking applications that can't wait on every request (Routing webtraffic, etc.).
Conclusion
In my personal opinion, what makes Node.js great is the asynchronous nature of promises that let you write programs that are flexible. At the end of the day, people are going to be using your library for a diverse range of purposes and the more options you can give your users the better. Good luck!
How big is your intended user base?
If it’s small - ie contained within an easily defined group and modern tech stack, play the dictator and choose just one style.
If it’s large and has to run on many, possibly legacy platforms, consider just offering callbacks, or offer multiple flavours.

Returning VS NOT returning a promise in firebase cloud function [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I'm creating a Firebase cloud function that routes events to another system by making http calls for every event.
I noticed that if I don't return the promise, the http gets called most of the times (can't be 100% sure). I don't care about the response
Execution time decreases substantially if I don't return it. (155ms vs 13ms)
Does anyone know if a non returned promise is guaranteed to execute?
If your function does not return a Promise, it may be killed prematurely by Cloud Functions.
Also you might come across something like this in your console:
Function execution took 60023 ms, finished with status: 'timeout'
This happens, when a function does not return a Promise to Cloud Functions.
All types of functions except HTTPS type functions require that you return a promise that becomes resolved when asynchronous work is complete. If you don't do this, there is no guarantee that your work will complete, because the Cloud Functions runtime could clean up your function before the work is done.
It doesn't matter if you care about the response or result of the work, you should still be waiting until it's complete before allowing your function to terminate.

Why are modern browser JS engines multi-threaded? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I understand modern browsers' JS engnines (like V8, Spidermonkey, Chakra, etc.) use thread pools internally, even though only a single thread (running the event loop) is exposed to a JS programmer.
Obviously, the (rarely used) Web Workers require multiple threads (or multiple processes) - otherwise they couldn't utilize multiple CPU cores. My question is, apart from Web Workers, what is the benefit of implementing JS engine with multiple threads?
Why couldn't JS engine remain always single-threaded by internally relying on the same event-loop that the JS programmers use, using non-blocking OS calls whenever it needs to do any IO?
To clarify: JS engine uses a thread pool even if the user opened just one window with just one tab.
Edit: this is answered here
There are many parts of a script engine that benefit from parallelisation, as they can run concurrently for different parts of the script or in relation to each other:
parsing
compilation
JIT, optimisation
debugging/logging/profiling
garbage collection
graphics
And that doesn't even involve sharing between multiple instances of the engine for different usage environments (worker scripts, browsing contexts).

Delay a jquery function without timer [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
I have seen many stack or google answer, where to delay a function there is a suggestion for timer. But is there a way to delay a function with out using timer. Or other way is there a event where once a event for eg onclick or on blur is done check for the event end and call a function in a lazy manner without using timer.
The only way to delay execution of a function is to do so as a callback in response to something. I believe this pretty much always means you need to have a trigger in the native implementation, as JavaScript is single threaded and will run until it runs out of code to execute.
Typical things that trigger a function call from native code are
Timers firing
Events firing
Async operations completing (eg. Http requests).
Out of these the timer is the only practical one to execute a function later without interactions being required by the user.
So no, not really I'd the short answer.
You can read more about the JavaScript Event Loop at the Mozilla Docs

Old codebase in codeigniter 2.1.4 blocking sessions & ajax [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
My sessions are working fine, I have a complex custom user permissions system using TankAuth. I started this project as a small one, it has turned into something monstrous. Most of the data is fetched async via javascript frontend. Its basically an API now, with some exceptions.
My question is in regards to sessions. I understand that CI 3 (or later version somewhere) changed the session library to allow multiple ajax calls not to block each other. I have noticed that whilst running multiple ajax requests my application fetches results all together after a delayed time. I'm convinced this is due to blocking of sessions but i am wary to attempt a fix due to security concerns.
How do i stop ajax calls from session blocking each other without risking security?
First off, I believe you're using the word "block" for two different things ...
Here:
I understand that CI 3 (or later version somewhere) changed the session library to allow multiple ajax calls not to block each other.
And here:
I have noticed that whilst running multiple ajax requests my application fetches all together after a delayed time. I'm convinced this is due to blocking of sessions
CodeIgniter 3 didn't just change the Session library - it replaced it with a completely new one, and one of the reasons why was for multiple requests not to interfere with each other (the first quote).
However, the way to achieve that is to use locking (or what you call "blocking" in the second quote and in your question). And you can't avoid this.
What you can do, is call session_write_close() in your requests as soon as they no longer need to modify the $_SESSION array - that will free the lock and close the session for the current request, but still preserving $_SESSION contents for reading.

Categories

Resources