How to know when batch job in Apex has finished in JavaScript? - javascript

I have run into a bit of an issue in my apex controller. I am looping through events between a start and end date (max amount of days to pull events from is a month), but this leads to potentially thousands of events. So I do a soql loop of events so it chunks them into 200 events, then I loop through each event in the 200 chunk to create custom event objects to return to my visual force page.
However, inside this second for loop, I need MORE for loops to do work in finding people invited and what not, so my runtime for this is unfortunately O(3n^3) however, the first two for loops generally act as one (get one chunk of 200, loop through them all, get next one, etc) so it is more O(3n^2), but when in a test org with 2777 events in one month, I ran into CPU governor limit.
I am wanting to throw this into a batch job, as I think that is the only way to handle these large amounts of events (I cannot reduce my for loops anymore).
I am wanting to have the lightning loading spinner run until batch job finishes. However... I am not sure how to communicate between the batch job finish() method and my javascript in my visualforce page. I will be calling a method in my controller with a remote action call and unhiding the spinner, then that method will initiate the batch job, then when batch job ends, spinner stops and page refreshes with data.
But yeah, I don't quite know how I can connect the finish() to my javascript to detect when the batch job has finished.

So you have a VF page that you want to react to a batch job that could take an nth number of time.. If you want that page to to be updated I would recommend looking into Streaming API, which I'm not even sure would solve your situation... Batch jobs are asynchronous obviously, so I don't think your requirements are realistic. I guess the bigger question is what are you trying to solve for, and if your requirements is to build a dynamic page off a async job that isn't realistic

You can check a salesforce batch status by providing job ID.
Below is a REST example:
curl https://instance.salesforce.com/services/async/39.0/job/jobId/batch/batchId -H "X-SFDC-Session: sessionId"

Related

WebSocket Queuing regular messages vs callbacks to complete request processing before continuing

I have an odd scenario I think, or at least haven't had this much of a challenge with my Javascript or any language for that matter... The pseudo is messing me up basically/logic stuff.
So I want to run a queue/callback-bypass through web-socket so when a message comes in I can process it one by one but if it's a callback I let it run every-time.
The issue I run into is I store the messages in a queue and unsure how to program my nextMessage statement to handle running queued items but clearing a callback message and not losing order.
My code now is being modified on the go so posting it is tough,
But say if I wanted to allow regular messages that can have instant response run but callback messages maximum priority without screwing up the order.
I'm dealing with the issue where if too many messages are routing at once I may summon two or more messages and bring things out of sync. So if users who write games or do stateless operations got a perfect queuing method with brain power behind it with web-sockets willing to hear it out.
I'll be coding to sort this but working off a MessageHandle/NextMessage function but seeing I can't just call next every-time or need to re-think my setup a bit.

React JS - Best way to have coninues results for every key stroke using a REST calls to server?

In short we have a massive database and need to provide results as the user types them in the search box. It is impossible for us to preload queries and attempt to match that way.
Currently we send a request to the server with the new query string every 2 chars or 5 seconds. This is a bit of a mess however, so I'm looking to see if there is a faster/better way of doing this.
Previous solutions I've seen would require pre-feteching which in our case is not possible and considering the size of the return too costly.
I would recommend using debounce for this. It will make the function wait a certain amount of time after being called before running. Additional calls to the function will reset the timer. That way, the function will not run until users have finished (or paused) typing.
This which will prevent unnecessary load on your database, while still providing a good user experience (as long as you have a reasonable debounce time). Examples of how to do debounce in React can be found here

Best Server API and Client Side Javascript Interaction Methods?

Currently, I'm using setTimeout() to pause a for loop on a huge list so that I can add some styling to the page. For instance,
Eg: http://imdbnator.com/process?id=wtf&redirect=false
What I use setTimeOut for:
I use setTimeout() to add images,text and css progress bar (Why doesn't Progress Bar dynamically change unlike Text?2).
Clearly, as you can see it is quite painful for a user to just browse through the page and hover over a few images. It gets extremely laggy. Is there any any workaround to this?
My FOR Loop:
Each for loop makes an ajax request on the background to a PHP API. It definitely costs me some efficiency there but how do all other websites pull it off with such elegance? I mean, I've seen websites show a nice loading image with no user interference while it makes an API request. While I try to do something like that, I have set a time-out everytime.
Is that they use better Server-Client side interaction languages like the node.js that I've heard?
Also, I'e thought of a few alternatives but run into other complications. I would greatly appreciate if you can help me on each of these possible alternatives.
Method 1:
Instead of making an AJAX call to my PHP API through jQuery, I could do a complete server side script altogether. But then, the problem I run into is that I cannot make a good Client Side Page (as in my current page) which updates the progress bar and adds dynamic images after each of the item of the list is processed. Or is this possible?
Method 2: (Edited)
Like one the useful answers below, I think the biggest problem is the server API and client interaction. Websockets as suggested by him look promising to me. Will they necessarily be a better fix over a setTimeout? Is there any significant time difference in lets say I replace my current 1000 AJAX requests into a websocket?
Also, I would appreciate if there is anything other than websocket that is better off than an AJAX call.
How do professional websites get around with a fluidic server and client side interactions?
Edit 1: Please explain how professional websites (such as http://www.cleartrip.com when you are requesting for flight details) provide a smooth client side while processing the server side.
Edit 2: As #Syd suggested. That is something that I'm looking for.I think there is a lot of delay in my current client and server interaction. Websockets seem to be a fix for that. What are the other/ best ways for improving server cleint interaction apart from the standard AJAX?
Your first link doesn't work for me but I'll try to explain a couple of things that might help you if I understand your overall problem.
First of all it is bad to have synchronous calls with large amount of data that require processing in your main ui thread because the user experience might suffer a lot. For reference you might want to take a look into "Is it feasible to do an AJAX request from a Web Worker?"
If I understand correctly you want to load some data on demand based on an event.
Here you might want to sit back and think what is the best event for your need, it's quite different to make an ajax request every once in a while especially when you have a lot of traffic. Also you might want to check if your previous request has completed before you initialize the next one (this might not be needed in some cases though). Have a look at async.js if you want to create chained asynchronous code execution without facing the javascript "pyramid of doom" effect and messy code.
Moreover you might want to "validate - halt" the event before making the actual request. For example let's assume a user triggers a "mouseenter" you should not just fire an ajax call. Hold your breath use setTimeout and check if the user didn't fire any other "mouseenter" event for the next 250 ms this will allow your server to breath. Or in implementations that load content based on scroll. You should not fire an event if the user scrolls like a maniac. So validate the events.
Also loops and iterations, we all know that if the damn loop is too long and does heavy lifting you might experience unwanted results. So in order to overcome this you might want to look into timed loops (take a look at the snippet bellow). basically loops that break after x amount of time and continue after a while. Here are some references that helped me with a three.js project. "optimizing-three-dot-js-performance-simulating-tens-of-thousands-of-independent-moving-objects" and "Timed array processing in JavaScript"
//Copyright 2009 Nicholas C. Zakas. All rights reserved.
//MIT Licensed
function timedChunk(items, process, context, callback){
var todo = items.concat(); //create a clone of the original
setTimeout(function(){
var start = +new Date();
do {
process.call(context, todo.shift());
} while (todo.length > 0 && (+new Date() - start < 50));
if (todo.length > 0){
setTimeout(arguments.callee, 25);
} else {
callback(items);
}
}, 25);
}
cleartip.com will probably might use some of these techniques and from what I've seen what it does is get a chunk of data when you visit the page and then upon scroll it fetches other chunks as well. The trick here is to fire the request a little sooner before the user reaches the bottom of the page in order to provide a smooth experience. Regarding the left side filters they only filter out data that are already in the browser, no more requests are being made. So you fetch and you keep something like cache (in other scenarios though caching might be unwanted for live data feeds etc).
Finally If you are interested for further reading and smaller overhead in data transactions you might want to take a look into "WebSockets".
You must use async AJAX calls. Right now, the user interaction is blocked while the HTTP ajax request is being done.
Q: "how professional websites (such as cleartrip.com) provide a smooth client side while processing the server side."
A: By using async AJAX calls

Long load times on website when running R script

I'm attempting to query a MySQL database on a webpage. Within my R script, I have 4 different "query" functions along with multiple calculations which will display statistical graphs to my webpage, all dependent on an "N" variable. I'm using PHP (using shell_exec) to call to R and send "N". I'm using the RMySQL & ggplot2 libraries in R.
Running my R script with just 1 basic query function (includes dbConnect(), dbGetQuery and on.exit(dbDisconnect()), then using png(), plot(), and dev.off() takes ~15 seconds to display the graph on my website.
With 2 functions and 2 plots, I haven't even had the patience to wait it out to see if it works since the load time is so long. The queries themselves are rather lengthy (could probably made easier through looping), but I've tested them to work through MySQL and I'm not sure how to avoid loop errors with SQL.
Could the long loading time be due to having dbConnect/dbDisconnect in each individual function? Should I only do this once in the script (i.e. create a new "connect" function and then call to the other functions from here)?
Is it the fact I'm running multiple and lengthy query requests? If that's the case, would it be better if I split each "query function" into individual R scripts, then "shell_exec" each and allow the user to select which graphs to display (i.e. check boxes in HTML/PHP that allow for the execution of each script/graph desired)?
Through testing, I know that my logic is there, but I might be totally missing something. I would like to speed up the process so the website user doesn't have to stare at a loading screen forever and I can actually get some tangible results.
Sorry for the lengthy request, I appreciate any help you can give! If you'd like to see the webpage or any of my code to get a better idea, I can upload that and share.
Thanks!
EDIT: It should also be noted that I'm using a while loop (x < 100) for some of the calculations; I know loops in R are typically known to be expensive processes but the whole vectoring thing (I think that's the name?) is over my head.
Your requests are probably very demanding and cannot be executed synchronously. You could instead use a queue system. When a request is made, it is send to queue. The results will be output asynchronously when the server will be ready. In the meantime, you can redirect your user to another page and the use could be made aware of when the results is available.
Here are some suggestions:
PHP Native
PHP Framework
GitHub

JavaScript custom event handler strategy advice

I am in the middle of the design/development of a web store and am thinking my way through the best way of handling a transparent load of a couple of megabytes of product items. It seems the Asynchronous bit of AJAX doesn't mean parallel so I have to be a little bit creative here.
Rather than just pull a large lump of data down I was thinking of breaking it into pages of say 50->100 items and allowing the browser some time to process any internal messages.
The loader would pull down a page of data - fire a custom event to itself to get the next page. Theory is that if the browser has other messages to process this event would queue up behind them allowing the browser do anything else it has to do. A loss of a bit of speed - but a smoother user experience.
Rinse and repeat.
Add in some smoke and mirrors engineering - a loading icon or some such - to keep the user from noticing any delays and I should be right.
Before I dive into what is starting to sound like a fun bit of code can anyone think of a better way to pull down a large lump of data in as smooth and friendly a way as possible? I am an ancient old programmer - but JavaScript is a bit new to me.
Am I reinventing the wheel - AJAX already does all this - and I just don't know about it?
There are two ways to improve the situation:
a) reduce the data coming from the database - i.e. if there is some information, which is not used you don't need to load it. Also if there is non-changeable data you may cache it and request it only in the beginning once
b) load only the information which you need to show - that's the way which you thinking about, except the fact that you want to trigger new data loading automatically. Or at least that's what I understood. I'll suggest to keep the ajax requests as less as possible and make a new one only if the user needs more data. For example if the user stays on page 1 of 20, you don't need to fire loading of page 3 and 4. It's maybe good idea to load page 2, so the user could switch fast.

Categories

Resources