I am trying to create a test automation results dashboard using a JS library called Chart.JS. I would like to display a bar graph over time showing the total number of tests passed & failed.
To do this, I have done the following things:
Created a dash_proj.html file in which I include Chart.js and in this file I will actually be drawing the graph onto the canvas.
Created a .php file in which I use a PDO connection to query a local database copy on my machine (testing locally for now through localhost).
In that same .php file, I display, in text, the results onto the browser to ensure I have grabbed the appropriate data.
Now, I am getting confused as to the proper flow of things from here. From what I have read, the next step should be to use JavaScript to call an AJAX function and tell it which PHP file name to look at (the one running the MySQL query), and that will return the data in real time (no screen refresh). Within the HTML file, I should wait for the JavaScript to return that info and the final step will be the actually drawing of the graph.
Where does jQuery come into play? And can I just put my AJAX calls inside my php file which makes the query to the database?
I was thinking of testing to make sure that AJAX call is working by first inserting dummy data into the database and checking to see if the results appear in real time on my PHP file through localhost. I was thinking that the next step would be to store all of the data from my $query->fetch() into two different arrays (one for tests passed and one for tests failed), then somehow access that array from my HTML file which calls Chart.JS and stick that data into the draw bar graph function?
You don't have to use jQuery. This JavaScript library contains a number of functions to simplify making AJAX calls and accessing the DOM, though some would argue that the convergence of browser APIs make it less necessary these days. Nevertheless, it remains popular.
Your first task is probably to fire off an AJAX operation upon page load. You can start off by adding this JavaScript directly to the page, though you'll probably want to add it as a minified asset once you have your logic working.
function ajax() {
// #todo Add your ajax logic in here
}
// Load the AJAX data into the chart as soon as the DOM is ready
$(document).on('ready', function() {
ajax();
});
It is common to do read operations using a get operation returning JSON, for which getJSON would work fine. Add this logic in place of the #todo comment above.
After that, you'll probably want to do a periodic refresh of your data, say every 60 seconds. You can do this thus:
setInterval(60 * 1000, ajax);
Note the interval timer works on milliseconds, hence the need to multiply by 1000.
One downside of the above is that if you expect a large number of users, or wish to reduce the interval to a very small value, your web server will be processing a lot of redundant requests (since most calls will result in no screen change). Using AJAX here is therefore not very scalable.
A better approach is to configure the server to push updates to browsers using Web Sockets. However, this requires a separate kind of web server, and so I probably would not recommend it for you just yet.
Related
I currently face the following issue:
After a user has uploaded his images, all images are processed through a script that optimizes every image (compresses it and removes EXIF-data).
I got everything working, the only problem is that the proces takes quite some time. I want to notify the user of the job status, e.g. a percentage of the processed images.
Currently, the user has to wait without knowing what's up in the back-end. What is the best way to accomplish this? I've thought about AJAX-calls, but I honestly have no idea where to start with implementing this, also because it looks like I need multiple calls (kinda like a heartbeat call on the processing job).
The application I am developing in is a Laravel application, I've made an API controller which handles incoming files via AJAX calls.
Any help is appreciated, thanks.
Laravel has Broadcasting for this. It uses websockets, redis or pusher to send events to the client.
This way you can send the client a message when the processing is done without them having to refresh a webpage all the time.
You'd be better off reading about the principle of how it's done, for example: Progress bar AJAX and PHP
Essentially the way it's done is that the job (processing images in your case) happens on the server through PHP. Your script will need to produce some sort of output to show how far through it is, e.g. echo some value for the percentage progress. The PHP script itself is responsible for producing this output, so you must work out how to calculate it and then code that in. It could be that it takes the number of images to be processed into account, and when each one is successfully processed, it adds 1 to a counter. When the counter equals the number of images, 100% done, or possibly some error messages if something went wrong.
On the frontend you could have an ajax script which reads the output from the PHP script. This in turn could update a progress bar, or div with some sort of percentage message - the value used coming from your PHP script.
Laravel - and other frameworks - have built-in methods to help. But you'd be better understanding the principles of how it works, such as on the link I posted.
I've developed a web application with the concept of Single Page Application but none of the modern techs and frameworks.
So I have a jQuery page that dynamically requests data to localhost - a Laravel instance that compiles the entries in the DB (within a given time interval).
So the client wants to see all the entries for last week, the app works fine. But if he wants to see the results for the whole last month... well, they're so many that the default execution time of the php ins't enough to process all the data (30 seconds). I can easily override this, of course, but then the jQuery client will loop through these arrays of objects and do stuff with them (sort, find, sum...). So I'm not even sure jQuery can handle this many data.
So my question can be broken in two:
Can laravel ->paginate() be used so the ajax request of jQuery can also chunk the data? How does this work (hopefully in a manner that doesn't force me to rewrite all the code).
How could I store large amounts of information on the client? It's only temporary but the users will hang around for a considerable amount of time on my webpage, and I don't want them to wait 5 minutes every time they press a button
Thanks.
If you want to provide an interface to a large amount of data stored in a backed, you should paginate the data. This is a standard approach, so I'm sure your client will be ok with that.
Using pagination is pretty simple - see the docs for Laravel 5.0 here: http://laravel.com/docs/5.0/pagination
In order to paginate results in the backend, you need to call paginate($perPage) on your query instead of get() in your controller, like that:
$users = User::whereIsActive(true)->paginate(15);
This will return paginated result with 15 records per page. Page number will be taken from page parameter of the request. In order to get 3rd page of users, you'll need your frontend jQuery app to send a request to URL like:
/users?page=3
I don't recommend caching data in the frontend application. The data can be changed by some other user and you won't even know about it. And with pagination, your requests should be lightweight enough to stop worrying about a request sent to fetch every page of results.
Not sure if you're subscribed to laracasts but Jeffery Way is amazing in explaining features of Laravel and I highly recommend his videos.
In short you can paginate the results, then on the view when you call the foreach on your items you can array_chunk() the results to display them how you need to. But the paginated results are going to be fetched using a query in the URL, and i'm not sure that is what you want if you're already using a lot of jQuery to keep everything on the same page.
https://laracasts.com/lessons/crazy-simple-pagination
But assuming you're already paginating the results with whatever jQuery you've already written for the json data...
You could also use a query scope to get the data you need to for the amount of time to scope to create a simple api to use with ajax. I think that's probably what you're looking for.
So here's what I would do assuming you're already doing some pagination manually with your javascript.
Create a few query scopes to filter the data for different lengths of time
Create simple routes to fetch results from URI using the query scopes
Get the json data from the route preforming an ajax requests to the URIs created
More information on Query Scopes: http://laravel.com/docs/5.1/eloquent#query-scopes
I have a routing file. When a user goes to site.com/page, my route makes a call to an SQL DB and then parses the results and returns them as JSON. Then I use
res.render('route/to/view', {data: result, moredata: resultTwo})
which sends data to the view. The problem is that my data is relatively large and takes forever to send to the view. I am 100% sure this is what is making my page run slowly. When I cut the dataset in half, the it displays much quicker. I am also aware that the actual showing of this data is also a factor, but I am strictly concerned with the speed at which it is passed from the routing to the view.
Is it any more efficient to pass a bunch of small chunks rather than one large chunk? Is the only way around this to do gradual passing of small chunks?
Passing the data to the view isn't very slow -- what's likely happening is that the template engine is just taking a while to render the data you've provided. The data itself never leaves memory, so there's no 'copy' operation going on.
The best way to speed stuff like this up is to use something like AJAX or websockets.
Here's a typical flow:
Make a DB request to grab a small number of the total items (let's say 10).
Pass those to your view and render it immediately to the user.
Have some AJAX code running in the view that then sends a GET request to your server asynchronously, requesting the rest of the items.
You can then update the DOM with the newly added data.
This is usually the best way to handle the display of large amounts of data, as the user perceives things to be really quick, even when they're quite slow -- the page loads fast, data is shown quickly, etc.
Background:
I am using node.js and a module called scrap (with jQuery) to screen scrape a website and display some of its information on my own website
To do this, I am making a JSON of data available at a certain path in my website so that the client can retrieve the information when they load the client-side javascript
I am exporting a variable in the JSON called isLoaded that will be true if the server has finished loading all of the data for the json
Problem:
Since scrap and jquery make asynchronous calls and load them into the data variable that is sent with the JSON, all of the information might not be included in the JSON just yet
This is fine most of the time, but take for instance the example when I access the page that loads that data and the server is still populating the data that is exported with the JSON object
This essentially requires the client to refresh the page until all of the data is loaded.
Question:
Is there a way to continuously call $http.get(...path...) inside the client-side javascript until the variable 'isLoaded' returns true?
Note:
I have already tried a loop but for some reason the loop can't get new data once it's running. Also, I have tried to the best of my ability to find an answer on Google with no luck. If anyone could point me in the right direction, I would appreciate it.
OK, the tite seems a little confusing, so I'll try to explain more thoroughly...
The process the page does currently follows the following sequence:
- User clicks a button
- server-side code goes retrieve data from the DB and exposes said data to the client using, populating, let's say, hidden fields.
- client-side code uses this data to fire up a an ActiveX component which performs a few tasks with the data provided.
And this works fine, however, we need to optimize the process because the ActiveX component is not fit to handle high volumes of data. We need to send data into "blocks" to the component, rather them send all data at once as it is done today.
However, I just hit a roadblock here, on how can I make the page go back and forth from server to client code multiple times? Like... "user clicks a button, server retrieves first block of data, sends to client, client executes ActiveX for the first block, client requests next block, server retrieves second block, sends to client, client executes ActiveX for the second block, client requests third block... and so on"? I can't get past the first request, since I can't register a client script block 2 times and expect AJAX to handle those multiple sequential callbacks...
Or is there a way?
This sounds more like an architectural issue than anything else.
What you should be doing here is:
1) User clicks a button. This is NOT a regular submit button. Just a plain old button that executes some local javascript.
2) Local javascript makes an AJAX request to determine how many records are available.
3) That javascript then does a loop based on the number of available records divided by the amount you want to pull per chunk.
3.a) Execute AJAX request for a chunk
3.b) Throw the data into your ActiveX control - which, btw, I really would suggest you guys think about getting rid of. There are so many issues with ActiveX that it's not even funny.
4) Repeat 3.a and 3.b until completion.
You'll notice that at no point was a full post back performed. You'll also notice that you shouldn't have to register any client script blocks.
Now the draw back here is purely in the ActiveX control. Can it be instantiated from javascript multiple times in a page or are you forced to only use a single instance?
If it's limited to a single instance, then you'll need a different approach entirely.