repeat asynchronous $http get until condition holds true - javascript

Background:
I am using node.js and a module called scrap (with jQuery) to screen scrape a website and display some of its information on my own website
To do this, I am making a JSON of data available at a certain path in my website so that the client can retrieve the information when they load the client-side javascript
I am exporting a variable in the JSON called isLoaded that will be true if the server has finished loading all of the data for the json
Problem:
Since scrap and jquery make asynchronous calls and load them into the data variable that is sent with the JSON, all of the information might not be included in the JSON just yet
This is fine most of the time, but take for instance the example when I access the page that loads that data and the server is still populating the data that is exported with the JSON object
This essentially requires the client to refresh the page until all of the data is loaded.
Question:
Is there a way to continuously call $http.get(...path...) inside the client-side javascript until the variable 'isLoaded' returns true?
Note:
I have already tried a loop but for some reason the loop can't get new data once it's running. Also, I have tried to the best of my ability to find an answer on Google with no luck. If anyone could point me in the right direction, I would appreciate it.

Related

can AJAX do anything else than load a JSON file?

I'm want to (or think I need to) use AJAX to accomplish what I intend.
When clicking on a specific link in a list of links, I want to fill the HTML markup below with content of specific subpages. The data is naturally somewhere in the database and actually easily accessible with the CMS's API (I'm using Processwire).
I'm quite new to coding and especially AJAX and all documentation I find online only mention it in combination with a JSON file that would be loaded via AJAX.
However, I don't have a JSON file on the server, that means, according to my understanding, I would need to
store the data I need in a multidimensional php array,
use json_decode to create and then save that JSON-file on the server,
load that file via AJAX and process through more JS.
Let alone keep that JSON-file updated (or create a new one and delete the old one?) since new content will arrive periodically. It seems unnecessarily complicated to me, but what do I know.
There's got to be a better way…
Any help is appreciated.
AJAX is simply a way to make a request to the web server for information.
When you make an AJAX request you ask for a response from a file on a server. So, you can send an AJAX request to a PHP script for-instance.
The PHP script could return anything, JSON is common and very widely used response format, but XML might be another one you've encountered.
So, your request for information is made using AJAX, and the response you get back is JSON.
You don't need to store a JSON file on your server. You just need to make an AJAX request that returns current data in JSON format.
AJAX allows you to do asynchronous HTTP requests.
You can of course ask for a json file, but you can also (for example) call an API.
I suggest you start by reading the the getting started guide for AJAX in MDN:
https://developer.mozilla.org/en-US/docs/Web/Guide/AJAX/Getting_Started

Re-run PHP rss feed

I couldn't really find anything online for what I was looking for.
Currently, I have some php code that grabs news feeds and every time the loop runs through, it stores it in an array slot {0,1,2} etc. The interesting part is, I don't know how to refresh the php rss grab function without refreshing the page.
Essentially I have index.php, and with code inside, and I'd like to re-run the php script in those arrows <> through javascript.
I know in javascript you can assign script names and call them in html, that's essential what I want to do but for php, is it possible?
Currently your RSS fetching logic is intertwined with your presentation logic. You need to separate them so the RSS logic can be called independently.
Consider a structure like this:
rss.php - script with logic to fetch RSS feed and package it up as PHP array
index.php - require(s) rss.php and wraps the results in HTML
api.php - another script which require(s) rss.php, but responds with JSON data that can be consumed by Javascript
Now you can present the RSS data when the page loads and then periodically call api.php via Javascript to update the results.
You could also forgo calling rss.php from index.php, merely have it present an HTML skeleton with the javascript and when the page loads, have the javascript call api.php right away to build the initial list. That way you don't have the presentation logic in two places.
PHP is a server-side language, meaning your script will be run by the server before it is returned to your client (the browser).
What you can do if you want to continuously get new data from your script is call your server periodically with JavaScript using an AJAX call and display that to your user.
See this and this.

Understanding the data flow when fetching chart data using AJAX

I am trying to create a test automation results dashboard using a JS library called Chart.JS. I would like to display a bar graph over time showing the total number of tests passed & failed.
To do this, I have done the following things:
Created a dash_proj.html file in which I include Chart.js and in this file I will actually be drawing the graph onto the canvas.
Created a .php file in which I use a PDO connection to query a local database copy on my machine (testing locally for now through localhost).
In that same .php file, I display, in text, the results onto the browser to ensure I have grabbed the appropriate data.
Now, I am getting confused as to the proper flow of things from here. From what I have read, the next step should be to use JavaScript to call an AJAX function and tell it which PHP file name to look at (the one running the MySQL query), and that will return the data in real time (no screen refresh). Within the HTML file, I should wait for the JavaScript to return that info and the final step will be the actually drawing of the graph.
Where does jQuery come into play? And can I just put my AJAX calls inside my php file which makes the query to the database?
I was thinking of testing to make sure that AJAX call is working by first inserting dummy data into the database and checking to see if the results appear in real time on my PHP file through localhost. I was thinking that the next step would be to store all of the data from my $query->fetch() into two different arrays (one for tests passed and one for tests failed), then somehow access that array from my HTML file which calls Chart.JS and stick that data into the draw bar graph function?
You don't have to use jQuery. This JavaScript library contains a number of functions to simplify making AJAX calls and accessing the DOM, though some would argue that the convergence of browser APIs make it less necessary these days. Nevertheless, it remains popular.
Your first task is probably to fire off an AJAX operation upon page load. You can start off by adding this JavaScript directly to the page, though you'll probably want to add it as a minified asset once you have your logic working.
function ajax() {
// #todo Add your ajax logic in here
}
// Load the AJAX data into the chart as soon as the DOM is ready
$(document).on('ready', function() {
ajax();
});
It is common to do read operations using a get operation returning JSON, for which getJSON would work fine. Add this logic in place of the #todo comment above.
After that, you'll probably want to do a periodic refresh of your data, say every 60 seconds. You can do this thus:
setInterval(60 * 1000, ajax);
Note the interval timer works on milliseconds, hence the need to multiply by 1000.
One downside of the above is that if you expect a large number of users, or wish to reduce the interval to a very small value, your web server will be processing a lot of redundant requests (since most calls will result in no screen change). Using AJAX here is therefore not very scalable.
A better approach is to configure the server to push updates to browsers using Web Sockets. However, this requires a separate kind of web server, and so I probably would not recommend it for you just yet.

Live updating from JSON

I have a JSON file which is dynamically and contain match info including an unique id. The JSON is divided into 3 arrays live, upcoming and recent. Since i'm quite new to Javascript i'm wondering what would be the best way to go in order to make this livescore script. I need it to be updating without refreshing browser? What is my options? Maybe someone has a snippet?
The JSON is automatically updates through another script which is connected to a cron job, so the script does not need to do anything regarding the JSON. Only retrieve and show the data.
I'm using dreamhost, which gives me access to shell, so websockets and so on is an option.
You'll need a jquery user to give you a snippit for this one, but in vanilla ecmascript 6, you use an XMLHttpRequest object to get the JSON from your server. This object can request data from the server asynchronously and is triggered by the client/browser so you can update the live match info when and as often as you like. You would just have to write a function to replace the data on the webpage with the new info when it is updated.

Is fetching remote data server-side and processing it on server faster than passing data to client to handle?

I am developing a web app which functions in a similar way to a search engine (except it's very specific and on a much smaller scale). When the user gives a query, I parse that query, and depending on what it is, proceed to carry out one of the following:
Grab data from an XML file located on another domain (ie: from www.example.com/rss/) which is essentially an RSS feed
Grab the HTML from an external web page, and proceed to parse it to locate text found in a certain div on that page
All the data is plain text, save for a couple of specific queries which will return images. This data will be displayed without requiring a page refresh/redirect.
I understand that there is the same domain policy which prevents me from using Javascript/Ajax to grab this data. An option is to use PHP to do this, but my main concern is the server load.
So my concerns are:
Are there any workarounds to obtain this data client-side instead of server-side?
If there are none, is the optimum solution in my case to: obtain the data via my server, pass it on to the client for parsing (with Javascript/Ajax) and then proceed to display it in the appropriate form?
If the above is my solution, all my server is doing with PHP is obtaining the data from the external domains. In the worst (best?) case scenario, let's say a thousand or so requests are being executed in a minute, is it efficient for my web server to be handling all those requests?
Once I have a clear idea of the flow of events it's much easier to begin.
Thanks.
I just finish a project to do the same request like your req.
My suggestion is:
use to files, [1] for frontend, make ajax call to sen back url; [2] receive ajax call, and get file content from url, then parse xml/html
in that way, it can avoid your php dead in some situation
for php, please look into [DomDocument] class, for parse xml/html, you also need [DOMXPath]
Please read: http://www.php.net/manual/en/class.domdocument.php
No matter what you do, I suggest you always archive the data in you local server.
So, the process become - search your local first, if not exist, then grab from remote also archive for - 24 hrs.
BTW, for your client-side parse idea, I suggest you do so. jQuery can handle both html and xml, for HTML you just need to filter all the js code before parse it.
So the idea become :
ajax call local service
local php grab xm/html (but no parsing)
archive to local
send filter html/xml to frontend, let jQuery to parse it.
HTML is similar to XML. I would suggest grabbing the page as HTML and traversing through it with an XML reader as XML.

Categories

Resources