What is the best way to handle streaming JS content in browser? - javascript

Imagine we have a server side application that generates streaming content full of JavaScript commands. The easiest way for me to show the example application is to use Python/Flask, however you can perform it using any language just flushing the output after each iteration. So, for a sample server side application:
from time import sleep from flask import Response
#app.route('/stream', methods=['POST']) def stream():
def generate():
for i in range(10):
sleep(1)
yield 'console.log("Iteration: %d");\n' % i
return Response(generate(), mimetype='application/javascript')
which returns (during 10 seconds with 1 second pauses) this kind of output:
console.log("Iteration: 0");
console.log("Iteration: 1");
console.log("Iteration: 2");
...
console.log("Iteration: 9");
I need to create a "parent" HTML/JavaScript page which handles and executes these commands on-the-fly, i.e. not waiting until all 10 iterations will be loaded. Also, it should be able to serve POST requests to the mentioned server side application.
Here are the options which I have tried.
I tested jQuery Ajax method with different options but it still
needs full generated output to execute all commands at once.
The other idea was to use iframe. It works fine but in order to
use it I need to rephrase my output from console.log("Iteration:
0"); to <script language="JavaScript">console.log("Iteration:
0");</script> with content type text/html; and also to simulate
POST form submission to the target iframe.
I have read about WebSockets. However, since this technology is not
absolutely supported at the moment, and my application should be
able to work with on-the-fly content already now, I refused to deal
with it.
Another very important thing: the output should be a stream, since server side application works with a long lasting process; so make setTimeout(function() { $.ajax(...); }, 1000); is not the solution.
To summarize, I have tried several options but simple iframe is the only solution which really works at the moment. Otherwise, most probably I am missing something. Any thoughts and constructive ideas are very much appreciated.
Thank you in advance!

long-Polling and comet are options, but these are hacks. The Iframe method you mentioned isn't terrible, but has some state issues if you need to recover the connection.
I would encourage you to reconsider web sockets. There is a lovely shim available on github which uses flash (which has had socket support for some time now) as a fall back. You can write client side code as if web sockets exist, and the shim adds it to browsers that don't support it. Great!

Related

Page Load alternative in a Pure HTML AJAX Website

I am working on a pure HTML website, all pages are HTML with no relation to any server side code.
Basically every request to the server is made using AJAX, I send data from forms, I process this data in Handlers, then I return a JSON string that will be processed back on the client side.
Let's say the page is loaded with parameters in the URL, something like question.html?id=1. Earlier, I used to read this query string on Page Load method, then read data from the database and so on...
Now, since its pure HTML pages, I'm trying to think of an approach that will allow me to do the same, I have an idea but its 99% a bad idea.
The idea is to read URL parameters using JS (after the page has loaded), and then make an AJAX request, and then fetch the data and show them on the page. I know that instead of having 1 request to the server (Web Forms), we are now having 2 Requests, the first request to get the page, and the second request is the AJAX request. And of course this has lots of delays, since the page will be loaded at the beginning without the actual data that I need inside it.
Is my goal impossible or there's a mature approach out there?
Is my goal impossible or there's a mature approach out there?
Lately there are a good handful of JavaScript frameworks designed around this very concept ("single page app") of having a page load up without any data pre-loaded in it, and accessing all of the data over AJAX. Some examples of such frameworks are AngularJS, Backbone.js, Ember.js, and Knockout. So no, this is not at all impossible. I recommend learning about these frameworks and others to find one that seems right for the site you are making.
The idea is to read URL parameters using JS (after the page has loaded), and then make an AJAX request, and then fetch the data and show them on the page.
This sounds like a fine idea.
Here is an example of how you can use JavaScript to extract the query parameters from the current page's URL.
I know that instead of having 1 request to the server (Web Forms), we are now having 2 Requests, the first request to get the page, and the second request is the AJAX request. And of course this has lots of delays, since the page will be loaded at the beginning without the actual data that I need inside it.
Here is why you should not worry about this:
A user's browser will generally cache the HTML file and associated JavaScript files, so the second time they visit your site, the browser will send requests to check whether the files have been modified. If not, the server will send back a short message simply saying that they have not been modified and the files will not need to be transmitted again.
The AJAX response will only contain the data that the page needs and none of the markup. So retrieving a page generated on the server would involve more data transfer than an approach that combines a cacheable .html file and an AJAX request.
So the total load time should be less even if you make two requests instead of one. If you are concerned that the user will see a page with no content while the AJAX data is loading, you can (a) have the page be completely blank while the data is loading (as long as it's not too slow, this should not be a problem), or (b) Throw up a splash screen to tell the user that the page is loading. Again, users should generally not have a problem with a small amount of load time at the beginning if the page is speedy after that.
I think you are overthinking it. I'd bet that the combined two calls that you are worried about are going to run in roughly the same amount of time as the single webforms page_load would if you coded otherwise - only difference now being that the initial page load is going to be really fast (because you are only loading a lightweight, html/css/images page with no slowdown for running any server code.
Common solution would be to then have a 'spinner' or some sort (an animated GIF) that gives the user an visual indication that the page isn't done loading while your ajax calls wait to complete.
Watch a typical page load done from almost any major website in any language, you are going to see many, many requests that make up a single page being loaded, wether it be pulling css/images from a CDN, js from a CDN, loading google analytics, advertisements from ad networks etc. Trying to get 100% of your page to load in a single call is not really a goal you should be worried about.
I don't think the 2-requests is a "bad idea" at all. In fact there is no other solution if you want to use only static HTML + AJAX (that is the moderm approach to web development since this allow to reuse AJAX request for other non-HTML clients like Android or iOS native apps). Also performance is very relative. If your client can cache the first static HTML it will be much faster compared to server-generated approach even if two requests are needed. Just use a network profiler to convince yourself.
What you can do if you don't want the user to notice any lag in the GUI is to use a generic script that shows a popup hiding/blocking all the full window (maybe with a "please wait") until the second request with the AJAX is received and a "data-received" (or similar) event is triggered in the AJAX callback.
EDIT:
I think that probably what you need is to convert your website into a webapp using a manifest to list "cacheable" static content. Then query your server only for dynamic (AJAX) data:
http://diveintohtml5.info/offline.html
(IE 10+ also support Webapp manifests)
Moderm browsers will read the manifest to know whether they need to reload static content or not. Using a webapp manifest will also allow to integrate your web site within the OS. For example, on Android it will be listed in the recent-task list (otherwise only your browser, not your app is shown) and the user can add a shorcut to the desktop.
So, you have static HTMLs and user server side code only in handlers? Why you can't have one ASP .Net page (generated on server side) to load initial data and all other data will be processed using AJAX requests?
if its possible to use any backed logic to determine what to load on server side, that will be easy to get the data.
say for example if you to load json a int he page cc.php by calling the page cc.php?json=a, you can determine from the PHP code to put a json into the page it self and use as object in your HTML page
if you are using query string to read and determine, what to load you have to make two calls.
The primary thing you appear to want is what is known as a router.
Since you seem to want to keep things fairly bare metal, the traditional answer would be Backbone.js. If you want even faster and leaner then the optimised Backbone fork ExoSkeleton might be just the ticket but it doesn't have the following that Backbone proper has. Certainly better than cooking your own thing.
There are some fine frameworks around, like Ember and Angular which have large user bases. I've been using Ember recently for a fairly complex application as it has a very sophisticated router, but based on my experiences I'm more aligned with the architecture available today in React/Flux (not just React but the architectural pattern of Flux).
React/Flux with one of the add-on router components will take you very far (Facebook/Instrgram) and in my view offers a superior architecture for web applications than traditional MVC; it is currently the fastest framework for updating the DOM and also allows isomorphic applications (run on both client and server). This represents the so called "holy grail" of web apps as it sends the initial rendered page from the server and avoids any delays due to framework loading, subsequent interactions then use ajax.
Above all, checkout some of the frameworks and find what works best for you. You may find some value comparing framework implementations over at TodoMVC but in my view the Todo app is far too simple and contrived to really show how the different frameworks shine.
My own evolution has been jQuery -> Backbone -> Backbone + Marionette -> Ember -> React/Flux so don't expect to get a good handle on what matters most to you until you have used a few frameworks in anger.
The main issue is from a UX / UI point of view.
Once you get your data from the server (in Ajax) after the page has been loaded - you'll get a "flickering" behavior, once the data is injected into the page.
You can solve this by presenting the page only after the data has arrived, OR use a pre-loader of some kind - to let the user know that the page is still getting its data, but then you'll have a performance issue as you already mentioned.
The ideal solution in this case is to get the "basic" data that the page needs (on the first request to the server), and manipulate it via the client - thus ease-in the "flickering" behavior.
It's the consideration between performance and "flickering" / pre-loading indication.
The most popular library for this SPA (Single Page Application) page - is angularJS
If I understand your inquiry correctly. You might want to look more about:
1) window.location.hash
Instead of using the "?", you can make use of the "#" to manipulate your page based on query string.
Reference: How to change the querystring on the same page without postback
2) hashchange event
This event fires whenever there's a changed in the fragment/hash("#") of the url. Also, you might want to track the hash to compare between the previous hash value and the current hash value.
e.g.
$(window).on('hashchange', function () {
//your manipulation for query string goes here...
prevHash = location.hash;
});
var prevHash = location.hash; //For tracking the previous hash.
Reference: On - window.location.hash - Change?
3) For client-side entry-point or similar to server-side PageLoad, you may make use of this,
e.g.
/* Appends a method - to be called after the page(from server) has been loaded. */
function addLoadEvent(func) {
var oldonload = window.onload;
if (typeof window.onload != 'function') {
window.onload = func;
} else {
window.onload = function () {
if (oldonload) {
oldonload();
}
func();
}
}
}
function YourPage_PageLoad()
{
//your code goes here...
}
//Client entry-point
addLoadEvent(YourPage_PageLoad);
Since you're doing pure ajax, the benefit of this technique is you would be able to easily handle the previous/next button click events from the browser and present the proper data/page to the user.
I would prefer AngularJS. This will be a good technology and you can do pagination with one HTML. So i think this will be good framework for you as your using static content.
In AngularJS MVC concept is there please read the AngularJS Tutorial. So this framework will be worth for your new project. Happy coding

javascript/ajax – ajax requests very slow - time out issue?

I programmed an experiment in javascript/jquery extensively using ajax to communicate with the server / execute php scripts on the server.
Now, a friend of mine wants to use the scripts in japan and is running into problems that I haven't encountered before.
For example he has to press submit buttons several times instead of just getting redirected to the next page or he sees placeholder symbols instead of the data that is stored in the mysql database and should have been fetched (and replace the placeholders).
In the last case, the script is quite straightforward. A $.post request is executed in the document ready function, which then executes a function in which the javascript replacements take place.
Now we are kind of lost what could be the cause of it. It seems to work fine on my side.
The server is located in europe and he also had some issues with proxy servers.
Could that be a problem?
Could it be that the $.post commands are just very slow or even time out?
Is there anything that can be done about this to make the scripts run reliably on his side?
I am happy to provide more specific information if needed, however currently I am a little bit lost and don't know what to look for specifically.
Thanks for any help!
edit:
Here is the jquery code for one specific example where we encountered this problem:
function updateContent() {
$.post("php/earnings.php",
{
type: "contribution",
grp : group,
pbnr : PbNr,
multiplier : multiplier
}, processAndShow);
function processAndShow(data) {
// here are just jquery commands updating html elements
}
}
$(document).ready(function() {
updateContent();
});
the earnings.php file contain roughly 150 lines of code with 15 mysql_query requests.
I also had a look at the connection diagnostics in firefox for the earnings.php $.post request:
Connection: "Keep-Alive";
Keep-Alive: "timeout=1, max=96"
has it possibly something to do with the timeout / max settings of the apache server?

Long Polling: How do I calm it down?

I'm working on a simple chat implementation in a function that has an ajax call that invokes a setTimeout to call itself on success. This runs every 30 seconds or so. This works fine, but I'd like a more immediate notification when a message has come. I'm seeing a lot of examples for long polling with jQuery code that looks something like this:
function poll()
{
$.ajax(
{
data:{"foo":"bar"},
url:"webservice.do",
success:function(msg)
{
doSomething(msg);
},
complete:poll
});
}
I understand how this works, but this will just keep repeatedly sending requests to the server immediately. Seems to me there needs to be some logic on the server that will hold off until something has changed, otherwise a response is immediately sent back, even if there is nothing new to report. Is this handled purely in javascript or am I missing something to be implemented server-side? If it is handled on the server, is pausing server execution really a good idea? In all of your experience, what is a better way of handling this? Is my setTimeout() method sufficient, maybe with just a smaller timeout?
I know about websockets, but as they are not widely supported yet, I'd like to stick to current-gen techniques.
Do no pause the sever execution... it will lead to drying out server resources if lot of people try to chat...
Use client side to manage the pause time as you did with the setTimeout but with lower delay
You missed the long part in "long polling". It is incumbent on the server to not return unless there's something interesting to say. See this article for more discussion.
You've identified the trade-off, open connections to the web server, therefore consuming http connections (i.e. the response must block server side) vs frequent 'is there anything new' requests therefore consuming bandwidth. WebSockets may be an option if your browser base can support them (most 'modern' browsers http://caniuse.com/websockets)
There is no proper way to handle this on the javascript side through traditional ajax polling as you will always have a lag at one end or the other if you are looking to throttle the amount of requests being made. Take a look at a nodeJS based solution or perhaps even look at the Ajax Push Engine www.ape-project.org which is PHP based.

Why is my Ajax request significantly slower than a normal browser request?

I have created a website for a friend. Because he wished to have a music player continue to play music through page loads, I decided to load content into the page via ajax (facilitated by jQuery). It works fine, it falls back nicely when there is no javascript, and the back/forward buttons are working great, but it's dreadfully slow on the server.
A couple points:
The initial page load is fairly quick. The Chrome developer console tells me that "index.php" loads in about 2.5 seconds. I have it set up so that query string params dictate which page is loaded, and this time frame is approximately accurate for them all. For the homepage, there is 8.4KB of data loaded.
When I load the content in via an ajax request, no matter the size of the data downloaded, it takes approximately 20 seconds. The smallest amount of data that is loaded in this way is about 500 bytes. There is obviously a mismatch here.
So Chrome tells me that the vast majority of the time spent is "waiting" which I take to mean that the server is dealing with the request. So, that can only mean, I guess, that either my code is taking a long time, or something screwy is going on with the server. I don't think it's my code, because it's fairly minimal:
$file = "";
if (isset($_GET['page'])) {
$file = $_GET['page'];
} else if (isset($_POST['page'])) {
$file = $_POST['page'];
} else {
$file = "home";
}
$file = 'content/' . $file . '.php';
if (file_exists($file)) {
include_once($file);
} else {
include_once('content/404.php');
}
This is in a content_loader.php file which my javascript (in this scenario) sends a GET request to along with a "page" parameter. HTML markup is returned and put into a DIV on the page.
I'm using the jQuery .get() shorthand function, so I don't imagine I could be messing anything up there, and I'm confident it's not a Javascript problem because the delay is in waiting for the data from the server. And again, even when the data is very small, it takes about 20 seconds.
I currently believe it's a problem with the server, but I don't understand why a request made through javascript would be so much slower than a request made the traditional way through the browser. As an additional note, some of the content pages do connect to a MySQL database, but some do not. It doesn't seem to matter what the page requires for processing or how much data it consists of, it takes right around 20 seconds.
I'm at a loss... does anyone know of anything that might explain this? Also, I apologize if this is not the correct place for such a question, none of the other venues seemed particularly well suited for the question either.
As I mentioned in my comment, a definite possibility could be reverse DNS lookups. I've had this problem before and I bet it's the source of your slow requests. There are certain Apache config directives you need to watch out for in both regular apache and vhost configs as well as .htaccess. Here are some links that should hopefully help:
http://www.tablix.org/~avian/blog/archives/2011/04/reverse_dns_lookups_and_apache/
http://betabug.ch/blogs/ch-athens/933
To find more resources just Google something like "apache slow reverse dns".
A very little explanation
In a reverse DNS lookup an attempt is made to resolve an IP address to a hostname. Most of the time with services like Apache, SSH and MySQL this is unnecessary and it's a bad idea as it only serves to slow down requests/connections. It's good to look for configuration settings for your different services and disable reverse DNS lookups if they aren't needed.
In Apache there are certain configuration settings that cause a reverse lookup to occur. Things like HostnameLookups and allow/deny rules specifying domains instead of IP addresses. See the links above for more info.
As you suggested in your comment, the PHP script is executing quickly once it finally runs. The time is spent waiting on Apache - most likely to do a reverse DNS lookup, and failing. You know the problem isn't with your code, it's with the other services involved in the request.
Hope this helps!

Requesting remote XML data with javascript

Ok here's my problem. I'm working on this little site called 10winstreak and I'm trying to detect if a stream is live or not with javascript because our server that we run the site off of cant handle processing every single request with PHP. The basis of detecting if a stream is live or not is you go to their XML file and in one of their tags (if it's live) it will say something along the lines of true and often time the XML file on their site will be empty if a particular stream isn't live. for example if you have a twitch.tv stream for gamespot you go to http://api.justin.tv/api/stream/list.xml?channel=gamespot and if it's got stuff in it then it's live if not then it's not.
so basically my code looks like this:
function check (URL, term){
$.get(URL , function(data){
console.log(data);
//data is whatever the server returns from the request, do whatever is needed with it to show who is live.
var number = data.search(term);
if (number > -1)
{
document.write("Live");
}
else
{
document.write("Offline");
}
});
}
and URL is a url that gets passed in and term is the term to search for in the xml file (usually "true" or "True"). but before anything happens I end up with "XMLHttpRequest cannot load http://api.own3d.tv/liveCheck.php?live_id=6815. Origin (my server's URL) is not allowed by Access-Control-Allow-Origin."
I've looked into it all over the net and i dont seem to be able to find anything that I can use. there's alot of theory stuff but not enough actual code and i dont understand the theory stuff to be able to start typing code out. from what i've seen you have 2 ways to go, use JSONP or add a line somewhere in your sever to allow for cross-domain accessing. neither of which i understand fully nor know how or what to do. It would be alot of help for someone to show me what needs to be done to get rid of this error. of course if you can explain it to a non-coder like me it would be even more awesome but at my current point, as long as the code works for all I care it might as well be magic lol.
You can solve it :)
Take a look at xReader
<script src="http://kincrew.github.com/xReader/xReader.full.js"></script>
<script type="text/javascript">
xReader("http://api.own3d.tv/liveCheck.php?live_id=6815", function(data) {
alert(data.content);
})
</script>
I think you need cacheburst option. but you can be banned from YQL.
I think its because the path is not relative. You may be calling this from a different domain/sub-domain. You can potentially allow other origins to access, which may open up a security hole or you can create a proxy locally.
In PHP creating a proxy is easy: http://blog.proxybonanza.com/programming/php-curl-with-proxy/
Now, instead of directing your request straight to that URL send the request from jQuery to your own local url and have it access it on the server side.
Another option would be to use YQL: http://www.parrisstudios.com/?p=333 (I wrote an article about this a while ago)... In that way you can turn the response into JSON, which can be accessed cross-domain (as can javascript).
You could ask for the API responses to all be returned using a JSONP server and in JSON.
You aren't going to be able to do this via client-side javascript unless they've enabled some way to retrieve their data cross-domain (CORS, JSONP, some flash widgety thing getting read permissions from crossdomain.xml file(s) located on their server...)
Short answer: unless 10winstreak offers a JSONP service, you'll have to do things on the server-side.
Slightly longer answer:
For security reasons browsers won't let you make AJAX requests from www.example.com to www.example2.com (or any other domain except www.example.com). There isn't much you can do about this except use JSONP (and you can only do that if the remote webservice offers it).
Therefore, what you end up needing to do is ask your server "hey what's on that other server?" and (since it's not limited the way a browser is) it can go get the XML from that other server. There are various ways of doing this, either with code or Apache config; not sure what's right for you, but hopefully now you understand the general principle.
P.S. See this question: Wouldn't have been simpler to just discard cookies for cross-domain XHR? if you are curious why browsers do this.
* EDIT *
I just checked out JustinTV's site, and it appears that they already have a PHP library for you to use:
https://github.com/jtvapi/jtv_php_api
This is very likely your best bet (if you want to keep using PHP that is; if not they have libraries for other languages: http://www.justin.tv/p/api).

Categories

Resources