Good or bad idea : load database as a separate .js file - javascript

I have a web page where you can customize your game character. In order to speed up browsing (gems) I load entire gems database (600 entries, 247KB) as a separate .js file, so it can be cached and I don't need to load it every time.
I don't notice a delay, is it still a bad idea?
Should I ajax-get necessary records on the fly instead?
FYI: I use ASP.NET MVC 2.0, here is loading the script:
<script type="text/javascript" src='./Data.aspx/Gems'></script>
And here is the action:
[OutputCache(Duration = 14400, VaryByParam = null)]
public ActionResult Gems() {...}
EDIT: My main concern is not load time, but memory usage. Is it going to have noticeable impact having excra 250KB of javascript loaded/parsed by browser?

I find it a pretty good idea. Plus, if you ever need to "upgrade" the GEMS database you can just load up the scripts with a version tag like
<script type="text/javascript" src='./Data.aspx/Gems?v=1232'></script>
Where v=123 will force the user to download the new version if required.

I assume the page won't function until the script is fully loaded anyway but to make the page feel faster you should load the javascript at the bottom of the page.

Embedding the data as a script will cause the browser to halt page loading until the script file has been downloaded and parsed.
If you fetch a static script or data file using ajax, the browser should cache it as if it was an inline script, so there isn't any downside to using ajax, and you don't have to worry about slowing the page load.

Related

Embedded appendChild JS files not re-fetched on browser hard refresh

I have a web page with multiple embedded JS files, almost all of which are inserted using the normal <script src="/blah.js?v=20200414"> tag. They all have a cache-busting query string, based on the date they were created.
There is one script which is loaded using the following
<script>
window.setTimeout(loadScriptElm, 2000);
function loadScriptElm()
{
var scriptElm = document.createElement("script");
scriptElm.src = "/special-script.js?v=20200414";
document.getElementsByTagName("body")[0].appendChild(scriptElm);
}
</script>
All the scripts have a Cache-Control for 900 seconds. The code in all of them runs just fine.
When I initially load the page, all the scripts are retrieved from the server. When I do a refresh (using F5), all the scripts are loaded from browser cache (assuming the cache TTL hasn't expired).
When I do a 'hard refresh' (using Ctrl+F5), all the 'regularly-embedded' scripts are re-fetched from the server (as I would expect), but the special-script.js file is not - it's still retrieved from browser cache. Why doesn't a hard refresh also re-fetch this file?
I can reliably recreate this on Chrome/Brave, but I haven't tried other browsers. It seems like a bug, but maybe it's not... Is this 'expected behavior'? If so, is it documented anywhere?

Best practice for script caching with Javascript

Using:
C# MVC5 and Jquery
I have a filter screen that potentially uses multiple different filters. Based on what the user selects I make a call to the server and I load a partial view into a bootstrap modal as follows:
$.ajax({
url: filterUrl,
contentType: 'application/html',
success: function (filterContent) {
$("#divReportFilterModalBody").html(filterContent);
LoadFilterScript(SCOPESTRINGS[currentReport.Scope]);
},....
The next step is to load the necessary javascript for that filter page because you cant have scripts on a partial view. For this I also request the script from the server as follows:
$.getScript(scopeString + "FilterJavaScript",
function () {
The mvc controller:
[OutputCache(NoStore = true, Duration = 0, VaryByParam = "*")]
public ActionResult ScopeFilterJavaScript()
{
return
File(System.IO.File.ReadAllBytes(Server.MapPath("~/Scripts/.../filterPartial.js")), "text/javascript");
}
Because the user can only use one filter at a time and may or may not use multiple filters my questions are:
The scripts aren't big, is it better practice to load them all upfront rather then fetch them as required? The reason I load them as required is because they might not get called and didn't want to load a bunch of scripts that will not get used
Is not caching them a good idea because the user can use the same filter multiple times and in my current case the script will get loaded each time? OR should I rather cache the script and figure out a way not to load it again?
I'm also not 100% clear on script caching. What happens to the script in this case after it was loaded? If I make a call to the server I can see that it gets loaded again, was the previous scripts removed? Because when I look at the script tab on firebug they are all still listed there? Will this cause conflicts on the page?
What would best practice be in this scenario?
Thanks
Edit: I've been researching the topic a bit further and found this article (Old but still very relevant in my opinion). enter link description here
It's always a good idea to only load stuff if you actually need it. When the files arent that huge, maybe you can combine them and include them in the first place.
OR should I rather cache the script and figure out a way not to load it again?
yup.
When you load a script (without any queries) the browser caches it. But this has nothing to do with what happens when you load a script again. Either the servers delivers it "again" or the browser uses the cached one. Nevertheless, the script then executes again. Even if you remove it from the dom - once loaded scripts are just there.
Maybe you can wrap your scripts like so:
if (!window.foobarLoaded) {
// your script content
window.foobarLoaded = true;
}
Then you can load the script as many times as you like - it only "executes" once.

Are JavaScript files downloaded with the HTML-body

I have a Java Web Application, and I'm wondering if the javascript files are downloaded with the HTML-body, or if the html body is loaded first, then the browser request all the JavaScript files.
The reason for this question is that I want to know if importing files with jQuery.getScript() would result in poorer performance. I want to import all files using that JQuery function to avoid duplication of JavaScript-imports.
The body of the html document is retrieved first. After it's been downloaded, the browser checks what resources need to be retrieved and gets those.
You can actually see this happen if you open Chrome Dev Console, go to network tab (make sure caching is disabled and logs preserved) and just refresh a page.
That first green bar is the page loading and the second chunk are the scripts, a stylesheet, and some image resources
The HTML document is downloaded first, and only when the browser has finished downloading the HTML document can it find out which scripts to fetch
That said, heavy scripts that don't influence the appearance of the HTML body directly should be loaded at the end of the body and not in the head, so that they do not block the rendering unless necessary
I'm wondering if the javascript are downloaded with the html body during a request
If it's part of that body then yes. If it's in a separate resource then no.
For example, suppose your HTML file has this:
<script type="text/javascript">
$(function () {
// some code here
});
</script>
That code, being part of the HTML file, is included in the HTML resource. The web server doesn't differentiate between what kind of code is in the file, it just serves the response regardless of what's there.
On the other hand, if you have this:
<script type="text/javascript" src="someFile.js"></script>
In that case the code isn't in the same file. The HTML is just referencing a separate resource (someFile.js) which contains the code. In that case the browser would make a separate request for that resource. Resulting in two requests total.
The HTML document is downloaded first, or at least it starts to download first. While it is parsed, any script includes that the browser finds are downloaded. That means that some scripts may finish loading before the document is completely loaded.
While the document is being downloaded, the browser parses it and displays as much as it can. When the parsing comes to a script include, the parsing stops and the browser will suspend it until the script has been loaded and executed, then the parsing continues. That means that
If you put a call to getScript instead of a script include, the behaviour will change. The method makes an asynchronous request, so the browser will continue parsing the rest of the page while the script loads.
This has some important effects:
The parsing of the page will be completed earlier.
Scripts will no longer run in a specific order, they run in the order that the loading completes.
If one script is depending on another, you have to check yourself that the first script has actually loaded before using it in the other script.
You can use a combination of script includes and getScript calls to get the best effect. You can use regular scripts includes for scripts that other scripts depend on, and getScript for scripts that are not affected by the effects of that method.

Caching Javascript inlined in HTML

Instead of having an external .js file, we can inline Javascript directly in HTML, i.e.
Externalized version
<html>
<body>
<script type="text/javascript" src="/app.js"></script>
</body>
</html>
Inlined version
<html>
<body>
<script type="text/javascript">
// app.js inlined
</script>
</body>
</html>
However, it's not recommended:
https://developer.yahoo.com/performance/rules.html#external
Put javascript and css inline in a single minified html file to improve performance?
The main reason is caching and pre-compiling - in the externalized version, the browser can download, pre-compile and store the file once for multiple pages, while it cannot do the same for inlined version.
However, is it possible to do something along these lines:
Inlined keyed version
<html>
<body>
<script type="text/javascript" hash="abc">
// app.js inlined
</script>
</body>
</html>
That is, do this:
In the first invocation, send the whole script and somehow tell the browser that the script hash is abc
Later, when the browser loads that or other pages containing the same script, it will send this key as a cookie. The server will only render the contents of the script if the key has been received.
That is, if the browser already knows about the script, the server will render just this:
Inlined keyed version, subsequent fetches (of the same or other pages)
<html>
<body>
<script type="text/javascript" hash="abc">
</script>
</body>
</html>
where notably the script contents are empty.
This would allow for shorter script fetching with a natural fallback.
Is the above possible? If not, is some other alternative to the above possible?
I don't know of a way to do what you asked, so I'll provide an alternative that might still suit your needs.
If you're really after a low latency first page load, you could inline the script, and then after the page loads, load the script via url so that it's in the browser cache for future requests. Set a cookie once you've loaded the script by direct url, so that your server can determine whether to inline the script or provide the external script url.
first page load
<script>
// inlined my-script.js goes here.
</script>
<script>
$(function(){
// load it again, so it's in the browser cache.
// notice I'm not executing the script, just loading it.
$.ajax("my-script.js").then(function(){
// set a cookie marking this script as cached
});
});
</script>
second page load
<script src="my-script.js"></script>
Obviously, this has the drawback that it loads the script twice. It also adds additional complexity for you to take care of when you update your script with new code - you need to make sure you address the cookie being for a old version.
I wouldn't bother with all this unless you really feel the need to optimize the first page. It might be worth it in your case.
The Concept
Here's an interesting approach (after being bugged by notifications :P)
You could have the server render your script this way. Notice the weird type attribute. That's to prevent the script from executing. We'll get to that in a second.
<script type="text/cacheable" data-hash="9182n30912830192c83012983xm019283x">
//inline script
</script>
Then create a library that looks for these scripts with weird types, get the innerHTML of these scripts, and execute them in the global context as if they were normally executing (via eval or new Function). This makes them execute like normal scripts. Here's a demo:
<script type="text/cacheable" data-hash="9182n30912830192c83012983xm019283x">
alert(a);
</script>
<script type="text/cacheable" data-hash="9182n30912830192c83012983xm019283x">
alert(b);
</script>
<script>
// Let's say we have a global
var a = "foo";
var b = "bar"
// Getting the source
var scripts = Array.prototype.slice.call(
document.querySelectorAll('script[type="text/cacheable"]')
);
scripts.forEach(function(script){
// Grabbing
var source = script.innerHTML;
// Create a function (mind security on this one)
var fn = new Function(source);
// Execute in the global scope
fn.call(window);
});
</script>
However...
Since you have the script source (the innerHTML), you can cache them somewhere locally (like in localStorage) and use the hash as its identifier. Then you can store the same hash in the cookie, where future page-requests can tell the server "Hey, I have cached script with [hash]. Don't print the script on the page anymore". Then you'll get this in future requests:
<script type="text/cacheable" data-hash="9182n30912830192c83012983xm019283x"></script>
That covers up the first half. The second phase is when your library sees an empty script. The other thing your library should do is when it sees an empty script, it should look up for that script with that hash in your local storage, get the script's source and execute it like you just did in the first place.
The Catch
Now there's always a trade-off in everything, and I'll highlight what I can think of here:
Pros
You only need one request for everything. Initial pageload contains scripts, subsequent pages become lighter because of the missing code, which is already cached by then.
Instant cache busting. Assuming the hash and code are 1:1, then changing the content should change the hash.
Cons
This assumes that pages are dynamic and are never cached. That's because if you happen to create a new script, with new hash, but had the client cache the page, then it will still be using the old hashes thus old scripts.
Initial page load will be heavy due to inlined scripts. But this can be overcome by compressing the source using a minifier on the server. Overhead of minification can also be overcome by caching minified results on the server.
Security. You'll be using eval or new Function. This poses a big threat when unauthorized code manages to sneak in. In addition, the threat is persistent because of the caching.
Out of sync pages. What happens if you get an empty script, whose hash is not in the cache? Perhaps the user deleted local storage? You'll have to issue a request to the server for it. Since you want the source, you'll have to have AJAX.
Scripts are not "normal". Your script is best put at the end of the page so that all inline scripts will be parsed by then. This means your scripts execute late and never in the time they get parsed by the browser.
Storage limits. localStorage has a size limit of 5-10MB, depending on which browser we're talking about. Cookies are limited to 4KB generally.
Request size. Note that cookies are shipped up to the server on request and down to the browser on response. That additional load might be more of a hassle than it is for good.
Added server-side logic. Because you need to know what needs to be added, you need to program your server to do it. This makes the client-side implementation dependent on the server. Switching servers (say from PHP to Python) wouldn't be as easy, as you need to port over the implementation.
If your <script> is not introduced as type=text/javascript, it will simply not be executed.
So you could have many tags like theses:
<script type="text/hashedjavascript" hash="abc">...</script>
<script type="text/hashedjavascript" hash="efg">...</script>
Then when the DOM is loaded, pick one and evaluate it.
I made an example here: http://codepen.io/anon/pen/RNGQEM
But it smells, real bad. It's definitely better to fetch two different files.
Actually what you should do, is have a single file my-scripts.js that contains the code for each of your script, wrapped in a function
// file: my-scripts.js
function script_abc(){
// what script abc is supposed to do
}
function script_efg(){
// what script efg is supposed to do
}
Then execute whatever your cookie tells you to. This is how AMD builders concatenate multiples files in one.
Also look for an AMD library such as requirejs
Edit: I misunderstood your question, removed the irrelevant part.

Send head before body to load CSS and JS asap

I wonder if anyone has found a way to send at mid rendering the head tag so CSS and Javascript are loaded before the page render has finished? Our page takes about 523ms to be rendered and resources aren't loaded until the page is received. I've done a lot of PHP and it is possible to flush the buffer before the end of the script. I've tried to add a Response.flush() at the end of the Masterpage page_load, but the page layout is horribly broken afterward. I've seen a lot of people using an update panel to send the content using AJAX afterward but I don't quite know what impact it would have on the SEO.
If I don't find a solution I guess I'd have to go the reverse proxy route and find a way to invalidate the proxy cache when the pages content change.
Do not place the Flush on code behind but on your html page as:
</head>
<%Response.Flush();%>
<body >
This can make something like fleekering effect on the page, so you can try to move the flush even a little lower to the page.
Also on Yahoo tips page at Flush the Buffer Early
http://developer.yahoo.com/performance/rules.html
Cache on Static
Additionally you can add client cache on static content like the css and javascript. In this page have all the ways for all iis versions.
http://www.iis.net/ConfigReference/system.webServer/staticContent/clientCache
Follow up
One more think that I suggest you to do after I see your pages is to place all css and javascript in one file each. And also use minified to minimize them.
I use this minified http://www.asp.net/ajaxlibrary/Download.ashx with very good results and real time minified.
Consider using a content-delivery-network (CDN) to host your images, CSS and JS files. Browsers have either an eight or four connection limit per domain - so once you use those up the browser has to wait for the resources to be freed up.
By hosting some files on the CDN you get another set of connections to use concurrently, allowing everything to load faster.
Also consider enabling GZIP on your server if you haven't already. This compresses files on the fly, resulting in smaller transfers.
You could use jQuery to execute your js as soon as it is loaded.
$.fn.ready(function(){
//Your code here
})
Or you could just take the standalone ready function -> $(document).ready equivalent without jQuery
You could do a fade-in or show once the document has been loaded. Just set body display:none;

Categories

Resources