Javascript single page application becoming very difficult to work with - javascript

I'm part of a development team who works on a single page application that's becoming a nightmare to develop. At the moment our build minifies all the JS files into one file so once it's deployed it's fast and easy to cache but when developing we have over 2000 individual JS files that need to be downloaded every time you refresh.
Has anyone come across this issue themselves and found a workaround.
One solution I was thinking was that if you could cache every file, but every time you save the file it would also break the cache just for that file.
A tactic people are using at the moment is to slim down their workspaces but that takes a huge amount of time as there are so many dependencies.
Any help on this would be very much appreciated

The problem is that when you're doing F5 or Ctrl+R (or using you're browser inspect tool), the headers sent to get the file asks for no cache for every file, and interprete all JS, assuming you're working on a browser project.
A possible easy to set up solution would be to make a small dedicated HTTP server which looks at the modification date of the file to serve and which returns only an 304 Not Modified header even if the browser specifies a if-modified-since header (which means that the file requested is cached by the browser).
Scenario:
The client asks for JS files (for the second time) with a if-modified-since: Wed, 09 Dec 2015 12:18:00 GMT header to your localhost:8888
If the file on disk is less recent, file server sends a 304 header
If the file is newer so the file server streams the file, specifying Date: Wed, 09 Dec 2015 12:20:00 GMT and cache-control: public, max-age:3600, must-revalidate
Edit: so this is a tiny NodeJS file server. Just install NodeJS, add npm. Create this file as index.js, set the wwwpath variable, run npm install node-static in the same folder as index.js (it is a dependency of the file server below), and execute the program with nodejs index.js (or node index.js on Windows).
var static = require('node-static');
var fs = require('fs');
var wwwpath = './www';
var fileServer = new static.Server(wwwpath);
require('http').createServer(function (request, response) {
request.addListener('end', function () {
fileServer.serve(request, response, function (err, result) {
var filepath = wwwpath + request.url;
// browser is checking cache
if (request.headers && request.headers['if-modified-since']) {
var ifModifiedSince = new Date(request.headers['if-modified-since']);
var stats = fs.statSync(filepath);
// file has been edited since
if (stats.mtime.getTime() > (ifModifiedSince.getTime() + 1000)) { // +1000 Date header is no millisecond accurate
response.writeHead(304, 'Not Modified');
response.end();
}
}
// file not found / ...
if (err) {
response.writeHead(err.status, err.headers);
response.end();
}
});
}).resume();
}).listen(8080);
What does it do ? If you have a file myscript.js in wwwroot, call it with http://localhost:8080/myscript.js. If you refresh your page, the server will check the edit time of the file before streaming it.

Related

NodeJs: How do I access the functions of my JavaScript backend from my HTML frontend?

Here is my HTML code in index.html.
<!DOCTYPE html>
<html>
<body>
<button type="button" onclick="stuff()">Click</button>
<script>
async function stuff() {
await connectToServer();
}
async function connectToServer() {
const xhttp = new XMLHttpRequest();
xhttp.onload = function() {
alert(this.responseText);
};
xhttp.open('GET', 'C:/Users/myName/myFolder/index.js', true);
xhttp.send();
return;
}
</script>
</body>
</html>
Then, here is my backend code in index.js.
const express = require('express');
const axios = require('axios');
const port = process.env.PORT || 8080;
const app = express();
app.get('/', (req, res) => {
res.sendFile('C:/Users/myName/myFolder/views/index.html');
});
app.listen(port, () => console.log(`Listening on port ${port}`));
I can type node index.js on the command line and run this program and go to http://localhost:8080/ . When I do this, the html page shows up as intended. However, when I click the button in order to make a GET request to the server side, I get a console error saying Not allowed to load local resource: file:///C:/Users/myName/myFolder/index.js . I'm using Google Chrome by the way.
I know that it is a security thing, and that you are supposed to make requests to files that are on a web server (they begin with http or https). I suppose then, my question is:
How do I make it so that my server file index.js can be viewed as being on a server so that I can call functions on the backend from my frontend?
You have to make an HTTP request to a URL provided by the server.
The only URL your server provides is http://localhost:8080/ (because you are running an HTTP server on localhost, have configured it to run on port 8080, and have app.get('/', ...) providing the only path.
If you want to support other URLs, then register them in a similar way and write a route to handle them.
The express documentation will probably be useful.
You should not need to load your server-side code into the browser. It's server-side code. It runs on the server. It isn't client-side code. It doesn't run in the browser. The browser does not need access to it.
If you want to load some actual client-side JS from the server, then use <script src="url/to/js"></script> (and not Ajax) and configure express' static middleware.
Let's improve your current flow by separating your backend API process from frontend hosting process. While backend can, it's not good in serving static html files (especially for local development purposes).
Run your backend as usual, node index.js. But as soon as this command will become more complicated, you will probably want to use npm scripts and do just npm start)
Run separate server process for frontend. Check out parcel, snowpack, DevServer. It can be as easy as npx parcel index.html, but this command is likely to change frequently with your understanding of your tool features.
To call backend, just add an API endpoint to an express app (just like you already did for serving static content), and call it, using backend process URL.
Usually, you will see your app on http://localhost/ and it should do requests to http://localhost:8080/.
If for some strange reason you will want to dynamically download js file from your server to execute it, you just need to serve this file from your frontend hosting process. In order to do so, different development servers have different techniques, but usually you just specify file extensions and paths you want to be available.
After editing frontend files, you will see hot-reload in browser. You can achieve the same for node process with various tools (start googling from nodemon)
If you find this way of operating not ideal, try to improve it, and check what people already did in this direction. For example, you can run two processes in parallel with concurrently.

How do you ensure a service worker caches a consistent set of files?

I have a progressive web app (PWA) consisting of several files including index.html, manifest.json, bundle.js and serviceWorker.js. I update my app by uploading all these files to my host. In case it matters, I am using Firebase so I use firebase deploy to upload the files.
Usually everything works correctly: When an existing user opens the app they still see the old version but in the background the new service worker installs any changed files to the cache. Then when the user next opens the app it activates and they see the new version.
But I have a problem when a user opens the app a short time after I deploy it. What seems to happen is: The host delivers the new serviceWorker.js but the old bundle.js. And so the install puts old bundle.js in its new cache. The user gets the old functionality or worse might get an app made up of an inconsistent mixture of new and old files.
I guess it could be argued that it is the host's fault for not updating atomically, but I have no control over Firebase. And it does not sound possible anyway because the browser is sending a series of independent fetches and there can be no guarantee that they will all return a consistent version.
In case it helps, here is my serviceWorker.js. The cacheName strings such as "app1-a0f43550e414" are generated by my build pipeline. Here a0f43550e414 is the hash of the latest bundle.js so that the cache is only updated if the content of bundle.js has changed.
"use strict";
const appName = "app1";
const cacheLookup = {
"app1-aefa820f62d2": "/",
"app1-a0f43550e414": "bundle.js",
"app1-23d94a4a7388": "manifest.json"
};
self.addEventListener("install", function (event) {
event.waitUntil(
Promise.all(Object.keys(cacheLookup).map(cacheName =>
caches.open(cacheName)
.then(cache => cache.add(cacheLookup[cacheName]))
))
);
});
self.addEventListener("activate", event => {
event.waitUntil(
caches.keys().then(cacheNames =>
Promise.all(cacheNames.map(cacheName => {
if (cacheLookup[cacheName]) {
// cacheName holds a file still needed by this version
} else if (!cacheName.startsWith(appName + "-")) {
// Do not delete the cache of other apps at same scope
} else {
console.log("Deleting out of date cache:", cacheName);
return caches.delete(cacheName);
}
}))
)
);
});
const handleCacheMiss = request =>
new Promise((_, reject) => {
reject(Error("Not in service worker cacheLookup: " + request.url));
});
self.addEventListener("fetch", event => {
const request = event.request;
event.respondWith(
caches.match(request).then(cachedResponse =>
cachedResponse || handleCacheMiss(request)
)
);
});
I have considered bundling all my HTML, CSS and JavaScript into a giant file so it cannot be inconsistent. But a PWA need several supporting files that cannot be bundled including the service worker, manifest and icons. If I bundle all that I can, the user can still get stuck with an old version of the bundle and still have inconsistent supporting files. And anyway, in the future I would like to increase granularity by doing less bundling and having more files so on a typical update only a few small files would need to be fetched.
I have also considered uploading bundle.js and the other files with a different filename for each version. The service worker's fetch could hide the name change so other files like index.html can still refer to it as bundle.js. But I don't see how this works the first time a browser loads the app. And I don't think you can rename index.html or manifest.json.
It sounds like your request for bundle.js inside of your install handler might be fulfilled by the HTTP cache, instead of via the network.
You can try changing this current snippet:
cache.add(cacheLookup[cacheName])
to explicitly create a Request object that has its cache mode set to reload to ensure that the response isn't provided by the HTTP cache:
cache.add(new Request(cacheLookup[cacheName], {cache: 'reload'})
Alternatively, if you're concerned about non-atomic global deployments and you can go through the effort to generate sha256 or better hashes as part of your build process, you can make use of subresource integrity to ensure that you're getting the correct response bytes from the network that your new service worker expects.
Adapting your code would something like the following, where you'd actually have to generate the correct sha256 hashes for each file you care about during build time:
const cacheLookup = {
"sha256-[hash]": "/",
"sha256-[hash]": "bundle.js",
"sha256-[hash]": "manifest.json"
};
// Later...
cache.add(new Request(cacheLookup[cacheName], {
cache: 'reload',
integrity: cacheName,
}));
If there's an SRI mismatch, then the request for a given resource will fail, and that will cause the cache.add() to reject, which will in turn cause the overall service worker installation to fail. Service worker installation will be retried the next time an update check happens, at which point (hopefully!) the deployment will be finished and the SRI will be valid.

Every JS file throws error until it's opened and saved on Azure

I'm deploying a node app to Azure, using local git deployment. I have it working without issues in the staging environment (also on Azure) and I'm now setting up a production environment.
Every single file involved in the app throws an error - but as soon as I open that file in my FTP client and just save it, without making changes, the error for that particular file goes away - and the next file used throws an error.
So the steps I took are something like this:
Run deployment, refresh browser.
Get error like Unexpected token ILLEGAL on server.js line 1
Save server.js in FTP client, without making changes.
Restart app, refresh browser.
server.js now has no issues, but the first file it requires, express, gives an error cannot find module ./lib/express on node_modules/express/index.js:11 (./lib/express is definitely there)
Save node_modules/express/index.js:11 in FTP client, without making changes.
Restart app, refresh browser.
Now, node_modules/express/index.js has no issues, but the first file it requires, ./lib/express will then give an error: cannot find module merge-descriptors on node_modules/lib/express.js:16
I'll stop there, but in real life I continued and the behaviour is consistently ongoing - each file errors on the first thing it tries to require, until it's been saved in the FTP client.
To top it all off, I left the app unchanged for 20 minutes, came back, and I was back to the beginning - with an Unexpected token ILLEGAL on server.js line 1 despite NO changes to my code. I tried saving each file and basically just repeated the steps above, getting the same results.
I'm completely stuck and have no idea what to do next short of saving every single file in the codebase. Any idea what could be going on here, or how I could move forwards with debugging the issue?
You most likely have a Byte-Order-Mark at the beginning of your files.
There is a simple gist by Domenic Denicola that shows how you can detect and remove this for all files in the current directory:
var fs = require("fs");
var path = require("path");
var wrench = require("wrench");
var BOM_CHAR_CODE = 65279;
var BOM = String.fromCharCode(BOM_CHAR_CODE);
var fileNames = wrench.readdirSyncRecursive(process.cwd()).filter(function (fileName) {
return path.extname(fileName) === ".js";
});
fileNames.forEach(function (fileName) {
fs.readFile(fileName, "utf8", function (err, fileContents) {
if (err) { throw err; }
if (fileContents.charCodeAt(0) !== BOM_CHAR_CODE) {
fs.writeFile(fileName, BOM + fileContents, "utf8", function (err) {
if (err) { throw err; }
});
}
});
});
After being in touch with Azure support, it turns out the issue was due to WEBSITE_DYNAMIC_CACHE being set to 1, not 0. The feature isn't visible in the azure portal, and is "in development and is currently in private preview." The issue I came across is a known bug with WEBSITE_DYNAMIC_CACHE and node apps.
We're still not sure how / why it got set to 1 in the first place, but it's fixed, so we don't care for now.
What a fun day!

Node.js not closing files created by fs.createReadStream()

On my server, every time a user uses our service we have to grab a JSON file for them from the server. I do this by using fs.createReadStream() inside of my own function.
function getJSONFromServer(filepath, callback){
var data = fs.createReadStream(filepath);
data.on('error', function (error) {
console.log("Caught", error);
callback(undefined, error);
});
var jsonFile = "";
data.on('data', function(chunk) {
jsonFile += chunk;
});
data.on('end', function() {
var jsonData = JSON.parse(jsonFile);
callback(jsonData);
data.destroy();
data.close();
});
}
This does the job, but it does not close the connection to the file. So after reading 1024 files (the limit on my server), Node.js will then produce the error EMFILE, too many open files. Then I have to kill our Node.js server, open it again and that will clear the "open files".
I check the amount of files open by lsof -i -n -P | grep nodejs. It displays something like this:
nodejs 13707 node 10u IPv4 1163695 0t0 TCP 127.0.0.1:55643->127.0.0.1:27017 (ESTABLISHED)
nodejs 13707 node 11u IPv4 1163697 0t0 TCP 127.0.0.1:55644->127.0.0.1:27017 (ESTABLISHED)
for as many files that are open.
I've tried using graceful-fs. I've tried calling stream.destroy() and stream.close(), but I still get the same issue. My server is essentially a ticking time bomb because we get a heavy, steady flow of users and after so many users have connected it will just stop working.
Also, ulimit -n [open file amount] does not work, and even if it did, this is not a long term solution because I'd like my file connections to close and not sit open for no reason.
I'm using Node.js version v0.10.25, Ubuntu 15.04 (GNU/Linux 3.19.0-42-generic x86_64) and the latest version of graceful-fs if that helps!
Thanks for any help you can provide.
This has got to be the stupidest mistake I've ever made. Regardless, here's the answer. I hope I can save someone from dealing with this error and almost ripping their hair out.
I was running my app with nodejs and not node. Turns out, if you do nodejs --version, it will likely return a version that is very old, which was v0.10.25 for me. node --version however was v5.6.0. Obviously this massive jump in versions would fix some stuff, so I ran the app with node app.js instead of nodejs app.js and I haven't had the issue at all since. There are now only 6 open files, whereas before we had over 1000 with time.
Damn it feels good to have this off my chest.

Design.io can't hot-push code

Trying to use design.io with node.js Express to get css/javascript hot-push into browsers,
I cloned the example https://github.com/viatropos/design.io-example
and couldn't get it to hot deploy after
following instructions in https://github.com/viatropos/design.io-example/README.md
shell-1-project-dir> design.io --watch ./src
error: unknown option `--watch'
following the instruction on https://github.com/viatropos/design.io/README.md
shell-1-project-dir> design.io start
shell-2-project-dir> design.io watch
[Sun, 06 May 2012 03:52:04 GMT] INFO updated views/.index.jade.swp
[Sun, 06 May 2012 03:52:04 GMT] INFO updated views/index.jade
[Sun, 06 May 2012 04:03:11 GMT] INFO updated views/.index.jade.swp
[Sun, 06 May 2012 04:03:11 GMT] INFO updated views/index.jade
Doing this, I can't access the http://localhost:4181/ because node.js isn't started. I have to start node.js instead.
shell-1-project-dir> node server.js
However this doesn't hot-push the changed index.jade file.
Seems like the example is outdated?
How do I hot deploy ?
Env:
OSX-LION
node 0.6.15
First, you can have your node server running in a different terminal window (you don't need to run either the design.io server or the node server, you can run both at the same time as long as they are on different ports).
Second, I don't think this does what you want it to. Design.io seems to be for injecting changes made to static client files like stylesheets and javascript files. You change a .css file, design.io sees the change and broadcasts it to the browser, design.io in the browser forces a stylesheet reload, completing the hot push of the change.
Jade files are a different story, they need to be processed by an interpreter before being sent to the browser (browsers don't understand jade files). Design.io will see the change but can't do anything about it since Node.js needs to process the new file and send an updated response to the browser. The only way it does this is if you refresh the browser page (thereby sending a new request) which is not really a hot swap.

Categories

Resources