I am facing a weird problem with push notifications on Chrome and Firefox. I have created a push notification service in a similar fashion to the recent tutorials on this issue (such as https://developers.google.com/web/ilt/pwa/introduction-to-push-notifications), using an index.js javascript page to register the service, which for the sake of argument is named mypushservice.js. Both are located in the same folder on my web server.
The registration works fine in my development environment, and when I deploy the files on my (SSL) production server environment and call the index.js from a different test location, the registration seems to work fine as well.
However, upon integration of the application with the product that needs to utilise the push notification service, the
navigator.serviceWorker.register('./mypushservice.js')
fails with a 404 not found Http exception. The javascript file is available (and can be registered from my test location), but somehow the service fails when called from another environment.
Has anyone had similar issues and found a solution for this problem?
Thanks
I managed to find the solution to this problem. Apparently the index.js and mypushservice.js both need to be accessed from the same server in order to work. The call to index.js was made from another (client) server, which created a security problem. We managed to create a workaround by adding an html page to the server that calls index.js on loading, and embedding this html file in an IFrame from the client server. I am still looking for a neater solution, but for now this works fine
Related
I'm trying to upgrade an old mobile application written in vanilla Javascript+React hosted on Cordova. This application leverages a simple api on the server-side:
when the app requires a page, sends a request to the server
the server elaborate the request, fetch the resources, and then reply with a complete HTML+JS. The Javascript is a ReactJs view compiled with Gulp/Browserify
the app takes the reply and stores it in a local Sqlite DB, then mount the received code and the view become reactive.
if the user requests a view but has no connectivity, the app search though the Sqlite db wether there is a cached view and uses it instead of requesting a fresh copy from the server.
When developing, the React JSX code is immediately compiled to a vanilla JS so, when in production, the api only needs to merge the vanilla JS with the HTML template. Plus, adding new features and fixing bugs is quite easy, because each user essentially download any updated view, each time he enters in it.
The problems with this approach are:
developing is painful because of the continuous compilation
a "base" part of the application resides on Cordova assets (basically the utilities to fetch from server, caching etc) and fixing this parts needs a new app release or ugly override patches
the caching feature often causes problems to the Sqlite database (which is used also for other stuff on the app); as a result, the DB sometimes corrupts and the user must clear the app data.
we would like to get rid of React
We already used NuxtJs for generating static sites, and it's great, but in this case I cannot leverage the SSG because the app should be almost completely served from the api, so we can keep the easy-feature-and-fix stuff.
I never used NustJs SSR and wonder if could be suitable for my use case, for example, could I spin up a NuxtJs instance on server-side which generates the html output and hydrate a barely empty Javascript client on Cordova? Is there a better way to accomplish this task? Should I use only Vue instead?
Thanks
I'm new to SSR so I'm not sure if this problem and my solution is standard practices but I can't imagine so.
My goal is to have a dynamic page that allows users to add/remove items on the page. I originally programmed this component with the intention of it only being a client side react project but now I want to put it on a server. Now when I'm translating my code to the new project I've run into a couple errors now that has to do with my backend running the code that is only supposed to be run on client side.
For instance I ran into this problem earlier React Redux bundle.js being thrown into request, and I was able to solve this issue with a Janky solution where I make sure that it's being passed client side code and stop the execution of its being passed from the backend. Now I've had to refactor my code to not use the fetch() function because it's not a function that the node backend recognizes and it's now complaining about my use of the document object because that's not a thing in node either.
I can keep on going importing new modules to fix the errors to keep my website from crashing but I feel like I'm on a small boat patching up new holes with duck tape waiting to find the next hole.
Here's an image of my config if that's necessary I also have additional images in my previous stack overflow question (link aobove)
For the bundle.js issue I am not sure to understand why it happens.
For the fetch issue, I think this is a common problem with SSR and if you implement it by yourself you need to handle conditions in different places of your app:
if(!!window) {
// do client-side stuff like accessing
// window.document
}
Basically the most common usage of SSR is to handle the first execution of you app on the server side. This include:
Route resolution
Fetching data (using nodejs http module)
Hydrating stores (if you use redux or other data library)
rendering UI
Once the execution is done, your server returns the bundled js app with hydrated store and UI and returns it to the client. Subsequent requests or route update will be executed on the client side, so you can directly use fetch or react-router
The pros of doing SSR are:
Great first contentful
Great for SEO
Client-side machine do less works
There is a lot of libraries that can help you with SSR as well as frameworks like nextjs, use-http
I am trying to execute a local php file from the client with axios get. This is most likely not achievable from what I found out, except that this post has an answear that helped, unfortunately it is too vague for me to understand what he really ment by "environment hasn't got PHP installed", but it gave me hope. The OP had pretty much an identical issue like me.
The functionality of my php code is this - after taking the params from the get url it downloads a file in the same directory as the php file. I'm doing this, so I can have local access to the downloaded file in my client. Also I am doing it like that because I don't have access to the server side of the project.
If this is not possible after all, I found out that running my project throughout the xampp Apache server could work for me, but not sure if it is going to be ideal. I already confirmed that the php code works by running it trough Apache and using axios get to execute it, but I need the downloaded file in my project's local dir.
Well till this current moment there is no way to execute php file in the client without a server. But building and implementing my project into xampp Apache server did the trick better than I expected. It managed to help me simulate the path of the downloaded file so I can access it.
As #ceejayoz mentioned in his comment "they'll have to put the file in the right place, and you'll need to know the FQDN or IP address of their local server" so that the connection between the file from the server and the client has no issues.
The question is, say I have written a backend REST service using Python, and then some other guy wrote some frontend code in Angular JS. Is the typical workflow putting them into two separate folders and run them separately? So the process will look like below
python manage.py runserver
in Python colder and then probably
# in the angular2 folder
npm start
Or should I place all the JS code into some assets folder and when I start my server, all the JS code will run automatically with them? If so, how should I do it?
Another question related is, when is all the JS code sent to users’ browsers? If the app is a client-side rendering app, is the code sent to browser at the first time of server requesting? How does the server know it should package your JS code and ship it?
Q)The question is, say I have written a backend REST service using Python, and then some other guy wrote some frontend code in Angular JS. Is the typical workflow putting them into two separate folders and run them separately?
Both Angular and Python can be run differently as well as together.
You could choose to place the Angular files (which qualify for all practical purposes as public files) in the public (or related folder) depending on which framework you're using - Django, Flask, Web2py or so on.
You can also choose to run them independently as standalone apps.
Depends on your requirement. Honestly, there are so many ways to answer this question.
Q)Or should I place all the JS code into some assets folder and when I start my server, all the JS code will run automatically with them? If so, how should I do it?
If you place everything in your assets folder, then as soon as the home route or any other route is made a request for [from a browser] , the public folder passes on to the browser.
Although I'm not sure if it is the case with server side rendering.
If you're using Angular 1, I do not think it fits server side rendering although you could use some libraries out there.
Q)Another question related is, when is all the JS code sent to users’ browsers? If the app is a client-side rendering app, is the code sent to browser at the first time of server requesting? How does the server know it should package your JS code and ship it?
All the files in PUBLIC folder on the server are accessible by a browser.
All of your questions seem to essentially ask the same question.
There are many approaches to this problem.
If infrastructure management is difficult for you, than maybe it's easier to place them in the same server. You can create another directory and place your html that is served to browser with your JavaScript code.
In case you have a good pipeline (which I think it pays for it self) than having another server that serves your application is better. You can do more things without affecting your service, like caching etc. And your server that runs your service wont be busy serving assets as well.
I have a case where we run a node server, which serves the html and javascript to our users, because that way we can do server side rendering.enter link description here
The flow of code execution will be this : Once the browser of your user hits the server that is serving html and assets, it will download it locally and it will start with JavaScript execution (Parsing, Compiling and Executing). Your JavaScript app might do some API calls upon initialization or it might not, but your service will be executed only if frontend makes a request.
As for CORS it is irrelevant how you serve them, because you can keep them always in the same domain.
Maybe i'm misunderstanding how Node.js works but, I would like to use it just as a server backend for a web app, without it running as a service/listening a port.
I'm willing to hear ideas to better solve the issue, this app will only be available on our intranet.
Example of what i'm thinking :
backend server.js :
function connectDb(usr, pwrd){
//Some npm package code to connect to a db
return console.log("Sucessfully connected")
}
frontend javascript.js :
require("server.js")
$(".connect.button").on("click", function(e){
connectDb($(".connect.user").text(), $(".connect.pwrd").text())
})
There are two different aspects with your question and code example on which you could work to get a better understanding of the ecosystem.
Client / Server
When a client wants to get some resource from a server, it connects to a specific port on that server, on which the back-end application is "listening". That means, to be able to serve resources coming from a database, you must have a Node process listening to a port, fetching the requested resources from the database, and returning them. The perfect format for that kind of data exchange is JSON.
To get a better understanding of this process, you may want to try and write a simple Node app sending a piece of JSON over the network when it receives a request, and try to load it with an XHR in client code (for example with JQuery's AJAX method). Then, try and serve a dynamic piece of JSON coming from a database, with a query based on the request's content.
Module loading
require("server.js") only works in Node, and can't be used in JavaScript that is running in a client's browser (Well, at least for now. Maybe some kind of module loading could be normalised for browsers, but that's another debate.).
To use a script in a client browser, you have to include it in the loaded page with a <script> tag.
In node, you can load a script file with require. However, said script must declare what functions or variables are exposed to the scripts that require it. To achieve it, you must export these variables or function setting module.exports.
See this article to get some basic understanding, and this part of Node docs to master all the details of module loading. This is quite important, as this will help you structure your app and avoid strange bugs.
For one thing, node itself isn't a web server: it's a JS interpreter, which (among other things) can be used to write a web server. But node itself isn't a web server any more than Java is.
But if you want things to be able to connect to your node program, in order to do things like access a database, or serve a webpage, then, yeah, your program needs to be listening on some port on the machine it's running on.
Simply having your node program listening to a specific port on your machine doesn't mean that anyone else can access it; but that's really a networking question not a programming question.