use React or Vue over server side template engines - javascript

For web development React or Vue seem to be a must have framework. Currently I just got some experience with NodeJs and Handlebars.
I know Handlebars runs a server side rendering, Vue and React run a client side rendering. Using Vue or React makes reusable components possible. Using a server side template engine requires a base layouting.
Let's say I have a website / webapplication with some routes. Why should I use Vue for HTML components instead of Handlebars HTML?
What I learned so far is that whenever I can
improve my SQL statement, do it before manipulating the data by code
do something at the client that can be done by the server, do it at the server because it has more power
Everyone is using Vue or React now, why should I leave the old structure and start outsourcing the code to the client?

I believe your question is more tied to the concept of server-side rendering and client-side rendering.
Server-side rendering has a few important points that you should consider when evaluating a problem:
Is faster to render the page in the server than in the client
It is much better for SEO(Search Engine Optimisation), since the crawlers can craw the entire page. Since some crawlers are not
evaluating/running javascript, a SPA(Single Page App) will probably
result in an empty page. Even though Google has improved quite a lot
with SPA SEO, server-side rendering is still the best option.
Client-side rendering, using SPAs, has different advantages:
Is much better to manipulate and maintain user state in the client-side, since you can have your webpage broken down into
components.
Faster interactions/page changes after the first load, since, in most cases, the entire app is loaded at once in the first request.
So with that in mind, you have to consider what you want to do. If you are building some website that reflects a more page-like structure, like a news or blog, server-side rendering is probably the best way to go.
On the other hand, if you are building a full-blown application that has loads of user interactions and state management, client-side rendering (or SPA) could be the best option.
There is no reason to outsource your code to the client-side without evaluating your problem first. That really depends on the problem you are trying to solve.

May I refer you to this article. As you can see it's not all black and white. From the cited article ..
Server-side pros:
Search engines can crawl the site for better SEO.
The initial page load is faster.
Great for static sites.
Server-side cons:
Frequent server requests.
An overall slow page rendering.
Full page reloads.
Non-rich site interactions.
Client-side pros:
Rich site interactions
Fast website rendering after the initial load.
Great for web applications.
Robust selection of JavaScript libraries.
Client-side cons:
Low SEO if not implemented correctly.
Initial load might require more time.
In most cases, requires an external library.
However, there are frameworks that do universal rendering such as next.js and nuxt.js (built around react and vue, respectively) that can leverage the powers of both the client and the server so you can have the best of both worlds. This usually involves sending the fully rendered application in the initial payload from the server to the browser and then loading the application code browserside afterwards.

Related

Server-side rendering with JavaScript frameworks: What are the benefits over established backend-technologies?

In recent times JavaScript frameworks for server-side rendering became popular. For example Next.js and Nuxt.js.
I wonder: Does those frameworks have any special benefit over established, pure backend technologies (Python Django, Ruby on Rails, PHP Laravel, ...)?
Or is it just for using the technology, you are known from the frontend, at the backend too? So that you don't have to switch between different languages.
The primary advantage of frameworks like Nuxt.js is that they implement server-side rendering of client-side framework code. (In the case of Nuxt.js it's Vue.js, but there are ones for React and Angular [in fact, I think there is probably more than one for each of them].)
You write your code using the client-side framework, and you can deploy it:
client-side, or
server-side for a static site, or
a combination of both
...all with the same technologies (including JavaScript rather than [say] PHP, but also the same framework tech).
Server-side rendering lets you present something meaningful to bots, or pre-bake common page views rather than rendering them on the client, or load a static version quickly via edge-cached resources then "hydrate" it to make it dynamic, ...
The purpose of server-side rendering in the world of React is to solve the problem of getting content to the user as quickly as possible, not so much because there are other backend technologies of different programming languages we don't want to bother with, after all, I have put together server-side rendering and non-server-side rendering React applications that have an Express server, so it all can be done with just JavaScript.
Now, just saying that it's to solve the problem of getting content to the user as quickly is possible may not mean anything if you don't understand that the browser requests a page...and then we wait...the browser requests a JS file...and then we wait...React app boots, it requests json from backend...and then we wait...and finally the content is visible.
Now this is not happening in hours or minutes but seconds to milliseconds, but that can make a huge difference in the success of a business. Perhaps this article from Fast Company may make the point:
https://www.fastcompany.com/1825005/how-one-second-could-cost-amazon-16-billion-sales
These big retailers have proven that loading a page as quickly as possible leads to improved conversion rates and increased user satisfaction, therefore it's in our best interest as engineers to figure out some way to condense down this loading process to be as quickly as possible and thus we have server-side rendering.
We want to get content visible to the user as quickly as possible. We want one request and...boom! the user can start enjoying the application.

Laravel + ReactJS loose components in blade views. A good practice?

The following is unclear to me. So far, I don't see lose Reactjs components embedded in views, but only in Single Page Applications.
I was wondering if one could use Reactjs in a Laravel application in combination with the blade template engine? I have a Laravel project and I like the way ReactJS binds to the DOM. But I do not need an entire JS SPA.
So is it possible AND a good practice to use different ReactJs components loosely in blade views? For example a React table component and a header message component, that also are able to communicate with each other.
There is a package that does just that.
However I would argue it would be a better practice if you separate the front-end and back-end completely. So you can use Laravel to serve the data using REST API endpoints and then display everything using a React app.
This way they will be loosely coupled and if you ever want to swap out one for the other (e.g. swap out React for Angular) then it would be a lot easier as your api endpoints wouldn't have to change at all.
EDIT: Some benefits of a headless CMS (taken from pantheon.io):
By shifting responsibility for the user experience completely into the
browser, the headless model provides a number of benefits:
Value: future-proof your website implementation, lets you
redesign the site without re-implementing the CMS
Sets frontend developers free from
the conventions and structures of the backend. Headless development
not only eliminates “div-itis”, it gives frontend specialists full
control over the user experience using their native tools.
Speeds up the site by shifting display logic to the client-side
and streamlining the backend. An application focused on delivering
content can be much more responsive than one that assembles completely
formatted responses based on complex rules. Builds true
interactive experiences for users by using your website to power fully
functional in-browser applications. The backend becomes the system of
record and “state machine”, but back-and-forth interaction happens
real-time in the browser.
Lets say you also want to build a mobile app using React Native that would use the same underlying code base as your React web app. If you have a decoupled CMS, so your backend is only serving data then you can make calls to this same backend from both your web app and mobile app without worrying about content types, response formats etc... In-fact, with React Native you can use the same React codebase for the mobile app along with the web app and you would only have to change some views around.
It's a great way to re-use your code with better modularity and seperation of concerns.

Using React (with Redux) as a component in a website

I have a large, globalised web site (not a web app), with 50k+ pages of content which is rendered on a cluster of servers using quite straightforward NodeJS + Nunjucks to generate HTML. For 90% of the site, this is perfectly acceptable, and necessary to achieve SEO visibility, particularly in non-Google search engines which don't index JS well (Yandex, Baidu, etc)
The site is a bit clunky as complexity has increased over time, and I'd like to re-architect some of the functional components that are built mostly using progressively enhanced jQuery as they are quite clunky. I've been looking at React for this with the Redux implementation of the Flux pattern.
Now my question is simply around the following - nearly 100% of the tutorials assume I'm building some sort of SPA, which I'm not. I just want to build a set of containerised reusable components that I can plug into replace the jQuery components. Oh, they have to be WCAG AA/508 accessible as well
Does React play well with being retrofitted into websites and are there any specific considerations around SEO, bootstrapping, accessibility? Examples of implementations or tutorials would be appreciated.
You can mount react component to any DOM Node on your page, so it makes it easy to insert components in statically generated content.
Most of search engines like google would wait for js files to load before they index the page so it will index a page with react component perfectly fine. However if you want to be 100% sure that your page rendered correctly by all crawling bots you have to take a look at react server rendering. If you already use NodeJS for a backend it should not be a big problem.
I never encountered with that kind of problem but my best guess would be to use ReactDOMServer.renderToString to render component on the server and then replace a node in your static html layout. The implementation would depend on you template lang you use. You can use something like handlebars to dynamically create halpers from React Components. So in your static html page you would be able to use them as {{my-component}} But it's only my speculations on that subject, may be there is more elegant solution.
Here is the article that could help.
You'll be happy to know that this is all possible through something called isomorphic javascript. Basically you'll just use React and jsx to render HTML on the server which is then sent to the browser as a fully built web page. This does not assume your app is an SPA, rather that you'll have multiple endpoints for rendering different pages, much like you already have probably.
The benefit here is that you can use the React/Redux architecture but still allow you site to be indexable by crawlers, as requests to your app will yield static pages, not a single page with lots of JS to make it work. You're also free to gradually refactor by converting your Nunjucks rendered endpoints to React one at a time, instead of a big jump to SPA land.
Here's a good tutorial I found on making isomorphic React apps with node:
https://strongloop.com/strongblog/node-js-react-isomorphic-javascript-why-it-matters/
EDIT: I may have misread your actual desire which is to inject React components into your existing web pages. This is also possible, you'll probably want to use ReactDOM to render your components to static markup, and then you can inject that markup string into your Nunjucks via templating.

React server-side render or static index.html?

How you can see in React manual (ReactDOMServer):
If you call ReactDOM.render() on a node that already has this
server-rendered markup, React will preserve it and only attach event
handlers, allowing you to have a very performant first-load
experience.
So does it mean that if I use static index.html in which I just include my react app js file I don't have to use server-side rendering?
Btw which of react-app architecture better for SEO?
Thanks for you answers!
In theory, it's true that you can use static index.html. React will try to render the page on the client side and update your html. This has become much easier to do with React 15 as you no longer need to maintain data-reactid attributes.
Nonetheless, I'd recommend using SSR (server side rendering) because it makes life easier. Granted, it takes effort to set up but it's beneficial. You also get to make use of server side routing, critical path css, and more.
If you want SEO, universal apps are the way to go. Two excellent architectures are:
React redux universal hot example
React starter kit
Good luck!
So does it mean that if I use static index.html in which I just
include my react app js file I don't have to use server-side
rendering?
Of course. You can certainly use React purely on the client without any need for server rendering. However server side rendering can be beneficial for graceful degradation. It also helps from a usability perspective as your user won't have to wait for javascript to be downloaded and executed before any content can be shown.
Btw which of react-app architecture better for SEO?
Now search engines have significantly matured in their ability to crawl dynamic pages. However the support for javascript generated content is a work in progress in most engines. As Google Webmasters blog explains:
Sometimes the JavaScript may be too complex or arcane for us to execute, in which case we can’t render the page fully and accurately.
Some JavaScript removes content from the page rather than adding, which prevents us from indexing the content.
So from SEO perspective it is still better if you opt for server side rendering.

Using node.js to serve content from a Backbone.js app to search crawlers for SEO

Either my google-fu has failed me or there really aren't too many people doing this yet. As you know, Backbone.js has an achilles heel--it cannot serve the html it renders to page crawlers such as googlebot because they do not run JavaScript (although given that its Google with their resources, V8 engine, and the sobering fact that JavaScript applications are on the rise, I expect this to someday happen). I'm aware that Google has a hashbang workaround policy but it's simply a bad idea. Plus, I'm using PushState. This is an extremely important issue for me and I would expect it to be for others as well. SEO is something that cannot be ignored and thus cannot be considered for many applications out there that require or depend on it.
Enter node.js. I'm only just starting to get into this craze but it seems possible to have the same Backbone.js app that exists on the client be on the server holding hands with node.js. node.js would then be able to serve html rendered from the Backbone.js app to page crawlers. It seems feasible but I'm looking for someone who is more experienced with node.js or even better, someone who has actually done this, to advise me on this.
What steps do I need to take to allow me to use node.js to serve my Backbone.js app to web crawlers? Also, my Backbone app consumes an API that is written in Rails which I think would make this less of a headache.
EDIT: I failed to mention that I already have a production app written in Backbone.js. I'm looking to apply this technique to that app.
First of all, let me add a disclaimer that I think this use of node.js is a bad idea. Second disclaimer: I've done similar hacks, but just for the purpose of automated testing, not crawlers.
With that out of the way, let's go. If you intend to run your client-side app on server, you'll need to recreate the browser environment on your server:
Most obviously, you're missing the DOM (Document Object Model) - basically the AST on top of your parsed HTML document. The node.js solution for this is jsdom.
That however will not suffice. Your browser also exposes BOM (Browser Object Model) - access to browser features like, for example, history.pushState. This is where it gets tricky. There are two options: you can try to bend phantomjs or casperjs to run your app and then scrape the HTML off it. It's fragile since you're running a huge full WebKit browser with the UI parts sawed off.
The other option is Zombie - which is lightweight re-implementation of browser features in Javascript. According to the page it supports pushState, but my experience is that the browser emulation is far from complete - however give it a try and see how far you get.
I'm going to leave it to you to decide whether pushing your rendering engine to the server side is a sound decision.
Because Nodejs is built on V8 (Chrome's engine) it will run javascript, like Backbone.js. Creating your models and so forth would be done in exactly the same way.
The Nodejs environment of course lacks a DOM. So this is the part you need to recreate. I believe the most popular module is:
https://github.com/tmpvar/jsdom
Once you have an accessible DOM api in Nodejs, you simply build its nodes as you would for a typical browser client (maybe using jQuery) and respond to server requests with rendered HTML (via $("myDOM").html() or similar).
I believe you can take a fallback strategy type approach. Consider what would happen with javascript turned off and a link clicked vs js on. Anything you do on your page that can be crawled should have some reasonable fallback procedure when javascript is turned off. Your links should always have the link to the server as the href, and the default action happening should be prevented with javascript.
I wouldn't say this is backbone's responsibility necessarily. I mean the only thing backbone can help you with here is modifying your URL when the page changes and for your models/collections to be both client and server side. The views and routers I believe would be strictly client side.
What you can do though is make your jade pages and partial renderable from the client side or server side with or without content injected. In this way the same page can be rendered in either way. That is if you replace a big chunk of your page and change the url then the html that you are grabbing can be from the same template as if someone directly went to that page.
When your server receives a request it should directly take you to that page rather than go through the main entry point and the load backbone and have it manipulate the page and set it up in a way that the user intends with the url.
I think you should be able to achieve this just by rearranging things in your app a bit. No real rewriting just a good amount of moving things around. You may need to write a controller that will serve you html files with content injected or not injected. This will serve to give your backbone app the html it needs to couple with the data from the models. Like I said those same templates can be used when you directly hit those links through the routers defined in express/node.js
This is on my todo list of things to do with our app: have Node.js parse the Backbone routes (stored in memory when the app starts) and at the very least serve the main pages template at straight HTML—anything more would probably be too much overhead /processing for the BE when you consider thousands of users hitting your site.
I believe Backbone apps like AirBnB do it this way as well but only for Robots like Google Crawler. You also need this situation for things like Facebook likes as Facebook sends out a crawler to read your og:tags.
Working solution is to use Backbone everywhere
https://github.com/Morriz/backbone-everywhere but it forces you to use Node as your backend.
Another alternative is to use the same templates on the server and front-end.
Front-end loads Mustache templates using require.js text plugin and the server also renders the page using the same Mustache templates.
Another addition is to also render bootstrapped module data in javascript tag as JSON data to be used immediately by Backbone to populate models and collections.
Basically you need to decide what it is that you're serving: is it a true app (i.e. something that could stand in as a replacement for a dedicated desktop application), or is it a presentation of content (i.e. classical "web page")? If you're concerned about SEO, it's likely that it's actually the latter ("content site") and in that case the "single-page app" model isn't appropriate; you really want the "progressively enhanced website" model instead (look up such phrases as "unobtrusive JavaScript", "progressive enhancement" and "adaptive Web design").
To amplify a little, "server sends only serialized data and client does all rendering" is only appropriate in the "true app" scenario. For the "content site" scenario, the appropriate model is "server does main rendering, client makes it look better and does some small-scale rendering to avoid disruptive page transitions when possible".
And, by the way, the objection that progressive enhancement means "making sure that a user can see doesn't get anything better than a blind user who uses text-to-speech" is an expression of political resentment, not reality. Progressively enhanced sites can be as fancy as you want them to from the perspective of a user with a high-end rendering system.

Categories

Resources