I am making a full-stack web-application using Next JS where I allow the user to create and manage letters (applications) based on pre-defined templates. So when the user successfully creates an application, it is sent to the database (POSTGRES) which is hosted on Supabase. On the home page, the applications created by the user are fetched and displayed in the form of a list. Here on, when the user chooses to preview an application, dynamic routing is put in place where the application IDs work as the dynamic parameter. By using getStaticPaths() to get the route parameters from the database, and then fetching the data for the page from the database based on the application ID in the getStaticProps() method at build time, we render the page. It works seamlessly on localhost but not on Vercel. The interesting part however is,that dynamic routing works on Vercel for past applications for every deployment, that is if the user wants to preview their past applications they can do so without any problem, but when they create an application and then try to preview it, they are prompted with the 404 error. But if I trigger a redeployment either manually or by a commit to the main branch of my repository, the error is fixed for the particular application which was giving the error. `
export const getStaticPaths = async () => {
var APIendpoint;
if (process.env.NODE_ENV === 'development') {
APIendpoint = 'http://localhost:3000/api/fetchApplication'
}
else if (process.env.NODE_ENV === 'production') {
APIendpoint = 'https://templaterepo.vercel.app/api/fetchApplication';
}
const data = await getPaths(APIendpoint)
const paths = data.map((application) => {
return {
params: { id: application.appid.toString() }
}
})
return {
paths,
fallback: 'blocking'
}
}
export async function getStaticProps(context) {
const appID = context.params.id;
var APIendpoint;
if (process.env.NODE_ENV === 'development') {
APIendpoint = 'http://localhost:3000/api/fetchApplicationwithID'
}
else if (process.env.NODE_ENV === 'production') {
APIendpoint = 'https://templaterepo.vercel.app/api/fetchApplicationwithID';
}
let data = await getPageData(APIendpoint, appID);
return {
props: { data }
}
}
Here is the code for the dynamic [id].js page where in I first get the paths based on the application IDs and then in the getStaticProps() function I fetch the data for the page corresponding to the application ID at build time. It works as expected in localhost but in Vercel deployment, even before the functions are executed, I get a 404 error.
Note: Vercel Framework Preset is set to Next.js.
I tried a variety of solutions including adding to and as parameters in the Link component. Also I changed my vercel.json file to the below configuration
`
{
"rewrites": [{ "source": "/(.*)", "destination": "/index.html" }]
}
`
But nothing seems to work.
When they create an application and then try to preview it, they are
prompted with the 404 error. But if I trigger a redeployment either
manually or by a commit to the main branch of my repository, the error
is fixed for the particular application which was giving the error.
This is expected, the data necessary for each dynamic page to be built is fetched ONLY at build time. Since you are using getStaticProps, you can implement ISR by adding a revalidate prop in getStaticProps, that way when a page (like a new application) has not been generated at build-time, Next.js will server-render it on first request and then cache it for subsequent requests.
On development mode, both getStaticProps and getStaticPaths run per-request (much like getServerSideProps), that's why you don't have this issue on the dev environment. Reference to this on the docs.
If you decide to implement ISR and want to display a loading UI while the page is being server-rendered, make sure to set fallback: true on getStaticPaths and at component level, you can access the router.isFallback flag to display the loading UI accordingly, otherwise, leave it as you already have with fallback: 'blocking'.
Also, make sure you write the server-side code directly in getStaticProps and getStaticPaths instead of calling your own API endpoints on these functions. This according to the docs.
Related
I have a project with react js and next js. I am developing a dynamic page, with getStaticPaths and getStaticProps. So I am fetching most of the data in getStaticProps to make the page be rendered on server side.
But there are some data which I can't fetch on server side, because it needs token which is stored in local storage.
The question is, if I use useEffect hook to fetch those data on client side, does this all process make any advantage for SEO?
Or I have to change structures, and store token in cookies to fetch all data on server side?
Update:
I want to check if user is logged in, and based on the result, show the page in different styles. But no user-related data is going to be rendered.
Right now, my code looks like this:
export default function BookDetail(props) {
const [isLoggedIn, setIsLoggedIn] = React.useState(false);
React.useEffect(() => {
// It captures token from cookies
const token = getCookie("token");
// Then I need to confirm that token is valid from backend
if (token) {
setIsLoggedIn(true);
}
}, []);
return (
<div>
{ !isLoggedIn ? (
{props.res.data.title}
<br/>
{props.res.data.description}
) : (
{props.res.data.title}
<br/>
<button type="button" onclick={()=>{window.location.href='http://example.com';}}
)}
</div>
);
}
If you need a token to fetch said data, that data is probably related to the user? Hence, doesn't and shouldnt be considered with SEO.
If your data is not specifically for the user. Consider making it accessable without token.
Edit based on the comments here:
Fetching data inside useEffect will absolutely affect SEO. You want to display part of a book (text) for users that are not logged in. You check if users are logged in by a request from the useEffect, this is fine and standard.
If you want to Google to be able to read your book-text with crawlers you can not fetch it in useEffect, I suggest the following:
in your getStaticProps: Fetch the data (book text) and pass it to your page. Display this information by default.
Then in your useEffect you check if the user is logged in. If they are --> remove the text and render a button instead.
This way, Google will read it as you intend, while logged in users will only see a button.
You can check no problem on the server side whether a user is logged in only when you use getServerSideProps - getStaticProps are executed at a built time so there is no communication with whatever user inputs into the UI simply because thats a different time frame: building the app on the server, only when the app is built user can interact with it.
But getServerSideProps are not executed at a built time, yet there are still executed on the server side and since useEffect is a frontend API it won't work there. So there are two ways:
If you use NextAuth - you can use getServerSideProps and on the context object you have 'req' property and the property passed to the getSession function (you need to import that function) will tell you whether user has a session or not. Here is an example code snipet:
import { getSession } from "next-auth/react";
// some code here like your frontend component
export const getServerSideProps = async (context) => {
const { req, res } = context;
const session = await getSession({ req: req });
if (!session) {
return {
redirect: { destination: "/", permanent: false },
};
}
const email = session.user.email;
return {
props: { email: email, session },
};
};
Here is more on the subject from the official next docs:
https://nextjs.org/docs/authentication
If you don't use NextAuth I am sure you can attach your token to the context object like in the example above and read it in getServerSideProps except not use getSession as that is NextAuth API. haven't done it though.
I'm new to the Next js and developing the Next js website. I am stuck in multiple authentications with different routes and roles. How can I handle it in the next js?
Frontend: Next js
Backend: Node js with JWT (JSON web token).
Please guide me on what I should use to authenticate.
Thanks in advance.
I am assuming you have done a couple things with my answer below:
you are setting a http only authenticated cookie / signing it, expiring it etc
On api requests, you are validating this cookie
You can create a middleware.ts / .js file on the root of your project, something like the following (note I was using typescript, you can just remove the types if using javascript):
// middleware.ts
import { NextResponse } from "next/server";
import type { NextRequest } from "next/server";
const protectedPages = ["/something", "/foo", "/profile"];
export function middleware(request: NextRequest) {
if (protectedPages.find((page) => page === request.nextUrl.pathname)) {
const token = request.cookies.get("YOUR_TOKEN_NAME");
if (!token) {
// send the user back to the sign in / home page or wherever
const url = request.nextUrl.clone();
url.pathname = "/home";
return NextResponse.redirect(url);
}
}
}
You do not need to import this anywhere, just do this, get the cookie and you are done. No Cookie with the name you gave it, show them the door.
Take a read of the docs from if you want to know more, they explain things better than me :) https://nextjs.org/docs/advanced-features/middleware
Can also combine this with the above suggestion with getServerSideProps to pass the data as a prop to the component.
I've got a really simple JSON flat file db setup that works when running locally but doesn't work once it's hosted on Netlify. I don't get any other error info besides a 500 error on the server. I get the error even if all I do is import the clusterDB object, so something is happening with the lowdb object. I've also tried using another json db library called StormDB and I get the same issue.
Return my API route with a static import of the json file (no db libraries) also works fine.
I'm new to Next.js and this seems related to maybe the SSR portion of things since the API routes run only on the server? Do I need to structure my files differently? Are these libraries not compatible? Lowdb says it works with Node, and everything works locally for me.
Here is my db init file (root/db/db.js)
import {Low, JSONFileSync} from 'lowdb'
// Cluster DB Setup
const adapter = new JSONFileSync('cluster-db.json')
const clusterDB = new Low(adapter)
// Initialize if empty
clusterDB.read()
clusterDB.data ||= { clusters: [] }
clusterDB.write()
export {clusterDB}
And my only API route (root/pages/api/clusters.js)
import {clusterDB} from '../../db/db'
export default async function handler(req, res) {
await clusterDB.read()
switch(req.method) {
case 'POST':
let newCluster = {severity: req.query.severity, comments: req.query.comments, date: req.query.date}
clusterDB.data.clusters.push(newCluster)
clusterDB.write()
res.status(200).json({status: "Success", cluster: newCluster})
break;
case 'GET':
if(clusterDB.data.clusters) {
res.status(200).json(clusterDB.data.clusters)
} else {
res.status(404).json({status: "404"})
}
break;
}
res.status(200).json({test: "yay"})
}
I have a web app written in NUXT that makes use of Firebase's Hosting, Firestore, Authentication and Storage.
Its a simple blog layout that has all the usual CRUD functions for its blog posts. It is loosely bases on Quick Nuxt.js SSR prototyping with Firebase Cloud Functions and Nuxt.js Firebase Auth.
In the development environment it runs perfectly but when I deploy it, the Firestore specifically, behaves unexpectedly.
So after the project has been deployed I can CRUD documents that reflect as expected in the Firebase Console Firestore viewer, but when I read the data again it will load the same data. In other words if I delete a document it will disappear in the Firestore viewer but when I refresh my NUXT website it loads that document again even though it's no longer present in the Firebase console. I get the same result on different computers/devices, so not a local caching issue.
I noticed that the changes in the Firestore viewer will only reflect in my website after I re-deploy my project. But any changes I make will not show after I refresh the website even though they have changed permanently in the Firestore viewer.
When in development it works perfectly, I can manipulate the database, refresh and it will load exactly what’s reflected in Firestore viewer.
Sorry for repeating it so much but I’m having an existential crisis here, lol.
So below is a sample of the NUXT's Store's index.js file, where you would have all your data stored for your app. It works perfectly at manipulating the data on Firestore but once in production the website gets served the same data over and over.
import { firestore } from '~/plugins/fireinit.js' // the part where `firebase.initializeApp` happens
Decare my array state: posts.
export const state = () => ({
posts: []
})
Mutations for manipulating the posts array.
export const mutations = {
addP (state, payload) { // Gets run for each documents from collection on first load.
state.posts.push(payload);
},
delP (state, payload) { // Deletes a post from the posts state.
state.posts = state.posts.filter(p => {
return p.id != payload.id;
});
},
}
The nuxtServerInit() runs on the server to make it Server Side Rendered when the website first loads.
export const actions = {
async nuxtServerInit({commit}, context) {
await firestore.collection('posts').get().then((querySnapshot) => {
querySnapshot.forEach(function(doc) {
var obj = doc.data()
obj.id = doc.id;
commit('posts/addP', obj)
})
})
},
The deletePost() action deletes a file on Firebase Storage then deletes the document on Firestore. Then finally removes the item from the posts state.
deletePost ({commit}, payload) {
storage.ref().child(payload.fullPath).delete().then(function() {
firestore.collection('posts').doc(payload.id).delete().then(()=>{
commit('delP', payload);
})
.catch((error)=>{
console.error(error);
});
})
}
}
This is what my Firestore Rules look like
rules_version = '2';
service cloud.firestore {
match /databases/{database}/documents {
match /{document=**} {
allow read;
allow write: if request.auth != null;
}
}
}
What am I doing wrong :/
So after losing some hair I finally figured it out!
So in NUXT you have two options for deploying your project, nuxt build or nuxt generate.
The generate option reads the database and then builds your static files from the firestore, which is then deployed. This is why when I reloaded my page it had all the old entried in the DB.
After switching to the build option and deploying that instead it all works perfectly.
I'm using workbox-webpack-plugin to register service worker.
My frontend app is react-redux app configured with webpack. If you visit app url, you can always see login view.
My plugin inside webpack.config.js:
new InjectManifest({
swSrc: path.join('src', 'service-worker.js')
})
Service worker:
workbox.skipWaiting();
workbox.clientsClaim();
workbox.precaching.precacheAndRoute(self.__precacheManifest);
My service worker caches all my splitted routes. But that doesn't matter - even if they all are cached, when user without connection visits my app, he cannot login. That's why I need a way to check if user is in offline mode, and instead of returning login, return 'offline.html' page.
I found out that my env.config.js file (which contains API URLS and is requested on login page) is not cached, so I think it would be easy to catch error while not getting this file. So I added following in my service worker:
workbox.routing.registerRoute(
new RegExp('/env.config.js'),
({event}) => {
return networkFirstHandler.handle({event})
.catch(() => caches.match('/offline.html'));
}
);
But it doesn't return offline.html in browser. It seems like 'offline.html' file is returned instead of 'env.config.js' file.
How to accomplish this? I'm new to workbox plugin and it would be great to see some suggestions.
importScripts("/precache-manifest.81b400bbc7dc89de30f4854961b64d1d.js", "https://storage.googleapis.com/workbox-cdn/releases/3.4.1/workbox-sw.js");
workbox.skipWaiting();
workbox.clientsClaim();
const STATIC_FILES = [
'/env.config.js',
];
self.__precacheManifest = STATIC_FILES.concat(self.__precacheManifest || []);
workbox.precaching.precacheAndRoute(self.__precacheManifest);
Update - since I decided to cache env.config.js file I'm only getting API error while using app offline. Maybe this API call (which returns error because of no connection) is a good trigger to display offline page? I think it is, but I still don't know.
When I try something like this:
workbox.routing.registerRoute(
new RegExp(API_REGEX_GOES_HERE),
({event}) => {
return networkFirstHandler.handle({event})
.catch(() => caches.match('/offline.html'));
}
);
The "offline.html" page will be returned instead of API request. So it will not be displayed like a page...