I want to define a global array that I can set in getserverprops and use throughout the project. Is this possible in next js?
I'm going to use this array as a cache
You can use a global variable on the server (getServerSideProps),
and
you also can use a global variable on the client (aka. the browser),
but
you can not use the same global variable on the server and the client.
Server only
If you only ever want to access that variable inside getServerSideProps,
then that is theoretically possible, but likely will cause all sorts of problems. E.g. consider a load balancer and 3 different server instances,
which have 3 different caches.
Better would be to use some "established caching technology", like a Redis DB.
Shared values
As said, you can not share a variable between server and client.
You might (as a mental model) consider getServerSideProps to be executed in a different country
on some secured server which you don't have access to, while the rest of the components (not all of them)
are executed on your computer in your browser.
So if you want to share some state between client and server, you need to create an API on the server, and communicate between client and server through this API.
If you just define a global array, that array will be created and can be used, but it will be created independently on the server and on the client, i.e. they will be two completely different variable.
my-app/global.js:
export const globalVariable = {
trace: [],
};
Then you access this variable inside the index.tsx:
my-app/pages/index.jsx:
const Home = ( props ) => {
console.log('Client: globalVariable', globalVariable);
console.log('Client: pageProps:', props);
useEffect(() => {
globalVariable.trace.push('from MyApp');
}, []);
return null;
}
export async function getServerSideProps() {
globalVariable.trace.push('from getServerSideProps');
return {
props: {
serverVariable: globalVariable,
},
}
}
Then you will have one globalVariable on the client, and a separate globalVariable on the server.
You will never see "from getServerSideProps" on the client, you will never see "from MyApp" on the server.
You can pass globalVariable from the server as props, like I did with serverVariable: globalVariable,
and that value will be available on the client, but it will be a third new variable on the client side.
You can not hope to props.serverVariable.trace.push('pushed from client to server'), that will only push
to the new client variable.
Related
I am working on a simple todo app with node.js express and I wanted to manipulate some resource in memory, instead of connecting to a database.
I have a local json file todo.json with some predefined data set and I wanted to use that as a starter and the CRUD operations are built on top of it.
So I have a function initializeTodos and a function getTodos
import { readFile } from 'fs/promises'
const initializeTodos = async () =>
JSON.parse(
await readFile(process.cwd() + '/src/resources/todo/todo.json', 'utf-8')
)
export const getTodos = async () => {
return initializeTodos()
}
then in each route handler I would call getTodos to get the todo list and perform crud operations on it. But now the issue is, every time I call getTodos it in turn calls initializeTodos and that gives me the json from the json file, which is static. That means any operations I perform after getTodos is not saved in memory and it is going to get reset every time I call getTodos
I guess I could write back to the disk for each crud operation but I really wanted to keep it simple here to just do it in memory. Is there a way I can achieve that?
But now the issue is, every time I call getTodos it in turn calls initializeTodos
Then don't call initializeTodos
You should load the file once at the start of your app and assign the data to a global variable that will be shared throughout your application. That will be your 'database' - all in memory
Then the updates and reads will be going to the same place so you will see updated results everytime i.e the routes will read write from that global variable
Then once you have this working - refactor the global variable out to its own class and call it ToDoInMemoryDb and hide all the access behind it to keep things clean. Global shared vars can lead you to learn bad habits
On app shutdown you can persist the latest value of the variable back to disk so the next time you have all the edits made
How do you access a session in an endpoint in sveltekit? I've tried this but no luck:
import { get } from 'svelte/store';
import { getStores} from "$app/stores";
function getUser() { // <- call this at component initialization
const { session } = getStores();
return {
current: () => get(session).user
}
}
The session store is being populated in src/hooks.js, the normal flow to do so is
in handle, add some data to event.locals.
in getSession, use event.locals to create a session object.
This session object is available in the client as the session store, and on during ssr if you use the load functions, but it is not available in endpoints.
What is available though in the endpoint is the locals variable, that is originally passed to the getSession function, so you can read that one.
export async function get({ locals }) {
// code goes here
}
Just be aware that this means there is no synchronization between locals and the client side session, if you add something to the session it will not be available to the endpoint. To handle this you would have to for example add new data to the cookie and parse this in the handle function.
The session store only works inside svelte components, (it uses context under the hood) this provides isolation between users.
You can import the getSession from src/hooks.js and pass the event to reuse the logic that extracts session data from the request.
I have just stumbled upon Immutable JS and I believe it looks like a very interesting library for reducing the possibility of bugs due to programmer error/accidental mutation as well as the performance optimisations it offers, however I am struggling to understand how I can keep track of state within a module.
For example, if I had a socket.io server running that supported multiple streams, I would usually have two variables in the global context of that module to track connected clients and current available streams:
var clients = []
var streams = []
If a user was to connect, I could simply use .push in socket ios io.on("connection") event listener and I could rest assured that my client state would now contain the newly joined socket.
In Immutable JS, I have an object global to the module which now looks like:
var state = Immutable.Map({
clients : Immutable.List.of(),
streams : Immutable.List.of()
})
Inside of socket io's connection handler, how can I update the global state? I believe Immutable JS works like this, so maintaining application state doesn't even seem possible (because of the way I am currently thinking about it)
// Define the Immutable array, this remains constant throughout the application
var state = Immutable.Map({
clients : Immutable.List.of(),
streams : Immutable.List.of()
})
io.on("connection", (socket) => {
console.log(state.clients)
// I would like to update the state of clients here, but I believe that
// I am only able to make a local copy within the context of the current
// scope, I would then lose access to this on the next socket joining?
var clientsArray = state.clients
clientsArray.push(socket)
state.set("clients", clientsArray)
console.log(state.clients)
})
From my understanding, I believe that the console.log statements, on two clients connecting would results in the following output:
// First client connects
[]
[ { socket object } ]
// Second client connects
[]
[ { socket object } ]
Is it possible for me to update the object so that I would get
[ { socket object }, { socket object } ]
Or am I going to need to stick to using global mutable state? The only reason I ask this question, is because when I have used react in the past, you are able to update component state in a method and then use that new state elsewhere in a component.
Your code is missing a simple assignment. As you are using immutable, any update operation, like set, results in a brand new object being created. In your case the following code state.set("clients", clientsArray) doesn't change the global state, but returns a new instance with the modified clients List.
To fix this, you need to simply update the global state with the result of the call, like so -
state = state.set("clients", clientsArray);
Or you could rewrite this all in one shot -
state = state.set("clients", state.get("clients").push(socket));
Hope this helps!
As a rule of thumb, remember that you always need to have an assignment whenever you are invoking a method which changes/mutates the immutable.
When I call a remote function in electron, the results are always littered with getters and setters. I think I understand why this is, but I'd like to be able to get simple objects.
My current solution is this:
import {remote} from 'electron'
const bridge = remote.require('bridge') // This is a little script I create for talking to a python process. Over stdin/stdout i.e. pipes.
bridge.on('fileTreeUpdate', (data) => {
myDataStore.update(JSON.parse(JSON.stringify(data.tree))
})
Is there a more elegant way to do this?
As mentioned in the comments, I would suggest to require 'bridge' directly in the Renderer; this way you don't need to pass its data between the Main and Renderer processes at all.
Having said that, if another requirement made it necessary to load the 'bridge' module in the Main process only, while still accessing its data from the Renderer, I'd suggest to use Electron's 'ipc' modules.
For instance, in your main process you could do something like this (assuming the variable win is a reference to the window of the Renderer process you wish to communicate with):
import bridge from 'bridge'
bridge.on('fileTreeUpdate', data => {
// assuming win {BrowserWindow} has already been initialized
win.webContents.send('bridge:fileTreeUpdate', data)
})
And in your Renderer processes (associated with win):
import ipcRenderer from 'electron'
ipcRenderer.on('bridge:fileTreeUpdate', (event, data) => {
myDataStore.update(data.tree)
})
I just started to learn nodejs-postgres and found the pg-promise package.
I read the docs and examples but I don't understand where should I put the initialization code? I using Express and I have many routes.
I have to put whole initialization (including pg-monitor init) to every single file where I would like to query the db or I need to include and initalize/configure them only in the server.js?
If I initialized them only in the server.js what should I include other files where I need a db query?
In other words. Its not clear to me if pg-promise and pg-monitor configuration/initalization was a global or a local action?
It's also unclear if I need to create a db variable and end pgp for every single query?
var db = pgp(connection);
db.query(...).then(...).catch(...).finally(**pgp.end**);
You need to initialize the database connection only once. If it is to be shared between modules, then put it into its own module file, like this:
const initOptions = {
// initialization options;
};
const pgp = require('pg-promise')(initOptions);
const cn = 'postgres://username:password#host:port/database';
const db = pgp(cn);
module.exports = {
pgp, db
};
See supported Initialization Options.
UPDATE-1
And if you try creating more than one database object with the same connection details, the library will output a warning into the console:
WARNING: Creating a duplicate database object for the same connection.
at Object.<anonymous> (D:\NodeJS\tests\test2.js:14:6)
This points out that your database usage pattern is bad, i.e. you should share the database object, as shown above, not re-create it all over again. And since version 6.x it became critical, with each database object maintaining its own connection pool, so duplicating those will additionally result in poor connection usage.
Also, it is not necessary to export pgp - initialized library instance. Instead, you can just do:
module.exports = db;
And if in some module you need to use the library's root, you can access it via property $config:
const db = require('../db'); // your db module
const pgp = db.$config.pgp; // the library's root after initialization
UPDATE-2
Some developers have been reporting (issue #175) that certain frameworks, like NextJS manage to load modules in such a way that breaks the singleton pattern, which results in the database module loaded more than once, and produce the duplicate database warning, even though from NodeJS point of view it should just work.
Below is a work-around for such integration issues, by forcing the singleton into the global scope, using Symbol. Let's create a reusable helper for creating singletons...
// generic singleton creator:
export function createSingleton<T>(name: string, create: () => T): T {
const s = Symbol.for(name);
let scope = (global as any)[s];
if (!scope) {
scope = {...create()};
(global as any)[s] = scope;
}
return scope;
}
Using the helper above, you can modify your TypeScript database file into this:
import * as pgLib from 'pg-promise';
const pgp = pgLib(/* initialization options */);
interface IDatabaseScope {
db: pgLib.IDatabase<any>;
pgp: pgLib.IMain;
}
export function getDB(): IDatabaseScope {
return createSingleton<IDatabaseScope>('my-app-db-space', () => {
return {
db: pgp('my-connect-string'),
pgp
};
});
}
Then, in the beginning of any file that uses the database you can do this:
import {getDB} from './db';
const {db, pgp} = getDB();
This will ensure a persistent singleton pattern.
A "connection" in pgp is actually an auto-managed pool of multiple connections. Each time you make a request, a connection will be grabbed from the pool, opened up, used, then closed and returned to the pool. That's a big part of why vitaly-t makes such a big deal about only creating one instance of pgp for your whole app. The only reason to end your connection is if you are definitely done using the database, i.e. you are gracefully shutting down your app.