how to add uuid in logger better way - javascript

------------program1.ts
consumerGroup.on('message',function(msgc){
uuid=uuid();
program2.func2(uuid,msgc);
});
------Program2.ts
function func2(uuid,msgc){
logger.log(uuid,msgc);
program3.func3(uuid,msgc);
}
-----------program3.ts
function func(uuid,msgc){
logger.log(uuid,msgc);
}
I have to implement UUID in the logger for each message read from the consumer in an existing project. I came up similar to the above code approach but there are lots of sub-functions and I would have to do it for each and every function/logger call.
Is there any better way to do it?

If you use Node 14 and above, you can use the built-in AsyncLocalStorage. Logging with IDs is a typical use-case of this new API. Here's an example of the use-case:
const { AsyncLocalStorage } = require('async_hooks');
const asyncLocalStorage = new AsyncLocalStorage();
function logWithId(msg) {
const id = asyncLocalStorage.getStore();
logger.log(id, msg);
}
asyncLocalStorage.run(uuid(), () => {
consumerGroup.on('message',function(msgc){
program2.func2(msgc);
});
})
// ------Program2.ts
function func2(uuid,msgc){
logWithId(msgc);
program3.func3(msgc);
}
// -----------program3.ts
function func(uuid,msgc){
logWithId(msgc);
}
Read the docs to understand more about it.

Related

node.js using nested functions from different files

I wanted to write a understandable code in node.js, so I want to put some functions, which are used very often, into other node.js files, and access them from there.
So I get a function, which calls a function from another node.js file and in this other node.js file, also another one is called.
Important to know, if I put all in one file, the code works, so it should be an issue with module export and using functions in another file.
I have one file, getting quotes from a decentralised exchange. Looking like this (quoter_uni_v2.js):
module.exports = function quotes_uni_v2(tokenIn, tokenOut, amountIn, router) {
const quotedAmountOut = router.getAmountsOut(amountIn.toString(), [
tokenIn,
tokenOut,
]);
return quotedAmountOut;
};
And I am importing this function in my second helper file (quotes_5.js) (It is splitted in two files, because in the second one I have to call the function multiple times):
var quotes_uni_v2 = require("./quotes_uni_v2");
module.exports = async function (router1, router2, route, amount_wei) {
console.log(route);
var amount_Out = await quotes_uni_v2.quotes_uni_v2(
route[1],
route[2],
amount_wei,
router1
);
...
return (
Math.round(ethers.utils.formatEther(amount_Out[1].toString()) * 100) / 100
);
};
After that I try to call the function in my main.js:
const quotes_uni_v2 = require("./quotes_uni_v2");
const quotes_5 = require("./quotes_5");
async function calc(route) {
amountOut = await new quotes_5(
quickswap_router,
sushiswap_router,
route,
amount_wei
);
return amountOut;
};
But calling the quotes function does not work... The error is:
TypeError: quotes_5 is not a constructor...
Can someone help me?
Thanks!

Having issues creating associations for create sequelize queries

I am having issues creating associations in my code on sequelize 6. I will try to be as clear as I can but since its a very large project for my company i cant share all the code.
but basically here is what is happening.
I am trying to learn the property way to associate my models so when I need to use them in other files, they have the associations saved.
As it stands, right now, this is a small snippet of some reduced example code
const createA = (sequelize, payload) => {
sequelize.models.A.create(payload,
{
include: [
sequelize.models.B
// ....
]
})
}
Model B will return the error Cannot read properties of undefined (reading 'name')", if I remove the include, it works fine (although the associated data is not created). getting queries are ok, creating/save are the issue.
So I looked into the different ways of saving associations, and found multiple ways all over stack overflow the course of 5 years. (a lot of these solutions are deprecated or no longer relevant)
One way is storing it into a db.sequelize export, so the modesl get there own object and not part of the models property
One way was saving each model returned from the define method into an object,
so imagine this
const a = A.hasOne(B, { foreignKey: 'bId' });
const a.b = B.belongsTo(A);
but this is an issue with very large and ocmplex associations (i tried this and it gets hairy but works BUT very unstable)
I even tried this example with no luck => https://github.com/sequelize/express-example/ (I will show this code below)
I also tried to make the class define model methods have an associate static function and associate everything during initialization (I can't find the example at the momentbut i seen it around)
Now our code is not exactly the same as this and i cant post all of it, but the format is basically the same, here is what we have in a file
const factory = () => {
let instance = null;
try {
instance = createInstance(); // <=========== this sets up the instance and returns it via new Sequelize();
defineModelsAndAssociations(instance); // <====here is my entry poiunt for all the different ways I tried definging and associating models
return instance;
} catch (error) { /* ... */ }
};
module.exports = factory();
here is how I tried the express example
const getModels = () => {
return [
require('../../models/definitions/A'),
require('../../models/definitions/B'),
// Add more models here...
];
}
const setUpModelsWithInstance = (instance, modelDefiners) => {
for (const modelDefiner of modelDefiners) {
modelDefiner(instance);
}
}
const associateModels = instance => {
const {
A,
B,
} = instance.models;
A.hasOne(B, { foreignKey: 'bId' });
B.belongsTo(A);
}
const defineModelsAndAssociations = instance => {
const modelDefiners = getModels();
setUpModelsWithInstance(instance, modelDefiners);
associateModels(instance);
}
module.exports = {
defineModelsAndAssociations,
}
but even with this and every other attempt, when I try the createA method/query i listed above
would anyone have any idea? thanks!

SharedWorker in React

I am using web workers in my React application with a helper class that looks like this:
export class WebWorker {
constructor(worker) {
const code = worker.toString();
const blob = new Blob(["(" + code + ")()"]);
return new Worker(URL.createObjectURL(blob));
}}
The worker looks something like:
export default () => {
self.addEventListener('message', function (e) {
switch (e.data.event) {
case 'start':
// Start back ground task
break;
case 'stop':
// Stop background task
break;
}
}, false);
Then I am able to create the worker using
let sessionWorker = new WebWorker(SessionWorker);
sessionWorker.postMessage({event: "start"})
This works fine, however I now need to use a SharedWorker and I am having trouble getting it to work. All of the resources I've found show regular web workers. There is this SO Question Using Shared Worker in React but it doesn't work for me. It actually looks identical to my regular WebWorker code, but this doesn't work because SharedWorkers require that you implement an onconnect function, and I don't understand how to do that. Non-react examples that I've found show the worker script as:
onconnect = function(e) {
var port = e.ports[0];
port.onmessage = function(e) {
var workerResult = 'Result: ' + (e.data[0] * e.data[1]);
port.postMessage(workerResult);
}
}
but if I put that in my worker js file and follow the same pattern using the WebWorker helper, the worker.toString() just returns [object Window] and it never gets executed. I don't understand how to create an 'onconnect' function that will get called. I've tried variations like this:
import React from 'react';
self.onconnect = (e) => {
var port = e.ports[0];
console.log("test log");
port.onmessage = function(e) {
console.log("Message received ", e)
port.postMessage("test");
}
}
export default self;
Ultimately nothing works for me. Clearly I do not understand exports at all in Javascript. If I just export a function called onconnect it never gets called, clearly the onconnect has to belong to some sort of class-like context (like Window or Self) but I don't understand what is needed
Thankyou,
Troy.

React / ES6 - Efficiently update property of an object in an array

I am looking for the most efficient way to update a property of an object in an array using modern JavaScript. I am currently doing the following but it is way too slow so I'm looking for an approach that will speed things up. Also, to put this in context, this code is used in a Redux Saga in a react app and is called on every keystroke* a user makes when writing code in an editor.
*Ok not EVERY keystroke. I do have debounce and throttling implemented I just wanted to focus on the update but I appreciate everyone catching this :)
function* updateCode({ payload: { code, selectedFile } }) {
try {
const tempFiles = stateFiles.filter(file => file.id !== selectedFile.id);
const updatedFile = {
...selectedFile,
content: code,
};
const newFiles = [...tempFiles, updatedFile];
}
catch () {}
}
the above works but is too slow.
I have also tried using splice but I get Invariant Violation: A state mutation
const index = stateFiles.findIndex(file => file.id === selectedFile.id);
const newFiles = Array.from(stateFiles.splice(index, 1, { ...selectedFile, content: code }));
You can use Array.prototype.map in order to construct your new array:
const newFiles = stateFiles.map(file => {
if (file.id !== selectedFile.id) {
return file;
}
return {
...selectedFile,
content: code,
};
});
Also, please consider using debouncing in order not to run your code on every keystroke.

Using a cursor with pg-promise

I'm struggling to find an example of using a cursor with pg-promise. node-postgres supports its pg-cursor extension. Is there a way to use that extension with pg-promise? I'm attempting to implement an asynchronous generator (to support for-await-of). pg-query-stream doesn't seem to be appropriate for this use case (I need "pull", rather than "push").
As an example, I use SQLite for my unit tests and my (abridged) generator looks something like this...
async function* () {
const stmt = await db.prepare(...);
try {
while (true) {
const record = await stmt.get();
if (isUndefined(record)) {
break;
}
yield value;
}
}
finally {
stmt.finalize();
}
}
Using pg-cursor, the assignment to stmt would become something like client.query(new Cursor(...)), stmt.get would become stmt.read(1) and stmt.finalize would become stmt.close.
Thanks
Following the original examples, we can modify them for use with pg-promise:
const pgp = require('pg-promise')(/* initialization options */);
const db = pgp(/* connection details */);
const Cursor = require('pg-cursor');
const c = await db.connect(); // manually managed connection
const text = 'SELECT * FROM my_large_table WHERE something > $1';
const values = [10];
const cursor = c.client.query(new Cursor(text, values));
cursor.read(100, (err, rows) => {
cursor.close(() => {
c.done(); // releasing connection
});
// or you can just do: cursor.close(c.done);
});
Since pg-promise doesn't support pg-cursor explicitly, one has to manually acquire the connection object and use it directly, as shown in the example above.
pg-query-stream doesn't seem to be appropriate for this use case (I need pull, rather than push).
Actually, in the context of these libraries, both streams and cursors are only for pulling data. So it would be ok for you to use streaming also.
UPDATE
For reading data in a simple and safe way, check out pg-iterator.

Categories

Resources