Insert 2 record to Json File - javascript

I'm using write file to write the account after it is created
`
async writeFile(path, content){
fs.writeFile(path, content, (err) => {
if (err)
console.log(err);
else {
console.log(fs.readFileSync(path, "utf8"));
}
})
}
`
`
export async function insertAccountToFile(path , emailAcc , orgName){
let data = `{
"LoginSuccessfully" :
{
"emailAddress": "${emailAcc}",
"password": "${globalConstants.password}",
"org" : "${orgName}",
"LoginStatus": "Successfully"
}
}`
fileHelper.writeFile(path, data)
}
`
I can insert to file normally, but after insert again, the new account is overwritten with the old account, so I'm not sure need to change something so that the old and the new account is still in file

You need to use appendFile instead of writeFile. Also, you need to promisify this method to use async/await syntaxis or use appendFileSync. Otherwise, there will be issues with desynchronization.
const fs = require("fs");
const { promisify } = require("util");
const appendFilePromisified = promisify(fs.appendFile);
async function writeFile(path, content) {
await appendFilePromisified(path, content);
}

Related

having problems with `fs.writeFile` it doesn't create files

I'm trying to start a script that itself creates a model file in json using fs.writeFile. The problem is when I run the script using node file.js. It is supposed to create a new file face-expression-model.json in directory /models but it doesn't create anything and doesn't show any errors.
I tried to use another library fs-extra not working as well, tried to make the script to create model directory fs.WriteDir not working eitheritried to add process.cwd() to bypass any authorisation when creating the file but didn't work. I also tried to add try/catch block to catch all errors but it doesn't show any errors and it appears that the file was created for the first while but NOPE, unfortunately.
Here is the code I'm using.
const axios = require("axios");
const faceapi = require("face-api.js");
const { FaceExpressions } = faceapi.nets;
const fs = require("fs");
async function trainModel(imageUrls) {
try {
await FaceExpressions.loadFromUri(process.cwd() + "/models");
const imageTensors = [];
for (let i = 0; i < imageUrls.length; i++) {
const response = await axios.get(imageUrls[i], {
responseType: "arraybuffer"
});
const image = new faceapi.Image();
image.constructor.loadFromBytes(new Uint8Array(response.data));
const imageTensor = faceapi.resizeImageToBase64Tensor(image);
imageTensors.push(imageTensor);
}
const model = await faceapi.trainFaceExpressions(imageTensors);
fs.writeFileSync("./models/face-expression-model.json", JSON.stringify(model), (err) => {
if (err) throw err;
console.log("The file has been saved!");
});
} catch (error) {
console.error(error);
}
}
const imageUrls = [
array of images urls here
];
trainModel(imageUrls);
I don't know exactly why but I had the same problem a while ago. Try using the "fs.writeFile" method. It worked for me.
fs.writeFile("models/face-expression-model.json", JSON.stringify(model), {}, (err) => {
if (err) throw err;
console.log("The file has been saved!");
});
Good luck with that!

Downloading and sending pdf document in Node through API

I am new to node, I want to download a pdf document from some another url when person hits a post request in the back-end, change the name of file and send the file back to original client where the pdf will be downloaded.
NOTE the file should not be saved in server
first there is controller file which contains following code
try {
const get_request: any = req.body;
const result = await printLabels(get_request,res);
res.contentType("application/pdf");
res.status(200).send(result);
} catch (error) {
const ret_data: errorResponse = await respondError(
error,"Something Went Wrong.",
);
res.status(200).json(ret_data);
}
Then after this the function printLabels is defined as
export const printLabels = async (request: any,response:any) => {
try {
const item_id = request.item_id;
let doc=await fs.createReadStream(`some url with ${item_id}`);
doc.pipe(fs.createWriteStream("Invoice_" + item_id + "_Labels.pdf"));
return doc;
} catch (error) {
throw error;
}
};
Using above code, I am getting error as no such file found. Also, I don't have access of front end so is it possible to test the API with postman for pdf which I am doing or my approach is incorrect?
Next solution working for Express, but I'm not sure if you're using Express-like framework. If that, please specify which framework you're using.
At first, you need to use sendFile instead of send:
try {
const get_request: any = req.body;
const result = await printLabels(get_request,res);
res.contentType("application/pdf");
res.status(200).sendFile(result);
} catch (error) {
const ret_data: errorResponse = await respondError(
error,"Something Went Wrong.",
);
res.status(200).json(ret_data);
}
Then, you returning readStream, instead of path to file. Notice, you need to use absolute path to do that.
const printLabels = async () => {
try {
let doc= await fs.createReadStream(path.join(__dirname, 'test.pdf'));
doc.pipe(fs.createWriteStream("Invoice_test_Labels.pdf"));
return path.join(__dirname, 'Invoice_test_Labels.pdf');
} catch (error) {
throw error;
}
};
About PostMan, of course you can see it or save it to file:

Unable to export db properties from nodejs module

I am trying to export database properties stored in properties file from Javascript module. By the time I read database properties file, Javascript file is already exported and data properties appear undefined wherever I use in other modules.
const Pool = require('pg').Pool;
const fs = require('fs')
const path = require('path');
class DbConfig {
constructor(dbData) {
this.pool = new Pool({
user: dbData['user'],
host: dbData['host'],
database: dbData['database'],
password: dbData['password'],
max: 20,
port: 5432
});
}
}
function getdbconf() {
const dbData = {};
fs.readFile("../../db_properties.txt"), 'utf8', (err, data) => {
if (err) {
console.error(err)
return
}
// dbData = {"user":"postgres", "password": "1234"...};
return dbData;
});
}
let db = new DbConfig(getdbconf());
let dbPool = db.pool;
console.log("dbpool : -> : ",dbPool); // username and password appear undefined
module.exports = { dbPool };
Is there a way to read data before exporting data from Javascript module?
Usually database config or any other sensitive info is read from a .env file using dotenv .
Or
you could also provide env from command line itself like
DB_HOST=127.0.0.1 node index.js
inside your index.js
console.log(process.env.DB_HOST)
Please create a new file (connection-pool.js) and paste this code:
const { Pool } = require('pg');
const poolConnection = new Pool({
user: 'postgresUserName',
host: 'yourHost',
database: 'someNameDataBase',
password: 'postgresUserPassword',
port: 5432,
});
console.log('connectionOptions', poolConnection.options);
module.exports = poolConnection;
For use it, create a new file (demo-connection.js) and paste this code:
const pool = require('./connection-pool');
pool.query('SELECT NOW();', (err, res) => {
if (err) {
// throw err;
console.log('connection error');
return;
}
if (res) {
console.log(res.rows);
pool.end();
}
});
This is an alternative option 🙂
Exporting the result of async calls
To export values which have been obtained asynchronously, export a Promise.
const fs = require('fs/promises'); // `/promise` means no callbacks, Promise returned
const dbDataPromise = fs.readFile('fileToRead')); //`readFile` returns Promise now
module.exports = dbDataPromise;
Importing
When you need to use the value,
const dbDataPromise = require('./dbdata');
async init() {
const dbData = await dbDataPromise;
}
//or without async, using Promise callbacks
init() {
dbDataPromise
.then(dbData => the rest of your code that depends on dbData here);
}
Current code broken
Please note that your current code, as pasted above, is broken:
function getdbconf() {
const dbData = {};
fs.readFile("../../db_properties.txt"), 'utf8', (err, data) => {
//[...] snipped for brevity
return dbData;
});
}
fs.readFile "returns" dbData, but there is nothing to return to, since you are in a callback which you did not call yourself. Function getdbconf returns nothing.
The line that says let db = new DbConfig(getdbconf()); will NOT work. It needs to be inside the callback.
The only way to avoid putting all of your code inside the callback (and "flatten" it) is to use await, or to use readFileSync
Avoiding the issue
Using environment variables
Suhas Nama's suggestion is a good one, and is common practice. Try putting the values you need in environment variables.
Using synchronous readFile
While using synchronous calls does block the event loop, it's ok to do during initialization, before your app is up and running.
This avoids the problem of having everything in a callback or having to export Promises, and is often the best solution.

MongoDB reusable custom javascript module

I would like to create a local Javascript module I can "require" in other files to handle all MongoDB CRUD operations.
I wrote something as:
-- dbConn.js file --
require('dotenv').config()
const MongoClient = require('mongodb').MongoClient
const ObjectID = require('mongodb').ObjectID
let _connection
const connectDB = async () => {
try {
const client = await MongoClient.connect(process.env.MONGO_DB_URI, {
useNewUrlParser: true,
useUnifiedTopology: true
})
console.log('Connected to MongoDB')
return client
} catch (err) {
console.log(error)
}
}
exports.findOne = async () => {
let client = await connectDB()
if (!client) {
return;
}
try {
const db = client.db("Test_DB");
const collection = db.collection('IoT_data_Coll');
const query = {}
let res = await collection.findOne(query);
return res;
} catch (err) {
console.log(err);
} finally {
client.close();
}
}
exports.findAll = async () => {
let client = await connectDB()
if (!client) {
return;
}
try {
const db = client.db("Test_DB");
const collection = db.collection('IoT_data_Coll');
const query = {}
let res = await collection.find(query).toArray();
return res;
} catch (err) {
console.log(err);
} finally {
client.close();
}
}
Then in another file (not necessary inside Express app), say
-- app.js ---
const findAll = require('./dbConn').findAll
const findOne = require('./dbConn').findOne
findAll().then(res => JSON.stringify(console.log(res)))
findOne().then(res => JSON.stringify(console.log(res)))
I wonder if it is correct?
I have to close the connection after each method/CRUD operation?
I was trying to use IIF instead of ".then", as:
(async () => {
console.log(await findOne())
})()
But I receive a weird error saying that findAll is not a function.
What's wrong with it?
Thanks.
It really depends on your use case which isn’t clear If you are using Express or just stand alone and how frequent are you planning to run app.js
Either way your code is expensive, each time you reference dbCon.js you are opening a new connection to the database.
So you can fix app.js by only requiring dbCon.js once and use it..
The best practice is to ofcourse use connection pooling https://www.compose.com/articles/connection-pooling-with-mongodb/

Async await sqlite in javascript

I'm looking at this tutorial, which has a library called aa-sqlite in order to replace Promises() syntax with async-await.
I'm not seeing aa-sqlite on npm. Is there another, updated syntax for async await sqlite?
Here is what I'm trying with the standard sqlite library:
const sqlite3 = require('sqlite3').verbose();
let db = new sqlite3.Database("tmp.db")
async function myfunc(db) {
let sql = "SELECT id id FROM TABLE LIMIT 2"
let res1 = await db.run(sql)
console.log(res1)
for (row of res1) {
console.log(row);
}
But this yields
TypeError: res1 is not iterable
I am not expecting res1 to be an object, but instead an iterator of results. How can I async/await the results of a db.run query in ES7/ES8?
I sort of tried sqlite npm package, which implements async/await over splite3, but it is not that easy to use.
A simple way is to create a little module and promessify the main sqlite3 functions.
Here is my simple module I created for a Discord chatbot database:
const sqlite3 = require('sqlite3');
const util = require('util');
let db = new sqlite3.Database('./db/chatbot.sqlite3', sqlite3.OPEN_READWRITE, (err) => {
if (err) {
console.error(err.message);
}
console.log('Connected to the chatbot database.');
});
db.run = util.promisify(db.run);
db.get = util.promisify(db.get);
db.all = util.promisify(db.all);
// empty all data from db
db.clean_db = async function() {
await db.run("delete from users");
await db.run("delete from members");
await db.run("delete from guilds");
db.run("vacuum");
}
// any kind of other function ...
// and then export your module
module.exports = db;
How to use - now you can use the module like this in your code:
const db = require('./db');
// get one user
const myUser = await db.get("select * from users where id = ?", [id]);
if (! myUser)
return console.log("User with id", id, "not found");
// get all users
const users = await db.all("select * from users");
users.map((user) => { console.log(user.id, "/", user.name); });
// ... etc ...
For me the simplest solution would be to encapsulate the operation in a Promise like so:
const res = await new Promise((resolve, reject) => {
db.each('SELECT id FROM table', [], (err, row) => {
if (err)
reject(err)
resolve(row)
})
})
console.log(res)
With this you'll have the row result in res outside the callback and synchronously.
Try the sqlite package, rather than the sqlite3 used in the demo. It has better support for async await.
You are note seeing the aa-sqlite package because it's not a npm package.
The guy who wrote the tutorial you are refering simply put how he created this small aa-sqlite package, and it's all written inside the tuto, but it has not been published on npm.

Categories

Resources