ReferenceError: process is not defined - javascript

I am running a k6 test and using env variables for some of the code. I keep getting this darn error though:
ReferenceError: process is not defined
I have tried using an alternate some people suggested in a similar issue using __ENV.yadda, but that did not work.
import http from "k6/http";
let formData = {
client_id: "LoadTesting",
grant_type: "blah_blah",
scope: "Scope",
};
const messageHeaders = {
'Content-Type': 'Client Type',
};
let user = null;
export function authorizeUser() {
formData.client_secret = process.env.REACT_APP_CLIENT_SECRET;
if (!user) {
let res = http.post(`https://localhost:44341/connect/token`, formData, { headers: messageHeaders });
if (res.status != 200) {
console.log('res: ', res.status);
throw new Error("couldn't load user");
}
user = JSON.parse(res.body).access_token;
return user;
}
}
I just want my env variables to work!

The environment variable in the k6 works different as for the Node.js try something as that:
https://k6.io/docs/using-k6/environment-variables

You can use the k6/x/dotenv extension library to do this. Please see: https://github.com/szkiba/xk6-dotenv for more information
At the start of your file:
import dotenv from "k6/x/dotenv";
const env = dotenv.parse(open('/path/to/your/.env'))
Then get your environment variables like so:
const yourEnvVar = env.YOUR_ENV_VAR

Related

Unable to export db properties from nodejs module

I am trying to export database properties stored in properties file from Javascript module. By the time I read database properties file, Javascript file is already exported and data properties appear undefined wherever I use in other modules.
const Pool = require('pg').Pool;
const fs = require('fs')
const path = require('path');
class DbConfig {
constructor(dbData) {
this.pool = new Pool({
user: dbData['user'],
host: dbData['host'],
database: dbData['database'],
password: dbData['password'],
max: 20,
port: 5432
});
}
}
function getdbconf() {
const dbData = {};
fs.readFile("../../db_properties.txt"), 'utf8', (err, data) => {
if (err) {
console.error(err)
return
}
// dbData = {"user":"postgres", "password": "1234"...};
return dbData;
});
}
let db = new DbConfig(getdbconf());
let dbPool = db.pool;
console.log("dbpool : -> : ",dbPool); // username and password appear undefined
module.exports = { dbPool };
Is there a way to read data before exporting data from Javascript module?
Usually database config or any other sensitive info is read from a .env file using dotenv .
Or
you could also provide env from command line itself like
DB_HOST=127.0.0.1 node index.js
inside your index.js
console.log(process.env.DB_HOST)
Please create a new file (connection-pool.js) and paste this code:
const { Pool } = require('pg');
const poolConnection = new Pool({
user: 'postgresUserName',
host: 'yourHost',
database: 'someNameDataBase',
password: 'postgresUserPassword',
port: 5432,
});
console.log('connectionOptions', poolConnection.options);
module.exports = poolConnection;
For use it, create a new file (demo-connection.js) and paste this code:
const pool = require('./connection-pool');
pool.query('SELECT NOW();', (err, res) => {
if (err) {
// throw err;
console.log('connection error');
return;
}
if (res) {
console.log(res.rows);
pool.end();
}
});
This is an alternative option 🙂
Exporting the result of async calls
To export values which have been obtained asynchronously, export a Promise.
const fs = require('fs/promises'); // `/promise` means no callbacks, Promise returned
const dbDataPromise = fs.readFile('fileToRead')); //`readFile` returns Promise now
module.exports = dbDataPromise;
Importing
When you need to use the value,
const dbDataPromise = require('./dbdata');
async init() {
const dbData = await dbDataPromise;
}
//or without async, using Promise callbacks
init() {
dbDataPromise
.then(dbData => the rest of your code that depends on dbData here);
}
Current code broken
Please note that your current code, as pasted above, is broken:
function getdbconf() {
const dbData = {};
fs.readFile("../../db_properties.txt"), 'utf8', (err, data) => {
//[...] snipped for brevity
return dbData;
});
}
fs.readFile "returns" dbData, but there is nothing to return to, since you are in a callback which you did not call yourself. Function getdbconf returns nothing.
The line that says let db = new DbConfig(getdbconf()); will NOT work. It needs to be inside the callback.
The only way to avoid putting all of your code inside the callback (and "flatten" it) is to use await, or to use readFileSync
Avoiding the issue
Using environment variables
Suhas Nama's suggestion is a good one, and is common practice. Try putting the values you need in environment variables.
Using synchronous readFile
While using synchronous calls does block the event loop, it's ok to do during initialization, before your app is up and running.
This avoids the problem of having everything in a callback or having to export Promises, and is often the best solution.

Browserify doesn't send data to write function

I am using this code to take user input, process it into a usable format and then post it to a Node Express server. When the write function is called the data is 'undefined'?
when I open this code as a file in the browser it works fine except for the fetch post because I am using Node-fetch.
When I serve the page and hard code the data the fetch works fine too.
I am also using Browserify/watchify to bundle Node-fetch with my code and this seems to be the problem. When I moved the fetch into the the same function that processes the input it works fine.
For some reason Browserify isn't sending the data to the write function.
I'd really like to keep the server communications separate from client side data processing.
Any suggestions?
function add_cat() {
let name = document.getElementById("name").value;
const nodePros = document.querySelectorAll('input.pro');
const nodeCons = document.querySelectorAll('input.con');
let stringPros = [];
let stringCons = [];
nodePros.forEach(currentValue => {
let pro = currentValue.value.toString();
if (pro.length > 0) {
stringPros.push(pro);
}
});
nodeCons.forEach(curValue => {
let con = curValue.value.toString();
if (con.length > 0) {
stringCons.push(con);
}
});
write_cat(name, stringPros, stringCons);
}
module.exports = add_cat;
function write_cat(name, stringPros, stringCons) {
const fetch = require('node-fetch');
(async() => {
const rawResponse = await fetch('http://localhost:8080/api/categories?', {
method: 'POST',
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json'
},
body: JSON.stringify({
name: name,
pros: stringPros,
cons: stringCons
})
})
const content = await rawResponse.json();
console.log(content);
})();
}
module.exports = write_cat;

How to fix firebase database initialised multiple times due to React SSR initialised database and cloud function firebase initialised database?

I have updated the question as found the root cause of the issue.
As I have hosted my React SSR app which uses firebase database in the client serving by one of the cloud function named app throwing an error of Error: FIREBASE FATAL ERROR: Database initialized multiple times. Please make sure the format of the database URL matches with each database() call.. When I comment out one by one and deploy, works perfectly. But when I deploy together doesn't work. How do I separate these two keeping both at the same repo?
ORIGINAL Question: Why firebase cloud function throwing an error of 'The default Firebase app does not exist.'?
So I am trying out firebase function for the first time. admin.messaging() throwing me the following error. Help me figure out why?
If I look at the console I get results till console.log('deviceToken', deviceToken);
so whats wrong in const messageDone = await admin.messaging().sendToDevice(deviceToken, payload);?
const functions = require('firebase-functions');
const admin = require('firebase-admin');
exports.updateUnreadCount = functions.database.ref('/chats/{chatId}/{messageId}')
.onCreate(async(snap, context) => {
const appOptions = JSON.parse(process.env.FIREBASE_CONFIG);
appOptions.databaseAuthVariableOverride = context.auth;
const adminApp = admin.initializeApp(appOptions, 'app');
const { message, senderId, receiverUid } = snap.val();
console.log(message, senderId, receiverUid);
console.log('------------------------');
const deleteApp = () => adminApp.delete().catch(() => null);
try {
const db = adminApp.database();
const reciverUserRef = await db.ref(`users/${receiverUid}/contacts/${senderId}/`);
console.log('reciverUserRef', reciverUserRef);
const deviceTokenSnapshot = await reciverUserRef.child('deviceToken').once('value');
const deviceToken = await deviceTokenSnapshot.val();
console.log('deviceToken', deviceToken);
const payload = {
notification: {
title: 'Test Notification Title',
body: message,
sound: 'default',
badge: '1'
}
};
const messageDone = await admin.messaging().sendToDevice(deviceToken, payload);
console.log('Successfully sent message: ', JSON.stringify(messageDone));
return deleteApp().then(() => res);
} catch (err) {
console.log('error', err);
return deleteApp().then(() => Promise.reject(err));
}
});
Update1: According to this https://firebase.google.com/docs/cloud-messaging/send-message#send_to_a_topic, admin.messaging().sendToDevice(deviceToken, payload) APIs are only available in the Admin Node.js SDK?
So switched to
const payload = {
data: {
title: 'Test Notification Title',
body: message,
sound: 'default',
badge: '1'
},
token: deviceToken
};
const messageDone = await admin.messaging().send(payload);
Which is not working either. Getting an error Error: The default Firebase app does not exist. Make sure you call initializeApp() before using any of the Firebase services. Any lead will be helpful.
EDIT: Finally got the function working.
My index.js is exporting to functions, follwoing
exports.app = functions.https.onRequest(app); //React SSR
exports.updateChat = functions.database.ref('/chats/{chatId}/{messageId}').onCreate(updateChat);
exports.app is a react ssr function, which I am using to host my site. This uses database too. and throwing error of multiple database instance.
When I comment out one by one and deploy, works perfectly. But when I deploy together doesn't work. How do I separate these two keeping both at the same repo? Any suggestions, please?
You can initialise db outside export function.
const admin = require('firebase-admin');
const adminApp = admin.initializeApp(appOptions, 'app')
//continue code
Update:
const admin = require('firebase-admin');
const adminApp = admin.initializeApp(options);
async function initialize(options, apps = 'app') {
try {
const defaultApp = adminApp.name
if(defaultApp) {
const adminApp1 = admin.initializeApp(apps);
}else {
const adminApp1 = admin.initializeApp(options, apps);
}
}catch(err) {
console.error(err);
}
}
Modify this snippet as per your need and try it out
It abstracts initialize of app in another function. Just call this function at appropriate place in your code.

Electron: How to securely inject global variable into BrowserWindow / BrowserView?

I want to load an external webpage in Electron using BrowserView. It has pretty much the same API as BrowserWindow.
const currentWindow = remote.getCurrentWindow();
const view = new remote.BrowserView({
webPreferences: {
// contextIsolation: true,
partition: 'my-view-partition',
enableRemoteModule: false,
nodeIntegration: false,
preload: `${__dirname}/preload.js`,
sandbox: true,
},
});
view.setAutoResize({ width: true, height: true });
view.webContents.loadURL('http://localhost:3000');
In my preload.js file, I simply attach a variable to the global object.
process.once('loaded', () => {
global.baz = 'qux';
});
The app running on localhost:3000 is a React app which references the value like this:
const sharedString = global.baz || 'Not found';
The problem is I have to comment out the setting contextIsolation: true when creating the BrowserView. This exposes a security vulnerability.
Is it possible to (one way - from Electron to the webpage) inject variables into a BrowserView (or BrowserWindow) while still using contextIsolation to make the Electron environment isolated from any changes made to the global environment by the loaded content?
Update:
One possible approach could be intercepting the network protocol, but I'm not sure about this 🤔
app.on('ready', () => {
const { protocol } = session.fromPartition('my-partition')
protocol.interceptBufferProtocol('https', (req, callback) => {
if (req.uploadData) {
// How to handle file uploads?
callback()
return
}
// This is electron.net, docs: https://electronjs.org/docs/api/net
net
.request(req)
.on('response', (res) => {
const chunks = []
res.on('data', (chunk) => {
chunks.push(Buffer.from(chunk))
})
res.on('end', () => {
const blob = Buffer.concat(chunks)
const type = res.headers['content-type'] || []
if (type.includes('text/html') && blob.includes('<head>')) {
// FIXME?
const pos = blob.indexOf('<head>')
// inject contains the Buffer with the injected HTML script
callback(Buffer.concat([blob.slice(0, pos), inject, blob.slice(pos)]))
} else {
callback(blob)
}
})
})
.on('error', (err) => {
console.error('error', err)
callback()
})
.end()
})
})
After doing some digging, I found a few pull requests for Electron that detail the issue you are having. The first describes a reproducible example very similar to the problem you are describing.
Expected Behavior
https://electronjs.org/docs/tutorial/security#3-enable-context-isolation-for-remote-content
A preload script should be able to attach anything to the window or document with contextIsolation: true.
Actual behavior
Anything attached to the window in the preload.js just disappears in the renderer.
It seems the final comment explains that the expected behavior no longer works
It was actually possible until recently, a PR with Isolated Worlds has changed the behavior.
The second has a user suggest what they have found to be their solution:
After many days of research and fiddling with the IPC, I've concluded that the best way is to go the protocol route.
I looked at the docs for BrowserWindow and BrowserView as well as an example that shows the behavior that you desire, but these PRs suggest this is no longer possible (along this route).
Possible Solution
Looking into the documentation, the webContents object you get from view.webContents has the function executeJavaScript, so you could try the following to set the global variable.
...
view.setAutoResize({ width: true, height: true });
view.webContents.loadURL('http://localhost:3000');
view.webContents.executeJavaScript("global.baz = 'qux';");
...
Other answers are outdated, use contextBridge be sure to use sendToHost() instead of send()
// Preload (Isolated World)
const { contextBridge, ipcRenderer } = require('electron')
contextBridge.exposeInMainWorld(
'electron',
{
doThing: () => ipcRenderer.sendToHost('do-a-thing')
}
)
// Renderer (Main World)
window.electron.doThing()
So, executeJavaScript as suggested by Zapparatus ended up being part of the solution.
This is what's going on in renderer.js.
view.webContents.executeJavaScript(`
window.communicator = {
request: function(data) {
const url = 'prefix://?data=' + encodeURIComponent(JSON.stringify(data))
const req = new XMLHttpRequest()
req.open('GET', url)
req.send();
},
receive: function(data) {
alert('got: ' + JSON.stringify(data))
}
};
`)
const setContent = data => view.webContents.executeJavaScript(
`window.communicator.receive(${JSON.stringify(data)})`
)
ipcRenderer.on('communicator', (event, message) => {
setContent(`Hello, ${message}!`)
})
We ended up setting up a custom protocol, similar to how its been done here. In your main.js file set up the following:
const { app, session, protocol } = require('electron')
const { appWindows } = require('./main/app-run')
const { URL } = require('url')
protocol.registerSchemesAsPrivileged([
{
scheme: 'prefix',
privileges: {
bypassCSP: true, // ignore CSP, we won't need to patch CSP
secure: true // allow requests from https context
}
}
])
app.on('ready', () => {
const sess = session.fromPartition('my-view-partition')
// https://electronjs.org/docs/tutorial/security#4-handle-session-permission-requests-from-remote-content
sess.setPermissionRequestHandler((webContents, permission, callback) => {
// Denies the permissions request
const decision = false
return callback(decision)
})
sess.protocol.registerStringProtocol('prefix', (req, callback) => {
const url = new URL(req.url)
try {
const data = JSON.parse(url.searchParams.get('data'))
appWindows.main.webContents.send('prefix', data)
} catch (e) {
console.error('Could not parse prefix request!')
}
const response = {
mimeType: 'text/plain',
data: 'ok'
}
callback(response)
})
})
No preload.js or postMessage needed.

Summon Dependency injenction and calling an method/function inside a module

Thank you in advance for your help.
I am unsure how to do the following. I have a module to send emails, and the Config is injected into the module using summon.js dependency injection, but I need to use the sendMail method and pass it the parameter mailOptions. Here is the code example:
'use strict';
const nodemailer = require('nodemailer');
const ejs = require('ejs');
const fs = require('fs');
module.exports = function(Configs) {
// create reusable transporter object using the default SMTP transport
let transporter = nodemailer.createTransport({
host: Configs.email.host,
port: Configs.email.port,
auth: {
user: Configs.email.user,
pass: Configs.email.pass
}
});
this.sendMail = function(mailOptions) {
mailOptions.to = Configs.mockEmail || mailOptions.to
mailOptions.from = Configs.email.user
return new Promise((resolve, reject) => {
if (mailOptions.template) {
ejs.renderFile('/../templates/' + mailOptions.template +
'.ejs', mailOptions.data, null, (err, html) => {
if (err) {
return reject(err)
}
resolve(html)
})
return
}
resolve()
}).then(html => {
mailOptions.html = html || mailOptions.html
return new Promise((resolve, reject) => {
// send mail with defined transport object
transporter.sendMail(mailOptions, (error, info) => {
if (error) {
return reject(error)
}
resolve(info)
})
})
})
}
return this
}
Then, I want to make use of this module:
const EmailUtil = require('email')
async function foo() {
// Do something async with await.
const mailOptions = {...}
EmailUtils.sendMail(mailOption);
}
However, it gives me the error:
TypeError: EmailUtils.sendMail is not a function
Note: I can remove the module.export = function(Configs) but them that will not be good since i would need to hard code the path of my config file and I have multiple configuration files for each environment. Then, I want to be able to keep Summon.js dependency injection while calling sendMail from another module. Thanks
Any ideas??
Thank you!
Since you're exporting a function you need to actually call it after requiring the module:
const EmailUtil = require('email')
async function foo() {
// Do something async with await.
const mailOptions = {...}
EmailUtil().sendMail(mailOption);
}
The use of this suggests that a function is supposed to be used as a constructor. There will be desired this object only if a function is called with new or bound to some context.
There is a convention in JavaScript to use pascal-cased names for constructors, so they could be identified unambiguously in the code.
For given EmailUtil, it should be:
const EmailUtil = require('email');
const emailUtil = new EmailUtil(config);
...
emailUtil.sendMail(mailOption);
I answer this by including EmailUtils in the depend.json file which takes care of defining the dependencies for summonjs. In this way, I was able to pass the Configs to EmailUtils and call sendMail in such way.
EmailUtils.sendMail(mailOptions);
There was no need to use the keyword new, which is a great answer. I was not aware a module could be instantiated in this way.

Categories

Resources