I want to ask how to using nodemailer with dynamic email and pass from database
export const mailTransporter = nodemailer.createTransport({
service: 'gmail',
auth: {
user: email_user,
pass: email_pass,
},
});
email_user and email_pass from file env, I want to email_user and email_pass from the database, so I think to create a function for getting value email and pass from the database, then save into variable and use in mailTransport. Guys any suggestion or opinion for it?
Wrap your nodemailer.createTransport into a function before you export it, then from the function, you get the credential from DB before constructing the nodemailer.createTransport.
module.exports = createTransportWithCredential
function createTransportWithCredential(){
return new Promise((resolve,reject)=>{
//get credential from DB, you may need to use promise-then to handle async situation
//example:
getCredentialFromDB().then(credentials=>{
let transporter = nodemailer.createTransport({
service: 'gmail',
auth: {
user: credential.user,
pass: credential.password,
},
});
resolve(transporter)
})
})
}
From the other js file you can do:
const mailer = require("./nodemailer")
mailer.createTransportWithCredential().then(transporter=>{
//use the transporter
})
Depends on what type of database you have. If you're using mysql you can use the mysql2 package to make queries. It looks like this.
I recommend creating a simple package outside of your main project but this is not completely necessary.
npm init
npm install dotenv mysql2
require('dotenv').config({ path: "/home/ubuntu/.env" });
const mysql = require("mysql2/promise");
const connection = mysql.createPool({
host: process.env.DB_HOST,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: "main",
flags: "-FOUND_ROWS",
charset: "utf8mb4_0900_ai_ci",
multipleStatements: true,
connectionLimit: 10,
queueLimit: 0
});
module.exports = connection;
Edit the options as you like. This one creates a pool of 10 connections and connects to a database named main. It also allows for multiple statements within a single query. That might not be desirable so turn that off if you'd like. Finally, I'm requiring an environment file, but I am specifying a specific location rather than getting one automatically from within the project folder (since this is its own package that will be imported into the main project).
Next we import the database package into our main project.
Just follow this page to install a local package Installing a local module using npm?
It should be something like this while inside your main project directory.
npm install ../database
if your database package is located next to your main project folder. Just replace ../database with whatever the path is to the separate database package.
Now inside our main project, it would look something like this. (I'm assuming you labelled your new package database but if not, just replace with whatever name you used.
require('dotenv').config({ path: "/home/ubuntu/.env" });
const connection = require("database");
const userID = "81cae194-52bd-42d3-9554-66385030c35b";
connection.query(`
SELECT
EmailUsername,
EmailPassword
FROM
UserEmail
WHERE
UserID = ?
`, [userID])
.then(([results, fields]) => {
let transporter = nodemailer.createTransport({
host: process.env.EMAIL_HOST,
port: process.env.EMAIL_PORT,
secure: true,
auth: {
user: results[0].EmailUsername,
pass: results[0].EmailPassword
},
});
// Put transporter.sendMail() here
})
.catch(error => console.log(error));
This is just a sample of what you could do and how you could do it. You need to use your own critical thinking to fit it into your own project and style.
There is no one way to do it and some of the choices I've made are personal decisions. You should be able to merge this with Jerry's answer. He goes more into how to create the nodemailer module. I am only showing you how to connect database data with nodemailer.
Please read up on https://www.npmjs.com/package/mysql2 especially the promise wrapper section. This solution also uses dotenv https://www.npmjs.com/package/dotenv and https://nodemailer.com/about/
Related
I am attempting to send an email using Nodemailer and Twilio Sendgrid, following the tutorial here. As far as I can tell I am following the instructions in the tutorial, as well as theNodemailer and Sendgrid documentation. Every time this method is called, the code in the catch block executes, and I get the error Error: Missing credentials for "PLAIN".
My question was closed due to association with the question here, however my problem is different and none of the solutions on the thread apply. I am using my own domain to send, not gmail.com. I want to solve the problem without using Oauth2, which from what I understand I should not need, given that I am using an email domain I control. Also I am already using pass' rather than 'password for my authorization data (the top solution on the associated answer).
I've been stuck on this for a few days now , and I'd appreciate any insight anyone can offer!
Here is my code:
async function sendEmail(email, code) {
try{
const smtpEndpoint = "smtp.sendgrid.net";
const port = 465;
const senderAddress = 'Name "contact#mydomain.com"';
const toAddress = email;
const smtpUsername = "apikey";
const smtpPassword = process.env.SG_APIKEY;
const subject = "Verify your email";
var body_html = `<!DOCTYPE>
<html>
<body>
<p>Your authentication code is : </p> <b>${code}</b>
</body>
</html>`;
let transporter = nodemailer.createTransport({
host: smtpEndpoint,
port: port,
secure: true,
auth: {
user: smtpUsername,
pass: smtpPassword,
},
logger: true,
debug: true,
});
let mailOptions = {
from: senderAddress,
to: toAddress,
subject: subject,
html: body_html,
};
let info = await transporter.sendMail(mailOptions);
return { error: false };
} catch (error) {
console.error("send-email-error", error);
return {
error: true,
message: "Cannot send email",
};
}
}
And here is the log:
Thanks!
You have already identified the issue of API key not being passed into the Nodemailer transport. While hardcoding the key is a possible solution, it's not a good practice. Usually secrets and keys are managed via environment variables so they are, for example, not accidentally committed to a repository and can be configured externally without changing the code.
In the tutorial you linked, working with the environment variable is addressed, but I see there is a mistake with .env file. So let me try to recap how to properly get SG_APIKEY from environment variable and .env file.
In your project directory create the .env file with the following contents:
SG_APIKEY=<your_sendgrid_api_key>
(obviously replace <your_sendgrid_api_key> with your actual API key)
Make sure dotenv package is installed: npm i dotenv
At the beginning of the file where you use Nodemailer, add the following line:
require("dotenv").config();
This will ensure the SG_APIKEY is loaded from .env file.
You can check if the env variable is set correctly with console.log(process.env.SG_APIKEY)
A comment on the (closed) previous version of this thread solved the problem for me:
I'm trying to send myself an email when a new user account is made in my web app. Here is the current code I'm deploying to Firebase Functions:
const functions = require("firebase-functions");
const admin = require("firebase-admin");
const nodemailer = require("nodemailer");
admin.initializeApp();
require("dotenv").config();
const {
SENDER_EMAIL,
SENDER_PASSWORD
} = process.env;
exports.sendEmailNotification = functions.firestore.document("users/{userId}").onCreate(async (snapshot, context) => {
const data = snapshot.data();
// create reusable transporter object using the default SMTP transport
let transporter = nodemailer.createTransport({
host: "smtp.gmail.com",
port: 465,
secure: true,
auth: {
user: SENDER_EMAIL,
pass: SENDER_PASSWORD,
},
});
// send mail with defined transport object
let info = await transporter.sendMail({
from: `"MY_APP_NAME" <${SENDER_EMAIL}>`,
to: "MY_PERSONAL_GMAIL_ACCOUNT",
subject: `A New User Has Joined MY_APP_NAME!`,
text: `A new user has joined MY_APP_NAME. Name: ${data.name}, email: ${data.email}`
});
})
Running a similar version on my machine using the node index.js command works no problem. The problem seems to be when it runs on Firebase Functions.
According to your current code, it seems to be working correctly, as you said, it only works fine in your machine using the node.js.
Probably the guide to follow to get the same result is the Nodemailer as a module for Node.js.
Now, if you would like to do it using Cloud Functions for Firebase, I can highly recommend you to follow the Send Email Using Firebase Functions & Nodemailer guide.
Additional guide for Send Email with Firebase functions and Nodemailer.
For me, it was failing because it wasn't importing nodemailer as I did not install it in the actual functions folder.
If you are installing nodemailer or any package please make sure that you are installing it in the actual functions folder.
You can confirm if it was installed if you check package.json and see that the dependencies have nodemailer in it.
Ive read documentation from several pages on SO of this issue, but i havent been able to fix my issue with this particular error.
throw new Error('SASL: SCRAM-SERVER-FIRST-MESSAGE: client password must be a string')
^
Error: SASL: SCRAM-SERVER-FIRST-MESSAGE: client password must be a string
at Object.continueSession (C:\Users\CNFis\Desktop\WulfDevelopments\ThePantry\node_modules\pg\lib\sasl.js:24:11)
at Client._handleAuthSASLContinue (C:\Users\CNFis\Desktop\WulfDevelopments\ThePantry\node_modules\pg\lib\client.js:257:10)
at Connection.emit (events.js:400:28)
at C:\Users\CNFis\Desktop\WulfDevelopments\ThePantry\node_modules\pg\lib\connection.js:114:12
at Parser.parse (C:\Users\CNFis\Desktop\WulfDevelopments\ThePantry\node_modules\pg-protocol\dist\parser.js:40:17)
at Socket.<anonymous> (C:\Users\CNFis\Desktop\WulfDevelopments\ThePantry\node_modules\pg-protocol\dist\index.js:11:42)
at Socket.emit (events.js:400:28)
at addChunk (internal/streams/readable.js:290:12)
at readableAddChunk (internal/streams/readable.js:265:9)
at Socket.Readable.push (internal/streams/readable.js:204:10)
its as if in my connectDB() function its not recognizing the password to the database. I am trying to run a seeder.js script to seed the database with useful information for testing purposes, and if i run npm run server which is a script that just starts a nodemon server, itll connect to the DB just fine. but when i try to run my script to seed data, i am returning this error.
import { Sequelize } from "sequelize";
import colors from "colors";
import dotenv from "dotenv";
dotenv.config();
const user = "postgres";
const host = "localhost";
const database = "thePantry";
const port = "5432";
const connectDB = async () => {
const sequelize = new Sequelize(database, user, process.env.DBPASS, {
host,
port,
dialect: "postgres",
logging: false,
});
try {
await sequelize.authenticate();
console.log("Connection has been established successfully.".bgGreen.black);
} catch (error) {
console.error("Unable to connect to the database:".bgRed.black, error);
}
};
export default connectDB;
above is my connectDB() file, and again, it works when i run the server normally. but i receive this error only when trying to seed the database. Ill post my seeder script below:
import dotenv from "dotenv";
import colors from "colors";
import users from "./data/users.js";
import User from "./models/userModel.js";
import connectDB from "./config/db.js";
dotenv.config();
console.log(process.env.DBPASS);
connectDB();
const importData = async () => {
try {
await User.drop();
await User.sync();
await User.bulkCreate(users);
console.log("Data Imported".green.inverse);
process.exit();
} catch (e) {
console.error(`${e}`.red.inverse);
process.exit(1);
}
};
const destroyData = async () => {
try {
await User.bulkDestroy();
console.log("Data Destroyed".red.inverse);
process.exit();
} catch (e) {
console.error(`${e}`.red.inverse);
process.exit(1);
}
};
if (process.argv[2] === "-d") {
destroyData();
} else {
importData();
}
Add your .env file in your project, I think your .env file is missing in your project folder.
add like this:
So, i may have figured this out by playing around in another project with sequelize, as it turns out, the initial connection to the database in my server.js file, honestly means nothing. Unlike Mongoose where the connection is available across the whole app. its not the same for Sequelize this connection that it creates is only apparent in certain places, for example i was trying the same process in my other project as i am here, except i was trying to read data from my DB using the model that i built with sequelize and i was receiving the same type error, i went into where i defined the model and made a sequelize connection there, and i was then able to read from the database using that object model.
Long story short, to fix the error in this app i have to place a connection to the database in the seeder.js file or i have to place a connection in the User model (this is ideal since ill be using the model in various places) to be able to seed information or read information from the database.
today i have same problem like this, so if you use database with type relational. you must define password from database.
const user = "postgres";
const host = "localhost";
const database = "thePantry";
const password = "yourdatabasepassword"; if null > const password = "";
const port = "5432";
but, if you use database with type non-relational, as long as the attributes are the same, you can immediately run the program as you defined it
I also faced this issue and another solution different from the accepted solution here solved my issue, so I wanted to explain that to this lovely community, too.
Firstly, when I faced the issue, ran my project in debug mode and reached the code below.
let sequelize;
if (config.use_env_variable) {
sequelize = new Sequelize(process.env[config.use_env_variable], config);
} else {
sequelize = new Sequelize(config.database, config.username, config.password, config);
}
The problem here is actually obvious when I saw first, there is a problem in .env file as mentioned in the solutions above. In my process.env is defined as like as following line: DATABASE_URL=postgres://username:password#IP_adress:port/db_name and my config.js file is in the following format:
module.exports = {
"development": {
"url":"postgres://username:password#IP_adress:port/db_name",
"dialect": "postgres",
}, ...
}
So as a solution, I come with the following fix for the parameters that are inside Sequelize(...). My solution below is totally worked for me and I hope it also works for you too.
let sequelize;
if (config.use_env_variable) {
sequelize = new Sequelize(process.env[config.use_env_variable], config);
} else {
sequelize = new Sequelize(config.url, config);
}
Finally, the point you need to be careful about what you have written to the config file. That's the most important in this case.
Farewell y'all.
Here is my case. I have postgresql connection url in my enviroment like:
POSTGRES=postgres://postgres:test#localhost:5432/default
But my config getting like:
POSTGRES_DB_HOST=localhost
POSTGRES_DB_PORT=5432
...rest of configs
Now it has resolved.
I faced this issue because I was trying to execute nodemon from a parent folder. Once I changed my pwd, the error was resolved.
For your seeder script, i'm doing something similar but not using Sequilize, just the node-postgres package in an ExpressJS app.
To give context (so you know if this applies to your situation)
I run a separate script for testing, which uses database credentials to test batched emailing. So, I need to access my database (eventually will migrate it to an AWS lambda function).
I need to access my database and run sequential actions, since I'm not spinning up my server, all that 'under the hood' processes that would normally start your connection pool is probably not running. My guess ( I know it's an old post but this may help others).
Try passing your hardcoded password credentials. first on your seeder.js file. (i'm sure you've tried this already).
Try creating a new Pool within your seeder script and pass it your credentials (try hard coding it first to see if it works).
Pool in postgres takes a client config with the following properties (i use this to get mine to work).
const pool = new Pool({
user: '****',
database: '****',
password: '****',
port: 5432,
host: '****',
max: 5,
idleTimeoutMillis: 30000,
connectionTimeoutMillis: 5000,
})
I imagine sequilize will have a similar configuration, so try playing around with that.
Then I just connect to the pool and do everything I'd normally do.
Hope this helps with a bit of the troubleshooting. I had the EXACT same error message earlier. Ultimately I had to restructure my code to 'boot up' the Client/Connection Pool for the database. It sounds like you're not properly 'booting up' your connection so try doing it manually within your seeder script (don't pass process.env.DB_PASSWORD at first).
I saw this error when running a npx sequelize-cli db:... command
and my postgres server wasn't running or able to accept connections.
To fix it, I had to be running: postgres -D /usr/local/var/postgres in the background.
I am working with integration testing for my NodeJS-Typescript application using a MongoDB database. I am using Jest as a testing framework. How to replace real db configuration with In-memory database(mongoDb) which I can use for testing. Can anyone help me with the configuration ?
config.ts
/**
* #file Configuration file - Testing Configuration.
*/
export default {
jwtPrivateKey: '11234.xsdfcswfe.23rcscdsfg',
// Testing Database configuration
MongoDB: {
dbConfig: {
user: 'xxxx',
password: 'xxxx',
host: '11.222.333.444',
port: '27017',
authMechanism: 'SCRAM-SHA-1',
authSource: 'permissionlevel',
dbName: 'sample_db'
}
}
};
You can set up a real testing database before running the tests & just drop it after running the tests. In this example (using mongoose), the database is cleaned even before running the tests (in case something went wrong with the last run)
mongoose.connect('mongodb://localhost/testing_db')
const db = mongoose.connection
db.on('error', err => {
console.error(err.toString())
done(err)
})
db.once('open', () => {
db.db.dropDatabase(() => {
done()
})
})
This drops the testing_db
I have started using
#shelfio/jest-mongodb and so far it is working great.
Documentation from their site is great and the repo has decent examples.
It is also the library recommended by jest on their site - Using with MongoDB so I’d suggest start looking at this if you haven’t already.
After working for some hours. I configured the config.ts which works fine for me.
/**
* #file Configuration file - Testing Configuration.
*/
// configuring In-memory mongodb
const globalAny:any = global;
const inMemoryUri= globalAny.__MONGO_URI__
let uri=inMemoryUri.split('/')
let hostPort=uri[2].split(':')
export default {
jwtPrivateKey: '121231231fbuyfg.hfvufuewfr3452',
// Testing Database configuration
MongoDB: {
dbConfig: {
user:'',
host: hostPort[0],
port: '27017',
authMechanism: 'SCRAM-SHA-1',
authSource: 'permissionlevel',
dbName: 'jest'
}
}
};
i have searched lot to get a solution to my problem. but didn't got it.
if anyone have the experience in such situations please help me.
i have created a application server in node express with MySQL a database.
and successfully create REST API endpoints which works successfully.
but our projects scaled up. a new client approaches so we need to serve those clients too.
those client may have 1k users.but the database schema is same.
solution 1: create a separate server and database for each client with different port no.
but i don't think this is good solution because if we have 100 client we can't maintain the code base.
solution 2: create a separate database for each client and switch database connection at run time.
but i don't understand how to implement solution 2. any suggestion highly appreciated.
if more than one client requesting same server how to know which database need to connect using the endpoint URL. i there any alternate way to tackle this situation.
my solution: create a middle ware to find out the which database is required and return the connection string.is it good idea.
middleware. in below example i use JWT token which contain database name.
const dbHelper=new db();
class DbChooser {
constructor(){
this. db=
{
wesa:{
host: "xxx",
user: "xxxx",
password: "xxxxx",
database: "hdgh",
connectionLimit:10,
connectTimeout:30000,
multipleStatements:true,
charset:"utf8mb4"
},
svn:{
host: "x.x.x.x.",
user: "xxxx",
password: "xxx",
database: "xxx",
connectionLimit:10,
connectTimeout:30000,
multipleStatements:true,
charset:"utf8mb4"
}
};
}
async getConnectiontring(req,res,next){
//console.log(req.decoded);
let d=new DbChooser();
let con=d.db[req.decoded.userId];
console.log(mysql.createPool(con));
next();
}
}
module.exports=DbChooser;
You can create a config JSON. On every request, request header should have a client_id based on the client_id we can get the instance of the database connection.
your db config JSON
var dbconfig = {
'client1': {
databasename: '',
host: '',
password: '',
username: ''
},
'client2': {
databasename: '',
host: '',
password: '',
username: ''
}
}
You should declare a global object, to maintain the singleton db instances for every client.
global.dbinstances = {};
on every request, you are going to check whether the instance is already available in your global object or not. If it's available you can go continue to the next process, otherwise it creates a new instance.
app.use('*', function(req,res) {
let client_id = req.headers.client_id;
if(global.instance[client_id]) {
next();
} else {
const config = dbconfig[client_id];
connectoDb(config, client_id);
}
}
function connectoDb(config, client_id) {
//.. once it is connected
global.instance.push({client_id: con}); //con refers to the db connection instance.
}