I have a csv that I have to read in and have to upload to a MySql database using Prisma. However... Writing all the students to the database works but the entire 'if a group exists, use that group and if it doesn't exist, create it' part doesn't work because he keeps adding duplicate groups into the database, I have tried so many options by now and none seem to work and I'm getting desperate...
router.post('/', async (req, res) => {
fs.createReadStream("./Upload/StudentenEnGroepen001.csv")
.pipe(parse({ delimiter: ",", from_line: 2 }))
.on("data", async (row) => {
let group = null;
let inschrijving = row[6];
if (inschrijving === "Student") {
let parts = row[8].split(",");
let groep = parts[1];
let groepNaam = groep.split(" - ")[0].trim();
const student = await prisma.student.create({
data: {
Code: row[0],
Gebruikersnaam: row[1],
Familienaam: row[2],
Voornaam: row[3],
Sorteernaam: row[4],
Email: row[5],
},
})
const groupCount = await prisma.groep.count({
where: {
Naam: { equals: groepNaam },
}
});
if (groupCount > 0) {
const existingGroup = await prisma.groep.findFirst({
where: {
Naam: { equals: groepNaam },
}
});
group = existingGroup;
} else {
group = await prisma.groep.create({
data: {
Naam: groepNaam,
}
});
}
await prisma.groepstudent.create({
data: {
GroepID: group.ID,
StudentID: student.ID,
},
});
}
})
res.json({message: "Studenten en groepen zijn toegevoegd."});
});
My latest attempt was this but this also doesn't work.
const groupCount = await prisma.groep.count({
where: {
Naam: { equals: groepNaam },
}
});
if (groupCount > 0) {
const existingGroup = await prisma.groep.findFirst({
where: {
Naam: { equals: groepNaam },
}
});
group = existingGroup;
} else {
group = await prisma.groep.create({
data: {
Naam: groepNaam,
}
});
}
I tried to just use 'findFirst' but that also didn't work...
group = await prisma.groep.findFirst({
where: {
Naam: { equals: groepNaam },
}
});
if (group) {
// then use this group
else{
// create group..
...
Would you please include the errors you're getting?
Adding to what Pointy said, try going back to the point where your app was actually working. Then, start adding in small pieces at a time. Run your app each time to see when it bombs out on you. Check your debugger for clues as to when exactly this is happening. These errors should point you in the right direction. Good luck!
Related
I am working with mongodb and nodejs. I have an array of customers I have to create each inside database.
const promises2 = customers.map(async customer => {
if (!customer.customerId) {
const counter = await Counter.findOne({ type: "Customer" });
console.log({counter});
const payload = {
customerId: counter.sequence_value,
};
await Customer.create(payload);
await Counter.findOneAndUpdate({ type: "Customer" }, { $inc: { sequence_value: 1 } });
}
});
await Promise.all([...promises2]);
The issue is counter is not increasing every time. I am getting same counter in all the created customers. What is the issue here?
Issue is something like this but don't have an answer.
The problem is that all the calls overlap. Since the first thing they each do is get the current counter, they all get the same counter, then try to use it. Fundamentally, you don't want to do this:
const counter = await Counter.findOne({ type: "Customer" });
// ...
await Counter.findOneAndUpdate({ type: "Customer" }, { $inc: { sequence_value: 1 } });
...because it creates a race condition: overlapping asynchronous operations can both get the same sequence value and then both issue an update to it.
You want an atomic operation for incrementing and retrieving a new ID. I don't use MongoDB, but I think the findOneAndUpdate operation can do that for you if you add the returnNewDocument option. If so, the minimal change would be to swap over to using that:
const promises2 = customers.map(async customer => {
if (!customer.customerId) {
const counter = await Counter.findOneAndUpdate(
{ type: "Customer" },
{ $inc: { sequence_value: 1 } },
{ returnNewDocument: true }
);
console.log({counter});
const payload = {
customerId: counter.sequence_value,
};
await Customer.create(payload);
}
});
await Promise.all([...promises2]);
...but there's no reason to create an array and then immediately copy it, just use it directly:
await Promise.all(customers.map(async customer => {
if (!customer.customerId) {
const counter = await Counter.findOneAndUpdate(
{ type: "Customer" },
{ $inc: { sequence_value: 1 } },
{ returnNewDocument: true }
);
console.log({counter});
const payload = {
customerId: counter.sequence_value,
};
await Customer.create(payload);
}
}));
The overall operation will fail if anything fails, and only the first failure is reported back to your code (the other operations then continue and succeed or fail as the case may be). If you want to know everything that happened (which is probably useful in this case), you can use allSettled instead of all:
// Gets an array of {status, value/reason} objects
const results = await Promise.allSettled(customers.map(async customer => {
if (!customer.customerId) {
const counter = await Counter.findOneAndUpdate(
{ type: "Customer" },
{ $inc: { sequence_value: 1 } },
{ returnNewDocument: true }
);
console.log({counter});
const payload = {
customerId: counter.sequence_value,
};
await Customer.create(payload);
}
}));
const errors = results.filter(({status}) => status === "rejected").map(({reason}) => reason);
if (errors.length) {
// Handle/report errors here
}
Promise.allSettled is new in ES2021, but easily polyfilled if needed.
If I'm mistaken about the above use of findOneAndUpdate in some way, I'm sure MongoDB gives you a way to get those IDs without a race condition. But in the worst case, you can pre-allocate the IDs instead, something like this:
// Allocate IDs (in series)
const ids = [];
for (const customer of customers) {
if (!customer.customerId) {
const counter = await Counter.findOne({ type: "Customer" });
await Counter.findOneAndUpdate({ type: "Customer" }, { $inc: { sequence_value: 1 } });
ids.push(counter.sequence_value);
}
}
// Create customers (in parallel)
const results = await Promise.allSettled(customers.map(async(customer, index) => {
const customerId = ids[index];
try {
await Customer.create({
customerId
});
} catch (e) {
// Failed, remove the counter, but without allowing any error doing so to
// shadow the error we're already handling
try {
await Counter.someDeleteMethodHere(/*...customerId...*/);
} catch (e2) {
// ...perhaps report `e2` here, but don't shadow `e`
}
throw e;
}
});
// Get just the errors
const errors = results.filter(({status}) => status === "rejected").map(({reason}) => reason);
if (errors.length) {
// Handle/report errors here
}
Your map function is not returning a promise.
Try this :
const promises2 = [];
customers.map((customer) => {
return new Promise(async (resolve) => {
if (!customer.customerId) {
const counter = await Counter.findOne({ type: 'Customer' });
console.log({ counter });
const payload = {
customerId: counter.sequence_value,
};
await Customer.create(payload);
await Counter.findOneAndUpdate({ type: 'Customer' }, { $inc: { sequence_value: 1 } });
}
resolve();
});
});
await Promise.all(promises2);
I am new in nodejs and mongodb. Its really very confusing to use promise in loop in nodejs for new developer.I require the final array or object. which then() give me final result. Please correct this.
I have a controller function described below.
let League = require('../../model/league.model');
let Leaguetype = require('../../model/leagueType.model');
let Leaguecategories = require('../../model/leagueCategories.model');
let fetchLeague = async function (req, res, next){
let body = req.body;
await mongo.findFromCollection(Leaguetype)
.then(function(types) {
return Promise.all(types.map(function(type){
return mongo.findFromCollection(Leaguecategories, {"league_type_id": type._id})
.then(function(categories) {
return Promise.all(categories.map(function(category){
return mongo.findFromCollection(League, {"league_category_id": category._id})
.then(function(leagues) {
return Promise.all(leagues.map(function(league){
return league;
}))
.then(function(league){
console.log(league);
})
})
}))
});
}))
})
.then(function(final){
console.log(final);
})
.catch (error => {
console.log('no',error);
})
}
mongo.findFromCollection function is looking like this.
findFromCollection = (model_name, query_obj = {}) => {
return new Promise((resolve, reject) => {
if (model_name !== undefined && model_name !== '') {
model_name.find(query_obj, function (e, result) {
if (!e) {
resolve(result)
} else {
reject(e);
}
})
} else {
reject({ status: 104, message: `Invalid search.` });
}
})
}
and here is my model file
var mongoose = require('mongoose');
const league_categories = new mongoose.Schema({
name: {
type: String,
required: true
},
active: {
type: String,
required: true
},
create_date: {
type: Date,
required: true,
default: Date.now
},
league_type_id: {
type: String,
required: 'league_type',
required:true
}
})
module.exports = mongoose.model('Leaguecategories', league_categories)
First i recommend you stop using callbacks wherever you can, its a bit dated and the code is much harder to read and maintain.
I re-wrote your code a little bit to look closer to what i'm used to, this does not mean this style is better, i just personally think its easier to understand what's going on.
async function fetchLeague(req, res, next) {
try {
//get types
let types = await Leaguetype.find({});
//iterate over all types.
let results = await Promise.all(types.map(async (type) => {
let categories = await Leaguecategories.find({"league_type_id": type._id});
return Promise.all(categories.map(async (category) => {
return League.find({"league_category_id": category._id})
}))
}));
// results is in the form of [ [ [ list of leagues] * per category ] * per type ]
// if a certain category or type did not have matches it will be an empty array.
return results;
} catch (error) {
console.log('no', error);
return []
}
}
I have a segment code like below running in Node.js. And I find it will always goes to else condiction, howerver with masterData is not null.
getOperationDetails(req, res) {
let sql = 'select a.*, b.s*';
sql += ` from ${paymentSheet} a left join ${paymentHisSheet} b on a.id= b.source_id `;
sql += ' where a.id=? ';
func.connPool(sql, id, (err, rows, field) => {
if (err) {
res.json({ code: 400, message: err })
} else {
let masterData = [];
let details = rows.map((row, idx) => {
if (idx === 0) {
masterData.push({
id: row.id,
name: row.name
});
}
return {
operator: row.operator_info,
comments: row.cmt,
status: row.sta
}
})
if (masterData.length > 0 ) {
masterData[0].details = details;
} else {
console.log(sql);
console.log(id);
console.log('=======================');
console.log(masterData);
}
res.json({ code: 200, message: 'ok', data: masterData })
}
})
For example, the console will show like below. Obviously masterData has value. It means 'if' condiction run before map(). Do I have to use async to wait the map() handle the data over?
allConnections:2
select a.*, b.* from payment a left join history b on a.id= b.source_id where a.id=?
83e588cd-9b4b-4592-ac7f-529bfaa9b231
=======================
allConnections:2
allConnections:2
[
{
id: '83e588cd-9b4b-4592-ac7f-529bfaa9b231',
name: 'Jeff'
}
]
My anaysis:
the rows from database should like below
83e588cd-9b4b-4592-ac7f-529bfaa9b231', 'Jeff', 'Operator Peter', 'OK', 0
83e588cd-9b4b-4592-ac7f-529bfaa9b231', 'Jeff', 'Operator Mary', 'NO', 1
83e588cd-9b4b-4592-ac7f-529bfaa9b231', 'Jeff', 'Operator Jet', 'OK', 2
or like below, means no details
83e588cd-9b4b-4592-ac7f-529bfaa9b231', 'Jeff', null, null, null
That is why I use masterData to separate. I think push() should not be taken out the map(), becasue rows maybe return nothing. Will it be like map() is over and push() is still running?
==== P.S. func.connPool====
let mysql = require('mysql');
let db = require('../configs/db');
let pool = mysql.createPool(db);
module.exports = {
connPool (sql, val, cb) {
pool.getConnection((err, conn) => {
if (err) {
console.log('Connection Error:' + err);
cb(err, null, null);
} else {
console.log('allConnections:' + pool._allConnections.length);
let q = conn.query(sql, val, (err, rows,fields) => {
pool.releaseConnection(conn);
if (err) {
console.log('Query:' + sql + ' error:' + err);
}
cb(err, rows, fields);
});
}
});
},
What I suspected is that the push operation is somehow delay because of some code that is not shown here (I am not certain yet).
I ran the following code so many times, I still could not reproduce your problem.
var rows = [
{
id: "123",
name: "test",
},
{
id: "123",
name: "test",
},
{
id: "123",
name: "test",
},
]
let masterData = [];
let details = rows.map((row, idx) => {
if (idx === 0) {
masterData.push({
id: row.id,
name: row.name
});
}
return {
id: row.id,
name: row.name,
}
})
if (masterData.length > 0 ) {
console.log("in");
} else {
console.log(masterData);
console.log('=======================');
}
Could you try whether it goes to else or not for this code.
From this piece of code you are pushing to MasterData only the first row.id and row.name.
( that is specified in the if conditional for just the first index idx === 0 )
So if thats the case you don't need to have this push thing inside the map.
You can take that out of the map and leave the iterator to create only the details array.
You can go with:
let details = rows.map(row => ({
operator: row.operator_info,
comments: row.cmt,
status: row.sta
})
);
let masterData = [{ id: rows[0].id, name: rows[0].name, details }]
I have some mock data for below 2 URLS:
1. Get the list of users from 'https://myapp.com/authors'.
2. Get the list of Books from 'https://myapp.com/books'.
Now my task is to sort the Books by name and write the sorted list to the file mysortedbooks.json as JSON
Then I have to create an array of authors with books property that has all the books of that author.
If the author has no books then this array should be empty. Sorting is not needed for this case, and data should be stored in file authorBooks.json as JSON.
Now I have to return a promise that resolves when the above steps are complete. For example, I should return the final saveToFile call in below code.
const fs = require('fs');
function getFromURL(url) {
switch (url) {
case 'https://myapp.com/authors':
return Promise.resolve([
{ name: "Chinua Achebe", id: "1" },
{ name: "Hans Christian Andersen", id: "2" },
{ name: "Dante Alighieri", id: "3" },
]);
case 'https://myapp.com/books':
return Promise.resolve([
{ name: "Things Fall Apart", authorId: "1" },
{ name: "The Epic Of Gilgamesh", authorId: "1" },
{ name: "Fairy tales", authorId: "2" },
{ name: "The Divine Comedy", authorId: "2" },
{ name: "One Thousand and One Nights", authorId: "1" },
{ name: "Pride and Prejudice", authorId: "2" },
]);
}
}
const outFile = fs.createWriteStream('...out-put-path...');
function saveToFile(fileName, data) {
outFile.write(`${fileName}: ${data}\n`);
return Promise.resolve();
}
function processData() {
const authors = getFromURL('https://myapp.com/authors').then(author => {
return authors;
});
const books = getFromURL('https://myapp.com/authors').then(books => {
return books.sort();
});
return saveToFile('mysortedbooks.json', JSON.stringify(books)).then(() => {
const authorAndBooks = authors.map(author => {
var jsonData = {};
jsonData['name'] = author.name;
jsonData['books'] = [];
for(var i=0; i<books.length; i++) {
if(authod.id == books[i].authorId) {
jsonData['books'].push(books[i].name);
}
}
});
saveToFile('authorBooks.json', authorAndBooks);
});
}
processData().then(() => outFile.end());
The main logic I have to implement is in processData method.
I tried adding code to solve the requirement but I got stuck how to return promise after all the operations. Also how to build my authorAndBooks JSON content.
Please help me with this.
const authors = getFromURL('https://myapp.com/authors').then(author => {
return authors;
});
const books = getFromURL('https://myapp.com/authors').then(books => {
return books.sort();
});
//authors and books are both promises here, so await them
return Promise.all([authors, books]).then(function(results){
authors = results[0];
books = results[1];
return saveToFile(...);
});
alternatively declare your function async and do
const authors = await getFromURL('https://myapp.com/authors').then(author => {
return authors;
});
const books = await getFromURL('https://myapp.com/authors').then(books => {
return books.sort();
});
return await saveToFile(...);
Refactored code with Promise Chaining and to create multiple file streams
const fs = require('fs');
function getFromURL(url) {
switch (url) {
case 'https://myapp.com/authors':
return Promise.resolve([
{ name: "Chinua Achebe", id: "1" },
{ name: "Hans Christian Andersen", id: "2" },
{ name: "Dante Alighieri", id: "3" },
]);
case 'https://myapp.com/books':
return Promise.resolve([
{ name: "Things Fall Apart", authorId: "1" },
{ name: "The Epic Of Gilgamesh", authorId: "1" },
{ name: "Fairy tales", authorId: "2" },
{ name: "The Divine Comedy", authorId: "2" },
{ name: "One Thousand and One Nights", authorId: "1" },
{ name: "Pride and Prejudice", authorId: "2" },
]);
}
}
function saveToFile(fileName, data) {
const outFile = fs.createWriteStream(`/var/${fileName}`);
outFile.write(data);
return Promise.resolve(outFile);
}
function authorBookMapping(data) {
let [authors, books] = data;
var jsonData = {};
authors.map(author => {
jsonData['name'] = author.name;
jsonData['books'] = [];
for(var i=0; i<books.length; i++) {
if(author.id == books[i].authorId) {
jsonData['books'].push(books[i].name);
}
}
});
return {
books: books,
authorAndBooks: jsonData
};
}
function writeFile(data) {
if(data) {
const {books, authorAndBooks} = data;
const book = saveToFile('mysortedbooks.json', JSON.stringify(books));
const author = saveToFile('authorBooks.json', JSON.stringify(authorAndBooks));
return Promise.all([book, author]);
}
}
function processData() {
const authors = getFromURL('https://myapp.com/authors');
const books = getFromURL('https://myapp.com/authors');
return Promise.all([authors, books])
.then(authorBookMapping)
.then(writeFile)
}
processData().then((stream) => {
for(let s in stream) {
stream[s].close();
}
})
.catch((err) => {
console.log("Err :", err);
}) ;
lot of mistakes in your code. I will try to explain one by one, read comments between code. I would recommend you to read some basics of file operations and promises. Problem is in your saveToFile method and how you are chaining promises in processData method.
Change your saveToFIle function as following. You can also use promise supporting fs libraries like fs-extra but I'm not sure if you want to use an external library.
const path = require('path');
const basePath = '.';//whatever base path of your directories
function saveToFile(fileName, data) {
// fs.writeFile method uses callback, you can use many ways to convert a callback method to support promises
// this is one of the simple and doesn't require any libraries to import
return new Promise((resolve,reject)=>{
let filePath = path.join(basePath,fileName);
return fs.writeFile(filePath,data,(err, data)=>{
if(err) reject(err);
else resolve();
});
})
}
Now change your processData function to use promise.all and to sort boooks in right way
function processData() {
let books, authors;
//Promise.all can be used when operations are not interdependent, fteches result fasetr as works like parallel requests
return Promise.all([
getFromURL('https://myapp.com/books'),
getFromURL('https://myapp.com/authors')
]).then(data => {
books = data[0];
authors = data[1];
let authorAndBooks = authors.map(author => {
let jsonData = {};
jsonData['name'] = author.name;
jsonData['books'] = [];
for(var i=0; i<books.length; i++) {
if(author.id == books[i].authorId) {
jsonData['books'].push(books[i].name);
}
}
return jsonData;
console.log(jsonData);
});
// you will have to use a comparator to sort objects, given below it will sort books based on names.
books.sort((first,second)=>{ return first.name>second.name ?1:-1})
return Promise.all([
saveToFile("mysortedbooks.json",JSON.stringify(books)),
saveToFile("authorBooks.json",JSON.stringify(authorAndBooks))])
}).then(data=>{
console.log('All operations complete');
})
}
processData();
Have you considered looking at this in a different way? If this is going to be the case for other APIs I'd think about aggregating those APIs in an aggregator service or the API itself if you can.
It is always better to receive all the data you need at once rather than multiple calls, you will incur latency and complexity.
I have a simple collection in mongodb.
I use mongoose.
I have users model with one field type object.
And I want change this object dynamically. But this code doesn't work, I used findByIdAndUpdate(), findById, findOne(), findOneAndUpdate().
const UsersSchema = mongoose.Schema({
likes: {}
},
{ collection: 'users' });
const Users = mongoose.model('Users', UsersSchema);
const id ="5b4c540f14f353a4b9875af4";
const thems = ['foo', 'bar'];
Users.findById(id, (err, res) => {
thems.map(item => {
if (res.like[item]) {
res.like[item] = res.like[item] + 1;
} else {
res.like[item] = 1;
}
});
res.save();
});
I believe that, for solve this problem you need to add more fields in your schema:
I created one example with this data:
const UsersSchema = new mongoose.Schema({
likes :[
{
thema:{
type: String
},
likes_amount:{
type: Number
},
_id:false
}]
});
module.exports = mongoose.model('Users', UsersSchema);
I added one user:
var newUser = new UserModel({
likes:[{
thema:'foo',
likes_amount:1
}]
});
newUser.save();
Here the code that increment the likes per thema:
const thems = ['foo', 'bar'];
const userId = "5b4d0b1a1ce6ac3153850b6a";
UserModel.findOne({_id:userId})
.then((result) => {
var userThemas = result.likes.map(item => {
return item.thema;
});
for (var i = 0; i < thems.length; i++) {
//if exists it will increment 1 like
if (userThemas.includes(thems[i])) {
UserModel.update({_id: result._id, "likes.thema" : thems[i]}, {$inc: {"likes.$.likes_amount": 1}})
.then((result) => {
console.log(result);
}).catch((err) => {
console.log(err)
});
} else {
//if doesn't exist it will create a thema with 1 like
UserModel.update({_id: result._id},
{
$addToSet: {
likes: {
$each: [{thema: thems[i], likes_amount: 1}]
}
}})
.then((result) => {
console.log(result);
}).catch((err) => {
console.log(err)
});
}
}
}).catch((err) => {
console.log(err)
});
Database result of this increment:
I hope that it can help you.