TypeError: usert.addItem is not a function - javascript

Trying to make a discord bot using discord.js. I'm using sequelize and sqlite to create a database to store data. A custom function doesn't seem to work, the terminal thinks it's not a function when actually defined. There could be a really obvious solution to this, but I'm very amateur, and I get errors often, but usually fix them. This one I can't even determine the root of the problem
This problem also applies to other custom functions
The most confusing bit is that, for another folder for another bot entirely, with very similar code and essentially the same custom functions, it works! But for some reason, it doesn't work here.
// Defining these
const { Users, ItemDB } = require('./dbObjects');
// The command that uses the function. It is worth noting that it finds the item and user successfully, proving that the problem is in users.addItem
const item = await ItemDB.findByPk(1);
const usert = Users.findByPk(message.author.id);
usert.addItem(item);
// The addItem function defined, in dbObjects file
Users.prototype.addItem = async function(item) {
const useritem = await UserItems.findOne({
where: { user_id: this.user_id, item_id: item.id },
});
if (useritem) {
useritem.amount += 1;
return useritem.save();
}
return UserItems.create({ user_id: this.user_id, item_id: item.id, amount: 1 });
};
The expected result is successfully adding to the database, but instead the terminal returns:
(node:21400) UnhandledPromiseRejectionWarning: TypeError: usert.addItem is not a function
Adding await before Users.findByPk returns as random.

You need to await Users.findByPk(message.author.id);
const { Users, ItemDB } = require('./dbObjects');
// The command that uses the function. It is worth noting that it finds the item and user successfully, proving that the problem is in users.addItem
const item = await ItemDB.findByPk(1);
const usert = await Users.findByPk(message.author.id);
usert.addItem(item);
// The addItem function defined, in dbObjects file
Users.prototype.addItem = async function(item) {
const useritem = await UserItems.findOne({
where: { user_id: this.user_id, item_id: item.id },
});
if (useritem) {
useritem.amount += 1;
return useritem.save();
}
return UserItems.create({ user_id: this.user_id, item_id: item.id, amount: 1 });

Since Users.findByPk(message.author.id) is a promise it returns execution to the next sequence code, so therefore the variable const usert is not yet initialized which results in usert.addItem() not being function.
You need to change const usert = Users.findByPk(message.author.id) to this for usert to be fully initialized, the addItem() function will be available then:
const usert = await Users.findByPk(message.author.id);

Related

How to get out a certain field value from a object from a array in Javascript

I'm learning firebase cloud functions with JavaScript,
I get a QuerySnapshot back of a collection of documents each document holds an ID field and a message field.
Where I'm stuck now is that every time I loop through the collection I want to be able to just take the ID field out from each object and save it.
I've tried all the ways that I can think of that come up on Google and stack overflow none are working for me, I'm obviously doing something wrong.
I'm totally new to JavaScript so this may be an easy fix if anyone has any information
This is my code in visual studio that I'm using, which is working fine from where I can see to get to the collection that I need to
// onDelete is my trigger which would then go and fetch the collection that I want
exports.onDelet = functions.firestore.document('recentMsg/currentuid/touid/{documentId}').onDelete(async(snap, context) => {
const data = snap.data();
const contex = context.params;
// once I get the information from onDelete the following code starts
await admin.firestore().collection(`messages/currentuid/${contex.documentId}`)
.get()
.then((snapshot) => {
//this array will hold all documents from the collection
const results = []
const data = snapshot.docs.map((doc) => ({
id: doc.id,
...doc.data(),
}));
results.push(data)
console.log(results)
//This is one of the ways I've tried
//results.forEach((doc) => {
//console.log(doc.id)
//this is a print out in the terminal
// > undefined
// } );
});
Below is a print out that I get in terminal which is all the information that it holds which is great,
But really all I want is to have an array that holds every id
if there was just one value in the array I know this would not be a problem but because there is an object with multiple values that's the issue.
i functions: Beginning execution of "onDelet"
> [
> [
> { id: '28ZyROMQkzEBBDwTm6yV', msg: 'sam 2' },
> { id: 'ixqgYqmwlZJfb5D9h8WV', msg: 'sam 3' },
> { id: 'lLyNDLLIHKc8hCnV0Cgc', msg: 'sam 1' }
> ]
> ]
i functions: Finished "onDelet" in ~1s
once again apologies if this is a dumb question I'm a newbie.
With await admin.firestore().collection(messages/currentuid/${contex.documentId}) .get() you get a QuerySnapshot that supports forEach and not map. Just write your code like this:
// onDelete is my trigger which would then go and fetch the collection that I want
exports.onDelet = functions.firestore.document('recentMsg/currentuid/touid/{documentId}').onDelete(async(snap, context) => {
const data = snap.data();
const contex = context.params;
// once I get the information from onDelete the following code starts
await admin.firestore().collection(`messages/currentuid/${contex.documentId}`)
.get()
.then((snapshot) => {
//this array will hold all documents from the collection
const results = []
snapshot.forEach((doc) =>{
results.push({id:doc.id,...doc.data()})
});
console.log(results)
//This is one of the ways I've tried
//results.forEach((doc) => {
//console.log(doc.id)
//this is a print out in the terminal
// > undefined
// } );
});
I assume that messages/currentuid/${contex.documentId} is a collecion and not a single document.
You can read more baout it here.

Loop from multiple airtable bases in a Next JS page

I think this more of a general async/await loop question, but I'm trying to do it within the bounds of an Airtable API request and within getStaticProps of Next.js so I thought that is important to share.
What I want to do is create an array of base IDs like ["appBaseId01", "appBaseId02", "appBaseId03"] and output the contents of a page. I have it working with 1 base, but am failing at getting it for multiple.
Below is the code for one static base, if anyone can help me grok how I'd want to loop over these. My gut says that I need to await each uniquely and then pop them into an array, but I'm not sure.
const records = await airtable
.base("appBaseId01")("Case Overview Information")
.select()
.firstPage();
const details = records.map((detail) => {
return {
city: detail.get("City") || null,
name: detail.get("Name") || null,
state: detail.get("State") || null,
};
});
return {
props: {
details,
},
};
EDIT
I've gotten closer to emulating it, but haven't figured out how to loop the initial requests yet.
This yields me an array of arrays that I can at least work with, but it's janky and unsustainable.
export async function getStaticProps() {
const caseOneRecords = await setOverviewBase("appBaseId01")
.select({})
.firstPage();
const caseTwoRecords = await setOverviewBase("appBaseId02")
.select({})
.firstPage();
const cases = [];
cases.push(minifyOverviewRecords(caseOneRecords));
cases.push(minifyOverviewRecords(caseTwoRecords));
return {
props: {
cases,
},
};
}
setOverviewBase is a helper that establishes the Airtable connection and sets the table name.
const setOverviewBase = (baseId) =>
base.base(baseId)("Case Overview Information");
You can map the array of base IDs and await with Promise.all. Assuming you have getFirstPage and minifyOverviewRecords defined as below, you could do the following:
const getFirstPage = (baseId) =>
airtable
.base(baseId)("Case Overview Information")
.select({})
.firstPage();
const minifyOverviewRecords = (records) =>
records.map((detail) => {
return {
city: detail.get("City") || null,
name: detail.get("Name") || null,
state: detail.get("State") || null,
};
});
export async function getStaticProps() {
const cases = await Promise.all(
["appBaseId01", "appBaseId02", "appBaseId03"].map(async (baseId) => {
const firstPage = await getFirstPage(baseId);
return minifyOverviewRecords(firstPage);
})
);
return {
props: {
cases
}
};
}

How to use dataloader?

Im trying to figure this out.
I want to get all my users from my database, cache them
and then when making a new request I want to get those that Ive cached + new ones that have been created.
So far:
const batchUsers = async ({ user }) => {
const users = await user.findAll({});
return users;
};
const apolloServer = new ApolloServer({
schema,
playground: true,
context: {
userLoader: new DataLoader(() => batchUsers(db)),// not sending keys since Im after all users
},
});
my resolver:
users: async (obj, args, context, info) => {
return context.userLoader.load();
}
load method requiers a parameter but in this case I dont want to have a specific user I want all of them.
I dont understand how to implement this can someone please explain.
If you're trying to just load all records, then there's not much of a point in utilizing DataLoader to begin in. The purpose behind DataLoader is to batch multiple calls like load(7) and load(22) into a single call that's then executed against your data source. If you need to get all users, then you should just call user.findAll directly.
Also, if you do end up using DataLoader, make sure you pass in a function, not an object as your context. The function will be ran on each request, which will ensure you're using a fresh instance of DataLoader instead of one with a stale cache.
context: () => ({
userLoader: new DataLoader(async (ids) => {
const users = await User.findAll({
where: { id: ids }
})
// Note that we need to map over the original ids instead of
// just returning the results of User.findAll because the
// length of the returned array needs to match the length of the ids
return ids.map(id => users.find(user => user.id === id) || null)
}),
}),
Note that you could also return an instance of an error instead of null inside the array if you want load to reject.
Took me a while but I got this working:
const batchUsers = async (keys, { user }) => {
const users = await user.findAll({
raw: true,
where: {
Id: {
// #ts-ignore
// eslint-disable-next-line no-undef
[op.in]: keys,
},
},
});
const gs = _.groupBy(users, 'Id');
return keys.map(k => gs[k] || []);
};
const apolloServer = new ApolloServer({
schema,
playground: true,
context: () => ({
userLoader: new DataLoader(keys => batchUsers(keys, db)),
}),
});
resolver:
user: {
myUsers: ({ Id }, args, { userLoader }) => {
return userLoader.load(Id);
},
},
playground:
{users
{Id
myUsers
{Id}}
}
playground explained:
users basically fetches all users and then myusers does the same thing by inhereting the id from the first call.
I think I choose a horrible example here since I did not see any gains in performence by this. I did see however that the query turned into:
SELECT ... FROM User WhERE ID IN(...)

How to append to an array in IndexedDB filtered by ID?

Init code:
let dbPormise = null;
const OBJECT_STORE_NAME = 'pages';
const DB_NAME = 'tracking-log';
To initiate an ObjectStore:
dbPromise = idb.open(DB_NAME, 3, upgradeDB => {
upgradeDB.createObjectStore(OBJECT_STORE_NAME, {
autoIncrement: true,
keypath: 'id'
});
});
This is how I generate a blank record in the IndexedDB:
const tx = db.transaction(OBJECT_STORE_NAME, 'readwrite');
tx.objectStore(OBJECT_STORE_NAME).put(
{ id: newBucketID, data: [] });
Now, at a later point, I have some elements that I want to append to the data array for a particular id.
This is how I tried doing it:
const tx = db.transaction(OBJECT_STORE_NAME, 'readwrite');
tx.objectStore(OBJECT_STORE_NAME).put(
{ id: localStorage.getItem("currentBucket"), data: item }
);
Schema
{
data: Array
}
Every item has a unique key generated and provided by me.
However, this doesn't work and returns an error: "Key already exists in the object store."
So, how can I append a value to a field inside a IDB objectt?
Not sure about the error, but regardless of that, the basic way of adding an item would be something like this:
function addItem(db, bucketId, item) {
return new Promise(addItemExecutor.bind(null, db, bucketId, item));
}
function addItemExecutor(db, bucketId, item, resolve, reject) {
// Start a single writable transaction that we will use for two requests. One to
// find the corresponding bucket, and one to update it.
const tx = db.transaction(OBJECT_STORE_NAME, 'readwrite');
// If all requests completed without error, we are done
tx.oncomplete = resolve;
// If any request fails, the operation fails
tx.onerror = event => reject(event.target.error);
const store = tx.objectStore(OBJECT_STORE_NAME);
// Go find the corresponding bucket object to update
const findRequest = store.get(bucketId);
findRequest.onsuccess = findRequestOnsuccess.bind(findRequest, bucketId, item, reject);
}
// React to the resolution of the get request
function findRequestOnsuccess(bucketId, item, reject, event) {
const bucket = event.target.result;
// If no bucket exists for that id then fail
if(!bucket) {
const error = new Error('No bucket found for id ' + bucketId);
reject(error);
return;
}
// Lazily init the data array property
if(!bucket.data) {
bucket.data = [];
}
// Add our item to the data array
bucket.data.push(item);
// Save the bucket object back into the bucket object store, completely replacing
// the bucket that was there before.
const bucketStore = event.target.source;
bucketStore.put(bucket);
}
async function someCallingCodeExampleAvoidingTopLevelAwait() {
const bucketId = localStorage.currentBucket;
const item = {foo:bar};
const db = evilUnreliableGlobalDbVariableFromSomewhereMagicalForeverOpenAssumeInitialized;
try {
await addItem(db, bucketId, item);
} catch(error) {
console.debug(error);
}
// Leave the database connection open for page lifetime
}
Without a reduced example it's difficult to figure out what's going on. The best way to get help is to create a reduced example of the problem, as in, the smallest amount of code needed to recreate the issue you're seeing, then put it on something like jsbin.com or glitch.com so folks only have to click a link to see the error you're seeing.
I wasn't able to recreate the error you're seeing. You have keypath when it should be keyPath, but I don't think that creates the error you're seeing.
Anyway, here's how to modify a record in IDB:
async function main() {
// Set up the database.
const OBJECT_STORE_NAME = 'pages';
const DB_NAME = 'tracking-log';
const db = await idb.open(DB_NAME, 1, upgradeDB => {
upgradeDB.createObjectStore(OBJECT_STORE_NAME, {
autoIncrement: true,
keyPath: 'id'
});
});
// The OP didn't make it clear what this value was, so I'll guess.
const newBucketID = 1;
{
// Create the record.
const tx = db.transaction(OBJECT_STORE_NAME, 'readwrite');
tx.objectStore(OBJECT_STORE_NAME).put({ id: newBucketID, data: ['first value'] });
}
{
const tx = db.transaction(OBJECT_STORE_NAME, 'readwrite');
// Get the record.
const record = await tx.objectStore(OBJECT_STORE_NAME).get(newBucketID);
// Modify it.
record.data.push('second value');
// Put the modified record back.
tx.objectStore(OBJECT_STORE_NAME).put(record);
}
{
// Read the value to confirm everything worked.
const tx = db.transaction(OBJECT_STORE_NAME);
const value = await tx.objectStore(OBJECT_STORE_NAME).get(newBucketID);
console.log(value);
}
}
main();
And here's that example running: https://jsbin.com/dineguq/edit?js,console

Batch update in knex

I'd like to perform a batch update using Knex.js
For example:
'UPDATE foo SET [theValues] WHERE idFoo = 1'
'UPDATE foo SET [theValues] WHERE idFoo = 2'
with values:
{ name: "FooName1", checked: true } // to `idFoo = 1`
{ name: "FooName2", checked: false } // to `idFoo = 2`
I was using node-mysql previously, which allowed multiple-statements. While using that I simply built a mulitple-statement query string and just send that through the wire in a single run.
I'm not sure how to achieve the same with Knex. I can see batchInsert as an API method I can use, but nothing as far as batchUpdate is concerned.
Note:
I can do an async iteration and update each row separately. That's bad cause it means there's gonna be lots of roundtrips from the server to the DB
I can use the raw() thing of Knex and probably do something similar to what I do with node-mysql. However that defeats the whole knex purpose of being a DB abstraction layer (It introduces strong DB coupling)
So I'd like to do this using something "knex-y".
Any ideas welcome.
I needed to perform a batch update inside a transaction (I didn't want to have partial updates in case something went wrong).
I've resolved it the next way:
// I wrap knex as 'connection'
return connection.transaction(trx => {
const queries = [];
users.forEach(user => {
const query = connection('users')
.where('id', user.id)
.update({
lastActivity: user.lastActivity,
points: user.points,
})
.transacting(trx); // This makes every update be in the same transaction
queries.push(query);
});
Promise.all(queries) // Once every query is written
.then(trx.commit) // We try to execute all of them
.catch(trx.rollback); // And rollback in case any of them goes wrong
});
Assuming you have a collection of valid keys/values for the given table:
// abstract transactional batch update
function batchUpdate(table, collection) {
return knex.transaction(trx => {
const queries = collection.map(tuple =>
knex(table)
.where('id', tuple.id)
.update(tuple)
.transacting(trx)
);
return Promise.all(queries)
.then(trx.commit)
.catch(trx.rollback);
});
}
To call it
batchUpdate('user', [...]);
Are you unfortunately subject to non-conventional column names? No worries, I got you fam:
function batchUpdate(options, collection) {
return knex.transaction(trx => {
const queries = collection.map(tuple =>
knex(options.table)
.where(options.column, tuple[options.column])
.update(tuple)
.transacting(trx)
);
return Promise.all(queries)
.then(trx.commit)
.catch(trx.rollback);
});
}
To call it
batchUpdate({ table: 'user', column: 'user_id' }, [...]);
Modern Syntax Version:
const batchUpdate = (options, collection) => {
const { table, column } = options;
const trx = await knex.transaction();
try {
await Promise.all(collection.map(tuple =>
knex(table)
.where(column, tuple[column])
.update(tuple)
.transacting(trx)
)
);
await trx.commit();
} catch (error) {
await trx.rollback();
}
}
You have a good idea of the pros and cons of each approach. I would recommend a raw query that bulk updates over several async updates. Yes you can run them in parallel, but your bottleneck becomes the time it takes for the db to run each update. Details can be found here.
Below is an example of an batch upsert using knex.raw. Assume that records is an array of objects (one obj for each row we want to update) whose values are the properties names line up with the columns in the database you want to update:
var knex = require('knex'),
_ = require('underscore');
function bulkUpdate (records) {
var updateQuery = [
'INSERT INTO mytable (primaryKeyCol, col2, colN) VALUES',
_.map(records, () => '(?)').join(','),
'ON DUPLICATE KEY UPDATE',
'col2 = VALUES(col2),',
'colN = VALUES(colN)'
].join(' '),
vals = [];
_(records).map(record => {
vals.push(_(record).values());
});
return knex.raw(updateQuery, vals);
}
This answer does a great job explaining the runtime relationship between the two approaches.
Edit:
It was requested that I show what records would look like in this example.
var records = [
{ primaryKeyCol: 123, col2: 'foo', colN: 'bar' },
{ // some other record, same props }
];
Please note that if your record has additional properties than the ones you specified in the query, you cannot do:
_(records).map(record => {
vals.push(_(record).values());
});
Because you will hand too many values to the query per record and knex will fail to match the property values of each record with the ? characters in the query. You instead will need to explicitly push the values on each record that you want to insert into an array like so:
// assume a record has additional property `type` that you dont want to
// insert into the database
// example: { primaryKeyCol: 123, col2: 'foo', colN: 'bar', type: 'baz' }
_(records).map(record => {
vals.push(record.primaryKeyCol);
vals.push(record.col2);
vals.push(record.colN);
});
There are less repetitive ways of doing the above explicit references, but this is just an example. Hope this helps!
The solution works great for me! I just include an ID parameter to make it dynamic across tables with custom ID tags. Chenhai, here's my snippet including a way to return a single array of ID values for the transaction:
function batchUpdate(table, id, collection) {
return knex.transaction((trx) => {
const queries = collection.map(async (tuple) => {
const [tupleId] = await knex(table)
.where(`${id}`, tuple[id])
.update(tuple)
.transacting(trx)
.returning(id);
return tupleId;
});
return Promise.all(queries).then(trx.commit).catch(trx.rollback);
});
}
You can use
response = await batchUpdate("table_name", "custom_table_id", [array of rows to update])
to get the returned array of IDs.
The update can be done in batches, i.e 1000 rows in a batch
And as long as it does it in batches, the bluebird map could be used.
For more information on bluebird map: http://bluebirdjs.com/docs/api/promise.map.html
const limit = 1000;
const totalRows = 50000;
const seq = count => Array(Math.ceil(count / limit)).keys();
map(seq(totalRows), page => updateTable(dbTable, page), { concurrency: 1 });
const updateTable = async (dbTable, page) => {
let offset = limit* page;
return knex(dbTable).pluck('id').limit(limit).offset(offset).then(ids => {
return knex(dbTable)
.whereIn('id', ids)
.update({ date: new Date() })
.then((rows) => {
console.log(`${page} - Updated rows of the table ${dbTable} from ${offset} to ${offset + batch}: `, rows);
})
.catch((err) => {
console.log({ err });
});
})
.catch((err) => {
console.log({ err });
});
};
Where pluck() is used to get ids in array form

Categories

Resources