Parse Scheduled Background Jobs - javascript

I am trying to make an app which has daily quotes logic and shows quotes. It should picks random object in my parse class and shows them to user. If users saw the todays object they should be can't see different random object in same day.
I made this algorithm with Swift. But I think Cloud Code and Background Job is the more clear and right way to do this algorithm. I researched background job tutorials guides etc to made that but I couldn't because I don't have enough JavaScript knowledge to do that. Whatever I created Background Job in my Parse server like that;
Parse.Cloud.define('todaysMentor', async (request) => {
var Mentor = Parse.Object.extend('Mentor');
var countQuery = new Parse.Query(Mentor);
const count = await countQuery.count();
const query = new Parse.Query('Mentor');
const randomInt = Math.floor(Math.random() * count);
query.equalTo('position', randomInt);
query.limit(1); // limit to at most 10 results
const results = await query.find();
const Today = Parse.Object.extend('Today');
const today = new Today();
today.set('mentor', results[0]);
today.save()
.then((today) => {
// Execute any logic that should take place after the object is saved.
}, (error) => {
});
return results;
});
Parse.Cloud.job('pickTodaysMentor', async function(request) {
const { params, headers, log, message } = request;
Parse.Cloud.run('todaysMentor', (request) => {
if (!passesValidation(request.object)) {
throw 'Ooops something went wrong';
}
});
});
I want to get random Mentor object from my Mentor class and add it to Today class. In this way I can get Today object in my mobile apps. First function is working well when I call it with Swift.
My server logs like that;
May 13, 2019, 22:22:45 +03:00- ERROR
(node:41) UnhandledPromiseRejectionWarning: TypeError: Parse.Cloud.run is not a function
at Parse.Cloud.job (/opt/app-root/src/repo/cloud/functions.js:28:19)
at Object.agenda.define [as fn] (/opt/app-root/src/build/agenda/agenda.js:74:25)
at process._tickCallback (internal/process/next_tick.js:68:7)
May 13, 2019, 22:22:45 +03:00- ERROR
(node:41) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 4)
I googled this error and learned it is a syntax error with Parse 3.0 they changed some function syntax. How can I fix that? or Do you have any suggestion to make this algorithm ?
Thank you!

I'd suggest you to go with something like this:
async function todaysMentor() {
var Mentor = Parse.Object.extend('Mentor');
var countQuery = new Parse.Query(Mentor);
const count = await countQuery.count();
const query = new Parse.Query('Mentor');
const randomInt = Math.floor(Math.random() * count);
query.equalTo('position', randomInt);
query.limit(1); // limit to at most 10 results
const results = await query.find();
const Today = Parse.Object.extend('Today');
const today = new Today();
today.set('mentor', results[0]);
await today.save();
return results;
}
Parse.Cloud.define('todaysMentor', async (request) => {
return await todaysMentor();
});
Parse.Cloud.job('pickTodaysMentor', async function(request) {
return await todaysMentor();
});

Related

Postgres query does not return large records using Async / Await

I have a query that runs perfectly for a small amount of records. However if I try to run a query with a large amount of records, it does not return any output. I suspect it is because I am not properly using Async/Await.
Here is the code for my class with the exception of the actual connecting string:
sql.js
class SQL {
get connectionString() { return 'postgres://user:pass#server:port/db'; }
async queryFieldValue(query) {
const pgs = require('pg');
const R = require('rambda');
const client = new pgs.Client(this.connectionString);
await client.connect();
await client.query(query).then(res => {
const result = R.head(R.values(R.head(res.rows)));
console.log("The Result is: " + result);
}).finally(() => client.end());
}
}
export default new SQL();
Any help is appreciated =)
Well, your usage of async/await is incorrect, but I don't think that's why you're getting results from small queries vs. large ones. When using Promises, try to stick to either async/await or chained promise resolution methods and not mix them together.
const pgs = require('pg');
const R = require('rambda');
class SQL {
get connectionString() { return 'postgres://user:pass#server:port/db'; }
get client() { return new pgs.Client(this.connectionString); }
async queryFieldValue(query) {
try {
await this.client.connect();
const { rows } = await this.client.query(query);
const result = R.head(R.values(R.head(rows)));
console.log("The Result is: " + result);
} catch(e) {
console.log('Some error: ', e);
} finally {
await client.end();
}
}
}
export default new SQL();
Preferences on code style aside, the above is a cleaner usage of async/await without blending in chained resolvers.
As for the actual problem you're having, based on your code you're only logging the first column value from the first row returned, so maybe just slap a limit on there? I imagine you're trying to do something a little more involved with the resultant rows than just logging that value, additional information would help. I think you might be swallowing an error by using that .finally and no catch, but that's a guess.

How to use promises with IndexedDB without transactions auto-committing?

Is there any way to use IndexedDB with promises and async/await without the transactions auto-committing? I understand that you can't do stuff like fetch network data in the middle of a transaction, but everything I was able to find online on the subject indicates that IndexedDB should still work if you simply wrap it in promises.
However, in my testing (Firefox 73), I found that simply wrapping the request's onsuccess method in a Promise is enough to cause the transaction to auto-commit before the promise executes, while the same code works when using the raw IndexedDB API. What can I do?
Here is a simplified minimal example of my code.
const {log, error, trace, assert} = console;
const VERSION = 1;
const OBJ_STORE_NAME = 'test';
const DATA_KEY = 'data';
const META_KEY = 'last-updated';
function open_db(name, version) {
return new Promise((resolve, reject) => {
const req = indexedDB.open(name, version);
req.onerror = reject;
req.onupgradeneeded = e => {
const db = e.target.result;
for (const name of db.objectStoreNames) {db.deleteObjectStore(name);}
db.createObjectStore(OBJ_STORE_NAME);
};
req.onsuccess = e => resolve(e.target.result);
});
}
function idbreq(objs, method, ...args) {
return new Promise((resolve, reject) => {
const req = objs[method](...args);
req.onsuccess = e => resolve(req.result);
req.onerror = e => reject(req.error);
});
}
async function update_db(db) {
const new_time = (new Date).toISOString();
const new_data = 42; // simplified for sake of example
const [old_data, last_time] = await (() => {
const t = db.transaction([OBJ_STORE_NAME], 'readonly');
t.onabort = e => error('trans1 abort', e);
t.onerror = e => error('trans1 error', e);
t.oncomplete = e => log('trans1 complete', e);
const obj_store = t.objectStore(OBJ_STORE_NAME);
return Promise.all([
idbreq(obj_store, 'get', DATA_KEY),
idbreq(obj_store, 'get', META_KEY),
]);
})();
log('fetched data from db');
// do stuff with data before writing it back
(async () => {
log('beginning write callback');
const t = db.transaction([OBJ_STORE_NAME], 'readwrite');
t.onabort = e => error('trans2 abort', e);
t.onerror = e => error('trans2 error', e);
t.oncomplete = e => log('trans2 complete', e);
const obj_store = t.objectStore(OBJ_STORE_NAME);
// This line works when using onsuccess directly, but simply wrapping it in a Promise causes the
// transaction to autocommit before the rest of the code executes, resulting in an error.
obj_store.get(META_KEY).onsuccess = ({result: last_time2}) => {
log('last_time', last_time, 'last_time2', last_time2, 'new_time', new_time);
// Check if some other transaction updated db in the mean time so we don't overwrite newer data
if (!last_time2 || last_time2 < new_time) {
obj_store.put(new_time, META_KEY);
obj_store.put(new_data, DATA_KEY);
}
log('finished write callback');
};
// This version of the above code using a Promise wrapper results in an error
// idbreq(obj_store, 'get', META_KEY).then(last_time2 => {
// log('last_time', last_time, 'last_time2', last_time2, 'new_time', new_time);
// if (!last_time2 || last_time2 < new_time) {
// obj_store.put(new_time, META_KEY);
// obj_store.put(new_data, DATA_KEY);
// }
// log('finished write callback');
// });
// Ideally, I'd be able to use await like a civilized person, but the above example
// shows that IndexedDB breaks when simply using promises, even without await.
// const last_time2 = await idbreq(obj_store, 'get', META_KEY);
// log('last_time', last_time, 'last_time2', last_time2, 'new_time', new_time);
// if (!last_time2 || last_time2 < new_time) {
// obj_store.put(new_time, META_KEY);
// obj_store.put(new_data, DATA_KEY);
// }
// log('finished write callback');
})();
return [last_time, new_time];
}
open_db('test').then(update_db).then(([prev, new_]) => log(`updated db timestamp from ${prev} to ${new_}`));
Orchestrate promises around transactions, not individual requests.
If that causes problems with your design, and you still want to use indexedDB, then design around it. Reevaluate whether you need transactional safety or whether you need to actually reuse a transaction for several requests instead of creating several transactions with only a couple requests per transaction.
There is little to no overhead in spawning a large number of transactions with a small number of requests per transaction in comparison to spawning a small number of transactions with a large number of requests. The only real concern is consistency.
Any await is a yield in disguise. indexedDB transactions timeout when no requests are pending. A yield causes a gap in time so the transactions will timeout.
It turns out that the problem was in a completely different part of my code.
At the end of my top level code, I had
.catch(e => {
error('caught error', e);
alert(e);
});
I'm not sure about the details, but showing an alert appears to cause all the transactions to autocommit, while the promises are still pending, leading the errors I saw once the user clicks "ok" on the alert popup and the pending promises continue. Removing the alert call from my global error handler fixed the issue.

Using returned values in async await in another function

So I am trying to access thermostatRow and the consts defined within it in another function, and I am trying to test that it is working by logging the const within that function so I know I can break it down into its different parts and push those parts into arrays as needed.
I tried 3 things, 2 of them are in my below code examples and the 3rd one returned Promise { <pending> } but I can't find that example.
const worksheet = workbook.getWorksheet("sheet1");
// worksheet is defined in another function earlier in the file that is called asynchronously before this one.
async function dataFormat(worksheet) {
csvWorkbook = workbook.csv.readFile("./uploads/uploadedFile.csv");
await csvWorkbook.then(async function() {
let restarts = 0;
let nullCounts = true;
let thermostatRows = [];
// don't create arrays of current temp here ! do it in a diff function
// const [serial,date,startDate,endDate,thing3,thing4,thing5] = firstTemps
worksheet.eachRow({ includeEmpty: true }, function(row, rowNumber) {
if (rowNumber > 6) {
thermostatRows.push((row.values));
}
});
thermostatRows.map(thermostatRow => {
[,date,time,SystemSetting,systemMode,calendarEvent,ProgramMode,coolSetTemp,HeatSetTemp,currentTemp,currentHumidity,outdoorTemp,windSpeed,coolStage1,HeatStage1,Fan,DMOffset,thermostatTemperature,thermostatHumidity,thermostatMotion] = thermostatRow
return thermostatRow// Push current temps , heatset and cool set into seperate arrays and use those arrays to run the overheat() function
})
})
};
const dataResult = async (worksheet) => {
let result = await dataFormat(worksheet);
console.log(result);
}
This logs nothing.
So I also tried:
const dataResult = async () => {
let result = await dataFormat();
console.log(result);
}
dataResult()
And got this error:
(node:11059) UnhandledPromiseRejectionWarning: TypeError: Cannot read property 'eachRow' of undefined
at /Users/jasonharder/coding/projects/hvacdoctor/controllers/hvacdoctor.js:69:15
at <anonymous>
at process._tickCallback (internal/process/next_tick.js:188:7)
What is happening that I am missing?
Note: I have refactored the code a little bit - I am returning undefined now.
Note: Appended where worksheet is defined (it is defined in the file)

How to connect to mssql server synchronously in node.js

All of the examples for using the mssql client package/tedious driver are for async/callbacks/promises but I'm only developing a microservice that will see limited use and my understanding of asynchronous functions is still a bit fuzzy.
Here's what I have for trying to use async/await :
Report generation class:
const mssql = require('mssql');
const events = require('events');
class reporter {
constructor(searcher, logger) {
// Pass in search type and value or log the error of none defined
this.lg = logger
if (searcher.type && searcher.content) {
this.lg.lg("reporter created", 3)
this.srchType = searcher.type;
this.srchContent = searcher.content;
} else {
this.lg.lg("!MISSING SEARCH PARAMETERS", 0);
this.err = "!MISSING SEARCH PARAMETERS";
}
}
proc() {
//DB Connect async
async () => {
try {
await mssql.connect('mssql://username:password#localhost/database')
this.result = await mssql.query`select * from mytable where id = ${this.searcher}`
} catch (err) {
// ... error checks
}
}
return this.result;
}
}
Then called:
//Pass to reporter for resolution
var report1 = new reporter(searcher, logs);
report1.proc();
I'm sure this is probably a pretty bad way to accomplish this, so I'm also open to any input on good ways to accomplish the end goal, but I'd still like to know if it's possible to accomplish synchronously.
You can't do it synchronously. Figuring out this async stuff is definitely worth your time and effort.
async / await / promises let you more-or-less fake doing it synchronously
const report1 = new reporter(searcher, logs);
report1.proc()
.then ( result => {
/* in this function, "result" is what your async function returned */
/* do res.send() here if you're in express */
} )
.catch ( error => {
/* your lookup failed */
/* inform the client of your web service about the failure
* in an appropriate way. */
} )
And, unwrap the async function in your proc function, like so:
async proc() {
try {
await mssql.connect('mssql://username:password#localhost/database')
this.result = await mssql.query`select * from mytable where id = ${this.searcher}`
} catch (err) {
// ... error checks
}
return this.result;
}
await and .then are analogous.
Kind of an updated answer that continues off of O. Jones' answer.
The current version of Node.js (v15+) has support for top-level await, meaning you can run it all sequentially.
import mssql from 'mssql';
await mssql.connect('mssql://username:password#localhost/database')
const result = await mssql.query`select * from mytable where id = ${this.searcher}`
But it should still be avoided since you want to catch for errors instead of letting it crash.
In current versions of Node.js, if an await/promise rejects, and isn't caught with a .catch(), then the uncaught promise will terminate your application with the error

NodeJS Wait till a function has completed

I have an application where i have a Client class, that needs to be executed with different configurations for each user when i receive a message on discord(It's a chat application having its own API calls for checking for new messages etc). I have initialized the obj constructor whenever i receive a new message with new configurations for that user, but the problem is when i receive multiple messages at the same time, only the latest user's configurations are used for all the client objects created. Attaching sample code from the application:
Wait for message code:
const app = require("./Client")
const app2 = require("./MyObject")
bot.on('message', async (message) => {
let msg = message.content.toUpperCase(), pg;
let client = await new app2.MyObject();
//set all such configurations
config.a = "valuea";
// initialize my
pg = await new app.Client(config, client);
let result = await pg.Main(bot, message, params).then((result) => {
// Do stuff with result here
});
});
Client class:
class Client {
constructor(config, client) {
this.config = config;
this.client = client;
}
async Main(bot, message, params) {
let result = {};
this.client.setProperty1("prop");
await this.client.doSOmething();
result = await this.doSOmethingMore(message);
this.client.doCleanUp();
return result;
}
}
I had also tried initializing the obj constructor in the Client class, but even that fails for some reason.
Any suggestions how Can i correct my code?
You don't need to use .then and await at the same time.
bot.on('message', async (message) => {
let msg = message.content.toUpperCase();
let client = await new MyObject();
//set all such configurations
config.a = "valuea";
// initialize my
pg = new app.Client(config, client);
let result = await pg.Main(bot, message, params);
// Do stuff with result here
console.log(result);
});
(You don't need to use await at constructor call because it is not an async method)
note: async-await will work on higher version of node than 7.6
If you want to use .then:
bot.on('message', (message) => {
let msg = message.content.toUpperCase();
new MyObject().then(client => {
//set all such configurations
config.a = "valuea";
// initialize my
pg = new app.Client(config, client);
pg.Main(bot, message, params).then(result => {
// Do stuff with result here
console.log(result);
});
});
});
It's difficult to be certain, since you don't include the MyObject code, but generally it's a bad practice to return a promise from a constructor.
Further, if you don't do it right, then whatever that promise resolves to might not in fact be the Client that your calling code expects.

Categories

Resources