I am in the process of building my own trading terminal with various functions for a school project where speed is important..
I have two functions that I would like to optimize. But don't know how to proceed
from here.
function1
const calcPromise = async () => {
const markPriceData = await restClient.getTickers({
symbol: selectedSymbol,
});
let cash = 100;
const lastPrice = markPriceData.result[0].last_price;
const quantity = Math.round((cash / lastPrice * 1.03).toFixed(3));
return { lastPrice, quantity };
};
it makes one request for the the last price, from the given symbol and calculates the quantity if I want to buy for a certain amount.
function2
export default async function placeOrder() {
try {
const orderData = await restClient.placeActiveOrder({
symbol: selectedSymbol,
order_type: 'Limit',
side: 'Buy',
qty: quantity,
price: lastPrice,
time_in_force: 'GoodTillCancel',
reduce_only: false,
close_on_trigger: false,
position_idx: LinearPositionIdx.OneWayMode
});
const endTime2 = performance.now();
const executionTime2 = endTime2 - startTime2;
console.log(`Buy order ${executionTime2}ms`);
console.log(orderData);
} catch (err) {
console.log(err);
}
}
Function2 takes the value from function 1, lastPrice & quantity and makes another request and executes a buy order.
This process is taking to long. Im clocking this at averege 2-5 seconds.
As it is now, function2 must sit and wait for the first request, which will then do the calculations before it can be executed.
The idea I have is to get function1 to run in the background somehow. That the price is already available, so that function2 does not need to wait for function1 to be ready.
I've tried implementing async. But it resulted in errors, because function2 always runs faster than function1 if they are run at the same time.
Is there a smart way I can solve this?
I combine the 2 functions in one only.
Try this:
async function placeOrder() {
try {
const markPriceData = await restClient.getTickers({
symbol: selectedSymbol,
});
let cash = 100;
const lastPrice = markPriceData.result[0].last_price;
const quantity = Math.round((cash / lastPrice * 1.03).toFixed(3));
const startTime1 = performance.now();
const orderData = await restClient.placeActiveOrder({
symbol: selectedSymbol,
order_type: "Limit",
side: "Buy",
qty: quantity,
price: lastPrice,
time_in_force: "GoodTillCancel",
reduce_only: false,
close_on_trigger: false,
position_idx: LinearPositionIdx.OneWayMode,
});
const endTime1 = performance.now();
const executionTime = endTime1 - startTime1;
console.log(`Calculations, buy ${executionTime}ms`);
return orderData;
} catch (error) {
throw error;
}
}
Related
I'm trying to make a this.getData(item) call in parallel. 2 at a time.
However with my approach even if I do 1 instead of 2 concurrently it still gobbles up my API usage limit.
I think I have a bug but not sure where.
async makeAsyncCallsInParallel(items) {
// Define the maximum number of parallel requests
const maxParallelRequests = 1;
// Use the map method to create an array of promises for each call to GetData
const promises = items.map(item => this.getData(item));
// Use a loop to process the promises in batches of up to the maximum number of parallel requests
const results = [];
for (let i = 0; i < promises.length; i += maxParallelRequests) {
const batch = promises.slice(i, i + maxParallelRequests);
const batchResults = await Promise.all(batch);
results.push(...batchResults);
}
// Return the final results
return results;
}
Here is my getData function, I think the problem is in here too:
async getData(item) {
const me = await this.me();
const {
link,
asin,
starts_at,
ends_at,
title,
image,
deal_price,
list_price,
merchant_name,
free_shipping,
description,
category,
// tags
} = item;
const discountPercent = deal_price?.value && list_price?.value ? parseInt((((deal_price?.value - list_price?.value) / list_price?.value) * 100).toFixed(0)) : null;
const { aiTitle, aiDescription, aiPrompt, aiChoices, aiTags } = await this.getAIDescription(item);
console.log('title: ', title, 'aiTitle:', aiTitle, 'description: ', description, 'aiDescription: ', aiDescription);
const deal = {
link,
productId: asin,
startsAt: starts_at,
endsAt: ends_at,
imageUrl: image,
title: aiTitle || title,
originalTitle: title,
dealPrice: deal_price,
listPrice: list_price,
discountPercent,
merchant: merchant_name,
isFreeShipping: free_shipping,
description: aiDescription || description,
originalDescription: description,
category,
createdBy: me.id,
tags: aiTags,
aiPrompt,
aiChoices,
};
return deal;
}
I am developping an app to order food online. As backend service I am using firestore to store the data and files. The user can order dishes and there are limited stocks. So every time a user order a dish and create a basket I update the stock of the corresponding ordered dishes. I am using a firebase function in order to perform this action. To be honest it is the first I am creating firebase function.
Into the Basket object, there is a list of ordered Dishes with the corresponding database DishID. When the basket is created, I go through the DishID list and I update the Quantity in the firestore database. On my local emulator it works perfectly and very fast. But online it takes minutes to perform the first update. I can deal with some seconds. Even if it takes a few seconds (like for cold restart) it's okay. But sometimes it can take 3 minutes and someone else can order a dish during this time.
Here is my code:
//Update the dishes quantities when a basket is created
exports.updateDishesQuantity = functions.firestore.document('/Baskets/{documentId}').onCreate(async (snap, context) => {
try{
//Get the created basket
const originalBasket = snap.data();
originalBasket.OrderedDishes.forEach(async dish => {
const doc = await db.collection('Dishes').doc(dish.DishID);
console.log('Doc created');
return docRef = doc.get()
.then((result) =>{
console.log('DocRef created');
if(result.exists){
console.log('Result exists');
const dishAvailableOnDataBase = result.data().Available;
console.log('Data created');
const newQuantity = { Available: Math.max(dishAvailableOnDataBase - dish.Quantity, 0)};
console.log('Online doc updated');
return result.ref.set(newQuantity, { merge: true });
}else{
console.log("doc doesnt exist");
}
})
.catch(error =>{
console.log(error);
return null;
});
});
}catch(error){
console.log(error);
}
});
I have a couple of logs output to debug the outputs on the server. It's the doc.get() function that takes 2 minutes to execute as you can see on the logger below:
Firebase logger
Thanks for your help,
Thansk for your help. I just edited a little bit your code to make it work. I post my edited code. Thanks a lot, now it takes just 4 seconds to update the quantities.
Kid regards
//Update the dishes quantities when a basket is created
exports.updateDishesQuantity = functions.firestore.document('/Baskets/{documentId}').onCreate(async (snap, context) => {
try {
//Get the created basket
const originalBasket = snap.data();
const promises = [];
const quantities = [];
originalBasket.OrderedDishes.forEach(dish => {
promises.push(db.collection('Dishes').doc(dish.DishID).get());
quantities.push(dish.Quantity);
});
const docSnapshotsArray = await Promise.all(promises);
console.log("Promises", promises);
const promises1 = [];
var i = 0;
docSnapshotsArray.forEach(result => {
if (result.exists) {
const dishAvailableOnDataBase = result.data().Available;
const newQuantity = { Available: Math.max(dishAvailableOnDataBase - quantities[i], 0) };
promises1.push(result.ref.set(newQuantity, { merge: true }));
}
i++;
})
return Promise.all(promises1)
} catch (error) {
console.log(error);
return null;
}
});
You should not use async/await within a forEach() loop, see "JavaScript: async/await with forEach()" and "Using async/await with a forEach loop".
And since your code executes, in parallel, a variable number of calls to the asynchronous Firebase get() and set() methods, you should use Promise.all().
You should refactor your Cloud Function along the following lines:
//Update the dishes quantities when a basket is created
exports.updateDishesQuantity = functions.firestore.document('/Baskets/{documentId}').onCreate(async (snap, context) => {
try {
//Get the created basket
const originalBasket = snap.data();
const promises = [];
originalBasket.OrderedDishes.forEach(dish => {
promises.push(db.collection('Dishes').doc(dish.DishID).get());
});
const docSnapshotsArray = await Promise.all(promises);
const promises1 = [];
docSnapshotsArray.forEach(snap => {
if (result.exists) {
const dishAvailableOnDataBase = result.data().Available;
const newQuantity = { Available: Math.max(dishAvailableOnDataBase - dish.Quantity, 0) };
promises1.push(result.ref.set(newQuantity, { merge: true }));
}
})
return Promise.all(promises1)
} catch (error) {
console.log(error);
return null;
}
});
Note that instead of looping and calling push() you could use the map() method for a much concise code. However, for SO answers, I like the clarity brought by creating an empty array, populating it with a forEach() loop and passing it to Promise.all()...
Also note that since you are updating quantities in a basket you may need to use a Transaction.
The question is pretty self-explanatory but I wanna know how to deal with scheduled tasks whether it is cron, setTimeout, setInterval etc. Let's say I have multiple variables initialized inside it. If the task keeps repeating is it gonna keep filling memory or will it clean itself after some time? This might sound simple but I'm a beginner. I will throw my code here where I send one API request to check the value of one variable and another one to increase it by one. I can probably safely take some of the variables outside just fine but the question remains the same.
const { GraphQLClient, gql } = require ('graphql-request')
const Cron = require ('croner')
const job = Cron('0 14 * * *', () => {
async function main() {
console.log("start")
const endpoint = 'https://graphql.anilist.co/'
const graphQLClient = new GraphQLClient(endpoint, {
headers: {
authorization: 'Token hidden for obvious reasons.',
},
})
const query_check = gql`
mutation ($mediaId: Int, $status: MediaListStatus) {
SaveMediaListEntry (mediaId: $mediaId, status: $status) {
id
status
progress
}
}
`
const variables = {
mediaId: "1337"
}
const check = await graphQLClient.request(query_check,variables)
console.log(`przed ${check.SaveMediaListEntry.progress}`)
const query_new = gql`
mutation ($mediaId: Int, $status: MediaListStatus, $progress: Int) {
SaveMediaListEntry (mediaId: $mediaId, status: $status, progress: $progress) {
id
status
progress
}
}
`
const new_variables = {
mediaId: `1337`,
progress: `${check.SaveMediaListEntry.progress + 1}`
}
const new_query = await graphQLClient.request(query_new,new_variables)
console.log(`po ${new_query.SaveMediaListEntry.progress}`)
}
main().catch((error) => console.error(error))
});
I wrote the following lambda to move messages from queueA to queueB
async function reprocess_messages(fromQueue, toQueue) {
try {
const response1 = await sqs.send(new GetQueueUrlCommand({ QueueName: fromQueue }));
const response2 = await sqs.send(new GetQueueUrlCommand({ QueueName: toQueue }));
const fromQueueUrl = response1.QueueUrl;
const toQueueUrl = response2.QueueUrl;
let completed = false;
while (!completed) {
completed = await moveMessage(toQueueUrl, fromQueueUrl);
// console.log(status);
}
// console.log(completed);
return completed;
} catch (err) {
console.error(err);
}
}
async function moveMessage(toQueueUrl, fromQueueUrl) {
try {
const receiveMessageParams = {
MaxNumberOfMessages: 10,
MessageAttributeNames: ["Messsages"],
QueueUrl: fromQueueUrl,
VisibilityTimeout: 2,
WaitTimeSeconds: 0,
};
const receiveData = await sqs.send(new ReceiveMessageCommand(receiveMessageParams));
// console.log(receiveData);
if (!receiveData.Messages) {
console.log("finished");
return true;
}
const messages = [];
receiveData.Messages.forEach(msg => {
messages.push({ body: msg["Body"], receiptHandle: msg["ReceiptHandle"] });
});
const sendMsg = async ({ body, receiptHandle }) => {
const sendMessageParams = {
MessageBody: body,
QueueUrl: toQueueUrl
};
await sqs.send(new SendMessageCommand(sendMessageParams));
// console.log("Success, message sent. MessageID: ", sentData.MessageId);
return "Success";
};
const deleteMsg = async ({ body, receiptHandle }) => {
const deleteMessageParams = {
QueueUrl: fromQueueUrl,
ReceiptHandle: receiptHandle
};
await sqs.send(new DeleteMessageCommand(deleteMessageParams));
// console.log("Message deleted", deleteData);
return "Deleted";
};
const sent = await Promise.all(messages.map(sendMsg));
// console.log(sent);
await Promise.all(messages.map(deleteMsg));
// console.log(deleted);
console.log(sent.length);
return false;
} catch (err) {
console.log(err);
}
}
export const handler = async function (event, context) {
console.log("Invoking lambda");
const response = await reprocess_messages("queueA", "queueB");
console.log(response);
}
With lambda config of 256 MB it takes 19691ms and with 512 MB it takes 10171ms to move 1000 messages from queueA to queueB. However, on my local system when I run reprocess_messages(qA, qB) it takes around 2 mins to move messages from queueA to queueB.
Does this mean that if I increase the memory limit to 1024 MB it will take only around 5000ms and how can I find the optimal memory limit?
It will most likely always be the case that running code on your local machine to interact with AWS services will be slower than if you were to run this code on an AWS service like Lambda. When your code is running on Lambda and interacting with AWS services in the same region, the latency is often only from the AWS network which can be drastically lower than the latency between your network and the AWS region you're working with.
In terms of finding the optimal performance, this is trial and error. You are trying to find the sweet spot between price and performance. There are tools like compute optimizer that can assist you with this.
A useful note to bear in mind here is that as you increase Lambda memory, you can avail of further vCPU cores, it is roughly every 1,720mb more gives you another vCPU core. So while your current performance increase is tied to just a memory increase, if you were to increase to a number that gives an additional vCPU core, it may have a greater affect. Unfortunately it can be difficult to theorize the results and it is best just to trial and error the different scenarios
So I have a function that checks if an order is 24 hours old, if thats the case I send a notification to the user , but it seems like it does not complete the execution of all the users, instead it just returns someones and some others not, I think I have a problem returning the promise, I'm not an expert in javascript and I did not really understand what is happening, sometimes instead of trying with all the documents it just finishes if one documents has a deviceToken as empty and not continue with the other user documents
exports.rememberToFinishOrder = functions.pubsub.schedule('every 3 minutes').onRun(async (context) => {
var db = admin.firestore();
const tsToMillis = admin.firestore.Timestamp.now().toMillis()
const compareDate = new Date(tsToMillis - (24 * 60 * 60 * 1000)) //24 horas
let snap = await db.collection('orders').where("timestamp","<",new Date(compareDate)).where("status", "in" ,[1,2,4,5,6]).get()
if(snap.size > 0){
snap.forEach(async(doc) => {
const userId = doc.data().uid
let userSnap = await db.collection('user').doc(userId).get()
const deviceToken = userSnap.data().deviceToken
const payload = {
notification: {
title: "¿ Did you received your order ?",
body: "We need to know if you have received your order",
clickAction: "AppMainActivity"
},
data: {
ORDER_REMINDER: "ORDER_REMINDER"
}
}
console.log("User: "+doc.data().uid)
return admin.messaging().sendToDevice(deviceToken,payload)
});
}
});
sometimes when in someusers the devicetoken is empty it will finish the execution of this function instead of continuing to the next user, and also it will not finish this function for all the users in my orders collection, it will do someones and someones not, and this should be an atomic operation that changes everything in that collection, not just some documents
what is happening ?
Like andresmijares are saying, are you not handling the promises correctly.
When you are doing several asynchronous calls, I'd suggest using the Promise.all() function that will wait for all the promises to be done before it continues.
exports.rememberToFinishOrder = functions.pubsub.schedule('every 3 minutes').onRun(async (context) => {
const db = admin.firestore();
const messaging = admin.messaging();
const tsToMillis = admin.firestore.Timestamp.now().toMillis()
const compareDate = new Date(tsToMillis - (24 * 60 * 60 * 1000)) //24 horas
const snap = await db.collection('orders').where("timestamp","<",new Date(compareDate)).where("status", "in" ,[1,2,4,5,6]).get()
let allPromises = [];
if(snap.size > 0){
snap.forEach((doc) => {
const userId = doc.data().uid;
allPromises.push(db.collection('user').doc(userId).get().then(userSnapshot => {
const userData = userSnapshot.data();
const deviceToken = userData.deviceToken;
if (userData && deviceToken) {
const payload = {
notification: {
title: "¿ Did you received your order ?",
body: "We need to know if you have received your order",
clickAction: "AppMainActivity"
},
data: {
ORDER_REMINDER: "ORDER_REMINDER"
}
}
console.log("User: "+doc.data().uid)
return messaging.sendToDevice(deviceToken,payload)
} else {
return;
}
}));
});
}
return Promise.all(allPromises);
});
EDIT:
I added a check to see if the deviceToken is present on the userData before sending the notification.