I found this sample code in the following websites:
github
freecodecamp
to help determine the connection status between client side and the internet.
const checkOnlineStatus = async () => {
try {
const online = await fetch("/1pixel.png");
return online.status >= 200 && online.status < 300; // either true or false
} catch (err) {
return false; // definitely offline
}
};
setInterval(async () => {
const result = await checkOnlineStatus();
const statusDisplay = document.getElementById("status");
statusDisplay.textContent = result ? "Online" : "OFFline";
}, 3000); // probably too often, try 30000 for every 30 seconds
// forgot to include async load event listener in the video!
window.addEventListener("load", async (event) => {
const statusDisplay = document.getElementById("status");
statusDisplay.textContent = (await checkOnlineStatus())
? "Online"
: "OFFline";
});
However this seems to produce an error in the console when running it. The error is:
script.js:6 GET http://127.0.0.1:5500/1pixel.png 404 (Not Found)
Does anyone else get this same error or know how to get around this?
Edit: I got the solution by changing the fetch API call to this:
const online = await fetch("https://jsonplaceholder.typicode.com/todos/1");
Probably the error was caused by the previous fetch API not correctly working
Related
I'm using Google Gmail API to get sent emails.
I'm using 2 APIs for this -
list (https://developers.google.com/gmail/api/reference/rest/v1/users.messages/list)
get (https://developers.google.com/gmail/api/reference/rest/v1/users.messages/get)
The list API gives a list of messages IDs which I use to get specific data from the get API.
Here's the code for this -
await Promise.all(
messages?.map(async (message) => {
const messageData = await contacts.getSentGmailData(
accessToken,
message.id
);
return messageData;
})
);
getSentGmailData is the get API here.
The problem here is, while mapping and making requests to this API continuously, I get a 429 (rateLimitExceeded) error.
What I tried is adding a buffer between each request like this -
function delay(ms) {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
}
const messageData = await contacts.getSentGmailData(accessToken,message.id);
await delay(200);
But this doesn't seem to work.
How can I work around this?
You can use the below solution like code for adding some more buffer time when you will get 429 (to many requests from google api).
Basically this code will help you to stop calling api when you exceed Rate Limiter.
Note: This doesn't mean that you can bypass Google api Rate Limiter.
async function getSentGmailDataWithBackoff(accessToken, messageId) {
const MAX_RETRIES = 5;
let retries = 0;
let delay = 200;
while (true) {
try {
const messageData = await contacts.getSentGmailData(accessToken, messageId);
return messageData;
} catch (error) {
if (error.response && error.response.status === 429 && retries < MAX_RETRIES) {
retries++;
console.log(`Rate limit exceeded. Retrying in ${delay}ms.`);
await delay(delay);
delay *= 2;
} else {
throw error;
}
}
}
}
async function getSentGmailDataWithBackoffBatch(accessToken, messageIds) {
return Promise.all(
messageIds.map(async (messageId) => {
const messageData = await getSentGmailDataWithBackoff(accessToken, messageId);
return messageData;
})
);
}
function delay(ms) {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
}
The reason the delay is not working is because it does not wait for the Promise to be resolved. The same reasoning applied to forEach, filter, reduce etc. You can get some idea here: https://gist.github.com/joeytwiddle/37d2085425c049629b80956d3c618971
If you had used a for-of loop or another for-loop for this purpose, it would have worked.
for(let message of messages) {
const messageData = await contacts.getSentGmailData(accessToken,message.id);
await delay(200);
}
You could also write your own rate-limiting function (also commonly called throttling function) or use one provided by libraries like Lodash: https://lodash.com/docs#throttle
I am trying to make a discord bot. I want it to fetch an API every X seconds and check for new topics created.
The API JSON contains just one object, the latest post on the site. I wanted the bot to fetch every X seconds and if the latest post found has a bigger uid to send a message in a channel. If not do nothing. The uid is a number.
Tried to use the setInterval function but could not get it to work as it gave out errors that await needs to be in a top async function.
I also hardcoded the latest post uid "latest_post" and the way I saw it working is if post_id from API is higher that the hardcoded one then hardcoded one receives the value of the new one. However latest_post's value remains unchanged as the function is executed again and lates_post is hardcoded every time. Tried to declare it outside the function but came out as undefined.
This is my current almost working code
client.on('messageCreate', async (message) => {
if (!message.content.startsWith(prefix) || message.author.bot) return;
const args = message.content.slice(prefix.length).split(/ +/);
const command = args.shift().toLowerCase();
if (command === 'new') {
var latest_post = 12345; // hardcoded latest post uid
try {
const response = await fetch('API HERE').then(r => r.text());
const body = JSON.parse(response);
const obj = body.data;
const objId = obj[0].post_id;
const objAuthor = obj[0].author;
const objTitle = obj[0].title;
if (objTopic > latest_post) {
client.channels.cache.get("CHANNEL ID").send(`Found a new post by ${objAuthor}`);
var latest_post = objId; // if it fetches a post with a higher uid then latest_post receives the value of that higher uid
} else {
var d = new Date();
var n = d.toLocaleTimeString();
console.log(`No new posts yet at ${n}`); //logs "No new posts yet at 1:30 PM"
}
} catch (error) {
message.channel.send('Oops, there was an error fetching the API');
console.log(error);
}
}
});
Can anyone guide me how to transform this in a recursive function that runs automatically every X seconds.
Thanks !
After some iterations managed to make it work the way I wanted. Will leave the code below, hope it helps!
let refreshint = 10 * 1000; //10 seconds
var API = "API_HERE";
client.on('ready', async (message) => {
console.log('Bot is connected...');
var latest_post = 12345 // hardcoded latest post uid
setInterval(async function(){
try {
const response = await fetch('API').then(r => r.text());
const body = JSON.parse(response);
const obj = body.data;
const objId = obj[0].post_id;
const objAuthor = obj[0].author;
const objTitle = obj[0].title;
if (objTopic > latest_post)
client.channels.cache.get("CHANNEL ID").send(`Found a new post by ${objAuthor}`);
console.log(`Found a new post by ${objAuthor}`);
function change_value(latest_post){
return latest_post;
}
latest_post = change_value(topicId);
} else {
var d = new Date();
var n = d.toLocaleTimeString();
console.log(`No new thread yet at ${n}`);
}
} catch (error) {
message.channels.cache.get('channel ID here').send('Oops, there was an error fetching the API');
console.log(error);
}
}, refreshint) });
I’m making Javascript below.
The entire system consists of HTML, Javascript and Python. they send/receive/show some data.
when to press generateBtn(), Form data on HTML is sent to Javascript and Python.
Javascript waits data processing of Python ( waitUpdate() ). After that, Javascript shows the data python sent ( generateProfile() ).
So far, waitUpdate() doesn’t work.
when to start “const waitUpdate = async function(user_uid) {”, it doesn’t get into “db.collection(“users”)…where(“generate_flag”, “==”, true).onSnapshot((querySnapshot) => {”. And it shows console.log('Flag : ’ + flag); (this shows Flag: true) right before sleep(1000);. So it gets into infinity loop shows “Flag: true”.
When to confirm generate_flag in CloudFirestore account, it shows true. So I thought
it’s got into “.onSnapshot((querySnapshot) => {”. But actually It’s not, somehow I mentioned above.
Did anyone realize how to solve this issue?
I’d appreciate it, if you would tell me the way to solve it.
Thank you.
function generateBtn() {
var user_uid = firebase.auth().currentUser.uid
//console.log("In createCloze() : user_uid : " + user_uid);
const waitUpdate = async function(user_uid) {
let flag
flag = true
while(flag){
//console.log("while() in waitUpdate() : user_uid = " + user_uid); //Okay.
db.collection("users").doc(user_uid).collection("logdata").where("generate_flag", "==", true)
.onSnapshot((querySnapshot) => {
console.log("In .onSnapshot((querySnapshot) =>");//It's never been through here so far.
querySnapshot.forEach((doc) => {
console.log("In querySnapshot.forEach((doc) =>");
if(doc.data().mypythonfeedback_flag == false){
flag = false;
//console.log('Flag : ' + flag + ' in if(doc.data().mypythonfeedback_flag == false)');
}
}); //querySnapshot.forEach((doc) => {
}); //.onSnapshot((querySnapshot) => {
console.log('Flag : ' + flag);
sleep(1000);
} //while(flag){
}
//Omitted. This shows the data.
const generateProfile = async function(user_uid) {
console.log('In generateProfile()');
} //const generateProfile = async function(user_uid) {
const processAll = async function(user_uid) {
await waitUpdate(user_uid)
await generateProfile(user_uid)
}
processAll(user_uid)
}
Data is loaded from Firebase asynchronously, because it may need to come from the network. While the data is being loaded, your other code continues to execute. And then when the data is loaded, your callback is called with that data.
What this means in practice is that your console.log('Flag : ' + flag) runs before the flag = false is ever executed, so it always shows the default value. You can most easily verify this by setting breakpoints on these lines and running in a debugger, or by checking that Flag : true shows before In querySnapshot.forEach((doc) => in your output.
The simplest solution for this is to put the code that needs the flag into the callback that is called when the data is loaded, so like this:
db.collection("users").doc(user_uid).collection("logdata").where("generate_flag", "==", true)
.onSnapshot((querySnapshot) => {
querySnapshot.forEach((doc) => {
if(doc.data().mypythonfeedback_flag == false){
flag = false;
}
}); //querySnapshot.forEach((doc) => {
console.log('Flag : ' + flag); // 👈
});
To allow the caller to await until the flag becomes false, you can return a custom promise:
const waitUpdate = async function(user_uid) {
return new Promise((resolve, reject) => {
let flag
flag = true
db.collection("users").doc(user_uid).collection("logdata").where("generate_flag", "==", true)
.onSnapshot((querySnapshot) => {
querySnapshot.forEach((doc) => {
if(doc.data().mypythonfeedback_flag == false){
flag = false;
resolve();
}
});
});
})
}
As you'll see, we've removed the loop here - and instead call resolve() (which then triggers the await) from within the callback that listens for updates from the database.
I'm new to ServiceWorkers, but I wanted to keep my homepage as fresh as possible by using the cache then network pattern as outline in Google's offline cookbook.
However, I am using static html instead of data streams, and the source code linked to Google's documentation does not use the example. So I came up with a pattern that works in testing, but I'm concerned it introduces a race condition.
The pattern I am trying to achieve is:
Request
ServiceWorker returns cached response
Page posts message to ServiceWorker, telling it to use the network on the next load
Page makes fetch request
ServiceWorker updates cache and returns network response
Page updates with the network response
However, there may be a race condition between 3 and 4.
The issue is that 3 and 4 happen in the client, while 5 happens in the ServiceWorker.
I am unsure of the potential delay caused by postMessage(). If it is substantial, 4 will fetch the page before the ServiceWorker is aware it should be fetching from the network, and a repeat cached response will be returned.
Is this the case? Or will the ServiceWorker always have time to update the flag from the postMessage() before the next fetch request is made?
Source
Page
let isControlled = navigator.serviceWorker.controller;
let isHomepage = location.href === location.origin + '/';
let homepageCached = getCookie('homecached') === 'true';
if (isControlled && !homepageCached) {
// If the cookie is expired or unset, push an update to the service worker and update cookie
navigator.serviceWorker.controller.postMessage('updateHomepageCache');
// `1` means store the cookie for 1 day
setCookie('homecached', 'true', 1);
if (isHomepage) {
// This works easily with location.reload(), but it's sloppy and causes a flicker
let existingMain = document.querySelector('.main');
fetch(location.href)
.then(function(response) {
return response.text();
}).then(function(text) {
try {
// Convert response text to a new document
let parser = new DOMParser();
let doc = parser.parseFromString(text, 'text/html');
// Get `main` element of response and update the current document with it
let fetchedMain = doc.querySelector('.main');
let parent = existingMain.parentNode;
parent.replaceChild(fetchedMain, existingMain);
} catch (err) {
console.error(err);
}
});
}
}
Service Worker
// Set flag to false so update only happens when cache is invalid
let updateHomepageCache = false;
self.addEventListener('message', (event) => {
// Update flag if cache is invalid
// Note, this comes after fetch in the ServiceWorker's lifecycle
if (event.data === 'updateHomepageCache') {
updateHomepageCache = true;
}
});
self.addEventListener('fetch', (event) => {
const normalizedUrl = new URL(event.request.url);
normalizedUrl.search = '';
const isNavgation = event.request.mode === 'navigate';
const isFromOrigin = normalizedUrl.origin === location.origin;
const isHomepage = normalizedUrl.href === location.origin + '/';
// Respond with network and update cache for homepage (if it needs to be updated)
if (isHomepage && isNavgation && updateHomepageCache) {
event.respondWith(
caches.open(cacheName).then(function(cache) {
return fetch(normalizedUrl).then(function(networkResponse) {
cache.put(normalizedUrl, networkResponse.clone());
updateHomepageCache = false;
return networkResponse;
}).catch(function() {
return cache.match(normalizedUrl);
});
})
);
// Cache then update cache with network response for pages "stale-while-revalidate"
} else if (isFromOrigin && isNavgation) {
event.respondWith(
caches.open(cacheName).then(function(cache) {
return cache.match(normalizedUrl).then(function(response) {
let fetchPromise = fetch(normalizedUrl).then(function(networkResponse) {
cache.put(normalizedUrl, networkResponse.clone());
return networkResponse;
});
return response || fetchPromise;
});
})
);
});
Example code:
Hub.listen('auth', event => {
const { event: type, data } = event.payload;
if (type === 'signIn') {
const session = data.signInUserSession;
console.log('SESSION', data.signInUserSession);
setTimeout(() => {
console.log('SESSION', data.signInUserSession);
}, 100);
}
});
When using oath, after the provider redirects to my app, the Hub fires a signIn event. However, the signInUserSession property is null when the event is fired, but gets a value some time later (within 100 ms). This does not seem to occur when using Auth.signIn(email, password) directly; signInUserSession is populated when the event is fired.
What is happening here, and how can I get around it? Currently, I have an explicit delay in the code, which is a terrible hack.
Perhaps the old way of JavaScript for waiting for value to be populated is useful to ensure that code does not fail even if the it takes longer than expected in populating the value.
Here is a sample code that I normally use when no other options are available.
waitForValue(){
if(myVar!= null && typeof myVar !== "undefined"){
//value exists, do what you want
console.log(myVar)
}
else{
setTimeout(() => {this.waitForValue()}, 100);
}
}
You can refactor this sample code as per your need.
Alternatively, AWS Amplify also have other ways to get current logged in user session. e.g. Auth.currentAuthenticatedUser() and Auth.currentSession() return promise. They can be used like this
private async getUser(){
let user = null;
try {
user = await Auth.currentAuthenticatedUser();
//console.log(user);
} catch (err) {
//console.log(err);
}
//return user;
}
i am not used to aws amplify - just read some github and so far i can see we will need info about your userPool implementation - i guess some weird callback issue
But for a workaround you can proxy the reference:
const event = {type: "signIn", data: {signInProperty: "null"}}
setTimeout(()=>event.data.signInProperty = "{Stack: Overflow}", 1000)
// mock events
function emit(type, args){
console.log(type, args)
}
//initialize
let watchedValue = event.data.signInProperty
document.getElementById("app").innerHTML = event.data.signInProperty
// protect reference
Object.defineProperty(event.data, "signInProperty", {
set(newValue){
watchedValue = newValue
document.getElementById("app").innerHTML = newValue
emit("event:signInCompleted", event.data)
},
get(){
return watchedValue
}
})
<div id="app"></div>