Chained Fetch getting first results immediately - javascript

I am sending chained Fetch requests. First, I retrieve data from database and request pictures related to every title I got.
The HTML code won't be loaded to results div before image requests are sent. So it takes long time to see articles. How can I make the text to load before image requests starting to be sent?
async function getArticles(params) {
url = 'http://127.0.0.1:8000/articles/api/?'
url2 = 'https://api.unsplash.com/search/photos?client_id=XXX&content_filter=high&orientation=landscape&per_page=1&query='
const article = await fetch(url + params).then(response => response.json());
const cards = await Promise.all(article.results.map(async result => {
try {
let image = await fetch(url2 + result.title).then(response => response.json())
let card = // Creating HTML code by using database info and Splash images
return card
} catch {
let card = // Creating HTML code by using info and fallback images from database
return card
}
}))
document.getElementById('results').innerHTML = cards.join("")
};
I have tried using them separately but I was getting Promise Object.

If you don't want to wait for all the fetches, use an ordinary for loop and await each one sequentially.
async function getArticles(params) {
url = 'http://127.0.0.1:8000/articles/api/?'
url2 = 'https://api.unsplash.com/search/photos?client_id=XXX&content_filter=high&orientation=landscape&per_page=1&query='
const article = await fetch(url + params).then(response => response.json());
for (let i = 0; i < article.results.length; i++) {
let result = article.results[i];
let card;
try {
let image = await fetch(url2 + result.title).then(response => response.json())
card = // Creating HTML code by using database info and Splash images
} catch {
card = // Creating HTML code by using info and fallback images from database
}
document.getElementById('results').innerHTML += card;
}
}
However, this will be slower because it won't start each fetch until the previous one completes.
It's hard to run all the fetches concurrently but display the results in the order that they were sent, rather than the order that the responses were received. You could do it by creating a container DIV for each response before sending, then filling in the appropriate DIV when its response is received.

Related

I want to receive data from external Api in my server ( Node Js Express) and send it to my front end

I have being stuck with this code more than 1 week. I still cannot make it work as it should.
I want to receive data from an external Api, but I have two issues that I am not sure how to handle:
1- The URl of my api call is from different users so I need to change the url base on the user's ID and make an API call per user.
I want to send the data received from the external API to an array and if the user's ID is inside the data I just want to update the data, if it is not in the array just insert the data.
2- This process of calling the api needs to be repeat every 20 seconds to receive the update information.
Here is the code I am trying to fix:
This is the code in my backend/server.I am using Node JS Express.
let userData = [];
async function callApi() {
for( let i = 80; i < 82; i++ ){
const url = "https://myurl.com/user/" + i;
//Making the APi call
const response = await fetch(url);
const information = await response.json();
let check = userData.find(obj => obj.data.userId === information.data.userId);
if (check == undefined) {
userData.push(information);
} else{
const index = userData.findIndex( x => x.data.userid === information.data.userId);
userData[index] = information;
}
}
callApi();
//Repeat the api call every 20 seconds
setInterval(function(){
callApi();
}, 20000);
// free endpoint
app.get("/free-endpoint", (req, res) => {
res.json(userData);
});
**In my frontend: **
I want to make a https request to my server url/free-endpoint and get the updated data from userData every 20 seconds.
I hope you can help me!
Thank you
I have try setInterval inside my route
app.get("/free-endpoint", (req, res) => {
res.json(userData);
});
but it always appear an error that I cannot send the header more than one time

Cannot get querySelectorAll to work with puppeteer (returns undefined)

I'm trying to practice some web scraping with prices from a supermarket. It's with node.js and puppeteer. I can navigate throught the website in beginning with accepting cookies and clicking a "load more button". But then when I try to read div's containing the products with querySelectorAll I get stuck. It returns undefined even though I wait for a specific div to be present. What am I missing?
Problem is at the end of the code block.
const { product } = require("puppeteer");
const scraperObjectAll = {
url: 'https://www.bilkatogo.dk/s/?query=',
async scraper(browser) {
let page = await browser.newPage();
console.log(`Navigating to ${this.url}`);
await page.goto(this.url);
// accept cookies
await page.evaluate(_ => {
CookieInformation.submitAllCategories();
});
var productsRead = 0;
var productsTotal = Number.MAX_VALUE;
while (productsRead < 100) {
// Wait for the required DOM to be rendered
await page.waitForSelector('button.btn.btn-dark.border-radius.my-3');
// Click button to read more products
await page.evaluate(_ => {
document.querySelector("button.btn.btn-dark.border-radius.my-3").click()
});
// Wait for it to load the new products
await page.waitForSelector('div.col-10.col-sm-4.col-lg-2.text-center.mt-4.text-secondary');
// Get number of products read and total
const loadProducts = await page.evaluate(_ => {
let p = document.querySelector("div.col-10.col-sm-4.col-lg-2").innerText.replace("INDLÆS FLERE", "").replace("Du har set ","").replace(" ", "").replace(/(\r\n|\n|\r)/gm,"").split("af ");
return p;
});
console.log("Products (read/total): " + loadProducts);
productsRead = loadProducts[0];
productsTotal = loadProducts[1];
// Now waiting for a div element
await page.waitForSelector('div[data-productid]');
const getProducts = await page.evaluate(_ => {
return document.querySelectorAll('div');
});
// PROBLEM HERE!
// Cannot convert undefined or null to object
console.log("LENGTH: " + Array.from(getProducts).length);
}
The callback passed to page.evaluate runs in the emulated page context, not in the standard scope of the Node script. Expressions can't be passed between the page and the Node script without careful considerations: most importantly, if something isn't serializable (converted into plain JSON), it can't be transferred.
querySelectorAll returns a NodeList, and NodeLists only exist on the front-end, not the backend. Similarly, NodeLists contain HTMLElements, which also only exist on the front-end.
Put all the logic that requires using the data that exists only on the front-end inside the .evaluate callback, for example:
const numberOfDivs = await page.evaluate(_ => {
return document.querySelectorAll('div').length;
});
or
const firstDivText = await page.evaluate(_ => {
return document.querySelector('div').textContent;
});

images dont render on first page

i am trying to make a simple app that calls an api and the renders 10 images per page.
the first page loads but does not show images but the second page does.
what am i doing wrong?
let imageData = [];
fetch({api}
).then(res => res.json())
.then((data) => {
imageData.push(...data.results)
})
fetch({api}
).then(res => res.json())
.then((data) => {
imageData.push(...data.results)
})
let currentPage = 1;
let imagesPerPage = 10;
const changePage = (page)=> {
let nextBttn = document.getElementById("nextBttn");
let prevBttn = document.getElementById("prevBttn");
let root = document.getElementById("root");
let pageCount = document.getElementById("page");
if (page < 1) page = 1;
if (page > numPages()) page = numPages();
root.innerHTML = "";
for (var i = (page - 1) * imagesPerPage; i < (page * imagesPerPage) && i < imageData.length; i++) {
const createImage = document.createElement('img')
createImage.src = imageData[i].urls.thumb
createImage.setAttribute('id',imageData[i].id)
root.appendChild(createImage)
}
pageCount.innerHTML = page + "/" + numPages();
window.onload = ()=>{
changePage(1);
};
there are two fetches because it returns 30 images and i need 60
The problem is you have two (three technically) asynchronous tasks running
that depend on one another, but without any code to synchronize them back up.
Here's the possible order of events:
You initiate a fetch of the first 30 images
You initiate a fetch of the second 30 images
No matter how fast these requests are, their callback won't fire until the rest of this code is parsed/executed.
You set a callback for Page Load
Here's where things can get wonky.
Scenario A (unlikely, you wouldn't have an error):
The server is fast as heck (or cached response) and already has a response waiting for you. In theory, I believe its possible the fetch callback fires before the page load (though I could be wrong here). In this unlikely scenario, the response data is loaded into the imageData. Then the page load event fires, and calls changePage, which displays the images from imageData.
Scenario B (most likely):
The server takes some milliseconds to respond but the page elements are all created and therefore onLoad callback fires first. It attempts to display the imageData (but there isn't any yet). The server finally responds with the 60 images. No code is executed that tells the webpage to draw this new image data.
As you can see, because your code assumes the image data is already available when it tries to display some images on page load (not data load), it fails when the image data takes awhile to return and upon returning does not notify the page to display the new image data.
Here's how you can modify it:
let response1 = fetch({api})
.then(res => res.json())
.then((data) => {
imageData.push(...data.results)
});
let response2 = fetch({api})
.then(res => res.json())
.then((data) => {
imageData.push(...data.results)
});
Promise.all([response1, response2])
.then(() => changePage(1));
// Remove onLoad callback because we don't really care when the page loads, we care when the data loads.

How do I loop through multiple pages in an API?

I am using the Star Wars API https://swapi.co/ I need to pull in starships information, the results for starships span 4 pages, however a get call returns only 10 results per page. How can I iterate over multiple pages and get the info that I need?
I have used the fetch api to GET the first page of starships and then added this array of 10 to my totalResults array, and then created a While loop to check to see if 'next !== null' (next is the next page property in the data, if we were viewing the last page i.e. page 4, then next would be null "next" = null) So as long as next does not equal null, my While loop code should fetch the data and add it to my totalResults array. I have changed the value of next at the end, but it seems to looping forever and crashing.
function getData() {
let totalResults = [];
fetch('https://swapi.co/api/starships/')
.then( res => res.json())
.then(function (json) {
let starships = json;
totalResults.push(starships.results);
let next = starships.next;
while ( next !== null ) {
fetch(next)
.then( res => res.json() )
.then( function (nextData) {
totalResults.push(nextData.results);
next = nextData.next;
})
}
});
}
Code keeps looping meaning my 'next = nextData.next;' increment does not seem to be working.
You have to await the response in a while loop, otherwise the loop runs synchronously, while the results arrive asynchronously, in other words the while loop runs forever:
async getData() {
const results = [];
let url = 'https://swapi.co/api/starships/';
do {
const res = await fetch(url);
const data = await res.json();
url = data.next;
results.push(...data.results);
} while(url)
return results;
}
You can do it with async/await functions more easily:
async function fetchAllPages(url) {
const data = [];
do {
let response = fetch(url);
url = response.next;
data.push(...response.results);
} while ( url );
return data;
}
This way you can reutilize this function for other api calls.

Read a stream with for-await using JavaScript in the browser

I am using the following code to download something large upon user gesture in the browser using fetch with progress indication:
const url = 'https://source.unsplash.com/random';
const response = await fetch(url);
const total = Number(response.headers.get('content-length'));
let loaded = 0;
const reader = response.body.getReader();
let result;
while (!(result = await reader.read()).done) {
loaded += result.value.length;
// Display loaded/total in the UI
}
I saw a snippet in a related question which lead me to believe this could be simplified into:
const url = 'https://source.unsplash.com/random';
const response = await fetch(url);
const total = Number(response.headers.get('content-length'));
let loaded = 0;
for await (const result of response.body.getReader()) {
loaded += result.value.length;
// Display loaded/total in the UI
}
getReader returns a ReadableStreamDefaultReader which comes from the Streams API which is a web API as well as a Node API which makes finding only web related information really hard.
In the snippet above the code fails with response.body.getReader(...) is not a function or its return value is not async iterable. I checked the object prototype and indeed I don't think it has Symbol.asyncIterator on it so yeah no wonder the browser failed to iterate over it.
So the code in that question must have been wrong, but now I wonder, is there a way to take a stream like this and iterate over it using for-await? I suppose you need wrap it in an async generator and yield the chunks in a similar way to the first snippet, right?
Is there an existing or planned API which makes this more streamlined, something closer to the second snippet, but actually working?
The ReadableStream itself implements an async iterable
for await (const result of response.body) {
loaded += result.value.length;
// Display loaded/total in the UI
}
The response.body.getReader() .read() returns {value:..., done: Boolean} meets exactly what asyncIterator needs; before the ReadableStream itself got asyncIterator implementation, you can easily polyfill it:
if (!response.body[Symbol.asyncIterator]) {
response.body[Symbol.asyncIterator] = () => {
const reader = response.body.getReader();
return {
next: () => reader.read(),
};
};
}
for await (const result of response.body) {
loaded += result.length;
console.log(((loaded / total) * 100).toFixed(2), '%');
}
See https://jsfiddle.net/6ostwkr2/

Categories

Resources