Update an element automatically in a website using NodeJs - javascript

I am still in early stages of my programming and I was thinking about creating something which does API calls after every x seconds and update my website with new content.
My intial goal is to populate a table with the content obtained from API using FOR loop (.ejs page).
Now, I want to update just those rows and columns (created from FOR loop) of my webpage after x seconds instead of refreshing my entire page.
How can I achieve this (Updating those rows and columns) ?
Consider stock market website, where it just updates the stock price, instead of entire page.
Thanks for your help in advance

The most popular way to solve this problem is to store the data you obtain from the other API in your database, then create an endpoint that serves the most recent version of that data as JSON.
Then at the client side, you make periodic requests to the server to fetch the content and updates a part of the page to show the newest data available.
This could be as simple as:
// server side (if you use express)
app.get("/data", (req, res) => {
database.getMostRecentEntry()
.then(data => res.json(data))
.catch(err => res.status(500).end());
});
// client side
function fetchMostRecentData() {
fetch("/data")
.then(response => response.json())
.then(data => updateView(data))
.catch(err => showError(err));
}
function updateView(data) {
let container = document.getElementById("app");
container.innerHTML = `
<tr>
<td>${data.name}</td>
<td>${data.value}</td>
</tr>
`;
}
function showError(err) {
console.error(err);
alert("Something went wrong");
}
// call fetchMostRecentData once every 10s
setInterval(fetchMostRecentData, 10000);
Now, this isn't a very robust solution and there are some fairly serious security problems, but it's a starting point.
From here, you should look into using a frontend framework (rather than updating innerHTML yourself). You could also look at using websockets, rather than serving the data through a HTTP endpoint.

I'd look into using expressjs in combination with node.js to build your website. Then using ajax inside your html to accomplish the rest api call updates.

Related

Data VS Async Data in Nuxt

Im using vue.js with nuxt.js, I'm just still confused as when to use Data VS Async Data. Why would I need to use Async data when I just have data that just displays on the page?
I have a data object of FAQ's and just want to display the data without doing anything with it. What are the benefits of using the asyncData? Or what are the cases or best use of them?
Should I display list data such as this as async by default if using data such as this inside of my component?
Data
data:() => ({
faqs:[
{"title":"faq1"},
{"title":"faq2"},
{"title":"faq3"},
]
}),
asyncData
asyncData(context) {
return new Promise((resolve, reject) => {
resolve({
colocationFaqs:[
{"title":"faq1"},
{"title":"faq2"},
{"title":"faq3"},
]
});
})
.then(data => {
return data
})
.catch(e => {
context.error(e);
});
},
asyncData happes on the serer-side. You cant access browser things like localStorage or fetch() for example but on the ther hand you can access server-side things.
So why should you use asyncData instead of vue cycles like created?
The benefit to use asyncData is SEO and speed. There is this special context argument. It contains things like your store with context.store. Its special because asyncData happens on server-side but the store is on the client side usually. That means you can get some data and then populate your store with it and somewhere else you display it. The benefit of this is that its all server-side and that increase your SEO so for example the google crawler doesnt see a blank page
why would I need to pre render it when it is going to be displayed
anyway
Yes for us it doesnt matter if i send 1 File to the client and it renders all data like in SPA's or if its pre-rendered. But it doesnt matter for the google crawler. If you use SPA mode the crawler just sees a blank page. You can discoverd it too. Go to any SPA website and click right-click and inspect you will see thats there only 1 Div tag and few <script> tags. (Dont press F12 and inspect like this thats not what i mean).

Display frames generated from opencv Django React

I want to show a video generated from opencv, I am getting each frame from opencv and with the help of the django I send this to react.
So what happens, I send a request from react to django api to get frame from opencv and I then show that on react, I am calling this api in a loop to get multiple frames in a second and show on react ( its so fast that it shows frame in a form of video).
But I found that its a wrong way I have to use sockets to send so much request at a time.
Can some show me how can I get the same functionality through websockets,I have a short time so I need a smaller and quicker solution I have googled a lot but did't find nothing.
here's my current approach of sending multiples request:
const interval = setInterval(() => {
axios
.get("http://127.0.0.1:8000/MyApp/get_logs/")
.then(res => {
set_show(res.data);
})
.catch(err => {
console.log(err);
});
}, 500);
return () => clearInterval(interval);
The above function is called after every 0.5 seconds, I get a frame in base64 and show it in image, and it happens repeatedely that makes it in a video form, how can I achieve this through sockets using django and react.

How is this fetch optimised on react gatsby website

On the front page of a website served with a Gatsby React setup I have a NavbarExtra component that shows some data via an api. The data coming from that endpoint changes several times a day.
The idea now is now to optimize the fetch so the fetch is made as seldom as possible so the api is used as rearly as possible, it has a limit to it(different paid plans).
The scenario should be that as long as the user has entered the site the fetch will only happen at once. Then the user can go around the site maybe close even the component with the fetch and again go back to the front-page where the fetch is made.
Now the component is called in the navmenu if the user is only on the front page:
{isLandingPage && <NavBarData/>} and in that component there is this fetch:
useEffect(() => {
fetch(
'https://endpoint',
{
method: 'GET',
headers: {
},
}
)
.then(response => {
if (response.status >= 200 && response.status <= 299) {
return response.json();
}
throw Error(response.statusText);
})
.then(data => {
const { result } = data.quoteResponse;
setNumbers(result));
})
.catch(error => {
console.log(error);
});
}, 0);
Firstly I would like to ask for how should this fetch be done so the api is used as rearly as possible, so the user gets the recent data and only gets it again when for example reloading the page?
Secondly I understand some concepts about single page apps and static site generators as here the used Gatsby, and probably have understood it right that if I would like to use the fetched data on different pages(even other pages than the isLandingPage) of the website I could just use it in one component that is served on different pages and it would not refetch on each page enter?
You could create a Parent component that fetches the data and can retrieve the data to the childrens, so you only control the fetch in one component only one time on each session, or wherever you need it. In this case depending on you architechture you could use simple stateor the context API to reuse this data on several and nested components.
Other solution could involve using the localStorage, so you could store the data fetched on the localStorage and reuse this data on any component, you just will have to update the data when you need it.

Handling large data sets on client side

I'm trying to build an application that uses Server Sent Events in order to fetch and show some tweets (latest 50- 100 tweets) on UI.
Url for SSE:
https://tweet-service.herokuapp.com/stream
Problem(s):
My UI is becoming unresponsive because there is a huge data that's coming in!
How do I make sure My UI is responsive? What strategies should I usually adopt in making sure I'm handling the data?
Current Setup: (For better clarity on what I'm trying to achieve)
Currently I have a Max-Heap that has a custom comparator to show latest 50 tweets.
Everytime there's a change, I am re-rendering the page with new max-heap data.
We should not keep the EventSource open, since this will block the main thread if too many messages are sent in a short amount of time. Instead, we only should keep the event source open for as long as it takes to get 50-100 tweets. For example:
function getLatestTweets(limit) {
return new Promise((resolve, reject) => {
let items = [];
let source = new EventSource('https://tweet-service.herokuapp.com/stream');
source.onmessage = ({data}) => {
if (limit-- > 0) {
items.push(JSON.parse(data));
} else {
// resolve this promise once we have reached the specified limit
resolve(items);
source.close();
}
}
});
}
getLatestTweets(100).then(e => console.log(e))
You can then compare these tweets to previously fetched tweets to figure out which ones are new, and then update the UI accordingly. You can use setInterval to call this function periodically to fetch the latest tweets.

is there a way to get dynamically generated content from a web page through a javascript get request?

essentially, I'm trying to retrieve the contents of a website I created to display time-based one-time passcodes as part of a project I'm doing for fun. the site in question is here. as you can see, all it does is display a totp. however, I'm having trouble actually getting the data from the site.
I've created a small script that is meant to get the totp from the web page, but upon running a fetch request like this (from another server):
const getResponseFromTOTP = () =>
new Promise((resolve, reject) => {
try {
fetch("https://jpegzilla.com/totp/?secret=secrettextgoeshere&length=6")
.then(res => res.text())
.then(html => resolve(html));
} catch (e) {
reject(e);
}
});
from this request, I get the entirety of the document at the url -- but without the content that would be rendered there if viewed by a browser.
the idea is to somehow get javascript to render the content of the webpage as it would be displayed if viewed by a browser, and then extract the totp from the document. the site hosting the totp is completely static; how might this be achieved using only javascript and html?

Categories

Resources