Display frames generated from opencv Django React - javascript

I want to show a video generated from opencv, I am getting each frame from opencv and with the help of the django I send this to react.
So what happens, I send a request from react to django api to get frame from opencv and I then show that on react, I am calling this api in a loop to get multiple frames in a second and show on react ( its so fast that it shows frame in a form of video).
But I found that its a wrong way I have to use sockets to send so much request at a time.
Can some show me how can I get the same functionality through websockets,I have a short time so I need a smaller and quicker solution I have googled a lot but did't find nothing.
here's my current approach of sending multiples request:
const interval = setInterval(() => {
axios
.get("http://127.0.0.1:8000/MyApp/get_logs/")
.then(res => {
set_show(res.data);
})
.catch(err => {
console.log(err);
});
}, 500);
return () => clearInterval(interval);
The above function is called after every 0.5 seconds, I get a frame in base64 and show it in image, and it happens repeatedely that makes it in a video form, how can I achieve this through sockets using django and react.

Related

How is this fetch optimised on react gatsby website

On the front page of a website served with a Gatsby React setup I have a NavbarExtra component that shows some data via an api. The data coming from that endpoint changes several times a day.
The idea now is now to optimize the fetch so the fetch is made as seldom as possible so the api is used as rearly as possible, it has a limit to it(different paid plans).
The scenario should be that as long as the user has entered the site the fetch will only happen at once. Then the user can go around the site maybe close even the component with the fetch and again go back to the front-page where the fetch is made.
Now the component is called in the navmenu if the user is only on the front page:
{isLandingPage && <NavBarData/>} and in that component there is this fetch:
useEffect(() => {
fetch(
'https://endpoint',
{
method: 'GET',
headers: {
},
}
)
.then(response => {
if (response.status >= 200 && response.status <= 299) {
return response.json();
}
throw Error(response.statusText);
})
.then(data => {
const { result } = data.quoteResponse;
setNumbers(result));
})
.catch(error => {
console.log(error);
});
}, 0);
Firstly I would like to ask for how should this fetch be done so the api is used as rearly as possible, so the user gets the recent data and only gets it again when for example reloading the page?
Secondly I understand some concepts about single page apps and static site generators as here the used Gatsby, and probably have understood it right that if I would like to use the fetched data on different pages(even other pages than the isLandingPage) of the website I could just use it in one component that is served on different pages and it would not refetch on each page enter?
You could create a Parent component that fetches the data and can retrieve the data to the childrens, so you only control the fetch in one component only one time on each session, or wherever you need it. In this case depending on you architechture you could use simple stateor the context API to reuse this data on several and nested components.
Other solution could involve using the localStorage, so you could store the data fetched on the localStorage and reuse this data on any component, you just will have to update the data when you need it.

how to quick up the process of displaying images in React coming from backend (Django Rest)

I have an api in django and I am calling it from react front end, the thing is the api have to be called infinitely and in every request from react, there is always new image came from request's response.The response is coming in a good speed but the speed of the reactjs displaying the images is really slow, I am displaying the images without reloading the whole component and I want to display it without reloading.
runInfinite=()=>{
axios.post('http://127.0.0.1:8000/faceapp/process_image/')
.then(res => {
this.setState({baseimage: res.data}, () => {
console.log(this.state.baseimage)
})
});
};
render() {
if (this.state.flag){
setInterval(()=> { this.runInfinite(); },1000);
}
return (
<div>
{this.state.baseimage?<img src={"data:image/png;base64," + this.state.baseimage}/>:<h1>Hello</h1>}
</div>
);
}
}
What can I do so React can display images in the same speed as the response is coming.
There is probably nothing that react can do in this case. You'll have to optimise the image and their dimensions as required on the page itself.
Like, You can start with checking the resolution needed.
E.g. There is no need to load an image of 4000*3000 in a placeholder of 400*300.
I can hardly imagine that React is too slow for rendering an image. Please validate where you're actual delay is coming from.
Other than that, you have a potential DoS issue.
Combining setInterval (never use setInterval) with an IO request (especially for processing data) is a recipe for error. What happens when your server takes more than 1000ms to respond? Answer: multiple requests start stacking up and slowing down the program to a hold. Might that be what's actually happening?
Avoid this behavior by
Initialing a request for a Promise or other callback.
When completed, run logic based on output.
Then reschedule request after all of this has been completed.

Handling large data sets on client side

I'm trying to build an application that uses Server Sent Events in order to fetch and show some tweets (latest 50- 100 tweets) on UI.
Url for SSE:
https://tweet-service.herokuapp.com/stream
Problem(s):
My UI is becoming unresponsive because there is a huge data that's coming in!
How do I make sure My UI is responsive? What strategies should I usually adopt in making sure I'm handling the data?
Current Setup: (For better clarity on what I'm trying to achieve)
Currently I have a Max-Heap that has a custom comparator to show latest 50 tweets.
Everytime there's a change, I am re-rendering the page with new max-heap data.
We should not keep the EventSource open, since this will block the main thread if too many messages are sent in a short amount of time. Instead, we only should keep the event source open for as long as it takes to get 50-100 tweets. For example:
function getLatestTweets(limit) {
return new Promise((resolve, reject) => {
let items = [];
let source = new EventSource('https://tweet-service.herokuapp.com/stream');
source.onmessage = ({data}) => {
if (limit-- > 0) {
items.push(JSON.parse(data));
} else {
// resolve this promise once we have reached the specified limit
resolve(items);
source.close();
}
}
});
}
getLatestTweets(100).then(e => console.log(e))
You can then compare these tweets to previously fetched tweets to figure out which ones are new, and then update the UI accordingly. You can use setInterval to call this function periodically to fetch the latest tweets.

Updating DOM for everyone when database changes - MongoDB, Ajax, Express,React

I'm making an application where when user submits a form, i want DOM to be updated for everyone on that page, realtime, without refreshing the page.
I have tried doing that with Socket.IO, and it kinda works, but the problem is that it only works if someone is already on that page and i don't need that functionality, i need that when users SUBMITS a form, view is not updated only for existing connection but also when someone loads the page first time and requests were already done.
So, i decided to create a database and check for changes and it works as expected
basically the work flow of the app is this
user submits form => fetch function that checks database for changes is fired => it finds new database entry => updates React state => change is sent to the view
But the problem is that if i do the updating of dom this way, i'm afraid i may be overloading the server unnecessary. I Checked, and every new open instance of
"http://localhost:3000/seek" checks to see if database is changed and so, if i had 1000 users on my web app that would be 1000 requests every second :o
Maybe i should combine both socket.io and database and use that approach for updating dom realtime?
Seek.js (Server Side)
router.post('/', (req,res) => {
// Processes form
// Saving to database
// Sending response
});
router.get('/:fetch', (req,res,next) => {
if(req.params.fetch === 'fetch'){
Seek.find(function(err,games){
if(err) console.log(err);
console.log('FETCHED')
res.status(200).send({games});
})
}else{
next();
}
});
seekDiv.jsx
class MyComponent extends React.Component {
constructor(props) {
super(props);
this.state = { games:[] };
}
componentWillMount(){
this.fetchGames()
}
fetchGames(){
fetch('http://www.localhost:3000/seek/fetch')
.then(res => res.json())
.then(data => {this.setState({games: data.games})})
.catch(err => console.log(err))
}
componentDidMount(){
setInterval( () => this.fetchGames(), 1000)
}
render() {
var games = this.state.games;
let zero;
if(games.length === 0){
zero = ''
}else{
zero = games.map(x => <div>{x.userAlias}</div>)
}
return(
<div>
{zero}
</div>
);
}
}
I'm hoping that i presented my problem clear enough but in case i didn't this is the functionality i want
users submits form => the DOM is updated for EVERY user without refresh containing that form data and is also there until it's manually removed.
Any help on how to proceed is greatly appreciated.
But the problem is that if i do the updating of dom this way, i'm afraid i may be overloading the server unnecessary. I Checked, and every new open instance of "http://localhost:3000/seek" checks to see if database is changed and so, if i had 1000 users on my web app that would be 1000 requests every second :o
Yeah - that is a problem. Conceptually, you need to make this a "push" not "pull" system.
Rather than having every client constantly ask the server if there are updates, you simply need to leave a socket connection open to every page (very low resource use) and on your server, after receiving a new form/post - you then push to every connected client the update.
The socket.io docs have a good example of how to do this in the "broadcast" section. It's for chat messages, but it works the same way for your forms.
You'll want to minimize the data you send to every client to the bare minimum needed. So if you record any additional data (say, a timestamp of when the new post was added) unless you are displaying or using that data on the front end, you wouldn't want to send it to all of the listening client.
You'll want your front end to be monitoring for incoming updates, and when it does, use react to update the DOM accordingly.

Update an element automatically in a website using NodeJs

I am still in early stages of my programming and I was thinking about creating something which does API calls after every x seconds and update my website with new content.
My intial goal is to populate a table with the content obtained from API using FOR loop (.ejs page).
Now, I want to update just those rows and columns (created from FOR loop) of my webpage after x seconds instead of refreshing my entire page.
How can I achieve this (Updating those rows and columns) ?
Consider stock market website, where it just updates the stock price, instead of entire page.
Thanks for your help in advance
The most popular way to solve this problem is to store the data you obtain from the other API in your database, then create an endpoint that serves the most recent version of that data as JSON.
Then at the client side, you make periodic requests to the server to fetch the content and updates a part of the page to show the newest data available.
This could be as simple as:
// server side (if you use express)
app.get("/data", (req, res) => {
database.getMostRecentEntry()
.then(data => res.json(data))
.catch(err => res.status(500).end());
});
// client side
function fetchMostRecentData() {
fetch("/data")
.then(response => response.json())
.then(data => updateView(data))
.catch(err => showError(err));
}
function updateView(data) {
let container = document.getElementById("app");
container.innerHTML = `
<tr>
<td>${data.name}</td>
<td>${data.value}</td>
</tr>
`;
}
function showError(err) {
console.error(err);
alert("Something went wrong");
}
// call fetchMostRecentData once every 10s
setInterval(fetchMostRecentData, 10000);
Now, this isn't a very robust solution and there are some fairly serious security problems, but it's a starting point.
From here, you should look into using a frontend framework (rather than updating innerHTML yourself). You could also look at using websockets, rather than serving the data through a HTTP endpoint.
I'd look into using expressjs in combination with node.js to build your website. Then using ajax inside your html to accomplish the rest api call updates.

Categories

Resources