I am building a simple IoT project in which a micro-controller publishes sensor data to a MQTT broker every second on a particular topic. I am using Node.js to build an app that uses the mqtt library to subscribe to this topic and update the sensor data to the Thingspeak channel using their API via a POST request (using the request library).
I have managed to build something that is almost working but I am facing a slight problem. I am not able to update all the sensor data since the POST request takes some time and during that time other sensor data is also published.
I am thinking of a solution in which I can queue the sensor data that needs to be updated via POST request.
Here is my current code:
var mqtt = require("mqtt")
var request = require("request")
var client = mqtt.connect('hostname')
client.on('connect', function () {
client.subscribe('topic')
})
var postReq = function(data) {
request.post('https://api.thingspeak.com/update', {form: {sensor_data: data}}, function(err,response,body){
console.log(body)
}}
}
client.on('message', function (topic, data) {
// message is Buffer
postReq(data)
})
Related
My project works as intended except that I have to refresh the browser every time my keyword list sends something to it to display. I assume it's my inexperience with Expressjs and not creating the route correctly within my websocket? Any help would be appreciated.
Browser
let socket = new WebSocket("ws://localhost:3000");
socket.addEventListener('open', function (event) {
console.log('Connected to WS server')
socket.send('Hello Server!');
});
socket.addEventListener('message', function (e) {
const keywordsList = JSON.parse(e.data);
console.log("Received: '" + e.data + "'");
document.getElementById("keywordsList").innerHTML = e.data;
});
socket.onclose = function(code, reason) {
console.log(code, reason, 'disconnected');
}
socket.onerror = error => {
console.error('failed to connect', error);
};
Server
const ws = require('ws');
const express = require('express');
const keywordsList = require('./app');
const app = express();
const port = 3000;
const wsServer = new ws.Server({ noServer: true });
wsServer.on('connection', function connection(socket) {
socket.send(JSON.stringify(keywordsList));
socket.on('message', message => console.log(message));
});
// `server` is a vanilla Node.js HTTP server, so use
// the same ws upgrade process described here:
// https://www.npmjs.com/package/ws#multiple-servers-sharing-a-single-https-server
const server = app.listen(3000);
server.on('upgrade', (request, socket, head) => {
wsServer.handleUpgrade(request, socket, head, socket => {
wsServer.emit('connection', socket, request);
});
});
In answer to "How to Send and/or Stream array data that is being continually updated to a client" as arrived at in comment.
A possible solution using WebSockets may be to
Create an interface on the server for array updates (if you haven't already) that isolates the array object from arbitrary outside modification and supports a callback when updates are made.
Determine the latency allowed for multiple updates to occur without being pushed. The latency should allow reasonable time for previous network traffic to complete without overloading bandwidth unnecessarily.
When an array update occurs, start a timer if not already running for the latency period .
On timer expiry JSON.stringify the array (to take a snapshot), clear the timer running status, and message the client with the JSON text.
A slightly more complicated method to avoid delaying all push operations would be to immediately push single updates unless they occur within a guard period after the most recent push operation. A timer could then push modifications made during the guard period at the end of the guard period.
Broadcasting
The WebSockets API does not directly support broadcasting the same data to multiple clients. Refer to Server Broadcast in ws documentation for an example of sending data to all connected clients using a forEach loop.
Client side listener
In the client-side message listener
document.getElementById("keywordsList").innerHTML = e.data;
would be better as
document.getElementById("keywordsList").textContent = keywordList;
to both present keywords after decoding from JSON and prevent them ever being treated as HTML.
So I finally figured out what I wanted to accomplish. It sounds straight forward after I learned enough and thought about how to structure the back end of my project.
If you have two websockets running and one needs information from the other, you cannot run them side by side. You need to have one encapsulate the other and then call the websocket INSIDE of the other websocket. This can easily cause problems down the road for other projects since now you have one websocket that won't fire until the other is run but for my project it makes perfect sense since it is locally run and needs all the parts working 100 percent in order to be effective. It took me a long time to understand how to structure the code as such.
BACK STORY :
Let me come from my problem, I need to update firebase database with Arduino so I used firebase-Arduino library but for some reason it will not compile Node MCU so my next way is a bit complicated that is I created a java script to update the firebase I just need to add 1 to the database so I don't need to update sensor value or anything so if I load the webpage it will update the value ,I thought it will be triggered with http request from Arduino but I was wrong it does not work like that.
QUESTION : How to run the JavaScript in a webpage with a web request from Arduino?
Assuming you have node.js installed you can have something like this (source):
const https = require('https');
https.get('your_url_here', (resp) => {
let data = '';
// A chunk of data has been recieved.
resp.on('data', (chunk) => {
data += chunk;
});
// The whole response has been received. Print out the result.
resp.on('end', () => {
console.log(JSON.parse(data).explanation);
});
}).on("error", (err) => {
console.log("Error: " + err.message);
});
But if you don't have installed node.js you might create the http request from bash commands like curl. This can be useful since you can make it run as daemon (run on th background every X minutes).
Let me know if you managed, something good luck.
Is there any way to get responses to a specific client when another client has a different request at the same time to the same server?
This is code snippet for an exchange server. The given function is present in a library named "ccxt", this function "exchange.fetchMarkets()" has an API which requests to a third party server which is an exchange server like 'bitfinex', 'crex24', 'binance', etc. The issue I am facing is when one client is requesting for an exchange like 'crex24' at the same time when another client is requesting for different exchange like 'binance', they are getting the same response as the function calls for the last recent exchange.
I want it to give responses according to the client's requests independent of each other.
this one is controller function:
const ccxt = require("ccxt");
exports.fetchMarkets = function(req, res){
let API = req.params.exchangeId;
let exchange = new ccxt[API]();
if (exchange.has["fetchMarkets"]) {
try{
var markets = await exchange.fetchMarkets();
res.send(markets)
}catch (err) {
let error = String(err);
res.send({ failed: error });
}
}else{
res.send({loadMarkets : "not available"})
}
}
This is end point for the server request:
app.route('/markets/:exchangeId')
.get(exchange.fetchMarkets)
Here you can find the ccxt library: https://github.com/ccxt/ccxt/wiki/Manual and can be included in the project by "npm install ccxt"
I don't see why the code you mentioned wouldn't work the way you are expecting it to work. I created a small app and it is working as expected. You can check here
https://repl.it/repls/IllfatedStrangeRepo
I am hitting four different request with different ids and I am getting different response.
Hope it clear the doubts.
Is it possible to send message (for example using alert) to all users when admin changed something in database?
situation: Users browsing car offers and while doing this admin changed price of few offers --> users gets notifications.
Just couple the event of the database update to an emit like this:
Backend
io.on('connection', (socket) => {
console.log('A user connected');
// handling event from the front-end:
socket.on('clientEvent', function(data) {
// Database update happens before this
socket.emit('databaseUpdate', { description: 'Database is updated'});
});
});
This way every time a database update happens a new event will be emitted to the frontend to all the users which are connected. Your frontend now can listen to it as follows (the frontend who is connected listened to emitten databaseUpdates from the backend):
Frontend
var socket = io();
// now we just log the updated data but in this callback you provide your own implementation.
socket.on('databaseUpdate', (data) => console.log(data.description));
Hopefully you find this answer usefull more info here
source1
Source2
You can use socket.blast() at the end of each db operation.
So, if any user is listening to the blasted message, you can make the API call so that it fetches the new record.
[http://node-machine.org/machinepack-sockets/blast][1]
I want to have the following architecture:
A JSON REST API where real time statistic data is pushed to and stored in a Redis server.
A JSON REST API call where any number of clients (native or web) can receive this data after it has been stored - i.e. in real time.
The first client will just be a web app and I may build a native app later.
I'm wondering if my only option is for the clients to poll the REST API for changes? Ideally, I'd like the server to push updates as they arrive so I don't need to manage this polling.
Is my architecture suitable for what I want to achieve, or am I missing something?
A more efficient way than polling is to use websockets, such as Faye or Socket.IO. You can place an emit event under a data store event to immediately send data that's been stored.
With Socket.IO, you'd do that like this:
var io = require('socket.io').listen(80);
//note that you can listen on HTTP servers
//can also be used with Express applications, etc
//when data is stored, run this
io.sockets.emit('event', {
object: 'that is sent to client'
});
You could then use this to tell the client that there is new data, or you could directly send the newly stored data. Custom events can be defined, such as
io.sockets.emit('data_receive', function (data) {...});
and would be received client side like so:
var socket = io.connect('http://socket.location');
socket.on('data_recieve, function (data) {
//data is whatever sent from server
});
In Faye you'd do something like this:
var http = require('http');
var faye = require('faye');
var bayeux = new faye.NodeAdapter({
mount: '/faye',
timeout: 45
});
bayeux.listen(8000);
Then when data is stored, you'd run:
client.publish('/path', {
data: 'Hello world'
});
Any client that has created a client like so:
var client = new Faye.Client('http://socket:port/path');
client.subscribe('/path', function(data) {
alert('Received data: ' + data.text);
});
Will receive the data.
You have the option of Node.js and the websocket for push and pull in realtime.
To don't manage the queuing you still have the option of MQ.