Express app POST response to HTML page - javascript

I'm currently working on IoT project where I'm trying to POST temperature reading from DHT11 temp. sensor to a node server on my local machine. The POST from the Arduino is just fine and its hitting my server correctly, and on my server.js I get the sensor reading as well. But, how do I print this sensor reading to my HTML page? I use Jade template.
Here is my POST code in server.js
app.post('/temp', function(req, res){
var sensorReading = req.body.sensorInput;
console.log("Temp: "+sensorReading); // this prints the input correctly to console.
res.render('index', {temp:sensorReading}); // but this does nothing
});
And in the Jade template I have
extends layout
block content
h2 #{pageTitle}
div
h1 #{temp}
Can someone help me to figure this out?

You will need to set up a web socket connection from your browser page(s) to your server, or have them poll the server to get the most recent response. That code is simply going to render your page back to your Arduino, not your web browser. Also, you will have to find a permanent storage location for your sensor reading as that location is a scope variable that will disappear after the POST.

Related

Using Node.js to access data from a running executable program

I have a large program written in C# using Visual Studio that is running as a standalone executable complete with an extensive GUI interface. We want to add a web interface to the program in order to get most, if not all of the functionality available.
The first step I took was to add a web server to the C# executable and I linked in some web pages (html and css) and I was able to present some data as a test case and this worked.
We decided that it would be better to treat the standalone executable like a database where we would prefer to send requests to the get data and post data to store data.
I'm using Node.js framework and Express as my webserver. I have a number of web pages designed and I'm using javascript and some embedded javascript to render data on some of the web pages.
The issue I'm having is how do I process a selected route on one of my webpages to force the the .js file that is processing the route to go out and access the executable program that is running on the same computer at this moment to both get and set data in the executable program and then present it back on the web page.
HTML code (test_data_02.ejs)
<input class="LSNButton1" type="button" style="color:black; background:green;" value="Test Button B" onclick="window.location.href = '/get_test_02b'" />
Javascript (Express) (main.js)
const { json } = require("express");
const express = require("express"),
app = express(),
fs = require("fs"),
homeController = require("./controllers/homeController"),
errorController = require("./controllers/errorController"),
data_entry_03 = require("./public/js/data_entry_03"),
layouts = require("express-ejs-layouts");
app.set("view engine", "ejs");
app.use(layouts);
app.use(express.static("public")); //tell the application to use the corresponding public folder to serve static files
app.use(express.urlencoded({ //assists in reading body contents, parse incoming requests in url format
extended: false
}));
app.use(express.json()); //assists in reading body contents, parse incoming requests in json format
app.set("port", process.env.PORT || 3000);
console.log("MAIN MAIN MAIN"); //this only happens once on start up - this file is only executed on startup, everything is configured on startup
//MIDDLEWARE CODE
app.use((req, res, next) => { //this is a generic middleware function that will be called before any route is processed
homeController.middleWareFunction(req, res); //process this function before moving on to next route that was selected
//console.log(req.query); //from the browser: http://localhost:3000?cart3&jack=5, this results in a request to /?cart=3&jack=5, how do i make a post for this
console.log(`request made to: ${req.url}`);
console.log("");
next();
})
//app.use("/general_data", homeController.readDataFile); //run this function for every request made to the path beginning with this route - it does not imply it is a middleware that will run on this route before the get
//GET ROUTE CODE
app.get("/", homeController.showHome); //a specific route has been identified and the callback function assoicated with the particular route is specified
app.get("/specific_data", homeController.showSpecificData);
app.get("/general_data", homeController.showGeneralData);
app.get("/get_test_02b", homeController.getTest02B);
More Javascript (homeController.js)
I was hoping somewhere in 'getTest02B' or in 'getTestData02B' I would be able to send or request data from the executable program somehow. Maybe using a URL. I would obviously need to add code to my executable to process the requests to either send or receive data. I would like to be able to process the data as JSON, XML, or text data in both directions. I just can't see the mechanism to perform this operation.
exports.getTest02B = (req, res) => {
console.log("in exports.getTest02B");
displayInformation(req.url, "URL");
displayInformation(req.method, "METHOD");
displayEmptyLine();
getTestData02B(function (err, data) {
if (err)
return res.send(err);
res.send(data);
});
};
One comment on this issue was that I needed to make a http request to the C#
program. I understand that Node.js allows several connections per server to
make HTTP requests.
I'm using the code below and it is not working.
const req = http.request('http://192.168.1.222/ListAlarms.html', (res) => {
console.log('STATUS:${res.statusCode}');
});
req.end;
When I run this code there appears to be no response from the C# program.
From the editing environment where I wrote the code above, if I select the URL and press CONTROL and CLICK I end up launching another browser with the data being displayed because the C# program is currently running. This implies to me that at least the URL in the http.request statement is good. Just to confirm the URL was working I ran the C# program in the debugger and I was able to break on receiving the URL. The URL is good but I just can't seem to access the C# program from the Node.js environment using the same URL. Obviously I'm doing something wrong.
I assume that once I'm able to generate a http.request from the Node.js environment I would be able to control the flow of data to and from the C# program.
Any input would be appreciated.

Where to put socket.io server in WebStorm Express template

I am trying to add socket.io into generated express server by WebStorm. Where I am supposted to setup server and run socket.on events? Just put them all inside /bin/www, or its messy and I am supposted to make some controller like index and users page have.
PS: Also I have second fast question. Is dumb idea to have express Web server on same port as Socket.IO websocket server? I see, that all websites using subdomain to connect to socket.io, so they must be using different port.
There isn't a single answer to this. But to get some ideas of nice ways to do it, you could download some trusted examples. For example, MEAN.JS uses socket.io, and it is very structured. You may not need everything in the stack, but it's great for getting inspiration on organization. Best of luck!
I get this post to life again because i was trying to make the same thing.
I tried something and it worked !
I just followed the Socket.io doc and tried to adapt it to this template.
https://socket.io/get-started/chat
Here's what i've wrote in my www.js from the template (didn't changed anything in this file).
/**
* Create HTTP server. (This part was in by default)
*/
let server = http.createServer(app);
/**
* Try Socket io integration (This is what i've done)
*/
let io = require('socket.io')(server);
io.on('connection', function (socket) {
console.log("yeet");
});
and here's my layout.pug
doctype html
html
head
title= title
link(rel='stylesheet', href='/stylesheets/style.css')
script(src='/socket.io/socket.io.js')
body
block content
script.
let socket = io();
Now when I get my page, I get the log.
Hope it can help other people.

Communication between an express node server and its displaying html

Really fast, this question may have already been asked, but I am not totally clear on the terminology and have had no success when searching for a solution.
I am attempting to create a simple web application that can receive get and post requests. Then, I would like to take the information I receive from these requests and use them in a javascript file embedded in html that is displayed. I think it may be more clear with code. This is my app.js:
var express = require('express');
var app = express();
var assert = require('assert');
app.use(express.static('javascript'));
app.get('/', function(req, res) {
res.render('index.jade');
});
app.listen(3000);
I would then like to be able to take information received from a get or post request, specifically in JSON format, and then be able to use it in javascript that I have linked to index.jade. To make matters slightly more confusing in index.jade the only line is:
include map.html
So ideally I would be able to get the information to javascript in that html file.
I want to be able to pretty continuously update map.html using javascript based on frequent get and post commands.
This could be a very simple solution, I am pretty new with web application programming in general, and I have been struggling with this problem. Thanks in advance, and ask if you need any clarification.
I think you need to understand what you are doing. You are creating a web server using node.js. Express framework simplifies that for you. The official documentation of express is pretty great. You should refer that especially this link.
Now, in your application, you need to create endpoints for GET and POST requests. You have already created one endpoint "/" (root location) that loads your home page which is the contents of the file index.jade. A small example is :
app.get('/json', function (req, res) {
res.setHeader('Content-Type', 'application/json');
res.send(JSON.stringify({ a: 1 }));
});
Jade is a templating engine which resolves to HTML. You should refer its documentation as well. Since as you said, you are not very familiar with these, you could simply use HTML as well.
After this tasks are quite simple. In your javascript, using ajax calls you can access the GET or POST endpoints of your server and do the desired actions.
Hope it helps!
The way I get it you want to be able to call an endpoint (lets assume /awesome) pass some info and then write to a Javascript file that you can then reference in your html.
Something like the below would write a main.js file in your public folder. You can make it write whatever you want.
var fs = require('fs');
fs.writeFile("/public/main.js", "console.log('Hello!')", function(err) {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});
You can then reference /public/main.js in your html.
Check that: Writing files in Node.js
If I misunderstood your intentions, apologies. :)
So you want to reviece continuously data from the server maybe web sockts are an option for you http://socket.io/docs/
This way u will open a stable connection from the client to the server. This is a great way if u want to get updates done by other clients.
If you want to trigger the changes by the client then ajaxs calls(http://www.w3schools.com/ajax/) as hkasera mentioned are the way to go.

Get data from Raspberry PI over Websocket into Web Page?

I am new at web developing and I want to do webpage for remote controlling my Raspberry PI. On raspberry I have few sensors attached and I can get data by sending request on 192.168.1.100:9997. There is code written in Python. Everything works if I try to get data with Putty for example. Now I want to establish TCP connection for reading data over my webpage. I was searching for few days and found that this is possible by creating Websockets. There are many tools, most described I found is Node.js. As I understand with Node.js is possible to create Websockets and it can also serves webpage (instead of Appache for example)?
For example I am running this Websocket server just for reading data from RPi in "server.js". Now I don't know how can I get this data from "server.js" into my .html? I didn't find any very basic examples. I can get data via database, but this is not what I want. I also want to send request from my Webpage to Rpi and then read the answer.
I hope you understand my problem. If you can point me in some good examples or tell me how it must be done I will be very glad. I want to do this with Javasrcipt if it's possible.
Thank you in advance.
EDIT: I have now working example with Node.js, but I don't know how to implement this into my Web page that user can trigger this part of codes from .html, and show answered data into .html web page. I hope this helps.
var client = new net.Socket();
client.connect(9997, '192.168.1.100', function() {
console.log('Connected');
//sending request
//THIS SHOULD BE TRIGGERED FROM HTML onclick for example
client.write('$DATA');
});
client.on('data', function(data) {
console.log('Received: ' + data);
//THIS DATA SHOULD BE SHOWN IN HTML for example
//client.destroy(); // kill client after server's response
});
client.on('close', function() {
console.log('Connection closed');
});
For getting data off a Pi and into a Web page, take a look at some examples doing this using WAMP (an open protocol which runs on top of WebSocket) and Crossbar.io (open source router for WAMP) - http://crossbar.io/iotcookbook/Raspberry-Pi/
Full disclosure: I'm working on these projects - but they are open source, and a great fit for what the OP wants to do.

Proper way to monitor/control a server remotely over http in realtime

On my client (a phone with a browser) I want to see the stats of the server CPU,RAM & HDD and gather info from various logs.
I'm using ajax polling.
On the client every 5 sec (setInterval) I call a PHP file:
scan a folder containing N logs
read the last line of each log
convert that to JSON
Problems:
Open new connection every 5 sec.
Multiple AJAX calls.
Request headers (they are also data and so consume bandwidth)
Response headers (^)
Use PHP to read files every 5 sec. even if nothing changed.
The final JSON data is less than 5 KB, but I send it every 5 sec, and there are the headers and new connection every time, so basically every 5 sec., I have to send 5-10 KB to get 5 KB which are 10-20 KB.
Those are 60 sec / 5 sec = 12 new connections per minute and about 15 MB per hour of traffic if I leave the app open.
Lets say I have 100 users that I let monitor / control my server that would be around 1.5 GB outgoing traffic in one hour.
Not to mention that the PHP server is reading multiple files 100 times every 5 sec.
I need something that on the server reads the last lines of those logs every 5 sec and maybe writes them to a file, then I want to push this data to the client only if it's changed.
SSE (server sent events) with PHP
header('Content-Type: text/event-stream');
header('Cache-Control: no-cache');
while(true){
echo "id: ".time()."\ndata: ".ReadTheLogs()."\n\n";
ob_flush();
flush();
sleep(1);
}
In this case after the connection is established with the first user
the connection keeps open (PHP is not made for that) and so I save some space (request headers,response headers). This work on my server bu most server don't allow to keep the connection open for long time.
Also with multiple users I read the log multiple times.(slowing down my old server)
And I can't control the server ... I would need to use ajax to send a command...
I need WebSockets!!!
node.js and websockets
using node.js, from what i understand, i can do all this without consuming alot
of resources and bandwich. The connection keeps open so no unnecessary headers, i can recieve and send data.it handles multiple users very well.
And this is where i need your help.
the node.js server should in background update, and store the logs data every 5 sec if the files are modified.OR should that do the operating system with (iwatch,dnotify...)
the data should be pushed only if changed.
the reading of the logs should be happen only one time after 5 sec ... so not triggered by each user.
this is the first example i have found.....and modified..
var ws=require("nodejs-websocket");
var server=ws.createServer(function(conn){
var data=read(whereToStoreTheLogs);
conn.sendText(data)// send the logs data to the user
//on first connection.
setTimeout(checkLogs,5000);
/*
here i need to continuosly check if the logs are changed.
but if i use setInterval(checkLogs,5000) or setTimeout
every user invokes a new timer and so having lots of timers on the server
can i do that in background?
*/
conn.on("text",function(str){
doStuff(str); // various commands to control the server.
})
conn.on("close",function(code,reason){
console.log("Connection closed")
})
}).listen(8001);
var checkLogs=function(){
var data=read(whereToStoreTheLogs);
if(data!=oldData){
conn.sendText(data)
}
setTimeout(checkLogs,5000);
}
the above script would be the notification server, but i also need to find a solution to store somwhere the info of those multiple logs and do that everytime something is changed, in the background.
How would you do to keep the bandwich low but also the server resources.
How would you do?
EDIT
Btw. is there a way to stream this data simultaneosly to all the clinets?
EDIT
About the logs: i also want to be able to scale the time dilatation between updates... i mean if i read the logs of ffmpeg i ned the update every sec if possible... but when no conversion is active.. i need to get the basic machine info every 5min maybe ... and so on...
GOALS:
1. performant way to read & store somewhere the logs data (only if clinets connected...[mysql,file, it's possible to store this info inside the ram(with node.js??)]).
2. performant way to stream the data to the various clients (simultanously).
3. be able to send commands to the server.. (bidirectional)
4. using web languages (js,php...), lunix commands( something that is easy to implement on multiple machines).. free software if needed.
best approach would be:
read the logs, based on current activity, to the system memory and stream simultaneously and continuosly, with an already open connection, to the various clients with webSockets.
i'don't know anything that could be faster.
UPDATE
The node.js server is up and running, using the http://einaros.github.io/ws/ webSocketServer implementation, as it appears to be the fastest one.
I wrote with the help of #HeadCode the following code to handle properly the client situation & to keep the process as low as possible. checking various things inside the broadcast loop. Now the pushing & the client handling is at a good point.
var
wss=new (require('ws').Server)({port:8080}),
isBusy,
logs,
clients,
i,
checkLogs=function(){
if(wss.clients&&(clients=wss.clients.length)){
isBusy||(logs=readLogs()/*,isBusy=true*/);
if(logs){
i=0;
while(i<clients){
wss.clients[i++].send(logs)
}
}
}
};
setInterval(checkLogs,2000);
But atm i'm using a really bad way to parse the logs.. (nodejs->httpRequest->php).. lol. After some googling i found out that i totally could stream the output of linux software directly to the nodejs app ... i didn't checked... but maybe that would be the best way to do it. node.js also has a filesystem api where icould read the logs. linux has it's own filesystem api.
the readLogs()(can be async) function is still something i'm not happy with.
nodejs filesystem?
linuxSoftware->nodejs output implementation
linux filesystem api.
keep in mind that i need to scan various folders for logs and then parse somehow the outputted data, and this every 2 seconds.
ps.: i adde isBusy to the server variables in case the logReading sytem is async.
EDIT
Answer is not complete.
Missing:
A performant way to read,parse and store the logs somewhere (linux filesystem api, or nodejs api, so the i store directly into system memory)
An explaination if it's possible to stream data directly to multiple users .
apparently nodejs loops trough the clients and so (i think) sending multiple times the data.
btw is it possible/worth to close the node server if there are no clients and restart on new connections on the apache side. (ex: if i connect to the apache hosted html file a script launches the nodejs server again). doing so would further reduce the memory leaking???right?
EDIT
After some experimenting with websockets (some videos are in the comments) i learned some new stuff. Raspberry PI has the possibility to use some CPU DMA channels to to high frequency stuff like PWM... i need to somehow understand how that works.
When using sensors and stuff like that i should store everything inside the RAM, nodejs already does that?? (in a variable inside the script)
websocket remains the best choice as it's basically easely accessible from any device now, simply using a browser.
I haven't used nodejs-websocket, but it looks like it will accept an http connection and do the upgrade as well as creating the server. If all you care about receiving is text/json then I suppose that would be fine, but it seems to me you might want to serve a web page along with it.
Here is a way to use express and socket.io to achieve what you're asking about:
var express = require('express');
var app = express();
var http = require('http').Server(app);
var io = require('socket.io')(http);
app.use(express.static(__dirname + '/'));
app.get('/', function(req, res){
res.sendfile('index.html');
});
io.on('connection', function(socket){
// This is where we should emit the cached values of everything
// that has been collected so far so this user doesn't have to
// wait for a changed value on the monitored host to see
// what is going on.
// This code is based on something I wrote for myself so it's not
// going to do anything for you as is. You'll have to implement
// your own caching mechanism.
for (var stat in cache) {
if (cache.hasOwnProperty(stat)) {
socket.emit('data', JSON.stringify(cache[stat]));
}
}
});
http.listen(3000, function(){
console.log('listening on *:3000');
});
(function checkLogs(){
var data=read(whereToStoreTheLogs);
if(data!=oldData){
io.emit(data)
}
setTimeout(checkLogs,5000);
})();
Of course, the checkLogs function has to be fleshed out by you. I have only cut and pasted it in here for context. The call to the emit function of the io object will send the message out to all connected users but the checkLogs function will only fire once (and then keep calling itself), not every time someone connects.
In your index.html page you can have something like this. It should be included in the html page at the bottom, just before the closing body tag.
<script src="/path/to/socket.io.js"></script>
<script>
// Set up the websocket for receiving updates from the server
var socket = io();
socket.on('data', function(msg){
// Do something with your message here, such as using javascript
// to display it in an appropriate spot on the page.
document.getElementById("content").innerHTML = msg;
});
</script>
By the way, check out the Nodejs documentation for a variety of built-in methods for checking system resources (https://nodejs.org/api/os.html).
Here's also a solution more in keeping with what it appears you want. Use this for your html page:
<!DOCTYPE HTML>
<html>
<head>
<meta charset="utf-8">
<title>WS example</title>
</head>
<body>
<script>
var connection;
window.addEventListener("load", function () {
connection = new WebSocket("ws://"+window.location.hostname+":8001")
connection.onopen = function () {
console.log("Connection opened")
}
connection.onclose = function () {
console.log("Connection closed")
}
connection.onerror = function () {
console.error("Connection error")
}
connection.onmessage = function (event) {
var div = document.createElement("div")
div.textContent = event.data
document.body.appendChild(div)
}
});
</script>
</body>
</html>
And use this as your web socket server code, recently tweaked to use the 'tail' module (as found in this post: How to do `tail -f logfile.txt`-like processing in node.js?), which you will have to install using npm (Note: tail makes use of fs.watch, which is not guaranteed to work the same everywhere):
var ws = require("nodejs-websocket")
var os = require('os');
Tail = require('tail').Tail;
tail = new Tail('./testlog.txt');
var server = ws.createServer(function (conn) {
conn.on("text", function (str) {
console.log("Received " + str);
});
conn.on("close", function (code, reason) {
console.log("Connection closed");
});
}).listen(8001);
setInterval(function(){ checkLoad(); }, 5000);
function broadcast(mesg) {
server.connections.forEach(function (conn) {
conn.sendText(mesg)
})
}
var load = '';
function checkLoad(){
var new_load = os.loadavg().toString();
if (new_load === 'load'){
return;
}
load = new_load;
broadcast(load);
}
tail.on("line", function(data) {
broadcast(data);
});
Obviously this is very basic and you will have to change it for your needs.
I had made a similar implementation recently using Munin . Munin is a wonderful server monitoring tool, open source too which also provides a REST API. There several plugins available for your needs monitoring CPU, HDD and RAM usage of your server.
You need to build a push notification server. All clients who are listening, will then get a push notification when new data is updated. See this answer for more information: PHP - Push Notifications
As to how you would update the data, I'd suggest using OS-based tools to trigger a PHP script (command line) that will generate an "push" the json file out to any client currently listening. Any new client logging on to "listen" will get served the current json available, until it's updated.
This way you're not subject to 100 users using 100 connections and how much ever bandwidth to poll your server every 5 seconds, and only get updated when they need to know there's an update.
How about a service that reads all the log info (via IPMI, Nagios or whatever) and creates the output files on some schedule. Then anyone that wants to connect can just read this output rather than hammering the server logs. Essentially have one hit on the server logs then everyone else just reads a web page.
This could be implemented pretty easily.
BTW: Nagios has a v nice free edition
Answering just these bits of your question:
performant way to stream the data to the various clients (simultanously).
be able to send commands to the server.. (bidirectional)
using web languages (js,php...), lunix commands( something that is easy to implement on multiple machines).. free software if needed.
I'll recommend the Bayeux protocol as made simple by the CometD project. There are implementations in a variety of languages and it's really easy to use in its simplest form.
Meteor is broadly similar. It's an application development framework rather than a family of libraries, but it solves the same problems.
Some suggestions:
Munin for charts
NetSNMP (used by Munin, but you can also use Bash and Cron to build traps that send SMS texts on alerts)
Pingdom for remote alerts about how well the server is responding to ping and HTTP checks. It can SMS text you or call a phone, as well as have call escalation rules.

Categories

Resources