Node JS live result from MySQL - javascript

I'm trying to show the results from my DB in table. I want the users to see every change made in the table(insert, update or delete) live without refreshing the page.
My first question is how to fetch my DB data in table like this. I also would like actions taken by the users to be lively visible from other users(like edit, delete, create).
This is my server.js
var io=require('socket.io'), mysql=require('mysql');
var express = require('express');
var connection = mysql.createConnection({
host : 'localhost',
user : 'root',
password : '',
database : 'lolast'
});
connection.connect(function(err) {
if(err) {
console.log('error when connecting to db:', err);
setTimeout(handleDisconnect, 2000);
}
});
var ws=io.listen(3000);
var socketCount = 0
ws.on('connection', function(socket) {
socketCount++
// Let all sockets know how many are connected
ws.emit('users connected', socketCount);
socket.on('disconnect', function() {
// Decrease the socket count on a disconnect, emit
socketCount--
ws.emit('users connected', socketCount)
});
});
And this is my client side:
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"></script>
<script src="http://127.0.0.1:3000/socket.io/socket.io.js"></script>
<script>
$(document).ready(function(){
// Connect to our node/websockets server
var socket = io.connect('http://localhost:3000');
// New socket connected, display new count on page
socket.on('users connected', function(data){
$('#usersConnected').html('Users connected: ' + data)
})
})
</script>
<div id="usersConnected"></div>
My user counter is working fine, show the result in live but I having trouble with my sql data.
I will be very greatfull if someone helps me.

The answer is replication logs or bin-logs on mysql. Have a look at this repo: https://github.com/numtel/mysql-live-select

This is an incomplete answer but can at least get you started on the problem -
MySQL doesn't have any sort of out-of-the-box support for this. Whatever you do will need to be based on polling, which is not impossible to scale, but requires very smart use of MySQL to scale well.
There are better storage products for this. I think Firebase specializes in this (bought by Google?) and I'm positive there is an AWS-product that does this.
Even NoSQLs like Redis have push support. I think MongoDB does.
There are probably other middleware solutions. There might be frameworks that will create and maintain the tables themselves. Try searching for language-agnostic frameworks or frameworks in other languages and see if anything points you in the right direction.
The roll-your-own solution will look something like this: all of your queries that should be visible to other users will need to go into tables, or be kept track of in tables, with timestamps, that your clients can query for anything since they last received data. This is hard to scale without serious MySQL chops (but reasonable to scale if you know MySQL well). (Just making these claims from my experience at a MySQL startup).

I am using firebase for getting live events for changing value in database.
For view perspective I have hooked firebase with Bootstrap's DataTable.
You can check small use of it over here. Firebase is one of the best solution for this.
https://www.firebase.com/docs/web/libraries/angular/guide/synchronized-arrays.html

Related

Node.js pg-promise Azure function to write to Postgres (timescaleDB)

Azure is driving me mad again. What I try to achieve is that the data that comes in through an Event Hub needs to be written to the database. What I got working thus far is that the data arrives at the Event Hub and that the Azure function is able to post data to the database. I would prefer to do this with Node.JS as the integration seems kind of nice in Azure. The script I use to send some bogus data to the database is as follows:
module.exports = async function (context, eventHubMessages){
const initOptions = {
query(e) {context.log(e.query)},
capSQL: true
//capSQL: true // capitalize all generated SQL
};
const pgp = require('pg-promise')(initOptions);
const db = pgp({
host: '####',
user: '####',
password: '####',
database: 'iotdemo',
port: 5432,
ssl: true
});
// our set of columns, to be created only once (statically), and then reused,
// to let it cache up its formatting templates for high performance:
const cs = new pgp.helpers.ColumnSet(['customer', 'tag', 'value', 'period'], {table: 'testtable'});
// generating a multi-row insert query:
const query = pgp.helpers.insert(JSON.parse(eventHubMessages), cs);
//=> INSERT INTO "tmp"("col_a","col_b") VALUES('a1','b1'),('a2','b2')
// executing the query:
db.none(query);
};
And yes, this is a snippet from somewhere else. The 'eventHubMessages' should contain the payload. A couple of issues that I have had thus far are:
I can send a payload defined within the script or whilst giving it a testing payload, but I cant send the payload of the actual message
pg-promise returns a 202 regardless of whether it fails or not, so debugging is 'blind' at the moment. Any tips on how to get proper logging would be much appreciated.
I used 'capture events' in the event hub instance to capture the actual messages. These were stored in a blob storage. In noticed that the format is Avro. Do I need to peel away at that object to get to the actual array?
The input should look something like this:
[{"customer": duderino, "tag": nice_rug, "value": 10, "period": 163249839}]
I think I have 2 issues:
I dont know how to get meaningful logging out of the Azure function using Node.JS
Something is off about how my payload is coming in.
A more deeper question is, how do I know whether the Azure function is getting the data that it should. I know that the Event Grid gets the data, but there is no throughput. Namespaces are consistent and the Azure Function should be triggered by that namespace and get the input as a string.
I am seriously lost and out of my depth. Apart from the solution I would also appreciate feedback on my request. I am not a pro on StackOverflow and don't want to waste your time.
Regards
Ok, so after some digging I found a few things to resolve the issue. First of all, I was receiving the payload as a string, meaning that I needed it to parse first, before I could make it a callable object. In terms of code its simple, and part of the base functions of node.js
var parsed_payload = JSON.parse(payload_that_is_a_string);
Lastly, to get meaningful logging I found that the PG-Promise module has great support for that, and that this can be configured when loading the module itself. I was particularly interested in errors, so I enabled that option like so:
const initOptions = {
query(e) {console.log(e.query)},
capSQL: true,
//capSQL: true // capitalize all generated SQL
error: function (error, e) {
if (e.cn) {
// A connection-related error;
// console.log("DC:", e.cn);
// console.log("EVENT:", error.message);
}
}
};
That then can be used as a settings object for loading PG-Promise:
const pgp = require('pg-promise')(initOptions);
Thanks for considering my ask for help. I hope this proves useful for anyone out there!
Regards Pieter

Connecting my React App to a MySQL Database on a PHP Server

I am new to working with data bases. I have a react app that I would like to connect to a MySQL database. The server is not mine but given by a web provider (strato), using PHP.
Is it generally possible to GET and PUT data via react (meaning javascript) to this data base in a way that I normally work with ajax and a restful API?
I really hope this question provides enough information for an answer :)
Here's a code example from https://github.com/numtel/reactive-mysql-example, file index.js. Have a look at that file and settings.js to see how to set up the connection.
var results = liveDb.select('select * from players order by score desc', [ {
  table: 'players'
} ]).on('update', function(diff, data){
  var msg = JSON.stringify({
    type: 'diff',
    data: diff
  });
  // Send change to all clients
  connected.forEach(function(conn){
    conn.write(msg);
  });
});

Streaming data from the Server to Client with Meteor:

I'm developing a text based adventure game with Meteor and I'm running into an issue with how to handle certain elements. Namely, how to emit data from the Server to the Client without any input from the Client.
The idea is that when a player is engaged in combat with a monster, the combat damage and updating the Player and Monster objects will be occurring in a loop on the server. When the player takes damage it should accordingly update the client UI. Is something like this possible with Publish / Subscribe?
I basically want something that sits and listens for events from the server to update the combat log accordingly.
In pseudo-code, this is something along the lines of what I'm looking for:
// Client Side:
Meteor.call("runCommand", "attack monster");
// Server Side
Meteor.methods({
runCommand: function(input) {
// Take input, run the loop to begin combat,
// whenever the user takes damage update the
// client UI and output a line saying how much
// damage the player just received and by who
}
});
I understand that you can publish a collection to the client, but that's not really as specific of a function I'm looking for, I don't want to publish the entire Player object to the client, I just want to tell the client to write a line to a textbox saying something like "You were hit for 12 damage by a monster!".
I was hoping there was a function similar to SocketIO where I could, if I wanted to, just emit an event to the client telling it to update the UI. I think I can use SocketIO for this if I need to, but people seemed to be adamant that something like this was doable with Meteor entirely without SocketIO, I just don't really understand how.
The only outs I see to this scenario are: writing all of the game logic client-side which feels like a bad idea, writing all of the combat logs to a collection which seems extremely excessive (but maybe it's not?), or using some sort of SocketIO type-tool to just emit messages to the client to tell it to write a new line to the text box.
Using Meteor, create a combat log collection seem to be the simplest option you have.
You can only listen on added event and then clear the collection when the combat is over.
It should be something like this :
var cursor = Combat_Log.find();
var handleCombatLog = cursor.observe({
added: function (tmp)
{
// do your stuff
}
});
I ask a similar question here, hope this will help ^^
Here's how I did it without a collection. I think you are right to be concerned about creating one. That would not be a good idea. First install Streamy.
https://atmospherejs.com/yuukan/streamy
Then on the server
//find active sockets for the user by id
var sockets = Streamy.socketsForUsers(USER_ID_HERE)._sockets
if (!Array.isArray(sockets) || !sockets.length) {
//user is not logged in
} else {
//send event to all open browser windows for the user
sockets.forEach((socket) => {
Streamy.emit('ddpEvent', { yourKey:yourValue }, socket);
})
}
Then in the client, respond to it like this:
Streamy.on('ddpEvent', function(data) {
console.log("data is ", data); //prints out {yourKey:yourValue}
});

Node.js and mongodb access mongodb

I'm trying to set up mongodb on Windows 8 using node.js, Does anyone know why im getting this error. C:\users\phill\node_modules\mongodb\lib\mongodb\mongo_client.js:359 it also says at collection = db collection,,, can't call method 'collection' of null. I'm having a hard time setting it up. My goal is to be able to add to mongo db, and see that I add or pull up what I added, but adding something is good enough for me for now. I'm trying every thing I can find, even straight from the website, I tried everything I see on here as well. Think it maybe it's the way I have things set up. My node.js is saved in my c: drive there is a file that says, program files(86x) in there I have node_modules, npm and such. The path ends up being, computer > windows (C:) > program files(86x) > nodejs. My Mongodb is saved right on my C: drive the path end up being windows (C:) > mongodb-win32-x86_64-2008plus-2.4.8. In my C: I also created a file data and in it created another db. I have been told i should just use mongoose, I'm just learning so i open to any advice, links or anything that will help. I have one last question as well, i learned php and then found out about sql injections and stuff like that, i am not seeing anything about security at all, should i expect the same as well. For this i get text not defined, but i have been getting errors with everthing i have done, best i did was get stuck on a right concern screen.
var MongoClient = require('mongodb').MongoClient;
MongoClient.connect("mongodb://localhost:27017/integration_test", function(err, db) {
test.equal(null, err);
test.ok(db != null);
db.collection("replicaset_mongo_client_collection").update({a:1},
{b:1}, {upsert:true}, function(err, result) {
test.equal(null, err);
test.equal(1, result);
db.close();
test.done();
});
});
Tried this as well and getting a error,C:\users\phill\node_modules\mongodb\lib\mongodb\mongo_client.js:359.... at collection = db collection,,, can't call method 'collection' of null. im calling it in command prompt node filename.js I'm saving it where my node.js file is, I have pulled up files before and created a server.
var Db = require('mongodb').Db,
MongoClient = require('mongodb').MongoClient,
Server = require('mongodb').Server,
ReplSetServers = require('mongodb').ReplSetServers,
ObjectID = require('mongodb').ObjectID,
Binary = require('mongodb').Binary,
GridStore = require('mongodb').GridStore,
Grid = require('mongodb').Grid,
Code = require('mongodb').Code,
BSON = require('mongodb').pure().BSON,
assert = require('assert');
var db = new Db('test', new Server('localhost', 27017));
// Fetch a collection to insert document into
db.open(function(err, db) {
var collection = db.collection("simple_document_insert_collection_no_safe");
// Insert a single document
collection.insert({hello:'world_no_safe'});
// Wait for a second before finishing up, to ensure we have written the item to disk
setTimeout(function() {
// Fetch the document
collection.findOne({hello:'world_no_safe'}, function(err, item) {
assert.equal(null, err);
assert.equal('world_no_safe', item.hello);
db.close();
})
}, 100);
});
In your first code example, you said:
For this i get text not defined
I assume you meant "test not defined?" Your script only requires the mongodb library, and I don't believe test is a core nodejs function, so that would explain the error.
To reference the driver documentation for db.collection(), an assert library is used, but also properly imported (as you did in your second example).
Within your callback to db.open(), you don't check if an error occurred. That might shed some light on why db is null in that function.
Regarding your question about the equivalent of SQL injection with MongoDB, the main areas of concern are places where you might pass untrusted input into evaluated JavaScript, or using such input to construct free-form query objects (not simply using a string, but more like dropping an object into your BSON query). Both of these links should provide more information on the subject:
What type of attacks can be used vs MongoDB?
How does MongoDB address SQL or Query injection?

Node.JS Socket.IO sending packets to specific connection IDs

I have been searching for this particular problem for the past week, and since I couldn't find any information on the subject(that wasnt outdated), I just decided to work on other things. But now I am at the point where I need to be able to send data(that I constructed) to specific clients using their ID who are connected to my server using node.js and socket.io. I already store the ID in an object for each new connection. What I need to know is a way to just send it to a connection ID I choose.
Something like: function send(data, client.id) {};
I am using an http server, not TCP.
Is this possible?
edit:
server = http_socket.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/html'});
res.end(respcont);
client_ip_address = req.header('x-forwarded-for');
});
socket = io.listen(1337); // listen
//=============================================
// Socket event loop
//=============================================
socket.on ('connection', function (client_connect) {
var client_info = new client_connection_info(); // this just holds information
client_info.addNode(client_connect.id, client_connect.remoteAddress, 1); // 1 = trying to connet
var a = client_info.getNode(client_connect.id,null,null).socket_id; // an object holding the information. this function just gets the socket_id
client_connect.socket(a).emit('message', 'hi');
client_connect.on('message', function (data) {
});
client_connect.on ('disconnect', function () {
);
});
solution: I figured it out by just experimenting... What you have todo is make sure you store the SOCKET, not the socket.id (like i was doing) and use that instead.
client_info.addNode(client_connect.id, client_connect.remoteAddress, client_connect, 1)
var a = client_info.getNode(client_connect.id,null,null,null).socket;
a.emit('message', 'hi');
If you need to do this, the easiest thing to do is to build and maintain a global associative array that maps ids to connections: you can then look up the appropriate connection whenever you need and just use the regular send functions. You'll need some logic to remove connections from the array, but that shouldn't be too painful.
Yes, it is possible.
io.sockets.socket(id).emit('message', 'data');
Your solution has one major drawback: scaling. What will you do when your app needs more the one machine? Using internal IDs also could be difficult. I advice using external IDs (like usernames).
Similarly to this post I advice using rooms (together with Redis to allow scaling). Add private room for every new socket (basing on user's name for example). The code may look like this:
socket.join('priv/' + name);
io.sockets.in('priv/' + name).emit('message', { msg: 'hello world!' });
This solution allows multiple machines to emit events to any user. Also it is quite simple and elegant in my opinion.

Categories

Resources