Connecting my React App to a MySQL Database on a PHP Server - javascript

I am new to working with data bases. I have a react app that I would like to connect to a MySQL database. The server is not mine but given by a web provider (strato), using PHP.
Is it generally possible to GET and PUT data via react (meaning javascript) to this data base in a way that I normally work with ajax and a restful API?
I really hope this question provides enough information for an answer :)

Here's a code example from https://github.com/numtel/reactive-mysql-example, file index.js. Have a look at that file and settings.js to see how to set up the connection.
var results = liveDb.select('select * from players order by score desc', [ {
  table: 'players'
} ]).on('update', function(diff, data){
  var msg = JSON.stringify({
    type: 'diff',
    data: diff
  });
  // Send change to all clients
  connected.forEach(function(conn){
    conn.write(msg);
  });
});

Related

websockets and database updates (push on change)

I started to learn websockets today, since I want to have an architecture, with which I can get real-time updates.
I created my first websockets node.js and socket.io app, so I can communicate via javascript between a client and server. That works fine.
But I need something, that can communicate with a MySQL, so that for every change on a certain table, it has to tell the clients, that there is a change.
So I was thinking that the node.js server communicates with a PHP script that observes the database. But then I would also need long-pulling to request changes, so I could do it with ajax anyway, so it would be useless.
So my question: How can I achieve to get real-time data changes from a table of the database or from a certain query, which will send the update to all clients, that are connected in real-time, without long polling?
Thanks !
So my question: How can I achieve to get real-time data changes from a
table of the database or from a certain query...
There is a module to watch for MySQL events: mysql-events
Official Example :
var MySQLEvents = require('mysql-events');
var dsn = {
host: _dbhostname_,
user: _dbusername_,
password: _dbpassword_,
};
var mysqlEventWatcher = MySQLEvents(dsn);
var watcher =mysqlEventWatcher.add(
'myDB.table.field.value',
function (oldRow, newRow, event) {
//row inserted
if (oldRow === null) {
//insert code goes here
}
//row deleted
if (newRow === null) {
//delete code goes here
}
//row updated
if (oldRow !== null && newRow !== null) {
//update code goes here
}
//detailed event information
//console.log(event)
},
'match this string or regex'
);
... which will send the update to all clients, that are connected in real-time,
without long polling?
You can use socket.io and prevent the initial http polling entirely by doing this on the client:
var socket = io({transports: ['websocket'], upgrade: false});
To avoid clients from using polling, add this line to the server:
io.set('transports', ['websocket']);
and to send the events to all connected (socket.io) clients from mysql-event use the following :
io.sockets.emit('db-event',eventDataObj)

How to pass data between Django module/app functions without using database in asynchronous web service

I've got a web service under development that uses Django and Django Channels to send data across websockets to a remote application. The arrangement is asynchronous and I pass information between the 2 by sending JSON formatted commands across websockets and then receive replies back on the same websocket.
The problem I'm having is figuring out how to get the replies back to a Javascript call from a Django template that invokes a Python function to initiate the JSON websocket question. Since the command question & data reply happen in different Django areas and the originating Javascript/Python functions call does not have a blocking statement, the Q&A are basically disconnected and I can't figure out how to get the results back to the browser.
Right now, my idea is to use Django global variables or store the results in the Django models. I can get either to work, but I beleive the Django global variables would not scale beyond multiple workers from runserver or if the system was eventually spread across multiple servers.
But since the reply data is for different purposes (for example, list of users waiting in a remote lobby, current debugging levels in remote system, etc), the database option seems unworkable because the reply data is varying structure. That, plus the replies are temporal and don't need to be permanently stored in the database.
Here's some code showing the flow. I'm open to different implementation recommendations or a direct answer to the question of how to share information between the 2 Django functions.
In the template, for testing, I just have a button defined like this:
<button id="request_lobby">Request Lobby</button>
With a Javascript function. This function is incomplete as I've yet to do anything about the response (because I can't figure out how to connect it):
$("#request_lobby").click(function(){
$.ajax({
type: "POST",
url: "{% url 'test_panel_function' %}",
data: { csrfmiddlewaretoken: '{{ csrf_token }}', button:"request_lobby" },
success: function(response){
}
});
});
This is the Django/Python function in views.py . The return channel for the remote application is pre-stored in the database as srv.server_channel when the websocket is initially connected (not shown):
#login_required
def test_panel_function(request):
button = request.POST.get('button', '')
if button == "request_lobby" :
srv = Server.objects.get(server_key="1234567890")
json_res = []
json_res.append({"COMMAND": "REQUESTLOBBY"})
message = ({
"text": json.dumps(json_res)
})
Channel(srv.server_channel).send(message)
return HttpResponse(button)
Later, the remote application sends the reply back on the websocket and it's received by a Django Channels demultiplexer in routing.py :
class RemoteDemultiplexer(WebsocketDemultiplexer):
mapping = {
"gLOBBY" : "gLOBBY.receive",
}
http_user = True
slight_ordering = True
channel_routing = [
route_class(RemoteDemultiplexer, path=r"^/server/(?P<server_key>[a-zA-Z0-9]+)$"),
route("gLOBBY.receive" , command_LOBBY),
]
And the consumer.py :
#channel_session
def command_LOBBY(message):
skey = message.channel_session["server_key"]
for x in range(int(message.content['LOBBY'])):
logger.info("USERNAME: " + message.content[str(x)]["USERNAME"])
logger.info("LOBBY_ID: " + message.content[str(x)]["LOBBY_ID"])
logger.info("OWNER_ID: " + message.content[str(x)]["IPADDRESS"])
logger.info("DATETIME: " + message.content[str(x)]["DATETIME"])
So I need to figure out how to get the reply data in command_LOBBY to the Javascript/Python function call in test_panel_function
Current ideas, both of which seem bad and why I think I need to ask this question for SO:
1) Use Django global variables:
Define in globals.py:
global_async_result = {}
And include in all relevant Django modules:
from test.globals import global_async_result
In order to make this work, when I originate the initial command in test_panel_function to send to the remote application (the REQUESTLOBBY), I'll include a randomized key in the JSON message which would be round-tripped back to command_LOBBY and then global_async_result dictionary would be indexed with the randomized key.
In test_panel_function , I would wait in a loop checking a flag for the results to be ready in global_async_result and then retrieve them from the randomized key and delete the entry in global_async_result.
Then the reply can be given back to the Javascript in the Django template.
That all makes sense to me, but uses global variables (bad), and seems that it wouldn't scale as the web service is spread across servers.
2) Store replies in Django mySQL model.py table
I could create a table in models.py to hold the replies temporarily. Since Django doesn't allow for dynamic or temporary table creations on the fly, this would have to be a pre-defined table.
Also, because the websocket replies would be different formats for different questions, I could not know in advance all the fields ever needed and even if so, most fields would not be used for differing replies.
My workable idea here is to create the reply tables using a field for the randomized key (which is still routed back round-trip through the websocket) and another large field to just store the JSON reply entirely.
Then in test_panel_function which is blocking in a loop waiting for the results, pull the JSON from the table, delete the row, and decode. Then the reply can be given back to the Javascript in the Django template.
3) Use Django signals
Django has a signals capability, but the response function doesn't seem to be able to be embedded (like inside test_panel_function) and there seems to be no wait() function available for an arbitrary function to just wait for the signal. If this were available, it would be very helpful

Node JS live result from MySQL

I'm trying to show the results from my DB in table. I want the users to see every change made in the table(insert, update or delete) live without refreshing the page.
My first question is how to fetch my DB data in table like this. I also would like actions taken by the users to be lively visible from other users(like edit, delete, create).
This is my server.js
var io=require('socket.io'), mysql=require('mysql');
var express = require('express');
var connection = mysql.createConnection({
host : 'localhost',
user : 'root',
password : '',
database : 'lolast'
});
connection.connect(function(err) {
if(err) {
console.log('error when connecting to db:', err);
setTimeout(handleDisconnect, 2000);
}
});
var ws=io.listen(3000);
var socketCount = 0
ws.on('connection', function(socket) {
socketCount++
// Let all sockets know how many are connected
ws.emit('users connected', socketCount);
socket.on('disconnect', function() {
// Decrease the socket count on a disconnect, emit
socketCount--
ws.emit('users connected', socketCount)
});
});
And this is my client side:
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"></script>
<script src="http://127.0.0.1:3000/socket.io/socket.io.js"></script>
<script>
$(document).ready(function(){
// Connect to our node/websockets server
var socket = io.connect('http://localhost:3000');
// New socket connected, display new count on page
socket.on('users connected', function(data){
$('#usersConnected').html('Users connected: ' + data)
})
})
</script>
<div id="usersConnected"></div>
My user counter is working fine, show the result in live but I having trouble with my sql data.
I will be very greatfull if someone helps me.
The answer is replication logs or bin-logs on mysql. Have a look at this repo: https://github.com/numtel/mysql-live-select
This is an incomplete answer but can at least get you started on the problem -
MySQL doesn't have any sort of out-of-the-box support for this. Whatever you do will need to be based on polling, which is not impossible to scale, but requires very smart use of MySQL to scale well.
There are better storage products for this. I think Firebase specializes in this (bought by Google?) and I'm positive there is an AWS-product that does this.
Even NoSQLs like Redis have push support. I think MongoDB does.
There are probably other middleware solutions. There might be frameworks that will create and maintain the tables themselves. Try searching for language-agnostic frameworks or frameworks in other languages and see if anything points you in the right direction.
The roll-your-own solution will look something like this: all of your queries that should be visible to other users will need to go into tables, or be kept track of in tables, with timestamps, that your clients can query for anything since they last received data. This is hard to scale without serious MySQL chops (but reasonable to scale if you know MySQL well). (Just making these claims from my experience at a MySQL startup).
I am using firebase for getting live events for changing value in database.
For view perspective I have hooked firebase with Bootstrap's DataTable.
You can check small use of it over here. Firebase is one of the best solution for this.
https://www.firebase.com/docs/web/libraries/angular/guide/synchronized-arrays.html

Make sure SQLite db is open before querying

I am currently working on a desktop app with AdobeAir, JS, and a SQlite DB. I have an issue where sometimes the DB is opening too slowly and the first query is then not successful.
I am using an Async connection, and I made a few functions that handle each type of query, like:
function CoreExecuteSelect(sql, selectStmt, callbackFunction) {
openConnection();
air.trace(sql);
selectStmt.sqlConnection = sqlConnection;
selectStmt.text = sql;
selectStmt.addEventListener(air.SQLEvent.RESULT, callbackFunction);
selectStmt.addEventListener(air.SQLErrorEvent.ERROR, onDatabaseError1);
selectStmt.execute();
}
So I believe, sometimes the openConnection() is not yet finished when the selectStmt.execute() runs, this is then causing trouble because I don't get data back. What would be the best way for me to wait for the DB to be open before sending the first queries ?
Thanks for your help.

How to send data to specific user using Socket.io?

I use node.js and socket.io.
I want to send data from the server to the client and displayed to the user with ID = 1
How to send data to specific user?
Haven't used node in awhile so this may not be 100%. Try on server:
var id = '<session_id_of_specific_user>',
io = require('/var/websites/lib/socket.io'),
user = io.clients[id];
Then to send do:
var data = {some:'data'};
user.send(data);
Hope that's useful. If you tell me more accurately what you need the code to do I could maybe be more helpful.

Categories

Resources