Sailsjs fetch data from two different mysql databases - javascript

I'm building a rest api using sails js v 1.x
I need to connect two mysql database so I have defined them in config/datastores.js file life this:
module.exports.datastores = {
default: {
adapter: require('sails-mysql'),
url: 'mysql://root:12345#192.168.0.5:3306/test',
},
mysqldb: {
adapter: require('sails-mysql'),
url: 'mysql://root:12345#192.168.0.5:3306/test2',
},
};
In my controller, I have this function which needs to get data by joining the database test as well as test2
module.exports = {
index: function (req, res) {
User.getDatastore().sendNativeQuery("SELECT * from test.users u INNER JOIN test2.users t ON u.id=t.id limit 10",function(err, rawResult) {
res.send(rawResult);
})
},
};
But this gives me an error :
{
"code": "ER_NO_SUCH_TABLE",
"errno": 1146,
"sqlMessage": "Table 'test.users' doesn't exist",
}
Also I have a blank User model and execution of raw sql queries work perfecty when the query is like select * from users (it uses the default database i.e test)
How do I achieve this kind of query by connecting more than one MySQL database in sails js?

I was able to solve the issue this way :
my config.datastore file :
module.exports.datastores = {
default: {
adapter: require('sails-mysql'),
url: 'mysql://root:12345#192.168.0.5:3306/',
},
};
My controller :
module.exports = {
index: function (req, res) {
User.getDatastore().sendNativeQuery("SELECT * from test.users u INNER JOIN test2.users t ON u.id=t.id limit 10",function(err, rawResult) {
res.send(rawResult);
})
},
};
Was having trouble figuring it out at first as there is no such example showing a join on multiple tables from different databases when using raw MySQL queries.
The only change I did is I just removed the database name from the default connection URL. Now I'm able to access all the databases on this particular server and also able to join multiple databases.

Related

Query MariaDB from NodeJS without Caching from MariaDB

I try to do a query from MariaDB with my NodeJS without using the MariaDB caching. I use Sequelize to query the database.
Everything works fine but I recognized that there is definitly some issue with caching. The select query I use always reports a long time that a new user is not available. But another MariaDB insert generated the data before.
To check the user I use:
isUserExists: async function ( inUser, inChat, inLevel ) {
const iskunde = await kunden.findAll({
where: {
kunde: inUser,
chat: inChat,
level: inLevel,
created_at: {
[Op.gte]: Sequelize.literal("DATE_SUB(NOW(), INTERVAL 15 MINUTE)"),
},
},
raw: true,
});
//var currentdate = new Date();
//console.log(currentdate);
//console.log(iskunde);
if(Array.isArray(iskunde) && iskunde.length) {
return true;
}else{
return false;
}
},
This becomes a big problem with my app. So I try to find a way to get the "realtime" data from database. From what I read there is a SELECT possible with SQL_NO_CACHE but I do not find any information how to manage this with Sequelize. Any idea?

Using JOINS with Supabase when no FK is present

I am running into issues with my querying when using supabase. I have this query which I can use successfully in DataGrip
SELECT
sja.audience_id,
sja.segment,
relation,
sjac.constraint_id,
sjac.constraint_value,
sjac.targeting
FROM signal_journey_audience_constraint_relations
JOIN signal_journey_audiences sja ON signal_journey_audience_constraint_relations.audience_id = sja.audience_id
JOIN signal_journey_audience_constraints sjac ON signal_journey_audience_constraint_relations.constraint_id = sjac.constraint_id
But when using supbase I can an error
async function getTableData() {
const { data, error } = await supabase.from(
'signal_journey_audience_constraint_relations'
).select(`
audience_id:signal_journey_audiences(audience_id),
segment:signal_journey_audiences(segment),
relation,
constraint_id:signal_journey_audience_constraints(constraint_id),
constraint_value:signal_journey_audience_constraints(constraint_value),
targeting:signal_journey_audience_constraints(targeting)
),
`);
if (data) {
console.log(data);
setTableData(data);
} else {
console.log(error);
}
}
Error is
hint: Verify that 'signal_journey_audience_constraint_re…ship was created, try reloading the schema cache.
message: Could not find a relationship between 'signal_journey_audience_constraint_relations' and 'signal_journey_audience_constraints' in the schema cache
I am getting confused to why I can run the query in DataGrip but not in Supbase. I'm 90% sure I just have some syntax issue but can't figure it out.

Need some help regarding the Elasticsearch engine and react js

I hope you are doing well, I need some help regarding the Elasticsearch engine. what I am doing is I am trying to create a search engine I have successfully post my data through kibana to elasticsearch engine. but "but how can I add the search component of elastyicsearch to my react app", I have like 4 million records into the kibana index, when I try to search directly from react it take a long time to display records into my frontapp app with nodejs api. below is the code with nodejs but the problem with this code it just gives me 10 records only.
router.get('/tweets', (req, res)=>{
let query = {
index: 'tweets',
// size: 10000
}
if(req.query.tweets) query.q = `*${req.query.tweets}*`;
client.search(query)
.then(resp => {
return res.status(200).json({
tweets: resp.body.hits.hits
});
})
.catch(err=>{
console.log(err);
return res.status(500).json({
err
});
});
});
Is there any way to impliment elasticsearch component directly to my reactjs app. like with the localhost:9200/index.. directly from the elasticsearch api?
Your request to Elasticsearch looks a bit strange to me, have you tried to search using a body like in the documentation? This line:
if(req.query.tweets) query.q = `*${req.query.tweets}*`;
doesn't seem like a correct way to write a query. Which field do you want to search for?
I saw that you tried to use the size field, which should be correct. You can also try the following:
client.search({
index: 'tweets',
body: {
size: 1000, // You can put the size here to get more than 10 results
query: {
wildcard: { yourfield: `*${req.query.tweets}*` }
}
}
}, (err, result) => {
if (err) console.log(err)
})
You could use SearchKit to directly query elasticsearch from you react app. But be aware that exposing DB services outside of your own infrastructure is bad practice.
You can use the component like this:
import {
SearchkitManager,
SearchkitProvider,
SearchkitComponent
} from 'searchkit'
const searchkit = new SearchkitManager(host)
class Render extends SearchkitComponent {
render(){
let results = await this.searchkit.reloadSearch()
return <div>{results}</div>
}
}
function table(){
return <SearchkitProvider searchkit={searchkit}>
<Render />
</SearchkitProvider>
}

Getting PUT routes to work in Angular

I'm seeking some wisdom from the Angular community. I am working on a simple project using the MEAN stack. I have set up my back-end api and everything is working as expected. Using Postman, I observe expected behavior for both a GET and PUT routes to retrieve/update a single value - a high score - which is saved in it's own document in its own collection in a MongoDB. So far so good.
Where things go off track is when trying to access the PUT api endpoint from within Angular. Accessing the GET endpoint is no problem, and retrieving and displaying data works smoothly. However, after considerable reading and searching, I am stll unable to properly access the PUT endpoint and update the high score data when that event is triggered by gameplay. Below are the snippets of code that I believe to be relevant for reference.
BACK-END CODE:
SCHEMA:
const _scoreSchema = {
name: { type: String, required: true },
value: { type: Number, "default": 0 }
};
ROUTES:
router
.route('/api/score/:highScore')
.put(scoreController.setHighScore);
CONTROLLER:
static setHighScore(req, res) {
scoreDAO
.setHighScore(req.params.highScore)
.then(highScore => res.status(200).json(highScore))
.catch(error => res.status(400).json(error));
}
DAO:
scoreSchema.statics.setHighScore = (value) => {
return new Promise((resolve, reject) => {
score
.findOneAndUpdate(
{"name": "highScore"},
{$set: {"value": value} }
)
.exec(function(err, response) {
err ? reject(err)
: resolve(response);
});
});
}
ANGULAR CODE:
CONTROLLER:
private _updateHighScore(newHighScore): void {
console.log('score to be updated to:', newHighScore)
this._gameService
.updateHighScore(newHighScore);
}
SERVICE:
updateHighScore(newHighScore: Number): Observable<any> {
console.log(newHighScore);
let url = '/api/score/' + newHighScore;
let _scoreStringified = JSON.stringify({value: newHighScore});
let headers = new Headers();
headers.append("Content-Type", "application/json");
return this._http
.put(url , _scoreStringified, {headers})
.map((r) => r.json());
}
Note that the console.log(newHighScore) in the last block of code above correctly prints the value of the new high score to be updated, it's just not being written to the database.
The conceptual question with PUT routes in angular is this: If the api is already set up such that it receives all the information it needs to successfully update the database (via the route param) why is it required to supply all of this information again in the Angular .put() function? It seems like reinventing the wheel and not really utilizing the robust api endpoint that was already created. Said differently, before digging into the docs, I naively was expecting something like .put(url) to be all that was required to call the api, so what is the missing link in my logic?
Thanks!

Sequelize get request data in hooks?

I'm trying to store some log data for my models on create, update, delete calls. I want to store some data from the request along with some user data also in the request (using express.js).
In the hooks I have some modules for logging.
hooks: {
afterCreate: function (order, options, done) {
// How to get user data stored in express request.
return app.log.set('event', [{message: 'created', data: order, userId: 1}, done]);
}
}
...
The module just makes a record in a table. However it's the userId part I'm having trouble with. I'm using the passport module and it's stored in the request, so how can I get a user object (or any external object for that matter) into the model hooks?
I would like to avoid doing it in a controller or anywhere else as there could be some scripts or other commands that may also enter data.
I also encountered similar problems, which I myself resolved as follows:
First: I declared a Global (universal) hook:
module.exports = sequelize.addHook('beforeCreate',
function(model, options, done) {//hook 2
//handle what you want
//return app.log.set('event', [{message: 'created', data: order, userId: 1}, done]);
});
Then, Before calling model, use call hooks (beforeCreate, beforeBulkUpdate,...) and assigned param request
module.exports = {
CreateUser: function(req, res) {
User.beforeCreate(function(model, options, done) {//hook1
model.request = req;
});
User.create({
id: 1,
username: 'thanh9999',
password: '31231233123'
//ex.....
})
.then(function(success) {
//response success
}, function(err) {
//response error
});
}
};
order hooks called: hook declaration in model → hook 1 → hook 2`.
In addition, you also have to declare hooks for each model.
See more information here.

Categories

Resources