node mysql2 Incorrect arguments statement - javascript

this is my function. If I remove the ? and enter the info manually it executes, I would assume this is how you pass in parameters. is this correct? If i console log the params they all work, I am assuming the way the params are been passed down
async function getMultiple(page = 1){
const offset = helper.getOffset(page, config.listPerPage);
const rows = await db.query(
'SELECT id, quote, author FROM quote LIMIT ?,?',
[offset, config.listPerPage]
);
const data = helper.emptyOrRows(rows);
const meta = {page};
return {
data,
meta
}
}
module.exports = {
getMultiple
}

So it turns out that the mysql version I had installed (8.0.23) has a problem with prepared statements (or a different way). I had to downgrade to less than that and it worked as expected. I downgraded to 5.7

Related

Trying to retrieve localUri from assets in Expo MediaLibrary, why does this return null?

I have two questions here, an Expo MediaLibrary question, and a linked Javascript/scoping question.
I have a ReactNative Expo app that loads recent images from a users media library. For this I need to use the MediaLibrary.getAssetsAsync() method.
Annoyingly, .getAssetsAsync() does not return localUri needed to display the image. It returns uri property which looks like this: "uri": "ph://8A1262C3-23F7-4BD3-8943-C01128DCEEB1"
Unless I'm mistaken, I can't use an asset file uri like this to display images in react, we need localUri. So for each photo I am calling the MediaLibrary.getAssetInfoAsync() method - which returns localUri. Here's how I am accomplishing that:
const selectAlbum = async (albumName) => {
const foundAlbum = await MediaLibrary.getAssetsAsync({album: albumName, mediaType: 'photo', first: 20})
const assets = foundAlbum['assets'] //array of photos or 'assets'
const assetsWithInfo = []
//for each selected photo
for(let i=0; i<assets.length; i++){
const assetWithInfo = getAssetInfo(assets[i].id)
//console.log(assetWithInfo) //doesnt work, null etc - WHY?!
assetsWithInfo.push(assetWithInfo)
}
}
const getAssetInfo = async (id) =>{
let res = await MediaLibrary.getAssetInfoAsync(id)
let asset={
creationTime: res['creationTime'],
isFavorite: res['isFavorite'],
localUri: res['localUri'],
}
//console.log(asset) //works, object correctly displays
return asset
}
My questions are:
Is there a more efficient way to do this i.e. display images from MediaLibrary without having to call so many functions. This feels messy and more complicated than it should be.
In the getAssetInfo async, the asset object is correctly displayed (using the console.log) with all properties. But when I return this object to the selectAlbum function and console.log the result, I get null etc. Why?
For #2, the issue is you're not awaiting the result of the call.
const assetWithInfo = await getAssetInfo(assets[i].id); should fix it.

Error: Only replay-protected (EIP-155) transactions allowed over RPC

When I try to send funds from a token address to another I encounter this error: only replay-protected (EIP-155) transactions allowed over RPC
My Code:
const contract = new web3.eth.Contract(ABI, token_contract_address);
const data = contract.methods.transfer(to, req.body.value).encodeABI();
const rawTransaction = {
'from': from,
'nonce': web3.utils.toHex(web3.eth.getTransactionCount(from)),
'gasPrice': web3.utils.toHex(web3.eth.gasPrice),
'gasLimit': web3.utils.toHex(21000),
'to': token_contract_address,
'value': 0,
'data': data,
'chainId': web3.utils.toHex(chainid)
};
const privateKey = new Buffer.from(req.body.PrivateKey, 'hex');
const tx = new Tx(rawTransaction);
tx.sign(privateKey);
const serializedTx = tx.serialize();
web3.eth.sendSignedTransaction(('0x' + serializedTx.toString('hex')), req.body.PrivateKey)
.then(function (result) {
res.statusCode = 200;
res.setHeader('Content-Type', 'application/json');
res.json(result * decimals);
console.log(result);
})
.catch((err) => next(err));
Notice I have already added ChainId
Here is an excellent example that looks like yours and should definitely work. What I think is happening here, is that there are already some pending transactions, so that your current nonce is causing this error message. Some small steps that might solve your issue and improve your code:
Change value from 0 to 0x00, this should also be hex encoded. Or you could leave it out, because you are also sending data.
Alternatively you could set
value to something like web3.utils.toHex(web3.utils.toWei('123', 'wei')).
Create the nonce with web3.eth.getTransactionCount(account.address, 'pending');. This would also include pending transactions to the count so that they are not used twice.
And what is the reason why you are adding req.body.PrivateKey to sendSignedTransaction(..)? Just adding the serialized transaction that you signed before is enough: '0x' + serializedTx.toString('hex').
It is not clear what your current chainId is. Try running your code on another chain to see if that works.
Ropsten = 3
Rinkeby = 4
Finally, update all your dependencies / the tools you are using. Changes in packages may sometimes cause issues like this.
You'll find another excellent example here.

Netlify Serverless Function returning 404

I am trying to set up a simple serverless function on Netlify just to test out usage of environment variables. I have defined the following two environment variables in Netlify for my site:
Variable Name
Value
ALPHABET_SEPARATION
2
CHARS_BETWEEN
3
I have also updated my functions directory as follows:
Functions directory: myfunctions
I am using continuous deployment from github. As I do not know the use of npm at present and finding it convenient to directly test the production deploy, I have defined a subdirectory called myfunctions inside my root directory and have placed my javascript file containing the "serverless" function inside it on my local machine. I have built in logic so that the "serverless" function gets called only when a "netlify" flag is set, otherwise, an alternate function gets executed client-side. Basically it works as follows:
const deploy = "netlify" //Possible valid values are "local" and "netlify"
async function postRandomString() {
const stringToUpdate = "THISISATESTSTRING"
var stringToPost = "DUMMYINITIALVALUE";
if (deploy === "local") {
stringToPost = updateString(stringToUpdate); //updateString is a function defined elsewhere and executes client-side;
}
else if (deploy === "netlify") {
const config = {
method: 'GET',
headers: {
'Accept': 'application/json',
}
};
const res = await fetch(`myfunctions/serverUpdateString?input=${stringToUpdate}`, config);
const data = await res.json();
stringToPost = data.retVal;
console.log(data.retVal);
}
else {
stringToPost = "##ERROR##";
}
postString(stringToPost); //postString is a function defined elsewhere and executes client-side;
}
The serverless function file serverUpdateString.js is coded as follows (it basically sets a character at a certain position (determined by CHARS_BETWEEN) in the string to an alphabetical character which is a certain number (determined by ALPHABET_SEPARATION) of places in the alphabet after the first character of the string (don't ask why - the point is that it never even receives/handles the request):
exports.handler = async function (event) {
const { CHARS_BETWEEN, ALPHABET_SEPARATION } = process.env;
const charsBetween = CHARS_BETWEEN;
const alphabetSeparation = ALPHABET_SEPARATION;
const initString = event.queryStringParameters.input;
const rootUnicode = initString.charCodeAt(0);
const finalUnicode = "A".charCodeAt(0) + (rootUnicode - "A".charCodeAt(0) + alphabetSeparation) % 26;
const finalChar = String.fromCharCode(finalUnicode);
const stringArray = initString.split("");
stringArray[charsBetween + 1] = finalChar;
const stringToReturn = stringArray.join("");
const response = {
statusCode: 200,
retVal: stringToReturn,
}
return JSON.stringify(response);
}
When I run it, I get a 404 error for the GET request:
In the above image, script.js:43 is the line const res = await fetch(myfunctions/serverUpdateString?input=ATESTSTRIN, config); in the calling file, as shown in the first code block above.
What am I doing incorrectly? Surely Netlify should be able to pick up the serverless function file given that I have specified the folder alright and have placed it at the right place in the directory structure? I have given the whole code for completeness but the problem seems quite elementary. Look forward to your help, thanks.
I got assistance from Netlify forums. Basically the following changes needed to be made:
The fetch request -- line 43 in the calling code (script.js) -- needed to be changed to
const res = await fetch(`https://netlifytestserverless.netlify.app/.netlify/functions/serverUpdateString?input=${stringToUpdate}`, config);
The return statement in the lambda function needed to be changed to:
const response = {
statusCode: 200,
body: JSON.stringify(stringToReturn),
}
Other minor changes such as using parseInt with the environment variables.
The code works now.

How to create reusable prepared statements in SQL Server using mssql?

I've been using prepared statements for SQL server queries but I don't think I'm using it properly. I want to be able to reuse the queries for different cases that require different parameters but I'm not sure how to go about it. So I've been creating a new function for every different case.
similar to this in java
http://tutorials.jenkov.com/jdbc/preparedstatement.html
I'm using
https://www.npmjs.com/package/mssql
and the documentation doesn't really reuse prepared statements
https://www.npmjs.com/package/mssql#prepared-statement
Instead of select * I want to be able to manipulate which parameters I want without creating a new function every time
async function SelectByStatusId(statusId) {
const ps = new sql.PreparedStatement(connectionPool);
ps.input('statusId', sql.Int);
const statement = await ps.prepare(`SELECT *
FROM table1 WHERE statusId = #statusId`);
const result = await statement.execute({
statusId: statusId
});
await statement.unprepare();
return result.recordset;
}
When updating functions I have to define parameter types and match the body values to the parameter so I'm not sure how I can reuse statements when updating or inserting
async function UpdateTable1(body) {
const ps = new sql.PreparedStatement(pools.poolDatalupa);
ps.input('id', sql.Int);
ps.input('name', sql.NVarChar);
ps.input('number', sql.Float);
const statement = await ps.prepare(`UPDATE table1
SET name=#name,
number=#number
WHERE id = #id`);
const result = await statement.execute({
id: body.id,
name: body.name,
number: body.number
});
await statement.unprepare();
return result;
}

Using a cursor with pg-promise

I'm struggling to find an example of using a cursor with pg-promise. node-postgres supports its pg-cursor extension. Is there a way to use that extension with pg-promise? I'm attempting to implement an asynchronous generator (to support for-await-of). pg-query-stream doesn't seem to be appropriate for this use case (I need "pull", rather than "push").
As an example, I use SQLite for my unit tests and my (abridged) generator looks something like this...
async function* () {
const stmt = await db.prepare(...);
try {
while (true) {
const record = await stmt.get();
if (isUndefined(record)) {
break;
}
yield value;
}
}
finally {
stmt.finalize();
}
}
Using pg-cursor, the assignment to stmt would become something like client.query(new Cursor(...)), stmt.get would become stmt.read(1) and stmt.finalize would become stmt.close.
Thanks
Following the original examples, we can modify them for use with pg-promise:
const pgp = require('pg-promise')(/* initialization options */);
const db = pgp(/* connection details */);
const Cursor = require('pg-cursor');
const c = await db.connect(); // manually managed connection
const text = 'SELECT * FROM my_large_table WHERE something > $1';
const values = [10];
const cursor = c.client.query(new Cursor(text, values));
cursor.read(100, (err, rows) => {
cursor.close(() => {
c.done(); // releasing connection
});
// or you can just do: cursor.close(c.done);
});
Since pg-promise doesn't support pg-cursor explicitly, one has to manually acquire the connection object and use it directly, as shown in the example above.
pg-query-stream doesn't seem to be appropriate for this use case (I need pull, rather than push).
Actually, in the context of these libraries, both streams and cursors are only for pulling data. So it would be ok for you to use streaming also.
UPDATE
For reading data in a simple and safe way, check out pg-iterator.

Categories

Resources