Adonis JS moveToDisk not Working for Multiple Files - javascript

The request.files method is great. I can map through multipart files and push them to s3 with the moveToDisk method. However, it seems that when moveToDisk returns an error or anything else it interrupts the loop. I'm pretty sure I have things at least close to set up correctly. I can successfully upload the first file and even store information to my database. But Anything beyond one file doesn't work. Anyone else having trouble with files or moveToDisk
const sprites = request.files("sprites");
for (let sprite of sprites) {
const collection = request.all().collection;
const uuid = uuidv4();
await client
.db("pixel-shop")
.collection("sprites")
.insertOne({
collection: collection,
token: sprite.fileName,
path: `pixel-shop/${collection}/${uuid}`,
created_at: new Date(),
});
await sprite?.moveToDisk(
`pixel-shop/sprites/${request.all().collection}`,
{ name: uuid }
);
}
Some things I've noticed in the logs are a cannot write file error which is weird because the first file writes fine, something about a .getUrl method, as well as some stuff about SSL but all my stuff aside from my localhost are secure.

Related

Trying to work with POST request using Postman, cosmosDB and NodeJs

I am trying to learn the way API's work. Here I am trying to get the POST method to work. I am using this code to make the document in the database,
app.post('/add', async (req, res) => {
try {
const data = require('./test.json');
const newItemId = Math.floor(Math.random() * 1000 + 10).toString();
data.id = newItemId;
data.Partnership_Id = newItemId;
//for testing purpose only
let documentDefinition = {
"id": newItemId,
"name": "Angus MacGyver",
"state": "Building stuff"
};
// Open a reference to the database
const dbResponse = await cosmosClient.databases.createIfNotExists({
id: databaseId
});
let database = dbResponse.database;
const { container } = await database.containers.createIfNotExists({id: containerId});
// Add a new item to the container
console.log("** Create item **");
const createResponse = await container.items.create(data);
res.redirect('/');
} catch (error) {
console.log(error);
res.status(500).send("Error with database query: " + error.body);
}
})
Here I am using test.json for the data input. I am making a fake id using newItemId for data.id and data.Partnership_Id.
With this approach, I can make a document in the database and can check on Postman too but there is nothing in the Body tag in Postman.
I am confused on this part, I feel like the data for the new document should be passed through the Postman Body rather than me using newItemId for it.
This might be a silly question to ask but I am trying to get my head around how API works and how to pass data in them.
IDs are almost always auto generated on the backend (or at least should be) when creating a database resource, so what you have seems to be correct. I would recommend using a library like nanoid to generate the ids though, just to remove the potential for errors.
Its is RESTful convention to return created data, so in this case you would return a JSON on the created document, and then redirect etc on the front end (to ensure complete separation of backend front end - so you can say host them separately). Your approach is also fine and works though.
My advice is to think of your backend and frontend as been completely separate, I would have a project for each personally. This was it is more clear how everything links together.

Node.js pg-promise Azure function to write to Postgres (timescaleDB)

Azure is driving me mad again. What I try to achieve is that the data that comes in through an Event Hub needs to be written to the database. What I got working thus far is that the data arrives at the Event Hub and that the Azure function is able to post data to the database. I would prefer to do this with Node.JS as the integration seems kind of nice in Azure. The script I use to send some bogus data to the database is as follows:
module.exports = async function (context, eventHubMessages){
const initOptions = {
query(e) {context.log(e.query)},
capSQL: true
//capSQL: true // capitalize all generated SQL
};
const pgp = require('pg-promise')(initOptions);
const db = pgp({
host: '####',
user: '####',
password: '####',
database: 'iotdemo',
port: 5432,
ssl: true
});
// our set of columns, to be created only once (statically), and then reused,
// to let it cache up its formatting templates for high performance:
const cs = new pgp.helpers.ColumnSet(['customer', 'tag', 'value', 'period'], {table: 'testtable'});
// generating a multi-row insert query:
const query = pgp.helpers.insert(JSON.parse(eventHubMessages), cs);
//=> INSERT INTO "tmp"("col_a","col_b") VALUES('a1','b1'),('a2','b2')
// executing the query:
db.none(query);
};
And yes, this is a snippet from somewhere else. The 'eventHubMessages' should contain the payload. A couple of issues that I have had thus far are:
I can send a payload defined within the script or whilst giving it a testing payload, but I cant send the payload of the actual message
pg-promise returns a 202 regardless of whether it fails or not, so debugging is 'blind' at the moment. Any tips on how to get proper logging would be much appreciated.
I used 'capture events' in the event hub instance to capture the actual messages. These were stored in a blob storage. In noticed that the format is Avro. Do I need to peel away at that object to get to the actual array?
The input should look something like this:
[{"customer": duderino, "tag": nice_rug, "value": 10, "period": 163249839}]
I think I have 2 issues:
I dont know how to get meaningful logging out of the Azure function using Node.JS
Something is off about how my payload is coming in.
A more deeper question is, how do I know whether the Azure function is getting the data that it should. I know that the Event Grid gets the data, but there is no throughput. Namespaces are consistent and the Azure Function should be triggered by that namespace and get the input as a string.
I am seriously lost and out of my depth. Apart from the solution I would also appreciate feedback on my request. I am not a pro on StackOverflow and don't want to waste your time.
Regards
Ok, so after some digging I found a few things to resolve the issue. First of all, I was receiving the payload as a string, meaning that I needed it to parse first, before I could make it a callable object. In terms of code its simple, and part of the base functions of node.js
var parsed_payload = JSON.parse(payload_that_is_a_string);
Lastly, to get meaningful logging I found that the PG-Promise module has great support for that, and that this can be configured when loading the module itself. I was particularly interested in errors, so I enabled that option like so:
const initOptions = {
query(e) {console.log(e.query)},
capSQL: true,
//capSQL: true // capitalize all generated SQL
error: function (error, e) {
if (e.cn) {
// A connection-related error;
// console.log("DC:", e.cn);
// console.log("EVENT:", error.message);
}
}
};
That then can be used as a settings object for loading PG-Promise:
const pgp = require('pg-promise')(initOptions);
Thanks for considering my ask for help. I hope this proves useful for anyone out there!
Regards Pieter

Is there a way to export and download your Angular front-end web page to pdf using back-end nodeJs's pdfmake?

I'm trying to export a certain page from my Angular/nodeJs application using "pdfmake" and have it show up as a download option on the same page after having heard that the best way to export pdf's is through the back-end. After reading the guide and following a tutorial, however, the code writes data to my header but doesn't appear to do anything else.
In the past I've tried following the tutorial below and have read through the method documentation of pdfmake.
https://www.youtube.com/watch?v=0910p09D0Sg
https://pdfmake.github.io/docs/getting-started/client-side/methods/
I'm uncertain whether pdfmake is only supposed to be used by 'headless chrome' (of which I don't possess much knowledge) and wonder if my method can work.
I've also tried using the .download() and .open() functions with pdfMake.createPdf() which resulted in errors.
NodeJs code
router.post('/pdf', (req, res, next) => {
var documentDefinition = {
content: [
'First paragraph',
'Another paragraph, this time a little bit longer to make sure, this line will be divided into at least two lines'
]
}
const pdfDoc = pdfMake.createPdf(documentDefinition)
pdfDoc.getBase64((data) => {
res.writeHead(200, {
'Content-Type': 'application/pdf',
'Content-Disposition': 'attachment;filename="filename.pdf"'
});
const download = Buffer.from(data.toString('utf-8'), 'base64');
res.end(download);
})
})
Angular code
savePDF() {
this.api.post('/bookings/pdf')
.then(res => {
console.log(res);
});
}
In this case the savePDF() function is called when the user clicks on a button on the web page.
Because nothing was happening upon clicking the button I decided to console.log the result which showed up as a very long string of data.
The pdf document data only contains testdata for now as I was trying to get a download link to work before trying to download the webpage itself.
I can also assure you that there is nothing wrong with the routing and the functions are called properly.
I expected the savePDF() function to start a download of a pdf containing the test data shown in the NodeJs "documentDefinition" content, but the actual result did seemingly nothing.

How to create a Shared Query Folder using the vso-node-api (VSTS)?

In the VSTS Rest API, there's a piece of documentation showing me how to create a folder. Specifically, I would like to create a folder within the Shared Queries folder. It seems like I can do this with the REST API.
I would like to do the same thing with the VSTS Node API (vso-node-api). The closest analogous function I can seem to find would be WorkItemTrackingApi.createQuery. Is this the correct function to use?
When I try to use this function, I'm getting an error:
Failed request: (405)
That seems strange, since a "Method Not Allowed" error doesn't seem like the right error here. In other words, I'm not the person deciding what method (GET/POST/...etc) to use, I'm just calling the VSTS Node API's function which should be using the correct HTTP Request Method.
I think the error code would/should be different if something about my request is wrong (like providing bad parameters/data).
But, I would not be surprised if VSTS didn't like the data I provided with the request. I wrote the following test function:
async function createQueryFolder (QueryHeirarchyItem, projectId, query) {
let result = await (WorkItemTrackingApi.createQuery(QueryHeirarchyItem, projectId, query))
return result
}
I set some variables and called the function:
let projectID = properties.project // A previously set project ID that works in other API calls
let QueryHeirarchyItem = {
isFolder: true,
name: 'Test Shared Query Folder 1'
}
try {
let result = await createQueryFolder(QueryHeirarchyFunction, projectID, '')
Notice that I provided a blank string for the query - I have no idea what to provide there when all I want to create is a folder.
So, I think a lot of things could be wrong with my approach here, but also if my request parameters are wrong maybe I should be getting a 400 error? 405 leads me to believe that the VSTS Node API is making a REST call that the underlying VSTS REST API doesn't understand.
For the third parameter of the createQueryFolder, you should specify the folder path where you want to create the new folder.
Such as if you want to create a folder Test Shared Query Folder 1 under Shared Queries, you should specify parameters for createQueryFolder as:
let result = await createQueryFolder(QueryHeirarchyFunction, projectID, 'Shared Queries')

Node.js and mongodb access mongodb

I'm trying to set up mongodb on Windows 8 using node.js, Does anyone know why im getting this error. C:\users\phill\node_modules\mongodb\lib\mongodb\mongo_client.js:359 it also says at collection = db collection,,, can't call method 'collection' of null. I'm having a hard time setting it up. My goal is to be able to add to mongo db, and see that I add or pull up what I added, but adding something is good enough for me for now. I'm trying every thing I can find, even straight from the website, I tried everything I see on here as well. Think it maybe it's the way I have things set up. My node.js is saved in my c: drive there is a file that says, program files(86x) in there I have node_modules, npm and such. The path ends up being, computer > windows (C:) > program files(86x) > nodejs. My Mongodb is saved right on my C: drive the path end up being windows (C:) > mongodb-win32-x86_64-2008plus-2.4.8. In my C: I also created a file data and in it created another db. I have been told i should just use mongoose, I'm just learning so i open to any advice, links or anything that will help. I have one last question as well, i learned php and then found out about sql injections and stuff like that, i am not seeing anything about security at all, should i expect the same as well. For this i get text not defined, but i have been getting errors with everthing i have done, best i did was get stuck on a right concern screen.
var MongoClient = require('mongodb').MongoClient;
MongoClient.connect("mongodb://localhost:27017/integration_test", function(err, db) {
test.equal(null, err);
test.ok(db != null);
db.collection("replicaset_mongo_client_collection").update({a:1},
{b:1}, {upsert:true}, function(err, result) {
test.equal(null, err);
test.equal(1, result);
db.close();
test.done();
});
});
Tried this as well and getting a error,C:\users\phill\node_modules\mongodb\lib\mongodb\mongo_client.js:359.... at collection = db collection,,, can't call method 'collection' of null. im calling it in command prompt node filename.js I'm saving it where my node.js file is, I have pulled up files before and created a server.
var Db = require('mongodb').Db,
MongoClient = require('mongodb').MongoClient,
Server = require('mongodb').Server,
ReplSetServers = require('mongodb').ReplSetServers,
ObjectID = require('mongodb').ObjectID,
Binary = require('mongodb').Binary,
GridStore = require('mongodb').GridStore,
Grid = require('mongodb').Grid,
Code = require('mongodb').Code,
BSON = require('mongodb').pure().BSON,
assert = require('assert');
var db = new Db('test', new Server('localhost', 27017));
// Fetch a collection to insert document into
db.open(function(err, db) {
var collection = db.collection("simple_document_insert_collection_no_safe");
// Insert a single document
collection.insert({hello:'world_no_safe'});
// Wait for a second before finishing up, to ensure we have written the item to disk
setTimeout(function() {
// Fetch the document
collection.findOne({hello:'world_no_safe'}, function(err, item) {
assert.equal(null, err);
assert.equal('world_no_safe', item.hello);
db.close();
})
}, 100);
});
In your first code example, you said:
For this i get text not defined
I assume you meant "test not defined?" Your script only requires the mongodb library, and I don't believe test is a core nodejs function, so that would explain the error.
To reference the driver documentation for db.collection(), an assert library is used, but also properly imported (as you did in your second example).
Within your callback to db.open(), you don't check if an error occurred. That might shed some light on why db is null in that function.
Regarding your question about the equivalent of SQL injection with MongoDB, the main areas of concern are places where you might pass untrusted input into evaluated JavaScript, or using such input to construct free-form query objects (not simply using a string, but more like dropping an object into your BSON query). Both of these links should provide more information on the subject:
What type of attacks can be used vs MongoDB?
How does MongoDB address SQL or Query injection?

Categories

Resources