consume s3 getobjectcommand result in angular (open as pdf) - javascript

I am trying to open a file from a s3 bucket using angular as a pdf. To do this, I have a node service running which gets the object, which I call from angular. Then I'm trying to open in angular as a pdf. Is there something I am missing? When I open the PDF, it shows up as a blank (white) document.
Below is my node code:
const streamToString = (stream) =>
new Promise((resolve, reject) => {
const chunks = [];
stream.on("data", (chunk) => chunks.push(chunk));
stream.on("error", reject);
stream.on("end", () => resolve(Buffer.concat(chunks).toString("utf8")));
});
const readFile = async function getObj(key) {
const params = {
Bucket: vBucket,
Key: key,
};
const command = new GetObjectCommand(params);
const response = await client.send(command);
const { Body } = response;
return streamToString(Body);
};
And here I am consuming in angular and opening as PDF
The service:
getObj(key: String): Observable<any>{
const httpOptions = {
'responseType' : 'arraybuffer' as 'json'
//'responseType' : 'blob' as 'json' //This also worked
};
return this.http.get<any>(environment.s3Ep + '/getfile?key=' + key, httpOptions );
}
And code consuming the service:
this.s3Svc.getObj(key).subscribe(
res => {
let file = new Blob([res], {type: 'application/pdf'});
var fileURL = URL.createObjectURL(file);
window.open(fileURL);
}
);

I started experiencing the same issue. Found a solution, replacing streamToString with streamToBuffer as follows:
const streamToBuffer = async (stream: Readable): Promise<Buffer> => {
return new Promise((resolve, reject) => {
const chunks: Array<any> = []
stream.on('data', (chunk) => chunks.push(chunk))
stream.on('error', reject)
stream.on('end', () => resolve(Buffer.concat(chunks)))
})
}
and the code that consumes it:
const command = new GetObjectCommand({ Bucket, Key })
const data = await s3.send(command)
const content = await streamToBuffer(data.Body as Readable)
fs.writeFileSync(destPath, content)
In my case I'm writing to a PDF file.
Writing as a string retrieved from streamToString or writing it as buffer.toString() resulted in the blank PDF.

Related

Node js: Download file to memory and pass to library function without saving to disk

The below code is working fine, but I had some code feedback:
"Why download and save the file to disk, only to read it back in memory?"
However, after spending some hours exploring options with Buffer and stream, I just don't seem to be getting anywhere.
const fs = require('fs');
const { PdfData } = require('pdfdataextract');
const axios = require('axios').default;
const getPDFText = async ({ url }) => {
const tmpDir = `${process.cwd()}/my_dir`;
const writer = fs.createWriteStream(`${tmpDir}/document.pdf`);
const response = await axios({
url,
method: 'get',
responseType: 'stream'
});
response.data.pipe(writer);
const text = await new Promise((resolve, reject) => {
writer.on('finish', () => {
const fileData = fs.readFileSync(`${tmpDir}/document.pdf`);
PdfData.extract(fileData, {
get: {
// ...
},
})
.then(resolve)
.catch(reject);
});
writer.on('error', reject);
});
return text;
};
How can I avoid saving the file to disk, and to instead feed it into the PdfData.extract method?
The signature for .extract says it accepts an Uint8Array.
Something like
const {PdfData} = require('pdfdataextract');
const axios = require('axios').default;
async function getPDFText({url}) {
const response = await axios({
url,
method: 'get',
responseType: 'arraybuffer',
});
const pdfUint8Array = new Uint8Array(response.data);
const res = await PdfData.extract(pdfUint8Array, /* ... */);
console.log(res);
return res;
}
could do the trick?

Text -> PNG -> ReadStream, all done on the front-end?

I'm not sure if this is even possible, but here's what I'm trying to do:
Let the user enter some text
Generate a PNG from that text
Upload it to Pinata, which requires it to be in ReadStream format
Do all of this on the front-end
I've managed to accomplish (1) and (2) using html2canvas.
The tricky part is (3). The reason it has to be in ReadStream format is because that's the format Pinata's SDK wants:
const fs = require('fs');
const readableStreamForFile = fs.createReadStream('./yourfile.png');
const options = {
pinataMetadata: {
name: MyCustomName,
keyvalues: {
customKey: 'customValue',
customKey2: 'customValue2'
}
},
pinataOptions: {
cidVersion: 0
}
};
pinata.pinFileToIPFS(readableStreamForFile, options).then((result) => {
//handle results here
console.log(result);
}).catch((err) => {
//handle error here
console.log(err);
});
I realize that this would be no problem to do on the backend with node, but I'd like to do it on the front-end. Is that at all possible? Or am I crazy?
I'm specifically using Vue if that matters.
For anyone interested the solution ended up being using fetch+blob:
const generateImg = async () => {
const canvas = await html2canvas(document.getElementById('hello'));
const img = canvas.toDataURL('image/png');
const res = await fetch(img);
return res.blob();
};
This blob can then be passed into a more manual version of their SDK:
const uploadImg = (blob: Blob) => {
const url = `https://api.pinata.cloud/pinning/pinFileToIPFS`;
const data = new FormData();
data.append('file', blob);
const metadata = JSON.stringify({
name: 'testname',
});
data.append('pinataMetadata', metadata);
const pinataOptions = JSON.stringify({
cidVersion: 0,
});
data.append('pinataOptions', pinataOptions);
return axios
.post(url, data, {
maxBodyLength: 'Infinity' as any, // this is needed to prevent axios from erroring out with large files
headers: {
// #ts-ignore
'Content-Type': `multipart/form-data; boundary=${data._boundary}`,
pinata_api_key: apiKey,
pinata_secret_api_key: apiSecret,
},
})
.then((response) => {
console.log(response);
})
.catch((error) => {
console.log(error);
});
};

Resizing images with sharp before uploading to google cloud storage

I tried to resize or compress an image before uploading to the google cloud storage.
The upload works fine but the resizing does not seem to work.
Here is my code:
const uploadImage = async (file) => new Promise((resolve, reject) => {
let { originalname, buffer } = file
sharp(buffer)
.resize(1800, 948)
.toFormat("jpeg")
.jpeg({ quality: 80 })
.toBuffer()
const blob = bucket.file(originalname.replace(/ /g, "_"))
const blobStream = blob.createWriteStream({
resumable: false
})
blobStream.on('finish', () => {
const publicUrl = format(
`https://storage.googleapis.com/${bucket.name}/${blob.name}`
)
resolve(publicUrl)
}).on('error', () => {
reject(`Unable to upload image, something went wrong`)
})
.end(buffer)
})
I ran into the same issue with a project I was working on. After lots of trial and error I found the following solution. It might not be the most elegant, but it worked for me.
In my upload route function I created a new thumbnail image object with the original file values and passed it as the file parameter to the uploadFile function for google cloud storage.
Inside my upload image route function:
const file = req.file;
const thumbnail = {
fieldname: file.fieldname,
originalname: `thumbnail_${file.originalname}`,
encoding: file.encoding,
mimetype: file.mimetype,
buffer: await sharp(file.buffer).resize({ width: 150 }).toBuffer()
}
const uploadThumbnail = await uploadFile(thumbnail);
My google cloud storage upload file function:
const uploadFile = async (file) => new Promise((resolve, reject) => {
const gcsname = file.originalname;
const bucketFile = bucket.file(gcsname);
const stream = bucketFile.createWriteStream({
resumable: false,
metadata: {
contentType: file.mimetype
}
});
stream.on('error', (err) => {
reject(err);
});
stream.on('finish', (res) => {
resolve({
name: gcsname
});
});
stream.end(file.buffer);
});
I think the problem is with toFormat(). That function does not exist in the Docs. Can you try to remove it and check if it would work?
sharp(buffer)
.resize(1800, 948)
.jpeg({ quality: 80 })
.toBuffer()
Modify the metadata once you have finished uploading the image.
import * as admin from "firebase-admin";
import * as functions from "firebase-functions";
import { log } from "firebase-functions/logger";
import * as sharp from "sharp";
export const uploadFile = functions.https.onCall(async (data, context) => {
const bytes = data.imageData;
const bucket = admin.storage().bucket();
const buffer = Buffer.from(bytes, "base64");
const bufferSharp = await sharp(buffer)
.png()
.resize({ width: 500 })
.toBuffer();
const nombre = "IMAGE_NAME.png";
const fileName = `img/${nombre}.png`;
const fileUpload = bucket.file(fileName);
const uploadStream = fileUpload.createWriteStream();
uploadStream.on("error", async (err) => {
log("Error uploading image", err);
throw new functions.https.HttpsError("unknown", "Error uploading image");
});
uploadStream.on("finish", async () => {
await fileUpload.setMetadata({ contentType: "image/png" });
log("Upload success");
});
uploadStream.end(bufferSharp);
});

JavaScript Azure Blob Storage move blob

I have a NodeJS backend which use the official blob storage library (#azure/storage-blob) from Microsoft to manage my Blob Storage:
https://www.npmjs.com/package/#azure/storage-blob
It is necessary to move a blob from one folder, to another.
Unfortunately I can't find any documentation for that.
What I did until now is:
const { BlobServiceClient } = require("#azure/storage-blob");
const blobServiceClient = BlobServiceClient.fromConnectionString(process.env.storageconnection);
const containerClient = blobServiceClient.getContainerClient('import');
const blobClient = containerClient.getBlobClient('toImport/' + req.body.file);
const downloadBlockBlobResponse = await blobClient.download();
... do some stuff with the value of the files
Like you can see in the code, I read a file from folder "toImport". After that I want to move the file to another folder "finished". Is that possible? Maybe I need to create a copy of the file and delete the old one?
As such move operation is not supported in Azure Blob Storage. What you have to do is copy the blob from source to destination, monitor the copy progress (because copy operation is asynchronous) and delete the blob once the copy is complete.
For copying, the method you would want to use is beginCopyFromURL(string, BlobBeginCopyFromURLOptions).
Please see this code:
const { BlobServiceClient } = require("#azure/storage-blob");
const connectionString = "DefaultEndpointsProtocol=https;AccountName=account-name;AccountKey=account-key;EndpointSuffix=core.windows.net";
const container = "container-name";
const sourceFolder = "source";
const targetFolder = "target";
const blobName = "blob.png";
async function moveBlob() {
const blobServiceClient = BlobServiceClient.fromConnectionString(connectionString);
const containerClient = blobServiceClient.getContainerClient(container);
const sourceBlobClient = containerClient.getBlobClient(`${sourceFolder}/${blobName}`);
const targetBlobClient = containerClient.getBlobClient(`${targetFolder}/${blobName}`);
console.log('Copying source blob to target blob...');
const copyResult = await targetBlobClient.beginCopyFromURL(sourceBlobClient.url);
console.log('Blob copy operation started successfully...');
console.log(copyResult);
do {
console.log('Checking copy status...')
const blobCopiedSuccessfully = await checkIfBlobCopiedSuccessfully(targetBlobClient);
if (blobCopiedSuccessfully) {
break;
}
} while (true);
console.log('Now deleting source blob...');
await sourceBlobClient.delete();
console.log('Source blob deleted successfully....');
console.log('Move operation complete.');
}
async function checkIfBlobCopiedSuccessfully(targetBlobClient) {
const blobPropertiesResult = await targetBlobClient.getProperties();
const copyStatus = blobPropertiesResult.copyStatus;
return copyStatus === 'success';
}
moveBlob();
The previous best solution seem working but I don't like use an infinite loop.
So this is an alternative way to move blob file
const move = async (
fileName: string,
src: string,
dest: string
) => {
try {
const = blobServiceClient = BlobServiceClient.fromConnectionString();
logger.info(`Move storage file [ ${src} -> ${dest} | ${fileName} ]`);
const srcContainerClient = blobServiceClient.getContainerClient(src);
const destContainerClient =
blobServiceClient.getContainerClient(dest);
const blobClient = srcContainerClient.getBlobClient(fileName);
const downloadBlockBlobResponse = await blobClient.download();
const buffer = await streamToBuffer(
downloadBlockBlobResponse.readableStreamBody!
);
blobClient.delete();
const blockBlobClient = containerClient.getBlockBlobClient(fileName);
await blockBlobClient.upload(buffer, buffer.length);
return `${this.storageUrl}/${containerClient.containerName}/${fileName}`;
} catch (e) {
throw new Error(
`Fail to move storage file [ ${src} -> ${dest} | ${fileName} ]`
);
}
};
const streamToBuffer = async (readableStream: NodeJS.ReadableStream): Promise<Buffer> => {
return new Promise((resolve, reject) => {
const chunks: Buffer[] = [];
readableStream.on("data", (data) => {
chunks.push(data instanceof Buffer ? data : Buffer.from(data));
});
readableStream.on("end", () => {
resolve(Buffer.concat(chunks));
});
readableStream.on("error", reject);
});
};

How to pass parameter message as text in openpgp JS

What I'm trying to do:
get some Azure storage blobs from container DIDE and encrypt them with RSA 2048 and upload them in other container called encrypted-dide
These blobs are downloaded through a stream(here Microsoft did a good job https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-nodejs#upload-blobs-to-a-container) and recomposed by func. streamToString(readableStream)
(I'm not using openpgp JS streams as I don't know if Microsoft streams are the same with NodeJs ones)
My code works as expected with unecrypted text and upload blobs in the supposedly encryped container encrypted-dide
I have followed the official documentation of openpgp js and some Internet resources.
The error I am getting is Error: Parameter [message] needs to be of type Message in openpgp JS
the publicKey is harcoded in the file keys.js and and is exported like this:
const publicKey = `-----BEGIN PGP PUBLIC KEY BLOCK-----
xsBNBGDgi3gBCADcZqIcczPDAx3+os5cCFVgaoT62Y+5rRvwPGPaPKKA1ajw
7NvoxS0RsJbqYAwNk0IEneoZvgSPpqkehGQCOBdsEhjcEgxVxaSlbbgPJkPh
avjTBDUhr6pIUc+MkSX7eh5LdkgWRSfzZdLXX2es5ycM5R1ZryzPTAenZh7D
l1g1x9TsyX+scI7gAtuyfbzAybYVqYMIvcHYZdIi8m6pGmxIqb0QW6sgO6nG
GyjVgxLDyMnHzYMInFRmKUV8XUUw9ECLZ6ioW4rthmpjoswh9vmP6vWI9OL/
Y7Zb3xY5XnIT6UFSpAHS5V/TNbEUD/EpoNtEI30bUl2X35UM277fUxetABEB
AAHNG0pvbiBTbWl0aCA8am9uQGV4YW1wbGUuY29tPsLAigQQAQgAHQUCYOCL
eAQLCQcIAxUICgQWAAIBAhkBAhsDAh4BACEJEGHAYnRSOf5GFiEExGMJvxnV
v1dXecI0YcBidFI5/kY5PAgAxL10QcUZIrxRXQIrqk04ZDhO4ehMirPqH/KT
L/MeHppHFqV06Fm4JDAOpGyh8lgleLwP4P9Lrod3AVSOKSX48u+UM/Bo4LtG
foAntS+tC9RjWlpR6PZ0aJA8SqHKLCnkaUvz7wv/M55fgGxeeQbhOLutNxN4
L8rCNhPo3UbWwoB+ifgQ9S4bv4kgyJxXYinyGYG0CD67YxQKxiAt58qjsdmK
x4uKCoFbHd1Oa4wfr6ezXet+2hCQvsf8eJV88+qL7TmpSe3ypiTWHNgxymNx
v77SlOkkzayJVWxrWtFU8ZoatlsfOP3A5tToio2rEhCHcnqYl4KtF1a0WUR8
KG+pJc7ATQRg4It4AQgA0Q2uZL9TIqGWtNzeAygdG0C3o+D+MoEYI/Qx0A6X
aZv7/1v84V++lRD0iuIMUlBgFEJWMsHF7cN1EMlUV1lRxOzyKTv+0FqyoSTr
bWexA+jG/Nb3Q8vSX1+eVHvm1+2H7AGhBH2szVmXeH15bGNaOaw03EmG5pCh
CIaCoXYUXKoavsa+C8827lGSuqLs1uRniCmIjQvkQSZg7a0IH6dpMIpxdHPh
h9Zyt8e74WwfmXW/be6cjWRI9FgBzl9U5EQEEVO1JdLvfzEEXkNthyAAhl+P
Z1oTR2PSs4ZUlYdb3MQrt7XoKeEOqCHHxoHB3gsj+75Jnc/aAbM+hb13imAJ
iwARAQABwsB2BBgBCAAJBQJg4It4AhsMACEJEGHAYnRSOf5GFiEExGMJvxnV
v1dXecI0YcBidFI5/kZYSQgAop0OsPV11O/fzbZ+oEabC3Ye9hNGapJQNdmJ
MJkiJg7Hnl1FO4MDtHK5OJ4YePFAqtlKRDIBCALPiN0E2v9+3yAafs6TQDc9
Lg3iIPDOnrXv7J7pv2WPnnue4o8Gkggpa+wEjbQJcUBLX311xJGBG4pSNIVN
FJcsl1fGoaxXB5ANPy/+UNMv0l/7cQWDzSw8V9WH10SO2Q4dQF7Zxw+UgBdb
mRVXWNHkcTs81WA/hYtAuLw0O5Q1QWfbXzlTJGNPy/lMMsxLF6La8fBPHlE0
CjYd4ZH9HgOvpCACjRtbc1jywaZJEisO2aJcO2BaozSzYUmkr5sH2wjSKcMS
nLviCw==
=Wg0i
-----END PGP PUBLIC KEY BLOCK-----`
The code is:
const { BlobServiceClient } = require('#azure/storage-blob');
// const { v1: uuidv1 } = require('uuid');
// const stream = require('stream').promises
const openpgp = require('openpgp');
// import * as openpgp from 'openpgp'
const { publicKey } = require('./keys')
async function main() {
const AZURE_STORAGE_CONNECTION_STRING = process.env.AZURE_STORAGE_CONNECTION_STRING;
const blobServiceClient = BlobServiceClient.fromConnectionString(AZURE_STORAGE_CONNECTION_STRING);
const containerClient = blobServiceClient.getContainerClient("uploadebs");
const containerEncryptedFiles = blobServiceClient.getContainerClient("encrypted-dide");
await containerEncryptedFiles.createIfNotExists("encrypted-dide")
// console.log(await openpgp.readKey({ armoredKey: publicKey })) <- THIS WORKS!
for await (const blob of containerClient.listBlobsFlat()) {
if (blob.name.match('^DIDE*')) {
const blockBlobClient = containerClient.getBlockBlobClient(blob.name);
const encryptedblockBlobClient = containerEncryptedFiles.getBlockBlobClient(blob.name)
blockBlobClient.download(0)
.then(downloadBlockBlobResponse => streamToString(downloadBlockBlobResponse.readableStreamBody))
.then(blobAsString => openpgp.encrypt({
message: openpgp.createMessage({ text: blobAsString }), // input as Message object
publicKeys: openpgp.readKey({ armoredKey: publicKey }),
}))
// BELOW LINE, SENDS TEXT IN BLOBS, ENCRYPTED OR NOT THROUGH FUNC UPLOAD
.then(encrypted => {encryptedblockBlobClient.upload(encrypted, encrypted.length)})
}
}
}
async function streamToString(readableStream) {
return new Promise((resolve, reject) => {
const chunks = [];
readableStream.on("data", (data) => {
chunks.push(data.toString());
});
readableStream.on("end", () => {
resolve(chunks.join(""));
});
readableStream.on("error", reject);
});
}
main().then(() => console.log('Done')).catch((ex) => console.log(ex.message));
openpgp.createMessage returns a Promise. So you need to do .then or add await before it.
Same with penpgp.readKey. It is also a promise.
For example from the Doc:
const publicKey = await openpgp.readKey({ armoredKey: publicKeyArmored });
const encrypted = await openpgp.encrypt({
message: await openpgp.createMessage({ text: 'Hello, World!' }), // input as Message object
publicKeys: publicKey, // for encryption
privateKeys: privateKey // for signing (optional)
});
EDIT2:
Without using await.
.then(blobAsString => {
return Promise.all([openpgp.createMessage({ text: blobAsString }), openpgp.readKey({ armoredKey: publicKey })])
.then(([message, publicKeys ])=>{
return openpgp.encrypt({
message,
publicKeys,
});
});
})
Used like this:
.then(blobAsString => {
return Promise.all([openpgp.createMessage({ text: blobAsString }), openpgp.readKey({ armoredKey: publicKey })])
.then(([message, publicKeys ])=>{
return openpgp.encrypt({
message,
publicKeys,
})
})
.then(encrypted => {encryptedblockBlobClient.upload(encrypted, encrypted.length)});;
})

Categories

Resources