Write a csv file to AWS S3 Fails - javascript

I have this code in TypeScript that I used to write a csv file to AWS S3, which it works fine locally, and recently I started getting and error saying:
s3 upload error unsupported body payload object
NOTES:
I'm not passing the credentials because the code is running in the
same container with AWS S3 (EC2) that's why I don't need to pass the
credentials.
I'm printing all the params I'm reading/passing and I have them
read properly.
Here is the code:
public async writeFileToS3(datasetFile: any): Promise<boolean> {
try {
const readFile = util.promisify(this.fileWriter.readFile);
const unlinkFile = util.promisify(this.fileWriter.unlink);
const s3BucketName = this.appConfig.get<string>(
'infra.fileWriter.bucket'
);
const s3Region = this.appConfig.get<string>(
'infra.fileWriter.region'
);
this.s3Bucket.config.region = s3Region;
console.log(
`datasetFile ${datasetFile.path} ${datasetFile.originalname}`
);
const data = readFile(datasetFile.path);
const params = {
Bucket: s3BucketName,
Key: datasetFile.originalname,
Body: data,
ACL: 'public-read'
};
console.log(
`params ${params.Bucket} ${params.Key} ${params.Body} ${params.ACL}`
);
return await new Promise<boolean>((resolve, reject) => {
this.s3Bucket.upload(params, function(err: any) {
unlinkFile(datasetFile.path);
if (err) {
console.log(err);
throw new OperationError(
'Error wirting file to S3',
err
);
} else {
resolve(true);
}
});
});
} catch (err) {
throw new OperationError('Error wirting file to S3');
}
}

readFile returns a Promise (you created it with util.promisify), thus data is a Promise here:
const data = readFile(datasetFile.path);
const params = {
Bucket: s3BucketName,
Key: datasetFile.originalname,
Body: data,
ACL: 'public-read'
};
You should await the Promise:
const data = await readFile(datasetFile.path);

Related

How to retrieve an OpenAI image and save it to an S3 bucket

I want to get an image generated in OpenAI/Dall E and save it to an S3 Bucket.
So far I can get the image URL and create a buffer, with the following:
const configuration = new Configuration({
apiKey: procEnvVars.OPENAI_API_KEY,
});
export const openai = new OpenAIApi(configuration);
const defaultImageParams: CreateImageRequest = {
n: 1,
prompt: "a bad request message",
};
interface InputParams extends CreateImageRequest {
prompt: string; // make this mandatory for the function params
}
// Once we get a URL from the OpenAI API, we want to convert it to a buffer
export async function getBufferFromUrl(openAiUrl: string) {
const axiosResponse = await axios({
url: openAiUrl, //your url
method: "GET",
responseType: "arraybuffer",
});
const data = axiosResponse.data;
if (!(data instanceof Buffer))
throw new Error("Axios response should be of type Buffer");
return data;
}
export async function getUrlFromOpenAi(inputParams: InputParams) {
const ImageResponse = await openai.createImage({
...defaultImageParams,
...inputParams,
});
const dataArray = ImageResponse.data.data;
if (!dataArray || dataArray.length === 0) {
console.error({
error: "We did not return choices from createOpenAiImage()",
data: ImageResponse.data,
datadata: ImageResponse.data.data,
});
}
return dataArray;
}
Next we need to take the buffer and save to S3:
// Create service client module using ES6 syntax.
import { S3Client } from "#aws-sdk/client-s3";
// Set the AWS Region.
const REGION = "eu-west-2";
// Create an Amazon S3 service client object.
const s3Client = new S3Client({ region: REGION });
export { s3Client };
// Import required AWS SDK clients and commands for Node.js.
import { PutObjectCommand } from "#aws-sdk/client-s3";
// Set the parameters.
export const bucketParams = {
Bucket: "<my s3 bucket name. Can be found in S3 console>",
};
// Create and upload an object to the S3 bucket.
export async function putS3Object(inputParams: { Body: Buffer; Key: string }) {
try {
const data = await s3Client.send(
new PutObjectCommand({
...bucketParams,
Body: inputParams.Body,
Key: `public/myFolder/${inputParams.Key}`,
})
);
console.log(
"Successfully uploaded object: " +
bucketParams.Bucket +
"/" +
`public/myFolder/${inputParams.Key}`
);
return data; // For unit tests.
} catch (err) {
console.log("Error", err);
}
}

How to use async/await inside promise executor to read pdf content?

import axios from 'axios'
export const findNumberOfPages = async pdfUrl => {
return new Promise(async (resolve, reject) => {
try {
const response = await axios.get(pdfUrl, {
headers: { 'Content-Type': 'application/pdf' }
})
const reader = new FileReader()
reader.readAsBinaryString(new Blob([response.data]))
reader.onloadend = () => {
const pages = reader.result.match(/\/Type[\s]*\/Page[^s]/g)
const count = pages?.length
resolve(count)
}
} catch (error) {
reject(new Error(`Error when fetching pdf: ${error.message}`))
}
})
}
I'm adding a utility to get number of pages in a pdf(https url) using File Reader API.
But eslint shows an error, Promise executor functions should not be async. (no-async-promise-executor)standard(no-async-promise-executor), and I want to await the axios request to get the pdf response. How can i await the axios request and resolve the number of pages on onloadend of FileReader?
You don't need the promise constructor here. You would (although not quite in that way) if you had to use FileReader, but you don't, Blob has a text method that reads the blob as text:
import axios from 'axios';
export const findNumberOfPages = async pdfUrl => {
try {
const response = await axios.get(pdfUrl, {
headers: { 'Content-Type': 'application/pdf' }
});
const result = await new Blob([response.data]).text();
const pages = result.match(/\/Type[\s]*\/Page[^s]/g);
const count = pages?.length;
return count;
} catch (error) {
throw new Error(`Error when fetching pdf: ${error.message}`);
}
};
I don't use axios, but I suspect that axios.get call is a bit off. You've specified a Content-Type, but you're not sending data (the Content-Type request header specifies the type of data you're sending, not receiving). Also, I suspect you don't need the Blob part of this at all.
Using the built-in fetch instead, I'd expect you could do it like this:
import axios from 'axios';
export const findNumberOfPages = async pdfUrl => {
try {
const response = await fetch(pdfUrl);
if (!response.ok) {
throw new Error(`HTTP error ${response.status}`);
}
const result = await response.text();
const pages = result.match(/\/Type[\s]*\/Page[^s]/g);
const count = pages?.length;
return count;
} catch (error) {
throw new Error(`Error when fetching pdf: ${error.message}`);
}
};

Azure function don't accept to create file on remote

I would download file on local the create a stream then send to an API.
In localhost files get created via blobClient.downloadToFile(defaultFile);
But When I deploy function it can not find file to stream, so I think that the download does not happen or in bad location.
I get this error
[Error: ENOENT: no such file or directory, open 'D:\home\site\wwwroot\importPbix\exampleName.pbix'
Here's my code
const blobServiceClient = BlobServiceClient.fromConnectionString(
process.env.CONNEXION_STRING
);
const containerClient = blobServiceClient.getContainerClient(
params.containerName
);
const blobClient = containerClient.getBlobClient(process.env.FILE_LOCATION); // get file from storage
let blobData;
var defaultFile = path.join(params.baseDir, `${params.reportName}.pbix`); // use path module
let stream;
try {
blobData = await blobClient.downloadToFile(defaultFile);
console.log(blobData);
stream = fs.createReadStream(defaultFile);
} catch (error) {
params.context.log(error);
console.log(error);
}
var options = {
method: "POST",
url: `https://api.powerbi.com/v1.0/myorg/groups/${params.groupId}/imports?datasetDisplayName=${params.reportName}`,
headers: {
"Content-Type": "multipart/form-data",
Authorization: `Bearer ${params.accessToken} `,
},
formData: {
"": {
value: stream,
options: {
filename: `${params.reportName}.pbix`,
contentType: null,
},
},
},
};
//check if file keep in mem
return new Promise(function (resolve, reject) {
request(options, function (error, response) {
if (error) {
params.context.log(error);
reject(error);
} else {
params.context.log(response);
resolve(response.body);
}
fs.unlinkSync(defaultFile);
});
});
I found this post having same issue , that's why I user path module and passed __dirname to function params.baseDir.
If you want to download a file from Azure blob and read it as a stream, just try the code below, in this demo, I try to download a .txt file to a temp folder(you should create it first on Azure function)and print its content from the stream for a quick test:
module.exports = async function (context, req) {
const { BlockBlobClient } = require("#azure/storage-blob")
const fs = require('fs')
const connStr = '<connection string>'
const container = 'files'
const blobName = 'test.txt'
const tempPath = 'd:/home/temp/'
const tempFilePath = tempPath + blobName
const blobClient = new BlockBlobClient(connStr,container,blobName);
await blobClient.downloadToFile(tempFilePath).then(async function(){
context.log("download successfully")
let stream = fs.createReadStream(tempFilePath)
//Print text content,just check if stream has been readed successfully
context.log("text file content:")
context.log(await streamToString(stream))
//You can call your API here...
})
function streamToString (stream) {
const chunks = [];
return new Promise((resolve, reject) => {
stream.on('data', (chunk) => chunks.push(Buffer.from(chunk)));
stream.on('error', (err) => reject(err));
stream.on('end', () => resolve(Buffer.concat(chunks).toString('utf8')));
})
}
context.res = {
body: 'done'
}
}
Result
File has been downloaded:
read as stream successfully:

'await' call doesn't wait

My app is trying to upload files to S3. S3 upload works fine. The problem is that after imageUpload returns, in handleSubmit(), it claims that the return value for imageUpload() is undefined. I suspect that it has to do with async/await, which I'm not too familiar with.
Can any expert explain what I'm missing?
async function imageUpload() {
const params = {
Bucket: BUCKET_NAME,
Key: product.media.name,
Body: product.media
};
s3.upload(params, function(s3Err, data) {
if (s3Err) throw s3Err
console.log(`File uploaded successfully at ${data.Location}`) // successfully get data.Location here
return data.Location
});
}
async function handleSubmit(event) {
try {
event.preventDefault();
setLoading(true)
setError('')
const mediaUrl = await imageUpload()
const url = `${baseUrl}/api/product`
const { name, desc } = product
const payload = { name, desc, mediaUrl } // mediaUrl is undefined here
const response = await axios.post(url, payload)
} catch(error) {
catchErrors(error, setError)
} finally {
setLoading(false)
}
}
You have to wrap your imageUpload code inside promise and then pass the data to resolve callback that you want to return, and if there is some error you pass them in reject callback, throwing error in asynchronous task can give unexpected behaviour, so use reject callback there.
async function imageUpload() {
const params = {
Bucket: BUCKET_NAME,
Key: product.media.name,
Body: product.media
};
return new Promise((resolve, reject) => {
s3.upload(params, function (s3Err, data) {
if (s3Err) {
reject(s3Error);
}
console.log(`File uploaded successfully at ${data.Location}`) // successfully get data.Location here
resolve(data.Location);
});
});
}
The problem is in your imageUpload function. You do not tell it to wait for response from s3.upload
function imageUpload() {
return new Promise((resolve, reject) => {
const params = {
Bucket: BUCKET_NAME,
Key: product.media.name,
Body: product.media
};
s3.upload(params, function(s3Err, data) {
if (s3Err) reject(s3Err)
else resolve(data.Location)
});
});
}
Your call to s3.upload is in an async function, but using a callback, and only returning to the callback (not to the outer function in any way). The AWS SDKs for JS all (or mostly all) support Promises now, so you should be able to do this:
async function imageUpload() {
const params = {
Bucket: BUCKET_NAME,
Key: product.media.name,
Body: product.media
};
const data = await s3.upload(params).promise()
console.log(`File uploaded successfully at ${data.Location}`)
return data
}

Amazon S3 Remote File Upload with Axios

I am trying to write a function that would:
Take a remote URL as a parameter,
Get the file using axios
Upload the stream to amazon s3
And finally, return the uploaded url
I found help here on stackoverflow. So far, I have this:
/*
* Method to pipe the stream
*/
const uploadFromStream = (file_name, content_type) => {
const pass = new stream.PassThrough();
const obj_key = generateObjKey(file_name);
const params = { Bucket: config.bucket, ACL: config.acl, Key: obj_key, ContentType: content_type, Body: pass };
s3.upload(params, function(err, data) {
if(!err){
return data.Location;
} else {
console.log(err, data);
}
});
return pass;
}
/*
* Method to upload remote file to s3
*/
const uploadRemoteFileToS3 = async (remoteAddr) => {
axios({
method: 'get',
url: remoteAddr,
responseType: 'stream'
}).then( (response) => {
if(response.status===200){
const file_name = remoteAddr.substring(remoteAddr.lastIndexOf('/')+1);
const content_type = response.headers['content-type'];
response.data.pipe(uploadFromStream(file_name, content_type));
}
});
}
But uploadRemoteFileToS3 does not return anything (because it's a asynchronous function). How can I get the uploaded url?
UPDATE
I have further improved upon the code and wrote a class. Here is what I have now:
const config = require('../config.json');
const stream = require('stream');
const axios = require('axios');
const AWS = require('aws-sdk');
class S3RemoteUploader {
constructor(remoteAddr){
this.remoteAddr = remoteAddr;
this.stream = stream;
this.axios = axios;
this.config = config;
this.AWS = AWS;
this.AWS.config.update({
accessKeyId: this.config.api_key,
secretAccessKey: this.config.api_secret
});
this.spacesEndpoint = new this.AWS.Endpoint(this.config.endpoint);
this.s3 = new this.AWS.S3({endpoint: this.spacesEndpoint});
this.file_name = this.remoteAddr.substring(this.remoteAddr.lastIndexOf('/')+1);
this.obj_key = this.config.subfolder+'/'+this.file_name;
this.content_type = 'application/octet-stream';
this.uploadStream();
}
uploadStream(){
const pass = new this.stream.PassThrough();
this.promise = this.s3.upload({
Bucket: this.config.bucket,
Key: this.obj_key,
ACL: this.config.acl,
Body: pass,
ContentType: this.content_type
}).promise();
return pass;
}
initiateAxiosCall() {
axios({
method: 'get',
url: this.remoteAddr,
responseType: 'stream'
}).then( (response) => {
if(response.status===200){
this.content_type = response.headers['content-type'];
response.data.pipe(this.uploadStream());
}
});
}
dispatch() {
this.initiateAxiosCall();
}
async finish(){
//console.log(this.promise); /* return Promise { Pending } */
return this.promise.then( (r) => {
console.log(r.Location);
return r.Location;
}).catch( (e)=>{
console.log(e);
});
}
run() {
this.dispatch();
this.finish();
}
}
But still have no clue how to catch the result when the promise is resolved. So far, I tried these:
testUpload = new S3RemoteUploader('https://avatars2.githubusercontent.com/u/41177');
testUpload.run();
//console.log(testUpload.promise); /* Returns Promise { Pending } */
testUpload.promise.then(r => console.log); // does nothing
But none of the above works. I have a feeling I am missing something very subtle. Any clue, anyone?
After an upload you can call the getsignedurl function in s3 sdk to get the url where you can also specify the expiry of the url as well. You need to pass the key for that function. Now travelling will update with example later.
To generate a simple pre-signed URL that allows any user to view the
contents of a private object in a bucket you own, you can use the
following call to getSignedUrl():
var s3 = new AWS.S3();
var params = {Bucket: 'myBucket', Key: 'myKey'};
s3.getSignedUrl('getObject', params, function (err, url) {
console.log("The URL is", url);
});
Official documentation link
http://docs.amazonaws.cn/en_us/AWSJavaScriptSDK/guide/node-examples.html
Code must be something like this
function uploadFileToS3AndGenerateUrl(cb) {
const pass = new stream.PassThrough();//I have generated streams from file. Using this since this is what you have used. Must be a valid one.
var params = {
Bucket: "your-bucket", // required
Key: key , // required
Body: pass,
ContentType: 'your content type',
};
s3.upload(params, function(s3Err, data) {
if (s3Err) {
cb(s3Err)
}
console.log(`File uploaded successfully at ${data.Location}`)
const params = {
Bucket: 'your-bucket',
Key: data.key,
Expires: 180
};
s3.getSignedUrl('getObject', params, (urlErr, urlData) => {
if (urlErr) {
console.log('There was an error getting your files: ' + urlErr);
cb(urlErr);
} else {
console.log(`url: ${urlData}`);
cb(null, urlData);
}
})
})
}
Please check i have update your code might its help you.
/*
* Method to upload remote file to s3
*/
const uploadRemoteFileToS3 = async (remoteAddr) => {
const response = await axios({
method: 'get',
url: remoteAddr,
responseType: 'stream'
})
if(response.status===200){
const file_name = remoteAddr.substring(remoteAddr.lastIndexOf('/')+1);
const content_type = response.headers['content-type'];
response.data.pipe(uploadFromStream(file_name, content_type));
}
return new Promise((resolve, reject) => {
response.data.on('end', (response) => {
console.log(response)
resolve(response)
})
response.data.on('error', () => {
console.log(response);
reject(response)
})
})
};
*
* Method to pipe the stream
*/
const uploadFromStream = (file_name, content_type) => {
return new Promise((resolve, reject) => {
const pass = new stream.PassThrough();
const obj_key = generateObjKey(file_name);
const params = { Bucket: config.bucket, ACL: config.acl, Key: obj_key, ContentType: content_type, Body: pass };
s3.upload(params, function(err, data) {
if(!err){
console.log(data)
return resolve(data.Location);
} else {
console.log(err)
return reject(err);
}
});
});
}
//call uploadRemoteFileToS3
uploadRemoteFileToS3(remoteAddr)
.then((finalResponse) => {
console.log(finalResponse)
})
.catch((err) => {
console.log(err);
});

Categories

Resources