Convert request.body to json - javascript

I am trying to assert data that I am sending the API.
When I log body I see that it's off ArrayBuffer format. But when I use Buffer.from I get a message First argument must be a string, Buffer, ArrayBuffer, Array, or array-like object.
cy.wait(‘#myrequest’)
.its('request')
.then((request) => {
console.log(request.body)
const buffer = Buffer.from(request.body)
})
What can I do to retrieve data in form object or json to then assert data that I am sending to an API?

Assuming that this is javascript, you should be able to use JSON.parse
let jsonData = JSON.parse(request.body);
Note that this should be wrapped in a try {} catch {}
If you're using express, you can try https://www.npmjs.com/package/body-parser

There's an example recipe that has multipart handling here stubs multipart form
Test
cy.wait('#submitForm').its('request').then(({ headers, body }) => {
expect(body, 'request body').to.be.a('ArrayBuffer')
const contentType = headers['content-type']
// the browser sets the separator string when sending the form
// something like
// "multipart/form-data; boundary=----WebKitFormBoundaryiJZt6b3aUg8Jybg2"
// we want to extract it and pass to the utility function
// to convert the multipart text into an object of values
expect(contentType, 'boundary').to.match(/^multipart\/form-data; boundary=/)
const boundary = contentType.split('boundary=')[1]
const values = parseMultipartForm({ boundary, buffer: body })
expect(values, 'form values').to.deep.equal({
city: 'Boston',
value: '28', // note this is a string
})
})
Parsing
/*
Utility: parses (very simply) multipart buffer into string values.
the decoded buffer string will be something like
------We bKitFormBoundaryYxsB3tlu9eJsoCeY
Content-Disposition: form-data; name="city"
Boston
------WebKitFormBoundaryYxsB3tlu9eJsoCeY
Content-Disposition: form-data; name="value"
28
------WebKitFormBoundaryYxsB3tlu9eJsoCeY--
there are NPM packages for parsing such text into an object:
- https://www.npmjs.com/package/parse-multipart
- https://www.npmjs.com/package/multiparty
- https://www.npmjs.com/package/busboy
- https://www.npmjs.com/package/formidable
*/
const parseMultipartForm = ({ boundary, buffer }) => {
expect(boundary, 'boundary').to.be.a('string')
const decoder = new TextDecoder()
const decoded = decoder.decode(buffer)
const parts = decoded.split(`--${boundary}`)
.map((s) => s.trim())
.filter((s) => s.startsWith('Content-Disposition: form-data;'))
console.log(decoded)
console.log(parts)
const result = {}
parts.forEach((part) => {
const lines = part.split(/\r?\n/g)
console.log('lines')
console.log(lines)
const key = lines[0].match(/name="(.+)"/)[1]
result[key] = lines[2].trim()
})
return result
}

Related

How implement types properly?

I'm new to Typescript and have been doing a refactor a colleague code, I'm currently doing a typecheck and removing all any types. The goal is to make an MSGraph API call and return the a JSON file that translated into BirthdayPerson with a name, birthday date and a ID
I've been trying to a assign a type instead of any in the following code, but whether I assign number, string or any other type a different error will show up.
Perhaps I'm not tackling the solution correctly:
graph.ts
* #param accessToken
* #param endpoint url to call from MS Graph
*/
async function callMsGraph(accessToken: string, endpoint: string) {
const headers = new Headers();
const bearer = `Bearer ${accessToken}`;
headers.append('Authorization', bearer);
const options = {
method: 'GET',
headers,
};
try {
return fetch(endpoint, options);
} catch (error) {
console.error(error);
throw error;
}
}
export const callMsGraphWithoutPagination = async (
accessToken: string,
url: string,
dataToReturn: any[] = []
): Promise<any[]> => {
try {
const data = await callMsGraph(accessToken, url);
const dataJson = await data.json();
const newData = dataToReturn.concat(dataJson.value);
if (dataJson['#odata.nextLink']) {
const NEXT_URL = dataJson['#odata.nextLink'].split('/v1.0')[1];
return await callMsGraphWithoutPagination(
accessToken,
process.env.REACT_APP_GRAPH_URL + NEXT_URL,
newData
);
}
return dataToReturn.concat(dataJson.value);
} catch (error) {
/* eslint-disable no-console */
console.error(error);
/* eslint-enable no-console */
throw error;
}
};
export default callMsGraph;
useUsers.tsx
export const useUsers = () => {
const token = useToken();
const [users, setUsers] = React.useState<BirthdayPerson[]>([]);
React.useEffect(() => {
if (token) {
callMsGraphWithoutPagination(token, graphConfig.usersEndpoint).then(async (data: any) => {
const processedData: any[] = await Promise.all(
data.map(async (element: any) => {
const user = await callMsGraph(token, graphConfig.userBirthdayEndpoint(element.id));
const userJson = await user.json();
const image = await callMsGraph(token, graphConfig.userPhotoEndpoint(element.id));
const blob = await image.blob();
const returnElement: BirthdayPerson = {
displayName: element.displayName,
birthday: userJson.value,
id: element.id,
};
if (blob !== null) {
window.URL = window.URL || window.webkitURL;
returnElement.picture = window.URL.createObjectURL(blob);
}
return returnElement;
})
);
setUsers([].concat(...processedData));
});
}
}, [token]);
return users;
};
helpers.ts
interface IUpdateData {
slideCount: number;
}
const sortAndFilterBirthdays = (people: BirthdayPerson[], daysToGet: number) =>
people
.sort((firstEl, secondEl) => sortDate(firstEl.birthday, secondEl.birthday))
.filter(({ birthday }) => filterByAmountOfDays({ date: birthday, daysAfter: daysToGet }));
const getBirthdays: any = (people: BirthdayPerson[], daysToGet: number) => {
const validBirthdays = people.filter((element: any) => {
const year = moment(element.birthday).year();
return year !== 0;
});
const result = sortAndFilterBirthdays(validBirthdays, daysToGet);
// if it's okay
if (result.length > 1 && daysToGet <= 30) {
return result;
}
// if not okay, filters by future dates, concats with 'next year' dates, returns 2 dates
const fallbackResult = validBirthdays
.sort((firstEl, secondEl) => sortDate(firstEl.birthday, secondEl.birthday))
.filter((person: BirthdayPerson) => {
const currentYear = moment().year();
const date = moment(person.birthday, DATE_FORMAT).set('years', currentYear);
return moment().diff(date, 'days') <= 0;
});
return fallbackResult.concat(validBirthdays).splice(0, 2);
};
Any help or indication would be great!
From all the changes I've done another object will complain that Type 'x' is not assignable to type 'string'
Firstly you need to somehow type responses from API, because right now, as you have seen, call to .json() on Response object returns unknown, which make sense because no one knows what response the server returns. You may know what response it is expected to return, but not what it actually does.
Ideally therefore you need a parser that will verify that the response has correct structure and throws an error otherwise. There are libraries such as superstruct, yup, joi and others which you can use for this. Of course this is a lot of work and will need refactoring. Or if you don't care enough you can just cast the response object to appropriate type, but then if server returns something unexpected and the application cannot handle it, it's your fault.
Example with response parsing using superstruct
import {string, number, create, Infer} from 'superstruct'
// I assume `BirthdayUser` is just a string, but you can type more complex objects as well
const BirthdayUser = string()
// This is so that you don't have to list fields twice: once
// for the parser and once for typescript
type BirthdayUser = Infer<typeof BirthdayUser>
// Then use the parser
const response = await callMsGraph(acessToken, url)
const userJson = await response.json()
// user variable has inferred appropriate type, and if the response doesn't
// comply with the definition of `BirthdayUser` an error will be thrown
// Also I assume MS graph doesn't just return plain value but wraps it in an object with `value` field, so writing it here
const user = create(userJson, object({ value: BirthdayUser }))
Example of "I don't care enough" solution
type BirthdayUser = string
const response = await callMsGraph(accessToken, url)
// Same thing with wrapping into object with `value` field
const userJson = (await response.json()) as {value: BirthdayUser}
This is a bit awkward, because API call returns Response object and not the actual data. It might be easier to work with if you move parsing and casting logic inside of callMsGraph. It's not obligatory of course, but I still provide an example because after that you need to type callMsGraphWithoutPagination and it will use the same idea
import {object, create, Struct} from 'superstruct'
async function callMsGraphParsed<T>(
accessToken: string,
url: string,
// Since we need information about object structure at runtime, just making function
// generic is not enough, you need to pass the parser structure as an argument
struct: Struct<T>
) {
// ...
const response = await fetch(...)
const json = await response.json()
// Same, verifies that response complies to provided structure, if it
// does returns type object (of type `T`), otherwise throws an error
return create(json, object({ value: struct }))
}
async function callMsGraphLazy<T>(accessToken: string, url: string) {
// ...
const response = await fetch(...)
const json = await response.json()
return json as {value: T}
}
However I only call .json() here, if you want to use this solution, you will then need either a different function or another argument if you also want it to call .blob() for some API calls.
Now you type callMsGraphWithoutPagination using in the same way:
export const callMsGraphWithoutPaginationParsed = async <T>(
accessToken: string,
url: string,
dataToReturn: T[] = [],
struct: Struct<T>,
): Promise<T[]> => {
// If you rewrote `callMsGraph` to use parsing
const dataJson = await callMsGraph(accessToken, url, struct);
const newData = dataToReturn.concat(dataJson.value);
// ...
}
export const callMsGraphWithoutPaginationLazy= async <T>(
accessToken: string,
url: string,
dataToReturn: T[] = [],
): Promise<T[]> => {
// If you left `callMsGraph` as is
const data = await callMsGraph(accessToken, url);
const dataJson = (await data.json()) as {value: T}
const newData = dataToReturn.concat(dataJson.value);
// ...
}
Then use it
// Not sure if you are requesting `BirthdayUser` array here or some other entity, so change it to whatever you expect to receive
callMsGraphWithoutPagination<BirthdayUser>(token, graphConfig.usersEndpoint).then(async (data) => {
// "data" is inferred to have type BirthdayUser[]
data.map(element => {
// "element" is inferred to have type BirthdayUser
})
})
Also everywhere I wrote "I assume" and "Not sure" is missing info that you should probably have provided in the question. It didn't turn out to be relevant for me, but it could have. Good luck!

Wrong result with HMAC verification using SubtleCrypto in JavaScript

I am trying to verify a HMAC signature using the SubtleCrypto API. The whole thing is supposed to run in Cloudflare Workers and I am testing it locally using their wrangler tool.
This is my code so far, but it generates the wrong signature.
const message = "(query params from an url)";
const given_signature = "(extracted from the query params)";
const SECRET = "...";
const algorithm = { name: 'HMAC', hash: 'SHA-256' };
const encoder = new TextEncoder();
const key = await crypto.subtle.importKey(
'raw',
encoder.encode(SECRET),
algorithm,
false,
['sign', 'verify']
);
const signature = await crypto.subtle.sign(
algorithm.name,
key,
encoder.encode(message)
);
const digest = btoa(String.fromCharCode(...new Uint8Array(signature)));
// The digest does not match the signature extracted from the query params
// If I, for example, want to verify the signature directly, the result is still false.
const verify = await crypto.subtle.verify(
algorithm.name,
key,
encoder.encode(given_signature),
encoder.encode(message)
);
If I am using the same secret and message in online HMAC testing tools, I am getting the correct results, so I am certain that there must be a bug in my code.
What I find interesting, is that the signature generated by my code is much shorter than the given one (e.g. 3fn0mhrebHTJMhtOyvRP5nZIhogX/M1OKQ5GojniZTM= vs ddf9f49a1ade6c74c9321b4ecaf44fe67648868817fccd4e290e46a239e26533).
Does anyone have an idea where I am going wrong?
Thanks to the helpful comments! The problem in a nutshell was that the provided signature was encoded as a HEX string, while the generated signature was a base64-encoded string.
To keep things clean, here is a working version that uses the crypto.subtle.verify function:
const message = "(query params from an url w/o the hmac signature)";
const given_signature = "(the given hmac signature extracted from the query params)";
const SECRET = "(my secret key)";
const hexToBuffer = (hex: string) => {
const matches = hex.match(/[\da-f]{2}/gi) ?? [];
const typedArray = new Uint8Array(
matches.map(function (h) {
return parseInt(h, 16);
})
);
return typedArray.buffer;
};
const algorithm = { name: "HMAC", hash: "SHA-256" };
const encoder = new TextEncoder();
const key = await crypto.subtle.importKey(
"raw",
encoder.encode(SECRET),
algorithm,
false,
["sign", "verify"]
);
const result: boolean = await crypto.subtle.verify(
algorithm.name,
key,
hexToBuffer(given_signature),
encoder.encode(message)
);

Crypto.sign() function to sign a message with given private key

I need to sign a message with crypto.sign() function in NodeJS to get a valid JWT.
I have a private key (base 64) like this:
Dm2xriMD6riJagld4WCA6zWqtuWh40UzT/ZKO0pZgtHATOt0pGw90jG8BQHCE3EOjiCkFR2/gaW6JWi+3nZp8A==
And I tried to get a signature:
const getJWT = () => {
const privateKey =
"Dm2xriMD6riJagld4WCA6zWqtuWh40UzT/ZKO0pZgtHATOt0pGw90jG8BQHCE3EOjiCkFR2/gaW6JWi+3nZp8A==";
const payload = {
iss: "test",
aud: "test.com",
iat: 1650101178,
exp: 1650101278,
sub: "12345678-1234-1234-1234-123456789123"
};
const token = encode(payload, privateKey);
return token
};
const encode = (payload, key) => {
const header = {
typ: "JWT",
alg: "EdDSA"
};
const headerBase64URL = base64url(JSON.stringify(header));
const payloadBase64URL = base64url(JSON.stringify(payload));
const msg = Buffer.from(`${headerBase64URL}.${payloadBase64URL}`);
const keyDecoded = Buffer.from(key, "base64");
const signature = crypto.sign("Ed25519", msg, keyDecoded); //Here is the problem
const signatureBase64url = base64url(Buffer.from(signature));
return `${msg}.${signatureBase64url}`;
};
I received this error:
internal/crypto/sig.js:142
return _signOneShot(keyData, keyFormat, keyType, keyPassphrase, data,
^
Error: error:0909006C:PEM routines:get_name:no start line
library: 'PEM routines',
function: 'get_name',
reason: 'no start line',
code: 'ERR_OSSL_PEM_NO_START_LINE'
How can I adapt my private key to a valid format?
The crypto.sign() method requires for Ed25519 a private key in PKCS#8 format. Your key is a raw key consisting of the concatenation of the raw private 32 bytes key and the raw public 32 bytes, base64 encoded. A DER encoded PKCS#8 key can be derived and imported as follows:
Base64 decode your key. Use the first 32 bytes of your raw 64 bytes key (i.e. the raw private key).
Concat the following prefix for a private Ed25519 key (hex): 302e020100300506032b657004220420
Import that DER encoded PKCS#8 key.
Accordingly, the key import in getJWT() must be changed as follows:
const privateKey = toPkcs8der('Dm2xriMD6riJagld4WCA6zWqtuWh40UzT/ZKO0pZgtHATOt0pGw90jG8BQHCE3EOjiCkFR2/gaW6JWi+3nZp8A==');
with
const toPkcs8der = (rawB64) => {
var rawPrivate = Buffer.from(rawB64, 'base64').subarray(0, 32);
var prefixPrivateEd25519 = Buffer.from('302e020100300506032b657004220420','hex');
var der = Buffer.concat([prefixPrivateEd25519, rawPrivate]);
return crypto.createPrivateKey({key: der, format: "der", type: "pkcs8"})
}
Furthermore, in the encode() function:
Remove the line const keyDecoded = Buffer.from(key, "base64")
Create the signature with
const signature = crypto.sign(null, msg, key)
Note that for Ed25519, a null must be passed as first parameter in the sign() call. The algorithm comes from the key.
With these changes, the NodeJS code returns the following JWT:
eyJ0eXAiOiJKV1QiLCJhbGciOiJFZERTQSJ9.eyJpc3MiOiJ0ZXN0IiwiYXVkIjoidGVzdC5jb20iLCJpYXQiOjE2NTAxMDExNzgsImV4cCI6MTY1MDEwMTI3OCwic3ViIjoiMTIzNDU2NzgtMTIzNC0xMjM0LTEyMzQtMTIzNDU2Nzg5MTIzIn0.f7WG_02UKljrMeVVOTNNBAGxtLXJUT_8QAnujNhomV18Pn5cU-0lHRgVlmRttOlqI7Iol_fHut3C4AOXxDGnAQ
that matches the expected JWT.

Convert JSON to CSV using javascript

I'm a total JS noob. I have read a JSON and filtered out specific items from it and saved it to a variable named mapped. How do I export this JSON to a properly formatted CSV file?
let json = require('./users.json')
let filtered = json.Users.filter((a)=>{
return new Date(a.UserCreateDate) > new Date('2020-05-11T00:00:00.000000+05:30')
})
let mapped=filtered.map((a)=>{
let email
a.Attributes.forEach(element => {
if(element.Name=='email'){
email = element.Value
}
});
return {
name: a.Username,
email: email,
UserCreateDate: a.UserCreateDate,
UserStatus: a.UserStatus
}
})
console.log(JSON.stringify(mapped, null, 4), mapped.length)
Although there are quite a few answers to this topic, I haven't been able to successfully implement any of those.
I assume you wanna use the keys as the header in CSV file.
What you need is to open a write stream, the format of the CSV file is pure strings with commas separating the values.
// ...
let mapped=filtered.map((a)=>{
//...
})
// ...
const fs = require('fs');
let writeStream = fs.createWriteStream('./output.csv');
writeStream.on('open', () => {
// get the header
const header = Object.keys(mapped[0]).toString().concat('\n');
writeStream.write(header);
mapped.forEach((obj) => {
const values = Object.values(obj).toString().concat('\n');
writeStream.write(values);
});
writeStream.end();
});

How to get response from S3 getObject in Node.js?

In a Node.js project I am attempting to get data back from S3.
When I use getSignedURL, everything works:
aws.getSignedUrl('getObject', params, function(err, url){
console.log(url);
});
My params are:
var params = {
Bucket: "test-aws-imagery",
Key: "TILES/Level4/A3_B3_C2/A5_B67_C59_Tiles.par"
If I take the URL output to the console and paste it in a web browser, it downloads the file I need.
However, if I try to use getObject I get all sorts of odd behavior. I believe I am just using it incorrectly. This is what I've tried:
aws.getObject(params, function(err, data){
console.log(data);
console.log(err);
});
Outputs:
{
AcceptRanges: 'bytes',
LastModified: 'Wed, 06 Apr 2016 20:04:02 GMT',
ContentLength: '1602862',
ETag: '9826l1e5725fbd52l88ge3f5v0c123a4"',
ContentType: 'application/octet-stream',
Metadata: {},
Body: <Buffer 01 00 00 00 ... > }
null
So it appears that this is working properly. However, when I put a breakpoint on one of the console.logs, my IDE (NetBeans) throws an error and refuses to show the value of data. While this could just be the IDE, I decided to try other ways to use getObject.
aws.getObject(params).on('httpData', function(chunk){
console.log(chunk);
}).on('httpDone', function(data){
console.log(data);
});
This does not output anything. Putting a breakpoint in shows that the code never reaches either of the console.logs. I also tried:
aws.getObject(params).on('success', function(data){
console.log(data);
});
However, this also does not output anything and placing a breakpoint shows that the console.log is never reached.
What am I doing wrong?
#aws-sdk/client-s3 (2022 Update)
Since I wrote this answer in 2016, Amazon has released a new JavaScript SDK, #aws-sdk/client-s3. This new version improves on the original getObject() by returning a promise always instead of opting in via .promise() being chained to getObject(). In addition to that, response.Body is no longer a Buffer but, one of Readable|ReadableStream|Blob. This changes the handling of the response.Data a bit. This should be more performant since we can stream the data returned instead of holding all of the contents in memory, with the trade-off being that it is a bit more verbose to implement.
In the below example the response.Body data will be streamed into an array and then returned as a string. This is the equivalent example of my original answer. Alternatively, the response.Body could use stream.Readable.pipe() to an HTTP Response, a File or any other type of stream.Writeable for further usage, this would be the more performant way when getting large objects.
If you wanted to use a Buffer, like the original getObject() response, this can be done by wrapping responseDataChunks in a Buffer.concat() instead of using Array#join(), this would be useful when interacting with binary data. To note, since Array#join() returns a string, each Buffer instance in responseDataChunks will have Buffer.toString() called implicitly and the default encoding of utf8 will be used.
const { GetObjectCommand, S3Client } = require('#aws-sdk/client-s3')
const client = new S3Client() // Pass in opts to S3 if necessary
function getObject (Bucket, Key) {
return new Promise(async (resolve, reject) => {
const getObjectCommand = new GetObjectCommand({ Bucket, Key })
try {
const response = await client.send(getObjectCommand)
// Store all of data chunks returned from the response data stream
// into an array then use Array#join() to use the returned contents as a String
let responseDataChunks = []
// Handle an error while streaming the response body
response.Body.once('error', err => reject(err))
// Attach a 'data' listener to add the chunks of data to our array
// Each chunk is a Buffer instance
response.Body.on('data', chunk => responseDataChunks.push(chunk))
// Once the stream has no more data, join the chunks into a string and return the string
response.Body.once('end', () => resolve(responseDataChunks.join('')))
} catch (err) {
// Handle the error or throw
return reject(err)
}
})
}
Comments on using Readable.toArray()
Using Readable.toArray() instead of working with the stream events directly might be more convenient to use but, its worse performing. It works by reading all response data chunks into memory before moving on. Since this removes all benefits of streaming, this approach is discouraged per the Node.js docs.
As this method reads the entire stream into memory, it negates the benefits of streams. It's intended for interoperability and convenience, not as the primary way to consume streams. Documentation Link
#aws-sdk/client-s3 Documentation Links
GetObjectCommand
GetObjectCommandInput
GetObjectCommandOutput
aws-sdk (Original Answer)
When doing a getObject() from the S3 API, per the docs the contents of your file are located in the Body property, which you can see from your sample output. You should have code that looks something like the following
const aws = require('aws-sdk');
const s3 = new aws.S3(); // Pass in opts to S3 if necessary
var getParams = {
Bucket: 'abc', // your bucket name,
Key: 'abc.txt' // path to the object you're looking for
}
s3.getObject(getParams, function(err, data) {
// Handle any error and exit
if (err)
return err;
// No error happened
// Convert Body from a Buffer to a String
let objectData = data.Body.toString('utf-8'); // Use the encoding necessary
});
You may not need to create a new buffer from the data.Body object but if you need you can use the sample above to achieve that.
Based on the answer by #peteb, but using Promises and Async/Await:
const AWS = require('aws-sdk');
const s3 = new AWS.S3();
async function getObject (bucket, objectKey) {
try {
const params = {
Bucket: bucket,
Key: objectKey
}
const data = await s3.getObject(params).promise();
return data.Body.toString('utf-8');
} catch (e) {
throw new Error(`Could not retrieve file from S3: ${e.message}`)
}
}
// To retrieve you need to use `await getObject()` or `getObject().then()`
const myObject = await getObject('my-bucket', 'path/to/the/object.txt');
Updated (2022)
nodejs v17.5.0 added Readable.toArray. If this API is available in your node version. The code will be very short:
const buffer = Buffer.concat(
await (
await s3Client
.send(new GetObjectCommand({
Key: '<key>',
Bucket: '<bucket>',
}))
).Body.toArray()
)
If you are using Typescript, you are safe to cast the .Body part as Readable (the other types ReadableStream and Blob are only returned in browser environment. Moreover, in browser, Blob is only used in legacy fetch API when response.body is not supported)
(response.Body as Readable).toArray()
Note that: Readable.toArray is an experimental (yet handy) feature, use it with caution.
=============
Original answer
If you are using aws sdk v3, the sdk v3 returns nodejs Readable (precisely, IncomingMessage which extends Readable) instead of a Buffer.
Here is a Typescript version. Note that this is for node only, if you send the request from browser, check the longer answer in the blog post mentioned below.
import {GetObjectCommand, S3Client} from '#aws-sdk/client-s3'
import type {Readable} from 'stream'
const s3Client = new S3Client({
apiVersion: '2006-03-01',
region: 'us-west-2',
credentials: {
accessKeyId: '<access key>',
secretAccessKey: '<access secret>',
}
})
const response = await s3Client
.send(new GetObjectCommand({
Key: '<key>',
Bucket: '<bucket>',
}))
const stream = response.Body as Readable
return new Promise<Buffer>((resolve, reject) => {
const chunks: Buffer[] = []
stream.on('data', chunk => chunks.push(chunk))
stream.once('end', () => resolve(Buffer.concat(chunks)))
stream.once('error', reject)
})
// if readable.toArray() is support
// return Buffer.concat(await stream.toArray())
Why do we have to cast response.Body as Readable? The answer is too long. Interested readers can find more information on my blog post.
For someone looking for a NEST JS TYPESCRIPT version of the above:
/**
* to fetch a signed URL of a file
* #param key key of the file to be fetched
* #param bucket name of the bucket containing the file
*/
public getFileUrl(key: string, bucket?: string): Promise<string> {
var scopeBucket: string = bucket ? bucket : this.defaultBucket;
var params: any = {
Bucket: scopeBucket,
Key: key,
Expires: signatureTimeout // const value: 30
};
return this.account.getSignedUrlPromise(getSignedUrlObject, params);
}
/**
* to get the downloadable file buffer of the file
* #param key key of the file to be fetched
* #param bucket name of the bucket containing the file
*/
public async getFileBuffer(key: string, bucket?: string): Promise<Buffer> {
var scopeBucket: string = bucket ? bucket : this.defaultBucket;
var params: GetObjectRequest = {
Bucket: scopeBucket,
Key: key
};
var fileObject: GetObjectOutput = await this.account.getObject(params).promise();
return Buffer.from(fileObject.Body.toString());
}
/**
* to upload a file stream onto AWS S3
* #param stream file buffer to be uploaded
* #param key key of the file to be uploaded
* #param bucket name of the bucket
*/
public async saveFile(file: Buffer, key: string, bucket?: string): Promise<any> {
var scopeBucket: string = bucket ? bucket : this.defaultBucket;
var params: any = {
Body: file,
Bucket: scopeBucket,
Key: key,
ACL: 'private'
};
var uploaded: any = await this.account.upload(params).promise();
if (uploaded && uploaded.Location && uploaded.Bucket === scopeBucket && uploaded.Key === key)
return uploaded;
else {
throw new HttpException("Error occurred while uploading a file stream", HttpStatus.BAD_REQUEST);
}
}
Converting GetObjectOutput.Body to Promise<string> using node-fetch
In aws-sdk-js-v3 #aws-sdk/client-s3, GetObjectOutput.Body is a subclass of Readable in nodejs (specifically an instance of http.IncomingMessage) instead of a Buffer as it was in aws-sdk v2, so resp.Body.toString('utf-8') will give you the wrong result “[object Object]”. Instead, the easiest way to turn GetObjectOutput.Body into a Promise<string> is to construct a node-fetch Response, which takes a Readable subclass (or Buffer instance, or other types from the fetch spec) and has conversion methods .json(), .text(), .arrayBuffer(), and .blob().
This should also work in the other variants of aws-sdk and platforms (#aws-sdk v3 node Buffer, v3 browser Uint8Array subclass, v2 node Readable, v2 browser ReadableStream or Blob)
npm install node-fetch
import { Response } from 'node-fetch';
import * as s3 from '#aws-sdk/client-s3';
const client = new s3.S3Client({})
const s3Response = await client.send(new s3.GetObjectCommand({Bucket: '…', Key: '…'});
const response = new Response(s3Response.Body);
const obj = await response.json();
// or
const text = await response.text();
// or
const buffer = Buffer.from(await response.arrayBuffer());
// or
const blob = await response.blob();
Reference: GetObjectOutput.Body documentation, node-fetch Response documentation, node-fetch Body constructor source, minipass-fetch Body constructor source
Thanks to kennu comment in GetObjectCommand usability issue
Extremely similar answer to #ArianAcosta above. Except I'm using import (for Node 12.x and up), adding AWS config and sniffing for an image payload and applying base64 processing to the return.
// using v2.x of aws-sdk
import aws from 'aws-sdk'
aws.config.update({
accessKeyId: process.env.YOUR_AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.YOUR_AWS_SECRET_ACCESS_KEY,
region: "us-east-1" // or whatever
})
const s3 = new aws.S3();
/**
* getS3Object()
*
* #param { string } bucket - the name of your bucket
* #param { string } objectKey - object you are trying to retrieve
* #returns { string } - data, formatted
*/
export async function getS3Object (bucket, objectKey) {
try {
const params = {
Bucket: bucket,
Key: objectKey
}
const data = await s3.getObject(params).promise();
// Check for image payload and formats appropriately
if( data.ContentType === 'image/jpeg' ) {
return data.Body.toString('base64');
} else {
return data.Body.toString('utf-8');
}
} catch (e) {
throw new Error(`Could not retrieve file from S3: ${e.message}`)
}
}
At first glance it doesn't look like you are doing anything wrong but you don't show all your code. The following worked for me when I was first checking out S3 and Node:
var AWS = require('aws-sdk');
if (typeof process.env.API_KEY == 'undefined') {
var config = require('./config.json');
for (var key in config) {
if (config.hasOwnProperty(key)) process.env[key] = config[key];
}
}
var s3 = new AWS.S3({accessKeyId: process.env.AWS_ID, secretAccessKey:process.env.AWS_KEY});
var objectPath = process.env.AWS_S3_FOLDER +'/test.xml';
s3.putObject({
Bucket: process.env.AWS_S3_BUCKET,
Key: objectPath,
Body: "<rss><data>hello Fred</data></rss>",
ACL:'public-read'
}, function(err, data){
if (err) console.log(err, err.stack); // an error occurred
else {
console.log(data); // successful response
s3.getObject({
Bucket: process.env.AWS_S3_BUCKET,
Key: objectPath
}, function(err, data){
console.log(data.Body.toString());
});
}
});
Alternatively you could use minio-js client library get-object.js
var Minio = require('minio')
var s3Client = new Minio({
endPoint: 's3.amazonaws.com',
accessKey: 'YOUR-ACCESSKEYID',
secretKey: 'YOUR-SECRETACCESSKEY'
})
var size = 0
// Get a full object.
s3Client.getObject('my-bucketname', 'my-objectname', function(e, dataStream) {
if (e) {
return console.log(e)
}
dataStream.on('data', function(chunk) {
size += chunk.length
})
dataStream.on('end', function() {
console.log("End. Total size = " + size)
})
dataStream.on('error', function(e) {
console.log(e)
})
})
Disclaimer: I work for Minio Its open source, S3 compatible object storage written in golang with client libraries available in Java, Python, Js, golang.
Just as an alternate solution:
As per this issue on the same subject, it seems like in October 2022, there is a way of handling the body returned from an S3 GetObject request. Assuming you are using AWS SDK V3, you can take advantage of the #aws-sdk/util-stream-node package in the official AWS SDK:
import { GetObjectCommand, S3Client } from '#aws-sdk/client-s3';
import { sdkStreamMixin } from '#aws-sdk/util-stream-node';
const s3Client = new S3Client({});
const { Body } = await s3Client.send(
new GetObjectCommand({
Bucket: 'your-bucket',
Key: 'your-key',
}),
);
// Throws error if Body is undefined
const body = await sdkStreamMixin(Body).transformToString();
You can also transform the body into a byte array or web stream using the .transformToByteArray() and .transformToWebStream() functions.
Keep in mind that the package says that you shouldn't be using it directly, but it seems to be the most straightforward way to handle the body from the request.
This was found in this reply that highlighted a PR that added this feature.
This is the async / await version
var getObjectAsync = async function(bucket,key) {
try {
const data = await s3
.getObject({ Bucket: bucket, Key: key })
.promise();
var contents = data.Body.toString('utf-8');
return contents;
} catch (err) {
console.log(err);
}
}
var getObject = async function(bucket,key) {
const contents = await getObjectAsync(bucket,key);
console.log(contents.length);
return contents;
}
getObject(bucket,key);
The Body.toString() method no longer works with the latest version of the s3 api. Use the following instead:
const { S3Client, GetObjectCommand } = require("#aws-sdk/client-s3");
const streamToString = (stream) =>
new Promise((resolve, reject) => {
const chunks = [];
stream.on("data", (chunk) => chunks.push(chunk));
stream.on("error", reject);
stream.on("end", () => resolve(Buffer.concat(chunks).toString("utf8")));
});
(async () => {
const region = "us-west-2";
const client = new S3Client({ region });
const command = new GetObjectCommand({
Bucket: "test-aws-sdk-js-1877",
Key: "readme.txt",
});
const { Body } = await client.send(command);
const bodyContents = await streamToString(Body);
console.log(bodyContents);
})();
Copy and pasted from here: https://github.com/aws/aws-sdk-js-v3/issues/1877#issuecomment-755387549
Not sure why this solution hasn't already been added as I think it is cleaner than the top answer.
Using express and AWS SDK v3:
public downloadFeedFile = (req: IFeedUrlRequest, res: Response) => {
const downloadParams: GetObjectCommandInput = parseS3Url(req.s3FileUrl.replace(/\s/g, ''));
logger.info("requesting S3 file " + JSON.stringify(downloadParams));
const run = async () => {
try {
const fileStream = await this.s3Client.send(new GetObjectCommand(downloadParams));
if (fileStream.Body instanceof Readable){
fileStream.Body.once('error', err => {
console.error("Error downloading s3 file")
console.error(err);
});
fileStream.Body.pipe(res);
}
} catch (err) {
logger.error("Error", err);
}
};
run();
};

Categories

Resources