Node S3Client Error - input.useDualstackEndpoint is not a function - javascript

I'm trying to get a list of files within an S3 folder in a lambda written in typescript. I've added the following dependencies to my package.json.
"#aws-sdk/client-s3": "^3.41.0",
"#aws-sdk/node-http-handler": "^3.40.0",
I then use the S3 client like this:
const client = new S3Client({
maxAttempts: 3,
retryMode: 'STANDARD',
region: getAwsRegion(),
requestHandler: new NodeHttpHandler({
connectionTimeout: 3000, // Timeout requests after 3 seconds
socketTimeout: 5000, // Close socket after 5 seconds
}),
credentials: args.credentials,
});
const listObjectsCommand = new ListObjectsCommand({
Bucket: args.bucketName,
Delimiter: '/',
Prefix: pathToPartition,
});
const objects = await client.send(listObjectsCommand);
I've tried using ListObjectsV2Command as well, but it has the same error. The error is:
TypeError: input.useDualstackEndpoint is not a function\n at
Object.getEndpointFromRegion
(/var/task/node_modules/<my_module>/node_modules/#aws-sdk/config-resolver/dist-cjs/endpointsConfig/utils/getEndpointFromRegion.js:12:46)\n
at processTicksAndRejections (internal/process/task_queues.js:95:5)\n
at async Object.serializeAws_restXmlListObjectsCommand
(/var/task/node_modules/<my_module>/node_modules/#aws-sdk/client-s3/dist/cjs/protocols/Aws_restXml.js:2386:68)\n
at async
/var/task/node_modules/<my_module>/node_modules/#aws-sdk/middleware-serde/dist-cjs/serializerMiddleware.js:5:21
.
Any idea what I may be doing wrong?

This happened due to a mismatch is the aws-sdk version used in the package.json I had and the package.json of a dependency. Updating it to be the same version fixed this!

Here is a working minimal version of your function. Get this simple version working first and incrementally add complexity.
AWS_REGION is a lambda-provided env var. I defined the BUCKET env var. You can pass the bucket name with the event payload (or, to begin with, hard-code it to minimise sources of error).
import { S3Client, ListObjectsCommand} from '#aws-sdk/client-s3';
const client = new S3Client({ region: process.env.AWS_REGION });
const bucket = process.env.BUCKET;
export async function handler(): Promise<void> {
const cmd = new ListObjectsCommand({
Bucket: bucket
});
const objects = await client.send(cmd);
console.log(objects)
}

Related

AWS S3 V3 Error trying to get list of objects inside a bucket. SignatureDoesNotMatch

I have Reactjs project created using create-react-app and a aws s3 bucket in witch I've saved some images that I want to display on my website.
I have created a aws.js where I configure and make the call like this
import { S3Client } from "#aws-sdk/client-s3";
import { ListObjectsV2Command } from "#aws-sdk/client-s3";
const REGION = 'eu-central-1'
const credentials = {
accessKeyId: accessKeyId,
privateKeyId: privateKeyId,
}
const config = {
region: REGION,
credentials: credentials,
}
const bucketName = {
Bucket: bucketName,
}
const s3Client = new S3Client(config);
export const run = async () => {
try{
const command = new ListObjectsV2Command(bucketName);
const data = await s3Client.send(command);
console.log("SUCCESS\n", data);
}
catch(err) {
console.log("ERROR\n", err);
}
}
I have also created a .env filder where I saved the keys with and without REACT_APP prefix but the result is the same. Invalidating the credentials.
For credentials I've checked and rechecked 10 times and I also created a new user and use those keys but nothing. I also configured CORS to allow access from my localhost.
What I'm doing wrong? And is there a complete documentation from A-Z on what to use AWS services? Including v3, api doc, credentials set up and everything.
P.S. It's my first time using AWS so some docs would be much apreciated. Thanks in advance
UPDATE---
I tried to use aws javascript sdk v2 and now it works. Here is the code that I used to list objects inside a bucket
But it works only when I used AWS.config.update if I passed the configuration to the bucket it still thrown an error
const AWS = require('aws-sdk');
AWS.config.update({
region: region,
accessKeyId: accessKeyId,
secretAccessKey: secretAccessKey
});
let s3 = new AWS.S3()
export const testFnc = () =>{
s3.listObjects({
Bucket: 'artgalleryszili.digital'
}, (res, err) => {
if(err){
console.log(err);
}
else{
console.log(res);
}
})
}

aws javascript sdk v3 - signature mismatch error

I can generate the presigned url following the steps as described in this section, so I wanted to test uploading a specific image marble.jpg and I tried to use postman to test the upload. So, I copied the presigned url and hit the endpoint with a PUT request, and I got this error:
<?xml version="1.0" encoding="UTF-8"?>
<Error>
<Code>SignatureDoesNotMatch</Code>
<Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message>
<Key>records/marble_cave.jpg</Key>
<BucketName>bucket</BucketName>
<Resource>/bucket/records/marble.jpg</Resource>
<RequestId>17E3999B521ABB65</RequestId>
<HostId>50abb07a-2ad0-4948-96e0-23403f661cba</HostId>
</Error>
The following resources are setup:
I'm using the min.io server to test this locally.
I'm using aws-sdk version 3 of the nodejs sdk for aws
I've triple checked my credentials, simple minio creds with no special characters also, I'm definitely making a PUT request.
So, The question is:
How to set the signatureVersion using the new javascript aws sdk version 3. (
The getSignedUrl is used to generate presigned url in v3 of the sdk, import { getSignedUrl } from '#aws-sdk/s3-request-presigner';)
what causes might be there such that this error is occuring?
The code I use for presigned url generation is:
import { getSignedUrl } from '#aws-sdk/s3-request-presigner';
import { PutObjectCommand, S3Client } from '#aws-sdk/client-s3';
const s3Client = new S3Client({
region: 'us-east-1',
credentials: {
accessKeyId: 'minioadmin',
secretAccessKey: 'minioadmin',
},
endpoint: http://172.21.0.2:9000,
forcePathStyle: true,
});
const bucketParams = {
Bucket: 'myBucket',
Key: `marbles.jpg`,
};
const command = new PutObjectCommand(bucketParams);
const signedUrl = await getSignedUrl(s3Client, command, {
expiresIn: 10000,
})
I stumbled on this issue myself a year ago, the new V3 SDK has a bug, it doesn't take the port into consideration when signing a URL.
see here https://github.com/aws/aws-sdk-js-v3/issues/2726
the work around I ended up implemented overrides getSignedUrl in my code and add the missing port as follows:
import {BuildMiddleware, MetadataBearer, RequestPresigningArguments} from '#aws-sdk/types';
import {Client, Command} from '#aws-sdk/smithy-client';
import {HttpRequest} from '#aws-sdk/protocol-http';
import {formatUrl} from '#aws-sdk/util-format-url';
import {S3RequestPresigner} from '#aws-sdk/s3-request-presigner';
export const getSignedUrl = async <
InputTypesUnion extends object,
InputType extends InputTypesUnion,
OutputType extends MetadataBearer = MetadataBearer
>(
client: Client<any, InputTypesUnion, MetadataBearer, any>,
command: Command<InputType, OutputType, any, InputTypesUnion, MetadataBearer>,
options: RequestPresigningArguments = {}
): Promise<string> => {
const s3Presigner = new S3RequestPresigner({ ...client.config });
const presignInterceptMiddleware: BuildMiddleware<InputTypesUnion, MetadataBearer> =
(next, context) => async (args) => {
const { request } = args;
if (!HttpRequest.isInstance(request)) {
throw new Error('Request to be presigned is not an valid HTTP request.');
}
// Retry information headers are not meaningful in presigned URLs
delete request.headers['amz-sdk-invocation-id'];
delete request.headers['amz-sdk-request'];
// User agent header would leak sensitive information
delete request.headers['x-amz-user-agent'];
delete request.headers['x-amz-content-sha256'];
delete request.query['x-id'];
if (request.port) {
request.headers['host'] = `${request.hostname}:${request.port}`;
}
const presigned = await s3Presigner.presign(request, {
...options,
signingRegion: options.signingRegion ?? context['signing_region'],
signingService: options.signingService ?? context['signing_service'],
});
return {
// Intercept the middleware stack by returning fake response
response: {},
output: {
$metadata: { httpStatusCode: 200 },
presigned,
},
} as any;
};
const middlewareName = 'presignInterceptMiddleware';
client.middlewareStack.addRelativeTo(presignInterceptMiddleware, {
name: middlewareName,
relation: 'before',
toMiddleware: 'awsAuthMiddleware',
override: true,
});
let presigned: HttpRequest;
try {
const output = await client.send(command);
//#ts-ignore the output is faked, so it's not actually OutputType
presigned = output.presigned;
} finally {
client.middlewareStack.remove(middlewareName);
}
return formatUrl(presigned);
};
The solution is probably the same as in my other question, so simply copying the answer:
I was trying and changing ports, and the put command seems to work when I use only local host for url generation
so, in this above:
new S3Client({
region: 'us-east-1',
credentials: {
accessKeyId: 'minioadmin',
secretAccessKey: 'minioadmin',
},
endpoint: http://172.21.0.2:9000,
forcePathStyle: true,
});
I use:
new S3Client({
region: 'us-east-1',
credentials: {
accessKeyId: 'minioadmin',
secretAccessKey: 'minioadmin',
},
endpoint: http://172.21.0.2, // or 127.0.0.1
forcePathStyle: true,
});
Note, I haven't used any port number, so the default is 80
If you're using docker-compose add this config:
.
.
.
ports:
- 80:9000
and it works fine.

AWS SDK : s3.upload is not a function

I am trying to upload files to my S3 bucket from my Node.js app, so I am following some very simple tutorials like this one.
The code is pretty straightforward :
const AWS = require("aws-sdk"); // fresh install, version : ^2.697.0
AWS.config.update({ // Credentials are OK
accessKeyId: process.env.s3_accessKeyId,
secretAccessKey: process.env.s3_secretAccessKey,
region: 'eu-central-1'
});
const s3 = new AWS.S3();
let params = {
// (some upload params, file name, bucket name etc)
};
s3.upload(params); // <-- crash with error: "s3.upload is not a function"
I had a look at the official AWS documentation and s3.upload() seems to be a thing. I have no idea why I get an error.
If I console.log(s3.upload) I get undefined.
Node.js v13.11.0.
EDIT
I ended up using s3.putObject() which does pretty much the same thing as s3.upload(), and works, while the latter is still inexplicably undefined...
console.log(`typeof s3.upload = `);
console.log(typeof s3.upload); // undefined?? WHY
console.log(`typeof s3.putObject = `);
console.log(typeof s3.putObject); // function, and works
Use putObject, example:
s3.client.putObject({
Bucket: bucketName,
Key: 'folder/file.txt',
Body: data,
ACL: 'public-read'
}, function (res) {
console.log('Successfully uploaded file.');
})
Documentation: https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#putObject-property
Can also try reinstalling aws-sdk package.
Refer: https://github.com/aws/aws-sdk-js/issues/916#issuecomment-191012462
You can try this
s3 = new AWS.S3({apiVersion: '2006-03-01'});
s3.upload(params, function(err, data) {
console.log(err, data);
});

Upload csv file to aws s3 bucket directly from a server

Happy weekend all,
I'm working on a task that fetches data from an API then store them into a csv file then from there directly upload to AWS S3 bucket. I've tried several ways but I'm currently stuck at the very last point. Any help would be much appreciate
My code below would demonstrate most of the problems and also what I've been trying so far.
First, I will fetch the data from an API
async systems() {
const endpoint = sampleEndPoints.SYSTEMS
return this.aggregateEndpoint(endpoint)
}
Second, I will get the data that fetched back and put them in a csv file as buffer. (Because I have to store them in fs.createReadStream later on)
// generate JSON to Buffer
async generateCsvToBuffer(json){
const {aws} = this.config
var ws = xlsx.utils.json_to_sheet(json)
var wb = xlsx.utils.book_new();
await xlsx.utils.book_append_sheet(wb, ws, 'Systems')
const csvParsed = xlsx.write(wb, { type: 'buffer'})
return csvParsed;
}
Third, I get the buffer data from that csvParsed in order to upload it to the amazon AWS S3. The problem is right here, that the Body: fileStream.path is supposed to show the content of the file but unfortunately, it logs out like this which coming from the fs.createReadStream
'{"type":"Buffer","data":[80,75,3,4,10,0,0,0,0,0,249,117,199,78,214,146,124
async uploadSample(file){
const {aws} = this.config
AWS.config.update({
secretAccessKey: aws.secretAccessKey,
accessKeyId: aws.accessKeyId,
region: 'us-east-2'
})
const bufferObject = new Buffer.from(JSON.stringify(file))
/*** WE NEED THE FILE SYSTEM IN ORDER TO STORE */
const fileStream = fs.createReadStream(bufferObject)
const uploadParams = {Bucket: aws.bucket, Key: aws.key, Body: fileStream.path}
const s3 = new AWS.S3()
await s3.upload(uploadParams,null,function(error, file){
if(error){
console.log(error)
} else {
console.log('Successfully uploaded')
}
})
}
All of my function will be executed in the server.js. So if you have a look at this then you can actually get the whole picture of the problem
app.get('/systems/parsed', async(req, res) => {
const Sample = await Sample()
//Fetch the data from an API
const systems = await Cache.remember('systems', async() => {
return Sample.systems()
})
const integration = await IntegrationInstance()
/** GET THE RESPONSE DATA AND PUT THEM IN A CSV FILE*/
const result = await integration.generateCsvToBuffer(systems)
const aws = await AwsInstance()
/*** GET THE SYSTEMS FILE (CSV FILE) THEN UPLOAD THEM INTO THE AWS S3 BUCKET*/
const awsUpload = await aws.uploadWorkedWithBuffer(result)
return res.send(awsUpload);
})
My only concern here is that, the file has successfully uploaded to the Amazon AWS S3, but the content of the file is still in Buffer. Any help on the existing function / any shorter way would much appreciate.
Here's my summarize again: fetch data from a server -> put on the Csv file as buffer BUT from a web browser -> and from there upload it to Amazon AWS S3 bucket -> Problem is file is uploaded but the content of the file is still in buffer.
It looks like you are making things more complicated than necessary here. According to the documentation .upload you can pass a buffer to the upload directly instead of creating a stream from the buffer. I suspect your underlying issue is passing the path from the stream instead of the stream itself though.
I actually solved it.
First, whenever you created the function generateCsvToBuffer remember to have a bookType on your wb (Workbook) in order for s3 to recognize it. The function should be something like this
async generateCsvToBuffer(json){
const {aws} = this.config
var ws = xlsx.utils.json_to_sheet(json)
var wb = xlsx.utils.book_new();
await xlsx.utils.book_append_sheet(wb, ws, 'Systems')
const csvParsed = xlsx.write(wb, { type: 'buffer', bookType: 'csv'})
return csvParsed;
}
Second, you have to import Content-Disposition: attachment into the uploadParams for the Aws Configuration
async uploadSample(file){
const {aws} = this.config
AWS.config.update({
secretAccessKey: aws.secretAccessKey,
accessKeyId: aws.accessKeyId,
region: 'us-east-2'
})
const bufferObject = new Buffer.from(JSON.stringify(file))
/*** WE NEED THE FILE SYSTEM IN ORDER TO STORE */
const fileStream = fs.createReadStream(bufferObject)
const uploadParams = {Bucket: aws.bucket, Key: aws.key, Body: fileStream.path}
const s3 = new AWS.S3()
await s3.upload(uploadParams,null,function(error, file){
if(error){
console.log(error)
} else {
console.log('Successfully uploaded')
}
})
}

Missing ; before statement (Jira Addon - oAuth)

I get the error:
[WARNING] File encoding has not been set, using platform encoding windows-1252, i.e. build is platform dependent!
[INFO] Compiling javascript using YUI
[ERROR] missing ; before statement let privateKeyData = fs.readFileSync('location','utf-8');
As shown I have put the ; before let. I don't understand the error. I am creating an add-on for Jira. I started the JS file via cmd and it worked. However when I want to package the project I get that error. Please help.
jQuery(function($) {
var initmyConfluenceMacro = function() {
$(".myConfluenceMacro").each(function() {
const request = require('request');
const fs = require('fs');
let privateKeyData = fs.readFileSync('filelocation', 'utf-8');
const oauth = {
consumer_key: 'mykey',
consumer_secret: privatkey,
token: 'mytoken',
token_secret: 'tokensecret',
signature_method: 'signaturemethod'
};
request.get({
url: 'thelink',
oauth: oauth,
qs: null,
json: true
}, function(e, r, user) {
console.log(user)
});
var html = "output";
$(this).html(html);
});
};
$(document).ready(function() {
initmyConfluenceMacro();
});
});
The problem for the error is
const fs = require('fs');
fs is for (as on their page) security reasons removed from Atlassian and cannot be used. My workaround was to use the velocity template, in order to import the file, and then to parse it to the js file.
I hope this helps. If someone has other ideas please let me know.

Categories

Resources