I need to figure out where my files are downloading when I use the filesDownload(). I don't see an argument for file destination. Here's my code:
require('isomorphic-fetch');
var Dropbox = require('dropbox').Dropbox;
var dbx = new Dropbox({ accessToken: 'accessToken', fetch});
dbx.filesDownload({path: 'filepath}).
then(function(response) {
console.log(response);
})
.catch(function(error) {
console.log(error);
});
I'm getting a successful callback when I run the code but I don't see the file anywhere.
I need to know where my files are downloading to and how to specify the file destination in my function.
Thanks,
Gerald
I've used the function as described in the SDK's documentation (http://dropbox.github.io/dropbox-sdk-js/Dropbox.html#filesDownload__anchor) but I have no idea where my file goes.
Expected Result: Files are downloaded to Dropbox to path that I have designated.
Actual Results: I get a successful callback from Dropbox but I cannot find the files downloaded.
In Node.js, the Dropbox API v2 JavaScript SDK download-style methods return the file data in the fileBinary property of the object they pass to the callback (which is response in your code).
You can find an example of that here:
https://github.com/dropbox/dropbox-sdk-js/blob/master/examples/javascript/node/download.js#L20
So, you should be able to access the data as response.fileBinary. It doesn't automatically save it to the local filesystem for you, but you can then do so if you want.
You need to use fs module to save binary data to file.
dbx.filesDownload({path: YourfilePath})
.then(function(response) {
console.log(response.media_info);
fs.writeFile(response.name, response.fileBinary, 'binary', function (err) {
if (err) { throw err; }
console.log('File: ' + response.name + ' saved.');
});
})
.catch(function(error) {
console.error(error);
});
Related
I am trying to fetch a S3 Object using AWS Storage
fetchAvatar = async () => {
try {
const imageData = await Storage.get("public/public/us-east-2:/1597842961073/family.jpg")
console.log(imageData)
} catch (err) {
console.log('error fetching avatar: ')
console.log(err)
}
}
When I click on the link that the imageData provides I get NoSuchKey error, however it does exist
I've made sure that the image is public and accessible by everyone, so there shouldn't be any authentication problems. I've also looked at similar issue to this and I made sure there is no spaces or a lot of special keys in my image keys. I am kind of stumped on this...
So I figured out the reason, and it has to do something with AWS S3 Management. For some reason that every time I upload an image, the folder will reset and become privet. When I remake the folders and image public manually I am able to render the image properly...So i guess it is more of AWS issue or bug that they need to fix I think
I suggest to use javascript aws sdk, you can get an object from the bucket like below:
var params = {
Bucket: "your-bucket-name",
Key: "yourFileName.jpg"
};
s3.getObject(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else console.log(data);
});
UPDATE:
You can define your region when you create a s3 instance, like:
const s3 = new S3({
region: 'eu-central-1',
});
So I am making a local App using Javascript , React and Electron and I want it to be able to work just fine without internet.
I can't use 'localStorage' because the data might get deleted if the user deletes the cache.
I tried reading/writing using differant Modules, none of them worked mostly because of CROS. Using XMLHTTPrequests and Ajax doesn't work either and am running out of time.
When I use them on the test server, they return the index.html for the mainpage (They can at least access that ... and still they can't read the data) but when I try it on the build I get CORS the error.
My Idea for now is to enable CORS on my webpage since I have no worries about security : The App will run ONLY offline so there is no danger.
But After many hours...I didn't find a solution to do it on the client side.
If anyone has an idea or suggestion I would be grateful.
I tried : fs,FileReader,FileSaver, $.ajax,XMLHTTPrequests
//using $ajax
var test = $.ajax({
crossDomain:true,
type: 'GET',
url:'../data/DefaultCategorie.txt',
contentType: 'text/plain',
success: function(data){
console.log(data);
},
error: function(){
alert('failed');
},
})
//using fs
fs.readFile('../data/DefaultCategorie.txt', 'utf8', (err, data) => {
if (err) {
console.log("Failed");
throw err
}
console.log(data);
fs.close(data, (err) => {
if (err) throw err;
});
});
This article covers the 3 most common ways to store user data: How to store user data in Electron
The Electron API for appDatadoes what you want. It is very easy to use.
From the above article:
const userDataPath = (electron.app || electron.remote.app).getPath('userData');
this.path = path.join(userDataPath, opts.configName + '.json')
this.data = parseDataFile(this.path, opts.defaults);
function parseDataFile(filePath, defaults) {
try {
return JSON.parse(fs.readFileSync(filePath));
} catch(error) {
// if there was some kind of error, return the passed in defaults instead.
return defaults;
}
}
Docs
app.getPath(name)
name String
Returns String - A path to a special directory or file associated with
name. On failure, an Error is thrown.
You can request the following paths by the name:
appData - Per-user application data directory, which by default points to:
%APPDATA% on Windows
$XDG_CONFIG_HOME or ~/.config on Linux
~/Library/Application Support on macOS
userData - The directory for storing your app's configuration files,
which by default it is the appData directory appended with your app's
name.
The event passed to my Google cloud function only really tells me the name of the bucket and file, and whether the file was deleted. Yes, there's more there, but it doesn't seem all that useful:
{ timestamp: '2017-03-25T07:13:40.293Z',
eventType: 'providers/cloud.storage/eventTypes/object.change',
resource: 'projects/_/buckets/my-echo-bucket/objects/base.json#1490426020293545',
data: { kind: 'storage#object',
resourceState: 'exists',
id: 'my-echo-bucket/base.json/1490426020293545',
selfLink: 'https://www.googleapis.com/storage/v1/b/my-echo-bucket/o/base.json',
name: 'base.json',
bucket: 'my-echo-bucket',
generation: '1490426020293545',
metageneration: '1',
contentType: 'application/json',
timeCreated: '2017-03-25T07:13:40.185Z',
updated: '2017-03-25T07:13:40.185Z',
storageClass: 'STANDARD',
size: '548',
md5Hash: 'YzE3ZjUyZjlkNDU5YWZiNDg2NWI0YTEyZWZhYzQyZjY=',
mediaLink: 'https://www.googleapis.com/storage/v1/b/my-echo-bucket/o/base.json?generation=1490426020293545&alt=media', contentLanguage: 'en', crc32c: 'BQDL9w==' }
}
How do I get the contents and not merely the meta-data of a new .json file uploaded to a gs bucket?
I tried using npm:request() on event.data.selfLink, which is a URL for the file in the storage bucket, and got back an authorization error:
"code": 401, "message": "Anonymous users does not have storage.objects.get access to object my-echo-bucket/base.json."
There was a similar question on SO about reading storage buckets, but probably on a different platform. Anyway it was unanswered:
How do I read the contents of a file on Google Cloud Storage using javascript
`
You need to use a client library for google storage instead of accessing via the URL. Using request() against the URL would only work if the file was exposed to public access.
Import the google cloud storage library in the npm-managed directory containing your project.
npm i #google-cloud/storage -S
The npm page for google-cloud/storage has decent examples but I had to read through the API a bit to see an easy way to download to memory.
Within the Google Cloud Functions environment, you do not need to supply any api key, etc. to storage as initialization.
const storage = require('#google-cloud/storage')();
The metadata passed about the file can be used to determine if you really want the file or not.
When you want the file, you can download it with the file.download function, which can take either a callback or, lacking a callback, will return a promise.
The data however, is returned as a Buffer so you will need to call data.toString('utf-8') to convert it to a utf-8 encoded string.
const storage = require('#google-cloud/storage')();
exports.logNewJSONFiles = function logNewJSONFiles(event){
return new Promise(function(resolve, reject){
const file = event.data;
if (!file){
console.log("not a file event");
return resolve();
}
if (file.resourceState === 'not_exists'){
console.log("file deletion event");
return resolve();
}
if (file.contentType !== 'application/json'){
console.log("not a json file");
return resolve();
}
if (!file.bucket){
console.log("bucket not provided");
return resolve();
}
if (!file.name){
console.log("file name not provided");
return resolve();
}
(storage
.bucket(file.bucket)
.file(file.name)
.download()
.then(function(data){
if (data)
return data.toString('utf-8');
})
.then(function(data){
if (data) {
console.log("new file "+file.name);
console.log(data);
resolve(data);
}
})
.catch(function(e){ reject(e); })
);
});
};
Deployment is as expected:
gcloud beta functions deploy logNewJSONFiles --stage-bucket gs://my-stage-bucket --trigger-bucket gs://my-echo-bucket
Remember to look in the Stackdriver:Logging page on Google Cloud Platform for the console.log entries.
UPDATE: (2019) When cloud-functions first released, they had some issues with ECONNRESET. I think that's fixed now. If not, use something like npm:promise-retry
npm install #google-cloud/storage --production
package.json:
{
"main": "app.js",
"dependencies": {
"#google-cloud/storage": "^1.2.1"
}
}
You should achieve that npm ls shows no errors like npm ERR! missing:.
app.js:
...
const storage = require("#google-cloud/storage")();
storage.
bucket("mybucket").
file("myfile.txt").
download( function(err, contents) {
console.log(contents.toString());
} );
I have successfully uploaded files to Firebase's storage via Google Cloud Storage through JS! What I noticed is that unlike files uploaded directly, the files uploaded through Google Cloud only have a Storage Location URL, which isn't a full URL, which means it cannot be read! I'm wondering if there is a way to generate a full URL on upload for the "Download URL" part of Firebase's actual storage.
Code being used:
var filename = image.substring(image.lastIndexOf("/") + 1).split("?")[0];
var gcs = gcloud.storage();
var bucket = gcs.bucket('bucket-name-here.appspot.com');
request(image).pipe(bucket.file('photos/' + filename).createWriteStream(
{metadata: {contentType: 'image/jpeg'}}))
.on('error', function(err) {})
.on('finish', function() {
console.log(imagealt);
});
When using the GCloud client, you want to use getSignedUrl() to download the file, like so:
bucket.file('photos/' + filename).getSignedUrl({
action: 'read',
expires: '03-17-2025'
}, function(err, url) {
if (err) {
console.error(err);
return;
}
// The file is now available to read from this URL.
request(url, function(err, resp) {
// resp.statusCode = 200
});
});
You can either:
a) Create a download url through the firebase console
b) if you attempt to get the downloadurl programmatically from a firebase client, one will be created on the fly for you.
I uploaded a file to my blob storage in Azure and now I want to get the Azure link for the upload. I'm using node.js and below is my code:
blobService.createContainerIfNotExists('trimfaces', {
publicAccessLevel: 'blob'
}, function(error, result, response) {
if (!error) {
// if result = true, container was created.
// if result = false, container already existed.
}
});
You might want to call blobService.getUrl(containerName, blobName). The API document is here: http://azure.github.io/azure-storage-node/BlobService.html#getUrl
[Updated]
The versioned document for v10 or later is at:
https://azuresdkdocs.blob.core.windows.net/$web/javascript/azure-storage-blob/12.0.0/classes/blockblobclient.html#url
According to the documentation, this is the way to list blobs using node:
blobSvc.listBlobsSegmented('mycontainer', null, function(error, result, response){
if(!error){
// result.entries contains the entries
// If not all blobs were returned, result.continuationToken has the continuation token.
}
});
You can see the entire documentation for Azure Storage Blobs here:
https://azure.microsoft.com/en-us/documentation/articles/storage-nodejs-how-to-use-blob-storage/