Dropbox.js how to download an specific revision? is it possible? - javascript

I'm using dropbox.js to upload the files from my web app to the cloud.
I noticed that if you upload two files with the same name, it just create another version or revision.
The thing is I dont found any way to "programatically" download an specific revision of the file.
Is there any workaround? Any help will be appreciated
I'm using this function to generate the download link:
function downFile(i) {
var client = new Dropbox.Client({
key: "xxxxxxxxxxx",
secret: "xxxxxxxxxx",
sandbox: false,
token: "xxxxxxxxxxxx"
});
client.makeUrl(i, {
downloadHack: false
}, function(error, data) {
if (error) {
return console.log(error); // Something went wrong.
}
$("#mylink").html(data.url);
$("#mylink").attr("href", data.url);
});
}

In dropbox.js, the method to download a file is called readFile, and it takes an optional rev parameter to specify which revision of the file you want to access.
So something like client.readFile(path, { rev: 'abc123' }, function (err, contents) { ... }); should work.
The code you have so far seems to be creating a share link for the file. There's no way to create a share link that points to a specific revision of the file... you'll always get the latest file contents from a share link.

Related

Read/Write and store data internelly in a Local App (no server) with JavaScript

So I am making a local App using Javascript , React and Electron and I want it to be able to work just fine without internet.
I can't use 'localStorage' because the data might get deleted if the user deletes the cache.
I tried reading/writing using differant Modules, none of them worked mostly because of CROS. Using XMLHTTPrequests and Ajax doesn't work either and am running out of time.
When I use them on the test server, they return the index.html for the mainpage (They can at least access that ... and still they can't read the data) but when I try it on the build I get CORS the error.
My Idea for now is to enable CORS on my webpage since I have no worries about security : The App will run ONLY offline so there is no danger.
But After many hours...I didn't find a solution to do it on the client side.
If anyone has an idea or suggestion I would be grateful.
I tried : fs,FileReader,FileSaver, $.ajax,XMLHTTPrequests
//using $ajax
var test = $.ajax({
crossDomain:true,
type: 'GET',
url:'../data/DefaultCategorie.txt',
contentType: 'text/plain',
success: function(data){
console.log(data);
},
error: function(){
alert('failed');
},
})
//using fs
fs.readFile('../data/DefaultCategorie.txt', 'utf8', (err, data) => {
if (err) {
console.log("Failed");
throw err
}
console.log(data);
fs.close(data, (err) => {
if (err) throw err;
});
});
This article covers the 3 most common ways to store user data: How to store user data in Electron
The Electron API for appDatadoes what you want. It is very easy to use.
From the above article:
const userDataPath = (electron.app || electron.remote.app).getPath('userData');
this.path = path.join(userDataPath, opts.configName + '.json')
this.data = parseDataFile(this.path, opts.defaults);
function parseDataFile(filePath, defaults) {
try {
return JSON.parse(fs.readFileSync(filePath));
} catch(error) {
// if there was some kind of error, return the passed in defaults instead.
return defaults;
}
}
Docs
app.getPath(name)
name String
Returns String - A path to a special directory or file associated with
name. On failure, an Error is thrown.
You can request the following paths by the name:
appData - Per-user application data directory, which by default points to:
%APPDATA% on Windows
$XDG_CONFIG_HOME or ~/.config on Linux
~/Library/Application Support on macOS
userData - The directory for storing your app's configuration files,
which by default it is the appData directory appended with your app's
name.

How do I read the contents of a new cloud storage file of type .json from within a cloud function?

The event passed to my Google cloud function only really tells me the name of the bucket and file, and whether the file was deleted. Yes, there's more there, but it doesn't seem all that useful:
{ timestamp: '2017-03-25T07:13:40.293Z',
eventType: 'providers/cloud.storage/eventTypes/object.change',
resource: 'projects/_/buckets/my-echo-bucket/objects/base.json#1490426020293545',
data: { kind: 'storage#object',
resourceState: 'exists',
id: 'my-echo-bucket/base.json/1490426020293545',
selfLink: 'https://www.googleapis.com/storage/v1/b/my-echo-bucket/o/base.json',
name: 'base.json',
bucket: 'my-echo-bucket',
generation: '1490426020293545',
metageneration: '1',
contentType: 'application/json',
timeCreated: '2017-03-25T07:13:40.185Z',
updated: '2017-03-25T07:13:40.185Z',
storageClass: 'STANDARD',
size: '548',
md5Hash: 'YzE3ZjUyZjlkNDU5YWZiNDg2NWI0YTEyZWZhYzQyZjY=',
mediaLink: 'https://www.googleapis.com/storage/v1/b/my-echo-bucket/o/base.json?generation=1490426020293545&alt=media', contentLanguage: 'en', crc32c: 'BQDL9w==' }
}
How do I get the contents and not merely the meta-data of a new .json file uploaded to a gs bucket?
I tried using npm:request() on event.data.selfLink, which is a URL for the file in the storage bucket, and got back an authorization error:
"code": 401, "message": "Anonymous users does not have storage.objects.get access to object my-echo-bucket/base.json."
There was a similar question on SO about reading storage buckets, but probably on a different platform. Anyway it was unanswered:
How do I read the contents of a file on Google Cloud Storage using javascript
`
You need to use a client library for google storage instead of accessing via the URL. Using request() against the URL would only work if the file was exposed to public access.
Import the google cloud storage library in the npm-managed directory containing your project.
npm i #google-cloud/storage -S
The npm page for google-cloud/storage has decent examples but I had to read through the API a bit to see an easy way to download to memory.
Within the Google Cloud Functions environment, you do not need to supply any api key, etc. to storage as initialization.
const storage = require('#google-cloud/storage')();
The metadata passed about the file can be used to determine if you really want the file or not.
When you want the file, you can download it with the file.download function, which can take either a callback or, lacking a callback, will return a promise.
The data however, is returned as a Buffer so you will need to call data.toString('utf-8') to convert it to a utf-8 encoded string.
const storage = require('#google-cloud/storage')();
exports.logNewJSONFiles = function logNewJSONFiles(event){
return new Promise(function(resolve, reject){
const file = event.data;
if (!file){
console.log("not a file event");
return resolve();
}
if (file.resourceState === 'not_exists'){
console.log("file deletion event");
return resolve();
}
if (file.contentType !== 'application/json'){
console.log("not a json file");
return resolve();
}
if (!file.bucket){
console.log("bucket not provided");
return resolve();
}
if (!file.name){
console.log("file name not provided");
return resolve();
}
(storage
.bucket(file.bucket)
.file(file.name)
.download()
.then(function(data){
if (data)
return data.toString('utf-8');
})
.then(function(data){
if (data) {
console.log("new file "+file.name);
console.log(data);
resolve(data);
}
})
.catch(function(e){ reject(e); })
);
});
};
Deployment is as expected:
gcloud beta functions deploy logNewJSONFiles --stage-bucket gs://my-stage-bucket --trigger-bucket gs://my-echo-bucket
Remember to look in the Stackdriver:Logging page on Google Cloud Platform for the console.log entries.
UPDATE: (2019) When cloud-functions first released, they had some issues with ECONNRESET. I think that's fixed now. If not, use something like npm:promise-retry
npm install #google-cloud/storage --production
package.json:
{
"main": "app.js",
"dependencies": {
"#google-cloud/storage": "^1.2.1"
}
}
You should achieve that npm ls shows no errors like npm ERR! missing:.
app.js:
...
const storage = require("#google-cloud/storage")();
storage.
bucket("mybucket").
file("myfile.txt").
download( function(err, contents) {
console.log(contents.toString());
} );

Convert wav to mp3 using Meteor FS Collections on Startup

I'm trying to transcode all wav files into a mp3 using Meteor and Meteor FS Collections. My code works when I upload a wav file to the uploader -- That is it will convert the wav to a mp3 and allow me to play the file. But, I'm looking for a Meteor Solution that will transcode and add the file to the DB if the file is a wav and exist in a certain directory. According to the Meteor FSCollection it should be possible if the files have already been stored. Here is their example code: *GM is for ImageMagik, I've replaced gm with ffmpeg and installed ffmpeg from atmosphereJS.
Images.find().forEach(function (fileObj) {
var readStream = fileObj.createReadStream('images');
var writeStream = fileObj.createWriteStream('images');
gm(readStream).swirl(180).stream().pipe(writeStream);
});
I'm using Meteor-CollectionFS [https://github.com/CollectionFS/Meteor-CollectionFS]-
if (Meteor.isServer) {
Meteor.startup(function () {
Wavs.find().forEach(function (fileObj) {
var readStream = fileObj.createReadStream('.wavs/mp3');
var writeStream = fileObj.createWriteStream('.wavs/mp3');
this.ffmpeg(readStream).audioCodec('libmp3lame').format('mp3').pipe(writeStream);
Wavs.insert(fileObj, function(err) {
console.log(err);
});
});
});
}
And here is my FS.Collection and FS.Store information. Currently everything resides in one JS file.
Wavs = new FS.Collection("wavs", {
stores: [new FS.Store.FileSystem("wav"),
new FS.Store.FileSystem("mp3",
{
path: '~/wavs/mp3',
beforeWrite: function(fileObj) {
return {
extension: 'mp3',
fileType: 'audio/mp3'
};
},
transformWrite: function(fileObj, readStream, writeStream) {
ffmpeg(readStream).audioCodec('libmp3lame').format('mp3').pipe(writeStream);
}
})]
});
When I try and insert the file into the db on the server side I get this error: MongoError: E11000 duplicate key error index:
Otherwise, If I drop a wav file into the directory and restart the server, nothing happens. I'm new to meteor, please help. Thank you.
Error is clear. You're trying to insert a next object with this same (duplicated) id, here you should first 'erase' the id or just update the document instead of adding the new one. If you not provide the _id field, it will be automatically added.
delete fileObj._id;
Wavs.insert(fileObj, function(error, result) {
});
See this How do I remove a property from a JavaScript object?
Why do you want to convert the files only on startup, I mean only one time? Probably you want to do this continuously, if yes then you should use this:
Tracker.autorun(function(){
//logic
});

Uploading a file with ng-file-upload directive to a sails.js server

I'm attempting to upload a file to a sails.js application server using this directive:
ng-file-upload
My client side to upload an already selected image is this:
$upload.upload({
url:'upload/item-image',
headers:{
'Content-Type': 'multipart/form-data'
},
data:{blah:'blah'},
file: $scope.uploadFile,
fileName:$scope.uploadFile.name
}).success(function(data){
console.log(data);
}).error(function(err){
console.log(err);
});
And my sails.js controller method that handles the upload on the server starts like this:
upload: function (req, res) {
req.file('item-image').upload(function (err, files) {
if (err)
return res.serverError(err);
if(!files.length)
return res.serverError('No files received for upload.');
var file = files[0];
...
}
...
}
However, while the server function is being called, and the json data on the body exists, no files are being found by req.file('no matter what put here') callback.
Thanks in advance.
The argument to req.file needs to be the name of the field in the request where Sails should look for the uploaded files. If you're doing a simple HTML form with something like:
<input type="file" name="myFileField" />
then in your controller code you would have:
req.file('myFileField').upload(...);
With uploader widgets like the ng-file-upload widget referenced in this question, it can sometimes be a little tricky to figure out what that field name is. In this case it's hidden in the usage instructions:
//fileFormDataName: myFile, // or a list of names for multiple files (html5).
// Default is 'file'
So you can set fileFormDataName to change the upload field name, or else it defaults to file.
If you can't find the field name in the docs or by inspecting the HTML, the easiest thing to do is just do an upload (even if it's unsuccessful) and inspect the network request using the developer tools in Chrome:
This is from the Angular file upload demo page. You can see in the lower right where it says name="myFile"; that's what you're looking for. In this case the file field is called myFile.
one important thing, if you want to process more than one file in your backend using:
req.file('files').upload({...});
you need to add the arrayKey: '' parameter in your Upload.upload({}) like this:
Upload.upload({
url: 'http://localhost:1337/uploads',
arrayKey: '',
data: {
files: files
}

Moving to s3 storage with node.js

I'm trying to get ready to move a node.js application I've been working on into a production environment (I'm using Heroku). Users add images to the site via url. Right now they are just saved on the server- I'd like to move the storage to s3 but am having some difficulties.
The idea is to save the image to disk first and then upload it, but I'm struggling to find a way to be notified when the file has finished writing to the disk. It seems like it doesn't use the typical node callback style.
Here is the code as it is now. I'm using the request node module which may be complicating things rather than simplifying them:
requestModule(request.payload.url).pipe(fs.createWriteStream("./public/img/submittedContent/" + name));
// what do I do here?
fs.readFile("./public/img/submittedContent/" + name, function(err, data){
if (err) { console.warn(err); }
else {
s3.putObject({
Bucket: "submitted_images",
Key: name,
Body: data
}).done(function(s3response){
console.log("success!");
reply({message:'success'});
}).fail(function(s3response){
console.log("failure");
reply({message:'failure'});
});
}
});
Any advice would be appreciated. Thanks!
Try listening for the finish event on the writable stream:
requestModule(request.payload.url).pipe(fs.createWriteStream("./public/img/submittedContent/" + name)).on('finish', function(){
// do stuff with saved file
});
Unless you're modifying the image, you shouldn't upload to Heroku - but rather directly to S3.
See Direct to S3 File Uploads in Node.js

Categories

Resources