Trigger AWS Lambda from Dropbox file change - javascript

Not sure if this is a right place to ask the question.
I have a dropbox folder which gets updated files time to time.
I want to trigger my AWS Lambda function upon file change in my dropbox folder.
Is there a way to do it? If not, is there a way to call any other custom API from dropbox?

Related

How to setup an S3 bucket for a website where users can upload and download files in their specific folder

I have am working on a React project where users can upload files and generate a unique passcode, then create a folder in my S3 bucket named with this passcode. Then the user (or someone else) could access the website on another computer, type in this passcode and retrieve the files.
I don't have much experience with S3 so the settings are a bit overwhelming. How can I configure a bucket for this project? I read about something called a "signed-URL". Would that accomplish what I want to do?
This sounds like a Pastebin with a password, except that it is multiple files under one code. It's also a bit similar to Dropbox, in the way that it can 'share' files.
I would recommend:
Your app generates a Unique ID (UUID)
Your app invites the user to upload a set of files:
These can be Uploaded to Amazon S3 using presigned URLs, which allow the files go to straight to S3. Make sure they are uploaded to a path prefixed with the UUID.
The app gives the user the UUID for later retrieval
Another user goes to the app and requests files, providing the UUID
The app then presents a list of files from that directory. When showing this list, the app creates an Amazon S3 pre-signed URLs for each file, allowing the user to download them directly from S3.
You have some process that 'cleans up' files after a period of time, either based on the upload time and/or the download time
Basically, the Amazon S3 bucket is kept private and all objects are kept private. There is no configuration required on the bucket or the objects. Instead, the 'magic' comes from your application generating pre-signed URLs, which allow time-limited access to a private object.
Please refer to this article : How to host a website on S3 without getting lost in the sea
Maybe this could be of your help for the scenario you have mentioned in the question.
Please refrain from asking questions with full tutorial on stackoverflow as it is against community guidelines. We do are happy to send you in the right direction however its not right to ask for full code or tutorial...
If you really need the full app or code here....just post the wrong code, someone will definitely fix itfor you, but no free service

Allow full synchronized filesystem read/write access over a subdirectory of an S3 bucket?

I have a web service involving the attachment of multiple documents to one of many "objects". To simplify this process and make it easier to edit each file individually if the user so desires, I want the user to be able to synchronise all of these files onto a directory on his/her computer (much like that of Google Drive or Dropbox). If the user were to change one of these files, remove a file, or add a file, this would reflect on my web service, and hence affect the files that are attached to this "object".
What would be the best choice of services to do this? I am currently using a Node.JS back-end, although I suspect this will do little to influence the choice of storage. I'm looking for a service that allows the user the flexibility of full filesystem-level CRUD, whilst synchronising the files in a secure manner to a subset of a larger object storage collection, much like synchronising (and only providing access to) a subdirectory of an AWS S3 bucket, hence the title of this question.
I'm currently looking into (somehow) doing this with AWS S3, although I am open to using another storage service.
Thanks in advance.
AWS Storage Gateway (cached gateway) allows you to edit files locally and the Gateway will synchronize the update automatically over to S3.
You will need to install a small VM on the machine. Typically, if your clients have a private data centre / server , this config will allows a "share folder" (or a NAS) to be synchronized with S3.

Display owncloud image in website

I'm trying to build a service into my website to allow uploads to be saved to owncloud and then displayed.
As per this: Uploading files to an ownCloud server programatically
I was able to setup postman to upload files and successfully save to serve. Now how do I display those files or do a getrequest to display an image on my website?
I found the get command. You need to make sure you have the authorization header as in postman for example.
You could do a get call
<baseurl>/owncloud/remote.php/<user>/<folder>/<pathtofile>
if you click on authorization select normal then put in your credentials it will generate a token key for you, this should be a separate call itself to generate a unique one each time.
You can also create folders to upload to using the MKCOL request. I had to export my postman and edit the export to have the MKCOL request b/c they are not built in.
<baseurl>/owncloud/remote.php/<user>/<foldertocreate>

How do I handle local file uploads in electron?

I'm having a hard time figuring out how to approach file uploads in atom electron. I would post code but I don't even know where to begin with this one.
In a standard web app I would post from the client to the server, either via a standard postback or using an ajax request. I have some pretty cool solutions for that. But in the case of electron, I'm not sure where or how to "post" the file back. I guess I just want to access the contents of my <input type='file' /> from node.js. How do I do this?
I could post to the browser process, but I don't know what the "address" would be. Or do I need to create a separate "page" in my app just to accept form posts? My background in web-dev is probably blinding me to some obvious answer, can someone help?
EDIT
Just to add a little more context, I have a .csv file which I'd like to allow the user to upload. I will then process this using node-csv and insert each returned row into the app's nedb datastore.
If you're going to process the file on the user's machine then there is no need to upload the file anywhere, it's already exactly where you need it. All you need to do is popup a dialog to allow the user to browse their file system and select the file they want your app to process. You can create a button and call dialog.showOpenDialog when the user presses it, that will get you a filename and you can then use Node's fs.readFile to read it from disk, then you can go ahead and process the contents in whichever way you want.

Initiate a 'save as' dialog for files which reside on a remote server

I'm writing a web page (using spring mvc) which displays a list of files that the user can download. I would like the 'save as' dialog to open when user choose a file and clicks the download button. To my best understanding, i can use href with the path to the file (or use window.location), however the files that should be downloaded are located in a different server and need to be fetched from there first.
There are two options that i have thought of, both have big flaws:
1. use window.location with a link to a spring mvc controller. in the controller make a call to fetch the file and set its content on the response (with content-disposition header).
The problem is that for large files, it will take some time for the save as dialog to open (As the file needs to be fetched from the remote server first) and i have no way of 'telling' the user that the download has started.
2. make an http call that will retrieve the file from the second server (and have some kind of indication while the http promise has not yet resolved), save it on my server, return the path to the file in the response and than call window.location with the returned path.
In this case, the downloaded files will need to somehow get deleted at the end of the download.
Any ideas and thoughts?
Thanks a lot!

Categories

Resources