I created a project, which is here: https://github.com/dartem/upload_files, and it uploads a file and saves it using FilesCollection. However, it looks like that an actual file is getting saved only temporarily in /cdn/storage and once I restart Meteor or if I open an incognito window an actual file doesn't exist.
I specify the path directory, which is assets/app/uploads/Images, but an image doesn't get saved in that directory. How can I save an actual file in that directory?
I ran your demo up, and it does save files to that directory.
It's not advisable to store files in .meteor/local - basically the files there are temporary - meteor will remove and replace them as it rebuilds the app. This explains why you can't find them again later.
It is also possible to write the files to the public directory, but that will trigger a rebuild of your app every time you save a file, which isn't a good side effect.
I would recommend that you only save the files to gridFS or AWS (or any of the other storage options). You could save them to a folder somewhere else in the file system, as long as you have a way of serving them up (some kind of web server like apache, express or whatever). Your choice based on what sysadmin capability you have.
Related
I am using the s3 bucket to serve the frontend app. And I am syncing code in that s3 bucket from the Jenkins server. While building the code in every deployment. Some files get updated some files get renamed, some files get deleted as well in the new folder on the Jenkins server.
Now when we use the s3 sync command on that build folder which was just created and has all the new files created in it. It basically pushes all the code to s3 and doesn't touch the code for example:- the test-123.jpg image which was been pushed in an earlier build. But has been deleted in the recent build.
This kind of file is renamed and all are not getting synced so they are just staying on the server.
May I know which is the ideal way to deal with this kind of issue while pushing the code to S3. My main motive is that I should be able to automatically delete the files which are not getting used or served anymore in the angular app.
I am working on a personal project and there are two things that keep bugging me.
This project is written in React and I am using an express node app for backend. In my frontend, I can load and send images to the server and I am saving the files inside an upload file and the path is stored inside a mongo database.
After the build, the files look like this:
As you can see, my uploads folder is inside the public folder. The public folder is the build of the react app. I just renamed it. The problem is, if I want to update the build, I have to save the uploads file in someplace else and than reintroduce it inside the new build(public) folder. And that sounds impractical to me. Is there a way around that? A best practice?
That would be the first thing.
The second one is the path that I am using to link the image folder to the <img src=''/>.
Right now, the path looks like this: http://localhost:5000/${ filePath }. And works just fine. But I don't think that in a production scenario I will be able to use this path.
The question is: Do I have to link the port as well? Something like: http://localhost:${PORT}/${ filePath } and the port is const PORT = process.env.PORT? Or is there a better practice? A way to get your own domain?
Hope I delivered all the info needed for an accurate answer. If not, I am waiting for feedback. Thanks!
"Best practice" would be uploading the files to static file server like S3 and storing a reference to they file key.
But short of that, I believe Express lets you set multiple static file directories using
app.use('/uploads', express.static(path.join(__dirname, 'uploads')));
so you could then keep your uploads directory outside your build folder.
As for including port in your asset URIs, what I've done in the past was to use an environment variable called domain that specified the web address of the current environment (including protocol, domain, and port). This way your dev enviornment can use localhost, but if you decied to deploy your production app to the public, you can just set the environment to the domain name -- and could tell your express server to listen on the 80/443 ports.
I'm attempting to create an app that can access a directory in the app's directory, zip it with JSZip and send it to an API at the press of a single button.
For instance, say I have a directory "Components" in the app's directory. At the press of one button (Deploy Components) the app should zip up that directory using JSZip and send it off to an API. It seems like React cannot access its own directory short of imports, and imports do not import a directory, only a single file. What I need is the full Components directory, with all files and sub-directories (and their files and sub-directories, and so on), zipped up and punted off.
I know I could have a form and have the user upload a zip file, but this gives the user far too much freedom to upload whatever they like - they should only be able to send this Components directory (which lives with the app and can't be changed without re-deploying the app) to the API.
Does JSZip have any methods that can create a new zipped folder from a provided local directory path? Does React have any functions I can use to load a directory, rather than individual files? Does node.js provide anything I can use (I am not familiar with node.js)? In my API (written in C#) I can use things like the System.IO File and Directory static classes to do things like this, but I am not sure if it is possible with React.
Is this even remotely possible? I've spent the best part of an afternoon trying to find ways of accessing and manipulating a local directory with react and mostly drawn blanks.
Edit: I stuck this on the backburner for a while, but eventually found this answer:
Need to ZIP an entire directory using Node.js and this has solved my issue.
I have an electron app, and when I build it for distribution, the actual app code and build folder are on app.asar file. During the app runtime, I have to copy certain files from the app.asar onto the user's computer, wherever the user chooses, and modify with the code.
The problem is that whenever the file is copied, it becomes readonly, and thus I can not write to it. Any way to handle this?
I'm running into this issue as well, I think the issue is that only some of the fs methods are ported over to work well with asar. According to the docs,
With special patches in Electron, Node APIs like fs.readFile and require treat asar archives as virtual directories, and the files in it as normal files in the filesystem.
Therefore, I think the solution is to manually copy the content of files from asar using fs.readFile and then to dump that into the file you want. I will try this today and hopefully post an update with some code.
I tried upload file to Heroku using https://www.npmjs.com/package/express-fileupload, on my PC it works great, but on Heroku there are this error:
{"errno":-2,"code":"ENOENT","syscall":"open","path":"./public/owner_photo/f28397baea8fb4d6f6dafed9f5586a9ac0b46843acf1120a0ecad24755cfff57.jpeg"}
How can I fix it?
Heroku has an immutable file system, meaning you can't make changes to, or additions to the file system. You'll need to store your uploads somewhere else, like Amazon S3.
Also, many upload packages by default store the uploaded file in a temp directory. So even if you are sending them to S3, you'll still need to make sure the methods you use don't attempt to do that, or set an option to disable it. I'm not familiar with express-fileupload so I can't say what methods do or do not attempt to store copies on the filesystem.
I have successfully implemented this using multiparty so I could be of more specific help with that package.