Heroku application file upload - file is deleted after some time? - javascript

So I have made an node.js based application with file upload and everything works fine,
I can upload files, browse uploaded files but after like 1 hour this file is I guess deleted from server? It's no longer available. So I have a question, is heroku deleting all non-application files after some time? Or what could be the reason of this?

Heroku file system is ephemeral so files are temporary and removed at every Dyno restart. It is likely that you have redeployed your application causing the restart of the Dyno.
Heroku also restarts a Dyno at least every 24 hours. See Heroku documentation
The good practise is to persist the files into an external storage (S3 for example). If you are interested in finding out which free options are available you can check this Git repo HerokuFiles

Related

Is there way to cleanup old unused filed on S3

I am using the s3 bucket to serve the frontend app. And I am syncing code in that s3 bucket from the Jenkins server. While building the code in every deployment. Some files get updated some files get renamed, some files get deleted as well in the new folder on the Jenkins server.
Now when we use the s3 sync command on that build folder which was just created and has all the new files created in it. It basically pushes all the code to s3 and doesn't touch the code for example:- the test-123.jpg image which was been pushed in an earlier build. But has been deleted in the recent build.
This kind of file is renamed and all are not getting synced so they are just staying on the server.
May I know which is the ideal way to deal with this kind of issue while pushing the code to S3. My main motive is that I should be able to automatically delete the files which are not getting used or served anymore in the angular app.

How to make sure javascript and css is cached in Angular app

I am reasonably new to angular (5), and have noticed that the javascript files (vendor.bundle, main.bundle, etc) are being reloaded each time I visit a page.
Is there anything in particular I should be doing to make sure that these are held in a browser cache after the first time they are loaded?
I guess I would need to add a cache-control header, but am not sure where to put it in the code, or whether this is something that the Angular-Cli could generate
Angular have lib called Service workers, which simply can be installed in cli project by below cli command
ng add #angular/pwa --project *project-name*
Note: project name can be obtained from angular.json
This command do the most required configuration but still some other stuff is needed, Which can be found on the flowing link service-worker confi. but some of this configuration already done by previous mentioned command. but also more configuration may be needed in "ngsw-config.json" file.
But unfortunately i tested this inside spring war and still the big files still downloaded every time without any caching but if i deployed on http server direct it work perfect.
Inside Spring War Result
For More Info. Please Check the blob of Angular Service Worker - Step-By-Step Guide for turning your Application into a PWA

Heroku / Node - how to add git commit on server

I'm learning Node and Git and I have a Heroku app that is reading and writing to a local file on the server (a very simple JSON database).
If I add the file to my gitignore locally, it disappears from my Heroku app and causes the app to error. But if I don't add it to my gitignore, it overwrites the latest version (on the server) with an old one I have locally.
Obviously the issue is because the changes on the server file aren't being committed. However, I don't know how to do that remotely, or if it's even possible. I can run heroku git:clone locally, but I can't run heroku:git add.
How do I handle this?
Generally, you should not commit a file that will be modified by the server.
It seems not a good idea because, as you said, this file will be overwrited by next push.
Usually you do not want to commit from your deployment branch, so it is not a good idea either to use git from server (and I doubt you can with Heroku).
Instead you could make your app check if the file exists and if not create that file on server.
That will work in a dedicated server you manage yourself, but Heroku doesn't work the same. Each push you make to your Heroku repository will in fact bundle your application before launching it on a dyno, and this process overwrite all the file, including your database JSON file, which will be no more persistent.
So I think you have no choice than switch to another storage method, for exemple subscribe a free Heroku postgreSQL plan or another database you prefer.

Heroku nodejs pathing issue... Error: ENOENT: no such file or directory

I have a nodejs application using hapi.js and I'm trying to download an image from a url. Heroku is giving me errors with the pathing.
My code:
Request(uri).pipe(fs.createWriteStream(__dirname+'/../public/images/'+filename)).on('close', callback);
My errors:
Error: ENOENT: no such file or directory, open '/app/../public/images/1430540759757341747_4232065786.jpg'
My file structure is simple:
app.js
-public
-images
-sampleimage.jpg
-videos
-samplevideo.mp4
-audio
-sampleaudio.wav
As you can see the __dirname for heroku application is /app. I've tried using __dirname+'all sorts of pathing ../ ./ etc' and I've also tried it without __dirname.
I will be creating a lot of these files using ffmpeg and a speech tool. So could anyone explain to me what kind of problem I am having? Is it something that can be solved by using the correct path name or is it my hapijs server configurations that I need to configure?
You just have the wrong path in your project.
On Heroku, you can't write to the folder BELOW the root of your project.
In your case, your code is running in app.js, which is in the 'root' folder of your project.
So, on Heroku's filesystem, this means your project looks like this:
/app
/app/app.js
/app/public
/app/public/images
...
Heroku puts all your code into a folder called app.
Now, in your code pasted above, you show:
Request(uri).pipe(fs.createWriteStream(__dirname+'/../public/images/'+filename)).on('close', callback);
If this code is running in your app.js, it means that by going BACK a folder (eg: ..), you're trying to write to a non-writable part of Heroku's filesystem.
Instead, you want to write to:
Request(uri).pipe(fs.createWriteStream(__dirname+'/public/images/'+filename)).on('close', callback);
This will correctly write your file into the images folder like you want.
HOWEVER
Here's where things are going to get complicated for a moment.
On Heroku, you can indeed write files to the filesystem, but they will DISAPPEAR after a short period of time.
Heroku's filesystem is EPHEMERAL, this means that you should treat it like it doesn't exist.
The reason Heroku does this is because they try to force you to write scalable software.
If your application writes files to your webserver disk, it won't scale very much. The reason why is that disk space is limited. Each web server has its own disk. This can lead to confusing / odd behavior where each webserver has a copy of the same file(s), etc. It just isn't a good practice.
Instead: what you should do is use a file storage service (usually Amazon S3) to store your files in a central location.
This service lets you store all of your files in a central location. This means:
You can easily access your files from ALL of your web servers.
You can have 'reliable' storage that is managed by a company.
You can scale your web applications better.
The folder you hosted on heroku is considered as "app" which you can see from the error you got. I m commenting this after 5 years just to let future viewers know. If any folder is empty, it is not pushed to github or heroku when you pushed the entire project as the folder is empty.
When we try to access a folder which is empty initially, we get the above error as the folder is not pushed in the first place. So, if you want to get rid of the error, place a temp file of any type ( I used a txt file) and push the code. Now the error won't be there anymore as this time the folder is pushed and it can access it.

Problems with public directory when deploying Node.js app with Heroku

I've been working on an app which will feature a Timelinejs (open source js library) element on the client side. I copied the Timelinejs library into my public/javascripts/ directory of my app. Then I linked it in my html header. When I serve my app up locally everything works fine with the timeline. However, I noticed that when I deployed my app to Heroku it wasn't loading my timeline. Using chrome js console I discovered that it didn't find my files in the public/Javascripts/Timelinejs folder. Using the Heroku run bash command I discovered that none of my Timelinejs files were present in the file structure, although an empty Timelinejs directory was present. Is there any command or configuration I need to specify to get these files to my Heroku deployment?
Heroku has a readonly file system. The directory where you can write are ./tmp or ./log. You can't write inside the public folder.
That's because of how they manage their dynos and the way to scale them. If you want to store something, use the ./tmp or, recommended, a s3 bucket. (as I presume 'tmp' stands for 'temporary' :D)
More info here: https://devcenter.heroku.com/articles/read-only-filesystem

Categories

Resources