I'm hosting a website on Firebase Hosting for free and I want that to stay that way :D.
It's a simple 1 page landing page. My problem is that today under hosting it says 3GB was downloaded by all the clients. (free is 10GB/month). How can I limit the client side download in any way? I have a looping background video with a size of 100MB but its only have to download it once right? so that 100MB/user but that shouldn't add up to 3GB in 1 day since I only send it to like 4 people to test it out. And I also have like 5 embed YouTube video but are those count toward client side data downloaded? and some pictures under 1mb but that should matter also.
Also here is a screenshot from the Network tab after Crtl+Shift+R reaload.
Related
I have six devices on/above my desk:
A 4k TV
An Ultrawide monitor
Two laptop computers
An iPad
An android phone
All are connected to the internet and browsing mydomain.org/page, an html page that I am editing using one of the laptops, which has a full page of code. When I press a button on my Wacom tablet that I rerouted through Wacom smart actions to run a script, my code is uploaded over FTP to my server. Right now, I have to reload every page on every device to see the updated results. I want to make a javascript to run on my site that has a LiveLoad() global function. When I execute this function from the DevTools command line, the script stores a cookie that the device is a debug device.
All pages with the script will now enable a small icon over the page that when set to 'live,' the script will open a connection through secure web sockets with an update server that stores up to 10 connections (perhaps only from approved IPs, so that only I can use this on my live site), each registered to the page they are browsing. When I update mydomain.org/page or other pages, my script securely opens a connection and POSTs a secret code to the update server, telling it to send a message over web sockets to all connected debug devices with that page open. On fast internet, this truly will be the ultimate website building setup if I can just overcome this design hurdle.
What I don't want:
Anything more than a single php script for the server-side implementation. Shared hosting. No root access. No Node. No fancy crap. Single PHP files are the only option.
Bloated javascript addons. Bare minimum code for receiving a single type of message (an update is needed) is all I need. Perhaps later I can make it more robust with a second type of update (a hard update) where a php script processes all the fonts, scripts, images etc on a page and adds a random query string after them to force a hard-reload of all page resources.
How can I achieve this? Where to start with php web sockets? The internet has proven to be a cynical wasteland of bloated php libraries that require installation in a node enviornment to freelancers struggling to make scripts off of the 2013 websockets API documentation, with no good, simple solutions around.
I have designed my Meteor/Cordova app to allow offline use for some functionalities. Images loaded from external sources are being cached while the app is in memory, but once the app is removed from memory or the device is restarted and the data connection is off, the images previously loaded from "https://graph.facebook.com/xxx" are not loaded anymore, but images from other sites such as "https://ucarecdn.com/" are being loaded perfectly.
Any idea why images from graph.facebook.com are not being stored in the app data cache? The image link remains the same after the device is restarted.
I am not sure whether this is the reason but all FB(and Instagram) links to images are signed URLs. They work for a while and then they don't. If you want to link a FB image to view it in another website, eventually it will not load because the link expires (the link contains a token). In this case you will request a new token (which means a different link). I see in FB images come with a 14 days cache policy. I do not know what is the case for the Graph but I'd assume you cannot get a photo by just "https://graph.facebook.com/some_image.jpg", you would need a signed link or eventually you will be returned an error saying that you need a token. Do you get an error? Could you provide a full link to an image? If you can load an image, can you check in a Chrome Dev tools / Network, what is the cacheing policy for the image and the full path (in case it redirects you to some other url).
Think of this: I post something in FB and then FB deletes it for any reason or I want to delete it but ... thousand of users can still see it in various mobile apps or other websites because the image is cached. Well, this should not be the case and this is why social networks expire links to resources. I think in FB T&C is also mentioned that you should not cache or store their images.
I'm trying to build a site where users can upload videos, which will be embedded for other users to view and download. I have tried using Vimeo, but I would need Vimeo Pro in order to store others' videos on my account which is too expensive for my organization. How can I do this?
Here's one way to do it:
Convert the videos to JW Player files, upload them to an S3 bucket, and point the bucket to a CloudFront Web distribution. More on this here.
In that tutorial the videos are uploaded manually to an S3 bucket. Because your use case needs the upload to happen programmatically, you need to use an SDK for S3 in your code. Here's a tutorial on how to do it in PHP (with other languages available in the sidebar).
These services are not free but cheap.
Suppose you upload 100 videos, 1 GB each in one month. This will cost you about $2.85 for S3.
Suppose the users view the total of 100 GB, with an average object size of 1 GB, and 100% of users are in United States. This will cost you about $7.10 for CloudFront.
I want to stream images (print screens of the server) using a local server (apache). For an example i will go to the website using a machine on the same network and then this web site will show me set of images at a speed around 30fps (then i will see it as a video). The image quality has to be good.
At the moment i can go to this website using a machine connected to local network. But i cannot figure out a way to stream images. And I have no knowledge of PHP..
Is this possible to achieve??
Can anyone point me in the right direction...
Thanks.
ffmpeg can help you create videos from images. It has a cli binary that can do the task.
Ref: https://trac.ffmpeg.org/wiki/Create%20a%20video%20slideshow%20from%20images
So I need to allow clients to record audio to the internet. Best solution I've found so far which keeps them on our site is the Soundcloud API and I just give them our account details and they use a version of the Soundcloud recorder (Flash/Javascript) hacked together for out site.
Main issue is, these recordings are long. Maybe up to an hour. Because Soundcloud API records in the browser then uploads the audio and transcodes on its server there's a lot to go wrong in the upload process which takes a long time with a big file. It's ok if we just record the audio in say 20 min chunks but it's just not that reliable.
We tried to build our own using wami-recorder, but that meant transcoding from wav to MP3 in the browser before upload to make the file smaller, but taking more time on the client machine - at least Soundcloud does the transcoding server-side.
Given the size of the files should I be looking at a server side recorder/streaming solution based on red5 or something, or is a client side recorder with upload a better proposal?
thanks a lot!