I have photos in a folder on google drive....The folder is shared to be used by anyone on the web .
I have a table in a static html page on my computer and i set the background of the cell using javascript as below
document.getElementById("ImageCell").style.backgroundImage=url("https://drive.google.com/uc?id=xxxx");
This works fine.
I have 20 photos per page and when i use window.print() it takes a while to load and the main issue is sometimes it does not render and i see memory usage up to 100%.
So looks like this is memory intensive.
My question is- is there any better method of implementation which will not be so slow.
I don't think Google Drive was ever meant to function as a CDN. Are there too many photos for you to just bundle them up with your site?
If so, look into dumping them into an S3 bucket on Amazon, and caching them with cloudfront.
Related
I'm new to both web development and firebase storage. Usually if I were to make a website with publicly available images, I would just put that image URL into the src attribute of an <img> tag. And probably if I was using custom content on squarespace or something there would be a way to upload your images so that you could still just add the photo the same way? But since I'm using firebase hosting I thought I would use firebase storage to store my custom content.
So if I have a homepage that has images I want to display, is it effective to use Firebase Storage and just call the getDownloadURL method to get a url that I can then inject into my HTML via Javascript?
I tried one instance of this, and it seems that the page pops up and then the download url came in after (unsurprisingly) but that just gave the whole thing a slow and laggy feel. Maybe it just feels that way because everything else was local on my machine so it popped up effortlessly, and it would look better when they're all being downloaded? But that seems kinda expensive in terms of network speed and usage, right?
Perhaps I'm just calling it in the wrong way (right now the script is referenced in <head> and that's how it's being rendered. Is there a best practice in terms of rendering content like this?
If the images are dynamic you should use the firebase storage but if you mean static images of your side I don't see any benefit to use the storage. You can host the images directly with your JS code with the Firebase Hosting.
The hosting is also using the Google Global network to reduce the loading time of your side and with that also of your images.
I am showing images on my web application using this
https://drive.google.com/uc?id=photoID
I am uploading the images on my Google Drive by Google Drive API and then grab the photoID of each uploaded image and then showing them on my web app placing it like this https://drive.google.com/uc?id=photoID. So my question is how much reliable is this approach? Before I have used Facebook url and later those links were no longer available because they are just temporary CDN urls, this time I don't want to made the same mistake again.
Thanks.
Each file in Google Drive has its own id.
Therefore, as long as the file is still stored in the Drive, you will be able to retrieve it and use it for your web application.
According to the Drive documentation:
File IDs are stable throughout the life of the file, even if the file name changes. Search expressions are used to locate files by name, type, content, parent container, owner, or other metadata.
Reference
Files and folders overview.
I have an application built that utilizes the HTML5 FilesSystem API, but it only works for Chrome.
Does anyone know of an existing plugin or a technique for replicating this functionality in iOS?
The catch is that I am rendering "mini-sites" for offline use. So I would need to be able to:
Download the files for the micro-site
Store them locally
Access them later. Right now, I'm using an iframe to render the page
My solution (right now) is to do the following.
Because I am caching microsite files that I am pulling from a 3rd party, I set
up a folder on a webserver and built out a PHP-based "caching"
service that routinely compares the content I have stored on the 3rd
part site to the same content stored locally to my server. It updates
the content where necessary.
The app, when it is run from iOS, will asynchronously load each of the microsites in an iframe (create a frame of size 1 x 1px with the appropriate src). The iframe self-destructs after
loading is completed.
Step 2 allows my service worker to cache all of the micro-sites locally, along with the main site.
I have other code in place to keep the local iOS cache "fresh".
This works, but it is nowhere near as ideal as the Chrome File System API, so any alternative suggestions would be great!
Thanks,
Wayne
I want to create an app to estimate engineering costs: lots of tables, forms, a sidebar with a tree structure and so on. You can access a database in the cloud and create table structures according to records in the database.
Naturally that would be a website, however, if the user does not always have an access to the Internet, he/she can download a copy of that database (precisely, a copy of the current version of the database), so that it would be possible to get data from it and work locally. That is why an offline desktop app is needed.
Would it be possible to develop such a hybrid application without first creating a web app and then doing the same with a desktop app?
Previously i have looked into JavaFX, but we saw that it's too difficult to create a website out of that. Then we saw Electron, but i'm not sure if it is the right choice, because it seams that Electron is only used for desktop.
I'm lost as to why you feel you need a website/desktop hybrid. There are many APIs available to let you access a cloud database. All you have to do is find an API with web hooks, or APIs that specifically target the database type you are using (sql, mongo, or whatever). Then just cache a local copy once you've pulled down the database.
Work on the copy then push back to the database when they save, or try to do a push/pull every 5 minutes or something.
You can make a HTML page, which would:
Use AJAX calls if connected
Add relative <script> tag, thinking that HTML file lies in a folder on PC, and that script is somewhere nearbly.
In both cases, user will get same results.
Downloads are:
HTML file with inline script
JS file with database
or
zipped folder with HTML, database JS files and all scripts, images, css, etc. requred for HTML file to show properly.
I am looking for a way to know how many KB of information have been transferred as a web page loads. It would be nice for example to say:
loadedSoFar = window.document.loadedKB;
Two prospective uses would be to test connection speed or to draw a progress bar.
The Mac has the Activity Monitor application that can give network usage.
Essentially what I am looking for is network usage information in realtime via Javascript.
Is there any such tool or function?
Ideally it would work without my having to calculate the file sizes of all the page components.
No, there is not.
JavaScript doesn't have access to network traffic like that.
If the purpose is for providing a loading bar or a measure of throughput, you could prime the client with file stats. Most minimally, if your script is allowed to "discover" the nodes responsible for loading each file and watch their load events, or if the script can be made directly responsible for file loads, you can at least show progress on files loaded over total files.
Beyond that, and for more detailed stats, you could prime the script with a profile of the files to load, accounting for gzip compressed sizes. That would allow you to show an average throughout as files trickle in.