iOS application that self-updates javascript (or other interpreted) resources? - javascript

Based on several questions I've seen:
How to implement auto-updating iOS Apps from App Store?
auto upgrading iOS apps
Is it possible to have a self-updating iPhone application?
it seems pretty clear that apps are not allowed to "self-update" code on startup (or through some other mechanism within the app), since the new code was part of the app submitted to app store.
However, there are plenty of applications out there that do perform automatic updates on startup (for example, this popular game downloads several hundred MB on first startup) - it seems this type of thing is allowed for resources - as long as the download is data/resources - stuff that doesn't change the compiled code.
Technically, Javascript included with an app's resource bundle could be considered a resource, and so it's safe to auto-update any included Javascript. On the other hand, Javascript could be considered 'code', as it's interpreted by Webkit (in a UIWebView) and executed. If this counts as code, one could even go to the extreme and create some interpreter that runs commands included in the app based on contents of an xml file (very easily considered a resource) that gets auto-updated to the app. Would this count as 'code'?
So, my main questions I'm looking for answers to:
Where exactly is the line between 'resource' and 'code'? Does 'code' just mean 'compiled Objective-C'?
Are there any known examples of accepted apps that do something like this?
Also, since I'm sure it'll be asked, my main reason for wanting to download and execute the javascript locally instead of just hitting the remote site all the time is for performance / offline capabilities.

Related

How to download and query html pages where JS processing is necessary?

I often compile informal datasets by running some kind of XPath/XQuery on publicly available web pages. Usually the structure of the HTML is regular enough that useful information can be extracted easily.
But today I've come across tunefind.com. This website makes extensive use of the REACTJS framework, and so most of the structure of the page is configured client-side by Javascript. The pages, when initially downloaded, are very basic and missing a lot of information. The pages are populated by a script that uses a hopelessly messy blob of JSON data at the bottom of the page.
The only way I can think of to deal with this would be to use some kind of GUI-based web engine and just not display the GUI part. But that is a preposterous amount of work for these casual little CLI tools that I use to gather information.
Is there any way to perform the javascript preprocessing without dealing with unnecessary graphics?
Even if you were to process without the graphics the react javascript will be geared towards running in a browser context, at the very least it will expect a functioning DOM to exist, the application itself may also require clicks / transitions to happen before you can see some data.
Your best bet then is to load the page in a browser, to keep this simple, there are plenty of good browser automation frameworks designed for this.
I've used a fair few libraries over the years including phantomJS and recently I've gotten the most mileage out of nightmarejs.
It runs an electron browser for you and gives you a useful promisified javascript API to control it with, that has common browser functions such as clicking, following links etc.
You can configure it to hide the browser which is useful for making a CLI tool, however its a bit of a pseudo-headless mode and will still require a windowing/graphical context (e.g. x window).
Hope this helps.
PS - If you're at all used to docker it's not hard to make this just a running container!

ElectronJS - Cache HTML and JS files from remote server

I have an electron app that retrieves the app files (.html & .js) from a remote server using the function mainWindow.loadURL('http://www.example.com/index.html')
The problem arises if the users network connection to the internet is offline or disconnected.
Is there a way in electron to cache the html and js files so that if the user is offline, electron will automatically load from the cache.
I have tried to use the HTML5 Application Cache and a plugin for webpack https://github.com/NekR/offline-plugin but these do not seem to work.
I see this is an old question but I stumbled across this when doing a semi-related search and there is no answer at all right now, so I'll provide one:
Ignoring the Electron-specific nature of this question, the web-standard way to do this is using Service Workers. Here are some docs on that:
"Using Service Workers" from MDN - https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API/Using_Service_Workers - this is a reference source.
"Adding a Service Worker and Offline into your Web App" - https://developers.google.com/web/fundamentals/codelabs/offline/ - this is a tutorial.
"Creating Offline-First Web Apps with Service Workers" - https://auth0.com/blog/creating-offline-first-web-apps-with-service-workers/ - this is also a tutorial.
I think this would be the most direct way to solve this, even within Electron. (An advantage of Electron here is that you have a single, known browser to make this work for, but I think what you are trying to do fits perfectly within the problem-space that Service Workers are designed to address.)
That said, I think Sayam's comment/question is valid -- if this html/js is the actual content of your electron app, and assuming it doesn't change too often you could (and maybe should) distribute it with the app itself. Then you don't need to do anything special for offline support (as long as that html/js doesn't need network-based resources), and changes to that code are deployed as application updates.
Personally I think that once-per-week is about the maximum frequency of updates for which this approach is suitable. It would not bother me if an app auto-updated 2 or 3 times per month, but I think I'd uninstall an app that updates itself 2 or 3 times per week if I had that option.
There may also be some electron and/or node modules that address this problem-space, but I've never bothered to look because one of the two options above has always seemed appropriate to me.
Old question but still valid usecase (offline cache for dynamic assets).
here is article that describes one solution for that (own ExpressJS caching middleware). Author made npm library to address that.

In Aurelia.js, how do I make supporting html for routers load on demand rather than startup?

In Aurelia.js, if you look at network traffic, it seems that all html pages for the routers are loaded at start-time (loaded by system.js since I am bundling). This makes my page load slower. I need them to load on demand, like when a router corresponding hyperlink is clicked.
This is an interesting idea. I'm not a core team member so maybe someone with deep knowledge of the router can comment on the ability to do what you're asking. I know what I'm about to say below doesn't directly answer your question, but I'm providing it as-is in case it's still helpful.
Generally, most people "bundle" their app before deploying to production. See https://github.com/aurelia/bundler for an example on how to do this in Aurelia. The Aurelia Skeleton app has this build pipeline wired in as well by default (https://github.com/aurelia/skeleton-navigation).
Bundling reduces the number of network requests, but it still loads the entire app at once. You have to think of SPA apps more like desktop apps. The HTML template or "view" template you edit is actually compiled into JavaScript, then combined with the data it needs by your "view model".
One approach that results in smaller downloads is to break up a large monolithic application into several smaller applications. You create separate Aurelia apps that are loaded at different server URLs. You then have a mixture of server routing (to route users between applications) and client routing (to route a user within an application). This obviously creates complexity in your development lifecycle though.
Several things occurred.
The question may be a bit confusing. The routers are not calling the resources; they are just consuming them. Also, its about start-up speed. That has to be clear.
Ensure bundling is done correct- copy over config file to deployment server, ensure you are bundling all necessary files, etc (mines was not). You can gain tremendous increase in start-up time if bundling is done correctly. Look at the developer tools and watch for anything suspicious.
If you still have trouble with a large app, you may experiment with having multiple bundles. See the Aurelia's documentation on bundling.
This is beta 1. There should be some performance enhancement coming in beta 2 and release #1. So far, in Beta 1, for the same browser, it seems that linux is way faster than Windows in regards to start-up time. IE and Edge are slower than other browsers when using aurelia.
This may help allieviate some of the problem. Added async attribute on external script calls (depending on needs, defer may work too.) Also added "lazyload" attributes on img-tag. It only works in IE and Edge, but its help since IE and Edge are the slowest browsers in many cases.

Windows Store App: Dynamically load JavaScript for easier update?

Recently I have taken over a Windows Store app project written in C#/XAML and one of its requirements is,
Migrating the project to Javascript/HTML5, and by making javascript dynamically loaded from our website, we can update the code logic as often as desired without having to prompt user to download a new version of our app. The deployed app is very simple and does not require update, each time the app launch it will try to load the javascripts and contents from the web.
This requirement is from marketing and they think it is fantastic if the app can be updated in this way. I don't know if this is a good idea, or even feasible.
My questions
Give me some reasons if this is not a good idea/not feasible
If javascript can be dynamically loaded, what about html and css files?
Edit
When I asked this question, I had not heard of Cordova, the idea of hybrid app was quite new at that time. With Cordova, we can write app that runs in this way. The web content (HTML/JS/CSS) is rendered in a WebView control and can be updated from the web each time.
In short, I don't believe the app can be written in this way.
The application code is part of the appx package when the app is deployed and if you wish it to be accessible via the store, it has to pass MS's app certification checks.
This ensures that the app code is is suitable quality and ticks the appropriate security boxes etc.
I don't believe you can dynamically load application code from a remote source, otherwise you could be loading any old unverified cack and this code would not have been checked.
However, the updating mechanisms are very simple as I understand it and pushing a new version of your app code to the store will allow users to update the app at their leisure and handle the process for you.
I think the best you can get in terms of loading things dynamically would be to load a webpage remotely in your app (the webpage could be updated as you wanted). This would not allow you to run application code from the webpage, and with the webpage being remote, it would run under the 'web context' from a security perspective, so you'd not be able to use a lot of the native functionality you could run locally.
If the reason for re-developing the app in JS/HTML is to load code dynamically, I would advise against it. There are not a huge range of differences between a C# implementation and a JS implementation, so it would seem inefficient to redo all the existing work in a different language.

Place javascript files in 12 Hive or in Document Library?

Besides the obvious benefit of placing the custom javascript files (or any other resource files) in a document library, such as:
versioning, history, tracking
easy to change/edit
Is there any other benefits?
Performance? Page Load time?
Are there any cons?
PS. This is not meant as a question on number of files / resources has a general HTTP performance, but rather this specific SharePoint issue on the file location.
http://site/_layouts/myjavascript.js
vs.
http://site/DocumentLibrary/myjavascript.js
If you are storing the javascript in a library then it is stored in the database.
It means that:
It has version control
It is slower then the filesystem (unless you are using blob cache)
It will be included in any backups you do of your sharepoint install (stsadm for example)
It will be accessible (changeable) by anyone with access to the document library (easier to maintain, less secure)
Client side caching will behave differently (you'll need to configure it, it's a bit complicated for MOSS content vs filesystem content)
We decided to store it in the 12 hive as it feels better in regards to code vs. data separation. If you consider this file to be data then store it in MOSS, if you consider it to be "code" then store it in the filesystem.
have you considered using google to host javascript files (such as jquery)
this benefits from using their bandwidth for downloading the files
faster page loading times
higher availability
chances are high that your javascript file is already cached on the user's machine
Document Library
Pros - Automatic delivery to all web front ends, easy, versioning, history, ease of editing
Cons - Slower (it's in the database),Security issues brought about by accidentally securing the item's site, login prompts if you are referencing the js via absolute URL your users may get repeated login prompts
Plasing the js file in the 12 hive
Pros- faster, no issues with the aformentioned security prompts
Cons - Not automagically delivered to all of your web front ends, possible AAM issues, technically you are not supposed to modify files in the 12 hive

Categories

Resources