Phonegap local storage returns empty result - javascript

On app's first load, I retrieve data from storage (Phonegap's WebSQL storage type). Phonegap storage results load fine. Until...
When I go to another page in the app:
Link
If I return back to that main page from the second page using window.location.href in Javascript, I retrieve the WebSQL data again (same function as before).
Phonegap Storage calls the success function (no error), but with an empty result set. It seems to be loading from a second database where I haven't stored anything into (see edit below).
If I force quit app and reopen, storage loads fine again, which shows me that the results in the DB are NOT being deleted.
I load the storage after ondeviceready is fired. What could be wrong?
Note:
1. This is not happening on simulators. only happening on real Android 4.0 device.
2. This app uses jQuery / jQuery Mobile.
function ondeviceready() {
db = window.openDatabase("test", "1.0", "test DB", 2000000);
// . . . //
db.transaction(function(tx) {
tx.executeSql('SELECT * FROM table ORDER BY name', [], querysuccess, function(tx, e) {
errorAlert();
});
}), errorCB;
}
EDIT:
I'm noticing now that if I add a new row when app is first loaded, it stores in one database. Then, if I go to the second page and then back to the first page, and I add another row, it also stores, but in a separate database! (so that's what it seems to be doing). So strange, I see this happening. All my rows are saved and persistent, but the query returns a different group of results depending on whether I went to the second page or not...
Also:
The second page is having a jQuery error for some reason. I just created a blank page with a script link to jQuery, and there is an error. Strange... I wonder if this error is affecting the database? I'm trying to discover how to solve this error.

I've encountered an issue similar to this one before, but my problem was that I was trying to write to the same database from the Java side and the Javascript side. This is how I was able to solve it:
First I ran the app on an Android Simulator and used DDMS to access the databases and that is when I realised that the two were writing in different databases.
I changed from using WebSQL, to using proper SQLite database, and I started getting consistent results after that. It's a drop-in replacement for the phonegap's implementation of database storage, so no need to change your code much. Please see https://github.com/brodyspark/PhoneGap-SQLitePlugin-Android.
I think it's also important to note that WebSQL has some storage limitations, so you might be better off using SQLite that WebSQL. Please see: http://www.steveworkman.com/html5-2/standards/2011/the-limitations-of-websql-and-offline-apps/
Hope this helps :-)

What I've done is I now load the external HTML page through Ajax:
$.mobile.changePage("page2.html");
I also switched to pgsqlite as recommended.

Related

localStorage getItem works on a local .NET project, but returns NULL once deployed to a real environment

I have a project that combines a .NET backend with React/Typescript frontend.
At some point, the backend (C#) executes the following instructions:
var name = AwesomeClass.GetName();
var surname = AwesomeClass.GetSurname();
var StorageScript = $"localStorage.setItem('name_token', '{name}'); localStorage.setItem('surname_token', '{surname}');";
ScriptManager.RegisterStartupScript(this, this.GetType(), "TokenLocalStorage", StorageScript, true);
Unless I'm doing something wrong (which is a clear possibility), that call to RegisterStartupScript executes the provided Javascript script, which will store in localStorage those "tokens". The name and surname variables are NOT null (tested several times).
On the frontend, there are calls to localStorage.getItem as follows:
const nameToken = localStorage.getItem('name_token');
The thing is that, if I run the project locally (through Visual Studio 2022, for example), everything works:
The call to localStorage.getItem works: the value generated in the backend is successfully retrieved from the frontend
Opening a Code Inspector (in Chrome, Firefox, Edge, etc) shows the entries in localStorage, as expected.
However, when the project is deployed on an AWS instance (so it can be access through a "real" URL, with its login form and so), the calls to localStorage.getItem ALWAYS return null. No matter which value is stored. No matter where the backend performs that RegisterStartupScript call. No matter where the frontend tries to retrieve the values. Always NULL.
The thing is that, through the Code Inspector, the tokens are present on the localStorage. So the backend part is working (it still storing the values using the setItem thing). Even more: if the localStorage.getItem is executed in the "Console" tab, the value returned is correct.
It seems that only the React/Typescript code is unable to retrieve the value.
Any idea of what could be the root cause?
I understand that problem might be located in the "deployment" process (AWS configuration? Pipeline configuration?), but want to be sure that there is nothing (obvious) wrong with the code.
Also, if there is another easy way to send backend data to the frontend, I'm all ears.
PD: Fairly new to the web development world, so I might be ignoring critical things.

Cache API with MVC Views

I have a basic MVC form and I've been trying to use the Javascript Cache API to cache all my css, js, html files so that when users (people in the field) do not have reliable access, they can still use my web form. Obviously I'm using IndexedDB and service workers as well to check for a connection and save locally when a connection is not available, syncing when it is available.
I've gone through some tutorials and everything seems straightforward when dealing with caching actual, physical files (css, html, js). MVC is weird though since you're routing. I created the basix Index, Create, Edit, Details views. When I create an array of URL's to cache such as
var urlsToCache = [
'/App/Details',
'/App/Edit',
'/App/Create',
'/App/Index',
'/App/Content/bootstrap.css',
'/App/Content/site.css',
'/App/Scripts/jquery-1.10.2.js',
'/App/Scripts/jquery.form.js',
'/App/sw.js',
'/App/Scripts/bootstrap.js',
]
.. everything caches except for DETAILS and EDIT. Index and create cache fine. I'm actually surprised the latter two cache at all since they aren't physical files. I'm assuming Details and Edit don't cache because they don't work without querystring parameters.
Is it POSSIBLE to cache these two views at all? Or does anyone know of anything on NuGet that addresses this situation?
I changed this in the GET method for my Edit action to return an empty Model if there was no ID
if (id == null)
{
//return new HttpStatusCodeResult(HttpStatusCode.BadRequest);
return View();
}
This allowed me to load the Edit page without a querystring variable and not get an error message. The page loads with no data but it allows me to cache it. At this point I suppose I would have to tell my service worker to check if the page is online. If it is, route the request normally, else query the local storage and manually plug the values into the fields.
So let this be a lesson to anyone creating Offline-enabled apps with MVC and using Cache API. Get rid of the lines that return bad request errors in your CRUD views if ID numbers aren't passed. Just pass back a blank model to the view (return View()). This allows you to cache your pages. And you'll obviously need to write code to handle the offline retrieval and presentation in code that executes when the page loads, but it will still allow you to utilize the MVC/Razor features when online.
One thing to note: "/App/Edit" will cache. If you load "/App/Edit/2", it won't match a url in your cache so you'll get an offline message. However, you can easily modify your Index page to send the ID via post. Just have a form on the page that goes to the Edit action and change the link to an underlined span with an onclick that sets the value of a hidden field to the ID. You'll have to pass another hidden field to let it know that it needs to retrieve instead of update (since the controller has different GET AND POST actions for Edit. The GET action is useless, but keep it for caching. You're retrieval that you normall would do int the GET is now going to be done in the POST with an if statement to check for your hidden field flag.

Parse.com PHP SDK - Refresh user data - Fetch command

I'm using the Parse.com PHP SDK on one of my pages. I seem to have a similar problem I faced on the iOS version but I easily solved that using the 'fetch' command.
The problem is when I edit information in my database, it does not update on my web page when I refresh the page. The user has to log out then log back in for the new data to be shown.
Here is how I'm getting the data:
<?php $u1=ParseUser::getCurrentUser()->get("auto"); echo $u1; ?>
Here is the documentation on the 'fetch' command but I don't understand how it works or how it is implemented: http://parseplatform.org/parse-php-sdk/classes/Parse.ParseUser.html#method_fetch
Does anyone know how to show the updated string values using this command or anything similar that would work?
The issue you're seeing is a cached object in your current session, specifically the current ParseUser. You can see that the php sdk attempts to find the current user from a variety of places, all effectively independent of the server-side copy.
You were in the right direction, you can use the fetch method to 'refresh' any ParseObject by updating it with any new changes from the database:
// get the current user
$user = ParseUser::getCurrentUser();
// fetch changes from parse
$user->fetch();
// get your newly set key/value pair
$isAuto = $user->get('auto');
Your user object will then be refreshed with the changes you need.

Force existing client web pages to reload - using only JSON (no eval)

I'm a consultant working on a web app that's basically a single page app. All it does is constantly retrieve new json data behind the scenes (like once a minute), and then display it on screen.
Our clients load this app, and leave it running 24/7, for weeks on end. If errors happen when retrieving new json data, the app ignores it and keeps running.
We're rolling out an update, and want the existing clients to either become invalidated, or reload themselves without any user interaction. This feature wasn't "built in" by anyone, and we're trying to do this after the fact.
Is there some way to make the existing clients reload without telling our end users to just reload the page?
The following conditions define the app a bit more:
The app uses jQuery 1.9.0
Runs exclusively in Chrome
Retrieves new json data frequently using jquery
Throws away any errors it finds in json responses and uses old data.
EDIT:
I've had it suggested that we could try the following:
send invalid data through the JSON responses to crash chrome (like 500 megs of data, for example)
send window.location.reload through the JSON response (which supposedly won't work due jquery protecting against this type of thing)
send "script" data in the JSON response and if it gets $.html(....) at some point, then it may run the script as well.
and am open to any suggestions on getting this to reload or kill chrome, so the client is forced to reload the page.
If you're using $.ajax to request your data, and not specifically setting your content type, then you may be able to do the following on the server:
set the content type header to "text/javascript"
respond with javascript, e.g. window.location = "http://www.yoursite.com"
jQuery may eval that, and simply run your javascript.
No it is not possible. As far as I can tell you do not execute code from the JSON response (which is a very good thing). Thus you have no way of altering your current client's behaviour. According to your own statement:
"Throws away any errors it finds in JSON responses and uses old data"
You will not be able to crash the user's browser by sending invalid JSON data as the errors will be suppressed.
You can build in automatic deployment in to future versions by sending an application version number and testing for changes or by using WebSockets (which the application seems better suited to anyway as you can ensure your clients only poll the server when the JSON has actually changed).
If I get it correctly, create a version referance page, and make the client check this page very couple seconds, when you update the file, client will reload itself with this script.
var buildNo = "1.2.0.1";//
var cV = setInterval(checkVersion,(5*1000))//Every 5 sec.
function checkVersion(){
$.ajax({
url:"checkVersion.php?v="+buildNo,
dataType:"JSON",
success:function(d){
if(d.version != buildNo){//if version is different
window.location.reload();
//chrome.runtime.reload(); //for chrome extensions
}
}
})
}
if you cant add extra page, you may just add extra variable to end of your JSON data.

Possible to cache JSON to increase performance / load time?

I'm using a JSON file to autopopulate a drop down list. It's by no means massive (3000 lines and growing) but the time taken to refresh the page is becoming very noticeable.
The first time the page is loaded the JSON is read, depending on what option the user has selected dictates which part of the JSON is used to populate the drop down.
It's then loaded on every refresh or menu selection after. Is it possible to somehow cache the values to prevent the need for it to be reloaded time and time again?
Thanks.
EDIT: More Info:
It's essentially a unit converter. The JSON holds all the details. When a users selects 'Temp' for example a call is made and the lists are populated. Once a conversion is complete you can spend all day running temp conversions and they'll be fine but everytime a user changes conversion type so now length, the page refreshes and takes a noticeable amount of time.
Unfortunately, I don't know of a standardized global caching mechanism in PHP. This article says that Optimizer Plus, a third party accelerator, is being included in core PHP starting in version 5.5. Not sure what version you are using but you could try that.
On a different note, have you considered file storage as andrew pointed out? I think it combined with $_SESSION could really help you in this case. Let me give you an example that would work with your existing JSON data:
Server Side
Store your JSON data in a .json file on your PHP server:
{
"data": "some data",
"data2": "more data",
"data3": [
...
],
etc.
}
Note: Make sure to properly format your JSON data. Remember all strings must be enclosed in double quotes ".
In PHP, use an if statement to decide the appropriate action:
error_reporting(E_ALL);
ini_set("display_errors", "On");
session_start();
if(isset($_SESSION['dataCache'])) {
echo json_encode($_SESSION['dataCache']);
} else {
$file = 'data.json';
if (!is_file($file) || !is_readable($file)) {
die("File not accessible.");
}
$contents = file_get_contents($file);
$_SESSION['dataCache'] = json_decode($contents, true);
echo $contents;
}
So lets dig into the above coding a little more. So here's what we are doing in a nutshell:
Turn on error reporting and start session support.
Check to see if we've already read the file for this user.
If so, pull the value from storage and echo it out and exit. If not continue below.
Save off the file name and do a little error checking to ensure PHP can find, open and read the contents of the file.
Read the file contents.
Save the decoded json, which is not an array because of the `true` parameter passed to `json_decode`, into your `$_SESSION` variable.
Echo the contents to the screen.
This will save you the time and hazzle of parsing out JSON data and/or building it manually on the server. It will be cached for the users session so that they can use it through out.
Client Side
I assume you are using ajax to fetch the information? If not correct me, but I was assuming that's where some of your JavaScript comes into play. If so you may consider this:
Store the returned data in sessionStorage on the user's browser when it's returned from the server:
$.ajax({
...
success: function (res) {
localStorage.setItem("dataCache", JSON.stringify(res));
},
...
});
Or if you use promise objects:
$.ajax({
...
}).done(function (res) {
localStorage.setItem("dataCache", JSON.stringify(res));
});
When you need to read it you can do a simple test:
var data;
// This returns null if the item is not in local storage.
// Since JavaScript is truthy falsy, it will be evaluated as false.
if(localStorage.getItem("dataCache")) {
data = JSON.parse(localStorage.getItem("dataCache"));
} else {
// Make ajax call, fetch object and store in localStorage in the success or done callbacks as described above
}
Notes:
localStorage is a new feature in HTML5, so it's not fully supported on all browsers yet. Most of the major ones do however, even as far back as IE8 (I think). However, there is no standardized size limit on how much these browsers are required to hold per site.
It's important to take that into consideration. I can guarantee you probably will not be able to store the entire 30,000 line string in localStorage. However, you could use this as a start. Combined with the server side solution, you should see a performance increase.
Hope this helps.
I use the browser's cache to ensure that my large chunk of JSON is only downloaded once per session. I program in ASP.NET, but I'm sure PHP has the same mechanisms:
On session start, I generate a random string as session key for my dynamic JavaScripts. This key get stored in the ASP.NET session state under the key JsonSessionID. That way I can refer to it in my page markup.
I have a "generic http handler" (an ashx file) that when called by the browser, returns a .js file containing my JSON.
In my HTML I include the dynamic script:
<script type="text/javascript" src="/dynamicJSON.ashx?v=<%= JsonSessionID %>"></script>
The browser will automatically cache any URLs included as scripts. The next time the browser is asked to load a cached script from a URL, it will just load up the file from the local disk. This includes dynamic pages like this.
By adding the ?v= in there, I ensure that the JSON is updated once per session.
Edit
I just realized that your JSON is probably static. If that's the case, you can just put your JSON into a static .js file that you include in your HTML, and the browser will cache it.
// conversionData.js
var conversionData = { "a":1,"b":2,"c":3 };
When you include the conversionData.js, the conversionData variable will be in scope with the rest of your page's JavaScript that dynamically updates the drop-downs.
Edit 2
If you are serving static files, this blog post has a good pattern for cache-busting based on the file's date modified property. i.e. the file is only downloaded when it is changed on the server.
I have yet to find a good method for cache-busting JSON created via database lookup tables, other than per-session. Which isn't ideal because the database could change mid-session.
Once you've got your JSON data decoded into an object you can just keep the object around, it should persist until a page reload at least.
If you want to persist between reloads you might want to look at HTML5's localStorage etc.
You would need to come up with an age strategy, maybe just dump the current date in there with it as well so you can compare that and expire as needed.
I would suggest storing your json data to a session. On first page load you can write a script to get your json data, then store them into a session.
on each page load/refresh afterwards you can check our session to decide what to do - use the session data or fetch again your json data.
This approach suites me for small scale data (for example: an array of products - colors - sizes - prices).
Based on your data you should test you loading times.
Here is a simple hack:
Create a call to a php file as GET request with parameter "bla-bla.html"
or "bla-bla.css"... well you know, it makes browser think it is not a php, but rather "html" or "css". And browser will cache it.
To verify that the trick is working - go to the "network" tab of the browser dev panel and you will see column "type" there along with "transferred" - instead of having php there and actual size, you will find "html" and "(cached)"
This is also good to know when you passing parameters like "blah-blak.html" to the php file and expect it will not be cached. Well, it will be cached.
Tested on FireFox Quantum 57.0.1 (Mac 64bit)
P.S.
Chrome 63 on Mac is capable of recognising real file type in this situation. So it cannot be fooled.
Thinking out of the box here:
but if your list has 3000 lines and growing (as you said)
is it possible for you to establish its maximum size ?
let's say the answer is 10,000 (max) items; then do you really need an ajax call ?
you could transfer the data straight away with the page
(depending on your architecture of course, you could come out with different solution)

Categories

Resources