I am working on an iOS app that was originally written for PhoneGap 1.0.0 that was recently upgraded to 2.2.0 so I can use SQLitePlugin with SQLCipher. I'm having problems with a SELECT statement never returning results, even though I know there's data in there.
Here's the rundown:
The database is created with the following code:
THDatabase.DatabaseConnection = sqlitePlugin.openDatabase(THDatabase.name, THDatabase.currentVersion, THDatabase.displayName, THDatabase.size);
THDatabase.DatabaseConnection.transaction(THDatabase.CreateUsersTable, THDatabase.FatalErrorCallback);
THDatabase.DatabaseConnection.transaction(THDatabase.CreateCentersTable, THDatabase.FatalErrorCallback);
...
Once a user registers, the Centers and Users tables have data inserted.
When the login screen loads, the db is queried for Centers and Users.
/* Centers */
LoadCenters: function LoadCenters() {
console.log("LoadCenters");
THDatabase.DatabaseConnection.transaction(THDatabase.LoadCentersImpl, THDatabase.FatalErrorCallback);
console.log('THDB.LoadCenters called from: ' + arguments.callee.caller.name);
},
LoadCentersImpl: function(tx) {
console.log("Loading centers.");
tx.executeSql('SELECT * FROM Centers', [], THDatabase.LoadCentersParse, THDatabase.RecoverableErrorCallback);
},
LoadCentersParse: function(tx, results) {
console.log('parsing centers');
var dataArray = [];
var len = results.rows.length;
...
},
/* Users */
LoadUsers: function LoadUsers() {
console.info("THDB.LoadUsers");
THDatabase.DatabaseConnection.transaction(THDatabase.LoadUsersImpl, THDatabase.FatalErrorCallback);
},
LoadUsersImpl: function (tx) {
console.info("Loading users...");
tx.executeSql('SELECT * FROM Users', [], THDatabase.LoadUsersParse, THDatabase.RecoverableErrorCallback);
},
LoadUsersParse: function (tx, results) {
console.info('parsing users...');
var dataArray = [];
var len = results.rows.length;
console.log(len);
...
}
...
The problem I'm having is that after LoadUsersImpl() runs, LoadUsersParse() never gets the results of the SELECT statement. The console.log stops at "Loading users...". I'm still able to log into the app, but nothing else that requires reading from the db works.
I've tried wrapping the call to LoadUsersParse() in an anonymous function, passing tx, results, which doesn't work. If I pass just results, then the results of the call show up in the console log, but it stops right after 'parsing users...' shows up.
This same code works fine without the SQLitePlugin installed in PhoneGap, but I can't figure out where it may be causing the problem.
Turning DEBUG on logs this right after "Loading users...":
SQLitePlugin.backgroundExecuteSqlBatch: {"executes":[{"query":["BEGIN"],"path":"ThinkHealthDBs.db","callback":"cb55"},{"query":["SELECT * FROM Users"],"path":"ThinkHealthDBs.db","callback":"cb56"}]}
This is a problem with and without SQLCipher being used, and on PhoneGap/Cordova 2.1.0.
Related
I have a cross platform app developed using AngularJS, Monaca and Onsen UI.
I have a service that gets and sets values to be access from various controllers throughout the app.
I have various SQLite database tables created already and each one populated with values. What I need to do is access the tables and store the table values in their "corresponding" arrays in order for them to be accessible from the apps views.
First, I initialise the arrays to store the values that will be retrieved from the database tables. Then I create the table-array association JSON which will match the array to be populated with its corresponding table as the small extract shows below.
// Init Arrays and Table association
var FruitArray = [];
var VegArray = [];
var formArrays = [{
tablename: "tb_fruit",
arrayname: "FruitArray"
}, {
tablename: "tb_veg",
arrayname: "VegArray"
}];
Next, I query the SQLite tables and used the retrieved values to populate the arrays as the sample below shows.
var myFunctions = {
initFormData: function () {
for (var i = 0; i < formArrays.length; i++) {
myFunctions.queryTable(formArrays[i].tablename, formArrays[i].arrayname);
}
},
// Query the Tables
queryTable: function (tableName, arrayName) {
var sql = 'SELECT * FROM ' + tableName;
db.transaction(function (tx) {
tx.executeSql(sql, [],
(function (arrayName) {
return function (tx, results) {
myFunctions.querySuccess(tx, results, arrayName);
};
})(arrayName), myFunctions.queryError);
});
},
So far s good. In my success callback below I try and push the data retrieved from the table to the array - but with no luck. I have done something similar previously but adding data to arrays in AngularJS ala. $scope[arrayName].push(results.rows.item(i)). But I cant seem to do the same in this case.
// Success Callback
querySuccess: function (tx, results, arrayName) {
for (var i = 0; i < results.rows.length; i++) {
alert("Array Name: " + arrayName + // Working - returns name e.g. VegArray
"\n Results: " + results.rows.item(i)); // Working - returns [object Object]
[arrayName].push(results.rows.item(i));
alert("Array: " + [arrayName]); // Not working here - simply returns name e.g. VegArray
}
},
// Error Callback
queryError: function (tx, err) {
alert("An error has occured);
},
} // myFunctions() end
return myFunctions;
How do I push the values from my tables to the arrays? Is this the best method (guessing no) or is there a better way (guessing yes).
I have a dynamodb table with a column called "approval", and several test items in it with values for "approval" that include "pending", "approved", and "not approved". I have 3 items with "pending".
I'm using the following Lambda function to retrieve items and I want get just the items that are "pending". So I'm using FilterExpression. This is my complete function:
var doc = require('dynamodb-doc');
var dynamo = new doc.DynamoDB();
exports.handler = function(event, context) {
var params = {
TableName: 'mytable',
FilterExpression: 'contains(approval, :approval_value)',
ExpressionAttributeValues: {':approval_value': 'pending'}
};
dynamo.scan(params, onScan);
function onScan(err, data) {
if (err) {
console.error("Unable to scan the table. Error JSON:", JSON.stringify(err, null, 2));
} else {
console.log("Scan succeeded.");
context.succeed(data);
}
}
};
Basically I want to do a, "SELECT * FROM mytable WHERE approval LIKE 'pending';" if it were in SQL.
Weirdly, only item is being returned while I'm expecting 3. I'm not using Limit. Why is it only returning one item?
Welp, embarrassingly, I was reading the execution result wrong. It was returning 3 results correctly, but only the first item appeared "above the fold" (probably another reason not to use the AWS console for this stuff).
Hopefully the code above, which works perfectly well, will help someone else as a simple example of using FilterExpression.
I am trying to create a user login application using Phonegap.
here is my code
function successCB(tx) {
var setName = $("#name").val();
var setID = $("#ID").val();
//alert('select * from USER where name='+setID+' and password='+setName+'');
tx.executeSql('select * from USER where name='+setID+' and password='+setName+'', [], querySuccess, errorCB);
}
When I uncomment the alert, it shows the data enter from the text fields, however I cant figure out what to do next,
I want to check the inputs from the database and if user id and password are correct the flow should go to another page like a proper login page.
What is the appropriate way to do that in
function querySuccess(tx, results)
You need to verify that some rows got selected. If the typed in username and password was correct the sql sentense will return a row with that user.
Either you make a global variable called DB where you connection is stored. Then you chould do something like this.
var db = "";
db = window.openDatabase("Database", "1.0", "Cordova Demo", 200000);
// collect your values as you already do.. i just showed em for demontration purpose..
var setName = $("#name").val();
var setID = $("#ID").val();
// execute this code from a function of your choice.
db.transaction(function(con) {
con.executeSql('select * from USER where name='+setID+' and password='+setName+'', [], function (tx, results) {
var len = results.rows.length;
if (len==1) {alert("you logged in with some user");}
});
});
Or with what you posted.
function querySuccess(tx, results) {
var len = results.rows.length;
if (len==1) {alert("you logged in with some user");}
}
So what you need to do after all this is to make sure that you can detect if you are logged in or not. Its not of much use if you have to log in every time u change page or what your app contains. You could use the localstorage to detect that.
You could also make a global variable to hold the value of loggedin or not.
:-)
I hope my code helped you.
I have a Phonegap (2.1.0) application that onDeviceready creates a DB and populates a table with info.
Running this locally (using the Ripple emulator) on Chrome works. Tables are being created and populated as required.
After installing the build .apk on my Android device my Eclipse logcat shows:
sqlite returned: error code = 14, msg = cannot open file at line 27712 of [8609a15dfa], db=/data/data/<project>/databases/webview.db
sqlite returned: error code = 14, msg = os_unix.c: open() at line 27712 - "" errno=2 path=/CachedGeoposition.db, db=/data/data/<project>/databases/webview.db
Which I believe according to this post here - can be ignored.
However - I also noticed this error in logcat:
sqlite returned: error code = 1, msg = no such table: latest_events, db=/data/data/<project>/databases/webview.db
I have also - through adb shell - confirmed that the DB is not created:
here: /data/data/com.application/databases.
or here: /data/data/com.application/app_databases
So - my code:
if (!window.openDatabase) {
doMessage('Databases are not supported on this device. Sorry','error');
return;
}else{
consoleLog('all good for storage');
var db;
var shortName = 'MyDB';
var version = '1.0';
var displayName = 'MyDB';
var maxSize = 102400;
function errorHandler(transaction, error) {consoleLog('Error: ' + error.message + ' code: ' + error.code);}
function nullHandler(){};
db = window.openDatabase(shortName, version, displayName,maxSize);
consoleLog('starting table creation');
db.transaction(function(tx){
tx.executeSql( 'CREATE TABLE IF NOT EXISTS latest_events (id integer PRIMARY KEY AUTOINCREMENT,EventID integer,EventLocation text,EventName text,EventDateFrom varchar,EventTime timestamp,EventPresentedBy varchar,EventVenue varchar,EventScript text,RequireRSVP varchar)',[],nullHandler,errorHandler);
db.transaction(function(tx){
tx.executeSql('SELECT count(id) as RowCount FROM device_info ', [],
function(tx, result) {
if (result != null && result.rows != null) {
for (var i = 0; i < result.rows.length; i++) {
var row = result.rows.item(i);
consoleLog('rowcount: '+row.RowCount);
if(row.RowCount==0){
tx.executeSql('INSERT INTO device_info (device_name, device_platform, device_uuid, device_os_ver, date_last_used) VALUES (?,?,?,?,?)',[device.name, device.platform, device.uuid, device.version, window.bowman_config.siteDate],nullHandler,errorHandler);
//doMessage('device info row added','notice');
}
}
}
},errorHandler);
},errorHandler,successCallBack('2'));
//doMessage('device info row added','notice');
},errorHandler,successCallBack('1'));
}
To add to my woes - on my logcat I do see the console.log output for "all good for storage", and the "starting table creation" messages.
My errorHandler functions are not returning anything and my successCallBack functions are triggered...but no DB created.
Thanks for the help.
When you pass in successCallBack("2") and successCallBack("1") then you are actually invoking them directly so you may be getting false positives on whether or not success has actually been called. You should provide two separate success call backs or just in-line some functions that call console.log("1") for instance.
Today this issue cost me 3 hours. What I tried:
Rewriting the copy database code.
Deleting the app from the emulator / device
Wiping emulator(s)
Cleaning eclipse
Changing file permissions
Validate a working SQLite database file
I solved the problem by copying the code from a shared Dropbox account to another location and refactoring the code in the Android Manifest and java files with another package name.
The application runs beautifully now, i.e. nothing wrong with the code, but somewhere it's muxed up by Dropbox.
I broke the nested functions up into single functions and 'chained' them based on their success or fail. It's actually been a lot simpler than I thought. RTFM it seemed. Thanks for all the help.
simplified:
var db = window.openDatabase("TheApp", "1.0", "The App", 50000000);
db.transaction(queryDB, errorCB, successCB);
// Query the database //
function queryDB(tx) {
//tx.executeSql('SELECT * FROM table", [], querySuccess, errorCB);
}
function querySuccess(tx, results) {
//do more functions here
}
function errorCB(err) {
console.log("Error processing SQL: "+err.code);
}
So I'm using node.js and the module instagram-node-lib to download metadata for Instagram posts. I have a couple of hashtags that I want to search for, and I want to download all existing posts (handling request failure during pagination) as well as monitor all new posts.
I have managed to crack the first part - downloading all existing posts and handling failure (I noticed that sometimes the Instagram API would just fail on me, so I've added redundancy to remember the last successful page I downloaded and attempt again from that point). For anyone who is interested, here is my code (note, I use Postgres to save the posts, and I've abbreviated/obfuscated some of the code for ease of reading and for commercial purposes) **apologies for the length of code, but I think this will come in useful to someone:
var db = new (require('./postgres'))
,api = require("instagram-node-lib")
;
var HASHTAGS = ["fluffy", "kittens"] //this is just an example!
,CLIENT_ID = "YOUR_CLIENT_ID"
,CLIENT_SECRET = "YOUR_CLIENT_SECRET"
,HOST = "https://api.instagram.com"
,PORT = 443
,PATH = "/v1/media/popular?client_id=" + CLIENT_ID
;
var hashtagIndex = 0
,settings
;
/**
* Initialise the module for use
*/
exports.initialise = function(){
api.set("client_id", CLIENT_ID);
api.set("client_secret", CLIENT_SECRET);
if( !settings){
settings = {
hashtags: []
}
for( var i in HASHTAGS){
settings.hashtags[i] = {
name: HASHTAGS[i],
maxTagId: null,
minTagId: null,
nextMaxTagId: null,
}
}
}
// console.log(settings);
db.initialiseSettings(); //I haven't included the code for this - basically just loads settings from the database, overwriting the defaults above if they exist, otherwise it creates them using the above object. I store the settings as a JSON object in the DB and parse them on load
execute();
}
function execute(){
var params = {
name: HASHTAGS[hashtagIndex],
complete: function(data, pagination){
var hashtag = settings.hashtags[hashtagIndex];
//from scratch
if( !hashtag.maxTagId){
console.log('Downloading old posts from scratch');
getOldPosts();
}
//still loading old (previously failed)
else if( hashtag.nextMaxTagId){
console.log('Downloading old posts from last saved position');
getOldPosts(hashtag.nextMaxTagId);
}
//new posts only
else {
console.log('Downloading new posts only');
getNewPosts(hashtag.minTagId);
}
},
error: function(msg, obj, caller){
apiError(msg, obj, caller);
}
}
api.tags.info(params);
}
function getOldPosts(maxTagId){
console.log();
var params = {
name: HASHTAGS[hashtagIndex],
count: 100,
max_tag_id: maxTagId || undefined,
complete: function(data, pagination){
console.log(pagination);
var hashtag = settings.hashtags[hashtagIndex];
//reached the end
if( pagination.next_max_tag_id == hashtag.maxTagId){
console.log('Downloaded all posts for #' + HASHTAGS[hashtagIndex]);
hashtag.nextMaxTagId = null; //reset nextMaxTagId - that way next time we execute the script we know to just look for new posts
saveSettings(function(){
next();
}); //Another function I haven't include - just saves the settings object, overwriting what is in the database. Once saved, executes the next() function
}
else {
//from scratch
if( !hashtag.maxTagId){
//these values will be saved once all posts in this batch have been saved. We set these only once, meaning that we have a baseline to compare to - enabling us to determine if we have reached the end of pagination
hashtag.maxTagId = pagination.next_max_tag_id;
hashtag.minTagId = pagination.min_tag_id;
}
//if there is a failure then we know where to start from - this is only saved to the database once the posts are successfully saved to database
hashtag.nextMaxTagId = pagination.next_max_tag_id;
//again, another function not included. saves the posts to database, then updates the settings. Once they have completed we get the next page of data
db.savePosts(data, function(){
saveSettings(function(){
getOldPosts(hashtag.nextMaxTagId);
});
});
}
},
error: function(msg, obj, caller){
apiError(msg, obj, caller);
//keep calm and try again - this is our failure redundancy
execute();
}
}
var posts = api.tags.recent(params);
}
/**
* Still to be completed!
*/
function getNewPosts(minTagId){
}
function next(){
if( hashtagIndex < HASHTAGS.length - 1){
console.log("Moving onto the next hashtag...");
hashtagIndex++;
execute();
}
else {
console.log("All hashtags processed...");
}
}
Ok so here is my dilema about solving the next piece of the puzzle - downloading new posts (in other words, only those new posts that have come into existence since I last downloaded all the posts). Should I use Instagram subscriptions or is there a way to implement paging similar to what I've already used? I'm worried that if I use the former solution then if there is a problem with my server and it goes down for a period of time then I will miss out on some posts. I' worried that if I use the latter solution then it might not be possible to page through the records, because is the Instagram API set up to enable forward paging rather than backward paging?
I've attempted to post questions in the Google Instagram API Developers Group a couple of times and none of my messages seem to be appearing in the forum so I thought I'd resort to trusty stackoverflow