Downloading multiple images with Phonegap - FileTransfer - javascript

I have built an offline app in Phonegap, the JSON data gets pushed and gets to where it needs to go. But there are always a bunch of images that need to follow, and html5 cache will just not do it seeing it clears when an app is closed. I have been breaking my nugget on this for quite a while now.
The issue is that you want to check if the image exists on the file system. I've chosen to do so with a "async:false" AJAX call, and therefore have the code running up to a point. I haven't used the phonegap (almost hack) way by writing and setting overwrite to false (which produces an error) seeing that was another async function I didn't want to deal with (wherein lies the problem: async).
Another problem is that the Phonegap browser allows best for 3 downloads at the time (looked it up). My previous function was suffering from this debacle as it would just cut off downloading images if the amount simultaneous downloads was to high, which was around 20 to 40 (depending on size) images at the time (which of course is not strange seeing download speed plummets when you divide it over multiple downloads at the time).
So the question is, how to build:
A function that loops through JSON data (image variables).
Downloads them three at the time (and set the meta data because Apple told it had to)
Deletes from disk those who are not needed anymore (this last one is problem all over the inet as far as I've read, so we'll keep this optional and because by now I really don't actually care about space anymore).
Should run a function when all images are downloaded.
My code as far:
var datadir = "";
var pics_at_the_time = 0;
var external_url_pics = "http://Folder on server where images are";
// new_config.pics holds the JSON data. Built: '[ key (same as id) { "id": 1234567890, "tld":"jpg" }'. id+'.'+tld = the image name.
window.requestFileSystem( // get the root folder path
LocalFileSystem.PERSISTENT,
0,
function(fileSystem) {
datadir = fileSystem.root.fullPath;
},
function() {
console.log("Could not get root FS");
}
);
function fetch_images() {
var len = new_config.pics.length; // amount of pictures that need to be here
var all_in_counter = 0;
$.each(new_config.pics,function(index,val){ // loop through all pics
pic_exists(val);
});
}
function pic_exists(val) {
$.ajax({ // pic on disk or not
async:false,
url: 'file://'+datadir+'/'+val.id+'.'+val.tld, //or your url
success: function(){
var obj = val.id;
delete new_config.pics.obj;
},
error: function(){
var obj = val.id;
delete new_config.pics.obj;
downloadImage(val);
}
});
}
function downloadImage(val){
if(pics_at_the_time < 3) { // only 3 at a time. else wait for a download to finish
pics_at_the_time++;
var ft = new FileTransfer();
ft.download(
external_url_pics+val.id+'.'+val.tld,
datadir + "/" + val.id+'.'+val.tld,
function(entry) {
if(debug_console) { console.log("download complete: " + entry.name); }
entry.setMetadata(function(metadata) { } , function(error) { console.log("Could not set meta data: "+val.id); }, { "com.apple.MobileBackup": 1}); // no apple cloudbackup for these pics. We can always re-download apparently
pics_at_the_time--;
fetch_images();
},
function(error) {
if(debug_console) { console.log("download error target " + error.target); }
pics_at_the_time--;
fetch_images();
});
}
}
As you can probably tell, the code is not very sophisticated and it definitely does not check for already existing images. Cause although this works, it is far from perfect seeing it reloops the bunch every time, which at first seemed like a good idea. But now I'm having second thoughts.
Any help is obviously appreciated

Related

How to efficiently stream a real-time chart from a local data file

complete noob picking up NodeJS over the last few days here, and I've gotten myself in big trouble, it looks like. I've currently got a working Node JS+Express server instance, running on a Raspberry Pi, acting as a web interface for a local data acquisition script ("the DAQ"). When executed, the script writes out data to a local file on the Pi, in .csv format, writing out in real-time every second.
My Node app is a simple web interface to start (on-click) the data acquisition script, as well as to plot previously acquired data logs, and visualize the actively being collected data in real time. Plotting of old logs was simple, and I wrote a JS function (using Plotly + d3) to read a local csv file via AJAX call, and plot it - using this script as a starting point, but using the logs served by express rather than an external file.
When I went to translate this into a real-time plot, I started out using the setInterval() method to update the graph periodically, based on other examples. After dealing with a few unwanted recursion issues, and adjusting the interval to a more reasonable setting, I eliminated the memory/traffic issues which were crashing the browser after a minute or two, and things are mostly stable.
However, I need help with one thing primarily:
Improving the efficiency of my first attempt approach: This acquisition script absolutely needs to be written to file every second, but considering that a typical run might last 1-2 weeks, the file size being requested on every Interval loop will quickly start to balloon. I'm completely new to Node/Express, so I'm sure there's a much better way of doing the real-time rendering aspect of this - that's the real issue here. Any pointers of a better way to go about doing this would be massively helpful!
Right now, the killDAQ() call issued by the "Stop" button kills the underlying python process writing out the data to disk. Is there a way to hook into using that same button click to also terminate the setInterval() loop updating the graph? There's no need for it to be updated any longer after the data acquisition has been stopped so having the single click do double duty would be ideal. I think that setting up a listener or res/req approach would be an option, but pointers in the right direction would be massively helpful.
(Edit: I solved #2, using global window. variables. It's a hack, but it seems to work:
window.refreshIntervalId = setInterval(foo);
...
clearInterval(window.refreshIntervalId);
)
Thanks for much for the help!
MWE:
html (using Pug as a template engine):
doctype html
html
body.default
.container-fluid
.row
.col-md-5
.row.text-center
.col-md-6
button#start_button(type="button", onclick="makeCallToDAQ()") Start Acquisition
.col-md-6
button#stop_button(type="button", onclick="killDAQ()") Stop Acquisition
.col-md-7
#myDAQDiv(style='width: 980px; height: 500px;')
javascript (start/stop acquisition):
function makeCallToDAQ() {
fetch('/start_daq', {
// call to app to start the acquisition script
})
.then(console.log(dateTime))
.then(function(response) {
console.log(response)
setInterval(function(){ callPlotly(dateTime.concat('.csv')); }, 5000);
});
}
function killDAQ() {
fetch('/stop_daq')
// kills the process
.then(function(response) {
// Use the response sent here
alert('DAQ has stopped!')
})
}
javascript (call to Plotly for plotting):
function callPlotly(filename) {
var csv_filename = filename;
console.log(csv_filename)
function makeplot(csv_filename) {
// Read data via AJAX call and grab header names
var headerNames = [];
d3.csv(csv_filename, function(error, data) {
headerNames = d3.keys(data[0]);
processData(data, headerNames)
});
};
function processData(allRows, headerNames) {
// Plot data from relevant columns
var plotDiv = document.getElementById("plot");
var traces = [{
x: x,
y: y
}];
Plotly.newPlot('myDAQDiv', traces, plotting_options);
};
makeplot(filename);
}
node.js (the actual Node app):
// Start the DAQ
app.use(express.json());
var isDaqRunning = true;
var pythonPID = 0;
const { spawn } = require('child_process')
var process;
app.post('/start_daq', function(req, res) {
isDaqRunning = true;
// Call the python script here.
const process = spawn('python', ['../private/BIC_script.py', arg1, arg2])
pythonPID = process.pid;
process.stdout.on('data', (myData) => {
res.send("Done!")
})
process.stderr.on('data', (myErr) => {
// If anything gets written to stderr, it'll be in the myErr variable
})
res.status(200).send(); //.json(result);
})
// Stop the DAQ
app.get('/stop_daq', function(req, res) {
isDaqRunning = false;
process.on('close', (code, signal) => {
console.log(
`child process terminated due to receipt of signal ${signal}`);
});
// Send SIGTERM to process
process.kill('SIGTERM');
res.status(200).send();
})

PHP - Update Displayed Info Upon Changing PDF Canvas

I've been running into an issue while trying to update a script for a client. I'm still somewhat new to the industry, so I wanted to make sure I'm not missing anything.
I am attempting to edit a web page which displays a number of PDFs in canvas views. The issue arises when a user tries to switch which PDF they're viewing. The function that retrieves the PDFs has a setTimeout in place to get the content to reload every minute, as the PDFs themselves are updated with new data every 5 minutes and the users need to be able to monitor the data as it comes in. However, when switching from one PDF canvas to the next, I found that the PDF doesn't update. The only PDF that reloads and updates is the one whose canvas has been open. This is what I'm trying to change. I want the PDF display to update not only every minute, but also whenever a user switches which PDF they're viewing.
What this told me is that I need to add something that loads the PDFs again when switching canvas views. First, I tried taking the function that updates the PDFs and calling it whenever the user switches the canvas view. This seemed to work, but the issue ended up changing. The page would begin to reload more and more often the more times you would change canvas views - I'd assume because the setTimeout within the function had then been called more than once.
My second attempt was to make a new function that simply didn't have the setTimeout code and call that instead. I must have done something wrong however, because after calling that function in the code, none of the PDFs wanted to load at all.
I'll paste the code for the functions I need to work with below. Any help is appreciated, especially if I'm just missing something obvious. If you need any other details, let me know. Thank you to anybody who offers help.
function getReport(reportName, frameId) {
var params = {
report: reportName
};
$.post('/getReport.php', params, function(data){
if (data == "") {
window.location.href="/login.php";
return;
}
if (useBrowserViewer) {
$("#pdfFrame" + frameId).attr("data", "data:application/pdf;base64," + data);
} else {
displayPdf(data, document.getElementById("canvas" + frameId), 1);
if (frameId == "4") {
displayPdf(data, document.getElementById("canvas4Page2"), 2);
}
}
setTimeout(function() {
getReport(reportName, frameId);
}, 60*1000);
}).fail(function() {
console.log("Report not returned.");
setTimeout(function() {
getReport(reportName, frameId);
}, 60*1000);
});
}
function showReport(frameIdNum) {
if (useBrowserViewer) {
$(".pdfFrame").hide();
$("#pdfFrame" + frameIdNum).show();
} else {
$(".pdfCanvas").hide();
document.getElementById("canvas" + frameIdNum).style.display = 'block';
if (frameIdNum == "4") {
document.getElementById("canvas4Page2").style.display = 'block';
}
}
$(".link").removeClass("active");
$("#link" + frameIdNum).addClass("active");
}

Streaming images with Nodejs on embedded device

I'm trying to stream images from a Node.js server to a client site. I've started with socket.io, but my implementation is fairly memory intensive (and possibly leaking as I'm not fluent in JavaScript). I'm just prototyping at this point with 10 sample images:
Server Side
Responds to a socket.io trigger with the following function that "streams" 10 images to the client at roughly 100ms intervals.
socket.on('img_trigger', function(data) {
var img_num = 0;
var timeoutHandle = null;
function startTimeout() {
stopTimeout();
if (img_num < 10) {
timeoutHandle = setTimeout(updateStream, 100);
}
}
function stopTimeout() {
clearTimeout(timeoutHandle);
}
function updateStream() {
var file = './sampleframes/sample-' + img_num + '.png';
fs.readFile(file , function(err, file_buff) {
if (err !== null) {
console.log('readFile error: ' + err);
} else {
socket.emit('img_stream', { buffer: file_buff });
}
file_buff = null;
++img_num;
});
startTimeout();
}
// kicks off first image
startTimeout();
});
Client Side
Capture the raw buffer data and generate a PNG with an <img> element.
socket.on('img_stream', function(data) {
var img_data = arrayBufferToBase64(data.buffer);
var panel = $('#frame-panel');
$('#frame').html('<img src="data:image/png;base64,' +
img_data + '" width="' + panel.width() + '" height="' +
panel.height() + '" />');
});
If I trigger the server once, it works fine but not great. I notice the memory usage go up significantly, and it crashes after several triggers. Can I improve my code here to be efficient or should I try a new approach?
I've looked into using Node's File Streams, socket.io-streams, and even Binary.js (though I hesitate to require our clients to have too-modern of browsers) and they look promising, but I don't quite know which would be best for my use-case. Any help or guidance would be greatly appreciated.
The web interface I'm developing is for an FPGA (Zynq-7000) based camera running PetaLinux with Node.js cross-compiled for the ARM processor, so I don't have a lot of server-side resources to work with. As such, I'd like to have the client-side do as much of the processing as possible. Eventually, streaming video would be incredible, but I'd be satisfied with reading and displaying successive frames at a reasonable rate.
This may be due to a memory leak within the socket.io library (see here for a description, and here for a proposed fix).
To fix this, download and use the latest version of socket.io from here.

Facebook OG Meta with Angular and Node

I am trying to get my angular / node application to render dynamic Open graph meta content.
I have been trying to follow this tutorial http://www.codewarmer.com/posts/1394433236-configuring-angularjs-nodejs-for-search-bots#!
I am having some problems with phantom working with node, my issue seems to be similar to this Error message when using PhantomJS, breaks at random intervals
except that my error does not happen at random intervals, it happens all the time.
EDIT: Here is my code
In my server.js I require a module i created based on the above tut called PhantomHandler.js and it is called like so.
var crawler = require('./modules/PhantomHandler');
This is what PhantomHandler.js looks like:
var phantom = require('phantom');
var models = require('../models');
mongoose = require('mongoose');
Snapshot = models.Snapshot;
url = require('url');
var baseUrl = 'my url';
function crawlSite(idx, arr, page, callback) {
crawlUrl(arr[idx], page, function(data) {
data.links.forEach(function(link) {
if (arr.indexOf(link) < 0)
arr.push(link);
});
Snapshot.upsert(data);
if (++idx === arr.length)
callback();
else
crawlSite(idx, arr, page, callback);
});
}
function startPhantom(cb) {
phantom.create(function(ph) {
phInstance = ph;
ph.createPage(function(page) {
phPage = page;
cb(ph, page);
});
});
}
function crawlUrl(path, page, cb) {
uri = url.resolve(baseUrl, path);
page.open(uri, function(status) {
var evaluateCb = function(result) {
result.path = path;
cb(result);
};
//Timeout 2000ms seems pretty enough for majority ajax apps
setTimeout(function() {
if (status == 'success')
page.evaluate(function() {
var linkTags = document.querySelectorAll('a:not([rel="nofollow"])');
var links = [];
for (var i = 0, ln; ln = linkTags[i]; i++)
links.push(ln.getAttribute('href'));
return {
'links': links,
'html': document.documentElement.outerHTML
};
}, evaluateCb);
}, 2000);
});
}
exports.crawlAll = function(callback) {
startPhantom(function(ph, page) {
crawlSite(0, ['/'], page, function() {
ph.exit();
callback();
});
});
};
exports.crawlOne = function(path, callback) {
startPhantom(function(ph, page) {
crawlUrl(path, page, function(data) {
Snapshot.upsert(data);
ph.exit();
callback();
});
});
};
When i run this code my exact error is:
phantom stderr: 'phantomjs' is not recognized as an internal or exte
,
operable program or batch file.
assert.js:92
throw new assert.AssertionError({
^
AssertionError: abnormal phantomjs exit code: 1
at Console.assert (console.js:102:23)
at ChildProcess.<anonymous> (path to node modules\node_modules\phantom\phantom.js:150:28)
at ChildProcess.emit (events.js:98:17)
at Process.ChildProcess._handle.onexit (child_process.js:809:12)
My question: Is this the best easiest way to go about getting angular to play nicely with Facebook OG? If it is can anyone else confirm if they have managed to get this technique to work with out phantom throwing an assertion error as described above.
It seems like this should be a relatively common job and I am surprised that I haven't found a nice straight forward tutorial on how to get this to work, unless i just haven't looked properly :s
Thanks
Okay,
Because my question was essentially "What is the best way to get angular and node to respond to Facebook with the correct page meta". I am now in a position to post my answer to this.
As stated above I think that using the phantom.js method described above requires phantom to be installed and to run as a separate process on the node.js server. (Can anyone confirm or deny this?)
For my situation I just wanted a user to be able to post a link from the site onto Facebook and for Facebook to return a nice looking link using open graph meta.
With that in mind I decided to skip the phantom.js step from the solution in the tutorial above. Instead I rolled some code which essentially saves an HTML snippet into the DB when a user hits a page. The HTML snippet just contains the meta tags I need for Facebook. I then use the last part of the above tutorial to direct Facebook bots to my saved HTML snippet.
It seems to work pretty well.

Error: The page has been destroyed and can no longer be used

I'm developing an add-on for the first time. It puts a little widget in the status bar that displays the number of unread Google Reader items. To accommodate this, the add-on process queries the Google Reader API every minute and passes the response to the widget. When I run cfx test I get this error:
Error: The page has been destroyed and can no longer be used.
I made sure to catch the widget's detach event and stop the refresh timer in response, but I'm still seeing the error. What am I doing wrong? Here's the relevant code:
// main.js - Main entry point
const tabs = require('tabs');
const widgets = require('widget');
const data = require('self').data;
const timers = require("timers");
const Request = require("request").Request;
function refreshUnreadCount() {
// Put in Google Reader API request
Request({
url: "https://www.google.com/reader/api/0/unread-count?output=json",
onComplete: function(response) {
// Ignore response if we encountered a 404 (e.g. user isn't logged in)
// or a different HTTP error.
// TODO: Can I make this work when third-party cookies are disabled?
if (response.status == 200) {
monitorWidget.postMessage(response.json);
} else {
monitorWidget.postMessage(null);
}
}
}).get();
}
var monitorWidget = widgets.Widget({
// Mandatory widget ID string
id: "greader-monitor",
// A required string description of the widget used for
// accessibility, title bars, and error reporting.
label: "GReader Monitor",
contentURL: data.url("widget.html"),
contentScriptFile: [data.url("jquery-1.7.2.min.js"), data.url("widget.js")],
onClick: function() {
// Open Google Reader when the widget is clicked.
tabs.open("https://www.google.com/reader/view/");
},
onAttach: function(worker) {
// If the widget's inner width changes, reflect that in the GUI
worker.port.on("widthReported", function(newWidth) {
worker.width = newWidth;
});
var refreshTimer = timers.setInterval(refreshUnreadCount, 60000);
// If the monitor widget is destroyed, make sure the timer gets cancelled.
worker.on("detach", function() {
timers.clearInterval(refreshTimer);
});
refreshUnreadCount();
}
});
// widget.js - Status bar widget script
// Every so often, we'll receive the updated item feed. It's our job
// to parse it.
self.on("message", function(json) {
if (json == null) {
$("span#counter").attr("class", "");
$("span#counter").text("N/A");
} else {
var newTotal = 0;
for (var item in json.unreadcounts) {
newTotal += json.unreadcounts[item].count;
}
// Since the cumulative reading list count is a separate part of the
// unread count info, we have to divide the total by 2.
newTotal /= 2;
$("span#counter").text(newTotal);
// Update style
if (newTotal > 0)
$("span#counter").attr("class", "newitems");
else
$("span#counter").attr("class", "");
}
// Reports the current width of the widget
self.port.emit("widthReported", $("div#widget").width());
});
Edit: I've uploaded the project in its entirety to this GitHub repository.
I think if you use the method monitorWidget.port.emit("widthReported", response.json); you can fire the event. It the second way to communicate with the content script and the add-on script.
Reference for the port communication
Reference for the communication with postMessage
I guess that this message comes up when you call monitorWidget.postMessage() in refreshUnreadCount(). The obvious cause for it would be: while you make sure to call refreshUnreadCount() only when the worker is still active, this function will do an asynchronous request which might take a while. So by the time this request completes the worker might be destroyed already.
One solution would be to pass the worker as a parameter to refreshUnreadCount(). It could then add its own detach listener (remove it when the request is done) and ignore the response if the worker was detached while the request was performed.
function refreshUnreadCount(worker) {
var detached = false;
function onDetach()
{
detached = true;
}
worker.on("detach", onDetach);
Request({
...
onComplete: function(response) {
worker.removeListener("detach", onDetach);
if (detached)
return; // Nothing to update with out data
...
}
}).get();
}
Then again, using try..catch to detect this situation and suppress the error would probably be simpler - but not exactly a clean solution.
I've just seen your message on irc, thanks for reporting your issues.
You are facing some internal bug in the SDK. I've opened a bug about that here.
You should definitely keep the first version of your code, where you send messages to the widget, i.e. widget.postMessage (instead of worker.postMessage). Then we will have to fix the bug I linked to in order to just make your code work!!
Then I suggest you to move the setInterval to the toplevel, otherwise you will fire multiple interval and request, one per window. This attach event is fired for each new firefox window.

Categories

Resources