In Lightswitch HTML client we have created a screen to display the work in progress for a particular business processes.
This is to be displayed on a big screen, much like when you go to Argos to collect your order. Here's a screenshot...
We are using some java script to refresh the page every 30 seconds.
setTimeout(function () {
window.location.reload(1);
}, 30000);
However, there are two issues with this.
The 'maximum number of results' text input by the user is lost on refresh.
It doesnt look nice to refresh the whole page.
Is it therefore possible to just trigger each query to reload instead of the entire page?
(The data is provided to LightSwitch by a WCF RIA Service)
In the JavaScript, use screen.MyList.load(). It will reload the list asynchronously.
Note that IntelliSense does not always suggest list names on the screen object but will recognize them if you type the name.
Combined with the setTimeout() method and the created screen event, it should work.
I had the same issue and finally found the solution. I added this in my created event:
myapp.BrowseMembers.created = function (screen) {
setInterval(function () {screen.Members.load(true);}, 1000);
};
Ii works, just the screen get flickering when reloading the data.
setTimeout will only trigger once but setInterval will trigger every 1 second.
I had the same problem with LightSwitch VS2012 Update 3,
for my case just invalidation is enough, so i can always work with a fresh entity.
This code runs once on entry screen and invalidates loaded entities every 30 seconds, and forces a refetch just when needed.
myapp.aaHome.created = function (screen) {
setInterval(function () {
screen.details.dataWorkspace.ApplicationData.Currencies._loadedEntities = {};
}, 30000);
};
Related
I have a Django app which has a view that pulls data from Bigquery before I render the data to the Frontend. This process of pulling data takes quite some time and it will load the frontend once the view is finished loading the data. Is there a way I could show a loading page while the Django Views are pulling the data then make it disappears once everything is ready?
I tried using the code below:
function onReady(callback) {
var intervalId = window.setInterval(function() {
if (document.getElementsByTagName('body')[0] !== undefined) {
window.clearInterval(intervalId);
callback.call(this);
}
}, 1000);
}
function setVisible(selector, visible) {
document.querySelector(selector).style.display = visible ? 'block' : 'none';
}
onReady(function() {
setVisible('#app', true);
setVisible('.loading', false);
});
But it seems that it still waits for the data to load, then shows the loading page for 1-2 seconds then immediately loads the frontend.
i am not good in javascript but i face same problem sometimes ago and i use Wayspoint for making infinite scroll and this plugging has two event that is very usefull: onAfterPageLoad and onBeforePageLoad.with these two events you can do what you want onBeforePageLoad you can set a spinner (or what ever you want) and onAfterPageLoad hide it.
Full documentation is available here: Full documentation for Wayspoint
Best Tutorial: Best tutorial ever
I have an internet radio station and I need a script that will display a picture of the current song in a particular dvi with an id. The image is automatically uploaded via ftp to the server each time the song changes..
HTML:
<div id="auto"></div>
JS:
$ (document).ready(function() {
$('#auto').html('<img src="artwork.png"></img>');
refresh();
});
function refresh() {
setTimeout (function() {
$('#auto').html('<img src="artwork.png"></img>');
refresh();
}, 1000);
}
I tried this, but all I get is that the image is loaded, but in case of a change, I have to manually refresh the whole page again..
I'll point out multiple things here.
I think your code is just fine if you are going for the setTimeout recursive calls instead of one setInterval action to repeat it.
File Caching
your problem is probably the browser's cache since you are using the same image name and directory all the time. browsers compare the file name and directory and to decide to load it from its cache or else it will request it from the server. there are different tricks you can do to reload the image from the server in this particular case.
Use different file names/directories for the songs loaded dynamically
Use a randomized GET query (e.g. image.png?v=current timestamp)
Your method for switching
you are replacing the file with FTP, I wouldn't recommend that. maybe you should have all your albums and thumbnails uploaded to the server and use a different dynamic switching for efficiency and less error proneness and will help you achieve method #1 in the previous section better.
Loading with constant refresh
I would like to highlight that if you are using nodeJs or nginx servers - which are event based - you can achieve the same functionality with much less traffic. you don't need a refresh method since those servers can actually send data on specific events to the browser telling it to load a specific resource at that time. no constant refresh is required for this.
You consider your options, I tried to be as comprehensive as I could
At the top level, browser cache the image based on its absolute URL. You may add extra query to the url to trick browser that is another new image. In this case, new URL of artist.png will be artist.png?timestamp=123
Check this out for the refresh():
function refresh() {
setTimeout (function() {
var timestamp = new Date().getTime();
// reassign the url to be like artwork.png?timestamp=456784512 based on timestmap
$('#auto').html('<img src="artwork.png?timestamp='+ timestamp +'"></img>');
refresh();
}, 1000);
}
You may assign id attribute to the image and change its src url
html
<img id="myArtworkId" src="artwork.png"/>
js in the refresh method
$('#myArtworkId').attr('src', 'artwork.png?timestamp=' + new Date().getTime());
You can use window.setInterval() to call a method every x seconds and clearInterval() to stop calling that method. View this answer for more information on this.
// Array containing src for demo
$srcs = ['https://www.petmd.com/sites/default/files/Acute-Dog-Diarrhea-47066074.jpg',
'https://www.catster.com/wp-content/uploads/2018/05/Sad-cat-black-and-white-looking-out-the-window.jpg',
'https://img.buzzfeed.com/buzzfeed-static/static/2017-05/17/13/asset/buzzfeed-prod-fastlane-03/sub-buzz-25320-1495040572-8.jpg?downsize=700:*&output-format=auto&output-quality=auto'
]
$i = 0;
$(document).ready(function() {
$('#auto').html('<img src="https://images.pexels.com/photos/617278/pexels-photo-617278.jpeg?auto=compress&cs=tinysrgb&dpr=1&w=500"></img>');
// call method after every 2 seconds
window.setInterval(function() {
refresh();
}, 2000);
// To stop the calling of refresh method uncomment the line below
//clearInterval()
});
function refresh() {
$('#auto').html('<img src="' + $srcs[$i++] + '"></img>');
// Handling of index out of bound exception
if ($srcs.length == $i) {
$i = 0;
}
}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
<div id="auto"></div>
What I want:
My propose is to check if new content was added in a page (that I do not own), so I was thinking to make a script that save the last content added in a cookie and refresh the page every minute: If the cookie doesn't match the last content added, that would mean there is new content and I would receive a notification.
Let's try with pseudocode:
main_file:
include: functions.js;
cookie last_content_added= get_first_paragraph();
//Refresh script
do (every_minute){
page_reload();
}
when.page.reload.complete {
run script_check_content
}
functions.js
script_check_content{
var content_check = get_first_paragraph();
if (content_check == cookie[last_content_added])
{
//do nothing
}
else{
//new content was added
play.notification.mp3
cookie[last_content.added] = get_first_paragraph();
}
}
Am I not thinking in an easier solution for what I'm looking for?
I'm new to chrome extensions, if you could separate the code in different files like it was a real extension, I would appreciate very much.
I recommend to use 'chrome.tabs.query', use this to get all tabs that have the specified properties or all tabs if no properties are specified and 'chrome.tabs.executeScript' to inject the javascript code into a page that calls 'window.location.reload(). to refresh the page.
Here's a sample code to get the current tab and reload it using chrome.tab methods:
chrome.tabs.query({active: false, currentWindow: true}, function (arrayOfTabs) {
var code = 'window.location.reload();';
chrome.tabs.executeScript(arrayOfTabs[0].id, {code: code});
});
Also, include 'onCompleted' listener to listen when it is completely loaded and initialized.
chrome.webNavigation.onCompleted.addListener(function callback).
Take a look at MutationObserver, it provides a way to react to changes in a DOM. You can provide a callback to react to DOM changes and don't need to use a timer.
I'm using BreezeJS and storing/restoring data in local storage. That's working great. The problem occurs when the user opens multiple tabs. Changes in each tab clobber each other. Changes should be synchronised between tabs.
NB: BreezeJS will take care of merging changes, I just need to deal with race conditions between tabs.
var stashName = 'stash_everything';
window.setInterval(function () {
var exportData = manager.exportEntities();
window.localStorage.setItem(stashName, exportData);
}, 5000);
addEvent(window, 'storage', function (event) {
if (event.key == stashName) {
var importData = window.localStorage.getItem(stashName);
manager.importEntities(importData);
}
});
I've tried listening to the 'storage' event, but I haven't been able to get it working successfully. I either still clobber changes, or get into an infinite loop.
The crux of the issue is that I'm just saving on a timer; if I only saved after user interaction, then I'd avoid (most) race conditions. There's no 'has the user changed anything since last time I asked you' call in breeze, though, as far as I can tell.
Does anyone have advice on how to approach this?
Hmm this doesn't seem like it is impervious to having problems for many reasons but the main one would be that you still won't prevent concurrent saves from each tab with different data sets. That being said, if you are comfortable with the fact the two caches could be out of sync just use some unique identifier -
Somewhere in your app on load -
var stashName = 'stash-everything-' + new Date().getTime();
window.setInterval(function () {
var exportData = manager.exportEntities();
window.localStorage.setItem(stashName, exportData);
}, 5000);
Now each tab would have a unique set of data to work with.
I created a Magic The Gathering site for my friends and I to use. On this site, we upload our decks of cards, and on the page where you can view all the cards in the deck each card is a link to the card on http://gatherer.wizards.com/. For ease of use, though, I made it so that when you hover over any of the card names, the card image gets Ajax'd in from gatherer, thus letting you see the card without having to click the link.
The question is: should I load all of the ~40 or so card images all at once when the page loads, or should I continuously load the images as they are hovered over, or is there some other way I should be doing it?
As it stands, I load each card as it is hovered over. My concern is that, as people mouse up and down the list, that is a LOT of requests to Gatherer. It would probably save requests to load them all up at the start, but I'm not sure if Gatherer would be upset with me for a sudden flurry of requests every time someone loads one of the decks on my site.
A solution I thought of was to load cards as they are hovered over, but save the image in a hidden container and just reload it when they mouse over it AGAIN. Thus if they load the page and don't look at anything, no needless requests were sent, but if they stay on the page for 30 minutes looking at every card over and over again, we don't inundate Gatherer with requests.
I just don't know if the method I'm using is wasteful - from a bandwidth standpoint for me or for gatherer, or from any other standpoint that I'm not familiar with. Are there any golden rules of external Ajax that I should know, for instance?
The method I'm currently using, which I assume is probably the worst implementation possible, but it was a proof of concept:
$(document).ready(function(){
var container = $('#cardImageHolder');
$('.bumpin a').mouseenter(function(){
doAjax($(this).attr('href'));
return false;
});
function doAjax(url){
// if it is an external URI
if(url.match('^http')){
// call YQL
$.getJSON("http://query.yahooapis.com/v1/public/yql?"+
"q=select%20*%20from%20html%20where%20url%3D%22"+
encodeURIComponent(url)+
"%22&format=xml'&callback=?",
// this function gets the data from the successful
// JSON-P call
function(data){
// if there is data, filter it and render it out
if(data.results[0]){
var data = filterData(data.results[0]);
var src = $(data).find('.leftCol img').first().attr('src');
var fixedImageSrc = src.replace("../../", "http://gatherer.wizards.com/");
var image = $(data).find('.leftCol img').first().attr('src', fixedImageSrc);
container.html(image);
// otherwise tell the world that something went wrong
} else {
var errormsg = "<p>Error: can't load the page.</p>";
container.html(errormsg);
}
}
);
// if it is not an external URI, use Ajax load()
} else {
$('#target').load(url);
}
}
// filter out some nasties
function filterData(data){
data = data.replace(/<?\/body[^>]*>/g,'');
data = data.replace(/[\r|\n]+/g,'');
data = data.replace(/<--[\S\s]*?-->/g,'');
data = data.replace(/<noscript[^>]*>[\S\s]*?<\/noscript>/g,'');
data = data.replace(/<script[^>]*>[\S\s]*?<\/script>/g,'');
data = data.replace(/<script.*\/>/,'');
return data;
}
});
No, there are no Golden Rules of Ajax. Loading 40 images up front would minimize load time upon hover, but would greatly increase how much bandwidth is used when the page is first loaded.
You will always have these types of balance questions. It's up to you to decide what is best, and tweak it based on empirical data.
"A solution I thought of was to load cards as they are hovered over,
but save the image in a hidden container and just reload it when they
mouse over it AGAIN. Thus if they load the page and don't look at
anything, no needless requests were sent, but if they stay on the page
for 30 minutes looking at every card over and over again, we don't
inundate Gatherer with requests."
This sounds reasonable.
If I were you, though, I would load every picture when the user first loads the page. Let the browser cache the images and you don't have to worry about it. Plus, this is likely the easiest method. Don't over complicate things when you don't have to :)