In my Electron app I have a webview. If I click a link (e.g. on Google) and then move back, the link turns not purple, like in normal browsers.
Can I activate this behaviour in Electron too? Or do it programmatically, if I would store the browsing history by myself?
I guess it is somehow connected that the history support in Electron is not yet the best?
Would this maybe possible with Muon fork of Electron?
As you can see from the comments at about line 16 of this file, the Electron guys/girls have created their own navigation solution. It would seem this doesn't handle a:visited properly.
Instead, we can easily create our own solution. Add the following function to all of your renderer scripts and ensure you put simuHistory(); at the start of every page:
function simuHistory() {
var localstorageSimuHistory = localStorage.getItem('simuHistory');
var simuHistory = localstorageSimuHistory ? JSON.parse(localstorageSimuHistory) : [];
var found = false;
var windowHref = window.location.href;
for(let i=0;i<simuHistory.length;i++) {
if(simuHistory[i] == windowHref) {
found = true;
break;
}
}
if(found === false) {
simuHistory[simuHistory.length] = windowHref;
localStorage.setItem('simuHistory', JSON.stringify(simuHistory));
}
var elements = document.getElementsByTagName('a');
for(let i=0;i<elements.length;i++) {
for(let h=0;h<simuHistory.length;h++) {
if(elements[i].href == simuHistory[h]) {
elements[i].className += ' visited';
}
}
}
}
And add the following CSS to your stylesheet:
.visited {
color: purple;
}
Alternatively, if you don't want to include the CSS (or want it all self-contained), you could replace this line:
elements[i].className += ' visited';
...with this line:
elements[i].style.color = 'purple';
Scalability
The TL;DR is that unless you have more than around 25,000 fairly long unique URLs in your app, you don't need to worry about scalability.
The visited URLs are stored in localStorage. This is limited in size, and therefore limited in the number of URLs we can store. I believe in Electron this is limited to around 5MB. I tested performance by adding 25,000 additional URLs to the simuHistory array using the following code:
for(var i = 0; i < 25000; i++) {
simuHistory[simuHistory.length] = 'https://stackoverflow.com/questions/417142/what-is-the-maximum-length-of-a-url-in-different-browsers' + Math.floor((Math.random() * 1000000) + 1);
}
The URL was chosen because it was fairly long. The random number on the end actually affects nothing at all; I just included it to add some extra length. Then I timed running simuHistory() on my crappy slow laptop (with no SSD):
25,000 URLs == ~210ms
1,000 URLs == ~13ms
In my opinion that's plenty fast for nearly any Electron app, but it depends on your use case as to how many URLs you might end up with.
Related
I want to have a search engine which searches only my own site. I have some JavaScript currently, but it only searches words on that specific page. I need it to search the links within my site if possible.
I cannot use the Google search engine as my site is on an internal intranet.
<SCRIPT language=JavaScript>
var NS4 = (document.layers);
var IE4 = (document.all);
var win = window;
var n = 0;
function findInPage(str) {
var txt, i, found;
if (str == "")
return false;
if (NS4) {
if (!win.find(str))
while(win.find(str, false, true))
n++;
else
n++;
if (n == 0)
alert("Not found.");
}
if (IE4) {
txt = win.document.body.createTextRange();
for (i = 0; i <= n && (found = txt.findText(str)) != false; i++) {
txt.moveStart("character", 1);
txt.moveEnd("textedit");
}
if (found) {
txt.moveStart("character", -1);
txt.findText(str);
txt.select();
txt.scrollIntoView();
n++;
}
else {
if (n > 0) {
n = 0;
findInPage(str);
}
else
alert("Sorry, we couldn't find.Try again");
}
}
return false;
}
</SCRIPT>
(onsubmit="return findInPage(this.string.value); in the button tag.)
It works great for searching that page, but I was hoping there was a way to search all pages on my site.
Few suggestions:
Unless you must, don't re-invent the wheel - there are open source libraries such as Tipue Search (Tipue Search) and others.
You can use jquery/ajax $.load() to dynamically load page content and search them, while still staying in the same page as far as your DOM and script goes.
NodeJS is also a good option, but will probably be an over kill.
Hope this helps!
You could use search-index. It can run on the server and in the browser. An example on how to run it in the browser and the actual demo of it. You would have to write a crawler/spider that goes through your site. Lunr.js would also work well, I think.
If you had your site as JSON, the indexing would be a small task to fix, or you could have a crawler running in the browser.
Disclaimer: I'm doing some work on search-index.
As you are on an intranet and presumably all you pages are on the same server then I would think it would be possible to make a XMLHttpRequest to each of your pages in turn, store the page in a variable and then do a search on the stored page.
Possibly someone with more experience of XMLHttpRequest would say how efficient or effective this would be.
I have gone through many stackoverflow posts and other websites to find a solution to this problem, but none of the solutions i could find either fit my problem or just straight up didn't work.
Im using javascript and jQuery (and some Java for background work) to build a page on a website. The page contains a table with data (the data is handled by Java) and i want to have that table refresh every 10 seconds. That works.
But now i also want to highlight all the cells that have changed values in them. For that, as you see in my code snippet, i just simply turn the background of those cells black. That works too.
My problem is that the color only changes for a split second before changing back to standard white. Using the console and playing around a bit i was able to find out that the table is actually reloaded twice, which to me must mean the load-command is executed twice.
window.setInterval(function () {
if (frozen == false) {
$('table').load(document.URL + ' table');
var elem = document.getElementById("freezebtn"); //This is a button generated by Java Code
elem.style.background = getRandomColor();
var tab = document.getElementsByTagName('table').item(1);
var l = tab.rows.length;
for (var i = 0; i<l; i++) {
var tr = tab.rows[i];
var cll = tr.cells[1];
var check = tr.cells[0];
if(check.innerText === "getAsk") {
var valAsk = cll.innerText;
var ask = Number(valAsk);
if (ask != loadPreviousAsk()) {
console.log("TELEFONMAST!");
cll.style.backgroundColor = "#000000";
}
}
//Do this for every other Updateable Cell
...
}
cachePreviousValues(someValues,thatINeed,To,BeCached);
}
My Question is, what is/could be causing the repeat of the load command (or why won't the cells stay black at all)?
Is there a better way of achieving what im trying to do?
Could my code sample be optimized in any other way?
$.fn.load() is async (by default). To handle any logic regarding loaded data, you can use the complete callback:
$('table').load(document.URL + ' table', function(){/* your logic regarding loaded data here*/);
I'm trying to achieve the following in a web page:
Users can open multiple tabs/windows of the page.
Every few seconds, I need exactly one of those tabs/windows to execute a specific section of code (critical region).
I don't care which one of the tabs/windows executes the code, i.e. no need to worry about the fairness or starvation properties of the solution.
Since the user opened the tabs/windows him/herself, the different instances of the page have no knowledge about or direct references to each other (i.e. no window.parent, etc.)
I don't want to require Flash or Silverlight or other plugins and everything needs to run client-side, so the ways in which the tabs/windows can communicate are very limited (LocalStorage is the only one I found so far, but there might be others).
Any of the tabs/windows can crash or be closed or refreshed at any time and more tabs/windows can be opened at any time also, and the remaining windows must "react" such that I still get exactly one execution of the critical region every few seconds.
This needs to run reliably in as many browsers as possible, including mobile (caniuse-rating of over %90).
My first attempt at a solution was to use a simple mutual exclusion algorithm that uses LocalStorage as the shared memory. For various reasons, I chose the mutual exclusion algorithm by Burns and Lynch from their paper "Mutual Exclusion Using Indivisible Reads and Writes" (page 4 (836)).
I built a jsfiddle (see code below) to try the idea out and it works beautifully in Firefox. If you'd like to try it, open the link to the fiddle in several (up to 20) windows of Firefox and watch exactly one of them blink orange every second. If you see more than one blink at the same time, let me know! :) (Note: the way I assign the IDs in the fiddle is a little cheesy (simply looping over 0..19) and things will only work if every window was assigned a different ID. If two windows show the same ID, simply reload one.).
Unfortunately, in Chrome and especially in Internet Explorer things don't work as planned (multiple windows blink). I think this is due to a delay in the propagation of the data I write to LocalStorage from one tab/window to the other (see my question about this here).
So, basically, I need to find either a different mutex algorithm that can handle delayed data (sounds difficult/impossible) or I need to find an entirely different approach. Maybe StorageEvents can help? Or maybe there is a different mechanism that doesn't use LocalStorage?
For completeness, here is the code of the fiddle:
// Global constants
var LOCK_TIMEOUT = 300; // Locks time out after 300ms
var INTERVAL = 1000; // Critical section should run every second
//==================================================================================
// Assign process ID
var myID;
id = window.localStorage.getItem("id");
if (id==null) id = 0;
id = Number(id);
myID = id;
id = (id+1) % 20;
window.localStorage.setItem("id", id);
document.documentElement.innerHTML = "ID: "+myID;
//==================================================================================
// Method to indicate critical section
var lastBlink = 0;
function blink() {
col = Math.round(Math.min((new Date().getTime() - lastBlink)*2/3, 255));
document.body.style.backgroundColor = "rgb(255, "+((col >> 1)+128)+", "+col+")";
}
//==================================================================================
// Helper methods to implement expiring flags
function flagUp() {
window.localStorage.setItem("F"+myID, new Date().getTime());
}
function flagDown() {
window.localStorage.setItem("F"+myID, 0);
}
// Try to refresh flag timeout and return whether we're sure that it never expired
function refreshFlag() {
content = window.localStorage.getItem("F"+myID);
if (content==null) return false;
content = Number(content);
if ((content==NaN) || (Math.abs(new Date().getTime() - content)>=timeout))
return false;
window.localStorage.setItem("F"+myID, new Date().getTime());
return Math.abs(new Date().getTime() - content) < timeout;
}
function setFlag(key) {
window.localStorage.setItem(key, new Date().getTime());
}
function checkFlag(key, timeout) {
content = window.localStorage.getItem(key);
if (content==null) return false;
content = Number(content);
if (content==NaN) return false;
return Math.abs(new Date().getTime() - content) < timeout;
}
//==================================================================================
// Burns-Lynch mutual exclusion algorithm
var atLine7 = false;
function enterCriticalRegion() {
// Refresh flag timeout and restart algorithm if flag may have expired
if (atLine7) atLine7 &= refreshFlag();
// Check if run is due
if (checkFlag("LastRun", INTERVAL)) return false;
if (!atLine7) {
// 3: F[i] down
flagDown();
// 4: for j:=1 to i-1 do if F[j] = up goto 3
for (j=0; j<myID; j++)
if (checkFlag("F"+j, LOCK_TIMEOUT)) return false;
// 5: F[i] up
flagUp();
// 6: for j:=1 to i-1 do if F[j] = up goto 3
for (j=0; j<myID; j++)
if (checkFlag("F"+j, LOCK_TIMEOUT)) return false;
atLine7 = true;
}
// 7: for j:=i+1 to N do if F[j] = up goto 7
for (j=myID+1; j<20; j++)
if (checkFlag("F"+j, LOCK_TIMEOUT)) return false;
// Check again if run is due
return !checkFlag("LastRun", INTERVAL);
}
function leaveCriticalRegion() {
// Remember time of last succesful run
setFlag("LastRun");
// Release lock on critical region
atLine7 = false;
window.localStorage.setItem("F"+myID, 0);
}
//==================================================================================
// Keep trying to enter critical region and blink on success
function run() {
if (enterCriticalRegion()) {
lastBlink = new Date().getTime();
leaveCriticalRegion();
}
}
// Go!
window.setInterval(run, 10);
window.setInterval(blink, 10);
This is the algorithm of what I want to do:
1.Locates flickr links with class' high_res_link and puts them in array [].
2.Opens flickr link with extension "sizes/h/"
3.finds largest photo dimensions on flickr. Then goes to that link. Or if there arent any big enough goes to step 2 and goes to next
array.
4. then opens link to download if downloading is enabled. If not goes to step 2 and goes to next array.
5. Goes to step 2 and goes to next array.
I am trying to write some code that crosses two domains: Tumblr and Flickr.
I have currently written 3 functions with Jquery and Javascript which I want to run on 2 different URLs:
Function #1:
function link_to_flickr() {
var hre = [];
$('.high_res_link').parent(this).each(function(){
var h = $(this).attr('href') +"sizes/o/";
hre.push(h);
});
alert(hre[0]);
}
This finds the links on the Tumblr page to the Flickr pages I want. And puts them in an array.
Function #2:
function find_large_quality() {
var w = 1280;
var h = 720;
var matchingDivs = $("small").each(function () {
var match = /^\((\d+) x (\d+)\)$/.exec($(this).text());
if (match) {
if (parseInt(match[1], 10) >= w && parseInt(match[2], 10) >= h) {
return true;
}
}
return false;
});
var href = $.trim(matchingDivs.text()).match(/\(.*?\)/g);
if (matchingDivs.length >= 1) {
alert("success");
} else {
alert("fail");
}
var ho = $('small:contains("'+href[href.length - 1]+'")').parent(this).find("a").attr("href");
alert("http://www.flickr.com"+ho);
}
This function once on the Flickr URL then searches for an image with dimensions greater than 720p.
Function #3:
function Download(){
var heyho = $('a:contains("Download the")').attr('href');
window.open(heyho, '_blank');
}
This downloads the image file. Once on the Highest quality Flickr URL
Each alert I want to open the URL instead. And perform the next function on. I have been trying for ages and ages of a method to go about doing something like this.
Using AJAX, using PHP, using Jsonp, using jquery.xdomainajax.js, etc... But I can't come up with a sufficient method on my own.
Has anybody got any way they would recommend going about doing something like this?
You usually can't run functions on different domains due to CORS unless the domains allow that.
If you're writing a userscript/extension, what you can do is use postMessage (quick tutorial on how to use it cross-domain) on both pages in a content script and use the achieved inter-page communication to control your script flow.
An alternate method is to use the APIs of the websites you want to access.
Or use Python with BeautifulSoup.
I have started learning javascript a couple of days ago and done the codeacadmey stuff and thought i will try make a simple game.
so i came up with the memory game where you have to find pairs of images.
it is all working and i got a score system in place but a few people have said the delay that happens once the cards have been chosen to allowing another chocie is hindering them and i cant figure out how to improve that performance.
here is a bit of code i think is causing the delay, is there any better way to produce the same result, sorry about before i am new to all this.
function check() {
clearInterval(tid);
if(people[secondchocie] === people[firstchocie]) {
cntr++;
(cntr === numOfMatches) {
stop();
score = checkScore(amountGoes);
$('#gameFinished').append('<p>Well done, you managed to complete the game your score is <span>' + score + '</span></p>');
}
turns = 0;
return;
} else {
document.images[firstchocie + numOfImages].src = backcard;
document.images[secondchocie + numOfImages].src = backcard;
turns = 0;
return;
}
}
I can't create comments, so I'll put this in an answer.
Although I agree with lukas.pukenis ...
Changing images can take some time if they aren't preloaded. To test this: Try to get them into the browser cache by adding them somewhere else in the page (i.e. with an IMG tag) before starting the game.
Then you'll be sure they are in the cache.
edit:
I recently used this:
var cache = [];
function preLoadImages(arrImg)
{
var args_len = arrImg.length;
for (var i = args_len; i--;)
{
var cacheImage = document.createElement('img');
cacheImage.src = arrImg[i];
cache.push(cacheImage);
}
}
preLoadImages(['images/img1.png','images/img2.png','images/img3.png',]);
you can add all images needed to the javascript array.
If your a quick study :) you can do the following:
If your page is generated by php you could let php read the entire images directory and write the filenames in the page as javascript code.
Or you could create an ajax request wich returns all paths to the images and sends them to the preload function as a callback.