We are using Web Components and Polymer on our site, and have quite a few bits of Javascript which wait for the "WebComponentsReady" event to be fired before executing. However, we have some asynchronous JS files which occasionally add an event listener for the event after it has been fired, meaning the script we want to run is never run.
Does anyone know if there is a flag for Web Components being ready which can be checked?
Something like this is what we would need:
if(WebComponents.ready) { // Does this flag, or something similar, exist??
// do stuff
} else {
document.addEventListener('WebComponentsReady', function() {
// do stuff
}
}
Any help appreciated.
The following flag is set during bootstrap
window.CustomElements.ready
Related
There is a website which fires a function on when the tab is blurred. I don't want that to happen.
Is there a way I can stop javascript from firing window.onBlur event?
From initial search, I have come to the conclusion that I need to override the default function of javascript, which can be done using userscript managers like Greesemonkey.
I tried the following script in Greesemonkey:
window.onblur = null
This doesn't seem to have any effect and the webpage behaves same as previously.
Have look at Event.preventDefault() and Event.stopPropagation() if it helps your case.
If you would like to override the function which is called on the event, you can simply redefine it and insert it using a script manager. For example:
var originalCallbackFunction = callbackFuntion;
callbackFunction = function() { // Redefinition
/* Do something else */
}
I've implemented a chat application using socket-io and nodejs. The application is running fine but, sometimes, I'm facing problems to treat HTML content because when I try to $('#id').hide() or $('#id').show() nothing happens because the element id is not available.
If I try to refresh page pressing F5, sometimes it works because that element is rendered before I try to hide or show it. I got this behavior while debugging using Google Developer tools but I'm not sure if it's the "real why".
I was trying to find on Internet what is the life cycle of DOM elements but I didn't find something that could clarify my thoughts.
I'm trying to reproduce this problem on development environment but I'm pretty far of reach the problem:
<script>
console.log('Creating socket');
var socket = io();
console.log('Socket created');
socket.on('connected', function (data) {
console.log('connected to server');
});
function timeout() {
setTimeout(function() {console.log('sleeping')}, 5000);
}
$(document).ready(function(){
timeout(); // is possible to stuck process on this point?
console.log('Ready');
});
</script>
No matter where I put socket.on('connected'.. because it's always called after console.log('Ready'). Based on this, my theory of F5 refresh is not correct and I feel that I'm running in circles.
Anyone have an idea why HTML elements are not present sometimes?
And, If I use socket.on('anyevent, {}) inside $(document).ready(function(){} do I have any guarantee that the event will only be processed after page being full rendered?
On a real world, all our sockets events are inside $(document).ready(function(){} but still not hiding or showing some html elements because they aren't present.
I am not sure about your HTML and code structure but this sounds like you are binding your event listeners to a dynamically added element but this element does not exist at the time of the binding.
If my understanding is correct, you need to add the binding on an element but base the action on the newly added element, something along the lines of:
// Add event listener, bound to the dynamically added element
$("button").on('click', $("#newElemId"), function(){
// if element exists, toggle it
if($("#newElemId").length){
$("#newElemId").toggle();
}else{
console.log('element not present yet');
}
});
See demo below:
$(function(){
// define function to add an element to the DOM
var addElement = function(){
var newElementAsObj = $(document.createElement('div'));
// add an id for querying later
newElementAsObj.attr('id', 'newElemId');
// add some text (so it's visible)
newElementAsObj.text('New Element');
$("#container").append(newElementAsObj);
console.log('new element added!');
}
// add a new element after a few secs
setTimeout( addElement, 5 * 1000); // time is in ms so 5*1000 = 5secs
// Add event listener, bound to the dynamically added element
$("button").on('click', $("#newElemId"), function(){
if($("#newElemId").length){
// if element exists, toggle it
$("#newElemId").toggle();
}else{
console.log('element not present yet');
}
});
});
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<div id="container">
<button>Toggle</button>
</div>
First, regarding this:
$(document).ready(function(){
timeout(); // is possible to stuck process on this point?
console.log('Ready');
});
No, it's not possible. But you don't need to wait there. You can remove that timeout function entirely.
You should move the socket.on('connected', function ... into $(document).ready(... because you don't want to respond to any socket events until the document is ready.
<script>
console.log('Creating socket');
var socket = io(); // It's fine to do this before the document loads
console.log('Socket created');
$(document).ready(function(){
socket.on('connected', function (data) {
console.log('connected to server');
});
console.log('waiting for connection...');
});
</script>
JQuery documentation describes that you can use $(window).on('load', function() {}) to run your code after the entire page, not just the DOM, is ready. You might try that if it's not enough that only your DOM is ready, but you need to wait for the whole page to be loaded.
https://learn.jquery.com/using-jquery-core/document-ready/
If I try to refresh page pressing F5, sometimes it works because that element is rendered before I try to hide or show it. I got this behavior while debugging using Google Developer tools but I'm not sure if it's the "real why".
Are you dynamically creating that element for example with $().append() or similar? If so, make sure that you actually create the element before you try to interact with it. If the element is not created dynamically, make sure that the code that interacts with the element is inside the $(document).ready or $(window).on('load') callback.
No matter where I put socket.on('connected'.. because it's always called after console.log('Ready'). Based on this, my theory of F5 refresh is not correct and I feel that I'm running in circles.
This happens because establishing the socket connection takes more time than to render the DOM. It's generally good idea to attach the event listener as soon as possible to not miss any events. If you attach the event listener only after the DOM has loaded, you might miss some events. But be aware, that if you manipulate the DOM inside the event listener callback, then you cannot be sure that the DOM is loaded and your target element is there, unless you attach the event listener after the DOM has loaded. So I recommend to attach event listeners after the DOM has loaded. At least those that contains some code to modify the DOM.
Anyone have an idea why HTML elements are not present sometimes?
There are not many possible reasons for this. Either your elements are not yet loaded or your code has removed them for some reason. I suggest putting breakpoints to the places where you create the elements or manipulate them somehow and see what the execution order is.
And, If I use socket.on('anyevent, {}) inside $(document).ready(function(){} do I have any guarantee that the event will only be processed after page being full rendered?
You have a guarantee that the callback function will be executed when the anyevent occurs and when the DOM is ready, that is, all the static html elements are there.
I have a chrome extension which injects some DOM event listeners through the content scripts. I want to remove those event listeners from the DOM in the event that the user deactivates the plugins, is there a method to do so?
It's an interesting question. It has to do with a concept of "orphaned" scripts. I talk at length about those in an addendum here.
Problem is, as soon as the script becomes detached from the parent extension, Chrome APIs will fail. As such, detecting this is not straightforward.
There are many possible approaches:
Maintain an open port to the background page. The port will fire an onDisconnected event in case the background page ceases to exist.
This is an event-based approach - you will be able to react immediately.
But this has an important downside: maintaining an open port will prevent an Event page from unloading. So if you use a non-persistent background page, this is not optimal.
Periodically, or better yet - in the beginning of your handlers, try to do something with Chrome API. This will fail, and you can catch the exception and assume that the extension is orphaned.
Please note that this is pretty much undefined behavior. How Chrome API reacts can change over time.
function heartbeat(success, failure) {
try {
if(chrome.runtime.getManifest()) {
success();
} else { // will return undefined in an orphaned script
failure();
}
} catch(e) { // currently doesn't happen, but may happen
failure();
}
}
function handler() {
heartbeat(
function(){ // hearbeat success
/* Do stuff */
},
function(){ // hearbeat failure
someEvent.removeListener(handler);
console.log("Goodbye, cruel world!");
}
);
}
someEvent.addListener(handler);
Finally, there is a proposal to make a special event for this situation, but it's not implemented yet.
Specifically for updates when the extension is reloaded, you can make it inject scripts into existing pages and let old scripts know they should deactivate; however, since your question is about extension being removed, it doesn't help.
With the hard part done, actual removal of event listeners depends on how you added them, but should be straightforward.
onDisabled fired when app or extension has been disabled.
chrome.management.onDisabled.addListener(function callback)
EventTarget.removeEventListener() removes the event listener previously registered.
var div = document.getElementById('div');
var listener = function (event) {
/* do something here */
};
div.addEventListener('click', listener, false);
div.removeEventListener('click', listener, false);
I think you don't need to this, since once your extension is disabled, your event listener will be removed and won't be injected.
My site works on jQuery + AJAX and has the only javascript file, which loads once when a user opens any page, so I'm used to add event listeners to all elements like: $(document).on(...).
In a while I'd noticed that there are too many .on(...) in the code, and I got afraid. I'd taken 9 pills and forced it to delete useless listeners every time when a user click on a link / back button.
function page_reload(){
if(c.r == 'http://example.com/page1'){
$(document).on('click', '#send', func.send);
$(document).on({mouseenter: func.me, mouseleave: func.ml}, '#chan');
}else{
$(document).off('click', '#send');
$(document).off('*', '#chan');
}
}
So is there any sense? Maybe a big number of listeners do some bad thing I don't know about?
When you attach a listener to an event, it does take memory and it can (if totally unchecked) cause memory related issues. In my experience, it is best to employ cleanup methods in your objects that, when a certain event fires, you use your .off() methods to unregister your event listeners.
The particular logic to these types of methods will vary depending on your project but something of the form:
var MyApp = {
cleanup: function cleanMyApp(event) {
this.off('#myId1', myMethod1);
this.off('#myId2', myMethod2);
}
}
$('document').on('ready', function() {
$(document).on('importantEvent', function(event) {
event.preventDefault(); // if you need to
MyApp.cleanup();
});
// or
$('#elem').on('something', MyApp.cleanup);
});
So yes, having too many listeners registered at a time can cause issues but you can monitor memory usage with your browser's dev tools and the like. In particular you can run out of stack (and heap?) memory and possibly crash the browser.
There is also a great answer here on dealing with these kinds of issues.
I inherited some code and all I want to do is run some jQuery code specifically an alert.
I know how to do that using jQuery/JavaScript but now sure if I can just use what I already have which is as follows:
<script>
$LAB
.script("js/lib/jquery-1.7.1.min.js").wait()
.script("js/lib/jquery.mobile-1.1.0.min.js").wait()
.script("../js/mobile/config/buildfile.js").wait()
.script("js/init.js")
.wait(function () {
$(function() {
setupHelpers();
loadApp(true,
function () {
},
function () {
});
});
});
</script>
Do I need to still use document.ready or can I just put that alert somewhere in the code above?
Thanks
The answer to if you need to wait for DOMready or not is separate from whether or not LABjs is in the picture. It's a common misunderstanding to conflate "DOMready" with "scripts are finished and ready", when in fact they're two separate events, and should be treated individually in terms of what you need.
So, to answer your specific question, should you wait for DOMready or not... the answer is, it depends on if you need the DOM or not to display the alert. If you're alerting to the user using the normal alert() dialog popup, then no you don't. If you're using some plugin which "alerts" the user by way of injecting a <div> into the DOM to show the alert, then yes you definitely need to wait for the DOM.
Whatever you do, don't fall for the fallacy that the "script loaded" event (which is what you get in your final wait(..) call) means the DOM is ready. This is not necessarily true.
Rule 1: if you need the DOM for some task, always wait using a DOMready event handler, don't assume.
Rule 2: if you need to wait for some or all scripts to finish loading, use a script-loaded event, like what a script loader like LABjs gives you.
Rule 3: if you need both the DOM and the scripts to finish loading, then compose both events, by embedding a DOMready handler in your script-loaded handler, as you have done above.