Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
Assuming there are no crazy optimizations (I'm looking at you Chrome).
I'm talking about raw, nasty, ain't-broke-don't-fix-it, ie v6 javascript, cost.
The lower limit being:
document.getElementById()
Versus:
document.getElementsByTagName('div') lookup.
getElementById can safely assumed to be O(1) in a modern browser as a hashtable is the perfect data structure for the id=>element mapping.
Without any optimizations any simply query - be it a css selector, an id lookup, a class or tag name lookup - is not worse than O(n) since one iteration over all elements is always enough.
However, in a good browser I'd expect it to have a tagname=>elements mapping, so getElementsByTagName would be O(1) too.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I need to arrange an array from highest to lowest in a function
i have used array.sort but was informed that that was not allowed for my exerccise
function minmax(array){
var ar = array.sort().join();//I cant use array.join methood
return ar
}
Please help out
One solution could be to implement any of the popular sorting algorithms, but make the comparison used prioritize larger numbers. One example is this link, the code of which only needs a > flipped to a <.
Note that if this is a homework or school excercise, copying from anywhere is plagiarism and is generally discouraged.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
Is there any resource to know the time complexity of natively defined array and string methods in JavaScript?
I have to do guess work while I am using them to solve algorithm, but I want to be sure about what is the time complexity of those functions?
This question has been answered previously:
Time Complexity for Javascript Methods in V8
In short, it's not specified and the time complexity for common JS methods can differ between browsers.
Worse yet, some methods might not even exist or will behave differently between different browsers and browser versions!
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
The memory of this line of code will go up about 100 megabytes, and does not get freed:
var json = new Array(100000000).join(",");
Why?
As reported in this answer, each element of your array will take up nearly a byte. 100,000,000 bytes is roughly 95MB. The memory reserved for the array will not be released until the garbage collector runs, which is when it deems fit.
As stated in the MDN:
Most memory management issues come at this phase. The hardest task here is to find when "the allocated memory is not needed any longer". It often requires the developer to determine where in the program such piece of memory is not needed anymore and free it.
So the memory will be released when the browser determines it is no longer needed, or when the developer explicitly states that it is no longer needed (demonstrated elsewhere on that page).
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I didn't find any documentation as to why XPath support was dropped from jQuery.
Read this: http://ejohn.org/blog/xpath-overnight/
I should, also, probably answer the inevitable question: “Why doesn’t
jQuery have an XPath CSS Selector implementation?” For now, my answer
is: I don’t want two selector implementations – it makes the code base
significantly harder to maintain, increases the number of possible
cross-browser bugs, and drastically increases the filesize of the
resulting download. That being said, I’m strongly evaluating XPath for
some troublesome selectors that could, potentially, provide some big
performance wins to the end user. In the meantime, we’ve focused on
optimizing the actual selectors that most people use (which are poorly
represented in speed tests like SlickSpeed) but we hope to rectify in
the future.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I made a website that does absolutely nothing, and I've proven to myself that people like to stay there - I've already logged 11+ hours worth of cumulative time on the page.
My question is whether it would be possible (or practical) to use the website as a distributed computing site.
My first impulse was to find out if there were any JavaScript distributed computing projects already active, so that I could put a piece of code on the page and be done. Unfortunately, all I could find was a big list of websites that thought it might be a cool idea.
I'm thinking that I might want to start with something like integer factorization - in this case, RSA numbers. It would be easy for the server to check if an answer was correct (simply test for modulus equals zero), and also easy to implement.
Is my idea feasible? Is there already a project out there that I can use?
Take a look at http://www.igvita.com/2009/03/03/collaborative-map-reduce-in-the-browser/ and http://www.igvita.com/2009/03/07/collaborative-swarm-computing-notes/