What is the most conclusive javascript benchmark test? - javascript

Here's four of them:
SunSpider JavaScript 0.9.1
V8 Benchmark Suite
Peacekeeper
Kraken 1.0
Different benchmarks tend to show a different browser as the king. Which one (including any one I may not have listed) is the most conclusive benchmark?

Sports analysts will never agree as to who is/was the best soccer player in history. Maradona? Pelé? Cruyff? Another one?
Different people regard some characteristics as more important than others, and give them more weight when forming an opinion. Same happens with benchmarks: no benchmark is absolutely right.
Find the benchmark that values the characteristics that you think are important for your particular program, and trust that one.

The one that does exactly what your application does.

Related

How to introspect elements kind in an array in V8

After reading this article: https://v8.dev/blog/elements-kinds. I was wondering if some of the arrays that are created in my code are packed or holey. Is there any way to check this at runtime in a browser environment(specifically Chrome)?
I am looking for something like a %DebugPrint() that works on Chrome.
Maybe there are some other clues that I can look for. For example, some info that is captured in heap snapshots, etc.
(V8 developer here.)
As far as I'm aware, Chrome (DevTools) does not expose arrays' "elements kind".
And that's probably okay, because the performance difference between "packed" and "holey" elements is very small. Essentially, it only shows up in microbenchmarks. The typical benefit of a packed array is that two machine instructions can be avoided that would otherwise have to happen. If you're looking at a one-line hot loop body that compiles to maybe a dozen machine instructions, then saving two of them can be measurable. If you're looking at a real-world app where hundreds of kilobytes of code contribute to overall performance, saving two machine instructions here and there isn't going to matter.
I understand that you may be curious; but as far as tangible optimizations go, there are much more impactful ways to spend your time. Talking about elements kinds is pretty much a "for your curiosity, here are some of the lengths to which JS engines go to squeeze every last bit of performance out of JavaScript" story; it is not meant to be important performance advice that most developers should be aware of.

How expensive are power operations in Javascript?

Does it take a lot of CPU cycles to raise 2 to the power of, say, 29? Multiple times each second?
Or is it worth creating a cache of power variables?
Questions about performance in JavaScript don't usually have a single answer, for the simple reason that each engine is different. What's expensive in one engine (say, Microsoft's JScript) is cheap in another (say, Google's V8).
So a two-part answer:
Don't worry about it until/unless you have a real-world performance issue related to power operations. Until then, it's just a waste to spend time on it.
If that happens, profile the performance of alternatives on the engines you intend to support. If this is a browser-based thing you're working on, http://jsperf.com can be useful there, and all modern browsers have development tools, some with pretty decent profilers.
Modern JavaScript run times do this sort of thing fast!
JavaScript can perform (in Google's v8, SpiderMonkey, JavaScriptCode and in IE10) over 44 million 'two to the power of 29' operations in a second.
Here is a perf
Whenever you run into this sort of question you should profile your code.

Could WebWorkers be used for supercomputer power?

This a general question really, not sure if this is the place for it (it might be deleted as quite general) so please don't heckle (I am just curious).
I have been reading up on WebWorkers API and had a thought.
WebWorkers can be limited to using only small amounts of processing power for each machine/user. This could be tailored to not affect user experience and might only slighly affect browser performance (if at all).
My question is, could they theoretically be used to turn a website/application into a highly distributed supercomputer?
Is it more of an ethical question as IF it could be done, is it wrong if the user is not aware?
Yes, WebWorkers can be used for supercomputing a.k.a. distributed computing.
In fact, that's exactly what CrowdProcess does: http://crowdprocess.com/
DISCLAIMER: I work on CrowdProcess.
Websites can join the platform and supply it with processing power from the browsers that visit them without disrupting the website visitors experience in any way.
Developers can use the platform for their distributed computing jobs. Check the documentation to know how this happens: http://crowdprocess.com/doc-index
The website visitor can opt-in, opt-out or simply agree with the terms and conditions of the website that provides the platform with the browser's processing power.
We ask the website owners to tell the users what's going on in any way they find appropriate for their audience. CrowdProcess is aware that no one should power this platform against their consent and will. That's why we develop projects with a higher purpose: forest fire behavioural prediction, genetic sequence alignment and medical computer vision just to name a few.
Our vision is that one day soon we will have enough commercial applications running on the platform that allow us to pay websites for the processing power they provide.
It's possible, unethical and likely illegal.
It is certainly possible to do. In fact you don't even need to use web workers to do it. It is probably unethical to do if the user is not aware but it may not actually be degrading to the user experience or even noticable. It may even be illegal and you should get some legal advice.
For example, if you have an aplication where users are aware that they help folding proteins while playing your game or something like that then it may be a great application. If, on the other hand, you want to mine bitcoins using the processing power and electricity of your unsuspecting visitors then you are asking for trouble.
I have found two companies...
Seti at home http://setiathome.berkeley.edu/
Gives the user the chance to give some processing power to help them analyse data from their telescope.
Folding at home http://folding.stanford.edu/English/About
Users can give processing power to research labs for all sorts of scientific research and study purposes (including protein strings).
It seems it is LEGAL (via WebSockets or ajax) as long as you give details in Terms and Conditions, but not recommended as better ways to do heavy processing exist (see above 2 examples).

What's going to replace HTML & CSS & JS? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
HTML and CSS are showing their age.
SASS generates CSS (because CSS isn't clean enough). Graphic Designers don't work in HTML, they work in graphics tools then have to translate it to HTML/CSS. JavaScript has to have abstractions like jQuery, and CSS has a bunch of hacks to even start approaching consistent predictable user experience.
It feels like people are doing some wonderful things despite the technologies, not because of them.
Surely there is a better way?!? Something more closely aligned with the task at hand.. of providing a fluid intuitive (consistent) user experience to let users achieve their goals.
Thoughts?
Nothing, I believe they are here to stay for next 10 years.
The Internet experience might be enhanced by technologies like Flash & Silverlight but what's valuable about Internet is not the technology but the information.
Therefore, breaking Internet compatibility for pure technological enhancement will never work.
BLTML: Bacon, Lettuce and Tomato Markup Language.
In the future, the web with only be used for posting tasty things we want to eat, so it makes sense to develop a language geared just for this.
The pictures of kittens with words on them will likely be supported with a gross hack .
Graphic Designers don't work in HTML
DTP designers don't produce paper and ink either. Design something and produce something, these are separable tasks - when you have an idea for a tv spot you still need lots of technology between your creativeness and the result, the same applies for the web.
Javascript has to have abstractions
like JQuery, and CSS has a bunch of
hacks to even start approaching
consistent predictable user
experience.
Oh man, js doesn't has to have, developers simply like to make their job easier, this rule applies to various programming languages, it's like saying python has to have django. Frameworks and libraries are over the language, they are not a must.
CSS 'has a bunch of hacks' because some browser producers don't give a damn about something called "standards", not because the language is badly designed.
Surely there is a better way?!?
Something more closely aligned with
the task at hand.. of providing a
fluid intuitive (consistent) user
experience to let users achieve their
goals.
What user experience is not provided with html,css,js? I really don't get your point, and what you expect from the Web.
Oh, and if you are like 'you know, you need flash for something or whatever', start getting interested in canvas.
HTML and CSS are here to stay for a long time yet!
While they may not be as intuitive to use for designers as say PhotoShop, well formed HTML is machine readable - this means that it can be used by humans AND computers. This is very important and useful. Imagine a web full of pictures that look beautiful but cannot be crawled or searched by Google?
HTML and CSS are superior because of the structured information that underlines them.
I don't think that HTML and CSS are showing their age, I think that browsers are showing their age. I like being able to describe what I want done, but not HOW to do it.
I guess what I want is browser vendors to use one rendering engine, or if nothing else, a SSL-type of certification for browsers. Kind of a global impartial body that's measures browsers on a quality bar-like scale.
Just like with SSL certification, it's done by a third party. I'm not sure what the pricing structure would be but I don't think you should have to pay for it. I think that it would make a great "works in this browser" logo like the Spam Free and Mal-ware Free logos we've seen popping up on sites over the years. Or perhaps an Acid1, Acid2, Acid3 passed logos for browsers.
I would argue that libraries like JQuery and Prototype exist because browsers all have their quirks. We just got tired of writing all that handling code, so some very smart people did this for us.
Personally I think HTML and CSS are very elegant, and while the W3C certainly isn't hasty, I think it's probably fair to say that a certain browser has been holding back design on the web more than the technologies themselves.
CSS3 has support for fantastic things such as web fonts with #fontface. Javascript engines are becoming increasingly speedy and allowing for things like John Resig's processing port which would have been unimaginable years ago.
We need to see users adopting new browsers at a faster pace, and we need to see vendors getting behind efforts to encourage their users to upgrade.
I think it's a mistake to think that abstractions are a negative thing, and indicate some problem with the base technology - technologies naturally evolve through abstraction. There's some inconsistency in your post in the sense that you decry the need for abstraction, but then mention that you desire consistency - that consistency across clients is achieved through abstraction. I no longer have to worry about how different clients handle the DOM - jquery does this for me. CSS hacks by the same token aren't really necessary, and it's quite acceptable to serve a different stylesheet to that browser; the rendering difference between the other mainstream engines are pretty minimal.
Please also consider that we're still using a lot of "old" technologies (Unix, c, c++ to name a few), because they are functional, elegant and well designed.
Worse is better. HTML and CSS are never going away because of the large volume of content that's written against that platform. Same thing with C. It's a terrible language, but it will always be with us because nearly all software is written in either C or C++.
A huge volume of JAVA means that's never going away. There's still a market for COBOL programmers.
It's a popular idea among some programmers to get really frustrated at a crufty system like HTML/CSS/JS, and think "Hay, let's demolish it and start over". Well you know, I could invent my own phone that is 100 times better, and with better sound quality than any other phone. That's the easy part. The hard part is having someone to call.
Like it or not, HTML/CSS/JS is the technology trio that became popular, and that means millions of people have invested trillions of dollars in producing content for the technology. Millions of people will be quite reluctant to throw that effort away because someone says HTML/CSS/JS sucks.
It's a mysterious thing figuring out what technology is going to become popular. It's not something you can control for the benefit of your own comfort. But at least, you know, there's such lovely computer science concepts as "abstraction", which if you can master it, will be your secret weapon. Jquery is an example, but it can be taken very much further.
The problem would not be to design something to replace HTML/CSS/JS, but to get browser vendors to adopt it. Good luck with that.
As central as the web experience and HTML are in my life and in your life, the only computer most people in the world use is their phone. The web just doesn't work well even on a multitouch touchscreen.
Just like you no longer enter your game using the ROM basic before playing it, and you no longer see text only screens around (well mostly), some day the web is going to be consumed by specialized devices or by specialized applications for your phone. Machine readable web, or in other words, web services. You can call that web 3.0 if you like.
I believe that the problem is not HTML, CSS and JS or their age (they are constantly evolving anyway). Theoretically you should be able to create one version of something and have it work exactly the same across different platforms. Which is where the problem lies: the platforms.
Saying those technologies are old and therefore need replacing is like saying C++ is old and therefore should not be used for game development. They are actually very appropriate and powerful tools for what they were designed for. Therefore I would predict that its not the HTML, CSS and JS that need to or will be replaced, but that the current platforms need to get their shit together (some more than others) and follow the bloody standards!
That said, they do need to keep evolving to stay relevant.
I think XML with XSL is the future. Graphical designer tools would come with their own XSL stylesheets, tailored to the strengths of the tool, and the tool would generate XML files that use the stylesheets.
But I'm no clairvoyant; what's going to be hot in the next 5 years, who knows. :-P
If you listen to Microsoft, Silverlight will feature prominently in the new web. Since it uses XAML, which is just a text file, it has the potential to be search engine friendly.
Others like Flash.
Of course I am sure that something else new will be invented in the future....
I think RIA's like Adobe flex/air/apollo and Silverlight etc will eat into some of the html market share , but not totally replace it.
Some of the issues that plagued RIA's like SEO related stuff, lack of back button support are being resolved.
The good thing with RIA's is that it is browser independent (as long as the user has the right plugin) , I could foresee future browsers launching with RIA support built-in thereby ensuring a 100% market penetration for apps built using them.
For documents and usual web pages, HTML and CSS are there to stay and evolve (Sass is nice indeed).
For applications, mobile code (javascript most probably) driving canvas-like graphics will help merge web apps and internet-enabled desktop applications.
— I wonder how ridiculous these predictions will look like in 10 years :)

Alternatives to JavaScript

At the moment, the only fully supported language, and the de-facto standard for DOM tree manipulation in the browser is JavaScript. It looks like it has deep design issues that make it a minefield of bugs and security holes for the novice.
Do you know of any existent or planned initiative to introduce a better (redesigned) language of any kind (not only javascript) for DOM tree manipulation and HTTP requests in next generation browsers? If yes, what is the roadmap for its integration into, say, Firefox, and if no, for what reasons (apart of interoperability) should be JavaScript the only supported language on the browser platform?
I already used jQuery and I also read "javascript: the good parts". Indeed the suggestions are good, but what I am not able to understand is: why only javascript? On the server-side (your-favourite-os platform), we can manipulate a DOM tree with every language, even fortran. Why does the client side (the browser platform) support only javascript?
The problem with javascript is not the language itself - it's a perfectly good prototyped and dynamic language. If you come from an OO background there's a bit of a learning curve, but it's not the language's fault.
Most people assume that Javascript is like Java because it has similar syntax and a similar name, but actually it's a lot more like lisp. It's actually pretty well suited to DOM manipulation.
The real problem is that it's compiled by the browser, and that means it works in a very different way depending on the client.
Not only is the actual DOM different depending on the browser, but there's a massive difference in performance and layout.
Edit following clarification in question
Suppose multiple interpreted languages were supported - you still have the same problems. The various browsers would still be buggy and have different DOMs.
In addition you would have to have an interpreter built into the browser or somehow installed as a plug in (that you could check for before you served up the page) for each language. It took ages to get Javascript consistent.
You can't use compiled languages in the same way - then you're introducing an executable that can't easily be scrutinised for what it does. Lots of users would choose not to let it run.
OK, so what about some sort of sandbox for the compiled code? Sounds like Java Applets to me. Or ActionScript in Flash. Or C# in Silverlight.
What about some kind of IL standard? That has more potential. Develop in whatever language you want and then compile it to IL, which the browser then JITs.
Except, Javascript is kind of already that IL - just look at GWT. It lets you write programs in Java, but distribute them as HTML and JS.
Edit following further clarification in question
Javascript isn't, or rather wasn't, the only language supported by browsers: back in the Internet Explorer dark ages you could choose between Javascript or VBScript to run in IE. Technically IE didn't even run Javascript - it ran JScript (mainly to avoid having to pay Sun for the word java, Oracle still own the name Javascript).
The problem was that VBScript was proprietary to Microsoft, but also that it just wasn't very good. While Javascript was adding functionality and getting top rate debugging tools in other browsers (like FireBug) VBScript remained IE-only and pretty much un-debuggable (dev tools in IE4/5/6 were none existent). Meanwhile VBScript also expanded to become a pretty powerful scripting tool in the OS, but none of those features were available in the browser (and when they were they became massive security holes).
There are still some corporate internal applications out there that use VBScript (and some rely on those security holes), and they're still running IE7 (they only stopped IE6 because MS finally killed it off).
Getting Javascript to it's current state has been a nightmare and has taken 20 years. It still doesn't have consistent support, with language features (specified in 1999) still missing from some browsers and lots of shims being required.
Adding an alternate language for interpreting in browsers faces two major problems:
Getting all the browser vendors to implement the new language standard - something they still haven't managed for Javascript in 20 years.
A second language potentially dilutes the support you already have, allowing (for instance) IE to have second rate Javascript support but great VBScript (again). I really don't want to be writing code in different languages for different browsers.
It should be noted that Javascript isn't 'finished' - it's still evolving to become better in new browsers. The latest version is years ahead of of the browsers' implementations and they're working on the next one.
Compile to Javascript
For now, using a language which compiles to Javascript seems to be the only realistic way to reach all the platforms while writing smarter code, and this will likely remain the case for a long time. With any new offering, there will always be some reason why one or more vendors will not rush to ship it.
(But I don't really think this is a problem. Javascript has been nicely optimized by now. Machine code is also unsafe if written by hand, but works fine as a compile target and execution language.)
So many options
There is an ever growing pool of languages that compile to Javascript. A fairly comprehensive list can be found here:
List of languages that compile to JS on the Coffeescript Wiki
Noteworthy
I will mention a few I think are noteworthy (while no doubt neglecting some gems which I am unaware of):
Spider appeared in 2016. It claims to take the best ideas of Go, Swift, Python, C# and CoffeeScript. It isn't typesafe, but it does have some minor safety features.
Elm: Haskell may be the smartest language of them all, and Elm is a variant of Haskell for Javascript. It is highly type-aware and concise, and offers Functional Reactive Programming as a neat alternative to reactive templates or MVC spaghetti. But it may be quite a shock for procedural programmers.
Google's Go is aimed at conciseness, simplicity, and safety. Go code can be compiled into Javascript by GopherJS.
Dart was Google's later attempt to replace Javascript. It offers interfaces and abstract classes through a C/Java-like syntax with optional typing.
Haxe is like Flash's ActionScript, but it can target multiple languages so your code can be re-used in Java, C, Flash, PHP and Javascript programs. It offers type-safe and dynamic objects.
Opalang adds syntactic sugar to Javascript to provide direct database access, smart continuations, type-checking and assist with client/server separation. (Tied to NodeJS and MongoDB.)
GorillaScript, "a compile-to-JavaScript language designed to empower the user while attempting to prevent some common errors." is akin to Coffeescript but more comprehensive, providing a bunch of extra features to increase safety and reduce repetitive boilerplate patterns.
LiteScript falls somewhere inbetween Coffeescript and GorillaScript. It offers async/yield syntax for "inline" callbacks, and checking for variable typos.
Microsoft's TypeScript is a small superset of Javascript that lets you place type-restrictions on function arguments, which might catch a few bugs. Similarly BetterJS allows you to apply restrictions, but in pure Javascript, either by adding extra calls or by specifying types in JSDoc comments. And now Facebook has offered Flow which additionally performs type inference.
LiveScript is a spin-off from Coffeescript that was popular for its brevity but does not look very readable to me. Probably not the best for teams.
How to choose?
When choosing an alternative language, there are some factors to consider:
If other developers join your project in future, how long will it take them to get up to speed and learn this language, or what are the chances they know it already?
Does the language have too few features (code will still be full of boilerplate) or too many features (it will take a long time to master, and until then some valid code may be undecipherable)?
Does it have the features you need for your project? (Does your project need type-checking and interfaces? Does it need smart continuations to avoid nested callback hell? Is there a lot of reactivity? Might it need to target other environments in future?)
The future...
Jeff Walker has written a thought-provoking series of blog posts about "the Javascript problem", including why he thinks neither TypeScript, nor Dart nor Coffeescript offer adequate solutions. He suggests some desirable features for an improved language in the conclusion.
should be JavaScript the only supported language on the browser platform ?
Yes and no. There is an alternative out there called Dart by Google which does compile to JavaScript and just like jQuery it tries to make DOM manipulation a bit easier. It may be fun to experiment, check it out.
From Google see The dart language
From Microsoft see TypeScript language
See also
Elm
Kal
It is true that Javascript was at one point notoriously hard to deal with but the web development community has come a long way since. Instead, I would encourage you to have a look at jQuery. It's easy and abstracts away all the various problems.
And there really are no alternatives that work across the board. Flash comes to mind but that too is ECMA script and it's probably over kill for most things.
Short term, I'd use things like jQuery to hide the browser incompatibilities. Long term, technologies like Silverlight or Adobe AIR may make this a very different minefield (but still a minefield) in the future.
Doug Crockford gave a talk to Google detailing the bad and good parts of JavaScript and its future. It actually hasn't changed much at all since 1999--which can be said to be a good thing (pretty much all browsers can run the same code as long as you're aware of their limitations) and Doug shows where the good parts were mostly misunderstandings that turn out to be very powerful.
For DOM manipuluation, look at JQuery as a client-side library that replaces most of the awful DOM API with operations that are a pain to write to pretty elegant bits of code that are easier to write.
If you're thinking that JavaScript has deep issues, I recommend Doug Crockford's book, JavaScript: The Good Parts. (Or Google for "Crockford JavaScript" to find several video presentations he's done.) Crockford sketches out a safe subset and set of practices, and specifically lists some parts of the language to avoid.
I'm unaware of plans to replace JavaScript as the de facto means of manipulating the DOM. So best learn to use it safely and well.
In terms of client side Javascript is the only way to manipulate the DOM. In terms of server side there are a multitude of ways.
Internet Explorer supports pluggable scripting languages, although the only one reliably included with IE besides JScript is VBScript.
As far as I have seen, there seems to be a general sort of bias toward dynamic languages in the browser, and JavaScript seems to fill this need adequately enough that network effects make any other language a non-starter. The language is actually quite powerful, though its implementation in browsers leaves much to be desired.
If you're willing to restrict your customers/visitors to specific browsers, and possibly willing to require them to install a plug-in, you could look at MS Silverlight -- a readable overview is on wikipedia. With Silverlight 2, you can run, client-side, code you've written in C#, IronPython, IronRuby, VB.NET, etc; the free Moonlight clone of Silverlight, from the Mono project, promises to bring the same functionality to Linux.
In practice, most developers of web apps and sites prefer to reach wider audiences than Silverlight (and eventually Moonlight) can currently deliver -- which means sticking with Javascript, or possibly Flash (which uses a similar programming language, Actionscript).
So, gaining substantial mindshare, adoption and traction for anything else is proving to be an uphill fight even for Microsoft with its large groups of engineers and marketing budgets and a free-software project on the side (to possibly ease worries about proprietary lock-in) -- which may help explain why there's very little interest, e.g. on the part of the Mozilla Foundation, in pushing towards such a goal. "Apart from interoperability", you say: but clearly the issue of interoperability is THE biggie here, given what we observe wrt Silverlight's progress...
As already said, you have Flash (ActionScript, which is a derived language from Javascript) and Silverlight/Moonlight (IronPython, IronRuby, JScript, VBScript, C#) that can run in the browser via plugins (the first one being much more ubiquitous).
There is also another alternative if you like Ruby: HotRuby, it's a ruby implementation in javascript that will run in the browser. It is not very mature yet, but you can have a look at it.
One thing I haven't seen mentioned (oh, I see Alcides mentioned HotRuby while I was writing and Nosredna mentioned GWT and Script#) and would like to throw out there is there are a number of implementations of [insert language]-on-JavaScript (eg. translators that allow you to convert Ruby, Python, C#, Java, Obj-J/Cappuccino [similar to Obj-C/Cocoa] or Processing [for Canvas] to JavaScript either on the client or before deployment [and some of which also feature various abstraction libraries]). Of course there's a performance overhead if it is being translated on the client, but if you are more comfortable with another language it will allow you some flexibility.
Personally, though, I recommend learning to love JavaScript. It's an excellent, powerful language, and pretty elegant once you get to know it. I'm facing the opposite dilemma, chomping at the bit to have a capable server-side JavaScript/DOM solution that meets all of my needs. /unsolicited opinion
JavaScript is the English language of the web. English historically spread because it had a strong navy conquering various countries. This is comparable to big companies that conquered the web with JavaScript. It's a language clobbered together from multiple European sources (Greek, Latin, Germanic languages, French even some Chinese and Indian words). JavaScript borrowed a lot of concepts throughout the years from other languages (structural, OO, functional). English is spoken in different places with slight variations in dialect and accent, that can render understanding difficult. Just like JavaScript has different browsers interpreting it a bit differently.
Even though English is easy to learn initially, it has very inconsistent pronunciation and more exceptions than rules. Just like JavaScript it's always there to offer a surprise.
Despite the different accents, JavaScript is the lingua franca of the web. Just like you might not be English and write here in English, every web browser has a certain degree of English understanding. IE6 is like the guy who says on his resume that he's fluent, but only went to a two week course on English as a foreign language.
There have been attempts to supplant English as the worlds main language, e.g. Esperanto. But all of them failed, because most people on earth speak some English. In the same way it will be difficult to introduce better alternatives to JavaScript.
Jquery (still javascript but) it will really help you they have support for almost all the browsers and it isn't really that hard to learn :)
No. JavaScript is it, but it will evolve. The next version is "JavaScript Harmony," and you can learn more if you Google that.
Now and then someone suggests putting a byte code interpreter into the browsers alongside JavaScript. Probably won't happen, at least for awhile.
I happen to love JavaScript. But there are other solutions, including GWT, which compiles Java to JavaScript and Script#, which compiles C# to JavaScript.
I don't think Javascript will be replaced any time soon. For a completely different approach to rich clients, you might want to investigate Flex, which is a Flash-based technology.
Maybe something like haxe (see haxe.org) could help you. It is a language which seems cleaner than JavaScript and can be compiled down to JavaScript, so it can be run inside a browser.
I know that this is not a direct answer to your question, but I thought it might be interesting for you, nevertheless.
Many people understand that Javascript isn't best and prettiest language ever. However, it is currently supported by browsers, and thus it will be extremely hard to introduce a different language. We simply don't need another browser war.
This explains why I know of no plans of switching to a different client-side language.
But I think Javascript isn't so bad if you start thinking about DOM model and how would one work with it. Many things that are messy with JS are the result of the way DOM model works.

Categories

Resources