This a general question really, not sure if this is the place for it (it might be deleted as quite general) so please don't heckle (I am just curious).
I have been reading up on WebWorkers API and had a thought.
WebWorkers can be limited to using only small amounts of processing power for each machine/user. This could be tailored to not affect user experience and might only slighly affect browser performance (if at all).
My question is, could they theoretically be used to turn a website/application into a highly distributed supercomputer?
Is it more of an ethical question as IF it could be done, is it wrong if the user is not aware?
Yes, WebWorkers can be used for supercomputing a.k.a. distributed computing.
In fact, that's exactly what CrowdProcess does: http://crowdprocess.com/
DISCLAIMER: I work on CrowdProcess.
Websites can join the platform and supply it with processing power from the browsers that visit them without disrupting the website visitors experience in any way.
Developers can use the platform for their distributed computing jobs. Check the documentation to know how this happens: http://crowdprocess.com/doc-index
The website visitor can opt-in, opt-out or simply agree with the terms and conditions of the website that provides the platform with the browser's processing power.
We ask the website owners to tell the users what's going on in any way they find appropriate for their audience. CrowdProcess is aware that no one should power this platform against their consent and will. That's why we develop projects with a higher purpose: forest fire behavioural prediction, genetic sequence alignment and medical computer vision just to name a few.
Our vision is that one day soon we will have enough commercial applications running on the platform that allow us to pay websites for the processing power they provide.
It's possible, unethical and likely illegal.
It is certainly possible to do. In fact you don't even need to use web workers to do it. It is probably unethical to do if the user is not aware but it may not actually be degrading to the user experience or even noticable. It may even be illegal and you should get some legal advice.
For example, if you have an aplication where users are aware that they help folding proteins while playing your game or something like that then it may be a great application. If, on the other hand, you want to mine bitcoins using the processing power and electricity of your unsuspecting visitors then you are asking for trouble.
I have found two companies...
Seti at home http://setiathome.berkeley.edu/
Gives the user the chance to give some processing power to help them analyse data from their telescope.
Folding at home http://folding.stanford.edu/English/About
Users can give processing power to research labs for all sorts of scientific research and study purposes (including protein strings).
It seems it is LEGAL (via WebSockets or ajax) as long as you give details in Terms and Conditions, but not recommended as better ways to do heavy processing exist (see above 2 examples).
I am completely new to javascript programming and I have a question that I didn't manage to find an answer for anywhere.
I have recently put together a simple slideshow to view the photos remotely that I host on my home computer. This by itself works fine. The problem I run into is that when I'm viewing photos I don't interact with the hardware, which after some time causes the monitor to switch off. This is particularly annoying when watching photos on my mobile phone.
My question is: is there a way to prevent this from happening? I am thinking in the direction of faking a mouse or other event every time I refresh the photo, but I have no clue how to do that and if it is possible.
Any help is greatly appreciated!
No. JavaScript on the browser cannot interact with the underlying system. Simulating keystrokes in the browser will not stop the screen saver from turning on. This is for security reasons, so that malicious code can't harm the system when you visit a web page.
Link on JavaScript Security
The modern JavaScript security model
is based upon Java. In theory,
downloaded scripts are run by default
in a restricted “sandbox” environment
that isolates them from the rest of
the operating system. Scripts are
permitted access only to data in the
current document or closely related
documents (generally those from the
same site as the current document). No
access is granted to the local file
system, the memory space of other
running programs, or the operating
system’s networking layer. Containment
of this kind is designed to prevent
malfunctioning or malicious scripts
from wreaking havoc in the user’s
environment. The reality of the
situation, however, is that often
scripts are not contained as neatly as
one would hope. There are numerous
ways that a script can exercise power
beyond what you might expect, both by
design and by accident.
Over the decade since this questions question was originally asked JavaScript has grown to provide much of the OS functionality (usually in a secure manner). The "wake lock" functionality is slowly being implemented. Currently there is a draft for the navigator.getWakeLock interface: https://www.w3.org/TR/wake-lock/#conformance
Chrome (https://developers.google.com/web/updates/2018/12/wakelock) and Mozilla (https://developer.mozilla.org/en-US/docs/Web/API/Screen_Wake_Lock_API) are considering it in various manners.
No, JavaScript cannot affect hardware or operating system. Just turn off monitor power saving settings until you're done with the slideshow.
Yes, It's possible now :)
Just use NoSleep.js library: https://github.com/richtr/NoSleep.js
It's working form me with Reveal.js slides on my Android Tablet
You could do it with a console application written in c# that interacted with the os
since js is a client side browser language it can only interact with the browser/bowser
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I hear about it a bit in tutorials that I watch, that certain things won't work if javascript is disabled. Occasionally I see workarounds.
The question is, are these relevent? I can't imagine anyone not having a javascript enabled browser nowadays, except the most ancient of phones, and chances are your page won't render on them properly anyway.
Do people still bother to write backup code for javascript being disabled?
Edit: As a test, I turned javascript off. Facebook doesn't work.
Edit: I understand about visually impaired users, but do people care (harsh, yes) if their experience is buggy? Not to sound disrespectful, but not sticking to strict standards will alienate people using Internet Explorer 4 and 5 too, but we don't seem to care about them...?
Edit: Saying that people should do it seems like a very automatic response, considering how many people use JQuery and other groovy addon libraries.
Edit: I tried a bunch of fortune 500 sites, and so far about 70% of the ones I tried have broken
Dell
Walmart
Fedex
Intel
Coca Cola
Yes, we still need backup code for people who have JavaScript disabled.
JavaScript is often used to do things that break in screen readers (so many screen reader users disable it) or to cause changes to appear out of sight of a screen magnifier.
JavaScript is still one of the biggest attack vectors to exploit security holes in browsers.
Add-ons such as No-Script are increasing in popularity.
Search engines tend not to execute it (so you don't want to hide your content behind it)
I prefer to think of it as a foundation rather than backup.
I understand about visually impaired users, but do people care (harsh, yes) if their experience is buggy?
Nasty people don't.
The law (in many jurisdictions) does.
Not to sound disrespectful, but not sticking to strict standards will alienate people using Internet Explorer 4 and 5 too, but we don't seem to care about them...?
IE 4/5 have:
a smaller market share than users without JS
many security holes
no support from their own publisher
As a developer I no longer worry about 1% of users who turn off Javascript. It is too time consuming and development time is too expensive to waste on such nonsense. AJAX saves an incredible amount of bandwidth which turns directly into $$$ savings, which makes profits higher. If I lose one or two potential users of the site for every 100 users, those one or two lost users will cost a lot more in development than the potential income of they could ever bring in.
Try turning off Javascript and logging into Facebook, it becomes a very broken website after that. If it's good enough for Facebook, its good enough for me.
Support for JavaScript-disabled web sites a nice thought, but not of much help, and of questionable value, IMHO
It is almost impossible to design a robust website without java script, and those that disable JS, for whatever reason, probably don't expect much of a user experience. So if you are coding for that 1% of the population, you have no choice. But for the majority of us, it is a given that JS is there. Accessibility is a different issue, with its own challenges. When I was doing web sites for Hewlett-Packard, they had to meet strict accessibility standards, and it was tough to create anything more than very basic web pages. I wouldn't wish that on anybody.
I have a different opinion to many here. I don't think you necessarily should care in some scenarios, especially if your website is targeting a particular group of people or that it is going to mean a lot of work.
if you refer to:
http://visualrevenue.com/blog/2007/08/eu-and-us-javascript-disabled-index.html
(source: visualrevenue.com)
You can see that year on year more browsers than ever have Javascript enabled, contrary to the other answers' claims. It was at 96.9% in 2007.
So you lose 3% of potential viewers, so what, your advertising campaign will do a lot more damage than that!
Yes. Especially when it comes to 508 and WCAG compliance. While the technologies to create accessible JavaScript are coming out of their infancy (see ARIA), developer's should still be coding sites in a way the does not require JavaScript.
http://www.w3.org/WAI/aria/faq
http://www.w3.org/TR/WCAG10/
All the other questions got most of the points covered, but I'll chime in with this: it's not a big deal to have your page(s) degrade gracefully in the absence of javascript. If you've got some super-whizzy ajax-infested real-time comet style app that really isn't going to work without javascript, you should at least render a nice message to the effect that javascript is required.
It depends on your audience and type of website.
For instance, a graphic artist portfolio will not be visited by blind people or people using text browsers. So in that case it's not so important to build nice degradable JS, expecially because it will most probably be used for graphic effects.
If, on the other hand you're developing a news website and you decide (for whatever reason) to dynamically load your news with JS then you should definitely make it degradable. Also, remember that the spiders of indexing engines may have difficulties in indexing content loaded with JS in your page.
At the end of the day, in most cases it's not so difficult to program the site so that it works without JS. If you're retrieving content dynamically you already have the server-side code to load the content, you just need to accomodate how the page is called.
Same thing for forms, you can send the content via AJAX or via a normal POST, the backend will be pretty much the same, so it's again easy to implement.
Of course, the problem is not even posed for JS code that is purely graphical.
My recent experience:
My former supervisor claimed, in earnest, that because Google Analytics told us that "87% of our users have Java enabled and less than 3% are using IE6," that we didn't have to worry about supporting older browsers or users with JavaScript disabled.
Problem 1: Java is not JavaScript.
Problem 2: In order for Google Analytics to track a hit, the browser must have JavaScript enabled because the GA interface is a JS include. GA is not, and can not, be aware of users with JS disabled, which can potentially severely skew its reports.
Problem 3: one of our biggest customers requires that all engineers use IE6 with JS disabled.
Problem 4: The boss (at the time) didn't know how to read analytics reports.
If you want to know how important this support is to your business, a good place to start is the IIS logs. Just about everything related to the browser caps is stored by IIS. I regularly import the logs into SQL Server and run some basic reports from my admin site, which come in handy every time someone starts suggesting that we go crazy with the jQuery BS.
If you decide to start building complicated, script-dependent interfaces, be sure that your interface degrades gracefully and doesn't remove required functionality if JS is disabled.
It is not merely a question of whether a browser is capable of executing javascript, but if a user has disabled it for some reason.
For example, you need to be aware of vison-impaired users. Such users might disable javascript, because the effects are confusing their screen reader software.
I've been thinking about the access web-based applications have to an OS.
I'm curious:
What is the best way of determining
this as it currently stands?
Is the trend leaning toward more, or
less access?
What functionalities should be
open/closed?
A simple example would be.. say your g-mail alerted you in the task-bar when an incoming e-mail is received.
In the past, browsers have shown a strong tendency to prevent all manner of access to the operating system (and also the browser chrome). The fundamental risk is about trust and deception; if all you have done is visited a website, do you want it popping up dialog boxes or reading files on your hard drive? Even worse, you might not have visited the website on purpose; you might have been subject to a phishing attack where a spammy email linked you to a URL that looks like a popular site, but has a "1" instead of an "I".
It's nuanced to say which way the trend is going. In one sense, you can now do more than before, because you have the browsers enabling GPU support and, as part of HTML5, offline storage. All of this is being done with careful consideration of the security issues though, ensuring sandboxing takes place. On the other hand, you have browsers locking down on things like cookies on the file:// URI.
Many apps are increasingly web apps, but it's not as simple as just pointing to the web app in your browser. It might be an app you installed as a mobile web widget, or purchased in an app store like the Palm Pre's, where most apps are basically just web apps. The point is, the trust scenario is different in each case; I'd feel more confident giving certain OS access to an app from a reputable app store, where the code has been inspected, the producer has signed it, and the app's actions and privileges can be tracked ... than a random website I happened upon.
Any access that is given is going to be abused by malware. Guaranteed.
I think the trend is toward less access; consider Google's Chrome OS, in which you do all of your work within web applications, which have no native access to your system at all.
I understand that server-side validation is an absolute must to prevent malicious users (or simply users who choose to disable javascript) from bypassing client-side validation. But that's mainly to protect your application, not to provide value for those who are running browsers with javascript disabled. Is it reasonable to assume visitors have javascript enabled and simply have an unusable site for those who don't?
I browse with NoScript in Firefox, and it always annoys me when I get pages that don't work. That said - know your audience. If you're trying to cater to paranoid computer security professionals - assume they might not have JavaScript enabled. If you're going for a general audience, JavaScript is probably on.
Totally depends on who you're aiming at.
If your site or app is for an Intranet, you can make lots of assumptions. If your target audience is bleeding-edge social-networking types, you can assume JavaScript will work. If you anticipate a lot of paranoia sysadmin types, you can assume a bunch of them will be trying to access your site in lynx or have JS turned of for "security reasons."
A good example of this is Amazon -- their approach is driven by their business goals. They are a mass-market site, but for them, locking out users in old/incapable browsers means potential lost sales, so they work hard on non-script fallbacks.
So like lots of these kinds of questions, the answer is not just regurgitating what you've read somewhere about accessibility or progressive enhancement. The real answer is "it depends."
I think there is another reason which push you to support at least some main functionality without JS - lots of us now browsing from mobile and PDA, which have no the same lvl of JavaScript support.
http://www.w3schools.com/browsers/browsers_stats.asp
They claim 95% of users have Javascript on.
Duplicate
There's at least one category where the answer is definitely "no". If you work for the government, you must make sure the site is accessible to those using screen readers.
Is it reasonable to assume visitors
have javascript enabled and simply
have an unusable site for those who
don't?
There are actually two questions, and the answers are: Yes, it is reasonable to assume visitors have javascript enabled. And, No this does not mean others should be left with unusable site.
Progressive enhancement is the way to go. Have your site usable without javascript and then add bells and whistles.
As for client side validation, it is no more than a convenience for user to avoid unnecessary roundtrips to server (where real validation should be performed).
I browse with the NoScript plugin in firefox and I'm surprised at the amount of developers that haven't even considered making their site degradable.
Never assume the user has JavaScript disabled - especially seeing as it may not always be their fault. Many enterprises have firewalls which block JavaScript/ActiveX etc. - In this instance the <noscript> element won't work so I would NOT recomend using that either!
Unless you're creating a full-on web application which is going to be 90% Ajax then you must make sure to abide by standards and progressively enhance your site through various layers of interactivity.
Also don't forget the important of object detection, especially with the rise of mobile phone web browsing. One of the most popular mobile web browsers (Opera mini 4.0) doesn't allow all "Background javaScript" to work and Ajax calls rarely execute correctly... Just something to be aware of.
To be honest I am sick and tired of developers that think everyone will have JS enabled! What ignorance!!
Yes it is. But expose as much of it as possible through regular HTML and URLs, if for nothing else than for Google.
Accessible, yes... functional? Not really.
This is really a customer requirement question more than developer-answerable, but if your customer tries to enforce a requirement that non-JS browsers work, you should argue heavily against it and really hammer them on the "cool" factor they'll be missing.
Given the heavy reliance by GWT, RichFaces, etc. on Javascript, it's just not feasible to make an app with any kind of user-friendly UI without it.
You should certainly warn non-JS enabled users that the site they're trying to visit relies heavily on JS, though. No point in being rude about it.
It is ok in these days to assume your visitors have JS enabled. With that said, you should strive for the best possible degradation of your site with JS disabled. It is ideal if your site falls back to a state that is still usable without JS.
No! Some environments will have it disabled as a matter of policy, with nothing you can do to enable it. And even if it's enabled, it might be crippled.
This question has been asked before.
One interesting point to consider is that as a web developer you have a social responsibility to push technology forward - and by using things like AJAX, you increase exposure and potentially rate of adoption along with it. The only thing that should stop you from using the tech to its fullest extent is money - if you won't make the money that you need because people will have trouble viewing the material, you've got to reconsider.
Never ever assume Javascript for form validation, as your question implies. Someone will eventually realise this and turn Javascript off.
Instead, code the app in fairly regular html manner and use Javascript for what it is: an optional perk for your users.
Even for an entirely AJAX app like Gmail, the complete works of form validation is required on the server side.
Yes it is, JavaScript is as old as CSS and no one tries to build around browsers that don't support CSS. Cross Site Scripting is the reason people are afraid of JavaScript, but believe me if a developer wants to screw you over he doesn't need JavaScript to do it. As far as mobile browsers, most of them now have JavaScript, and the others shouldn't be considered browsers. My advice is not to open yourself to hackers by making your site vulnerable to those who choose to turn off their JavaScript, but at the same time don't go out of the way to support those who are living in the stone age. You aren't going to support IE 4 or Netscape, right? Then why support those who sabotage their own browsers because of blatant fear or paranoia?
I think it's fair to assume that the majority of visitors to your site will have JavaScript enabled. Some of the more trafficked sites out there have a dependency on JavaScript. For example, I was surprised to learn that you can't authenticate through a Passport-enabled site without a JS-enabled browser.
Nearly all (but not quite all!) users will have javascript enabled. (I believe the figure quoted above of about 5% is accurate.)
Given the vast improvement in usability you can make with the judicious use of javascript, my opinion is that most of the time, it is reasonable to assume it is enabled.
There will of course be some instances where that is not the case, (ie, a site designed for mobile devices, or with a high percentage of disabled users etc), and always a effort should definitely be given to making your site as accessible as possible to as large a percentage of the population as possible.
That said, if you only have a low traffic site, 5% of a small number is a very small number. It may not be worth bending over backwards to make your site accessible to these people when it may only gain you one or two extra users.
I guess the short answer is (as always), there is no correct answer - it will depend entirely on the target use, and target users of the sit in question.
According to this little page Javascript is enabled in 95% of browsers and it keeps raising.
The W3C Browser Statistics page (scroll down) has some information on this; they say that 95% of visitors have JavaScript on as of January 2008.
It's reasonable to assume your visitors have javascript enabled !-)
-- but of course it depends on who you're trying to reach ...
Several times above w3schools have been mentioned and, as Dan stated, its their own visitors which make it somewhat quirky to draw conclusion from.
However, if you look at theCounter.com it seems that their audience has the same habits in general on this point ...
A twist that hasn't been mentioned yet is the growing amount of crawlers, mailharvesters and so on, they definitely do not have javascript turned on, and how good are counters to detect them ?-)
My guess would be, that this sort of machine-browsers fill up a lot of those 5-6% !o]
-- that said, if it's at all possible, make your app degrade gracefully (as a wise man said)
Your questions seems to suggest form based input for an application. If it's an intranet application then you'd be guided by the in-house security experts. If it's public app, then as other posters have suggested, fail gracefully.
I will argue that it is more than reasonable to expect them to have javascript so long as you provide suitable means to replace javascript should it not be enabled. One of the reasons that I like the Yahoo UI Library is that it degrades gracefully.
I always try to code my sites as static one first, THEN i add js/ajax functionality. This way i can be kinda sure that will work on non JS browsers :)
But, javascript is like flash: all users have it, but developer have to concern on WHAT IF.... ? :D
No its not, period, full-stop, end of story. Its just naive and wrong at an ethical level, not to mention you miss out on around 50% of Internet users worldwide (believe it or not 70% of web access worldwide is from mobile devices).
Add extra nifty stuff that requires Javascript, thats fine. Don't make your site unusable without Javascript unless you have really, really, really good reason to do so.
Someone rightly pointed out that I don't have evidence to back up my claim of 70% mobile web users. Unforunately I can't find the source I got it from but I remember it being authorative so have no reason to doubt it. It does make sense though when you consider worldwide usage, many developing countries have more mobile phones than landlines and broadband. A statistic that was quoted in my not-to-be-found source was that one African country in particular has 300,000 landlines, but has 1.5 million mobile phones!
This is totally a "it depends" question, as many people have pointed out.
This is why metrics is valuable on sites to help show if you can really run with the analogy that "major sites say that the majority of people have JS on" - you could have a site where it's 99%.
I won't dig in to what's been said above, as it's been answered very well :)
Not that everyone else hasn't chimed in, but I disagree with the "look at your audience" position to some extent.
It really should be "look at your app" if you are just displaying some information, and your js is for bell/whistle purposes, then certainly look at nice degradation, if you want to.
However, if you're building something like Google Docs, it's really asinine that someone would think they could use your site without js, so perhaps let them know that via a nice sarcastic message inside <noscript> tags.
From a purely philosophical point of view, if users want to access your site, they will flip the js switch, or upgrade to a decent browser, etc. And you should force them to do this, because evolution is important for the survival of the species.
According to this site, 95% of browsers use JavaScript.
That said, there are a LOT of bots that don't use JavaScript: scrapers, search bots, etc. I'd say closer to 100% of actual human users use JavaScript. But your guess is just as good as mine.