Eu Cookies Law and Javascript - javascript

This is probably not a technical question but more related to legislation:
In these years European countries are due adjust with eu cookies law (synthetically every web site must inform/asks agreement/and so on to their users about every kind of cookies used by the sites).
Recently we spent some time to analyze possible implementations about informer banner/pop up. Our doubt concerns javascript implementation, for example as suggested here https://www.cookiechoices.org/ with Google collaboration.
These implementations - pure Javascript - should work only if a browser enables javascript, so what happens if a browser disables javascript? Would this web site outlaw?

These implementations - pure Javascript - should work only if a browser enables javascript, so what happens if a browser disables javascript? Would this web site outlaw?
The mechanism used to show agreement is quite simple, explained on the first page and can be implemented by serverside code just as well as in Javascript:
With each tool, you can customise a notice or statement directed to your visitors and invite your visitors to close the message (using wording of your choice such as "close", "accept" or "I agree"). Users also have the option of clicking a link to another page, such as a page that explains how your site uses cookies (again, you choose the wording for this link text). If users close the message, a cookie will be set to your domain. It is called 'displayCookieConsent' and has the value 'y'.
Simply put an iframe in the div to communicate cookie preferences serverside.
Although your question specifically relates to the tools available from the cookiechoice website, you are no doubt aware that article 5(3) of the ePrivacy directive has much wider scope and conditional context that may negate the need for this kind of consent, or indeed, depending on your application, require the capability to allow and deal with the scenario where the user explicitly explicitly refuses cookies. Hence whether your site would be illegal (which also brings the question of the implementations of the directive in different jurisdictions) is highly dependent on what your site does and how it uses the cookies. (e.g. see this discussion of exemptions)

Related

Why CasperJS and browsers show different behaviors with CAPTCHA? [duplicate]

Is there any way to consistently detect PhantomJS/CasperJS? I've been dealing with a spat of malicious spambots built with it and have been able to mostly block them based on certain behaviours, but I'm curious if there's a rock-solid way to know if CasperJS is in use, as dealing with constant adaptations gets slightly annoying.
I don't believe in using Captchas. They are a negative user experience and ReCaptcha has never worked to block spam on my MediaWiki installations. As our site has no user registrations (anonymous discussion board), we'd need to have a Captcha entry for every post. We get several thousand legitimate posts a day and a Captcha would see that number divebomb.
I very much share your take on CAPTCHA. I'll list what I have been able to detect so far, for my own detection script, with similar goals. It's only partial, as they are many more headless browsers.
Fairly safe to use exposed window properties to detect/assume those particular headless browser:
window._phantom (or window.callPhantom) //phantomjs
window.__phantomas //PhantomJS-based web perf metrics + monitoring tool
window.Buffer //nodejs
window.emit //couchjs
window.spawn //rhino
The above is gathered from jslint doc and testing with phantom js.
Browser automation drivers (used by BrowserStack or other web capture services for snapshot):
window.webdriver //selenium
window.domAutomation (or window.domAutomationController) //chromium based automation driver
The properties are not always exposed and I am looking into other more robust ways to detect such bots, which I'll probably release as full blown script when done. But that mainly answers your question.
Here is another fairly sound method to detect JS capable headless browsers more broadly:
if (window.outerWidth === 0 && window.outerHeight === 0){ //headless browser }
This should work well because the properties are 0 by default even if a virtual viewport size is set by headless browsers, and by default it can't report a size of a browser window that doesn't exist. In particular, Phantom JS doesn't support outerWith or outerHeight.
ADDENDUM: There is however a Chrome/Blink bug with outer/innerDimensions. Chromium does not report those dimensions when a page loads in a hidden tab, such as when restored from previous session. Safari doesn't seem to have that issue..
Update: Turns out iOS Safari 8+ has a bug with outerWidth & outerHeight at 0, and a Sailfish webview can too. So while it's a signal, it can't be used alone without being mindful of these bugs. Hence, warning: Please don't use this raw snippet unless you really know what you are doing.
PS: If you know of other headless browser properties not listed here, please share in comments.
There is no rock-solid way: PhantomJS, and Selenium, are just software being used to control browser software, instead of a user controlling it.
With PhantomJS 1.x, in particular, I believe there is some JavaScript you can use to crash the browser that exploits a bug in the version of WebKit being used (it is equivalent to Chrome 13, so very few genuine users should be affected). (I remember this being mentioned on the Phantom mailing list a few months back, but I don't know if the exact JS to use was described.) More generally you could use a combination of user-agent matching up with feature detection. E.g. if a browser claims to be "Chrome 23" but does not have a feature that Chrome 23 has (and that Chrome 13 did not have), then get suspicious.
As a user, I hate CAPTCHAs too. But they are quite effective in that they increase the cost for the spammer: he has to write more software or hire humans to read them. (That is why I think easy CAPTCHAs are good enough: the ones that annoy users are those where you have no idea what it says and have to keep pressing reload to get something you recognize.)
One approach (which I believe Google uses) is to show the CAPTCHA conditionally. E.g. users who are logged-in never get shown it. Users who have already done one post this session are not shown it again. Users from IP addresses in a whitelist (which could be built from previous legitimate posts) are not shown them. Or conversely just show them to users from a blacklist of IP ranges.
I know none of those approaches are perfect, sorry.
You could detect phantom on the client-side by checking window.callPhantom property. The minimal script is on the client side is:
var isPhantom = !!window.callPhantom;
Here is a gist with proof of concept that this works.
A spammer could try to delete this property with page.evaluate and then it depends on who is faster. After you tried the detection you do a reload with the post form and a CAPTCHA or not depending on your detection result.
The problem is that you incur a redirect that might annoy your users. This will be necessary with every detection technique on the client. Which can be subverted and changed with onResourceRequested.
Generally, I don't think that this is possible, because you can only detect on the client and send the result to the server. Adding the CAPTCHA combined with the detection step with only one page load does not really add anything as it could be removed just as easily with phantomjs/casperjs. Defense based on user agent also doesn't make sense since it can be easily changed in phantomjs/casperjs.

Javascript that Creates an Outlook Appointment - Browser Issue

I was looking into a script embedded in a webpage that creates an Outlook appointment and opens it. I tested a sample appointment shared by Brian White: http://www.winscripter.com/WSH/MSOffice/90.aspx
and embedded it in a sample web page, but here are two problems:
The script works only in IE and not in any other browser.
IE issues a security message about an ActiveX control and asks if to enable it.
Do you have any idea how to make it work in all browsers and not to scare users with the ActiveX warning?
Thank you in advance!
The script you've linked to works by creating an instance of the Outlook ActiveX control. As such, no, there's no way to make this work in browsers that don't support ActiveX, which is effectively all of them except Internet Explorer.
As for not scaring the users with the ActiveX dialog box, that's not in your hands. The warning message is a security feature, part of the browser itself, and can only be disabled by changing the browser's settings - which isn't something you can do through code, for obvious reasons!
If it's appropriate to your situation, rather than do this through client-side javascript your could instead use Exchange Web Services on the server-side. This comes with its own set of limitations and things to be aware of, namely (a) it's obviously impossible to open Outlook with this method, and (b) on the server-side you'd require access to the Exchange server and would need to know the username/password of an Exchange user with permission to write to the relevant calendar (which is only going to happen if we're talking about a corporate environment).
Although I realize it is an old post, I wanted to offer another approach.
I notice your question refers specifically to OUTLOOK appointments, but what about using "iCalendar"?
{http://en.wikipedia.org/wiki/ICalendar}
This could offer a wider solution. Also, a page could offer two alternative icons.
One for Outlook, another one using iCalendar, and let the user choose which one to use.
Hope this helps. Cheers.
Marcelo F.

Accessing a user's history using JS

I'm not looking for code/how to. Just knowledge.
A client has just come to us with a question: Can we access the user's history from within a banner advert to give them some targeted advertising based on their history.
Obviously, this presents a privacy issue, but I need to give a good case for why it is technically not a viable option.
So I have a few questions...
Which browsers still, if any, support accessing a user's history, using window.history.
If some do and some don't. When did those who don't allow it stop allowing it?
If all browsers allow it (I have yet to find a script that works), why is it not commonly used?
Finally,
Having been on Amazon.co.uk, I then go to Macrumors.com and the adverts give me adverts based on products I have bought/looked at. I'm guessing this is just based on cookies/a system that amazon has implemented?
Just to make things clear:
I know it is a privacy issue. I am not looking for code/a way to do it (as I mentioned above)
There are ways to "sniff" for visited links within a page.
There used to be a way using the JavaScript history object, to list all the objects within your history (from the current site). history.length still works now. I seem to remember some browsers only returning undefined for each item, some returning them as an unreadable object.
No!
There's no browser (that I know of) that legitimately give you access to a user's browsing history.
There has been incidents where it was possible to do so by exploiting certain behaviors of the browser. Recently, in Firefox 16 there's a vulnerability that, if exploited properly, allows you to peek into the user's browsing history.
In the case you're describing (Amazon), yes, cookies are used. To be more accurate, Third-Party Cookies are used.
Update:
I was very interested in your last edit (about history being completely open in the past), so I tried to go back a little.

Protect against browser extension injected Javascript code

Browsers allow extensions to inject code, manipulate the DOM, etc.
Over the years, I have noticed lots and various uncaught errors (using window.onerror) on a website (app) I am watching, generated by unknown browser extensions on Firefox, Chrome and Internet Explorer (all versions).
These errors didn't seem to be interrupting anything. Now I want to increase the security of this website, because it will start processing credit cards. I have seen with my own eyes malware/spyware infecting browsers with modified browser extensions (innocent browser extension, modified to report to attackers/script kiddies) working as keyloggers (using trivial onkey* event handlers, or just input.value checks).
Is there a way (meta tag, etc.) to inform a browser to disallow code injection or reading the DOM, standard or non-standard? The webpage is already SSL, yet this doesn't seem to matter (as in give a hint to the browser to activate stricter security for extensions).
.
Possible workarounds (kind of a stretch vs. a simple meta tag) suggested by others or off the top of my head:
Virtual keyboard for entering numbers + non textual inputs (aka img for digits)
remote desktop using Flash (someone suggested HTML5, yet that doesn't solve the browser extension listening on keyboard events; only Flash, Java, etc. can).
Very complex Javascript based protection (removes non white listed event listeners, in-memory input values along with inputs protected with actual asterix characters, etc.) (not feasible, unless it already exists)
Browser extension with the role of an antivirus or which could somehow protect a specific webpage (this is not feasible, maybe not even possible without creating a huge array of problems)
Edit: Google Chrome disables extensions in Incognito Mode, however, there is no standard way to detect or automatically enable Incognito Mode and so a permanent warning must be displayed.
Being able to disable someone's browser extension usually implies taking over the browser. I don't think it's possible. It would be a huge security risk. Your purpose maybe legit, but consider the scenario of webmasters programatically disabling addblockers for users in order to get them to view the advertisments.
In the end it's the user's responsability to make sure they have a clean OS when making online banking transactions. It's not the website's fault that the user is compromised
UPDATE
We should wrap things up.
Something like:
<meta name="disable-extension-feature" content="read-dom" />
or
<script type="text/javascript">
Browser.MakeExtension.MallwareLogger.to.not.read.that.user.types(true);
</script>
doesn't exist and i'm sure there won't be implemented in the near future.
Use any means necessary to best use the current up to date existing technologies and design your app as best as you can security wise. Don't waste your energy trying to cover for users who souldn't be making payments over the internet in the first place
UPDATE (2019-10-16): This isn't a "real" solution - meaning you should not rely on this as a security policy. Truth is, there is no "real" solution because malicious addons can hijack/spoof JavaScript in a way which in not detectable. The technique below was more of an exercise for me to figure out how to prevent simple key logging. You could expand on this technique to make it more difficult for hackers... but Vlad Balmos said it best in his answer below - Don't waste your energy trying to cover for users who souldn't be making payments over the internet in the first place.
You can get around the key logging by using a javascript prompt. I wrote a little test case (which ended up getting a little out of hand). This test case does the following:
Uses a prompt() to ask for the credit card number on focus.
Provides a failsafe when users check "prevent additional dialogs" or if the user is somehow able to type in the CC field
Periodically checks to make sure event handlers haven't been removed or spoofed and rebinds/ warns the user when necessary.
http://jsfiddle.net/ryanwheale/wQTtf/
prompt('Please enter your credit card number');
Tested in IE7+, Chrome, FF 3.6+, Android 2.3.5, iPad 2 (iOS 6.0)
Your question is interesting, and thoughtful (+1'd), however unfortunately the proposed security does not provide real security, thus no browser will ever implement it.
One of the core principle on browser/web/network security is to resist from the desire of implementing a bogus security feature. Web will be less secure with the feature than without!
Hear me out:
Everything execute on the client-side can be manipulated. Browsers are just another HTTP clients that talks to server; server should never ever trust the computation result, or checks done in front-end Javascript. If someone can simply bypass your "security" check code executed in a browser with a extension, they can surely fire the HTTP request directly to your server with curl to do that. At least, in a browser, skilled users can turn to Firebug or Web Inspector and bypass your script, just like what you do when you debug your website.
The <meta> tag stopping extensions from injection does make the website more robust, but not more secure. There are a thousand ways to write robust JavaScript than praying for not having an evil extension. Hide your global functions/objects being one of them, and perform environment sanity check being another. GMail checks for Firebug, for example. Many websites detects Ad block.
The <meta> tag does make sense in terms of privacy (again, not security). There should be a way to tell the browser that the information currently present in the DOM is sensitive (e.g. my bank balance) and should not be exposed to third parties. Yet, if an user uses OS from vender A, browser from vender B, extension from vender C without reading through it's source code to know exactly what they do, the user have already stated his trust to these venders. Your website will not be at fault here. Users who really cares about privacy will turn to their trusted OS and browser, and use another profile or private mode of the browser to check their sensitive information.
Conclusion: If you do all the input checks on sever-side (again), your website is secure enough that no <meta> tag can make it more secure. Well done!
I saw something similar being done many times, although the protection was directed in the other way: quite a few sites, when they offer sensitive information in a form of text would use a Flash widget to display the text (for example, e-mail addresses, which would be otherwise found by bots and spammed).
Flash applet may be configured to reject any code that comes from the HTML page, actually, unless you specifically expect this to be possible, it will not work out of the box. Flash also doesn't re-dispatch events to the browser, so if the keylogger works on the browser level, it won't be able to log the keys pressed. Certainly, Flash has its own disadvantages, but given all other options this seems the most feasible one. So, you don't need remote desktop via Flash, simple embedded applet will be just as good. Also, Flash alone can't be used to make a fully-functional remote desktop client, you'd be looking into NaCl or JavaFX, which would make this only usable by corporate users and only eventually by private users.
Other things to consider: write your own extension. Making Firefox extension is really easy + you could reuse a lot of your JavaScript code since it can also use JavaScript. I never wrote a Google Chrome or MSIE extension, but I would imagine it's not much more difficult. But you don't need to turn it into an antivirus extension. With the tools available, you could make it so no other extension can eavesdrop on what's going on inside your own extension. I'm not sure how friendly your audience will greet that, but if you are targeting corporate sector, then that audience is, in a way, a very good one, as they don't get to choose their tools... so you can just obligate them to use the extension.
Any more ideas? - well, this one is very straight-forward and efficient: have users open a pop-up window / separate tab and disable JavaScript in it :) I mean, you could decline to accept a credit card info if the JavaScript is enabled in the browser - obviously, it is very easy to check. This would require some mental effort from the users to find the setting, where they can disable it + they will be raging over a pop-up window... but almost certainly this will disable all code injection :)
This wont work, but i'll try something around document.createElement = function(){};
That should affect client side scripts (greasemonkey)
You can also try to submit the current DOM using an hidden input
myform.onsubmit=function(){myform.hiddeninput.value=document.body.innerHTML;} and check server side for unwanted DOM elements. I guess using a server side generated id/token on every element can help here (as injected DOM node will surely miss it)
=> page should look like
<html uniqueid="121234"> <body uniqueid="121234"><form uniqueid="121234"> ...
So finding un-tracked elements in the POST action should be easy (using xpath for example)
<?php
simplexml_load_string($_POST['currentdom'])->xpath("*:not(#uniqueid)") //style
Something around that for the DOM injection issue.
As for the keylogging part, i don't think you can do anything to prevent keylogger from a client side perspective (except using virtual keyboard & so), as there is no way to discern them from the browser internals. If you are paranoid, you should try a 100% canvas generated design (mimicking HTML element & interaction) as this might protect you (no DOM element to be bound to), but that would mean creating a browser in a browser.
And just that we all know we cannot explicitly block the extensions from our code,
one another way can be to find the list of event listeners attached to key fields like password, ssn and also events on body like keypress, keyup, keydown and verify whether the listener belongs to your code, if not just throw a flash message to disable addons.
And you can attach mutation events to your page and see if there are some new nodes being created / generated by a third party apart from your code.
ok its obvious that you will get into performance issues, but thats a trade off for your security.
any takers ?

Is it worth it to code different functionality for users with javascript disabled?

I'm currently building a project and I would like to make use of some simple javascript - I know some people have it disabled to prevent XSS and other things. Should I...
a) Use the simple javascript, those users with it disabled are missing out
b) Don't use the simple javascript, users with it enabled have to click a little more
c) Code both javascript-enabled and javascript-disabled functionality
I'm not really sure as the web is always changing, what do you recommend?
Degrade gracefully - make sure the site works without JavaScript, then add bells and whistles for those with JavaScript enabled.
Everyone else has committed good comments, but there are a few other considerations to make.
Sometimes the javascript will be hosted on a different domain, and be prone to timeout.
Sometimes that domain may become inacessible, while your site remains accessible. Its not good to have your site completely stack itself in this scenario.
For this reason, "blocking" scripts ( ie: document write inline ) like that present in google's tracker, should be avoided, or at very least, should go as late in the page as possible so the page renders whether or not the domain is timing out requests or not.
If you happen to be serving JS from a broken/malicious server, by intent or by accident, one can halt page rendering simply by having a script that serves that javascript which just calls "sleep(forever)" once its sent all the headers.
Some People Use NoScript
Like the above problem, sometimes the clients environment may block certain script sources, be it the users choosing, or other reasons ( ie: browser security satisfactions, odd antivirus/anti-malware apps ). The most popular and controllable instance of this is NoScript, and I myself paranoidly block some of the popular tracking/advertising services with it ( some proxy servers will do this too ).
However, if a site is not well designed, the failing of one script to load still executes code that was dependant on that script being present, which yeilds errors and stops everything working.
My recommendation is :
Use Firebug
Use NoScript and block out everything --> See Site still works
Enable core site scripts that you cant' do without for anything --> See site still works and firebug doesn't whine.
Enable 3rd party stuff --> See site still works and firebug doesn't whine.
There are a lot of other complications that can crop up, but satisfying the above 2 should solve most of them. Just assume that, for whatever reason, one or more resources that comprise a page are viable to spontaneously disappear ( they do, all the time ), and you want the page to "survive" this problem as amicably as possible. For the problems that may persist for < 10 seconds, its not so bad, refresh the page and its fixed, but if its a problem that can occur, and severley hamper usability for an hour or more at a time.
In essence, instead of thinking "oh, theres the edge case users that don't have javascript", try thinking more a long the lines of "its really easy to have something go wrong, and have ALL of our users with broken javascript. Ouch! Lets try make it so we dont' really hose ourself when that does happen"
( I've seen IE updates get rolled out and hose javascript for that entire browser until the people whom wrote the scripts find a workaround. Losing all your IE customers is not a good thing )
:set sarcasm
:set ignoreSpelling
:set iq=76
Don't worry, its only a 5% Niché Market
Nobody cares about targeting Niché markets right? All those funny propeller heads running lynx in their geeky stupid linoox cpus, spending all their time on the intarwebs surfing because they have nothing better to do with their life or money? the crazy security paranoid nerds disabling javascript left and right because they don't like it?
Nobody wants them as your primary customer now do they?
Niché markets. Pfft. Who cares!
:set nosarcasm
Consider your audience
"Degrade gracefully" is generally the best answer. But lots of sites now depend on JS - especially AJAX.
Consider your audience. If your site is aimed at extremely tech-savvy people, the chances of them not having javascript are small, and you can notify them to turn it on if necessary.
If your audience may access your site with mobile devices, don't assume they have JavaScript, and don't even assume they support CSS properly. Aim to degrade gracefully all the way down to bare HTML.
I've learned a lot from my question: What's With Those Do-Not-Use Javascript People
Go with Ajax and Web 2.0. It's the way the web is going and it's wonderful. Isn't Stackoverflow great to be on? It's not quite as nice with your Javascript turned off.
Once you have your site ready, but before you let it go live, test it with Javascript off, and just add whatever you feel you need to make your site appear and function to them. You only need to add what you feel is essential.
Remember, except for visually impared people using screen readers, the others have chosen to turn javascript off. They can also choose to trust your site and turn javascript on for your site if they want to use all the functionality you have. It really is their choice.
As other have said, it should "degrade gracefully".
In other works, it must work without Javascript (period). It doesn't have to work well. The folks who've disabled Javascript know the limitations that causes and have accepted them. But if you are trying to sell them something, it's important that they can still buy it.
On the site I'm designing, there's a javascript-based fly-out menu. With Javascript off, all the flyouts are always open. It doesn't look as cool as it would with JS, but it can still be used to navigate the site.
It depends on how much time you have to develop and maintain both solutions, and how much the non-javascript users are worth to you.
My e-commerce site relies heavily on javascript, and in over a year and a half, I've not received a single complaint.
In fact, I don't think I've seen a single visitor with javascript disabled in any of logs since I started.
That doesn't mean they're not out there. It just means that either (a) they're a tiny percentage, (b) they're not interested in what I'm selling, or (c) both of the above.
Code your web site with support for the bare minimum kind of browser. Then more people can use your site without frustration even if they don't have all the bells and whistles--like Flash, Javascript, and Java--enabled. It may not be practical to continue support for ancient browsers, say Netscape Navigator 4, because a user can be reasonably expected to keep their computer up-to-date. However, features like Javascript, Flash, and Java can be security holes in old or modern browsers, as well as being an annoyance.
Neither of my parents keep Javascript or Flash enabled because they've had too many experiences with them slowing down their already slow connection, crashing their browsers, or being more of an annoyance on sites that use it stupidly (which is a lot of them...) than a useful feature. It's just bad design if, for example, your form requires an AJAX call be made and you can't actually hit a submit button to send the form when Javascript is disabled.
My mother was recently quite frustrated to discover that she is now unable to click through eBay results pages because each one requires Javascript. The only way she can see the next page of results is to turn on Javascript or to show more results per page. Now what reason would there be for page links to require Javascript while the 'results per page' links are just plain links? They should all be plain old HTML links. Maybe Javascript could be used to add some whiz-bang to the navigation, but a user should not be punished with a bad interface for having Javascript disabled. It's stupid on eBay's part, and it causes undue hassle for their users.
I am one of those that uses 'No-Script.' And I can tell you that sites that use javascript and don't work without it enabled is extremely annoying, stackOverflow... No we don't expect it to be very fancy, if I upvote load a new page that says "Thank you."
We expect to be able to use the site with reasonable limitations, don't ever display a page that says JS must be enabled, though, even if the site is crap without it. And yes if your site convinces us to stay we will enable. A function that isn't in common use on the site can also require javascript.
Please note that your site should also look good with no JS or CSS, if nothing else it is good for Bots.
As others have pointed out some phones don't have JS, this is changing but another good reason to have reasonable non-JS. I suggest code with non-JS and add JS after the former works, there are good ways where JS can work with the non-JS layout.
It helps me in my implementations to think about it as "progressive enhancement" rather than graceful degradation. Degradation often leads you to figure out how to make it work w/o js after it is implemented, instead of making a baseline and enhancing with js.
It is essential to at least test your website is functional when JavaScript is turned off.
As orip says, degrading gracefully is very important. It should be vital that your page both looks nice and functions when JavaScript is disabled.
For a standard web site that is primarily intended for conveying information, degrade gracefully always.
For web applications:
When building a web application for a standard internet audience, I would keep the three following facts in mind:
95%-97% of potential users will have JavaScript enabled.
At times established users will need to access functionality when JavaScript is not available.
3%-5% of potential users will have JavaScript intentionally disabled.
Given fact one, if you believe that building a JavaScript reliant web application will deliver a superior user experience, then by all means do it. Doing so may help you accumulate users.
However, given fact two, you should always provide a means by which your users can access core functionality without JavaScript. Do you need to offer every single feature? Probably not. But a user should be able to get his or her work done. This will keep your users happy when they find themselves temporarily without JavaScript.
Given fact three, I would also provide an in depth tour as an attempt to entice these users to enable JavaScript.
As an aside, one of my most favorite web applications, Remember The Milk follows this approach. Also, Google's Calendar application is unusable without JavaScript. So JavaScript reliant web apps are on the rise and that trend is probably unstoppable. In my opinion this is a good thing.
(Do keep in mind that JavaScript make Accessbility a bigger problem than it is already. Please do make an effort to make your apps usable by those with disabilities.)
As said before, it depends on your target audience.
If I'm part of it, you want to make sure that your site works (if not ideally) on my phone, and that it gives me reason to turn Javascript on when I surf there with it off. Nobody expects full functionality with Javascript disabled, and anybody who uses their phone to access websites expects some issues, but you need to at least provide teasers. For a web store, make sure customers can see at least some merchandise anyway, even if they can't buy without Javascript.

Categories

Resources