Consider a complex rich internet application with lots of user interaction. I'm thinking of extensive drag-drop support, server-side user input validation, custom drawn UI controls such as an Outlook-like calendar, real-time UI feedback, etc... Would such an application be debuggable? I mean, can you easily step through the source code, place breakpoints, view the contents of variables, see the current call stack, use a profiler to pinpoint performance issues, etc...
Yes, why wouldn't it be?
Complexity just means more code to dig through, but tools like console.trace() from Firebug makes that easier.
Yes, it would be debug-able.
If you're using IE8 to test your site, you could use the Developer Tools to inspect individual HTML elements and change their CSS on the fly. There's also the ability to break into Javascript from the same interface.
If you're using Firefox, Firebug has almost identical abilities with a different interface.
Safari also has developer tools installed by default, you just have to go through the hoops of enabling them.
When you are designing your application, design it with debugability and testability in mind. Make sure that individual parts are independently testable, you have enough test data, you have appropriate debug/probe points in your program logic, etc. Essentially if the complexity is properly managed, debugability won't be an issue at all.
If your job depended on it, you would find a way! :)
Seriously... a passenger jet has literally millions of parts and yet there are regular routine maintenance checks and if it breaks down it gets fixed. It's a very rare piece of software that approaches that much complexity.
Web app front ends tend to be relatively simple. Essentially you're just pushing some text from the server to the browser and making it pretty; and you're using various parts of the in-browser display as controls, some of which initiate some more text conversations with the server. There are lots of little things that can go wrong, of course, but much of the hardship is simply getting the browser (all of them!) to Do What You Mean.
The only truly difficult problems are those that are intermittent and/or timing sensitive. Those can be a bear to reproduce and trace. That calls for in-depth logical analysis of your source code and/or some specialized testing methods.
Related
I am developing a system with JavaScript, which I want to let it work only on common web browsers (like IE9, Firefox, Chrome, Safari, Opera, ...).
First, I've compressed my code using Closure library+Closure compiler with the ADVANCED_OPTIMIZATION option, generating a code which slightly looks difficult to understand. Unfortunately, the codes can easily be converted to something beautiful (and readable) by using tools like this.
Second, I've chosen algorithms which are easy to read, but difficult to understand. For example, scripts that are decoding Reed-Solomon codes may be difficult to understand for those who has never developed such kind of algorithms before. Of course this solution is not perfect, because ones who have deep knowledge to Reed-Solomon codes may figure out what's written inside, even if the code is compressed and has no comments.
But the major problem is that my complicated code may run easily just by copying-pasting to Non-Web browser javaScript environments like Rhino+env.js, PhantomJS, and so on.
Please teach me the usable techniques to let my code ignore non-web browser environments, if there are .
I don't really understand the point of this question, but it sounds like your worry is that you don't want people to steal your JS that you return as part of your site.
If that's the case, there is only one real solution: do the work on the server.
You can make it hard to run the code but in the end, it's source code that you give to the world. Any kind of obfuscation and encryption must be reversed before the browser can execute it which means that anyone can eventually reverse engineer the code.
If you don't want this / can't have it, then the browser isn't a suitable tool for you. You can try to write a desktop application or build an appliance (= code which is protected by hardware) but that just raises the bars for reverse engineering. People do grind off the cover off chips to find out how they work.
From my experience, you can make it somewhat hard to "steal" your valuable data but you can't prevent it. Other downsides that you should take into account:
Paying customers will be offended by bugs and limitations that you impose on them (pirates will simply remove your copy protection).
Hollywood spent millions of dollars in DVD CSS and HDMI. Both protection systems were circumvented in a relatively short time. How much money do you plan to spend?
Maybe your time and energy is spent better on providing a better service to customers so they don't feel any need to "steal" from you.
Add this to the beginning of your script:
if(typeof window === 'undefined')
throw new Error('This script is meant to run in a web browser');
Obfuscate it as you see fit.
If you are doing this for security purposes it is a pointless endeavour. Everything you do in Javascript running on the users' browsers can be read and modified by the users. Not only that, but it is very easy for them to do so. Any data you don't want the users to see should not be sent to their browser at all. All processing should be done server-side.
I've got a HTML/JS (YUI framework) photo-organizer that needs access to the local FS. Should I move HTML/JS to AIR, or bite the bullet and "port" it to Flex AIR?
I know what the marketing says, but I want the real answer -- what an I "giving up" by going HTML/JS AIR? I'd like to get some feedback from people with deep experience building HTML-based AIR apps.
I don't think you'll see many issues in using the HTML AIR mode, AIR uses the Webkit engine under the covers iirc, which can work well enough, and has most of the same native features of the Flash/Flex built applications. You'll also see most of the HTML5 features you'd find in Safari. I would say if you need animations that Flash will generally run better than Canvas at this point... There's plenty of examples of ExtJS and other frameworks running on AIR.
As to what you are giving up, I don't think you'd lose anything from an HTML to HTML/AIR standpoint. You could gain a lot of what you gain in having an application in general based in Flash over straight HTML. In flex specifically controls and other features can be more readily tweaked than standard HTML controls. The animation tools in Flash are much nicer. ActionScript doesn't line up to JS on a one to one basis, so there may be issues with code. Dealing with remote content/data is actually a little nicer imho in AS over XHR, though only when dealing with XML.
From an administrative standpoint, going to AIR with HTML from an already written application is probably the shortest path. If you REALLY needed to, you could convert later, and a lot of the underlying logic would be worked out. Time to market would be shorter with whatever is closest to what you are already using more often than not.
Not really, since FS access is available with the HTML/JS version. However, the other route does open up some more native support for application development -- Animation for example, richer controls etc which you will have to live without otherwise. You will greatly miss the debugger and the profiler as also the design view when you move to complex applications. Also, note that if you are worried about sharing your source you probably shouldn't use the HTML/JS way.
I'm fairly new to both tools and need to go hardcore with both as I manage, monitor, and tweak a new site's design process. What sort of strategies should I ask be implemented that sets a good solid foundation for debugging, testing, and logging?
[to the degree that back-end stuff can play a role - it's .net mvc
thx
I would use Firebug to see how things are working with a few Firebug Add-ons.
I would use YSlow to check that you aren't downloading too much and it will make suggestions if you aren't minifying and gzipping your javascript.
I would also use FireQuery as that highlights jQuery very nicely in Firebug. I use it quite a lot these days to see what it should be firing.
Firebug doesn't rewrite XHRequests anymore but there is a bug in the latest Firefox/Firebug where if can block long running XHR calls. Details here
First off make sure you've read Firebug's docs. Some of the commands work cross-brower with other tools as well.
A simple search query will show you all available extensions for Firebug. As some people mentioned - some of them are really helpful.
Also it's important not to limit yourself to just a single tool since you will most likely be developing for multiple browsers. So make sure you take a look at webkits developer tools (Safari, Chrome) as well. Here's a good article which sums up the most popular development/debug tools.
You might want to research how jQuery/jQuery plugins are structured/organized so you have general idea how to organise your own JavaScript/jQuery code. It all depends how JavaScript heavy is your application. If jQuery just provides some visual enhancements and few Ajaxified pages here and there, don't bother. From other hand if it's very JavaScript heavy (as in a lot more site logic on client-side then on backend) I would suggest Prototype over jQuery, but it's just my opinion.
You could consider using automatic tools to build your JavaScript if you have a lot of code.
For example:
Sprockets
Juicer
On production server you want to end up with as few JavaScript files as possible and make sure to compress em.
If you're interested in more links to articles/tools for javascript heavy applications, drop a comment. I'm just trying to stay on topic at the moment.
I would just give a small warning using FireBug's network monitor and AJAX together. When enabled, it rewrites some HTTP headers and breaks stuff badly (well it used too, not sure anymore).
So if anything goes ape. Check that network monitoring is disabled.
I will also add for tools FireCookie, as it goes very well with $.cookie.
When I am debugging jQuery code I am using the NET panel a lot in Firebug for all ajax requests. Very helpful to see what are you sending and what are you receiving.
Also I use a lot the comand line, to test snippets of code.
You cannot do without the console. It will be very helpful. Example:
$.get( 'url.php', {},
function(data){
$.each(data, function(x){
console.log( x ); // will log each x object to see what it contains
});
}, 'json'
);
I also suggest you install FireUnit addon. It helps you work with QUnit unit tests. Of course that is if you are planning to write unit tests but in most cases that's the very good idea.
As much as you might love Firebug, Safari's developer tools are also quite powerful, and worth checking out. It's all I use when I dev.
Worth mentioning that Safari's javascript engine is still faster than FFX's, while Chrome reigns supreme. They're playing catch-up though, so this really isn't worth caring about.
I'm currently building a project and I would like to make use of some simple javascript - I know some people have it disabled to prevent XSS and other things. Should I...
a) Use the simple javascript, those users with it disabled are missing out
b) Don't use the simple javascript, users with it enabled have to click a little more
c) Code both javascript-enabled and javascript-disabled functionality
I'm not really sure as the web is always changing, what do you recommend?
Degrade gracefully - make sure the site works without JavaScript, then add bells and whistles for those with JavaScript enabled.
Everyone else has committed good comments, but there are a few other considerations to make.
Sometimes the javascript will be hosted on a different domain, and be prone to timeout.
Sometimes that domain may become inacessible, while your site remains accessible. Its not good to have your site completely stack itself in this scenario.
For this reason, "blocking" scripts ( ie: document write inline ) like that present in google's tracker, should be avoided, or at very least, should go as late in the page as possible so the page renders whether or not the domain is timing out requests or not.
If you happen to be serving JS from a broken/malicious server, by intent or by accident, one can halt page rendering simply by having a script that serves that javascript which just calls "sleep(forever)" once its sent all the headers.
Some People Use NoScript
Like the above problem, sometimes the clients environment may block certain script sources, be it the users choosing, or other reasons ( ie: browser security satisfactions, odd antivirus/anti-malware apps ). The most popular and controllable instance of this is NoScript, and I myself paranoidly block some of the popular tracking/advertising services with it ( some proxy servers will do this too ).
However, if a site is not well designed, the failing of one script to load still executes code that was dependant on that script being present, which yeilds errors and stops everything working.
My recommendation is :
Use Firebug
Use NoScript and block out everything --> See Site still works
Enable core site scripts that you cant' do without for anything --> See site still works and firebug doesn't whine.
Enable 3rd party stuff --> See site still works and firebug doesn't whine.
There are a lot of other complications that can crop up, but satisfying the above 2 should solve most of them. Just assume that, for whatever reason, one or more resources that comprise a page are viable to spontaneously disappear ( they do, all the time ), and you want the page to "survive" this problem as amicably as possible. For the problems that may persist for < 10 seconds, its not so bad, refresh the page and its fixed, but if its a problem that can occur, and severley hamper usability for an hour or more at a time.
In essence, instead of thinking "oh, theres the edge case users that don't have javascript", try thinking more a long the lines of "its really easy to have something go wrong, and have ALL of our users with broken javascript. Ouch! Lets try make it so we dont' really hose ourself when that does happen"
( I've seen IE updates get rolled out and hose javascript for that entire browser until the people whom wrote the scripts find a workaround. Losing all your IE customers is not a good thing )
:set sarcasm
:set ignoreSpelling
:set iq=76
Don't worry, its only a 5% Niché Market
Nobody cares about targeting Niché markets right? All those funny propeller heads running lynx in their geeky stupid linoox cpus, spending all their time on the intarwebs surfing because they have nothing better to do with their life or money? the crazy security paranoid nerds disabling javascript left and right because they don't like it?
Nobody wants them as your primary customer now do they?
Niché markets. Pfft. Who cares!
:set nosarcasm
Consider your audience
"Degrade gracefully" is generally the best answer. But lots of sites now depend on JS - especially AJAX.
Consider your audience. If your site is aimed at extremely tech-savvy people, the chances of them not having javascript are small, and you can notify them to turn it on if necessary.
If your audience may access your site with mobile devices, don't assume they have JavaScript, and don't even assume they support CSS properly. Aim to degrade gracefully all the way down to bare HTML.
I've learned a lot from my question: What's With Those Do-Not-Use Javascript People
Go with Ajax and Web 2.0. It's the way the web is going and it's wonderful. Isn't Stackoverflow great to be on? It's not quite as nice with your Javascript turned off.
Once you have your site ready, but before you let it go live, test it with Javascript off, and just add whatever you feel you need to make your site appear and function to them. You only need to add what you feel is essential.
Remember, except for visually impared people using screen readers, the others have chosen to turn javascript off. They can also choose to trust your site and turn javascript on for your site if they want to use all the functionality you have. It really is their choice.
As other have said, it should "degrade gracefully".
In other works, it must work without Javascript (period). It doesn't have to work well. The folks who've disabled Javascript know the limitations that causes and have accepted them. But if you are trying to sell them something, it's important that they can still buy it.
On the site I'm designing, there's a javascript-based fly-out menu. With Javascript off, all the flyouts are always open. It doesn't look as cool as it would with JS, but it can still be used to navigate the site.
It depends on how much time you have to develop and maintain both solutions, and how much the non-javascript users are worth to you.
My e-commerce site relies heavily on javascript, and in over a year and a half, I've not received a single complaint.
In fact, I don't think I've seen a single visitor with javascript disabled in any of logs since I started.
That doesn't mean they're not out there. It just means that either (a) they're a tiny percentage, (b) they're not interested in what I'm selling, or (c) both of the above.
Code your web site with support for the bare minimum kind of browser. Then more people can use your site without frustration even if they don't have all the bells and whistles--like Flash, Javascript, and Java--enabled. It may not be practical to continue support for ancient browsers, say Netscape Navigator 4, because a user can be reasonably expected to keep their computer up-to-date. However, features like Javascript, Flash, and Java can be security holes in old or modern browsers, as well as being an annoyance.
Neither of my parents keep Javascript or Flash enabled because they've had too many experiences with them slowing down their already slow connection, crashing their browsers, or being more of an annoyance on sites that use it stupidly (which is a lot of them...) than a useful feature. It's just bad design if, for example, your form requires an AJAX call be made and you can't actually hit a submit button to send the form when Javascript is disabled.
My mother was recently quite frustrated to discover that she is now unable to click through eBay results pages because each one requires Javascript. The only way she can see the next page of results is to turn on Javascript or to show more results per page. Now what reason would there be for page links to require Javascript while the 'results per page' links are just plain links? They should all be plain old HTML links. Maybe Javascript could be used to add some whiz-bang to the navigation, but a user should not be punished with a bad interface for having Javascript disabled. It's stupid on eBay's part, and it causes undue hassle for their users.
I am one of those that uses 'No-Script.' And I can tell you that sites that use javascript and don't work without it enabled is extremely annoying, stackOverflow... No we don't expect it to be very fancy, if I upvote load a new page that says "Thank you."
We expect to be able to use the site with reasonable limitations, don't ever display a page that says JS must be enabled, though, even if the site is crap without it. And yes if your site convinces us to stay we will enable. A function that isn't in common use on the site can also require javascript.
Please note that your site should also look good with no JS or CSS, if nothing else it is good for Bots.
As others have pointed out some phones don't have JS, this is changing but another good reason to have reasonable non-JS. I suggest code with non-JS and add JS after the former works, there are good ways where JS can work with the non-JS layout.
It helps me in my implementations to think about it as "progressive enhancement" rather than graceful degradation. Degradation often leads you to figure out how to make it work w/o js after it is implemented, instead of making a baseline and enhancing with js.
It is essential to at least test your website is functional when JavaScript is turned off.
As orip says, degrading gracefully is very important. It should be vital that your page both looks nice and functions when JavaScript is disabled.
For a standard web site that is primarily intended for conveying information, degrade gracefully always.
For web applications:
When building a web application for a standard internet audience, I would keep the three following facts in mind:
95%-97% of potential users will have JavaScript enabled.
At times established users will need to access functionality when JavaScript is not available.
3%-5% of potential users will have JavaScript intentionally disabled.
Given fact one, if you believe that building a JavaScript reliant web application will deliver a superior user experience, then by all means do it. Doing so may help you accumulate users.
However, given fact two, you should always provide a means by which your users can access core functionality without JavaScript. Do you need to offer every single feature? Probably not. But a user should be able to get his or her work done. This will keep your users happy when they find themselves temporarily without JavaScript.
Given fact three, I would also provide an in depth tour as an attempt to entice these users to enable JavaScript.
As an aside, one of my most favorite web applications, Remember The Milk follows this approach. Also, Google's Calendar application is unusable without JavaScript. So JavaScript reliant web apps are on the rise and that trend is probably unstoppable. In my opinion this is a good thing.
(Do keep in mind that JavaScript make Accessbility a bigger problem than it is already. Please do make an effort to make your apps usable by those with disabilities.)
As said before, it depends on your target audience.
If I'm part of it, you want to make sure that your site works (if not ideally) on my phone, and that it gives me reason to turn Javascript on when I surf there with it off. Nobody expects full functionality with Javascript disabled, and anybody who uses their phone to access websites expects some issues, but you need to at least provide teasers. For a web store, make sure customers can see at least some merchandise anyway, even if they can't buy without Javascript.
Is it alright to expect that the user using the back end will have Javascript enabled?
I guess the answer I'll get is 'it depends on your target users'. I am developing a system for fun that will hopefully be used by other people. I would like to hear from other people developing back end systems, and what did they decide to do and why?
SEO I'm not concerned with, and semantics aern't of as much importance.
Personally I would expect the failover, but there are circumstances (particularly low profile sites, intranets, e-learning content) where you can assume JS.
Mostly you can even go with a simple "You require JS / This works better with JS" and I would consider that good enough, but there's a couple of instances where I would demand real failover:
.gov or other public service sites (legal requirements)
sites for web-tech companies (you need to demonstrate your ability to do this)
very high traffic sites (where the 3% of non-JS users becomes a high absolute number)
sites (or pages) for mobile devices (most of these haven't got JS reliably)
In general, it's reasonably easy to provide some kind of noscript, so why not do it anyway?
if its for fun please go ahead and require javascript.
considering the 3 points :
backend means only a few people will be accessing them (and all of them probably have knowledge about web too, eg. know what javascript is and how to get it enabled)
SEO isn't important
it's for fun
I'd say that it's alright. :)
annakata provided a pretty good insight as well.
It really depends on your application and its target audiences. Do you care about user accessibility (can disabled people use the site), do you want your site to work on various mobile browsers with limited JavaScript support? I would try to build the site so that it would gracefully degrade without CSS or JavaScript. That is unless you site is very dynamic, like say a word processor which can't possible work at all without JavaScript.
Yes, It mostly depends on your target user.
Whatever the front end is, the back end must be bulletproof.
At least, it should make ensure that, nobody can hack or make a mess by disabling javascript.
server-side filtering/validations is important for security, while client-side validation and interactivity is important for usability.
I don't think it's unreasonable to require Javascript for a web based backend/CMS, where your target users are likely to be a fairly small and pretty specific group.
All the CMS systems that I've worked on so far have required it.
I refer you to this post by Jeff Atwood. The important assertion in it is that you can expect that javascript will work as expected among browsers. The security risks are also lower today. So I would say that is now safe to ignore clients that do not enable javascript. If you want to attack users javascript would give a clear advantage.
The only exception I can think of is mobile sites. Although mobile browsers have gone better and do support javascript, the extra download bandwidth and the small screen make js less suitable.
As long as the function that your application will be serving is general, I'd say it is safe to rely on Javascript. One of the sites that I manage receives ~35,000 UV's on a good day. I think it is fair to say we come in contact with quite a variety of browser and operating system combinations. According to our stats, roughly 97% of our users have Javascript enabled.
If it can fail elegantly without Javascript, I'd opt for that solution, but I wouldn't lose sleep over the fact that you might be losing a few people everyday.