jQuery trigger pinch gesture on iPad - javascript

So I am developing a WEB app for iPad and have a problem with device rotation. Depending on the initial orientation when the orientation changes the page appears zoomed in. Now the user can simply pinch to return to an acceptable experience, but that is not nearly good enough. So here is what I want to do. I want to trigger a 'pinch' event with javascript that would behave the same as a real user zoom out.
However I am not sure this is even possible, as I have had zero success triggering touchmove events. Plus I would have to trigger 2 touchmove events moving toward each other.
$('body').trigger('touchmove' ... how would I pass in X and Y?
Now I saw this example in jQuery's documentation:
var event = jQuery.Event("logged");
event.user = "foo";
event.pass = "bar";
$("body").trigger(event);
which makes me think passing data into a triggered event object is possible, but how would a full pinch be triggered?

I may be misunderstanding your question, but are you not simply trying to stop the page from needing to be pinched?
<meta name="viewport" content="initial-scale=1, maximum-scale=1">
This will tell the device that you have designed the page for iPad and that it does not require scaling!

Related

Make a website load with browser zoomed at 1:1

I know similar questions have been asked but I have found no solution of any kind for my issue. I am building a mobile web app that has a google map embedded into it. My issue is that if a user double taps on it to try and zoom the map (which basically takes up the whole screen) it, in some cases, zooms in the browser instead of the map. Then the trouble is that the user is unable to zoom back out. Using two fingers to zoom out passes the event to the map instead of the browser and then renders the web app useless. Reloading the page keeps it zoomed in. I understand that browsers typically don't allow the script to change the browser zoom because 'controlling UI for the user is a bad idea' but in this situation I am saving the user. I don't want to do it while viewing the page, just either on load (the user will undoubtedly try and reload when they can't view/use the web app right) or on a button click.
To the best of my knowledge after quite a bit of googling on this, it is indeed not possible to change the browser zoom once the page has loaded. However, you can make a request to the browser, prior to loading the DOM, to start at a certain zoom level and/or to limit the zoom. Here is the meta tag I used to do this:
<meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=no"/>
I think for the most part the attributes are pretty self explanatory, and if you are needing more/different control I'm sure there are more options to play with. Also, keep in mind this is also entirely up to the browser on whether or not it wants to follow this.

Detecting pinching in desktop browsers

So with scrolling events I can detect panning gestures in desktop browsers and modify content accordingly. But is there a way to detect also pinching (zooming) gestures? So that instead of browser zooming the whole site (or not doing anything) I could modify a DOM element accordingly.
There are laptops with such trackpads (like Magic Trackpad and Force Touch Trackpad). So gestures can be captured, but how can I use them in desktop web apps?
Imagine that you could pinch and zoom in our out in Google Map in desktop browser. And pan left and right with a hand gesture.
I would suggest using some specialized library like hammer.js. Have a look at this example of the hammer.js documentation, where they utilize pinch gesture recognition.
According to that example detecting the pinch gesture could be as simple as:
var myElement = document.getElementById('myElement'),
mc = new Hammer.Manager(myElement),
pinch = new Hammer.Pinch();
mc.add(pinch);
mc.on("pinch", function(ev) {
console.log('You sir did just pinch!');
});
However, if you only wanted to react to the changing viewport within your layout, you might be better off with using responsive design features like media queries.
Edit: It does not work that way. Thats because the desktop browsers do not yet support gesture events. However the second part of my answer remains true: you can use media queries based on the viewport dimensions and you could hook in on the resize events of the window as usual.
I am looking for the same thing - pinching on only a specific DOM element and not entire webpage (let's an embedded drawing app instead of the google maps example).
In the developer console from Chrome and Opera select Toggle Device Toolbar and chose a touch device. Pinching can be simulated by keeping the shift key + left mouse click pressed. Then check out the monitorEvents : https://developers.google.com/web/tools/chrome-devtools/debug/command-line/events?hl=en
Additionally, Hammer.js has a touch emulator library:
<script> TouchEmulator(); </script>
function log(ev) {
console.log(ev.touches);
}
document.body.addEventListener('touchmove', log, false);
Later edit. For the Google Maps specific example the viewport is set like this so that it uses its own zooming libraries and not the browser embedded .
<meta name="viewport" content="width=device-width, user-scalable=no, initial-scale=1">

Is disabling all click events at first touchstart event a good idea?

Making a nice quick-responding website is relatively difficult because of the conflicts between touchstart, tap and the 300ms delayed click.
Ofcourse vclick should fix these issues, but also they seem to have problems fixing it completely. From the documentation:
Warning: Use vclick with caution
Use vclick with caution on touch devices. Webkit based browsers
synthesize mousedown, mouseup, and click events roughly 300ms after
the touchend event is dispatched. The target of the synthesized mouse
events are calculated at the time they are dispatched and are based on
the location of the touch events and, in some cases, the
implementation specific heuristics which leads to different target
calculations on different devices and even different OS versions for
the same device. This means the target element within the original
touch events could be different from the target element within the
synthesized mouse events.
We recommend using click instead of vclick anytime the action being
triggered has the possibility of changing the content underneath the
point that was touched on screen. This includes page transitions and
other behaviors such as collapse/expand that could result in the
screen shifting or content being completely replaced.
Now I'm thinking about doing something simpler. Whenever a touchstart event is being triggered I know this is a touch device for sure. I just disable all click events, and start listening to touchstart (or tap) events only. Ignoring the 300ms delayed click events.
Of course there are devices with a mouse and touch, but people using these at the same time seem like a minority to me.
Is this a good idea, or am I missing something in my thinking?
First of… what makes you say that people that use both touch and mouse input are a minority?
The 300ms click delay has been gone a while now on Android when using <meta name="viewport" content="width=device-width">. Unfortunately it can't be removed on iOS because it's a scroll gesture on unzoomable pages that almost nobody seems to be aware of.
I'd say that the best approach is still to support both mouse as well as touch input, despite the 300ms delay on iOS devices. It's dangerous to assume a user will exclusively use touch input when they use it once.
Imagine a user happily using a mouse to navigate. They see something interesting that they want to look at a bit closer so they use a touch gesture to zoom in and all of a sudden mouse clicks don't work anymore. That sounds broken to me.
I just remembered an interesting discussion about detecting a mouse user. Maybe it'll help you see things a bit differently.
Yes, in my honest opinion it is smart way to go. This has proven to be quite a hard problem and when you still combine it to the compatibility problems caused by some really crappy mobile devices which don't follow standards even that bit, it quickly becomes a battle that you can't win. We have adopted a solution close to this, with realization of fact that there might arise problems with poor devices. But after all, you can't satisfy the needs of everybody and the distribution of usage tends to favor those devices (nowadays) that follow the standards.
Also note that you don't need to wait for first touchstart to happen. Instead you can do this trick after DOM is ready and bind the events accordingly.
var isTouchDevice = 'ontouchstart' in document.documentElement;
which is copied from KevBurnsJr's answer.
Also as you most probably already know, you can bind to all kind of events and then check when entering the callback of which type the event actually is with
event.type
Good luck!
It would be a bad idea to disable all click events on the basis of a single touchstart as you suggest. While using both pointers or touch at the same time isn't a common use case. Preventing dual use of mouse/pens and touch isn't a forward compatible approach.
And when you say: "Ignoring the 300ms delayed click events."
I think you make a false assumption on click. You'd still have to synthesize clicks one way or another. touchstart alone isn't a click action. An assumed click happens on touchend, not touchstart. Here is the principle behind detecting clicks early on mobile: https://developers.google.com/mobile/articles/fast_buttons
If you are looking for fastclicks you may want to look into the fastclick script or other fastclick ones on github instead of vclicks.
To avoid issues with people using both touchscreen and mouse with pleasant reactivity I suggest this in JQuery, it works good enough for me:
$elem.on('click touchstart', function(e){
var $self = $(this);
if(e.type == 'touchstart'){
$self.mouseenter(); //fire events you still need
e.preventDefault();
}
/* your code */
});
From my experience it's better than keep the delay on click event and some hazardous comportment through devices, but there's inconvenient too.
On the iPad I tested, it avoids the situation where hover event is triggered on first tap then click event on second tap, but also it seems that the click event fire when you tap near the border of your element and not the touchstart, have to keep it in mind.
Also, it seems not working well with 'tap' event, certainly because it's not well treated yet by JQuery.

Android browser touchend event bug workaround

I've been developing a mobile site for my homepage and I have run into an issue when hooking into mobile touchevents. Basically I would like to accomplish the following:
User scrolls down
on touchend event is fired --> a function is called to figure out the amount of the document that is hidden above after the scroll (like jQuery scrollTop)
program takes action based on the amount of the document that is hidden up top
My issues are the following. So touchend works like expected in iOS, when the user lifts her finger the function makes a call to jQuery.scrollTop() which gives me a pixel value for how much the user has scrolled down. However on Android Browser devices it seems that the jQuery.scrollTop() value is calculated on touchstart. That is to say the event doesn't fire off properly, I get the correct pageX & Y coordinates from the touchend event, however scrolltop() returns the value from when the user started scrolling. I've checked around on the inet and this seems to be a known android browser bug, what I want to know is if there a decent workaround for this issue i.e. one that doesn't involve preventing the default scroll behaviour!? Thanks in advance!
Are you taking into account smooth scrolling? or just basic scrolling?
With basic scrolling you should be able to get the correct value simply by using document.body.scrollTop
Let me know if there is an issue

iOS Delay between touchstart and touchmove?

I'm attempting to convert my web app into a form usable by mobile devices. I'm attempting to build in support for touch gestures like horizontal scrolling. I'm finding some strange behavior in my app.
I start a gesture with a touchstart event, and then scroll on touchmove. However, my application sees a 500-700 ms delay between receiving these two events. As far as I can tell, my app is doing no other work between these two events.
Other aspects:
The code is written in jquery, using
$(element).bind(touchmove, function(ev) {return myobject.DoTouch(ev) }
were the DoTouch command simply checks the ev.type, records the touch position, and returns false.
Any ideas what I should look for to try to solve this? The lag between touching and getting a response from the app is very annoying.
Yes. It turns out, this is how iOS works. I was pulling my own hair out for some time. Read more here: http://developer.apple.com/library/ios/#DOCUMENTATION/AppleApplications/Reference/SafariWebContent/HandlingEvents/HandlingEvents.html. Essentially, if iOS thinks it can handle this as an internal PAN gesture, it does and doesn't even bother sending a touchmove event at all.
In my project, I found that if the viewer makes the touchmove gesture very deliberately and pauses a bit longer before lifting the finger at the end of the move, then the touchmove event is, in fact, sent as one might expect. So, the documented behaviour may be a little iffy versus reality, which only added to confusion and my debugging efforts.
Anyway, if iOS handles the event internally as a PAN gesture, it will send a scroll event before the touchend. In my project I was able to use this to set the flag I was using to distinguish dragging gestures (which was normally sent in my touchmove handler) and ignore any behaviour in stand-alone touchend handlers that were not related to the handling of my own scroll-handling.
I hope this helps you (and others) as well!

Categories

Resources