So with scrolling events I can detect panning gestures in desktop browsers and modify content accordingly. But is there a way to detect also pinching (zooming) gestures? So that instead of browser zooming the whole site (or not doing anything) I could modify a DOM element accordingly.
There are laptops with such trackpads (like Magic Trackpad and Force Touch Trackpad). So gestures can be captured, but how can I use them in desktop web apps?
Imagine that you could pinch and zoom in our out in Google Map in desktop browser. And pan left and right with a hand gesture.
I would suggest using some specialized library like hammer.js. Have a look at this example of the hammer.js documentation, where they utilize pinch gesture recognition.
According to that example detecting the pinch gesture could be as simple as:
var myElement = document.getElementById('myElement'),
mc = new Hammer.Manager(myElement),
pinch = new Hammer.Pinch();
mc.add(pinch);
mc.on("pinch", function(ev) {
console.log('You sir did just pinch!');
});
However, if you only wanted to react to the changing viewport within your layout, you might be better off with using responsive design features like media queries.
Edit: It does not work that way. Thats because the desktop browsers do not yet support gesture events. However the second part of my answer remains true: you can use media queries based on the viewport dimensions and you could hook in on the resize events of the window as usual.
I am looking for the same thing - pinching on only a specific DOM element and not entire webpage (let's an embedded drawing app instead of the google maps example).
In the developer console from Chrome and Opera select Toggle Device Toolbar and chose a touch device. Pinching can be simulated by keeping the shift key + left mouse click pressed. Then check out the monitorEvents : https://developers.google.com/web/tools/chrome-devtools/debug/command-line/events?hl=en
Additionally, Hammer.js has a touch emulator library:
<script> TouchEmulator(); </script>
function log(ev) {
console.log(ev.touches);
}
document.body.addEventListener('touchmove', log, false);
Later edit. For the Google Maps specific example the viewport is set like this so that it uses its own zooming libraries and not the browser embedded .
<meta name="viewport" content="width=device-width, user-scalable=no, initial-scale=1">
Related
Since I´m working on a project where I need to be able to drag objects around my canvas but also to scroll the entire page by dragging the actual canvas 'background' behind my PIXI Sprites, i followed the findings of this guy here:
https://github.com/pixijs/pixi.js/issues/2483 :
By default, the Pixi canvas/display-area cannot be used to scroll the
webpage that contains it. Which is important on touch screens. (eg. If
you use the rest of the web-page to pinch-zoom into the Pixi canvas,
you can become trapped and unable to zoom back out (or pan away),
because there's no non-Pixi-canvas area of the page to "grab" with
your pinch gesture).
To enable this functionality, I use autoPreventDefault. But this comes
with some undesirable side-effects, like scroll/pinch-zoom actions
over the canvas registering "taps" or clicks in a way that doesn't
make sense. (ie. I'm attempting to zoom or scroll the outer page at
that point, not interact with the Pixi canvas)
To work around that, I modify and compile my own custom version of
Pixi where I can apply preventDefault in a more granular way...
To get page-scrolling functionality it seems I only need to
preventDefault in the InteractionManager.prototype.onTouchEnd
function. Whereas autoPreventDefault will also preventDefault on 3
other events. (onMouseDown, onTouchMove, onTouchStart).
Leaving autoPreventDefault = false and applying preventDefault only to
onTouchEnd, gives me the functionality I need. But it'd be nice to not
have to customize and rebuild Pixi in this way every release. (Sorry
if there's something else I'm missing here; I don't completely
understand the event system in Pixi, or what else to do about this
scroll-touch problem)
So i disabled e.preventDefault() on 'onTouchStart' and on 'onMouseMove' but left it as is on 'onTouchEnd'
This works perfect on IOS devices but not on Android, the only exception being a Samsung A7 using Adblock browser (fails on Chrome).
Would really appreciate some help on this.
TLDR:
Disabling PIXI´s e.preventDefault on onTouchStart and onMouseMove works on IOS devices and lets me scroll the page by draggin my canvas around but not on Android devices.
My solution for that was to use
renderer.plugins.interaction.autoPreventDefault = false
This should work on iOS and Android.
Docs for autoPreventDefault reads:
Should the manager automatically prevent default browser actions.
Using PIXI 4.5.6.
Take a look at the docs:
http://pixijs.download/dev/docs/PIXI.CanvasRenderer.html#plugins
http://pixijs.download/dev/docs/PIXI.interaction.InteractionManager.html
Using renderer.plugins.interaction.autoPreventDefault=true should do the trick.
I wonder if it's possible to detect a kind of the device that runs a JS app.
What do I mean? - I want to serve touch monitors (that can be used as a normal touch devices and also can have a mouse attached - so they can behave as a normal PC) and classic touch devices (e.g. mobile phones - without possibility to use a mouse).
Is it possible? I know how to detect touch devices:
var isTouch = !!("ontouchstart" in window || navigator.msMaxTouchPoints);
But how to detect that there is a touch monitor or at least it has a mouse attached?
Well, you already have a bool reflecting touch capabilities. Obtaining the same for the mouse is doable, although not exactly elegant. You could do something like this:
var hasMouse;
window.addEventListener('mousemove', function mouseCheck(e) {
window.removeEventListener('mousemove', mouseCheck);
hasMouse = true;
});
Combined with user-agent inspection, you could get a decent estimate of the users device. Naturally some mobile browsers allow one to spoof a non-mobile user agent for 'desktop' like viewing. I'm not too fond of the idea of coercing interface formats in a situation where you aren't certain of the devices capabilities. How about simply showing a modal at application start, and using the isTouch && hasMouse, along with user-agent information to indicate the 'recommended' layout?
I'm developing a presentation-like local site (will be local, displayed on several screens and that's all) and i pick to develop on chrome (you know, awesome support for all neat stuff). Chrome will run as full screen window, on top of windows 7 (or maybe 8, who knows?)
The app/site will be operable trough a multi touch screen.
However!
Since the screen is multitouch, on pinch everything is zoomed in/out. There is a way of disabling this? (even with a separate app that will work on background, dunno). Also, i would like to be able to use multitouch in the app later on, but on custom stuff, not on screen resize...
Btw, i tried the trick i used on an iphone app:
document.addEventListener('mousemove', function(e){ e.preventDefault(); }, false);
and:
<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=no">
But it seems it doesn't work.
Sooo... Any ideas?
Thanks!
Best solution for now (until google updates chrome, if they ever will) is to disable gestures in your touchscreen/mouse/touchpad settings. Control panel -> Mouse -> Device Settings -> (pick device) -> Settings -> Disable pinch zoom.
I am working on a similar project (browser-based digital kiosk with multitouch screen) and ran in to the same problem.
Since I have not found a work around in Chrome I am switching to Firefox. You can disable pinch zooming in Firefox by navigating to 'about:config' and changing the values for browser.gesture.pinch.in and browser.gesture.pinch.out to "".
When I do a Google image search on my iPhone within the Safari mobile browser, it gives me this beautiful interface for flipping through the images. If I swipe left or right, it browses through the images. If I touch and move up or down, I get what appears to be the native Safari scroll function. Can anyone explain how Google does this? I'm only beginning to learn the Safari API for touch events. It seems like either you capture the touch event to attach handlers to swipe left or right or you let Safari handle the touch events natively, in which case you get the beautiful native Safari scrolling. Can anyone explain how Google captures left/right swipe but not scrolling?
There are touch-specific DOM Events. They've implemented a lot of JavaScript logic around them. Read the Safari Web Content Guideline: Handling Events Docs Also checkout out the official spec for Touch Events
A while back, I wrote a quick library to wrap some native-like gestures as HTML events JSGestureRecognizer. I don't really recommend using that library in production, but reading the source should give you a pretty good idea about how google went about listening to native Touch Events and doing complex user interfaces with them.
I am updating my scrolling game engine to output HTML5 code for the scrolling maps it generates, so that it can be used not only as a (somewhat-platform-specific) complete game creator, but also as a cross-platform HTML5 scrolling map editor. I got past the challenge of supporting the graphic tinting as described in my earlier question. And I have a nice sample running at http://sgdk2.enigmadream.com/ben/. However I have noticed that the mouse interaction for scrolling the map does not work on FireFox or on an iPod. It looks like iPod may use different events (ontouch etc) according to Native HTML5 Drag and Drop in Mobile Safari (iPad, iPod, iPhone)?. And that doesn't explain why FireFox wouldn't react. Isn't there a more universal way to support mouse or touch interaction? Do the touch events also work for mouse, or are they specific to touch? How would you recommend interacting with this scrolling map in the most cross-platform compatible way?
you need to correctly retrieve your srcElement
var srcEl = e.srcElement? e.srcElement : e.target;
try it
P.S.: see targets