Capture horizontal touch movement while allowing native vertical scroll - javascript

I'm trying to explore how feasible it is to create an HTML5 interaction where a touch can be conditionally captured by the initial direction of the motion.
The goal is to capture and track the motion of the touch only if it initially moves in a horizontal direction. Combined with a responsive page layout which only scrolls vertically, we should be able to use horizontal swipe motion to do something cool by tracking it.
The problem is in the seeming impossibility (especially on iOS) of performing touch sequence capture conditionally.
In order to track a touch (and by this I mean specifically obtaining the stream of x,y coordinates which represent the position of the finger over time), touchstart has to be preventDefaulted in order to prevent the page from engaging native momentum-scroll. Native momentum-scroll, while in operation, suspends all JS execution (setTimeout, rAF, jQuery animate et al...) and even CSS keyframe/animation/transition execution.
I would really like to know if there's a way to somehow condition the preventDefault() of the original touchstart event. It's completely possible for JS to store that event object, and then not call it until some later time (when it is determined that we are indeed interested in preventing the native scroll from starting).
But this is certain to fail because the default behavior of not running preventDefault() on that event is to engage native-scroll, which then will block JS execution for the entire remaining duration of the scroll. Failing to return the event listener function attached to touchstart does not appear to be an option. It would freeze everything.
The tentative answer, then, without any additional insight, is that all swipes must be captured if we want to be able to capture it at all, and scrolling has to be "faked" via external means à la (iScroll)[http://cubiq.org/iscroll-5-ready-for-beta-test] (personally I would want to explore some sort kind of combination of rAF and window.scrollTo and would be surprised if iScroll 5 does not employ these APIs)
Incidentally the description for the 4th demo there got me really excited, but it turns out that all it does is create a page section that iScrolls horizontally while the rest of the page behaves regularly, which is entirely mundane.

Related

How can I use DeviceOrientationControls with scroll in iOS 13?

I am attempting to use the data received from DeviceOrientationEvents to animate (rotate) a camera in three.js using three's DeviceOrientationControls. The controls are updated upon every animation frame, and everything works as I would expect. However, if I begin to scroll, then no DeviceOrientationEvent is fired again until the inertia from the scrolling is complete.
I have confirmed that these events are not fired (or at least not dispatched) during the scroll by logging to the console from within the DeviceOrientationEvent handler. I can see the events fired regularly up until the moment I begin to scroll, then stop, and then resume firing from the moment the inertia from the scroll is complete.
Manually stopping the inertia mid-scroll (by touching the screen) also causes the deviceorientation events to resume.
I have disabled all other scroll event handlers in my script. I have made all touch event handlers passive, have tried making them non-passive as well, and have also tried disabling all touch event handlers in my script altogether.
I am fairly sure by this point that this may be a function of how the processing of the scroll thread (which operates separately to the main thread) and the processing of IMU data are scheduled/queued in the browser, so that there may be no good solution, but I'm asking here in case there is something I've overlooked in my own troubleshooting. This does not appear to be an issue with three.js or the DeviceOrientationControls in three.js, but I've tagged this as three.js just in case anybody has ever come across this problem when attempting something similar.
My unique case for having DeviceOrientationControls enabled while scrolling is that scroll drives the animation of a "camera rig" (of empty objects whose rotation and position are animated), while moving the phone around rotates the camera itself. (It's a bit like being able to turn your head to look around while moving in a railcar.)
My testing has been on an IPhone 11 Pro, with iOS 13.5.1, in Chrome iOS 84, and Safari. I have not tested on Android.
iOS has slowed down repetitive JavaScript functionality during scroll for many years now. This is to conserve battery consumption, since it has to re-render the page lots of times while scrolling, so it halts other secondary commands until scrolling is complete. See here for more.
You could create your own custom scrolling functionality without actually scrolling down an HTML page by capturing vertical swipe gestures via 'touchstart' and 'touchmove'. Or you could use a library like Hammer.js to help you.

How can I allow a page to scroll while still capturing a dojox swipe gesture?

I am using dojox/gesture/swipe to trigger a carousel card change whenever a user performs a swipe gesture on my carousel widget. However, it seems that the user input is effectively "trapped" within whatever element is capturing the gesture which means that the user can't scroll the page by swiping up up or down on the screen if the scrolling starts by the user tapping and swiping from within the carousel.
It seems that this is almost the intended behaviour as well since this happens even with the most bare custom gesture that simply inherits from dojox/gesture/swipe without overriding any methods. Is there any way to let the scrolling of the page happen even while the gesture is being "interpreted" or am I misunderstanding the usage of the gesture system? I guess the intended use may actually be a means of more robust user like interpreting letter gestures akin to a Palm Pilot.

forwarding long touch to multiple targets

I am using javascript/jquery/jquerymobile/cordova in my application currently testing on Android.
I am trying to detect the movement of a touch in a screen into the multiple targets/buttons that there maybe in the screen.
In a desktop hoover: would just do what I want. However, it does not work in mobile platforms as it does in desktops.
In my problem I am looking to throw some 288 very small buttons and I want to receive an event for each target that is touched/mouseover along the journey of the finger in the screen ( 1 single touchstart-touchmove-touchend). As I understand it as soon as a touchstart happens, that target is the only one receiving the event notification, no matter where the touch ends or what else is touched.
I could do it with canvas and calculate the distance to 288 points at each finger movement etc. But would be resource intensive, specifically for a task that a web browser excels. Hence I am looking to see how much can I delegate to the browser.
-- edited
I found a possible answer in How to find out the actual event.target of touchmove javascript event
I will be testing today. However, I am concerned by Steven Lu comments on performance. Specially as in my case there is a significant number of object/points that I need to track.

Native scroll delaying or stopping JavaScript execution on iOS

This isn't a specific JS code issue, but more the way iOS deals with JS that is causing more problems on my site than most others.
On iOS only (it doesn't happen on Android) if I'm natively scrolling (up/down) and then try to activate some JS just before the scroll has finished (very quickly) then it completely ignores the JS.
I believe that Apple do this so that the UX always remains priority (don't let any crappy JS slow down the user), but in this case it's just a very simple piece of JS that I want to allow to run.
As an example, if a user is scrolling and then quickly presses a tab at the top of the screen that opens a fixed navigation panel then it won't register if the native scroll is still happening. If they press it again (the scroll has finished) then it works.
I'm also using a JS slider to scroll horizontally through images and if I try to scroll left/right just before the native up/down scroll has finished it sort of jumps and isn't good UX. I think it's prioritising the native scroll but still activating the horizontal scroll with some sort of delay.
It's not a massive problem, but not perfect. If everybody slowly navigated the site and waited for the native scroll to come to a complete stop, it would be great. But of course people won't do this.
I don't think preventing the default behaviour will do anything. I have tried to take over the native scroll before on iOS and I just don't think you can.
I think this may actually happen on many sites. I've just tried to find a good example by visiting stackoverflow.com on an iPhone and if you scroll quickly and then quickly hit a link before the scroll has finished it won't register. I don't think text links are as big a UX issue though, but a horizontal slider and big 'open menu' button at the top are much more likely to be hit quickly before the native scroll has ended (as you don't need to read something before you press it, like with text links).
I have various JS scripts on a site that would benefit from this being improved in iOS, so if I can understand a way around it, why it happens, what is going on, then I can apply individual fixes to each of those scripts.
Thanks.
The problem is not that iOS ignores javascript while scrolling (more precisely, while the scroll momentum is active). The problem is that, while that happens, iOS does not really register the position change of elements on the screen. In fact, if you have a handler attached to the scroll event, it will stop firing the moment you stop touching the screen, and then will fire just once when the scrolling stops.
Consequence? You think you're touching a link, but you aren't. The image on the screen has moved up or down, but, to the broswer, everything is on the same position, so, actually, you aren't touching anything (or are touching something different). I got very annoyed when I found this behaviour because, in my case, my page is full of images that are links to a gallery ... and if you touch them while scrolling, the gallery opens showing you not the image you touched, but another (The one that really was on that position when your fingers stopped touching the screen).
Is there a workaround? The only one that I know of is disabling the scroll momentum, but you lose scrolling performance.

Handle onTouchDown on mobile browser and onMouseDown on desktop/laptop?

I'm developing a web game application with a, HTML5 canvas that has to react to "click" events (in the general meaning).
On a mobile platform (or touch-capable), that means reacting to onTouchDown, on a desktop/laptop platform (with a mouse or pad), that means reacting to onMouseDown.
Problem is, if I handle both events, then sometimes the same "click" will result in both events getting fired, so I get a double signaling.
What would be the best way to handle that?
Currently, upon the first touch event I receive, I turn off mouse events, but that may be a bit heavy handed if the platform supports both touch & mouse clicks (f.i. an Android tablet with attached keyboard/pad/mouse)
Measuring the delay between touch & click to ignore a click after a touch doesn't work too well, as there are circumstances where user may touch/click at a high frequency, so a too long delay leads to dropped double taps/clicks, and a too short delay lets through the occasional double signaling slipping through.
I've looked at user-agent detection, but that seems quite fragile (many user agents out there), and doesn't solve the cases where the platform has both touch & mouse/pad.
Maybe a combination of your suggestion:
Measuring the delay between touch & click to ignore a click after a touch doesn't work too well, as there are circumstances where user may touch/click at a high frequency, so a too long delay leads to dropped double taps/clicks, and a too short delay lets through the occasional double signaling slipping through
And detecting the x,y coordinates of the tap would decrease the false positives. So if both events (tap & click) happening in sequence got the same coordinates they are handled as the same.
Or maybe let the user switch (through some sort of options screen) if he wants to use mouse or touch for his device if both is supported. Display a warning message for example saying you're on a touch device so we enabled touch events, if you're using a mouse please see options or something like that.
Most users would be happy with the auto-choice and everybody unhappy can change it.

Categories

Resources