Detect touch monitor with mouse attached - javascript

I wonder if it's possible to detect a kind of the device that runs a JS app.
What do I mean? - I want to serve touch monitors (that can be used as a normal touch devices and also can have a mouse attached - so they can behave as a normal PC) and classic touch devices (e.g. mobile phones - without possibility to use a mouse).
Is it possible? I know how to detect touch devices:
var isTouch = !!("ontouchstart" in window || navigator.msMaxTouchPoints);
But how to detect that there is a touch monitor or at least it has a mouse attached?

Well, you already have a bool reflecting touch capabilities. Obtaining the same for the mouse is doable, although not exactly elegant. You could do something like this:
var hasMouse;
window.addEventListener('mousemove', function mouseCheck(e) {
window.removeEventListener('mousemove', mouseCheck);
hasMouse = true;
});
Combined with user-agent inspection, you could get a decent estimate of the users device. Naturally some mobile browsers allow one to spoof a non-mobile user agent for 'desktop' like viewing. I'm not too fond of the idea of coercing interface formats in a situation where you aren't certain of the devices capabilities. How about simply showing a modal at application start, and using the isTouch && hasMouse, along with user-agent information to indicate the 'recommended' layout?

Related

Is there a way to detect if the device is a VR HEADSET and not just a "VR allowed" device?

I've been searching but I couldn't find... I'm making a three js webgl VR experience but I don't want the VR to be accessible on mobile phones, ONLY on VR headsets.
For the moment I only can detect if the device is able to use VR, but many mobiles are, and I don't want those to allow VR. I've been searching but couldn't find any solution.
Not sure why you'd purposely prevent certain users from accessing the content if they can handle it. But if you really want to exclude mobile devices, you're gonna have to target a specific functionality of theirs. For example, you can disable it if it has touch events:
button.addEventListener('touchstart', (event) => {
// Disable your app however you please
app.disabled = true;
});
Keep in mind this isn't 100% foolproof because some devices could pretend to have touch events, but it can get you pretty close.

Knowing when to display on screen touch controls

I'm making a game that has 2 controls, left and right. On desktop those are the left and right cursor keys. On a machine without a keyboard or a machine primarily interacted with via touch screen I need to show buttons the user can press with their fingers.
Is there a way to detect when it's appropriate to display these touch buttons and when it's not?
solutions considered:
check for iPhone/iPad/Android
That covers most phones and tablet but of course it will be wrong on some other brands
use css media queries hover and pointer
AFAIK these won't help. A phone probably has hover: none and pointer: course but what about an iPad with a stylus. It should allow both hover:hover and pointer:fine making it indistinguishable from a desktop.
check by size
Seems flaky. Tablets have pretty large sizes. Often as large as laptops.
check for touch support
isTouchDevice = "ontouchstart" in window;
returns true on many Windows devices with touch screens

How to capture touch pad input

I've looked everywhere for how to capture touch pad input for laptops but I can't seem to find anything for Chrome extensions/JavaScript.
Question: how can I capture the number of fingers down (not clicked, just down and potentially moving as you would with a mouse), their corresponding x,y coordinates, and their corresponding up events, for a touch pad on a laptop?
Clarifications:
I'm not interested in detecting touch screen events. Just touch pad
events.
Can assume the touch pad lives on 3 year old or newer lap tops.
I can't find it by now, but I somewhere read about this topic. But the synopsis is simple: it's a draft/in development but no browser supports it by now.
Here is the W3C draft: https://w3c.github.io/pointerevents/
I think it is not possible to do this using JavaScript only. Let's take it this way:
Consider the following situation:
I am using a Macbook pro 13in Retina Display, with multi touch and multi gesture touchpad.
Now Suppose if I have gesture settings in my Operating System that if I tap two fingers, register it as a normal Left Click, and when I tap a single finger, register it as a right click.
Now imagine we are capturing both the events, click and dblclick, now tell me which event will get fired when I will do a single tap with one finger. It will be a dblclick, and when I will do a tap with two fingers it will be click event fired.
Another Case: Imagine i have inverted scrolling turned on in my computer, now when I will scroll upwards my page will scroll downwards. And this is something which chrome/ firefox is not controlling.
Conclusion:
There can be varied number of such settings across varied types of operating system, across varied number of devices such as trackpads, trackballs, touchpads, mouse, magic mouse etc. This gives me a feel that there is a layer between the external hardware and the browser detecting the firing events and this layer is provided by the operating system. Its operating system which manipulate the events according to the user defined/preset settings.
There can be devices which intent to provide and fire multiple events like touch device, on touch they fire multiple events. But that is not the case with all the devices. So it doesn't matter if you are clicking from mouse or from the trackball or from the touchpad or from the touch screen you will get one common event that is a click, there is definitely a possibility that some more events are fired but they are dependent on the type of device and not on the settings you have done in your Operating System.
One way you can capture is the event is by establishing some sort of connectivity between your browser web page and operating system as suggested by #AlvaroSanz.
to develop such kind of extension, you need to write chrome native client with Windows Touch Input to make it happen.
I know that you´re asking for a solution for Chrome extensions/JavaScript but I´ve been searching and getting nothing, so I finished with a possible solution combining VB and JavaScript.
There is VB api for Synaptics (https://autohotkey.com/board/topic/65849-controlling-synaptics-touchpad-using-com-api/) and you can call javascript from VB (http://www.codeproject.com/Articles/35373/VB-NET-C-and-JavaScript-communication#cjfv), it's a long way, but it's a way.

Detecting pinching in desktop browsers

So with scrolling events I can detect panning gestures in desktop browsers and modify content accordingly. But is there a way to detect also pinching (zooming) gestures? So that instead of browser zooming the whole site (or not doing anything) I could modify a DOM element accordingly.
There are laptops with such trackpads (like Magic Trackpad and Force Touch Trackpad). So gestures can be captured, but how can I use them in desktop web apps?
Imagine that you could pinch and zoom in our out in Google Map in desktop browser. And pan left and right with a hand gesture.
I would suggest using some specialized library like hammer.js. Have a look at this example of the hammer.js documentation, where they utilize pinch gesture recognition.
According to that example detecting the pinch gesture could be as simple as:
var myElement = document.getElementById('myElement'),
mc = new Hammer.Manager(myElement),
pinch = new Hammer.Pinch();
mc.add(pinch);
mc.on("pinch", function(ev) {
console.log('You sir did just pinch!');
});
However, if you only wanted to react to the changing viewport within your layout, you might be better off with using responsive design features like media queries.
Edit: It does not work that way. Thats because the desktop browsers do not yet support gesture events. However the second part of my answer remains true: you can use media queries based on the viewport dimensions and you could hook in on the resize events of the window as usual.
I am looking for the same thing - pinching on only a specific DOM element and not entire webpage (let's an embedded drawing app instead of the google maps example).
In the developer console from Chrome and Opera select Toggle Device Toolbar and chose a touch device. Pinching can be simulated by keeping the shift key + left mouse click pressed. Then check out the monitorEvents : https://developers.google.com/web/tools/chrome-devtools/debug/command-line/events?hl=en
Additionally, Hammer.js has a touch emulator library:
<script> TouchEmulator(); </script>
function log(ev) {
console.log(ev.touches);
}
document.body.addEventListener('touchmove', log, false);
Later edit. For the Google Maps specific example the viewport is set like this so that it uses its own zooming libraries and not the browser embedded .
<meta name="viewport" content="width=device-width, user-scalable=no, initial-scale=1">

How to detect 4 fingers touches on iOS without exit the app?

I am creating a web-app for iPad where I need to detect when 4 fingers touch my div.page.
I can easily do this with:
$('.page').on("touchstart", function(e) {
e.preventDefault();
touchesNbr = e.originalEvent.touches.length;
}
But my problem is that, even if there is a e.preventDefault(), iOS automatically show the "Which app would you like to kill" screen, when 4 fingers touch the screen and move slightly up.
Is there a way to avoid that?
Unfortunately, it seems impossible. Multitasking gestures are at the OS level, which means javascript can't access them.

Categories

Resources