Summary: | Enhance PointerEvent / Deprecate MouseEvent | ||
---|---|---|---|
Product: | [JogAmp] Newt | Reporter: | Sven Gothel <sgothel> |
Component: | core | Assignee: | Julien Gouesse <gouessej> |
Status: | CONFIRMED --- | ||
Severity: | enhancement | CC: | gouessej |
Priority: | --- | ||
Version: | tbd | ||
Hardware: | All | ||
OS: | all | ||
Type: | FEATURE | SCM Refs: | |
Workaround: | --- | ||
Bug Depends on: | 807, 813 | ||
Bug Blocks: | 595, 1427 |
Description
Sven Gothel
2013-08-09 14:49:22 CEST
We have to handle the following kind of events: - button events (pressed, released) [digital] - axis events (float) [analog] - sliding axis events (float, like 2 axis: abscissa, ordinate) [analog] - hat switch or POV (point-of-view) hat events (9 directions) [analog] - vector3 events (float, like 3 axis: abscissa, ordinate, applicate) [analog] - plug in/out events (when a gamepad/mouse/keyboard is added or removed) We have to handle some effects: - rumbler - force feedback We have to represent all these devices: - keyboard - mouse - gamepad For example, a mouse event should indicate which mouse has just generated it, nothing forces the end user to plug in a single mouse and a single keyboard even though it is the general case. An input device must describe its "components" which is very important for gamepads. The developer should be able to listener a specific device. It's already possible to listen to all keyboards which is the case with current key listeners. Generating "mouse" events and moving the mouse pointer with a gamepad or any joystick can be useful when there is no mouse, for video games machines and in assistive technology, for people who can no longer use a mouse because of a muscular weakness. A mouse can be seen as an input device with 2 axis (X & Y), one or multiple buttons and maybe a wheel. The problem is that some native APIs only provide relative data about the mouse movements and the wheel. These "raw" APIs are useful to get more accurate and resolution independent data, especially in games. It allows to decrease the screen resolution without decreasing the accuracy of the mouse. If we use these APIs, we have to compute the absolute position from the relative movement. (In reply to comment #1) > - axis events (float) [analog] > - sliding axis events (float, like 2 axis: abscissa, ordinate) [analog] > - hat switch or POV (point-of-view) hat events (9 directions) [analog] > - vector3 events (float, like 3 axis: abscissa, ordinate, applicate) [analog] .. can you elaborate on these ? Best if you can add a reference/link to documentation. (In reply to comment #1) > We have to handle some effects: > - rumbler > - force feedback I assume these are not input events, but triggers the user sent to the InputDevice itself. Hence InputDevice / PointerDevice may require methods like: doRumble(int ms, float strength/frequency) .. or something ? (In reply to comment #4) > (In reply to comment #1) > > - axis events (float) [analog] > > - sliding axis events (float, like 2 axis: abscissa, ordinate) [analog] > > - hat switch or POV (point-of-view) hat events (9 directions) [analog] > > - vector3 events (float, like 3 axis: abscissa, ordinate, applicate) [analog] > > .. can you elaborate on these ? > Best if you can add a reference/link to documentation. https://en.wikipedia.org/wiki/File:Joyopis.svg The main stick [1] can generate axis events. The green component [7] is a POV hat. Vector3 events concern natural input interfaces, the only examples that comes to my mind are the Wiimote and the Kinect. My main source of inspiration is OOIS Joystick listener: https://github.com/wgois/Object-oriented-Input-System--OIS-/blob/master/includes/OISJoyStick.h (In reply to comment #5) > (In reply to comment #1) > > We have to handle some effects: > > - rumbler > > - force feedback > > I assume these are not input events, but triggers the user > sent to the InputDevice itself. > Hence InputDevice / PointerDevice may require methods like: > doRumble(int ms, float strength/frequency) > .. > or something ? You're right, haptic effects should be treated in the device itself. (In reply to comment #0) > Currently PointerEvent is still called MouseEvent, rename it! > On one hand, what will we do with the interfaces dealing with MouseEvent instances? MouseListener handles the callbacks mainly useful for the mice. Should we create another listener for the callbacks used by the gamepads? On the other hand, MouseEvent.getButton() returns a short, this isn't enough for gamepads, the object representing the device should have a method that uses a short in input and that provides some information about the button in output. Maybe some other kinds of "components" (axis, POV hat, ...) will require some complementary information. Other kinds of devices have to be considered (like it is already the case in JInput): - fingerstick - headtracker - rudder - stick (traditional joystick, like the one designed for the Apple 2E) - trackball - trackpad (?) - wheel (a steering wheel, not a mouse wheel, see net.java.games.input.Controller) Should the port type (USB, parallel, ...) be exposed in the API? The port number will have to be exposed. The device should indicate which effects are supported (rumbler(s)?). Look at net.java.games.input.Component to have an idea of which kind of information have to be returned to identify a "component" of a gamepad. mousePressed and mouseReleased should be renamed, why not calling them buttonPressed and buttonReleased? Instead of creating tons of classes to handle events, PointerEvent can fit into most of our needs except when adding or removing a "pointer device". PointerDevice can contain a mapping between short button codes and real "components". The value of a component can be relative or absolute. If every new move can be managed as an axis move, PointerEvent.pointerMoved() is enough for point-of-view controls, sliders and axis. getTranslation() or getMove() would be enough to indicate the moves along 3 axis. A PointerEvent must contain some information about the concerned device, not just the pointer type, the pointer identifier (Pointer-ID) should allow to retrieve some complementary information. PointerEvent.PovDirection could represent the 9 possible directions of a point-of-view control. By default, multiple mice are seen as a single one by the operating system and the data are expressed in a resolution dependent way. How should we allow the developer to switch to the resolution independent raw data? Should each mouse be exposed as 2 pointer devices? Actually, pointerPressed and pointerReleased might be better than buttonPressed and buttonReleased. At first, maybe getMove() should be replaced by getD or getDX + getDY + getDZ to avoid any confusion as those names are already used in some other APIs and it clearly indicates that the data are relative (D -> delta). The mouse data of the poller should be considered as relative only when the cursor is invisible (as it is done in some APIs based on JInput). The poller should store the previous fetched data. - visible cursor: - The new delta is the difference between the current absolute coordinates and the previous unclipped absolute coordinates. - The new unclipped absolute coordinates and the new (potentially clipped) absolute coordinates are equal to the current absolute coordinates. - invisible cursor - The new delta is the sum between the current relative coordinates and the previous delta. - The new unclipped absolute coordinates and the new (potentially clipped) absolute coordinates are equal to the sum between the current relative coordinates and themselves. The absolute clipped coordinates have to be clipped later. We have to take care of the overflow too. Then, the mouse (PointerDevice) can expose both its relative data and its absolute data without appearing twice in the list of devices. Should the pollers be exposed in the public API too? N.B: The absolute data depend on the display, monitor or screen. Another conversion might be necessary. This API could be a nice source of inspiration to manipulate USB HID: https://github.com/nyholku/purejavahidapi We'll have to decide whether using udev (part of systemd) or evdev (inside the GNU Linux kernel) later. Raw Input API will be used under Windows and probably IOKit under Mac OS X. It was already the case in JInput. Actually, I think that I should use libinput for this purpose, it has a udev and a path backend, it works both with Wayland and X11, it provides: - Keyboard events - Pointer events - Touch events - Gesture events - Tablet events - Tablet pad events - Switch events A risk of using several different libraries at the same time to manage input events is to provide inconsistent data. I think that it's a safer bet to use the same library to manage the mouse and keyboard events that we support in NEWT from the very beginning and the other events. LibInput provides relative pointer motion events with libinput_event_pointer_get_dx_unaccelerated() and libinput_event_pointer_get_dy_unaccelerated(), it helps to get the absolute and relative pointer motion events from the same API in the software stack that manages the input events. If I succeed in implementing a NEWT backend using it, it will allow to provide relative pointer motion events to some softwares in which providing the absolute pointer motion events doesn't really make sense, typically fullscreen games with no pointer icon, mostly first person shooters. In my humble opinion, computing absolute coordinates from relative coordinates is LibInput's role, using it to avoid having to do it by ourselves seems to be a good idea. LibInput supports both X11 and Wayland but I have no system to test the latter yet. I haven't found a way of obtaining any information through LibInput about gamepads whereas a quick test with udevadm showed me that udev sees my gamepad as expected. If I can't use LibInput to manage gamepads, I'll have to use either evdev (with libevdev or directly with ioctl like JInput) and/or udev as I initially planned. I'll try this: https://github.com/git-moss/purejavahidapi |