Currently PointerEvent is still called MouseEvent, rename it!
PointerEvent currently misses:
Enhance PointerEvent to satisfy 'joystick' controllers.
- Add PointerTypes:
- JoystickA(PointerClass.Offscreen) (analogue)
- JoystickD(PointerClass.Offscreen) (digital)
- JoystickD == JoyPad ?
- Add 3rd axis for abs coord, z.
- Rel. movement:
'float moveXYZ' and 'float moveScale'
similar to rotate* !
Abs. position may be calculated via current position
and moveScale * moveXYZ.
Resulting position may exceed window coordinates.
Requires a device context to keep track of position,
i.e. PointerDevice (see below).
Hence moveScale is optional.
- Send event to focused window only or
consider auto-assignment - see Bug 813
- PointerDevice extends InputDevice:
- Pool of static PointerDevice's accessible by user to
query (position) and configure (origin, ..).
- Unique by it's Pointer-ID (as already used in PointerEvent)
- Keep track of absolute position (see above).
- Used to fire 'added' and 'removed' events.
We have to handle the following kind of events:
- button events (pressed, released) [digital]
- axis events (float) [analog]
- sliding axis events (float, like 2 axis: abscissa, ordinate) [analog]
- hat switch or POV (point-of-view) hat events (9 directions) [analog]
- vector3 events (float, like 3 axis: abscissa, ordinate, applicate) [analog]
- plug in/out events (when a gamepad/mouse/keyboard is added or removed)
We have to handle some effects:
- force feedback
We have to represent all these devices:
For example, a mouse event should indicate which mouse has just generated it, nothing forces the end user to plug in a single mouse and a single keyboard even though it is the general case.
An input device must describe its "components" which is very important for gamepads. The developer should be able to listener a specific device. It's already possible to listen to all keyboards which is the case with current key listeners.
Generating "mouse" events and moving the mouse pointer with a gamepad or any joystick can be useful when there is no mouse, for video games machines and in assistive technology, for people who can no longer use a mouse because of a muscular weakness.
A mouse can be seen as an input device with 2 axis (X & Y), one or multiple buttons and maybe a wheel. The problem is that some native APIs only provide relative data about the mouse movements and the wheel. These "raw" APIs are useful to get more accurate and resolution independent data, especially in games. It allows to decrease the screen resolution without decreasing the accuracy of the mouse. If we use these APIs, we have to compute the absolute position from the relative movement.
(In reply to comment #1)
> - axis events (float) [analog]
> - sliding axis events (float, like 2 axis: abscissa, ordinate) [analog]
> - hat switch or POV (point-of-view) hat events (9 directions) [analog]
> - vector3 events (float, like 3 axis: abscissa, ordinate, applicate) [analog]
.. can you elaborate on these ?
Best if you can add a reference/link to documentation.
(In reply to comment #1)
> We have to handle some effects:
> - rumbler
> - force feedback
I assume these are not input events, but triggers the user
sent to the InputDevice itself.
Hence InputDevice / PointerDevice may require methods like:
doRumble(int ms, float strength/frequency)
or something ?
(In reply to comment #4)
> (In reply to comment #1)
> > - axis events (float) [analog]
> > - sliding axis events (float, like 2 axis: abscissa, ordinate) [analog]
> > - hat switch or POV (point-of-view) hat events (9 directions) [analog]
> > - vector3 events (float, like 3 axis: abscissa, ordinate, applicate) [analog]
> .. can you elaborate on these ?
> Best if you can add a reference/link to documentation.
The main stick  can generate axis events.
The green component  is a POV hat.
Vector3 events concern natural input interfaces, the only examples that comes to my mind are the Wiimote and the Kinect.
My main source of inspiration is OOIS Joystick listener:
(In reply to comment #5)
> (In reply to comment #1)
> > We have to handle some effects:
> > - rumbler
> > - force feedback
> I assume these are not input events, but triggers the user
> sent to the InputDevice itself.
> Hence InputDevice / PointerDevice may require methods like:
> doRumble(int ms, float strength/frequency)
> or something ?
You're right, haptic effects should be treated in the device itself.
(In reply to comment #0)
> Currently PointerEvent is still called MouseEvent, rename it!
On one hand, what will we do with the interfaces dealing with MouseEvent instances? MouseListener handles the callbacks mainly useful for the mice. Should we create another listener for the callbacks used by the gamepads?
On the other hand, MouseEvent.getButton() returns a short, this isn't enough for gamepads, the object representing the device should have a method that uses a short in input and that provides some information about the button in output. Maybe some other kinds of "components" (axis, POV hat, ...) will require some complementary information.
Other kinds of devices have to be considered (like it is already the case in JInput):
- stick (traditional joystick, like the one designed for the Apple 2E)
- trackpad (?)
- wheel (a steering wheel, not a mouse wheel, see net.java.games.input.Controller)
Should the port type (USB, parallel, ...) be exposed in the API? The port number will have to be exposed. The device should indicate which effects are supported (rumbler(s)?).
Look at net.java.games.input.Component to have an idea of which kind of information have to be returned to identify a "component" of a gamepad.
mousePressed and mouseReleased should be renamed, why not calling them buttonPressed and buttonReleased?
Instead of creating tons of classes to handle events, PointerEvent can fit into most of our needs except when adding or removing a "pointer device".
PointerDevice can contain a mapping between short button codes and real "components". The value of a component can be relative or absolute.
If every new move can be managed as an axis move, PointerEvent.pointerMoved() is enough for point-of-view controls, sliders and axis. getTranslation() or getMove() would be enough to indicate the moves along 3 axis. A PointerEvent must contain some information about the concerned device, not just the pointer type, the pointer identifier (Pointer-ID) should allow to retrieve some complementary information. PointerEvent.PovDirection could represent the 9 possible directions of a point-of-view control.
By default, multiple mice are seen as a single one by the operating system and the data are expressed in a resolution dependent way. How should we allow the developer to switch to the resolution independent raw data? Should each mouse be exposed as 2 pointer devices?
Actually, pointerPressed and pointerReleased might be better than buttonPressed and buttonReleased.
At first, maybe getMove() should be replaced by getD or getDX + getDY + getDZ to avoid any confusion as those names are already used in some other APIs and it clearly indicates that the data are relative (D -> delta).
The mouse data of the poller should be considered as relative only when the cursor is invisible (as it is done in some APIs based on JInput). The poller should store the previous fetched data.
- visible cursor:
- The new delta is the difference between the current absolute coordinates and the previous unclipped absolute coordinates.
- The new unclipped absolute coordinates and the new (potentially clipped) absolute coordinates are equal to the current absolute coordinates.
- invisible cursor
- The new delta is the sum between the current relative coordinates and the previous delta.
- The new unclipped absolute coordinates and the new (potentially clipped) absolute coordinates are equal to the sum between the current relative coordinates and themselves.
The absolute clipped coordinates have to be clipped later. We have to take care of the overflow too.
Then, the mouse (PointerDevice) can expose both its relative data and its absolute data without appearing twice in the list of devices.
Should the pollers be exposed in the public API too?
N.B: The absolute data depend on the display, monitor or screen. Another conversion might be necessary.
This API could be a nice source of inspiration to manipulate USB HID:
We'll have to decide whether using udev (part of systemd) or evdev (inside the GNU Linux kernel) later.
Raw Input API will be used under Windows and probably IOKit under Mac OS X. It was already the case in JInput.