Difference between revisions of "SpecInputButtonCustomisation"

From Inkscape Wiki
Jump to navigation Jump to search
(New page: Launchpad Entry: https://blueprints.launchpad.net/inkscape/+spec/mouse-button-customisation == Summary == mouse buttons and other input buttons (tablet buttons for instance) should be rem...)
 
(→‎Design: mappable actions)
Line 46: Line 46:


== Design ==
== Design ==
=== User Interface ===


i know that Jon Cruz is currently [[http://codewideopen.blogspot.com/2008/03/tablet-test-area.html| developing a module]] which will ease setting up graphics tablets and the actions each of their buttons performs. i'm not clear on the details regarding mouse buttons, though, as his blog entry doesn't mention them. it sounds as if it would be the perfect input device setup module though, if it could be extended to also allow the mouse buttons to be configured.
i know that Jon Cruz is currently [[http://codewideopen.blogspot.com/2008/03/tablet-test-area.html| developing a module]] which will ease setting up graphics tablets and the actions each of their buttons performs. i'm not clear on the details regarding mouse buttons, though, as his blog entry doesn't mention them. it sounds as if it would be the perfect input device setup module though, if it could be extended to also allow the mouse buttons to be configured.


=== Back End ===
=== mappable actions ===
 
*primary action (current left click)
=== More ideas ===
*secondary action (current shift-primary action)
 
*tertiary action (control-primary action)
*"push" page while held (current middle click)
*zoom while held, centreing on mouse position
*context menu (current right click)
*possibly other sub-actions currently accessed via keyboard modifiers


== TODO ==
== TODO ==


== Discussion ==
== Discussion ==

Revision as of 15:50, 17 May 2008

Launchpad Entry: https://blueprints.launchpad.net/inkscape/+spec/mouse-button-customisation

Summary

mouse buttons and other input buttons (tablet buttons for instance) should be remappable. actions assignable should include their existing functions (primary action, "push" page and context menu) plus secondary action (what you currently get with a shift-click), tertiary action (control-click), possibly other sub-actions currently accessed via keyboard modifiers and zoom while button is held. obviously others would think of more possible mappings.

Rationale

a preamble to set the scene:

i'm a long time Xara user. before that, i was a user of Artworks. Artworks was (and is) Xara's predecessor, and is for Risc OS. Risc OS machines have always had three-button mice, the left called "select", the middle called "menu" and the right called "adjust".

select was the standard clicking button -- it would select things or perform actions, exactly as left-click generally works in other OSs.

menu would always bring up the application menu, instead of applications having menu bars across the top.

adjust was to perform a similar action to select, but in a different way. sometimes this would be opposite (adjust on a scroll button would scroll in the opposite direction, for instance).

in Artworks, for example, adjust adds or removes from the current selection when the selection tool is active, resizes in a different way when resize handles are dragged and so on. in short, it performs the functions that Xara and Inkscape would perform with a shift-click.

the Xara developers thoughtfully included an option to remap the mouse buttons. to be precise, the left and right mouse buttons can each be remapped to any of normal click, shift-click, alt-click, control-click, context menu, toggle full screen, zoom or push. i can't see why toggle full screen is useful, but i can see use cases for the rest of them.

setting the right click to shift-click therefore makes controlling Xara much more similar to controlling Artworks. this is a setting i jumped on right away when starting to use Xara, since i was frustrated that right clicking things didn't add them to the selection, that right clicking a colour didn't set it as the outline colour and so on -- this was behaviour i was very used to.

now, onwards...

i'm one user who hates context menus. i'll explain.

to perform an action with a context menu you must first click to get the menu, then read the items and find the one you want (since often they'll move around or disappear based on context), then aim at it and finally click again. this is incredibly inefficient.

using a keyboard shortcut requires much less cognitive activity since you have the tactile feedback of the keys and the muscle memory putting your fingers in the right place subconsciously. not to mention you don't need to look once you get good. for the most common tasks a single click is even better, for instance the existing push operation (dragging the middle button), using a tool or setting a colour.

context menus are obviously useful for newer users (it would not be immediately obvious, for instance, that right clicking a colour would set it as the outline colour), but in most cases they don't offer anything the menus across the top of the screen doesn't.

the only exceptions i've seen in Inkscape so far are the context menus of colours (though there are only two options -- these could easily be left and right click) and the fill and stroke colour boxes at the bottom of the screen (in this case there could be a dropdown (up) menu button for these useful actions instead of relying on a context menu).

in short, it makes much more sense to me to use a mouse button for the second-most common tasks, which are currently accessed by shift-clicking, than to have a menu pop up which i almost never use, and whose actions can be performed more easily with keyboard commands, or the main application menus.

i'm sure that many (most?) users used to keyboard commands would find the same.

i realise and agree that it's good to have interface uniformity throughout an operating system and so for Linux and Windows users it is natural to get a menu when the right mouse button is pressed, but this isn't the case for all OSs (Macs only have one button).

so wouldn't it make sense to allow the users to set up which action each button performs?

it would increase accessibility for people used to different actions (like me) and at the same time allow everyone to choose what suits their workflow best.

Design

i know that Jon Cruz is currently [developing a module] which will ease setting up graphics tablets and the actions each of their buttons performs. i'm not clear on the details regarding mouse buttons, though, as his blog entry doesn't mention them. it sounds as if it would be the perfect input device setup module though, if it could be extended to also allow the mouse buttons to be configured.

mappable actions

  • primary action (current left click)
  • secondary action (current shift-primary action)
  • tertiary action (control-primary action)
  • "push" page while held (current middle click)
  • zoom while held, centreing on mouse position
  • context menu (current right click)
  • possibly other sub-actions currently accessed via keyboard modifiers

TODO

Discussion