Interaction controller

From Buzztrax

Jump to: navigation, search

Live interaction with sound generators and effects is one very important goal for the future. We like to allow configuring all sort of HID devices to be used as controller.

Control Sources

Open Sound Control

Open Sound Control is a communication protocol for network connected controllers. On idea is to use a Nokia 770, N800, N810 as a remote control. There is lightweight osc implementation available which we could use. There are also efforts to make OSC devices discoverable and introspectable using zeroconv.

We could just assign OSC path to all parameters. The one can design a control applications (or use e.g. khagan to build one) that provides a ui to change them remotely.

Determining HID Devices

Under linux we can probably use HAL to find out about mice/keyboard/touchpad/joystick/midi-devices/lirc.

Tip - use hal-device-manager to explore :

  • midi:
    • linux.sysfx_path=/sys/class/sound/midi*
    • resmgr.class=sound

Getting events:

ALSA Sequencer client

http://www.alsa-project.org/~tiwai/alsa-subs.html

Getting events:

Plugins

The idea is to have externals that can print control-changes to stdout. The script/binary would need to support some option flags. Besides the --help/--version it should have --format. Upon that it prints something like this:

<control-name> <type> <min> <max> <def>
<control-name> ...
...

Btic would read the active externals from a plugin dir (external = device) and check their capabilities (controls). When a control is used, it would spawn the process and read the control-changes via stdin. When the last control is unbound, it terminates the process. The external would print all control values in one line (values sepaarted by spaces). Fracional values should use a locale independent '.' as a comma separator. <control-value1> <control-value2> It's up to the btic implementation to track previous values and to only notify changed values.

User Interface

We need a list of found devices/controllers. Once a device is selected one sees some info and can choose to use it or not. Next we preset a list of named controllers for the selected device. Some deviced will have fixed/know controllers. For others maybe we can ship some controller maps. For not-yet-known devices we need a learn function, where the user tweak the controls and then enter a name for the detected controller.

If we use a controller app on N800 we have the problem that the appliation needs to export the interfaces and the controller device will find the control targets. For the other controller we would use e.g. HAL to detect a list of available controllers.

Maybe we need to have always available controllers and then its up to the user to actived the controller and select the target.

There are also different strategies regarding Midi Controller Assignment.

Control targets

There are two kind of controll sources and targets: triggers and continous parameters. Trigger sources are e.g. push buttons and can be used to play notes and trigger percussion.

Song Transport

See playback controller for this.

  • play, stop, pause, record
  • navigation (timeline label as playlist).

This is a bit like media-player iface in upnp. From the controller (e.g. N800) we like to select the controll-target (which application: e.g. buzztrax, rythmbox, ...).

Mixing

  • track-volumes and pans
  • mute,solo,bypass

Machine Parameters

  • notes and percussion triggers
  • sound parameters
Support Us

Collaboration

GStreamer Logo
Become a Friend of GNOME
Linux Sound Logo
MediaWiki
Valgrind
GNU Library Public Licence
GNU Free Documentation License 1.2