Solitar\'s post prompted me to put this up. It\'s not aimed at answering his question, so I\'ve put it in its own thread:
Assuming more and more of our work is going to end being rendered via a PC, what would your dream interface be?
Right now we have mouse, trackball, keyboard, multiple screens, maybe tablets, boxes of assignable knobs, and more sophicticated sets of knobs like the HUI or Logic Control. As far as an intuitive interface which doesn\'t get in the way, none of these works for me.
My dream (for composing/programming/sequencing - not engineering)is a simple large touch sensitive screen. Pull up your application and access functions by touching/dragging the switches/knobs and sliders which you see on the screen.
This way you wouldn\'t have to assign hardware knobs, you wouldn\'t have to remember what switch#3 in the top left row of your control surface does.
If you see a function, you reach out and change it, just as you do on a hardware surface.
Goodbye mouse, goodbye keyboard.
The OS would link screen position to app function, just as it does for the mouse now, so there\'s no problem with loading up a new plugin, sliding it over to the bottom right of your monitor and beginning to play with it straight away.
I just want to be clear that I don\'t think any engineer will EVER accept an interface that doesn\'t use long throw faders, but I\'m no engineer and my mixes tend to use sequenced automation rather than live moves, so I don\'t care about the \'feel\' of the knobs or switches.
I know there are resolution and cost factors to overcome, but I can\'t think of any more inutitive interface for PCs in the future. Anyone got better alternatives?