This diagram (from Getting started with Unify: Overview) shows the basic signal routing in Unify:
For patches with no MIDI layers, incoming MIDI data (from your DAW and/or MIDI keyboard/controller) is always routed to ALL Instrument layers.
When there are MIDI layers, incoming MIDI data is always routed to ALL MIDI layers, and each Instrument layer can be set to receive MIDI either directly from the input stream, OR from the output of any selected MIDI layer.
In Unify v1.0.x, MIDI data is not routed to audio-effects layers (either AUX or MASTER). This may change in future.
As the above diagram shows, MIDI data is passed in daisy-chain fashion along a chain of MIDI effects. The MIDI input stream is the MIDI input for the first MIDI effect, the output of the first effect is the input to the next, and so on, and the output of the last MIDI effect in the chain is the output of the entire MIDI layer.
This chaining can be useful when you need to modify the input MIDI stream before passing it on to the next effect, e.g. using an instance of MIDI Filter to interpret MIDI sustain-pedal (CC#64) events ahead of an instance of BlueARP, which doesn't understand those events itself.
MIDI routing in Instrument layers is completely different from that in MIDI layers: The input MIDI stream is routed directly to EVERY plug-in in the layer, like this, and the layer does not have any MIDI output:
What this means is that not only the Instrument plug-in, but ALSO any insert effects on the layer, can be controlled via MIDI. This is particularly useful when using MIDI data-streams generated by MIDI generator plug-ins such as arpeggiators. Of course, only plug-ins that actually support MIDI control can take advantage of this feature. Nearly all instrument plug-ins support MIDI control (if only for playing notes), but quite a few audio effects do too. (Effect plug-ins by Polyverse/Infected Mushroom have particularly good MIDI control abilities.
The above descriptions of MIDI routing in MIDI and Instrument layers don't tell the whole story. Both MIDI and Instrument layers have quite a bit of built-in MIDI-stream processing.
The following diagram shows both MIDI and audio routing in Unify's Instrument layers:
The box labeled MIDI filtering/processing on the left is represented in the Unify GUI by the “MIDI controls cluster” at the left-hand side of the layer, which consists of eight primary control rectangles plus two more (the MIDI activity light and MIDI latch control) which appear only when they are relevant.
The MIDI Source box (top left corner) determines where the layer's MIDI data comes from. This routing choice happens before the MIDI data even enters the MIDI filtering/processing box.
The “MIDI filtering/processing” box in the diagram above is itself a sequence of five processing stages:
At the first stage, the MIDI channel number (1–16) of each incoming MIDI event is compared against the setting of the Input Channel control, and the event is only passed on to the next stage if it matches:
The second stage allows the option to change the event's MIDI channel number. This could be useful if you have multiple MIDI keyboards assigned to, say, MIDI channels 4, 5, and 6, and you have set up MIDI channel-based filtering to route these to different layers, but you would like to change the channel numbers back to 1 before processing them, either because you're working with a plug-in which only works on MIDI channel 1, or you simply prefer not to set your individual plug-ins to specific MIDI channels. (You might have loaded the layer from a layer preset, which you have defined with all plug-ins set on MIDI channel 1 for simplicity.) Stage 2 is controlled by the MIDI output channel control:
The third filtering stage is the basis for key- and velocity-splitting. By setting multiple layers to respond to non-overlapping pitch ranges, you can define key-splits. By setting two or more layers to respond to the same pitch range, but non-overlapping ranges of the MIDI note-velocity (1–127), you can define a velocity split, where the layer which plays depends on how hard you strike notes on the keyboard. Stage 3 is controlled by the four control boxes at the right-hand side of the MIDI-controls cluster:
The control box marked G8 in the image indicates the upper limit of pitch; the box below it marked C-2 is the lower limit. You can adjust these values in four ways:
The two control boxes on the right are similar, but define the upper (top-right) and lower (bottom-right) velocity limits. These can only be adjusted using the mouse (either click/drag or use mouse-wheel).
MIDI pitch values for key-splits are defined prior to transposing, so you can specify splits based on physical keys on your keyboard. (Technically, the key-split ranges are defined based on the MIDI note-numbers output by your keyboard, which might themselves be transposed if you have activated any transpose capability built into the keyboard controller itself.)
The fourth processing stage allows you to transpose the pitch at which the layer sounds by adding an offset (in semitones, which may be negative) to the note-numbers for each MIDI note-on or note-off event. The offset is the sum of the layer transpose offset and the global transpose offset.
The layer transpose offset affects only the given layer, and is is set by the layer transpose control at the bottom-left corner of the MIDI-controls cluster:
The global transpose offset affects ALL instrument and MIDI layers, and hence allows you to transpose the entire composite Unify “instrument” up and down. The global transpose offset is set by the global transpose box at the bottom-right corner of the Show MIDI view which appears in the Footer of the Unify GUI window when you click on the MIDI icon in the Icon Strip below:
Both the global and layer-transpose boxes can be adjusted in the same three ways:
Finally, the fifth stage of layer MIDI processing allows you to apply an arbitrary response curve to the velocity values associated with each note-on event. Clicking on the MIDI Velocity Graph control at the right-hand side of the MIDI-controls cluster pops up a curve editor window like this:
The graph itself represents incoming MIDI note-velocity values along the horizontal axis, with lowest (quietest) velocities on the left and highest (loudest) velocities on the right. The vertical axis represents outgoing (processed) velocity values, lowest at the bottom and highest at the top. The default velocity curve is a straight line–the so-called “identity mapping” where each incoming velocity value is mapped to the identical value (no change at all).
In the graph control itself, you can:
For velocity curves, you will rarely need to create split points. You will usually only need to adjust the curvature (slightly – a little change goes a long way). Dragging upward to create a convex curve will make the layer more responsive to velocity, and can be useful when you are working with a weighted-key MIDI controller. Dragging downward to create a concave curve will make the layer less responsive to velocity, and can be useful when playing a very lightweight synth-action keyboard.