Plugin Processing

A plugin processes signals - be it samples that represent audio, real-time or automated parameter changes or control events such as Midi events.

In order to design an extensible mechanism for the plugin to process one or more of these (types of) data streams, the following architecture is proposed. The plugin requests the number and type of input and output channels it requires with the host.

// Base interface for specific channel implementations
public interface IChannel
{
    string TypeName { get; } // identifies the type of channel
}

public interface IMidiChannel : IChannel
{
    int MidiChannel { get; } // zero-based
}

public interface IAudioChannel<T> : IChannel
{
    int SampleCount { get; }
    T[] Buffer { get; }
}

//[Flags]
public enum ChannelDirection
{
  None = 0x00,
  Input = 0x01,
  Output = 0x02,
  Uni = 0x03,        // [Input | Output] in the same buffer
}

public enum ProcessingType
{
  None,
  Realtime,
  Offline
}

public interface IProcessContext
{
  ProcessingType ProcessingType { get; }

  // provide access to the channels
  IList<IChannel> GetByBusName(ChannelDirection dir, string busName);
  IList<IChannel> GetByChannelType(ChannelDirection dir, string typeName);

  // provide access to processing specific services
  IServiceProvider ServiceProvider { get; } // can also be derived from...

  // time/clock related service
}

public enum ProcessResult
{
    Success, // ok - done
    Sleep, // no need to call again until the data changes
    Retry // maybe ignored by the host for realtime processing
}

public interface IRealtimeProcessor
{
    ProcessResult Process(IProcessContext context);
}

public interface IOfflineProcessor
{
    ProcessResult Process(IProcessContext context);
}

Realtime and Offline processing is supported. The plugin can have two separate processors for each case and/or simple look at the ProcessingContext.ProcessingType to perform Realtime or Offline specific operations. The host checks if the plugin supports the IOfflineProcessor service and uses it for Offline processing, otherwise the IRealtimeProcessor is used.

Plugins that can process events in-place (audio only?) or are compatible with that, negotiate that in the profile. The host manages the channels (buffers) in such a way that this is possible supplying Uni-directional channels.

Control Events

Midi and Parameter events are accessibly during the processing cycle.
This allows the plugin to process all information at once in the order that is convenient for the plugin.
All events are tagged with a zero-base sample index that starts at the current processing cycle (call).

public interface IEvent
{
    Int32 FrameOffset { get; }
}

public interface IControlEvent : IEvent
{
    // Gesture grouping information ??
}

public interface IMidiShortEvent : IControlEvent
{
    byte[] Data { get; }
    byte Status { get; }
    byte Parameter1 { get; }
    byte Parameter2 { get; }
    int ToInt();
}

public interface IParameterEvent : IControlEvent
{
    // parameter reference
    // new parameter value
    // smoothing curve and duration information
}

TBD is there a need to have different (derived) event definitions for events received and events sent? Like when a plugin sends parameter change events, which could require extra or other information?




Plugin Event Processing.PNG

Each color represent a different event type. All event streams (channels) are delivered to the plugin at the same time through the ProcessingContext. Both the Event Sources and Event Targets are (managed by) the host. A plugin can process events in all possible permutations.
  • It can receive multiple events and aggregate them into fewer events.
  • It can receive one event and generate multiple events for that.
  • it can just receive events and process them internally but not send any events out.
  • it can generate events (based on algorithm, timer, user action or other) without consuming them.
Or a combination of any of these.

Last edited Oct 6, 2013 at 8:45 AM by obiwanjacobi, version 15