Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Change links to plugin branch

...

To that end, everything about the GUI revolves around the ProcessorGraph, which is a subclass of Juce's AudioProcessorGraph. A list of available inputs, filters, and visualizers/outputs is presented in the ProcessorList, which are dragged into the EditorViewport to construct the signal chain. The ProcessorGraph then takes care of the behind-the-scenes operations of connecting the individual modules and making sure they can process data in the right order during acquisition. By structuring things around a flexible ProcessorGraph, the application makes few assumptions about what the user wants to do. Therefore, the GUI can be useful for things that don't involve electrophysiology at all: controlling trial structure for behavioral experiments, delivering open-loop optogenetic stimuli, or providing online feedback based on position data. Of course, the GUI's true power comes in its ability to easily condition stimuli based on neural events.

...

  • Solid black arrows denote ownership: which objects are responsible for creating and destroying other objects?

  • Dashed black arrows indicate the precisely timed callbacks that drive data acquisition.

  • Red arrows show how neural data flows through the application. All of the data processing takes place within the ProcessorGraph, but may leave the graph to be stored to disk by the RecordNode, sent to the audio monitors by the AudioNode, or visualized within theDataViewport. Efforts have been taken to make everything outside of the ProcessorGraph as general as possible. Thus, extending the functionality of the GUI will only involve creating new processing modules, rather than modifying the rest of the source code.

  • Orange arrows denote the essential message-passing interactions. The messages sent between the EditorViewport and theProcessorGraph are especially important, as they are responsible for constructing and verifying the signal chain.

...

When the application starts, the first object created is an OpenEphysApplication, which is derived from the JUCEApplication class. This object initializes the application and owns the MainWindow, but doesn't do much else. When the MainWindow is created, it creates the three central objects of the application: the UIComponent, the AudioComponent, and the ProcessorGraph.

  • The UIComponent owns all of the objects the user can interact with directly. Editing the ProcessorGraph, starting and stopping acquisition, and receiving updates about the application's state is all handled by objects created by the UIComponent. It is derived from Juce's Component class

  • The AudioComponent communicates with the computer's audio card, which drives the callbacks to the ProcessorGraph and enables audio monitoring of neural signals. It is derived from Juce's AudioDeviceManager class.

  • The ProcessorGraph stores information about the signal chain and ensure that data is processed efficiently. It is derived from Juce'sAudioProcessorGraph class.

User-interface classes

 

The UIComponent has five main regions, each of which is an object that it is responsible for creating and destroying:

  • The ControlPanel occupies the top of the application window. It contains a CPU usage meter, a disk space meter, a play button, and a record button. It also own the Clock, which displays the total time spent acquiring or recording data.

  • The ProcessorList sits on the left-hand side of the application window and contains a list of all the available modules for creating and processing data. Modules are selected from the list and dragged onto the EditorViewport to construct the signal chain. The ProcessorListcan  can be collapsed to provide additional room for data visualization once the signal chain is in place.

  • The EditorViewport, near the bottom of the application window, displays the editors for all of the monitors currently in the signal chain. The editors provide a graphical interface for modifying the parameters of each module. Once they're inside the EditorViewport, editors can be dragged and dropped to change their ordering within a signal chain, but only if acquisition is paused.

  • The MessageCenter lives at the bottom-right of the application window and displays messages that are relevant to the user. Most of these messages currently originate in the ProcessorGraph, but they could come from almost any object within the program.

  • The DataViewport contains tabs with OpenGL visualizers. Whenever a new visualizer is added to the signal chain, it can either claim a tab within the DataViewport or place its display in a separate window. This affords the GUI incredible flexibility. In a setup with multiple monitors, each visualizer can occupy a separate window on its own screen. If the GUI is being used on a laptop, all of the visualizers are organized into easy-to-find tabs.

...

  • Sources feed data into the graph. They can only have outputs, never inputs. Examples of sources include the SourceNode, which communicates with external data sources, and the EventNode, which emits events at defined intervals. Every signal chain must have at least one source.

  • Filters modify the data they receive. They can be as simple as bandpass filters or resamplers, or more complex spike detectors or ripple/spindle detectors. Filters can process continuous signals or discrete events.

  • Sinks take data and do something with it. They can only have inputs, never outputs (at least within the ProcessorGraph). Examples of sinks include LFP displays, spike displays, and network signaling nodes.

  • Utilities (not shown) allow signal chains to be combined or split. Robust utility classes are still under construction.

...