diff --git a/README.md b/README.md index e941c1d..1e8efbb 100644 --- a/README.md +++ b/README.md @@ -8,7 +8,7 @@ access as well as optionally the centralized collection, viewing and disk record The most common way to use LSL is to use one or more applications with integrated LSL functionality. -* Take a look at the list of [supported devices](https://labstreaminglayer.readthedocs.io/en/latest/info/supported_devices.html) +* Take a look at the list of [supported devices and tools](https://labstreaminglayer.readthedocs.io/en/latest/info/supported_devices.html) and follow the instructions to start streaming data from your device. If your device is not in the list then see the "Getting Help" section below. * Download [LabRecorder](https://github.com/labstreaminglayer/App-LabRecorder) from its diff --git a/docs/info/supported_devices.rst b/docs/info/supported_devices.rst index 0a60c68..12289c2 100644 --- a/docs/info/supported_devices.rst +++ b/docs/info/supported_devices.rst @@ -1,8 +1,9 @@ -Supported Devices -################# -The lab streaming layer was originally developed to facilitate human-subject experiments that involve multi-modal data acquisition, including both brain dynamics (primarily EEG), physiology (EOG, EMG, heart rate, respiration, skin conductance, etc.), as well as behavioral data (motion capture, eye tracking, touch interaction, facial expressions, etc.) and finally environmental and program state (for example, event markers). +Supported Devices and Tools +########################### + +**For device applications and tools hosted on GitHub, please make sure to read the respective repository's README and to check the release page for downloads.** -For device support hosted on GitHub, make sure to read the README and check the release page for downloads. +The lab streaming layer was originally developed to facilitate human-subject experiments that involve multi-modal data acquisition, including both brain dynamics (primarily EEG), physiology (EOG, EMG, heart rate, respiration, skin conductance, etc.), as well as behavioral data (motion capture, eye tracking, touch interaction, facial expressions, etc.) and finally environmental and program state (for example, event markers). Supported EEG Hardware **********************