From 02b999b5899037ef188aba070097ee6255ec6212 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Miguel=20Garc=C3=ADa=20Garc=C3=ADa?= Date: Fri, 25 Oct 2024 12:37:46 +0200 Subject: [PATCH 01/14] Add monocular settings --- neon/.vitepress/config.mts | 4 ++ neon/data-collection/data-format/index.md | 1 + neon/data-collection/gaze-mode/index.md | 56 +++++++++++++++++++++++ 3 files changed, 61 insertions(+) create mode 100644 neon/data-collection/gaze-mode/index.md diff --git a/neon/.vitepress/config.mts b/neon/.vitepress/config.mts index 1c9f38d17..73b660244 100644 --- a/neon/.vitepress/config.mts +++ b/neon/.vitepress/config.mts @@ -106,6 +106,10 @@ let theme_config_additions = { text: "Measuring the IED", link: "/data-collection/measuring-ied/", }, + { + text: "Monocular Gaze", + link: "/data-collection/gaze-mode/", + }, { text: "Scene Camera Exposure", link: "/data-collection/scene-camera-exposure/", diff --git a/neon/data-collection/data-format/index.md b/neon/data-collection/data-format/index.md index 7fe76fdfb..e1aceaf61 100644 --- a/neon/data-collection/data-format/index.md +++ b/neon/data-collection/data-format/index.md @@ -34,6 +34,7 @@ This file contains meta-information on the recording. | **start_time** | Timestamp of when the recording was started. Given as UTC timestamp in nanoseconds. | | **template_data** | Data regarding the selected template for the recording as well as the response values. | | **wearer_id** | Unique identifier of the wearer selected for this recording. | +| **gaze_mode** | Indicates whether binocular or monocular (right/ left) pipeline was used to infer gaze. | | **wearer_name** | Name of the wearer selected for this recording. | | **workspace_id** | The ID of the Pupil Cloud workspace this recording has been assigned to. | diff --git a/neon/data-collection/gaze-mode/index.md b/neon/data-collection/gaze-mode/index.md new file mode 100644 index 000000000..c183a799a --- /dev/null +++ b/neon/data-collection/gaze-mode/index.md @@ -0,0 +1,56 @@ +# Binocular vs. Monocular Gaze Mode + +Starting from version 2.8.33, the Neon Companion App allows you to select between using both eyes (binocular) or a single eye (monocular) images for outputing gaze positions. This flexibility can help optimize performance based on your specific requirements, especially in wearers with eye phorias (strabismus) or other eye conditions. + +## Modes + +- `Binocular` _(default)_: Utilizes images from both the right and left eyes to infer gaze position. This mode offers higher accuracy and robustness by leveraging information from both eyes. +- `Mono Right`: Uses only the right eye's image to infer gaze position. This mode may be useful in scenarios where one eye provides a clearer view or when reducing computational load is necessary. +- `Mono Left`: Uses only the left eye's image to infer gaze position. Similar to Mono Right, this mode is beneficial when focusing on a single eye enhances performance or reliability. + +::: warning +**Monocular gaze is less accurate and robust** since it relies on a single eye image. Use this mode only if binocular tracking is not feasible or if there's a specific need for single-eye tracking. +::: + +## Changing Gaze Modes + +To switch between gaze modes, follow these steps: + +1. From the home screen of the Neon Companion App, tap the gear icon located at the top-right corner to open **Companion Settings**. +2. Scroll down to the **NeonNet** section. +3. Choose your desired **Gaze Mode** (`Binocular`, `Mono Right`, or `Mono Left`). +4. After selecting the new gaze mode, **unplug and re-plug** the Neon device to apply the changes. + +::: tip +After altering the gaze mode to monocular, it's recommended to perform a new [Offset Correction](/data-collection/offset-correction/) to improve accuracy. +::: + +## Other Considerations + +- Changing the gaze mode modifies the existing gaze stream. It does **not** create an additional stream. +- All downstream processes, including fixations and enrichments, will utilize this monocular gaze data. +- Eye State and Pupillometry remain unaffected by changes to the gaze mode and will output the data for each eye. + +## In Pupil Cloud: + +Pupil Cloud handles gaze data processing as follows: + +- **Default Behavior**: Pupil Cloud reprocesses recordings to maintain a consistent sampling rate of **200Hz**, regardless of the real-time sampling rate set in the app. + +- **Monocular Mode**: If a monocular gaze mode is selected, Pupil Cloud **will not** reprocess the recordings. Ensure that this aligns with your data analysis requirements. + +## Where Can I Find Which Mode Was Used on a Recording? + +On the recording's view in the Neon Companion App, you can tap on the three dots to visualize the metadata. + +Additionally, the [info.json](/data-collection/data-format/#info-json) file now includes a new field `gaze_mode`. + +--- + +### Best Practices / Additional Recommendations + +- **Testing**: After changing the gaze mode, perform tests to verify that the gaze tracking meets your accuracy and performance needs. + +- **Update your Team**: Keep your team informed about changes in gaze modes to ensure consistency in data collection and analysis. + +--- From 28f9bf6f3255db7ab7ff2a8d2d0d26c3b0f997a9 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Miguel=20Garc=C3=ADa=20Garc=C3=ADa?= Date: Fri, 25 Oct 2024 13:11:57 +0200 Subject: [PATCH 02/14] Corrections --- neon/data-collection/gaze-mode/index.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/neon/data-collection/gaze-mode/index.md b/neon/data-collection/gaze-mode/index.md index c183a799a..32d30bc87 100644 --- a/neon/data-collection/gaze-mode/index.md +++ b/neon/data-collection/gaze-mode/index.md @@ -1,12 +1,12 @@ # Binocular vs. Monocular Gaze Mode -Starting from version 2.8.33, the Neon Companion App allows you to select between using both eyes (binocular) or a single eye (monocular) images for outputing gaze positions. This flexibility can help optimize performance based on your specific requirements, especially in wearers with eye phorias (strabismus) or other eye conditions. +Starting from version 2.8.33, the Neon Companion App allows you to select between using both eyes (binocular) or a single eye (monocular) images for outputing gaze positions. This flexibiliy enables you to isolate gaze from a specific eye, e.g. when recording from participants/wearers with phorias, or other experimental paradgims that require monocular gaze. ## Modes - `Binocular` _(default)_: Utilizes images from both the right and left eyes to infer gaze position. This mode offers higher accuracy and robustness by leveraging information from both eyes. -- `Mono Right`: Uses only the right eye's image to infer gaze position. This mode may be useful in scenarios where one eye provides a clearer view or when reducing computational load is necessary. -- `Mono Left`: Uses only the left eye's image to infer gaze position. Similar to Mono Right, this mode is beneficial when focusing on a single eye enhances performance or reliability. +- `Mono Right`: Uses only the right eye's image to infer gaze position. This mode may be useful in scenarios where one eye can only be used. +- `Mono Left`: Uses only the left eye's image to infer gaze position. Similar to Mono Right but using the right eye. ::: warning **Monocular gaze is less accurate and robust** since it relies on a single eye image. Use this mode only if binocular tracking is not feasible or if there's a specific need for single-eye tracking. From a36d90d11b9b8be66257c4f807f8ed2b0ed45d89 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Miguel=20Garc=C3=ADa=20Garc=C3=ADa?= Date: Fri, 25 Oct 2024 13:13:54 +0200 Subject: [PATCH 03/14] left :facepalm: --- alpha-lab/imu-transformations/pl-imu-transformations | 1 + neon/data-collection/gaze-mode/index.md | 2 +- 2 files changed, 2 insertions(+), 1 deletion(-) create mode 160000 alpha-lab/imu-transformations/pl-imu-transformations diff --git a/alpha-lab/imu-transformations/pl-imu-transformations b/alpha-lab/imu-transformations/pl-imu-transformations new file mode 160000 index 000000000..542043d8a --- /dev/null +++ b/alpha-lab/imu-transformations/pl-imu-transformations @@ -0,0 +1 @@ +Subproject commit 542043d8a4500ae908af6abe9846266a76bd1eeb diff --git a/neon/data-collection/gaze-mode/index.md b/neon/data-collection/gaze-mode/index.md index 32d30bc87..1e5441fd6 100644 --- a/neon/data-collection/gaze-mode/index.md +++ b/neon/data-collection/gaze-mode/index.md @@ -6,7 +6,7 @@ Starting from version 2.8.33, the Neon Companion App allows you to select betwee - `Binocular` _(default)_: Utilizes images from both the right and left eyes to infer gaze position. This mode offers higher accuracy and robustness by leveraging information from both eyes. - `Mono Right`: Uses only the right eye's image to infer gaze position. This mode may be useful in scenarios where one eye can only be used. -- `Mono Left`: Uses only the left eye's image to infer gaze position. Similar to Mono Right but using the right eye. +- `Mono Left`: Uses only the left eye's image to infer gaze position. Similar to Mono Right but using the left eye. ::: warning **Monocular gaze is less accurate and robust** since it relies on a single eye image. Use this mode only if binocular tracking is not feasible or if there's a specific need for single-eye tracking. From 5ac9c5e62d12f0dfc1d7e26c0a12111a114a7b08 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Miguel=20Garc=C3=ADa=20Garc=C3=ADa?= Date: Tue, 5 Nov 2024 10:54:18 +0100 Subject: [PATCH 04/14] Update app version and jargon --- neon/data-collection/gaze-mode/index.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/neon/data-collection/gaze-mode/index.md b/neon/data-collection/gaze-mode/index.md index 1e5441fd6..0ff0484fc 100644 --- a/neon/data-collection/gaze-mode/index.md +++ b/neon/data-collection/gaze-mode/index.md @@ -1,12 +1,12 @@ # Binocular vs. Monocular Gaze Mode -Starting from version 2.8.33, the Neon Companion App allows you to select between using both eyes (binocular) or a single eye (monocular) images for outputing gaze positions. This flexibiliy enables you to isolate gaze from a specific eye, e.g. when recording from participants/wearers with phorias, or other experimental paradgims that require monocular gaze. +Starting from version 2.8.34-prod, the Neon Companion App allows you to select between using both eyes (binocular) or a single eye (monocular) images for outputing gaze positions. This option enables you to isolate gaze from a specific eye, e.g. when recording participants with strabismus, or other experimental paradgims that require monocular gaze. ## Modes - `Binocular` _(default)_: Utilizes images from both the right and left eyes to infer gaze position. This mode offers higher accuracy and robustness by leveraging information from both eyes. - `Mono Right`: Uses only the right eye's image to infer gaze position. This mode may be useful in scenarios where one eye can only be used. -- `Mono Left`: Uses only the left eye's image to infer gaze position. Similar to Mono Right but using the left eye. +- `Mono Left`: Uses only the left eye's image to infer gaze position. Similar to `Mono Right` but using the left eye. ::: warning **Monocular gaze is less accurate and robust** since it relies on a single eye image. Use this mode only if binocular tracking is not feasible or if there's a specific need for single-eye tracking. From c062d1c08b8252c1b2e4b5e3aa03d4c1ac8b3a29 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Miguel=20Garc=C3=ADa=20Garc=C3=ADa?= Date: Mon, 11 Nov 2024 12:46:30 +0100 Subject: [PATCH 05/14] Consistent Neon Companion App capitalisation & internal linking --- alpha-lab/neon-with-capture/index.md | 10 ++--- neon/data-collection/data-format/index.md | 8 ++-- neon/data-collection/data-streams/index.md | 8 ++-- .../ecosystem-overview/index.md | 6 +-- neon/data-collection/first-recording/index.md | 4 +- neon/data-collection/measuring-ied/index.md | 13 +++--- neon/data-collection/monitor-app/index.md | 4 +- neon/data-collection/psychopy/index.md | 44 ++++++++++--------- neon/data-collection/recordings/index.md | 8 ++-- .../scene-camera-exposure/index.md | 18 +++++--- .../time-synchronization/index.md | 11 +++-- .../transfer-recordings-via-usb/index.md | 22 +++++----- neon/data-collection/troubleshooting/index.md | 2 +- neon/data-collection/wearers/index.md | 3 +- neon/hardware/compatible-devices/index.md | 14 +++--- .../module-technical-overview/index.md | 2 +- neon/neon-player/eye-state-timeline/index.md | 21 ++++----- neon/pupil-cloud/index.md | 2 +- neon/pupil-cloud/offset-correction/index.md | 10 ++--- neon/pupil-cloud/troubleshooting/index.md | 10 +++-- neon/real-time-api/tutorials/index.md | 18 ++++++-- 21 files changed, 136 insertions(+), 102 deletions(-) diff --git a/alpha-lab/neon-with-capture/index.md b/alpha-lab/neon-with-capture/index.md index d6eabb55d..10f1b89b7 100644 --- a/alpha-lab/neon-with-capture/index.md +++ b/alpha-lab/neon-with-capture/index.md @@ -1,6 +1,6 @@ # Use Neon with Pupil Capture -It's possible to make recordings using Neon in Pupil Capture under MacOS and Linux. You can use the Neon module as if it was a Pupil Core headset by plugging it into your Computer and running Pupil Capture from [source](https://github.com/pupil-labs/pupil/tree/master). +It's possible to make recordings using Neon in Pupil Capture under MacOS and Linux. You can use the Neon module as if it was a Pupil Core headset by plugging it into your Computer and running Pupil Capture from [source](https://github.com/pupil-labs/pupil/tree/master). Note, that in this case gaze estimation will **NOT** be done using **NeonNet**, but using Pupil Core's gaze estimation pipeline. This means you will have to do a calibration and experience a lack of robustness compared to NeonNet. @@ -11,10 +11,10 @@ To get started with Neon in Capture, follow these steps: 1. Connect **Neon** to your computer. 2. Open **Pupil Capture**. 3. Under Video Source, click **"Activate Device"** and choose **Neon** to activate the scene and eye cameras. -4. In the eye camera's window, you can adjust the absolute exposure time and gain. Keep in mind that the Neon eye cameras do not have individual controls, so changes will affect both cameras. -Additionally, you may want to switch to ROI mode in the general settings to define a specific area for pupil detection. For more guidance, check out this [video](https://drive.google.com/file/d/1tr1KQ7QFmFUZQjN9aYtSzpMcaybRnuqi/view?usp=sharing). +4. In the eye camera's window, you can adjust the absolute exposure time and gain. Keep in mind that the Neon eye cameras do not have individual controls, so changes will affect both cameras. + Additionally, you may want to switch to ROI mode in the general settings to define a specific area for pupil detection. For more guidance, check out this [video](https://drive.google.com/file/d/1tr1KQ7QFmFUZQjN9aYtSzpMcaybRnuqi/view?usp=sharing). 5. Select either `Neon 3D` or `2D` as the **Gaze Mapping** option in the **Calibration** tab. ::: warning -Recordings made with the Neon Companion app (rather than Pupil Capture) are **NOT** compatible with Pupil Player. -::: \ No newline at end of file +Recordings made with the Neon Companion App (rather than Pupil Capture) are **NOT** compatible with Pupil Player. +::: diff --git a/neon/data-collection/data-format/index.md b/neon/data-collection/data-format/index.md index e1aceaf61..35199f486 100644 --- a/neon/data-collection/data-format/index.md +++ b/neon/data-collection/data-format/index.md @@ -19,9 +19,9 @@ This file contains meta-information on the recording. | **android_device_id** | Unique identifier of the Android device used as Companion. | | **android_device_model** | Model name of the Companion device. | | **android_device_name** | Device name of the Companion device. | -| **app_version** | Version of the Neon Companion app used to make the recording. | -| **calib_version** | Version of the offset correction used by the Neon Companion app. | -| **data_format_version** | Version of the data format used by the Neon Companion app. | +| **app_version** | Version of the Neon Companion App used to make the recording. | +| **calib_version** | Version of the offset correction used by the Neon Companion App. | +| **data_format_version** | Version of the data format used by the Neon Companion App. | | **duration** | Duration of the recording in nanoseconds | | **firmware_version** | Version numbers of the firmware and FPGA. | | **frame_id** | Number identifying the type of frame used for this recording. | @@ -29,7 +29,7 @@ This file contains meta-information on the recording. | **gaze_offset** | Gaze offset applied to this recording using the offset correction. Values are in pixels. | | **module_serial_number** | Serial number of the Neon module used for the recording. This number is encoded in the QR code on the back of the Neon module. | | **os_version** | Version of the Android OS that was installed on the recording Companion device. | -| **pipeline_version** | Version of the gaze estimation pipeline used by the Neon Companion app. | +| **pipeline_version** | Version of the gaze estimation pipeline used by the Neon Companion App. | | **recording_id** | Unique identifier of the recording. | | **start_time** | Timestamp of when the recording was started. Given as UTC timestamp in nanoseconds. | | **template_data** | Data regarding the selected template for the recording as well as the response values. | diff --git a/neon/data-collection/data-streams/index.md b/neon/data-collection/data-streams/index.md index c454b3498..2bd9efe20 100644 --- a/neon/data-collection/data-streams/index.md +++ b/neon/data-collection/data-streams/index.md @@ -21,7 +21,7 @@ The scene camera can be operated with automatic or manual exposure. In situation ## Gaze Available in: Real-timePupil CloudNeon Player -The Neon Companion app can provide gaze data in real-time at up to 200 Hz. Gaze data is output in pixel space of the scene camera image. The origin is in the top-left corner of the image. +The Neon Companion App can provide gaze data in real-time at up to 200 Hz. Gaze data is output in pixel space of the scene camera image. The origin is in the top-left corner of the image. ![Gaze](./gaze.jpg) @@ -47,7 +47,7 @@ The downloads for gaze mapping enrichments ([Reference Image Mapper](/pupil-clou ## 3D Eye States Available in: Real-timePupil Cloud -The Neon Companion app provides 3D eye state data in real-time at up to 200 Hz. The 3D eye states are a time series of each eye's position and orientation in 3D space, given by the location of the eyeball center and the optical axis of each eye. The units are millimeters. +The Neon Companion App provides 3D eye state data in real-time at up to 200 Hz. The 3D eye states are a time series of each eye's position and orientation in 3D space, given by the location of the eyeball center and the optical axis of each eye. The units are millimeters. The coordinate system is depicted below. The origin corresponds to the scene camera of the Neon Module. @@ -64,7 +64,7 @@ If 200 Hz real-time data is essential, consider upgrading to a newer [Companion ## Pupil Diameters Available in: Real-timePupil Cloud -The Neon Companion app provides pupil diameter data in real-time at up to 200 Hz. Separately for the left and right eye. The computed pupil diameters correspond to the physical pupil size in mm, rather than the apparent pupil size in pixels as observed in the eye videos. You can find a high-level description as well as a thorough evaluation of the accuracy and robustness of Neon’s pupil-size measurements in our [white paper](https://zenodo.org/records/10057185). +The Neon Companion App provides pupil diameter data in real-time at up to 200 Hz. Separately for the left and right eye. The computed pupil diameters correspond to the physical pupil size in mm, rather than the apparent pupil size in pixels as observed in the eye videos. You can find a high-level description as well as a thorough evaluation of the accuracy and robustness of Neon’s pupil-size measurements in our [white paper](https://zenodo.org/records/10057185). Similar to the 3D eye states, the accuracy of the pupil diameter measurements improves when supplying the wearer's IED in the wearer profile before making a recording. @@ -82,7 +82,7 @@ The blink detection algorithm is operating directly on the eye video to detect t Available in: Pupil CloudNeon Player Stereo microphones are integrated into the Neon module. Recorded audio will be part of the resulting scene video. -Audio recording is disabled in the Neon Companion app by default and can be enabled in the settings. +Audio recording is disabled in the Neon Companion App by default and can be enabled in the settings. ## Movement (IMU Data) diff --git a/neon/data-collection/ecosystem-overview/index.md b/neon/data-collection/ecosystem-overview/index.md index 779befda8..484fbb030 100644 --- a/neon/data-collection/ecosystem-overview/index.md +++ b/neon/data-collection/ecosystem-overview/index.md @@ -4,7 +4,7 @@ The Neon ecosystem contains a range of tools that support you during data collec ## Neon Companion App -You should have already used the Neon Companion app to [make your first recording](/data-collection/first-recording/). This app is the core of every Neon data collection. +You should have already used the Neon Companion App to [make your first recording](/data-collection/first-recording/). This app is the core of every Neon data collection. When your Neon is connected to the Companion device, it supplies it with power and enables it to generate a real-time gaze signal as well as several other [data streams](/data-collection/data-streams/). When making a [recording](/data-collection/recordings/), all generated data is saved on the Companion device. @@ -12,7 +12,7 @@ The app automatically saves [UTC timestamps](https://en.wikipedia.org/wiki/Coord ## Other Data Collection Tools -Several other tools complement the Neon Companion app and can make data collection much easier in some scenarios. +Several other tools complement the Neon Companion App and can make data collection much easier in some scenarios. ### Neon Monitor @@ -40,7 +40,7 @@ Neon is compatible with LSL and you can learn more about how to use Neon with LS [Pupil Cloud](/pupil-cloud/) is our web-based storage and analysis platform located at [cloud.pupil-labs.com](https://cloud.pupil-labs.com/). It makes it easy to store all your data securely in one place and offers a variety of options for data analysis and visualization. -Pupil Cloud is the recommended tool for processing your Neon recordings and if you enable uploads in the Neon Companion app all recordings can be uploaded automatically. +Pupil Cloud is the recommended tool for processing your Neon recordings and if you enable uploads in the Neon Companion App all recordings can be uploaded automatically. ![Pupil Cloud](./pupil_cloud.webp) diff --git a/neon/data-collection/first-recording/index.md b/neon/data-collection/first-recording/index.md index 2b81450d4..149d75e98 100644 --- a/neon/data-collection/first-recording/index.md +++ b/neon/data-collection/first-recording/index.md @@ -12,11 +12,11 @@ Create a new Google account or use an existing Google account during setup. ## 2. Install and Start the Neon Companion App. -Next, install the Neon Companion app on your device: +Next, install the Neon Companion App on your device: - Launch the **Google Play Store** app. It is already installed by default on your Companion Device. - Search for [**Neon Companion**](https://play.google.com/store/apps/details?id=com.pupillabs.neoncomp) in the Google Play Store and click install. -- Start the Neon Companion app when the installation has finished. +- Start the Neon Companion App when the installation has finished. - A couple of permission prompts will appear, which you need to accept. - First-time users need to sign up for a [Pupil Cloud](https://cloud.pupil-labs.com/) account. Sign up using your Google account, or create an account with an email address and password. diff --git a/neon/data-collection/measuring-ied/index.md b/neon/data-collection/measuring-ied/index.md index 9b1ab5953..d908de9d7 100644 --- a/neon/data-collection/measuring-ied/index.md +++ b/neon/data-collection/measuring-ied/index.md @@ -1,15 +1,16 @@ # Measuring the Inter-Eye-Distance (IED) -The wearer's IED can be set in the Neon Companion app for applications requiring -precise pupillometry or eye-state measurements. -This does not require prior calibration of the device. However, the accuracy of Neon’s 3D eye-state and pupil-size +The wearer's IED can be set in the Neon Companion App for applications requiring +precise pupillometry or eye-state measurements. +This does not require prior calibration of the device. However, the accuracy of Neon’s 3D eye-state and pupil-size measurements can be enhanced by correctly setting the IED for each wearer. -To add the wearer's IED, input the value into their ‘Wearer Profile’ in the Neon Companion app before -starting +To add the wearer's IED, input the value into their ‘Wearer Profile’ in the Neon Companion App before +starting a recording. The default IED is set to 63 mm, which is the average for adults. Here's a simple way to measure IED: + 1. Ask the person to look at a distant object (to avoid convergence) 2. Hold a ruler in front of their eyes and measure from the center of one pupil to the center of the -other \ No newline at end of file + other diff --git a/neon/data-collection/monitor-app/index.md b/neon/data-collection/monitor-app/index.md index 76600d62c..255aaa965 100644 --- a/neon/data-collection/monitor-app/index.md +++ b/neon/data-collection/monitor-app/index.md @@ -4,10 +4,10 @@ Using the **Neon Monitor** app you can easily monitor your data collection in real-time and remote control all your Neons from another device. It's a web-app that can be opened in any browser on a computer, tablet or phone! The only requirement is that the Neon Companion device and the device you use to access the Monitor app are connected to the same network. -To access the Monitor app make sure the Neon Companion app is running and visit the page [neon.local:8080](http://neon.local:8080) on your monitoring device. +To access the Monitor app make sure the Neon Companion App is running and visit the page [neon.local:8080](http://neon.local:8080) on your monitoring device. ::: tip -The Neon Companion app can display a QR code that gets you straight to the monitor website. Just select `Streaming` on the home screen! +The Neon Companion App can display a QR code that gets you straight to the monitor website. Just select `Streaming` on the home screen! ::: ## The App's User Interface diff --git a/neon/data-collection/psychopy/index.md b/neon/data-collection/psychopy/index.md index 55a8f74cf..07113c376 100644 --- a/neon/data-collection/psychopy/index.md +++ b/neon/data-collection/psychopy/index.md @@ -1,26 +1,27 @@ # PsychoPy -[PsychoPy](https://psychopy.org/) is widely used open-source software for creating and running psychophysics experiments. +[PsychoPy](https://psychopy.org/) is widely used open-source software for creating and running psychophysics experiments. It enables users to present stimuli, collect data, and interface with a variety of hardware and software applications. -We have created a dedicated plugin for PsychoPy that enables Neon to be used in PsychoPy experiments. PsychoPy +We have created a dedicated plugin for PsychoPy that enables Neon to be used in PsychoPy experiments. PsychoPy users have two options for designing their experiments, both of which can be used alongside Neon: - [Builder](https://www.psychopy.org/builder/) – Gives users a graphical interface with little or no need to write code - although it does support custom code when necessary. - [Coder](https://psychopy.org/coder/index.html) – Gives users the option to generate experiments or do other things programmatically, [using Psychopy like any other Python package](https://psychopy.org/api/). ## Using PsychoPy with Neon -When using PsychoPy with Neon, you can save eyetracking data in PsychoPy's hdf5 format, by enabling the "Save hdf5 file" -option within the experiment settings. But we also recommend recording in the Neon Companion app for the duration of + +When using PsychoPy with Neon, you can save eyetracking data in PsychoPy's hdf5 format, by enabling the "Save hdf5 file" +option within the experiment settings. But we also recommend recording in the Neon Companion App for the duration of the experiment for data redundancy. PsychoPy’s standard "Eyetracker Record" component can be used to start and stop recordings on the Companion Device accordingly. -For experiments that only require pupillometry/eye state, make sure the "Compute Eye State" setting is enabled in the companion app. +For experiments that only require pupillometry/eye state, make sure the "Compute Eye State" setting is enabled in the companion app. For experiments that do not require screen-based gaze coordinates, this is all that is required. -To use Neon for screen-based work in PsychoPy, the screen needs to be robustly located within the scene camera’s field of view, -and Neon’s gaze data subsequently transformed from scene camera-based coordinates to screen-based coordinates. The plugin for -PsychoPy achieves this with the use of AprilTag Markers and the +To use Neon for screen-based work in PsychoPy, the screen needs to be robustly located within the scene camera’s field of view, +and Neon’s gaze data subsequently transformed from scene camera-based coordinates to screen-based coordinates. The plugin for +PsychoPy achieves this with the use of AprilTag Markers and the [real-time-screen-gaze](https://github.com/pupil-labs/real-time-screen-gaze) Python package (installed automatically with the plugin). ## Builder @@ -45,27 +46,27 @@ The standard "Eyetracker Record" and "Region of Interest" components work with N Two new Builder components will be available in the components list under the Eyetracking section: "April Tag Frame" and "April Tag" (necessary for screen-based work). - April Tag Frame: this component is recommended for most users. Using it in your Builder experiment will display an array of AprilTag markers around the edge of the screen. You can configure the number of markers to display along the horizontal and vertical edges of the screen, the size and contrast of the markers, and (optionally) the marker IDs. A minimum of four markers (2 horizontally by 2 vertically) is recommended, but more markers will provide more robust detection and accurate mapping. -![AprilTag Frame](./apriltag-frame.png) + ![AprilTag Frame](./apriltag-frame.png) - April Tag: this component will add a single AprilTag marker to your display. It is intended for use when the April Tag Frame component cannot be used (e.g., you need to display stimuli on the edges of the display where the April Tag Frame component would place markers in the way). ### Data Format -[PsychoPy saves eyetracking data in its own format](https://psychopy.org/hardware/eyeTracking.html#what-about-the-data). +[PsychoPy saves eyetracking data in its own format](https://psychopy.org/hardware/eyeTracking.html#what-about-the-data). -When processing eyetracking data in PsychoPy's data format, please note that PsychoPy doesn’t have distinct record types -for gaze data versus eye state. If you’re collecting screen-gaze coordinates and pupillometry data, their records they will +When processing eyetracking data in PsychoPy's data format, please note that PsychoPy doesn’t have distinct record types +for gaze data versus eye state. If you’re collecting screen-gaze coordinates and pupillometry data, their records they will be intermixed, but they can be distinguished. - For screen gaze records - - `[left|right]_gaze_[x|y]` will be the screen coordinates in PsychoPy’s display units `[left|right]_gaze_z` will be `0` - - `[left|right]_eye_cam_[x|y|z]` will be `0` - - `left_pupil_measure1` and `left_pupil_measure1_type` will be `0` + - `[left|right]_gaze_[x|y]` will be the screen coordinates in PsychoPy’s display units `[left|right]_gaze_z` will be `0` + - `[left|right]_eye_cam_[x|y|z]` will be `0` + - `left_pupil_measure1` and `left_pupil_measure1_type` will be `0` - For eye state records - - `[left|right]_gaze_[x|y|z]` will be the optical axis vector - - `[left|right]_eye_cam_[x|y|z]` will be eye position - - `left_pupil_measure1` will be pupil diameter in mm - - `left_pupil_measure1_type` will be `77` + - `[left|right]_gaze_[x|y|z]` will be the optical axis vector + - `[left|right]_eye_cam_[x|y|z]` will be eye position + - `left_pupil_measure1` will be pupil diameter in mm + - `left_pupil_measure1_type` will be `77` ### Example Builder Experiment @@ -73,11 +74,12 @@ Check out our simple but complete [gaze contingent demo designed in PsychoPy Bui ## Coder -To use Neon with PsychoPy coder, we recommend interfacing directly with the [real-time API](https://docs.pupil-labs.com/neon/real-time-api/tutorials/) -and, for screen-based tasks, using the [real-time-screen-gaze](https://github.com/pupil-labs/real-time-screen-gaze) package. +To use Neon with PsychoPy coder, we recommend interfacing directly with the [real-time API](https://docs.pupil-labs.com/neon/real-time-api/tutorials/) +and, for screen-based tasks, using the [real-time-screen-gaze](https://github.com/pupil-labs/real-time-screen-gaze) package. `AprilTagFrameStim` and `AprilTagStim` classes are provided to more easily display screen markers and configure a screen-based gaze mapper. ### Example Coder Experiment + ```python import numpy as np diff --git a/neon/data-collection/recordings/index.md b/neon/data-collection/recordings/index.md index 1ab2ff740..6e8e81fdf 100644 --- a/neon/data-collection/recordings/index.md +++ b/neon/data-collection/recordings/index.md @@ -1,15 +1,17 @@ # Recordings -A recording starts and stops when you press the record button in the Neon Companion app. While this should feel similar to recording a regular video on your phone, there is a lot more happening behind the scenes. When you are recording with the Neon Companion app, you are capturing not only video data but several more sensors (see [Data Streams](/data-collection/data-streams/)). + +A recording starts and stops when you press the record button in the Neon Companion App. While this should feel similar to recording a regular video on your phone, there is a lot more happening behind the scenes. When you are recording with the Neon Companion App, you are capturing not only video data but several more sensors (see [Data Streams](/data-collection/data-streams/)). Recordings are designed to be as robust as possible. If at any point the Neon module is temporarily disconnected from the Companion phone, it will automatically start capturing again as soon as it is reconnected. You could start a recording with no Neon connected and plug it in at a later time. As soon as it is connected, data will be captured. -The Neon Companion app has several more features to ensure robust data collection and will e.g. warn you in case the Companion device's battery is running low or if you run out of storage space. +The Neon Companion App has several more features to ensure robust data collection and will e.g. warn you in case the Companion device's battery is running low or if you run out of storage space. ## Events, Wearers, & Templates + When making a recording you can capture various additional data to record things like metadata or key events that happened during data collection. These will be saved as part of the recording itself. [**Events**](/data-collection/events/) are key points in time in a recording that have been marked. [**Wearers**](/data-collection/wearers/) are the people who wear the Neon device while recording. -[**Templates**](/data-collection/templates/) are questionnaires that can be filled out at recording time. \ No newline at end of file +[**Templates**](/data-collection/templates/) are questionnaires that can be filled out at recording time. diff --git a/neon/data-collection/scene-camera-exposure/index.md b/neon/data-collection/scene-camera-exposure/index.md index a2276e518..cc280153b 100644 --- a/neon/data-collection/scene-camera-exposure/index.md +++ b/neon/data-collection/scene-camera-exposure/index.md @@ -1,30 +1,34 @@ # Scene Camera Exposure -The [scene camera’s](https://docs.pupil-labs.com/neon/data-collection/data-streams/#scene-video) exposure can be adjusted to improve image quality in different lighting conditions. There are four modes: + +The [scene camera’s](/data-collection/data-streams/#scene-video) exposure can be adjusted to improve image quality in different lighting conditions. There are four modes: - **Manual:** This mode lets you set the exposure time manually. - **Automatic**: `Highlights`, `Balanced`, and `Shadows` automatically adjust exposure according to the surrounding lighting. -::: tip -The mode you choose should depend on the lighting conditions in your environment. The images below provide some +::: tip +The mode you choose should depend on the lighting conditions in your environment. The images below provide some examples and important considerations. ::: ## Changing Exposure Modes -From the home screen of the Neon Companion app, tap -the [Scene and Eye Camera preview](https://docs.pupil-labs.com/neon/data-collection/first-recording/#_4-open-the-live-preview), + +From the home screen of the Neon Companion App, tap +the [Scene and Eye Camera preview](/data-collection/first-recording/#_4-open-the-live-preview), and then select `Balanced` to reveal all four modes. ## Manual Exposure Mode -Allows you to set the exposure time between 1 ms and 1000 ms. + +Allows you to set the exposure time between 1 ms and 1000 ms. ::: tip Exposure duration is inversely related to camera frame rate. Exposure values above 330 ms will reduce the scene camera rate below 30fps. ::: ## Automatic Exposure Modes + `Highlights`- optimizes the exposure to capture bright areas in the environment, while potentially underexposing dark areas. ![This mode optimizes the exposure to capture bright areas in the environment, while potentially underexposing dark areas.](Highlight.webp) - + `Balanced` - optimizes the exposure to capture brighter and darker areas equally. ![This mode optimizes the exposure to capture brighter and darker areas in the environment equally.](./Balance.webp) diff --git a/neon/data-collection/time-synchronization/index.md b/neon/data-collection/time-synchronization/index.md index e035c1243..dcce7c5bf 100644 --- a/neon/data-collection/time-synchronization/index.md +++ b/neon/data-collection/time-synchronization/index.md @@ -1,12 +1,13 @@ # Achieve Super-Precise Time Sync -For some applications, it is critical to accurately synchronize your Neon with another clock. That could be from a second Neon device, an external sensor, or a computer you use for stimulus presentation. + +For some applications, it is critical to accurately synchronize your Neon with another clock. That could be from a second Neon device, an external sensor, or a computer you use for stimulus presentation. Neon provides UTC timestamps for all the data it generates, which makes it easy to sync the data to anything. Those timestamps are generated using the clock of the Companion device. However, digital clocks can suffer from drift, meaning that they sometimes run slightly too fast or slow. Over time this error accumulates and can lead to errors when comparing two clocks. Therefore digital clocks regularly readjust themselves by syncing to a master clock over the internet every other day or so. Two freshly synced clocks should have <~20 ms of an offset. From there, the offset increases in the order of a couple 10s of milliseconds per hour. After 24 hours it may reach about 1 second. - ### Force Syncing to the Master Clock on Demand + The easiest way to achieve accurate time synchronization is to force a fresh sync-up to the master clock of all devices before starting data collection. This will ensure drift error is minimized for at least a few hours. A sync-up can usually be forced by toggling the automatic determination of the current time off and back on in the operating system's settings. In Android 11, for example, the `Date & Time` settings in the `System` settings have a toggle called `Use network-provided time`. In Android 12, the toggle is called `Set time automatically`. Whenever this toggle is turned on, the system syncs up. @@ -25,12 +26,14 @@ NTP_SERVER=north-america.pool.ntp.org ``` ### Improving Synchronization further + While an error of `<20 ms` is sufficient for most applications, some require even better synchronization. To achieve this, you can estimate the offset between the clock used by Neon and the external clock down to single millisecond accuracy. This can be done using the `TimeOffsetEstimator` of the [real-time API](/real-time-api/tutorials/). Using the following code, you can estimate the offset between the Neon clock and the clock of the host executing the code. ::: tip **Dependency**: `pip install "pupil-labs-realtime-api>=1.1.0"` ::: + ```python from pupil_labs.realtime_api.simple import discover_one_device @@ -43,7 +46,7 @@ if device is None: estimate = device.estimate_time_offset() if estimate is None: device.close() - raise SystemExit("Neon Companion app is too old") + raise SystemExit("Neon Companion App is too old") print(f"Mean time offset: {estimate.time_offset_ms.mean} ms") print(f"Mean roundtrip duration: {estimate.roundtrip_duration_ms.mean} ms") @@ -53,9 +56,11 @@ device.close() Using continuous offset estimates like this, you can precisely compensate for clock drifts by correcting the respective timestamps with it. The calculation would look like this: + ```python companion_app_time = external_clock_time - offset ``` + ::: tip **Note:** In very busy wifi networks the transfer speeds might fluctuate wildly and potentially impact the clock offset measurement. In such cases it would be helpful to connect the phone to the network via [ethernet](/hardware/using-a-usb-hub/) instead. ::: diff --git a/neon/data-collection/transfer-recordings-via-usb/index.md b/neon/data-collection/transfer-recordings-via-usb/index.md index 74cd3febd..960fbfd5d 100644 --- a/neon/data-collection/transfer-recordings-via-usb/index.md +++ b/neon/data-collection/transfer-recordings-via-usb/index.md @@ -4,21 +4,22 @@ The recommended way for transferring recordings off of the phone is to upload them to [Pupil Cloud](/pupil-cloud/). For some use-cases, however, this may not be possible and users may want to transfer the recordings via USB. ::: -To transfer recordings directly to a computer you first need to export the recordings to the Android filesystem. Then you need to access the filesystem to copy the data over to your computer. +To transfer recordings directly to a computer you first need to export the recordings to the Android filesystem. Then you need to access the filesystem to copy the data over to your computer. Recordings downloaded directly from the phone will be in a raw binary format. -#### Export from Neon Companion app -1. Open the recordings view in the Neon Companion app +#### Export from Neon Companion App + +1. Open the recordings view in the Neon Companion App 2. Select desired recording/s 3. Export: - - For single recordings, the export button is found by clicking on the 3 vertical dots to + - For single recordings, the export button is found by clicking on the 3 vertical dots to the right of the cloud symbol - - For multiple recordings, click the download symbol at the bottom of the screen + - For multiple recordings, click the download symbol at the bottom of the screen 4. The app via show you a dialog indicating to which folder the recordings will be exported too. Confirm this by clicking `Yes`. - #### Transfer Exported Recordings to a Computer + 1. Connect your OnePlus device to a PC via USB (using the USB cable supplied) 2. Slide down from the top of the device's home-screen and click on 'Android System - USB charging this device' 3. Click on 'Tap for more options' @@ -27,12 +28,11 @@ Recordings downloaded directly from the phone will be in a raw binary format. 6. Locate the export folder on the phone. Usually, it is in `Documents/Neon Export`. 7. Copy the recordings to your computer. +Note that the export process does not delete the recordings from the Neon Companion App, and you can still upload +to Pupil Cloud at a later date if required. -Note that the export process does not delete the recordings from the Neon Companion app, and you can still upload -to Pupil Cloud at a later date if required. - -Recordings that are deleted from the Neon Companion app, e.g. to free up storage space, cannot be transferred back -to the Neon Companion app from your backup location (including Pupil Cloud, a laptop/desktop PC, or external HD). +Recordings that are deleted from the Neon Companion App, e.g. to free up storage space, cannot be transferred back +to the Neon Companion App from your backup location (including Pupil Cloud, a laptop/desktop PC, or external HD). This means that if you delete the recordings prior to uploading them to Pupil Cloud, they cannot be uploaded at a later date. diff --git a/neon/data-collection/troubleshooting/index.md b/neon/data-collection/troubleshooting/index.md index e9ef30d2f..c4faa8511 100644 --- a/neon/data-collection/troubleshooting/index.md +++ b/neon/data-collection/troubleshooting/index.md @@ -4,7 +4,7 @@ Below you can find a list of issues we have observed in the past and recommendat ## The Companion Device Is Vibrating and a Red LED Is Blinking in the Neon Module! -The vibrations and the blinking LED try to grab the wearer's attention to notify them of a problem that may critically hurt the ongoing recording. To get details on the problem, open the Neon Companion app, which will show an error description. +The vibrations and the blinking LED try to grab the wearer's attention to notify them of a problem that may critically hurt the ongoing recording. To get details on the problem, open the Neon Companion App, which will show an error description. Potential problems include: diff --git a/neon/data-collection/wearers/index.md b/neon/data-collection/wearers/index.md index 38bd6bbe6..02177fac5 100644 --- a/neon/data-collection/wearers/index.md +++ b/neon/data-collection/wearers/index.md @@ -1,5 +1,6 @@ # Wearers -Wearers are the people wearing Neon. In a typical study, each subject would be a wearer. Every recording you make is assigned to a wearer to help you organize your recordings. You can create new wearers on the fly in the Neon Companion app or in advance using [Pupil Cloud](/pupil-cloud/). It is also possible to change the assigned wearer in a recording post hoc in Pupil Cloud. Simply select *Change Wearer* from the context menu of a recording! + +Wearers are the people wearing Neon. In a typical study, each subject would be a wearer. Every recording you make is assigned to a wearer to help you organize your recordings. You can create new wearers on the fly in the Neon Companion App or in advance using [Pupil Cloud](/pupil-cloud/). It is also possible to change the assigned wearer in a recording post hoc in Pupil Cloud. Simply select _Change Wearer_ from the context menu of a recording! Every wearer is assigned a unique ID, such that you can edit the name and profile picture at any time without mixing up your recordings. diff --git a/neon/hardware/compatible-devices/index.md b/neon/hardware/compatible-devices/index.md index a8f6e1e54..c3512dde4 100644 --- a/neon/hardware/compatible-devices/index.md +++ b/neon/hardware/compatible-devices/index.md @@ -1,21 +1,23 @@ # Companion Device -The Companion device is a flagship Android smartphone. It is a regular phone that is not customized or modified in any way. To ensure maximum stability and performance we can only support a small number of carefully selected and tested models. The Neon Companion app is tuned to work with these particular models as we require full control over various low-level functions of the hardware. -The supported models are: OnePlus 8, OnePlus 8T, OnePlus 10 Pro, and Motorola Edge 40 Pro. Currently, Neon ships with a Motorola Edge 40 Pro. We highly recommend the Edge 40 Pro, giving you the best performance, endurance and stability. +The Companion device is a flagship Android smartphone. It is a regular phone that is not customized or modified in any way. To ensure maximum stability and performance we can only support a small number of carefully selected and tested models. The Neon Companion App is tuned to work with these particular models as we require full control over various low-level functions of the hardware. -If you want to replace or add an extra Companion device you can purchase it [directly from us](https://pupil-labs.com/products/neon) or from any other distributor. The Neon Companion app is free and can be downloaded from the [Play Store](https://play.google.com/store/apps/details?id=com.pupillabs.neoncomp). +The supported models are: OnePlus 8, OnePlus 8T, OnePlus 10 Pro, and Motorola Edge 40 Pro. Currently, Neon ships with a Motorola Edge 40 Pro. We highly recommend the Edge 40 Pro, giving you the best performance, endurance and stability. + +If you want to replace or add an extra Companion device you can purchase it [directly from us](https://pupil-labs.com/products/neon) or from any other distributor. The Neon Companion App is free and can be downloaded from the [Play Store](https://play.google.com/store/apps/details?id=com.pupillabs.neoncomp). Using a fully charged Motorola Edge 40 Pro device you get around 4 hours of continuous recording time. You can extend this duration by simultaneously charging the phone during a recording [using a powered USB-C hub](/hardware/using-a-usb-hub/). ## Companion Device Updates -### Neon Companion app -Make sure to update the Neon Companion app on a regular basis. The latest version will always be available on the +### Neon Companion App + +Make sure to update the Neon Companion App on a regular basis. The latest version will always be available on the [Play Store](https://play.google.com/store/apps/details?id=com.pupillabs.neoncomp). ### Android OS -We ship each Companion Device with a specific Android Version, carefully tested to ensure robustness and stability. +We ship each Companion Device with a specific Android Version, carefully tested to ensure robustness and stability. The currently supported Android versions are as follows: diff --git a/neon/hardware/module-technical-overview/index.md b/neon/hardware/module-technical-overview/index.md index 7533b0d4c..a5da2ad9c 100644 --- a/neon/hardware/module-technical-overview/index.md +++ b/neon/hardware/module-technical-overview/index.md @@ -10,7 +10,7 @@ The Neon module is a small powerhouse of sensors! It connects to the Companion d - **IR LEDs**: One infrared LED is located on each arm of the module. The LEDs illuminate the eyes of the wearer to improve image quality in dark environments. -- **Scene Camera**: A front-facing scene camera is located at the center of the module capturing [scene video](/data-collection/data-streams/#scene-video). A **microphone** is integrated into the module to capture [audio](/data-collection/data-streams/#audio). Capturing audio is optional and settable in the Neon Companion app settings. +- **Scene Camera**: A front-facing scene camera is located at the center of the module capturing [scene video](/data-collection/data-streams/#scene-video). A **microphone** is integrated into the module to capture [audio](/data-collection/data-streams/#audio). Capturing audio is optional and settable in the Neon Companion App settings. - **IMU**: A 9-degrees-of-freedom IMU is integrated into the module. It captures the [inertia](/data-collection/data-streams/#movement-imu-data) of the glasses, including translational acceleration, rotational speed, magnetic orientation, pitch, yaw, and roll. diff --git a/neon/neon-player/eye-state-timeline/index.md b/neon/neon-player/eye-state-timeline/index.md index f1ab4fd3c..2de008562 100644 --- a/neon/neon-player/eye-state-timeline/index.md +++ b/neon/neon-player/eye-state-timeline/index.md @@ -1,21 +1,22 @@ # Eye State Timeline -This plugin visualizes [3D eye state](/data-collection/data-streams/#_3d-eye-states) and [pupil diameter](/data-collection/data-streams/#pupil-diameters) data. +This plugin visualizes [3D eye state](/data-collection/data-streams/#_3d-eye-states) and [pupil diameter](/data-collection/data-streams/#pupil-diameters) data. ![Eye State Timeline](./eye-state-timeline.webp) ::: info -The data will only be visualized if Eye State computation was enabled in the Neon Companion app during recording. -::: +The data will only be visualized if Eye State computation was enabled in the Neon Companion App during recording. +::: ## Export Format + Results exported to `3d_eye_states.csv` with the following fields: -| Field | Description | -| ------------------------- | -------- | -| **section id** | Unique identifier of the corresponding section. | -| **recording id** | Unique identifier of the recording this sample belongs to. | -| **timestamp [ns]** | UTC timestamp in nanoseconds of the sample. Equal to the timestamp of the eye video frame this sample was generated with. | -| **pupil diameter left [mm]** | Physical diameter of the pupil of the left eye. | -| **pupil diameter right [mm]** | Physical diameter of the pupil of the right eye. | +| Field | Description | +| ------------------------- | -------- | +| **section id** | Unique identifier of the corresponding section. | +| **recording id** | Unique identifier of the recording this sample belongs to. | +| **timestamp [ns]** | UTC timestamp in nanoseconds of the sample. Equal to the timestamp of the eye video frame this sample was generated with. | +| **pupil diameter left [mm]** | Physical diameter of the pupil of the left eye. | +| **pupil diameter right [mm]** | Physical diameter of the pupil of the right eye. | | **eye ball center left x [mm]**
**eye ball center left y [mm]**
**eye ball center left z [mm]**
**eye ball center right x [mm]**
**eye ball center right y [mm]**
**eye ball center right z [mm]** | Location of left and right eye ball centers in millimeters in relation to the scene camera of the Neon module. For details on the coordinate systems see [here](/data-collection/data-streams/#_3d-eye-states). | | **optical axis left x**
**optical axis left y**
**optical axis left z**
**optical axis right x**
**optical axis right y**
**optical axis right z** | Directional vector describing the optical axis of the left and right eye, i.e. the vector pointing from eye ball center to pupil center of the respective eye. For details on the coordinate systems see [here](/data-collection/data-streams/#_3d-eye-states). | diff --git a/neon/pupil-cloud/index.md b/neon/pupil-cloud/index.md index a6d134177..cf94731f2 100644 --- a/neon/pupil-cloud/index.md +++ b/neon/pupil-cloud/index.md @@ -8,6 +8,6 @@ aside: false [Pupil Cloud](https://cloud.pupil-labs.com) is a web-based eye tracking platform for data logistics, analysis, and visualization. It is the recommended tool for processing your Neon recordings. It makes it easy to store all your data securely in one place and it offers a variety of options for analysis. -If Cloud upload is enabled in the Neon Companion app, then all recordings will be uploaded automatically to Pupil Cloud. +If Cloud upload is enabled in the Neon Companion App, then all recordings will be uploaded automatically to Pupil Cloud. We have a strict privacy policy that ensures your recording data is accessible only by you and those you explicitly grant access to. Pupil Labs will never access your recording data unless you explicitly instruct us to. diff --git a/neon/pupil-cloud/offset-correction/index.md b/neon/pupil-cloud/offset-correction/index.md index 321a498e6..8aadcbe49 100644 --- a/neon/pupil-cloud/offset-correction/index.md +++ b/neon/pupil-cloud/offset-correction/index.md @@ -1,19 +1,19 @@ # Offset Correction on Pupil Cloud -For some subjects, you may find a constant offset in their gaze estimates. This gaze offset can be [compensated for in the Neon Companion app](https://docs.pupil-labs.com/neon/data-collection/offset-correction/) before starting a recording, or post hoc in Pupil Cloud as shown below. Click on the gaze offset icon to start the gaze offset correction. +For some subjects, you may find a constant offset in their gaze estimates. This gaze offset can be [compensated for in the Neon Companion App](/data-collection/offset-correction/) before starting a recording, or post hoc in Pupil Cloud as shown below. Click on the gaze offset icon to start the gaze offset correction. ![Offset correction on Cloud header image](./offset-cloud-timeline.png) -Once you enter the `Edit Gaze Offset` view, simply drag the gaze circle to apply the correction. See the video below for reference. +Once you enter the `Edit Gaze Offset` view, simply drag the gaze circle to apply the correction. See the video below for reference. - The grey circle indicates the raw gaze estimate provided by Neon’s gaze estimation pipeline. - The red circle indicates the gaze position with the current offset applied. -- The blue circle indicates the offset correction applied in the Neon Companion app. +- The blue circle indicates the offset correction applied in the Neon Companion App. ::: tip -Modifying the gaze offset impacts all downstream data, such as fixations, mapped gaze from enrichments, and visualizations. Where possible, data is updated instantly. If not, the respective data will be deleted, requiring (partial) re-computation of enrichments or visualizations. In that case, the enrichment or visualization will be shown as *Not Started* and you need to re-run them. -::: \ No newline at end of file +Modifying the gaze offset impacts all downstream data, such as fixations, mapped gaze from enrichments, and visualizations. Where possible, data is updated instantly. If not, the respective data will be deleted, requiring (partial) re-computation of enrichments or visualizations. In that case, the enrichment or visualization will be shown as _Not Started_ and you need to re-run them. +::: diff --git a/neon/pupil-cloud/troubleshooting/index.md b/neon/pupil-cloud/troubleshooting/index.md index b4585eb9c..5147e114e 100644 --- a/neon/pupil-cloud/troubleshooting/index.md +++ b/neon/pupil-cloud/troubleshooting/index.md @@ -1,12 +1,16 @@ # Troubleshooting + Below you can find a list of issues we have observed in the past and recommendations on how to fix them. If you can not find your issue in the list, please reach out to us on [Discord](https://pupil-labs.com/chat/) or via email to `info@pupil-labs.com`. ## Recordings Are Not Uploading to Pupil Cloud Successfully -1. Make sure **Cloud upload** is enabled in the Neon Companion app's settings. + +1. Make sure **Cloud upload** is enabled in the Neon Companion App's settings. 1. Try logging out of the app and back in. ## My Enrichment Download Contains Only an ‘info.json’ File and Nothing Else! + Did you use **Safari browser** to make the download? - - Enrichment downloads come as ZIP files. By default, Safari will automatically extract ZIP files when the download is finished. However, it will only extract file types that are considered "safe". Surprisingly, CSV and MP4 files are not considered safe and Safari will by default only extract the remaining JSON files. - To fix this you can either use a different browser to make the download, or disable the **"Open 'safe' files after downloading"** setting in Safari. \ No newline at end of file +- Enrichment downloads come as ZIP files. By default, Safari will automatically extract ZIP files when the download is finished. However, it will only extract file types that are considered "safe". Surprisingly, CSV and MP4 files are not considered safe and Safari will by default only extract the remaining JSON files. + + To fix this you can either use a different browser to make the download, or disable the **"Open 'safe' files after downloading"** setting in Safari. diff --git a/neon/real-time-api/tutorials/index.md b/neon/real-time-api/tutorials/index.md index f04a4a748..d6963e734 100644 --- a/neon/real-time-api/tutorials/index.md +++ b/neon/real-time-api/tutorials/index.md @@ -20,7 +20,7 @@ The client comes in two modes, `simple` and `async`. The simple mode is very eas ## Connecting to a Neon Device -Using the [`discover_one_device`](https://pupil-labs-realtime-api.readthedocs.io/en/stable/api/simple.html#pupil_labs.realtime_api.simple.discover_one_device) function, we can connect to a Neon device connected to your local network. Make sure the Neon Companion app is running! If no device can be found, please check the [troubleshooting section](#troubleshooting) at the end. +Using the [`discover_one_device`](https://pupil-labs-realtime-api.readthedocs.io/en/stable/api/simple.html#pupil_labs.realtime_api.simple.discover_one_device) function, we can connect to a Neon device connected to your local network. Make sure the Neon Companion App is running! If no device can be found, please check the [troubleshooting section](#troubleshooting) at the end. ```python from pupil_labs.realtime_api.simple import discover_one_device @@ -36,6 +36,7 @@ print(f"Battery level: {device.battery_level_percent}%") print(f"Free storage: {device.memory_num_free_bytes / 1024**3:.1f} GB") print(f"Serial number of connected glasses: {device.serial_number_glasses}") ``` + ``` Phone IP address: 192.168.1.168 Phone name: OnePlus8 @@ -63,6 +64,7 @@ time.sleep(5) device.recording_stop_and_save() ``` + ``` Started recording with id 2f99d9f9-f009-4015-97dd-eb253de443b0 ``` @@ -89,6 +91,7 @@ print(device.send_event("test event 2", event_timestamp_unix_ns=time.time_ns())) device.recording_stop_and_save() ``` + ``` Event(name=None recording_id=None timestamp_unix_ns=1642599117043000000 datetime=2022-01-19 14:31:57.043000) Event(name=None recording_id=fd8c98ca-cd6c-4d3f-9a05-fbdb0ef42668 timestamp_unix_ns=1642599122555200500 datetime=2022-01-19 14:32:02.555201) @@ -121,7 +124,9 @@ scene_image_rgb = cv2.cvtColor(scene_sample.bgr_pixels, cv2.COLOR_BGR2RGB) plt.imshow(scene_image_rgb) plt.scatter(gaze_sample.x, gaze_sample.y, s=200, facecolors='none', edgecolors='r') ``` + The output data would look as follows: + ``` This sample contains the following data: @@ -135,6 +140,7 @@ For the left eye x, y, z: -30.087890625, 10.048828125, -52.4462890625 and for th Directional vector describing the optical axis of the left and right eye. For the left eye x, y, z: -0.05339553952217102, 0.12345726788043976, 0.9909123182296753 and for the right eye x, y, z: -0.40384653210639954, 0.11708031594753265, 0.9073038101196289. ``` + Alternatively, you could also use the [`receive_scene_video_frame`](https://pupil-labs-realtime-api.readthedocs.io/en/stable/api/simple.html#pupil_labs.realtime_api.simple.Device.receive_scene_video_frame) and [`receive_gaze_datum`](https://pupil-labs-realtime-api.readthedocs.io/en/stable/api/simple.html#pupil_labs.realtime_api.simple.Device.receive_gaze_datum) methods to obtain each sample separately. The [`receive_matched_scene_video_frame_and_gaze`](https://pupil-labs-realtime-api.readthedocs.io/en/stable/api/simple.html#pupil_labs.realtime_api.simple.Device.receive_matched_scene_video_frame_and_gaze) method does however also ensure that both samples are matched temporally. ## IMU Data @@ -161,7 +167,9 @@ print(imu_sample.accel_data) print(f"Gyro data:") print(imu_sample.gyro_data) ``` + The output data would look as follows: + ``` This IMU sample was recorded at 2023-05-25 11:23:05.749155 It contains the following data: @@ -200,6 +208,7 @@ print(calibration["left_distortion_coefficients"][0]) ``` ## Template Data + You can access the response data entered into the template questionnaire on the phone and also set those responses remotely. Using the [`get_template`](https://pupil-labs-realtime-api.readthedocs.io/en/stable/api/simple.html#pupil_labs.realtime_api.simple.Device.get_template) method, you can receive the definition of the template containing all questions and sections. @@ -209,6 +218,7 @@ template = device.get_template() fstring = "{i.id}\t{i.title}\t{i.widget_type} \t{i.choices}" print("\n".join(fstring.format(i=i) for i in template.items)) ``` + ``` e3b94cc7-dce4-4781-a818-f769574c31d2 Section 1 SECTION_HEADER [] a54e85aa-5474-42f8-90c0-19f40e9ca825 Question 1 TEXT [] @@ -224,6 +234,7 @@ Using the [`get_template_data`](https://pupil-labs-realtime-api.readthedocs.io/e data = device.get_template_data() print("\n".join(f"{k}\t{v}" for k, v in data.items())) ``` + ``` 6169276c-91f4-4ef9-8e03-45759ff61477 ['a'] 3c7d620f-9f98-4556-92dd-b66df329999c An example paragraph. @@ -243,12 +254,13 @@ questionnaire = { device.post_template_data(questionnaire) ``` -You can also retrieve individual questions by their ID using the [`get_question_by_id`](https://pupil-labs-realtime-api.readthedocs.io/en/stable/api/models.html#pupil_labs.realtime_api.models.Template.get_question_by_id) method and check the validity of a response using the [`validate_answer`]( https://pupil-labs-realtime-api.readthedocs.io/en/stable/api/models.html#pupil_labs.realtime_api.models.TemplateItem.validate_answer) method. +You can also retrieve individual questions by their ID using the [`get_question_by_id`](https://pupil-labs-realtime-api.readthedocs.io/en/stable/api/models.html#pupil_labs.realtime_api.models.Template.get_question_by_id) method and check the validity of a response using the [`validate_answer`](https://pupil-labs-realtime-api.readthedocs.io/en/stable/api/models.html#pupil_labs.realtime_api.models.TemplateItem.validate_answer) method. ```python question = template.get_question_by_id("6169276c-91f4-4ef9-8e03-45759ff61477") question.validate_answer(["invalid_option"]) ``` + ``` pupil_labs/realtime_api/models.py", line 346, in validate_answer raise InvalidTemplateAnswersError(self, answers, errors) @@ -262,7 +274,7 @@ pupil_labs.realtime_api.models.InvalidTemplateAnswersError: Question 3 (6169276c If you are having trouble connecting to your Neon device via the real-time API, consider the following points: -1. Make sure the Neon Companion app and the device you are using to access the API are connected to the same local network. +1. Make sure the Neon Companion App and the device you are using to access the API are connected to the same local network. 1. For discovery the local network must allow MDNS and UDP traffic. In large public networks this may be prohibited for security reasons. - You may still be able to connect to Neon using its IP address. You can find the IP address in the WiFi settings of the phone. Once you have it, you can connect like this: From 47bdf56acb6924c3f5ded0d6564677ee7a1e0fad Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Miguel=20Garc=C3=ADa=20Garc=C3=ADa?= Date: Mon, 11 Nov 2024 12:46:53 +0100 Subject: [PATCH 06/14] @N-M-T suggestions applied. --- neon/.vitepress/config.mts | 2 +- neon/data-collection/gaze-mode/index.md | 55 ++++--------------------- 2 files changed, 8 insertions(+), 49 deletions(-) diff --git a/neon/.vitepress/config.mts b/neon/.vitepress/config.mts index 73b660244..dc74e1bf4 100644 --- a/neon/.vitepress/config.mts +++ b/neon/.vitepress/config.mts @@ -107,7 +107,7 @@ let theme_config_additions = { link: "/data-collection/measuring-ied/", }, { - text: "Monocular Gaze", + text: "Gaze Mode", link: "/data-collection/gaze-mode/", }, { diff --git a/neon/data-collection/gaze-mode/index.md b/neon/data-collection/gaze-mode/index.md index 0ff0484fc..9f8a4cf88 100644 --- a/neon/data-collection/gaze-mode/index.md +++ b/neon/data-collection/gaze-mode/index.md @@ -1,56 +1,15 @@ -# Binocular vs. Monocular Gaze Mode +# Gaze Mode -Starting from version 2.8.34-prod, the Neon Companion App allows you to select between using both eyes (binocular) or a single eye (monocular) images for outputing gaze positions. This option enables you to isolate gaze from a specific eye, e.g. when recording participants with strabismus, or other experimental paradgims that require monocular gaze. - -## Modes - -- `Binocular` _(default)_: Utilizes images from both the right and left eyes to infer gaze position. This mode offers higher accuracy and robustness by leveraging information from both eyes. -- `Mono Right`: Uses only the right eye's image to infer gaze position. This mode may be useful in scenarios where one eye can only be used. -- `Mono Left`: Uses only the left eye's image to infer gaze position. Similar to `Mono Right` but using the left eye. - -::: warning -**Monocular gaze is less accurate and robust** since it relies on a single eye image. Use this mode only if binocular tracking is not feasible or if there's a specific need for single-eye tracking. -::: +In the Neon Companion App, you can select between Binocular (default) or Monocular (left or right) gaze modes. Binocular mode captures gaze data from both eyes, and is recommended for most users. Monocular mode generates gaze data from a single eye (left or right, as chosen by the user), and is advisable only for those who specifically need it. ## Changing Gaze Modes -To switch between gaze modes, follow these steps: - -1. From the home screen of the Neon Companion App, tap the gear icon located at the top-right corner to open **Companion Settings**. -2. Scroll down to the **NeonNet** section. -3. Choose your desired **Gaze Mode** (`Binocular`, `Mono Right`, or `Mono Left`). -4. After selecting the new gaze mode, **unplug and re-plug** the Neon device to apply the changes. - -::: tip -After altering the gaze mode to monocular, it's recommended to perform a new [Offset Correction](/data-collection/offset-correction/) to improve accuracy. -::: - -## Other Considerations - -- Changing the gaze mode modifies the existing gaze stream. It does **not** create an additional stream. -- All downstream processes, including fixations and enrichments, will utilize this monocular gaze data. -- Eye State and Pupillometry remain unaffected by changes to the gaze mode and will output the data for each eye. - -## In Pupil Cloud: - -Pupil Cloud handles gaze data processing as follows: - -- **Default Behavior**: Pupil Cloud reprocesses recordings to maintain a consistent sampling rate of **200Hz**, regardless of the real-time sampling rate set in the app. - -- **Monocular Mode**: If a monocular gaze mode is selected, Pupil Cloud **will not** reprocess the recordings. Ensure that this aligns with your data analysis requirements. - -## Where Can I Find Which Mode Was Used on a Recording? - -On the recording's view in the Neon Companion App, you can tap on the three dots to visualize the metadata. - -Additionally, the [info.json](/data-collection/data-format/#info-json) file now includes a new field `gaze_mode`. - ---- +You can switch between gaze modes in the Companion App settings. After selecting a new gaze mode, be sure to unplug and re-plug the Neon device. -### Best Practices / Additional Recommendations +## Considerations When Switching to Monocular Gaze -- **Testing**: After changing the gaze mode, perform tests to verify that the gaze tracking meets your accuracy and performance needs. +- Switching to Monocular gaze mode alters the existing gaze stream without creating an additional one. This means that all downstream processes, including fixations and enrichments, will utilise this monocular gaze data. -- **Update your Team**: Keep your team informed about changes in gaze modes to ensure consistency in data collection and analysis. +- [Eye State](/data-collection/data-streams/#_3d-eye-states) and [Pupillometry](/data-collection/data-streams/#pupil-diameters) are unaffected by changes to the gaze mode and will continue to measure data for both eyes as usual. ---- +- Pupil Cloud will **not** re-process recordings at 200 Hz as with default binocular recordings. Only real-time recorded monocular gaze will be saved for processing with enrichments. From 7b71c681b65f7154d0f4a049abc3174b278ffa11 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Miguel=20Garc=C3=ADa=20Garc=C3=ADa?= Date: Mon, 11 Nov 2024 12:54:17 +0100 Subject: [PATCH 07/14] Fixed Events tabulation fro merging --- neon/data-collection/psychopy/index.md | 20 ++++++++++---------- 1 file changed, 10 insertions(+), 10 deletions(-) diff --git a/neon/data-collection/psychopy/index.md b/neon/data-collection/psychopy/index.md index b5a75127a..1538e260c 100644 --- a/neon/data-collection/psychopy/index.md +++ b/neon/data-collection/psychopy/index.md @@ -8,11 +8,12 @@ We have created a dedicated plugin for PsychoPy that enables Neon to be used in - [Coder](https://psychopy.org/coder/index.html) – Gives users the option to generate experiments or do other things programmatically, [using Psychopy like any other Python package](https://psychopy.org/api/). ## Using PsychoPy with Neon + When using PsychoPy with Neon, you can save eyetracking data in PsychoPy's hdf5 format, by enabling the "Save hdf5 file" option within the experiment settings. But we also recommend recording in the Neon Companion app for the duration of the experiment for data redundancy. PsychoPy’s standard "Eyetracker Record" component can be used to start and stop recordings on the Companion Device accordingly. If desired, custom timestamped events can be triggered from PsychoPy and saved in the Neon recording. -* For experiments that only require pupillometry/eye state, make sure the "Compute Eye State" setting is enabled in the companion app. For experiments that do not require screen-based gaze coordinates, this is all that is required. +- For experiments that only require pupillometry/eye state, make sure the "Compute Eye State" setting is enabled in the companion app. For experiments that do not require screen-based gaze coordinates, this is all that is required. -* To use Neon for screen-based work in PsychoPy, the screen needs to be robustly located within the scene camera’s field of view, and Neon’s gaze data subsequently transformed from scene camera-based coordinates to screen-based coordinates. The plugin for PsychoPy achieves this with the use of AprilTag Markers and the [real-time-screen-gaze](https://github.com/pupil-labs/real-time-screen-gaze) Python package (installed automatically with the plugin). +- To use Neon for screen-based work in PsychoPy, the screen needs to be robustly located within the scene camera’s field of view, and Neon’s gaze data subsequently transformed from scene camera-based coordinates to screen-based coordinates. The plugin for PsychoPy achieves this with the use of AprilTag Markers and the [real-time-screen-gaze](https://github.com/pupil-labs/real-time-screen-gaze) Python package (installed automatically with the plugin). ## Builder @@ -37,13 +38,12 @@ Three new Builder components will be available in the components list under the - April Tag Markers: for screen-based work, you will need to render AprilTag markers on your display. These components make it easy to do so. We recommend at least four markers, but more markers will improve gaze mapping. - - **April Tag Frame**: this component is recommended for most users. Using it in your Builder experiment will display an array of AprilTag markers around the edge of the screen. You can configure the number of markers to display along the horizontal and vertical edges of the screen, the size and contrast of the markers, and (optionally) the marker IDs. A minimum of four markers (2 horizontally by 2 vertically) is recommended, but more markers will provide more robust detection and accurate mapping. Marker IDs are automatically chosen but can be manually specified if needed. - ![AprilTag Frame](./apriltag-frame.png) + ![AprilTag Frame](./apriltag-frame.png) - **April Tag**: this component will add a single AprilTag marker to your display. It is intended for use when the April Tag Frame component cannot be used (e.g., you need to display stimuli on the edges of the display where the April Tag Frame component would place markers in the way). Using this component will give you control over the size and position of each marker. You will need to ensure that a unique marker ID is assigned to each AprilTag marker. - - **Neon Event**: use this component to send a timestamped event annotation to the Neon Recording. You can mark the start and end of an experiment, the start and end of a trial, the timing of a stimulus presentation, etc. A timestamp can be manually specified or, if set to `0`, automatically assigned when the component start is triggered. +- **Neon Event**: use this component to send a timestamped event annotation to the Neon Recording. You can mark the start and end of an experiment, the start and end of a trial, the timing of a stimulus presentation, etc. A timestamp can be manually specified or, if set to `0`, automatically assigned when the component start is triggered. Events can only be saved to an active recording. You can use PsychoPy's standard "Eyetracking Record" component to start/stop a recording or manually start a recording from the Companion App. @@ -52,12 +52,12 @@ Three new Builder components will be available in the components list under the [PsychoPy saves eyetracking data in its own format](https://psychopy.org/hardware/eyeTracking.html#what-about-the-data). Screen gaze data will be saved as `MonocularEyeSampleEvent` records (even when using the binocular gaze mode). Eye state data, if enabled, will appear in `BinocularEyeSampleEvent` records. For eye state data in`BinocularEyeSampleEvent` records: -- For eye state records - - `[left|right]_gaze_[x|y|z]` will be the optical axis vectors - - `[left|right]_eye_cam_[x|y|z]` will be eye positions - - `[left|right]_pupil_measure1` will be pupil diameters in mm - - `[left|right]_pupil_measure1_type` will be `77` +- For eye state records + - `[left|right]_gaze_[x|y|z]` will be the optical axis vectors + - `[left|right]_eye_cam_[x|y|z]` will be eye positions + - `[left|right]_pupil_measure1` will be pupil diameters in mm + - `[left|right]_pupil_measure1_type` will be `77` ### Example Builder Experiment From de43be575c0008b7f34992251c2729dc7f466a1a Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Miguel=20Garc=C3=ADa=20Garc=C3=ADa?= Date: Mon, 11 Nov 2024 14:38:59 +0100 Subject: [PATCH 08/14] Neon Companion app. --- README.md | 16 ++++++---- alpha-lab/neon-with-capture/index.md | 2 +- .../offset-correction/index.md | 7 ++-- neon/data-collection/data-format/index.md | 8 ++--- neon/data-collection/data-streams/index.md | 8 ++--- .../ecosystem-overview/index.md | 6 ++-- neon/data-collection/first-recording/index.md | 6 ++-- neon/data-collection/index.md | 5 +-- .../lab-streaming-layer/index.md | 16 +++++----- neon/data-collection/measuring-ied/index.md | 4 +-- neon/data-collection/monitor-app/index.md | 4 +-- .../offset-correction/index.md | 3 +- neon/data-collection/recordings/index.md | 4 +-- .../scene-camera-exposure/index.md | 2 +- .../transfer-recordings-via-usb/index.md | 8 ++--- neon/data-collection/troubleshooting/index.md | 2 +- neon/data-collection/wearers/index.md | 2 +- neon/hardware/compatible-devices/index.md | 6 ++-- .../module-technical-overview/index.md | 2 +- neon/neon-player/eye-state-timeline/index.md | 2 +- neon/neon-xr/index.md | 11 ++++--- neon/neon-xr/neon-xr-core-package/index.md | 32 +++++++++++-------- neon/pupil-cloud/index.md | 2 +- neon/pupil-cloud/offset-correction/index.md | 4 +-- neon/pupil-cloud/troubleshooting/index.md | 2 +- neon/real-time-api/tutorials/index.md | 4 +-- 26 files changed, 90 insertions(+), 78 deletions(-) diff --git a/README.md b/README.md index c0e605e47..f45fcfdb5 100644 --- a/README.md +++ b/README.md @@ -1,18 +1,20 @@ # Recommended Defaults + - When importing images, use markdown-style imports + ```markdown ![alt text](/path/to/image.png) ``` + - When trying to create a gallery view, consider using the `PhotoGrid` component - Avoid using custom HTML and CSS as much as possible. - # Capitalization + - We use title case for all titles, headings, menu sections etc. We use title case as it is defined by the Associated Press Stylebook. There is a converter available [here](https://titlecaseconverter.com/) to double-check correctness. - Fixed product and feature names are capitalized, like e.g. - - Reference Image Mapper - - Neon Companion App - - Neon Companion Device - - Pupil Invisible Glasses - - Video Renderer - - Heatmap Visualization + - Reference Image Mapper + - Neon Companion + - Pupil Invisible Glasses + - Video Renderer + - Heatmap Visualization diff --git a/alpha-lab/neon-with-capture/index.md b/alpha-lab/neon-with-capture/index.md index 10f1b89b7..fc7baaafe 100644 --- a/alpha-lab/neon-with-capture/index.md +++ b/alpha-lab/neon-with-capture/index.md @@ -16,5 +16,5 @@ To get started with Neon in Capture, follow these steps: 5. Select either `Neon 3D` or `2D` as the **Gaze Mapping** option in the **Calibration** tab. ::: warning -Recordings made with the Neon Companion App (rather than Pupil Capture) are **NOT** compatible with Pupil Player. +Recordings made with the Neon Companion app (rather than Pupil Capture) are **NOT** compatible with Pupil Player. ::: diff --git a/invisible/data-collection/offset-correction/index.md b/invisible/data-collection/offset-correction/index.md index 1e0ff52c7..079773957 100644 --- a/invisible/data-collection/offset-correction/index.md +++ b/invisible/data-collection/offset-correction/index.md @@ -1,7 +1,8 @@ # Using Offset Correction to Improve Gaze Accuracy -For some wearers, you may find a constant offset in their gaze estimates. Especially in setups where wearers gaze at somethings that is less than 1 meter away from them this can happen due to parallax error. -To compensate for those offsets, you can use the Offset Correction feature in the Pupil Invisible Companion App. See the video below to learn how it works! The video is using demonstrating the feature within the Neon Companion App, but the approach is very similar. +For some wearers, you may find a constant offset in their gaze estimates. Especially in setups where wearers gaze at somethings that is less than 1 meter away from them this can happen due to parallax error. + +To compensate for those offsets, you can use the Offset Correction feature in the Pupil Invisible Companion App. See the video below to learn how it works! The video is using demonstrating the feature within the Neon Companion app, but the approach is very similar. @@ -9,4 +10,4 @@ To compensate for those offsets, you can use the Offset Correction feature in th Note, that the amount of offset introduced by parallax error is highly dependend on the distance between the wearer and the object they are looking at. The closer the object, the larger the offset. The offset correction is only valid for the specific distance at which it was recorded. If the wearer changes their distance to the object, you will need to record a new offset correction. -::: \ No newline at end of file +::: diff --git a/neon/data-collection/data-format/index.md b/neon/data-collection/data-format/index.md index 35199f486..e1aceaf61 100644 --- a/neon/data-collection/data-format/index.md +++ b/neon/data-collection/data-format/index.md @@ -19,9 +19,9 @@ This file contains meta-information on the recording. | **android_device_id** | Unique identifier of the Android device used as Companion. | | **android_device_model** | Model name of the Companion device. | | **android_device_name** | Device name of the Companion device. | -| **app_version** | Version of the Neon Companion App used to make the recording. | -| **calib_version** | Version of the offset correction used by the Neon Companion App. | -| **data_format_version** | Version of the data format used by the Neon Companion App. | +| **app_version** | Version of the Neon Companion app used to make the recording. | +| **calib_version** | Version of the offset correction used by the Neon Companion app. | +| **data_format_version** | Version of the data format used by the Neon Companion app. | | **duration** | Duration of the recording in nanoseconds | | **firmware_version** | Version numbers of the firmware and FPGA. | | **frame_id** | Number identifying the type of frame used for this recording. | @@ -29,7 +29,7 @@ This file contains meta-information on the recording. | **gaze_offset** | Gaze offset applied to this recording using the offset correction. Values are in pixels. | | **module_serial_number** | Serial number of the Neon module used for the recording. This number is encoded in the QR code on the back of the Neon module. | | **os_version** | Version of the Android OS that was installed on the recording Companion device. | -| **pipeline_version** | Version of the gaze estimation pipeline used by the Neon Companion App. | +| **pipeline_version** | Version of the gaze estimation pipeline used by the Neon Companion app. | | **recording_id** | Unique identifier of the recording. | | **start_time** | Timestamp of when the recording was started. Given as UTC timestamp in nanoseconds. | | **template_data** | Data regarding the selected template for the recording as well as the response values. | diff --git a/neon/data-collection/data-streams/index.md b/neon/data-collection/data-streams/index.md index 2bd9efe20..c454b3498 100644 --- a/neon/data-collection/data-streams/index.md +++ b/neon/data-collection/data-streams/index.md @@ -21,7 +21,7 @@ The scene camera can be operated with automatic or manual exposure. In situation ## Gaze Available in: Real-timePupil CloudNeon Player -The Neon Companion App can provide gaze data in real-time at up to 200 Hz. Gaze data is output in pixel space of the scene camera image. The origin is in the top-left corner of the image. +The Neon Companion app can provide gaze data in real-time at up to 200 Hz. Gaze data is output in pixel space of the scene camera image. The origin is in the top-left corner of the image. ![Gaze](./gaze.jpg) @@ -47,7 +47,7 @@ The downloads for gaze mapping enrichments ([Reference Image Mapper](/pupil-clou ## 3D Eye States Available in: Real-timePupil Cloud -The Neon Companion App provides 3D eye state data in real-time at up to 200 Hz. The 3D eye states are a time series of each eye's position and orientation in 3D space, given by the location of the eyeball center and the optical axis of each eye. The units are millimeters. +The Neon Companion app provides 3D eye state data in real-time at up to 200 Hz. The 3D eye states are a time series of each eye's position and orientation in 3D space, given by the location of the eyeball center and the optical axis of each eye. The units are millimeters. The coordinate system is depicted below. The origin corresponds to the scene camera of the Neon Module. @@ -64,7 +64,7 @@ If 200 Hz real-time data is essential, consider upgrading to a newer [Companion ## Pupil Diameters Available in: Real-timePupil Cloud -The Neon Companion App provides pupil diameter data in real-time at up to 200 Hz. Separately for the left and right eye. The computed pupil diameters correspond to the physical pupil size in mm, rather than the apparent pupil size in pixels as observed in the eye videos. You can find a high-level description as well as a thorough evaluation of the accuracy and robustness of Neon’s pupil-size measurements in our [white paper](https://zenodo.org/records/10057185). +The Neon Companion app provides pupil diameter data in real-time at up to 200 Hz. Separately for the left and right eye. The computed pupil diameters correspond to the physical pupil size in mm, rather than the apparent pupil size in pixels as observed in the eye videos. You can find a high-level description as well as a thorough evaluation of the accuracy and robustness of Neon’s pupil-size measurements in our [white paper](https://zenodo.org/records/10057185). Similar to the 3D eye states, the accuracy of the pupil diameter measurements improves when supplying the wearer's IED in the wearer profile before making a recording. @@ -82,7 +82,7 @@ The blink detection algorithm is operating directly on the eye video to detect t Available in: Pupil CloudNeon Player Stereo microphones are integrated into the Neon module. Recorded audio will be part of the resulting scene video. -Audio recording is disabled in the Neon Companion App by default and can be enabled in the settings. +Audio recording is disabled in the Neon Companion app by default and can be enabled in the settings. ## Movement (IMU Data) diff --git a/neon/data-collection/ecosystem-overview/index.md b/neon/data-collection/ecosystem-overview/index.md index 484fbb030..779befda8 100644 --- a/neon/data-collection/ecosystem-overview/index.md +++ b/neon/data-collection/ecosystem-overview/index.md @@ -4,7 +4,7 @@ The Neon ecosystem contains a range of tools that support you during data collec ## Neon Companion App -You should have already used the Neon Companion App to [make your first recording](/data-collection/first-recording/). This app is the core of every Neon data collection. +You should have already used the Neon Companion app to [make your first recording](/data-collection/first-recording/). This app is the core of every Neon data collection. When your Neon is connected to the Companion device, it supplies it with power and enables it to generate a real-time gaze signal as well as several other [data streams](/data-collection/data-streams/). When making a [recording](/data-collection/recordings/), all generated data is saved on the Companion device. @@ -12,7 +12,7 @@ The app automatically saves [UTC timestamps](https://en.wikipedia.org/wiki/Coord ## Other Data Collection Tools -Several other tools complement the Neon Companion App and can make data collection much easier in some scenarios. +Several other tools complement the Neon Companion app and can make data collection much easier in some scenarios. ### Neon Monitor @@ -40,7 +40,7 @@ Neon is compatible with LSL and you can learn more about how to use Neon with LS [Pupil Cloud](/pupil-cloud/) is our web-based storage and analysis platform located at [cloud.pupil-labs.com](https://cloud.pupil-labs.com/). It makes it easy to store all your data securely in one place and offers a variety of options for data analysis and visualization. -Pupil Cloud is the recommended tool for processing your Neon recordings and if you enable uploads in the Neon Companion App all recordings can be uploaded automatically. +Pupil Cloud is the recommended tool for processing your Neon recordings and if you enable uploads in the Neon Companion app all recordings can be uploaded automatically. ![Pupil Cloud](./pupil_cloud.webp) diff --git a/neon/data-collection/first-recording/index.md b/neon/data-collection/first-recording/index.md index 149d75e98..108c71966 100644 --- a/neon/data-collection/first-recording/index.md +++ b/neon/data-collection/first-recording/index.md @@ -12,11 +12,11 @@ Create a new Google account or use an existing Google account during setup. ## 2. Install and Start the Neon Companion App. -Next, install the Neon Companion App on your device: +Next, install the Neon Companion app on your device: - Launch the **Google Play Store** app. It is already installed by default on your Companion Device. - Search for [**Neon Companion**](https://play.google.com/store/apps/details?id=com.pupillabs.neoncomp) in the Google Play Store and click install. -- Start the Neon Companion App when the installation has finished. +- Start the Neon Companion app when the installation has finished. - A couple of permission prompts will appear, which you need to accept. - First-time users need to sign up for a [Pupil Cloud](https://cloud.pupil-labs.com/) account. Sign up using your Google account, or create an account with an email address and password. @@ -27,7 +27,7 @@ Next, install the Neon Companion App on your device: -![Neon Companion App](/ne-companion_app_logo-bg.png) +![Neon Companion app](/ne-companion_app_logo-bg.png) diff --git a/neon/data-collection/index.md b/neon/data-collection/index.md index 884f07589..5ead192df 100644 --- a/neon/data-collection/index.md +++ b/neon/data-collection/index.md @@ -1,10 +1,11 @@ # Data Collection with Neon + Data collection is a key step for any application of Neon! In this section you can learn everything about it, starting with how to make [your first recording](/data-collection/first-recording/)! You can find an overview of what data is contained in Neon recordings including all the [data streams](/data-collection/data-streams/) from the various sensors, as well as any additional data like [events](/data-collection/events/), [wearers](/data-collection/wearers/), and [templates](/data-collection/templates/). -You can find introductions on how to use the Neon Companion App, e.g. on [Offset Correction](/data-collection/offset-correction/), and how-to guides on common tasks during data collection, e.g. [Time Synchronization](/data-collection/time-synchronization/). +You can find introductions on how to use the Neon Companion app, e.g. on [Offset Correction](/data-collection/offset-correction/), and how-to guides on common tasks during data collection, e.g. [Time Synchronization](/data-collection/time-synchronization/). Documentation on useful software and integrations for data collection is also available, see e.g. [Monitor App](/data-collection/monitor-app/) or [Lab Streaming Layer](/data-collection/lab-streaming-layer/). -Finally, you can find a list of [troubleshooting](/data-collection/troubleshooting/) tips and tricks for common issues during data collection. \ No newline at end of file +Finally, you can find a list of [troubleshooting](/data-collection/troubleshooting/) tips and tricks for common issues during data collection. diff --git a/neon/data-collection/lab-streaming-layer/index.md b/neon/data-collection/lab-streaming-layer/index.md index e7696970b..ee2e97f96 100644 --- a/neon/data-collection/lab-streaming-layer/index.md +++ b/neon/data-collection/lab-streaming-layer/index.md @@ -2,7 +2,7 @@ [Lab Streaming Layer](https://labstreaminglayer.org/) (LSL) is an open-source framework that connects, manages, and synchronizes data streams from multiple sources, such as EEG, GSK, and motion capture systems. Check out the [LSL documentation](https://labstreaminglayer.readthedocs.io/info/intro.html) for a full overview of supported devices. -The Neon Companion App has built-in support for LSL, streaming Neon’s real-time generated data over the LSL network. This allows you to easily synchronize Neon with other LSL-supported devices. +The Neon Companion app has built-in support for LSL, streaming Neon’s real-time generated data over the LSL network. This allows you to easily synchronize Neon with other LSL-supported devices. ## **Usage** @@ -10,7 +10,7 @@ LSL streaming can be initiated in the Companion App by enabling the "Stream over When enabled, data will be streamed over the LSL network, and subsequently, to any connected LSL inlet (such as the LSL LabRecorder App, or another third-party system with inlet functionality) which is listening. Like the [Real-Time API](https://docs.pupil-labs.com/neon/real-time-api/tutorials/), it is not necessary for the Companion App to be actively recording, but simultaneously streaming LSL data while making a recording is supported. -Note that you'll need to ensure the Neon Companion App is connected to the same network as the other devices streaming via LSL. +Note that you'll need to ensure the Neon Companion app is connected to the same network as the other devices streaming via LSL. ## **LSL Outlets** @@ -40,11 +40,11 @@ If your devices are on the same network but you have trouble connecting, it is l - UDP broadcasts to port `16571` and/or - UDP multicast to port `16571` at - - `FF02:113D:6FDD:2C17:A643:FFE2:1BD1:3CD2` - - `FF05:113D:6FDD:2C17:A643:FFE2:1BD1:3CD2` - - `FF08113D:6FDD:2C17:A643:FFE2:1BD1:3CD2` - - `FF0E:113D:6FDD:2C17:A643:FFE2:1BD1:3CD2` - - `224.0.0.1`, `224.0.0.183`, `239.255.172.215` + - `FF02:113D:6FDD:2C17:A643:FFE2:1BD1:3CD2` + - `FF05:113D:6FDD:2C17:A643:FFE2:1BD1:3CD2` + - `FF08113D:6FDD:2C17:A643:FFE2:1BD1:3CD2` + - `FF0E:113D:6FDD:2C17:A643:FFE2:1BD1:3CD2` + - `224.0.0.1`, `224.0.0.183`, `239.255.172.215` - TCP and UDP connections to the ports `16572`-`16604` -More troubleshooting tips can be found in the [Network Troubleshooting](https://labstreaminglayer.readthedocs.io/info/network-connectivity.html) page in LSL’s documentation. \ No newline at end of file +More troubleshooting tips can be found in the [Network Troubleshooting](https://labstreaminglayer.readthedocs.io/info/network-connectivity.html) page in LSL’s documentation. diff --git a/neon/data-collection/measuring-ied/index.md b/neon/data-collection/measuring-ied/index.md index d908de9d7..00c2c6db6 100644 --- a/neon/data-collection/measuring-ied/index.md +++ b/neon/data-collection/measuring-ied/index.md @@ -1,11 +1,11 @@ # Measuring the Inter-Eye-Distance (IED) -The wearer's IED can be set in the Neon Companion App for applications requiring +The wearer's IED can be set in the Neon Companion app for applications requiring precise pupillometry or eye-state measurements. This does not require prior calibration of the device. However, the accuracy of Neon’s 3D eye-state and pupil-size measurements can be enhanced by correctly setting the IED for each wearer. -To add the wearer's IED, input the value into their ‘Wearer Profile’ in the Neon Companion App before +To add the wearer's IED, input the value into their ‘Wearer Profile’ in the Neon Companion app before starting a recording. The default IED is set to 63 mm, which is the average for adults. diff --git a/neon/data-collection/monitor-app/index.md b/neon/data-collection/monitor-app/index.md index 255aaa965..76600d62c 100644 --- a/neon/data-collection/monitor-app/index.md +++ b/neon/data-collection/monitor-app/index.md @@ -4,10 +4,10 @@ Using the **Neon Monitor** app you can easily monitor your data collection in real-time and remote control all your Neons from another device. It's a web-app that can be opened in any browser on a computer, tablet or phone! The only requirement is that the Neon Companion device and the device you use to access the Monitor app are connected to the same network. -To access the Monitor app make sure the Neon Companion App is running and visit the page [neon.local:8080](http://neon.local:8080) on your monitoring device. +To access the Monitor app make sure the Neon Companion app is running and visit the page [neon.local:8080](http://neon.local:8080) on your monitoring device. ::: tip -The Neon Companion App can display a QR code that gets you straight to the monitor website. Just select `Streaming` on the home screen! +The Neon Companion app can display a QR code that gets you straight to the monitor website. Just select `Streaming` on the home screen! ::: ## The App's User Interface diff --git a/neon/data-collection/offset-correction/index.md b/neon/data-collection/offset-correction/index.md index 18e1883f2..3c41d8ffe 100644 --- a/neon/data-collection/offset-correction/index.md +++ b/neon/data-collection/offset-correction/index.md @@ -1,5 +1,6 @@ # Using Offset Correction to Improve Gaze Accuracy -For some wearers, you may find a constant offset in their gaze estimates. To compensate for those, you can use the Offset Correction feature in the Neon Companion App. See the video below to learn how it works! + +For some wearers, you may find a constant offset in their gaze estimates. To compensate for those, you can use the Offset Correction feature in the Neon Companion app. See the video below to learn how it works! diff --git a/neon/data-collection/recordings/index.md b/neon/data-collection/recordings/index.md index 6e8e81fdf..3573406bc 100644 --- a/neon/data-collection/recordings/index.md +++ b/neon/data-collection/recordings/index.md @@ -1,10 +1,10 @@ # Recordings -A recording starts and stops when you press the record button in the Neon Companion App. While this should feel similar to recording a regular video on your phone, there is a lot more happening behind the scenes. When you are recording with the Neon Companion App, you are capturing not only video data but several more sensors (see [Data Streams](/data-collection/data-streams/)). +A recording starts and stops when you press the record button in the Neon Companion app. While this should feel similar to recording a regular video on your phone, there is a lot more happening behind the scenes. When you are recording with the Neon Companion app, you are capturing not only video data but several more sensors (see [Data Streams](/data-collection/data-streams/)). Recordings are designed to be as robust as possible. If at any point the Neon module is temporarily disconnected from the Companion phone, it will automatically start capturing again as soon as it is reconnected. You could start a recording with no Neon connected and plug it in at a later time. As soon as it is connected, data will be captured. -The Neon Companion App has several more features to ensure robust data collection and will e.g. warn you in case the Companion device's battery is running low or if you run out of storage space. +The Neon Companion app has several more features to ensure robust data collection and will e.g. warn you in case the Companion device's battery is running low or if you run out of storage space. ## Events, Wearers, & Templates diff --git a/neon/data-collection/scene-camera-exposure/index.md b/neon/data-collection/scene-camera-exposure/index.md index cc280153b..95989e2e4 100644 --- a/neon/data-collection/scene-camera-exposure/index.md +++ b/neon/data-collection/scene-camera-exposure/index.md @@ -12,7 +12,7 @@ examples and important considerations. ## Changing Exposure Modes -From the home screen of the Neon Companion App, tap +From the home screen of the Neon Companion app, tap the [Scene and Eye Camera preview](/data-collection/first-recording/#_4-open-the-live-preview), and then select `Balanced` to reveal all four modes. diff --git a/neon/data-collection/transfer-recordings-via-usb/index.md b/neon/data-collection/transfer-recordings-via-usb/index.md index 960fbfd5d..0c86dbab0 100644 --- a/neon/data-collection/transfer-recordings-via-usb/index.md +++ b/neon/data-collection/transfer-recordings-via-usb/index.md @@ -10,7 +10,7 @@ Recordings downloaded directly from the phone will be in a raw binary format. #### Export from Neon Companion App -1. Open the recordings view in the Neon Companion App +1. Open the recordings view in the Neon Companion app 2. Select desired recording/s 3. Export: - For single recordings, the export button is found by clicking on the 3 vertical dots to @@ -28,11 +28,11 @@ Recordings downloaded directly from the phone will be in a raw binary format. 6. Locate the export folder on the phone. Usually, it is in `Documents/Neon Export`. 7. Copy the recordings to your computer. -Note that the export process does not delete the recordings from the Neon Companion App, and you can still upload +Note that the export process does not delete the recordings from the Neon Companion app, and you can still upload to Pupil Cloud at a later date if required. -Recordings that are deleted from the Neon Companion App, e.g. to free up storage space, cannot be transferred back -to the Neon Companion App from your backup location (including Pupil Cloud, a laptop/desktop PC, or external HD). +Recordings that are deleted from the Neon Companion app, e.g. to free up storage space, cannot be transferred back +to the Neon Companion app from your backup location (including Pupil Cloud, a laptop/desktop PC, or external HD). This means that if you delete the recordings prior to uploading them to Pupil Cloud, they cannot be uploaded at a later date. diff --git a/neon/data-collection/troubleshooting/index.md b/neon/data-collection/troubleshooting/index.md index c4faa8511..e9ef30d2f 100644 --- a/neon/data-collection/troubleshooting/index.md +++ b/neon/data-collection/troubleshooting/index.md @@ -4,7 +4,7 @@ Below you can find a list of issues we have observed in the past and recommendat ## The Companion Device Is Vibrating and a Red LED Is Blinking in the Neon Module! -The vibrations and the blinking LED try to grab the wearer's attention to notify them of a problem that may critically hurt the ongoing recording. To get details on the problem, open the Neon Companion App, which will show an error description. +The vibrations and the blinking LED try to grab the wearer's attention to notify them of a problem that may critically hurt the ongoing recording. To get details on the problem, open the Neon Companion app, which will show an error description. Potential problems include: diff --git a/neon/data-collection/wearers/index.md b/neon/data-collection/wearers/index.md index 02177fac5..ea44ee926 100644 --- a/neon/data-collection/wearers/index.md +++ b/neon/data-collection/wearers/index.md @@ -1,6 +1,6 @@ # Wearers -Wearers are the people wearing Neon. In a typical study, each subject would be a wearer. Every recording you make is assigned to a wearer to help you organize your recordings. You can create new wearers on the fly in the Neon Companion App or in advance using [Pupil Cloud](/pupil-cloud/). It is also possible to change the assigned wearer in a recording post hoc in Pupil Cloud. Simply select _Change Wearer_ from the context menu of a recording! +Wearers are the people wearing Neon. In a typical study, each subject would be a wearer. Every recording you make is assigned to a wearer to help you organize your recordings. You can create new wearers on the fly in the Neon Companion app or in advance using [Pupil Cloud](/pupil-cloud/). It is also possible to change the assigned wearer in a recording post hoc in Pupil Cloud. Simply select _Change Wearer_ from the context menu of a recording! Every wearer is assigned a unique ID, such that you can edit the name and profile picture at any time without mixing up your recordings. diff --git a/neon/hardware/compatible-devices/index.md b/neon/hardware/compatible-devices/index.md index c3512dde4..e8a19a7b4 100644 --- a/neon/hardware/compatible-devices/index.md +++ b/neon/hardware/compatible-devices/index.md @@ -1,10 +1,10 @@ # Companion Device -The Companion device is a flagship Android smartphone. It is a regular phone that is not customized or modified in any way. To ensure maximum stability and performance we can only support a small number of carefully selected and tested models. The Neon Companion App is tuned to work with these particular models as we require full control over various low-level functions of the hardware. +The Companion device is a flagship Android smartphone. It is a regular phone that is not customized or modified in any way. To ensure maximum stability and performance we can only support a small number of carefully selected and tested models. The Neon Companion app is tuned to work with these particular models as we require full control over various low-level functions of the hardware. The supported models are: OnePlus 8, OnePlus 8T, OnePlus 10 Pro, and Motorola Edge 40 Pro. Currently, Neon ships with a Motorola Edge 40 Pro. We highly recommend the Edge 40 Pro, giving you the best performance, endurance and stability. -If you want to replace or add an extra Companion device you can purchase it [directly from us](https://pupil-labs.com/products/neon) or from any other distributor. The Neon Companion App is free and can be downloaded from the [Play Store](https://play.google.com/store/apps/details?id=com.pupillabs.neoncomp). +If you want to replace or add an extra Companion device you can purchase it [directly from us](https://pupil-labs.com/products/neon) or from any other distributor. The Neon Companion app is free and can be downloaded from the [Play Store](https://play.google.com/store/apps/details?id=com.pupillabs.neoncomp). Using a fully charged Motorola Edge 40 Pro device you get around 4 hours of continuous recording time. You can extend this duration by simultaneously charging the phone during a recording [using a powered USB-C hub](/hardware/using-a-usb-hub/). @@ -12,7 +12,7 @@ Using a fully charged Motorola Edge 40 Pro device you get around 4 hours of cont ### Neon Companion App -Make sure to update the Neon Companion App on a regular basis. The latest version will always be available on the +Make sure to update the Neon Companion app on a regular basis. The latest version will always be available on the [Play Store](https://play.google.com/store/apps/details?id=com.pupillabs.neoncomp). ### Android OS diff --git a/neon/hardware/module-technical-overview/index.md b/neon/hardware/module-technical-overview/index.md index a5da2ad9c..7533b0d4c 100644 --- a/neon/hardware/module-technical-overview/index.md +++ b/neon/hardware/module-technical-overview/index.md @@ -10,7 +10,7 @@ The Neon module is a small powerhouse of sensors! It connects to the Companion d - **IR LEDs**: One infrared LED is located on each arm of the module. The LEDs illuminate the eyes of the wearer to improve image quality in dark environments. -- **Scene Camera**: A front-facing scene camera is located at the center of the module capturing [scene video](/data-collection/data-streams/#scene-video). A **microphone** is integrated into the module to capture [audio](/data-collection/data-streams/#audio). Capturing audio is optional and settable in the Neon Companion App settings. +- **Scene Camera**: A front-facing scene camera is located at the center of the module capturing [scene video](/data-collection/data-streams/#scene-video). A **microphone** is integrated into the module to capture [audio](/data-collection/data-streams/#audio). Capturing audio is optional and settable in the Neon Companion app settings. - **IMU**: A 9-degrees-of-freedom IMU is integrated into the module. It captures the [inertia](/data-collection/data-streams/#movement-imu-data) of the glasses, including translational acceleration, rotational speed, magnetic orientation, pitch, yaw, and roll. diff --git a/neon/neon-player/eye-state-timeline/index.md b/neon/neon-player/eye-state-timeline/index.md index 2de008562..0a3a06b3a 100644 --- a/neon/neon-player/eye-state-timeline/index.md +++ b/neon/neon-player/eye-state-timeline/index.md @@ -5,7 +5,7 @@ This plugin visualizes [3D eye state](/data-collection/data-streams/#_3d-eye-sta ![Eye State Timeline](./eye-state-timeline.webp) ::: info -The data will only be visualized if Eye State computation was enabled in the Neon Companion App during recording. +The data will only be visualized if Eye State computation was enabled in the Neon Companion app during recording. ::: ## Export Format diff --git a/neon/neon-xr/index.md b/neon/neon-xr/index.md index 29a9a71d4..43a378a71 100644 --- a/neon/neon-xr/index.md +++ b/neon/neon-xr/index.md @@ -1,4 +1,5 @@ # Neon XR + Neon XR allows you to equip XR devices with research-grade eye tracking powered by Neon. This enables both gaze-based interaction for XR applications and visual behaviour analysis in XR environments. Thanks to the small form factor of the [Neon module](/hardware/module-technical-overview/), it can easily be integrated into a variety of XR devices. A hardware mount for the [Pico 4](https://pupil-labs.com/products/vr-ar) headset is available for purchase and additional mounts for other headsets are in development. You can also [build a mount yourself](/neon-xr/build-your-own-mount/) for any headset! @@ -8,20 +9,22 @@ Thanks to the small form factor of the [Neon module](/hardware/module-technical- Neon XR includes software integration with Unity. The [Neon XR Core Unity Package](/neon-xr/neon-xr-core-package/) allows you to receive gaze data from a Neon Module in your Unity project in real-time. We also provide a [template project](/neon-xr/MRTK3-template-project/) for the [Mixed Reality Toolkit 3.0](https://learn.microsoft.com/en-us/windows/mixed-reality/mrtk-unity/mrtk3-overview/) that makes it easy to get started with the development of your XR application. ## System Overview + The **Neon Module** is attached to it's **mount**, which is in turn attached to the **XR device**. Similar to regular Neon frames, the mount is tethered to the **Neon Companion Device** with a USB-C cable. -The Neon Companion Device provides power to the module and runs the the **Neon Companion App**, which does all the real-time computation. +The Neon Companion Device provides power to the module and runs the the **Neon Companion app**, which does all the real-time computation. -Depending on the XR device used, the **Unity application** is running on the XR device itself or on a separate computer the XR device is tethered to. -It communicates with the Neon Companion App over the network, receives gaze data in real-time, and projects it into the virtual world thanks to the **Neon XR Core** Unity Package. +Depending on the XR device used, the **Unity application** is running on the XR device itself or on a separate computer the XR device is tethered to. +It communicates with the Neon Companion app over the network, receives gaze data in real-time, and projects it into the virtual world thanks to the **Neon XR Core** Unity Package.
![System Overview](./system_overview.png) ## Getting Started + The easiest starting point for building XR applications with Neon is to use our [MRTK3 template project](/neon-xr/MRTK3-template-project/). It uses MRTK3 as the foundation, and contains several demo scenes you can work off of. If you don't want to use MRTK3, you can also integrate the [`Neon XR Core` Unity Package](/neon-xr/neon-xr-core-package/) into your project directly, which contains only the ability to receive gaze data in real-time in Unity and to map it into the virtual world. -If the available Pico 4 mount does not fit your needs, [build your own mount](/neon-xr/build-your-own-mount/)! \ No newline at end of file +If the available Pico 4 mount does not fit your needs, [build your own mount](/neon-xr/build-your-own-mount/)! diff --git a/neon/neon-xr/neon-xr-core-package/index.md b/neon/neon-xr/neon-xr-core-package/index.md index b4751d488..412e2e83d 100644 --- a/neon/neon-xr/neon-xr-core-package/index.md +++ b/neon/neon-xr/neon-xr-core-package/index.md @@ -1,19 +1,21 @@ # The Neon XR Core Package + Using Neon XR Core Package in your Unity project enables you to receive eye tracking data from a Neon device over the local network in real-time. ## Adding Neon XR to Your Project -The [**Neon XR Unity package**](https://github.com/pupil-labs/neon-xr) enables you to receive eye tracking data from a Neon module in your Unity project in real-time. + +The [**Neon XR Unity package**](https://github.com/pupil-labs/neon-xr) enables you to receive eye tracking data from a Neon module in your Unity project in real-time. To integrate it in your project, follow these steps: 1. Add the `Neon XR` package in the Package Manager. - 1. Select `Window -> Package Manager` - 2. Select `+ -> Add Package from git URL…` - 3. Insert ` https://github.com/pupil-labs/neon-xr.git?path=/com.pupil-labs.neon-xr.core`. + 1. Select `Window -> Package Manager` + 2. Select `+ -> Add Package from git URL…` + 3. Insert ` https://github.com/pupil-labs/neon-xr.git?path=/com.pupil-labs.neon-xr.core`. 1. If your project does not use Addressables, create default Addressables settings. - 1. Select `Window -> Asset Management -> Addressables -> Groups`. - 2. Click on `Create Addressables Settings`. - 3. If legacy bundles are detected click on `Ignore`. + 1. Select `Window -> Asset Management -> Addressables -> Groups`. + 2. Click on `Create Addressables Settings`. + 3. If legacy bundles are detected click on `Ignore`. 1. Select `Pupil Labs -> Addressables -> Import Groups`. After this step the `NeonXR Group` should appear in the `Addressables Groups` window (you can open this window again following step 2.1). 1. In the `Addressable Groups` window, select `Build -> New Build -> Default Build Script`. 1. Copy the `NeonXR` prefab from the imported package into the scene. @@ -21,17 +23,19 @@ To integrate it in your project, follow these steps: 1. Add your own listener for the `gazeDataReady` event (see for example, `GazeDataVisualizer.OnGazeDataReady`). ## Connecting to Neon -The Neon Companion App publishes the data it generates to the local network using the [real-time API](/real-time-api/tutorials/). The Neon XR Core package contains a client to receive this data and map it into the 3D virtual world. By default, it tries to connect the first Neon device it detects on the network. + +The Neon Companion app publishes the data it generates to the local network using the [real-time API](/real-time-api/tutorials/). The Neon XR Core package contains a client to receive this data and map it into the 3D virtual world. By default, it tries to connect the first Neon device it detects on the network. Thie behavior can be further configured by editing the `config.json` file of the app, which is located at the following path: + ``` \Android\data\com.MixedRealityToolkitOrganization.MRTK3Sample\files\config.json -``` +``` It contains a field `rtspSettings` with the following keys: -| Field | Description | -| --- | --- | -| `autoIP` | Enables the automatic discovery of Neon devices connected to the local network. The first detected device will be used. | -| `deviceName` | If not empty, only devices with the provided name can be discovered. | -| `ip` | The IP address that will be used if automatic discovery is disabled. | \ No newline at end of file +| Field | Description | +| ------------ | ----------------------------------------------------------------------------------------------------------------------- | +| `autoIP` | Enables the automatic discovery of Neon devices connected to the local network. The first detected device will be used. | +| `deviceName` | If not empty, only devices with the provided name can be discovered. | +| `ip` | The IP address that will be used if automatic discovery is disabled. | diff --git a/neon/pupil-cloud/index.md b/neon/pupil-cloud/index.md index cf94731f2..a6d134177 100644 --- a/neon/pupil-cloud/index.md +++ b/neon/pupil-cloud/index.md @@ -8,6 +8,6 @@ aside: false [Pupil Cloud](https://cloud.pupil-labs.com) is a web-based eye tracking platform for data logistics, analysis, and visualization. It is the recommended tool for processing your Neon recordings. It makes it easy to store all your data securely in one place and it offers a variety of options for analysis. -If Cloud upload is enabled in the Neon Companion App, then all recordings will be uploaded automatically to Pupil Cloud. +If Cloud upload is enabled in the Neon Companion app, then all recordings will be uploaded automatically to Pupil Cloud. We have a strict privacy policy that ensures your recording data is accessible only by you and those you explicitly grant access to. Pupil Labs will never access your recording data unless you explicitly instruct us to. diff --git a/neon/pupil-cloud/offset-correction/index.md b/neon/pupil-cloud/offset-correction/index.md index 8aadcbe49..b68ffdfaf 100644 --- a/neon/pupil-cloud/offset-correction/index.md +++ b/neon/pupil-cloud/offset-correction/index.md @@ -1,6 +1,6 @@ # Offset Correction on Pupil Cloud -For some subjects, you may find a constant offset in their gaze estimates. This gaze offset can be [compensated for in the Neon Companion App](/data-collection/offset-correction/) before starting a recording, or post hoc in Pupil Cloud as shown below. Click on the gaze offset icon to start the gaze offset correction. +For some subjects, you may find a constant offset in their gaze estimates. This gaze offset can be [compensated for in the Neon Companion app](/data-collection/offset-correction/) before starting a recording, or post hoc in Pupil Cloud as shown below. Click on the gaze offset icon to start the gaze offset correction. ![Offset correction on Cloud header image](./offset-cloud-timeline.png) @@ -8,7 +8,7 @@ Once you enter the `Edit Gaze Offset` view, simply drag the gaze circle to apply - The grey circle indicates the raw gaze estimate provided by Neon’s gaze estimation pipeline. - The red circle indicates the gaze position with the current offset applied. -- The blue circle indicates the offset correction applied in the Neon Companion App. +- The blue circle indicates the offset correction applied in the Neon Companion app.