You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was happy to discover this project as I think the community could really use a standardized standalone latency testing tool.
That said, upon taking a closer look, because of how the autotrigger timing is implemented, I don't think it gives meaningful results in its current state.
If we want to use the average pin-toggle-to-USB-packet time as the primary metric, we have to be really conscious of when the GPIO pin toggle occurs. Because of the periodic nature of USB, not only do we need to know the precise time when it occured, but when doing, say, a thousand random samples with the intent of measuring the pin-to-USB times and averaging them, the pin toggle times need to be uniformly distributed over the polling interval. Otherwise, the average will be influenced by the uneven distribution, not (just) by the characteristics of the device under test.
Looking at the code, the issue is acknowledged, but not really addressed in a satisfactory manner:
void xlat_auto_trigger_action(void)
{
// random delay, such that we do not always perfectly align with USB timing
srand(xTaskGetTickCount());
int val = rand() & 0xFFF;
for (volatile int i = 0; i < val; i++) {
__NOP();
}
HAL_GPIO_WritePin(ARDUINO_D11_GPIO_Port, ARDUINO_D11_Pin, auto_trigger_level_high ? GPIO_PIN_SET : GPIO_PIN_RESET);
}
This nop loop results in a maximum delay of around 140us, which I guess is near the duration of a High Speed microframe, but it would really need to be exactly equal to it to even work for 8000Hz devices, not to mention anything below that. If there's any alignment or synchronization between the timing at which the xlat_auto_trigger_action function is called and the USB frame timing, and it seems that there is, the results will mostly be influenced by this. If a perfect zero-latency device reports at 1ms intervals and all of our pin toggles occur within the 100us window after the data transfer occurs, then we will get an average pin-to-USB latency of around 900-1000us, even though in reality it's somewhere near 500us.
One might be tempted with a solution that goes something like this:
But while this would be better than the current situation, it still has certain issues. Whenever a higher priority thread or an interrupt handler run at exactly the time we want to trigger the GPIO pin, we have to wait until our thread is active again, resulting in a gap within the USB frame near the SOF packet, giving us non-uniform distribution and again influencing the average latency time.
I think we should look into using some kind of hardware functionality of the microcontroller to trigger the GPIO pin at a given time, ensuring uniform distribution of the samples and protection from our thread being preempted.
The text was updated successfully, but these errors were encountered:
I was happy to discover this project as I think the community could really use a standardized standalone latency testing tool.
That said, upon taking a closer look, because of how the autotrigger timing is implemented, I don't think it gives meaningful results in its current state.
If we want to use the average pin-toggle-to-USB-packet time as the primary metric, we have to be really conscious of when the GPIO pin toggle occurs. Because of the periodic nature of USB, not only do we need to know the precise time when it occured, but when doing, say, a thousand random samples with the intent of measuring the pin-to-USB times and averaging them, the pin toggle times need to be uniformly distributed over the polling interval. Otherwise, the average will be influenced by the uneven distribution, not (just) by the characteristics of the device under test.
Looking at the code, the issue is acknowledged, but not really addressed in a satisfactory manner:
This nop loop results in a maximum delay of around 140us, which I guess is near the duration of a High Speed microframe, but it would really need to be exactly equal to it to even work for 8000Hz devices, not to mention anything below that. If there's any alignment or synchronization between the timing at which the
xlat_auto_trigger_action
function is called and the USB frame timing, and it seems that there is, the results will mostly be influenced by this. If a perfect zero-latency device reports at 1ms intervals and all of our pin toggles occur within the 100us window after the data transfer occurs, then we will get an average pin-to-USB latency of around 900-1000us, even though in reality it's somewhere near 500us.One might be tempted with a solution that goes something like this:
But while this would be better than the current situation, it still has certain issues. Whenever a higher priority thread or an interrupt handler run at exactly the time we want to trigger the GPIO pin, we have to wait until our thread is active again, resulting in a gap within the USB frame near the SOF packet, giving us non-uniform distribution and again influencing the average latency time.
I think we should look into using some kind of hardware functionality of the microcontroller to trigger the GPIO pin at a given time, ensuring uniform distribution of the samples and protection from our thread being preempted.
The text was updated successfully, but these errors were encountered: