Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Speedtest Fail - starting yesterday the 21st #36

Open
imzenreally opened this issue Jun 23, 2023 · 23 comments · May be fixed by #37
Open

Speedtest Fail - starting yesterday the 21st #36

imzenreally opened this issue Jun 23, 2023 · 23 comments · May be fixed by #37

Comments

@imzenreally
Copy link

been working like a charm - and then its over - happily talking to my influx db - but seems to be unhappy with ookla -- their client change or something?

@chrisweis
Copy link

chrisweis commented Jun 24, 2023

Same here, can't seem to figure out why. Tried to change LOG_TYPE environment variable to "debug" to see more detail but nothing.

Considering a switch over to this: https://github.com/alexjustesen/speedtest-tracker

@imzenreally
Copy link
Author

Same here, can't seem to figure out why. Tried to change LOG_TYPE environment variable to "debug" to see more detail but nothing.

Considering a switch over to this: https://github.com/alexjustesen/speedtest-tracker

Are you running this container in unraid?

I've been seeing some odd ssl errors that (according to a separate thread) seem related to the docker version

I'm running unraid 6.12.1 and everything is up to date and current etc

@chrisweis
Copy link

Nope, I'm running it atop an Ubuntu VM on Proxmox

@imzenreally
Copy link
Author

imzenreally commented Jun 24, 2023 via email

@chrisweis
Copy link

Which logs are you seeing the SSL errors? No SSL errors in my docker logs. I have:

{"log":"2023-06-24 01:44:25,819 [INFO] Speedtest CLI data logger to InfluxDB started...\r\n","stream":"stdout","time":"2023-06-24T01:44:25.819485356Z"}
{"log":"2023-06-24 01:44:25,839 [INFO] Ping data written successfully\r\n","stream":"stdout","time":"2023-06-24T01:44:25.83961178Z"}
{"log":"2023-06-24 01:44:25,861 [INFO] Manual server choice : ID = 10162\r\n","stream":"stdout","time":"2023-06-24T01:44:25.861743637Z"}
{"log":"2023-06-24 01:44:25,861 [INFO] Speedtest Failed :\r\n","stream":"stdout","time":"2023-06-24T01:44:25.86180898Z"}
{"log":"2023-06-24 01:45:25,900 [INFO] Ping data written successfully\r\n","stream":"stdout","time":"2023-06-24T01:45:25.901179612Z"}
{"log":"2023-06-24 01:46:25,969 [INFO] Ping data written successfully\r\n","stream":"stdout","time":"2023-06-24T01:46:25.969823677Z"}

@chrisweis
Copy link

Looks like it stopped working for me 2023-06-21 13:30 (US Pacific):
image

@imzenreally
Copy link
Author

imzenreally commented Jun 24, 2023 via email

@imzenreally
Copy link
Author

2023-06-23 20:48:32,590 [INFO] Speedtest Failed :
2023-06-23 20:48:32,591 [DEBUG] b'[2023-06-23 20:48:32.588] [error] Configuration - SSL connect error (UnknownException)\n[2023-06-23 20:48:32.589] [error] Configuration - Cannot retrieve configuration document (0)\n[2023-06-23 20:48:32.589] [error] ConfigurationError - Could not retrieve or read configuration (Configuration)\n[2023-06-23 20:48:32.589] [error] ConfigurationError - Could not retrieve or read configuration (Configuration)\n{"type":"log","timestamp":"2023-06-24T03:48:32Z","message":"Configuration - Could not retrieve or read configuration (ConfigurationError)","level":"error"}\n'
2023-06-23 20:48:32,591 [DEBUG] b''```

my last successful pull was 6/21 0930 Pacific

@mattdymes
Copy link

I started having the problem on the same date... Just switched over to https://github.com/alexjustesen/speedtest-tracker in a docker container. Be aware - the new speedtest-tracker is API or Influx 2.0 only. out of the box it seems like traditional username / password authentication to in Influx database is not an option, it is key only. I will work , but will not provide the same functionality of speedflux

@imzenreally
Copy link
Author

@mattdymes i've avoided using that because i'm resisting upgrading myinfluxdb - i'm still running ... 1.8 i think

@imzenreally
Copy link
Author

@mattdymes but the graphs the guy has set up as default are nice - i suppose it would replace my grafana dahsboard

@funnelcloudservices
Copy link

UnRaid guy here. yep, Mine started failing as well.

@imzenreally
Copy link
Author

looks like the sppedtest-tracker thats been linked here a few times has plans to support influxdb v1 - i'm probably bailing on speedflux - repo doesn't appear to be maintained and i'm too lazy to troubleshoot when there are other working options

@chrisweis
Copy link

This thread helped me upgrade my v1 to v2 using a Docker container: https://community.influxdata.com/t/problem-backup-1-8-restore-2-6/28943

Hypothesis I considered but didn't test: Did Ookla Speedtest increase the maximum frequency of tests, and that began causing it to fail? Would a decrease to only hourly fix the old Speedflux? 🤔

@clintkev251
Copy link

Nah that's not it. I only test once a day, and mine have been failing consistently too

@imzenreally
Copy link
Author

Would a decrease to only hourly fix the old Speedflux? 🤔

i only ran my speedflux once ever 2 or 3 hours - so i don't think its ookla rate limiting

@ananyosen
Copy link

ananyosen commented Jun 27, 2023

Looks like a case of horribly updated speedtest binary, the image was last published almost 2 years back (https://github.com/breadlysm/SpeedFlux/pkgs/container/speedflux). Manually upgraded to latest speedtest binary from ookla (https://install.speedtest.net/app/cli/ookla-speedtest-1.2.0-linux-x86_64.tgz) after exec'ing into the container, replaced the binary in /usr/bin/speedtest and speedtests are working fine now.

This will not persist across container updates (but that does not seem to be an issue at this point)

image

Edit: There's a fork that's regularly updated (https://hub.docker.com/r/dontobi/speedflux.rpi), will test it over the weekend and migrate if it's a drop-in replacement.

@imzenreally
Copy link
Author

imzenreally commented Jun 27, 2023

There's a fork that's regularly updated

slick - i was just about to fork it and take a whack up making the update - let me know if that works as a drop in replacement

edit: fail as a drop in replacement in unraid - its targetted specifically for ARM's

@faspina
Copy link

faspina commented Jun 27, 2023

Looks like a case of horribly updated speedtest binary, the image was last published almost 2 years back (https://github.com/breadlysm/SpeedFlux/pkgs/container/speedflux). Manually upgraded to latest speedtest binary from ookla (https://install.speedtest.net/app/cli/ookla-speedtest-1.2.0-linux-x86_64.tgz) after exec'ing into the container, replaced the binary in /usr/bin/speedtest and speedtests are working fine now.

This will not persist across container updates (but that does not seem to be an issue at this point)

image

Edit: There's a fork that's regularly updated (https://hub.docker.com/r/dontobi/speedflux.rpi), will test it over the weekend and migrate if it's a drop-in replacement.

This worked for me . Downloaded the file, unpack put in it a director and added that directory as a resource to the docker container, then went into the container consol and copied speedtest to the /usr/bin to replace existing one, restarted container it started working

@funnelcloudservices
Copy link

Oddly enough mine started working again overnight.

@imzenreally
Copy link
Author

imzenreally commented Jun 29, 2023 via email

@clintkev251
Copy link

Yup, mine has also just randomly started working again. No changes, no updates.

@sethwv sethwv linked a pull request Jul 11, 2023 that will close this issue
@sethwv
Copy link

sethwv commented Jul 11, 2023

Looks like a case of horribly updated speedtest binary, the image was last published almost 2 years back (https://github.com/breadlysm/SpeedFlux/pkgs/container/speedflux). Manually upgraded to latest speedtest binary from ookla (https://install.speedtest.net/app/cli/ookla-speedtest-1.2.0-linux-x86_64.tgz) after exec'ing into the container, replaced the binary in /usr/bin/speedtest and speedtests are working fine now.

This will not persist across container updates (but that does not seem to be an issue at this point)

image

Edit: There's a fork that's regularly updated (https://hub.docker.com/r/dontobi/speedflux.rpi), will test it over the weekend and migrate if it's a drop-in replacement.

I've made a PR to the effect of changing the binary even though it's unlikely to be merged as this project is unfortunately so inactive - if you need a drop-in replacement for unraid then for now you're welcome to build an image of your own with my PR fork or use my repo at https://hub.docker.com/repository/docker/sethwv/speedflux/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

8 participants