Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Delay when sending push to api.push.apple.com #124

Closed
cyrildurand opened this issue Dec 20, 2023 · 6 comments
Closed

Delay when sending push to api.push.apple.com #124

cyrildurand opened this issue Dec 20, 2023 · 6 comments
Labels
stale Stale issue that will be closed soon

Comments

@cyrildurand
Copy link

Hi,

I'm sending thousand of notification through this package, everything work but I'm experiencing delay when sending request to api.push.apple.com.

When sending few notification, it took around 250ms. When sending thousand of notification it increase up to few seconds and even sometime up to 60 seconds.

I try to configure HttpClient in various way and still have the same timing, does anyone experience the same issue ?

How fast is api.push.apple.com when sending notification ? Do you also experience delay when sending push notification ?

I have thousands (10k~200k) of notification to send at the same time what would be the best approach for this ? right now I'm sending 500 notifications per minute which is way too slow.

@tomalabaster
Copy link

I'm curious about the sending speed too, for example:

The ApnsService.SendPushes method does something along the lines of...:

foreach (var push in pushes)
    result.Add(await client.Send(push));

... which to the best of my novice C# knowledge performs a push 1 at a time?

When I perform...:

await Task.WhenAll(pushes.Select((push) => _apnsService.SendPush(push)));

... it sends them all at once as oppose to waiting for each one?

Very non-scientific test but using the build in SendPushes method it takes me between 2.5 and 3 seconds to send 16 pushes but using the Task.WhenAll approach it takes me between 400 and 600 milliseconds to send 16 pushes.

As I said, novice C# knowledge, I'm sure there's downsides to using Task.WhenAll that yourself or @alexalok might know about?

@alexalok
Copy link
Owner

alexalok commented Jan 1, 2024

Hi everyone, sorry for a delayed response.

Unfortunately, I have never had to manage a large number of push notifications, so I have never looked into the speed matter.
As @tomalabaster correctly points out, ApnsService.SendPushes does indeed send one push at a time. This is a conservative but at the same time is extremely safe default which will get pushes across, sooner or later.

Using await Task.WhenAll(pushes.Select((push) => _apnsService.SendPush(push))); will generally speed up things considerably because instead of sending pushes one by one, we basically just throw them all at the ThreadPool (which backs Tasks, or more specifically, default TaskScheduler). The problem with that approach is that if you send thousands of pushes at the same time and thus put thousands of tasks on a ThreadPool, it may starve either ThreadPool or http connections pool (or both). Starvation of ThreadPool is unlikely though, because very little work is done inside SendPush (basically just serialization of a push payload and some housekeeping stuff) apart from an actual http request which won't use any threads. What I am more worried about is the starvation of http connections pool. It may be affected by a lot of stuff, say where you host your app or which HttpHandler ends up in use internally in HttpClient.

It is of course possible to test these scenarios and use the "safe enough" estimates, but I am sure there will be tons of edge cases because in my personal opinion the http stack in .net has always been somewhat inconsistent (see #4 and #6 which did take quite some time to figure out and document properly). Based on that I suggest folks use the safe defaults when possible but if speed becomes a concern write their own wrapper around concurrent use of _apnsService.SendPush and test which level of concurrency works best for their goals and environment their software runs on. I would personally suggest looking at Parallel.Foreach which is a perfect fit for this issue.

Finally, I would advise everyone to first evaluate whether the delay in bulk notifications is something that can't be tolerated. Talking from my experience, when I have to send that many notifications it is usually not about something time critical, i.e. it won't hurt user experience to get the notification an hour later. Think incoming call notification (extremely time critical but not bulk at all) vs new major update notification (might involve hundreds of thousands of users but generally can be gradually sent out in hours if not days).

@cyrildurand
Copy link
Author

Thanks for the reply.

Totally agree with @alexalok, multi threading may improve performance for small batch but real concern is about http starvation. My application is hosted on Azure App Service and opening multiple connection may also result in SNAT port exhaustion.

In my case, I need to send thousands of notification in less than few minutes. and I'm not sure how to get better performance.

I know that api.push.apple.com resolve to multiple IP address (8 IPv4 and 8 IPv6 right now). I'm considering writing my own HttpClient pool to have one opening one connection per IP. I'm also considering using more than one outbound IP address and opening one connection per IP.

It's not that straightforward to implement and I'm looking for feedback before implementing these changes.

@tomalabaster
Copy link

@alexalok thank you so much for that extremely detailed response! Some great insights 🙌

@alexalok
Copy link
Owner

alexalok commented Jan 2, 2024

@cyrildurand, since apns servers only support http/2 (and not http/1) perhaps some/all HttpClientHandlers can take advantage of stream multiplexing. You may also try to limit the number of concurrent connections per server to a sensible number that would prevent starvation. Wireshark can be used to confirm that the limiting does indeed work in an expected way.

@tomalabaster, glad I was able to shed some light on the issue 😃

As a note, should anyone do research into this matter you are more than welcome to share any findings here.

Copy link

stale bot commented Feb 8, 2024

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale Stale issue that will be closed soon label Feb 8, 2024
@stale stale bot closed this as completed Feb 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stale Stale issue that will be closed soon
Projects
None yet
Development

No branches or pull requests

3 participants