Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Eagerly load and cache playlists #27

Merged

Conversation

adambechtold
Copy link
Owner

@adambechtold adambechtold commented Feb 24, 2024

Eagerly Generate Playlist and Store in a Cache

Playlists are taking a long time to load. For more, see Issue #25.

Features

🆕 Class - PlaylistCache - Stores entire playlists

  • function createPlaylistKey(users: [number, number], preferenceType: PreferenceType): string

Dependency - lru-cache

This library looks well supported.

  • Weekly Downloads - 192,121,185
  • Last Updated - One month ago
  • Documentation Quality - High
  • Dependencies - 0

🆕 Functionality - Playlist Service looks for cached playlists before generating new ones

Highlight - It looks for both the requested playlist and the inverse playlist

When a playlist is requested,

  1. Look for a playlist the fits the exact parameters
  2. Look for a playlist that fits the inverse parameters
  • Example
    • Requested Params
      • User 1 - id: 1
      • User 2 - id: 2
      • Preference - USER1-ONLY
    • Inverse
      • User 1 - id: 2
      • User 2 - id: 1
      • Preference - USER2-ONLY
  1. Generate Playlist

image

Impact - The site appears 10-20x faster

  • Request Time without Pre-cache
    • Median - 1.2s
    • 90th percentile - 2.0s
    • 99th percentile - 4.5s
  • Request Time with Pre-cache
    • Median - 150ms
    • 90th percentile - 200ms
    • 99th percentile - 250ms

Caveat - No speed up provided if the pre-load has not completed

Approach for Future Optimization - Cache the pending preload function and return it

The preload call already kicks off 3 playlist generation calls. We can store those pending functions in a cache. If those playlists are requested while the calls are pending, we can add some kind of callback to the pending function so that just wait for the previous calls to finish, rather than kick off new requests.

Benchmarks are approximate

These numbers are approximate, based on informal benchmarking.

  • Median - What I saw most often
  • 90th percentile - Numbers I saw a few times during testing
  • 99th percentile - The highest number I saw

@adambechtold adambechtold self-assigned this Feb 24, 2024
Copy link
Owner Author

@adambechtold adambechtold left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very fun! 👏

@adambechtold adambechtold marked this pull request as ready for review February 24, 2024 21:07
@adambechtold adambechtold merged commit 9452e03 into main Feb 24, 2024
@adambechtold adambechtold deleted the feat/optimize-playlist-generation/eager-load-and-cache/main branch February 24, 2024 21:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant