Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Feature Request: Include total in queryChannels request #1429

Open
Aidurber opened this issue Jan 6, 2025 · 2 comments
Open

Feature Request: Include total in queryChannels request #1429

Aidurber opened this issue Jan 6, 2025 · 2 comments

Comments

@Aidurber
Copy link

Aidurber commented Jan 6, 2025

First off I need to clarify that our usage of Stream is fairly non-standard. In our application we need to fetch all channels up front because we do channel categorisation. We cannot categorise on scroll and infinitely load that way because channels will jump around.

Our issue is that the response from client.queryChannels doesn't return a total. Because of this we need to request channels sequentially rather than in parallel. We see some requests for channels in the 3-5s range pretty frequently.

Here's the code we have to use:

        let allChannels: Channel[][] = []
        let offset = 0

        while (true) {
          const response = await client.queryChannels(filters, undefined, {
            limit: QUERY_LIMIT,
            offset,
            watch: true,
            state: true,
            message_limit: 50,
          })
          allChannels.push(response)
          if (response.length < QUERY_LIMIT) {
            break
          }
          offset += QUERY_LIMIT
        }

        return allChannels.flat()

If a coach has 100 users assigned to them so we need to make 4 requests due to the query limit being 30.
If Stream exposed a total: number in the API response we could request once to get the first page and the total, then fetch the rest in parallel which would improve the performance for us pretty drastically.

With the current way the API is designed we can sorta force parallel requests with something like this (naïve implementation):

         // Super rough and too static
        const ESTIMATED_MAX_CHANNELS = 300
        const requests = Math.ceil(ESTIMATED_MAX_CHANNELS / QUERY_LIMIT)
        const range = Array.from({ length: requests }, (_, i) => i * QUERY_LIMIT)

        const responses = await Promise.all(
          range.map((offset) =>
            client.queryChannels(filters, undefined, {
              limit: QUERY_LIMIT,
              offset,
              watch: true,
              state: true,
              message_limit: 50,
            }),
          ),
        )
        const data = responses.flat()

This could be improved by batching 100 channels at a time in parallel so we can overcome the ESTIMATED_MAX_CHANNELS limitation.

All of these are less efficient than the API telling us the total up front however and I'd rather not hit the Stream API more than we need to.

@szuperaz
Copy link
Contributor

szuperaz commented Jan 6, 2025

Hi,

Unfortunately, we can't accommodate new features that support non-standard use cases of our API. Front loading all channels is something we advise against as it can cause performance issues on the client-side and rate limit issues.

Although I'm not familiar with your use-case, but if the issue is sorting ("We cannot categorise on scroll and infinitely load that way because channels will jump around.") you can use custom data on channels to categorize channels and then use queryChannels to filter by category.

queryChannels({'custom.category': 'Category 1'}) // once the user scrolled to the bottom of this category, switch to next query
queryChannels({'custom.category': 'Category 2'})

The limitation is that your channel list will be sorted by category first, but it's a performant solution because you don't have to fire off multiple requests in parallel.

@Aidurber
Copy link
Author

Aidurber commented Jan 6, 2025

Hello, that's an interesting suggestion. Our categories are a mix of time-based and message-based. For example:

  • Active Coach
  • Active User
  • Needs Follow up

These would need to be computed periodically and/or on event, ideally on the backend. I'll run it by the team.

I do understand that our use-case is non-standard. However if Stream pagination it also makes sense that there be some indicator of the position via a cursor, current page + total pages. I think that's standard for a pagination API.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants