Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Race condition on mounting multiple components with requests #177

Open
thomassuckow opened this issue Feb 11, 2020 · 7 comments
Open

Race condition on mounting multiple components with requests #177

thomassuckow opened this issue Feb 11, 2020 · 7 comments

Comments

@thomassuckow
Copy link

Describe the bug
Race condition causes some requests to never be sent and thus are loading forever

⚠️ Make a Codesandbox ⚠️
https://codesandbox.io/s/falling-shape-3gcsu

What I see, since this is a race condition and doesn't always trigger but does quite often for me.
Screen Shot 2020-02-11 at 08 55 06

To Reproduce
Steps to reproduce the behavior:

  1. Create a few requests in flight
  2. Before the first few complete, mount more components that cause more requests.
  3. Note that some/all of the new requests never complete and are loading "forever".
  4. Note that sometimes when one of the new components is unmounted while in the forever loading state a react invariant exception occurs because setLoading(false) is called in a finally block which is not allowed on an unmounted component.

Expected behavior
All requests should complete successfully.

@iamthesiz
Copy link
Collaborator

You are seriously amazing for helping bullet proof this! ❤️ Again, I will get on this as soon as possible.

@iamthesiz
Copy link
Collaborator

iamthesiz commented Feb 13, 2020

I tested the exact same code you have in the codesandbox at least 20 times on my local machine and was not able to reproduce the bug. I'll try some more, but it's possible this is an issue with codesandbox.

@iamthesiz
Copy link
Collaborator

Update: I was able to reproduce.

@iamthesiz
Copy link
Collaborator

iamthesiz commented Feb 13, 2020

Aha, here's the problem. The default cachePolicy is cache-first (meaning make 1 request, and use the cached response value every other request after that) and since you are making a bunch of the same requests, it's not making the requests, and returning the cached responses for each. I guess it's not able to store the 1st http request's response in the cache fast enough before the other 3 requests in the 1st batch are made 🐛. So that's why al the 1st 4 go true -> false.

Now, when we switch the count to 50, since the request is the same for them all, and loading is set to true initially when we have a [] as the dependency, the cached response is returned without setting loading to false because it doesn't make it that far. I'm still not 100% why the 1st 4 still remain false while the rest are `true. Stay tuned.

Current workarounds until I get a fix for this are.

  1. cachePolicy: 'no-cache'
function Thing() {
  const { loading, error, data } = useFetch(`https://httpbin.org/post`, {
      cachePolicy: 'no-cache',
      headers: {
        Accept: "application/json",
        "Content-Type": "application/json"
      },
      method: "POST",
      body: JSON.stringify({ width: 100 })
  }, [])

  return <div>Loading: {String(loading)}</div>
}
  1. Don't make the same request a lot of times. Shouldn't be making requests in an array like this anyway, but even if you had to, the requests would all look different using IDs from the various items. So your code would look more like
function Thing({ i }) {
  const { loading, error, data } = useFetch(`https://httpbin.org/post`, {
      headers: {
        Accept: "application/json",
        "Content-Type": "application/json"
      },
      method: "POST",
      body: JSON.stringify({ width: i }) // where this would be an ID of whatever item this is
    }, [])

  console.log(`loading ${i}: `, loading)

  return <div style={{ fontSize: 12 }}>Loading: {i} {String(loading)}</div>
}

const BugRaceCondition = () => {
  const [more, setMore] = useState(false);
  useTimeout(() => setMore(true), 1000)
  return (
    <Provider url='https://httpbin.org/delay/3'>
      <div className="App">
        <h1>Hello CodeSandbox</h1>
        <h2>Start editing to see some magic happen!</h2>
        {Array.from({ length: more ? 50 : 4 }, (x, i) => <Thing key={i} i={i} />)}
      </div>
    </Provider>
  )
}

@thomassuckow
Copy link
Author

It is worth noting in my real application they are all different post requests (same endpoint, different body). At least they should be 🤔. They should be GETs but that's an open issue with the API writer.

For now I ended up making my own lightweight useFetch.

For what it is worth, all these requests are for sparklines in an infinite scroller. So when you scroll quickly random sparklines never finish loading.

@iamthesiz
Copy link
Collaborator

Double check they are different bodies. The way the cache works is we save the key as url:https://your-endpoint.com/whatever||method:POST||body:YOUR_JSON_STRINGIFIED_BODY and the body as the response body. (the cache body will change in the future to be the response to the request, but for now it's "essentially" the response.json())

Since the key includes the body, if the bodies are different for each request, it should work. I will look into this some more asap.

@thomassuckow
Copy link
Author

I'm currently juggling some unrelated work, so it might be a bit before I get around to checking on the bodies.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants