2

I have to pretty weird case to handle.

We have to few boxes, We can call some action on every box. When We click the button inside the box, we call some endpoint on the server (using axios). Response from the server return new updated information (about all boxes, not the only one on which we call the action).

Issue: If user click submit button on many boxes really fast, the request call the endpoints one by one. It's sometimes causes errors, because it's calculated on the server in the wrong order (status of group of boxes depends of single box status). I know it's maybe more backend issue, but I have to try fix this on frontend.

Proposal fix: In my opinion in this case the easiest fix is disable every submit button if any request in progress. This solution unfortunately is very slow, head of the project rejected this proposition.

What we want to goal: In some way We want to queue the requests without disable every button. Perfect solution for me at this moment:

  • click first button - call endpoint, request pending on the server.
  • click second button - button show spinner/loading information without calling endpoint.
  • server get us response for the first click, only then we really call the second request.

I think something like this is huge antipattern, but I don't set the rules. ;)

I was reading about e.g. redux-observable, but if I don't have to I don't want to use other middleware for redux (now We use redux-thunk). Redux-saga it will be ok, but unfortunately I don't know this tool. I prepare simple codesandbox example (I added timeouts in redux actions for easier testing).

I have only one stupid proposal solution. Creating a array of data needs to send correct request, and inside useEffect checking if the array length is equal to 1. Something like this:

const App = ({ boxActions, inProgress, ended }) => {
  const [queue, setQueue] = useState([]);

  const handleSubmit = async () => {  // this code do not work correctly, only show my what I was thinking about 

    if (queue.length === 1) {
      const [data] = queue;
      await boxActions.submit(data.id, data.timeout);
      setQueue(queue.filter((item) => item.id !== data.id));
  };
  useEffect(() => {
    handleSubmit();
  }, [queue])


  return (
    <>
      <div>
        {config.map((item) => (
          <Box
            key={item.id}
            id={item.id}
            timeout={item.timeout}
            handleSubmit={(id, timeout) => setQueue([...queue, {id, timeout}])}
            inProgress={inProgress.includes(item.id)}
            ended={ended.includes(item.id)}
          />
        ))}
      </div>
    </>
  );
};

Any ideas?

3 Answers 3

4

I agree with your assessment that we ultimately need to make changes on the backend. Any user can mess with the frontend and submit requests in any order they want regardless how you organize it.

I get it though, you're looking to design the happy path on the frontend such that it works with the backend as it is currently.

It's hard to tell without knowing the use-case exactly, but there may generally be some improvements we can make from a UX perspective that will apply whether we make fixes on the backend or not.

Is there an endpoint to send multiple updates to? If so, we could debounce our network call to submit only when there is a delay in user activity.

Does the user need to be aware of order of selection and the impacts thereof? If so, it sounds like we'll need to update frontend to convey this information, which may then expose a natural solution to the situation.

It's fairly simple to create a request queue and execute them serially, but it seems potentially fraught with new challenges.

E.g. If a user clicks 5 checkboxes, and order matters, a failed execution of the second update would mean we would need to stop any further execution of boxes 3 through 5 until update 2 could be completed. We'll also need to figure out how we'll handle timeouts, retries, and backoff. There is some complexity as to how we want to convey all this to the end user.

Let's say we're completely set on going that route, however. In that case, your use of Redux for state management isn't terribly important, nor is the library you use for sending your requests.

As you suggested, we'll just create an in-memory queue of updates to be made and dequeue serially. Each time a user makes an update to a box, we'll push to that queue and attempt to send updates. Our processEvents function will retain state as to whether a request is in motion or not, which it will use to decide whether to take action or not.

Each time a user clicks a box, the event is added to the queue, and we attempt processing. If processing is already ongoing or we have no events to process, we don't take any action. Each time a processing round finishes, we check for further events to process. You'll likely want to hook into this cycle with Redux and fire new actions to indicate event success and update the state and UI for each event processed and so on. It's possible one of the libraries you use offer some feature like this as well.

// Get a better Queue implementation if queue size may get high.
class Queue {
  _store = [];
  enqueue = (task) => this._store.push(task);
  dequeue = () => this._store.shift();
  length = () => this._store.length;
}

export const createSerialProcessor = (asyncProcessingCallback) => {
  const updateQueue = new Queue();

  const addEvent = (params, callback) => {
    updateQueue.enqueue([params, callback]);
  };

  const processEvents = (() => {
    let isReady = true;

    return async () => {
      if (isReady && updateQueue.length() > 0) {
        const [params, callback] = updateQueue.dequeue();
        isReady = false;

        await asyncProcessingCallback(params, callback); // retries and all that include

        isReady = true;
        processEvents();
      }
    };
  })();

  return {
    process: (params, callback) => {
      addEvent(params, callback);
      processEvents();
    }
  };
};

Hope this helps.

Edit: I just noticed you included a codesandbox, which is very helpful. I've created a copy of your sandbox with updates made to achieve your end and integrate it with your Redux setup. There are some obvious shortcuts still being taken, like the Queue class, but it should be about what you're looking for: https://codesandbox.io/s/dank-feather-hqtf7?file=/src/lib/createSerialProcessor.js

Sign up to request clarification or add additional context in comments.

Comments

3
+50

In case you would like to use redux-saga, you can use the actionChannel effect in combination with the blocking call effect to achieve your goal:

Working fork: https://codesandbox.io/s/hoh8n

Here is the code for boxSagas.js:

import {actionChannel, call, delay, put, take} from 'redux-saga/effects';
// import axios from 'axios';
import {submitSuccess, submitFailure} from '../actions/boxActions';
import {SUBMIT_REQUEST} from '../types/boxTypes';

function* requestSaga(action) {
  try {
    // const result = yield axios.get(`https://jsonplaceholder.typicode.com/todos`);
    yield delay(action.payload.timeout);
    yield put(submitSuccess(action.payload.id));
  } catch (error) {
    yield put(submitFailure());
  }
}

export default function* boxSaga() {
  const requestChannel = yield actionChannel(SUBMIT_REQUEST); // buffers incoming requests
  while (true) {
    const action = yield take(requestChannel); // takes a request from queue or waits for one to be added
    yield call(requestSaga, action); // starts request saga and _waits_ until it is done
  }
}

I am using the fact that the box reducer handles the SUBMIT_REQUEST actions immediately (and sets given id as pending), while the actionChannel+call handle them sequentially and so the actions trigger only one http request at a time.

More on action channels here: https://redux-saga.js.org/docs/advanced/Channels/#using-the-actionchannel-effect

4 Comments

Thanks so much. In your example, where exactly should I call the endpoint? In this test case I use dummy timeout to easiest testing, but in my real example i wait not for timeout from action, but for response for the server, but in your fork we don't call endpoint at all (as I mentioned, i don't know saga). Can you update your example not using dummy timeouts?
You just have to remove the delay and uncomment the request (also the axios import was wrong O:) ) - see codesandbox.io/s/hw4zb for working version with endpoint instead of timeout
Ok, if you could be so kind, I have last question. What should I do if I have another button, calling another endpoints, but want to take every endpoint calls in one queue? Something like this: codesandbox.io/s/…
If you really need to have all the redux actions duplicated (e.g. because you want to do different operations in reducer), then you can pass an array to the actionChannel effect and then run different sagas based on action type: codesandbox.io/s/iodsl .... if you instead just need to change the endpoint, perhaps a simple type parameter would be enough codesandbox.io/s/1klz7 ... you can even combine the two solutions (e.g. have two actions for the different endpoints instead of type param but keep the rest the same).
1

Just store the promise from a previous request and wait for it to resolve before initiating the next request. The example below uses a global variable for simplicity - but you can use smth else to preserve state across requests (e.g. extraArgument from thunk middleware).

// boxActions.ts

let submitCall = Promise.resolve();

export const submit = (id, timeout) => async (dispatch) => {
  dispatch(submitRequest(id));

  submitCall = submitCall.then(() => axios.get(`https://jsonplaceholder.typicode.com/todos`))

  try {
    await submitCall;

    setTimeout(() => {
      return dispatch(submitSuccess(id));
    }, timeout);
  } catch (error) {
    return dispatch(submitFailure());
  }
};

1 Comment

Very Smart. Not sure I would have thought of this 😛. It's especially clean abstracted out, where you can see we're really only adding a few lines of code here, and the rest of the code flow pretty much stays unchanged. codesandbox.io/s/vigorous-bell-porcd?file=/src/lib/…

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.