You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While I don't find this is fully necessary in many cases, since you can call api.request() directly, there is lots of value in building a transaction / rollback strategy, for example:
Managing a loading state to return from the hook and use in the UI
Optimistically updating the cache value
Making multiple API calls which are all required for a successful completion
If one API call fails, we want to roll back to the original value
In-between the API requests, we don't want the cache to be optimistically updated multiple times. While this could be jarring for the user, it may also introduce bugs. For example, if you're incrementing a counter really quickly with a button, we could optimistically show the last-pressed button, but the API will keep resetting it after each successive API call completes.
An explicit mutations API which supports an optimistic response could solve the above issues.
Additionally, the optimistic setter should accept a "setter" function in addition to just passing the new response directly. This would help to facilitate updating the response relative to the previous response in a deterministic way, similar to react#useState's function setter options.
Goals
Allow a minimal mutation to be performed without much boilerplate
Allow various API methods to be accessed, such as request, writeCachedResponse, readCachedResponse, buildUrl, etc. that may be useful during the mutation. Event handlers such as onError and onCachedUpdate should not be accessible during the mutation.
Should manage a loading flag to be returned in the hook while the mutation is occurring
There should be an optimistic mutation option which allows for setting the cache values in a transaction, so the cached response can be updated for the mutation query or any other query and can be rolled back if the request(s) fails
Concurrent requests should be allowed, with the most recent optimistic response being respected. Also, the loading flag should remain true while any requests are going on.
Default fetch policy of the mutation request method should only be no-cache, and should be customizable via ApiProvider.
Usage Examples
Optimistic update pattern with key-value cache:
import{useApiMutation,useApiQuery}from'fairlight'const{showSnackbar}=useSnackbar()const{handleUnexpectedError}=useErrorHandler()const[users,usersQueryActions]=useApiQuery(UserEndpoints.list())const[user,userQueryActions]=useApiQuery(UserEndpoints.findById(props.id))const[updateUser,{loading: updatingUser}]=useApiMutation({mutation: (values,formikBag)=>async({request, setOptimisticResponse})=>{// transform data into request body payloadconstupdatePayload=transformDataForUpdate(values)// optimistically update the UIconstcommitUser=setOptimisticResponse(UserEndpoints.findById(1),(prev)=>({...prev, ...updatePayload}))constcommitUsers=setOptimisticResponse(UserEndpoints.list(),(prev)=>prev.map((prevUser)=>{returnprevUser.id===prop.id ? {...prevUser, updatePayload} : prevUser}))// make request(s)// - `deduplicate` is always `false`// - `fetchPolicy` is always `fetch-first` or `no-cache`constupdatedUser=awaitrequest(UserEndpoints.partialUpdate(props.id,updatedUser))// now we need to replace the cached versions with// the correct versionscommitUser(updatedUser)commitUsers((prev)=>prev.map((prevUser)=>{returnprevUser.id===prop.id ? updatedUser : prevUser}))// we may even want to refetch these related queries insteadusersQueryActions.refetch()userQueryActions.refetch()showSnackbar({variant: 'Success',message: `User '${updatedUser.name}' has been updated.`})},onError(error,{args}){handleFormError(error,args[1],'An error occurred')}})
Potential future-version if we start supporting a normalized cache:
import{useApiMutation,useApiQuery}from'fairlight'const{showSnackbar}=useSnackbar()const{handleUnexpectedError}=useErrorHandler()const[users,usersQueryActions]=useApiQuery(UserEndpoints.list())const[user,userQueryActions]=useApiQuery(UserEndpoints.findById(props.id))const[updateUser,{loading: updatingUser}]=useApiMutation({mutation: (values,formikBag)=>async({request})=>{// transform data into request body payloadconstupdatePayload=transformDataForUpdate(values)constupdatedUser=awaitrequest(UserEndpoints.partialUpdate(props.id,updatePayload),{// must be set directly on the `request` because// it doesn't have a way to match up request body// to identify the correct request (ie. running multiple `POST` requests)optimisticResponse: {
...user.data,
...updatePayload}})// we may even want to refetch these related queriesuserQueryActions.refetch()showSnackbar({variant: 'Success',message: `User '${updatedUser.name}' has been updated.`})},onError(error,{args}){handleFormError(error,args[1],'An error occurred')}})
The text was updated successfully, but these errors were encountered:
Summary
Mutations are a common feature in data-fetching libraries:
While I don't find this is fully necessary in many cases, since you can call
api.request()
directly, there is lots of value in building a transaction / rollback strategy, for example:An explicit mutations API which supports an optimistic response could solve the above issues.
Additionally, the optimistic setter should accept a "setter" function in addition to just passing the new response directly. This would help to facilitate updating the response relative to the previous response in a deterministic way, similar to
react#useState
's function setter options.Goals
request
,writeCachedResponse
,readCachedResponse
,buildUrl
, etc. that may be useful during the mutation. Event handlers such asonError
andonCachedUpdate
should not be accessible during the mutation.loading
flag to be returned in the hook while the mutation is occurringrequest
(s) failsloading
flag should remaintrue
while any requests are going on.request
method should only beno-cache
, and should be customizable viaApiProvider
.Usage Examples
Optimistic update pattern with key-value cache:
Potential future-version if we start supporting a normalized cache:
The text was updated successfully, but these errors were encountered: