New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Networking Overview #9

Closed
orta opened this Issue Aug 5, 2014 · 7 comments

Comments

Projects
None yet
4 participants
@orta
Member

orta commented Aug 5, 2014

This was discussed in an artsy hangout, but I wanted to get it into something actionable, Our NetworkingAPI should:

  • Make it easy to run offline
  • Make it possible to run network requests synchronously
  • Treat stubs as a first class citizen
  • Allow tests to state only networking request X is allowed during this test run
  • Allow iterating through all potential API requests at runtime for API sanity checks
  • Keep track of current requests and don't support dupes

Think this is best done by having the request generation & the networking API client separate. A lot like how we have it in existing apps. Not assigning someone, but they should get ticked off as it happens, definitely better to do it earlier.

@orta orta added the enhancement label Aug 5, 2014

@orta

This comment has been minimized.

Show comment
Hide comment
@orta

orta Aug 5, 2014

Member

I'd like the ability to easily cancel requests too, not really sure if that's the job of the PAI or the Network Models though.

Member

orta commented Aug 5, 2014

I'd like the ability to easily cancel requests too, not really sure if that's the job of the PAI or the Network Models though.

@orta

This comment has been minimized.

Show comment
Hide comment
@orta

orta Aug 6, 2014

Member

Ideally in this use case most of our requests should be cached, we could try having an API layer that insta-returns data synronously, then makes the network API call, updates the objects then does the async call. My gut impression is that this is how they do it in the tumblr app.

Member

orta commented Aug 6, 2014

Ideally in this use case most of our requests should be cached, we could try having an API layer that insta-returns data synronously, then makes the network API call, updates the objects then does the async call. My gut impression is that this is how they do it in the tumblr app.

@shepting

This comment has been minimized.

Show comment
Hide comment
@shepting

shepting Aug 10, 2014

Another great feature would be:

  • Don't emit a second network call, if a previous one for the same resource is already "in flight"

shepting commented Aug 10, 2014

Another great feature would be:

  • Don't emit a second network call, if a previous one for the same resource is already "in flight"
@segiddins

This comment has been minimized.

Show comment
Hide comment
@segiddins

segiddins Aug 10, 2014

Contributor

@orta best bet might be to separate networking layer from your persistence layer and use something like NSFetchResultsController for that binding goodness

Contributor

segiddins commented Aug 10, 2014

@orta best bet might be to separate networking layer from your persistence layer and use something like NSFetchResultsController for that binding goodness

@orta

This comment has been minimized.

Show comment
Hide comment
@orta

orta Aug 11, 2014

Member

I agree with both ( that's a smart one @shepting )

Member

orta commented Aug 11, 2014

I agree with both ( that's a smart one @shepting )

@ashfurrow

This comment has been minimized.

Show comment
Hide comment
@ashfurrow

ashfurrow Aug 17, 2014

Member

(Working on something here)

Member

ashfurrow commented Aug 17, 2014

(Working on something here)

@orta

This comment has been minimized.

Show comment
Hide comment
@orta

orta Sep 3, 2014

Member

Closing, the responsibilities for this issue now lay in
screen shot 2014-09-01 at 10 51 36 am

Member

orta commented Sep 3, 2014

Closing, the responsibilities for this issue now lay in
screen shot 2014-09-01 at 10 51 36 am

@orta orta closed this Sep 3, 2014

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment