Skip to content
Permalink
Browse files

Update Docs for Scheduler refactor (DispatchTimeInterval)

  • Loading branch information...
freak4pc authored and kzaher committed Apr 16, 2019
1 parent c8a545a commit ac20a268a5a146657b2a70d7d3aabd4e6c833ea1
@@ -1,19 +1,20 @@
## Comparison with ReactiveCocoa
## Comparison with ReactiveSwift

RxSwift is somewhat similar to ReactiveCocoa since ReactiveCocoa borrows a large number of concepts from Rx.
RxSwift is somewhat similar to ReactiveSwift since ReactiveSwift borrows a large number of concepts from Rx.

One of the main goals of this project was to create a significantly simpler interface that is more aligned with other Rx implementations, offers a richer concurrency model, offers more optimization opportunities and is more aligned with built-in Swift error handling mechanisms.

We've also decided to only rely on the Swift/llvm compiler and not introduce any external dependencies.

Probably the main difference between these projects is in their approach in building abstractions.

The main goal of RxSwift project is to provide environment agnostic compositional computation glue abstracted in the form of observable sequences.
The main goal of RxSwift project is to provide environment-agnostic compositional computation glue abstracted in the form of observable sequences.

We then aim to improve the experience of using RxSwift on specific platforms. To do this, RxCocoa uses generic computations to build more practical abstractions and wrap Foundation/Cocoa/UKit frameworks. That means that other libraries give context and semantics to the generic computation engine RxSwift provides such as `Driver`, `Signal`, `ControlProperty`, `ControlEvent`s and more.

One of the benefits to representing all of these abstractions as a single concept - ​_observable sequences_​ - is that all computation abstractions built on top of them are also composable in the same fundamental way. They all follow the same contract and implement the same interface.
It is also easy to create flexible subscription (resource) sharing strategies or use one of the built-in ones: `share`, `publish`, `multicast` ...

This library also offers a fine-tunable concurrency model. If concurrent schedulers are used, observable sequence operators will preserve sequence properties. The same observable sequence operators will also know how to detect and optimally use known serial schedulers. ReactiveCocoa has a more limited concurrency model and only allows serial schedulers.
This library also offers a fine-tunable concurrency model. If concurrent schedulers are used, observable sequence operators will preserve sequence properties. The same observable sequence operators will also know how to detect and optimally use known serial schedulers. ReactiveSwift has a more limited concurrency model and only allows serial schedulers.

Multithreaded programming is really hard and detecting non-trivial loops is even harder. That's why all operators are built in a fault tolerant way. Even if element generation occurs during element processing (recursion), operators will try to handle that situation and prevent deadlocks. This means that in the worst possible case programming error will cause stack overflow, but users won't have to manually kill the app, and you will get a crash report in error reporting systems so you can find and fix the problem.
@@ -4,7 +4,7 @@
To run the example app:

* Open `Rx.xcworkspace`
* Choose one of example schemes (RxExample-iOS, RxExample-OSX) and hit `Run`.
* Choose one of example schemes (RxExample-iOS, RxExample-macOS) and hit `Run`.

You can also run the example app using CocoaPods.

@@ -116,10 +116,10 @@ enum Availability {
var message: String {
switch self {
case .available(message: let message),
.taken(message: let message),
.invalid(message: let message),
.pending(message: let message):
case .available(let message),
.taken(let message),
.invalid(let message),
.pending(let message):
return message
}
@@ -114,7 +114,7 @@ Here is an example with the `interval` operator.

```swift
let scheduler = SerialDispatchQueueScheduler(qos: .default)
let subscription = Observable<Int>.interval(0.3, scheduler: scheduler)
let subscription = Observable<Int>.interval(.milliseconds(300), scheduler: scheduler)
.subscribe { event in
print(event)
}
@@ -157,7 +157,7 @@ A few more examples just to be sure (`observeOn` is explained [here](Schedulers.
In case we have something like:

```swift
let subscription = Observable<Int>.interval(0.3, scheduler: scheduler)
let subscription = Observable<Int>.interval(.milliseconds(300), scheduler: scheduler)
.observeOn(MainScheduler.instance)
.subscribe { event in
print(event)
@@ -174,7 +174,7 @@ subscription.dispose() // called from main thread
Also, in this case:

```swift
let subscription = Observable<Int>.interval(0.3, scheduler: scheduler)
let subscription = Observable<Int>.interval(.milliseconds(300), scheduler: scheduler)
.observeOn(serialScheduler)
.subscribe { event in
print(event)
@@ -375,7 +375,7 @@ Ok, now something more interesting. Let's create that `interval` operator that w
*This is equivalent of actual implementation for dispatch queue schedulers*

```swift
func myInterval(_ interval: TimeInterval) -> Observable<Int> {
func myInterval(_ interval: DispatchTimeInterval) -> Observable<Int> {
return Observable.create { observer in
print("Subscribed")
let timer = DispatchSource.makeTimerSource(queue: DispatchQueue.global())
@@ -402,7 +402,7 @@ func myInterval(_ interval: TimeInterval) -> Observable<Int> {
```

```swift
let counter = myInterval(0.1)
let counter = myInterval(.milliseconds(100))
print("Started ----")
@@ -435,7 +435,7 @@ Ended ----
What if you would write

```swift
let counter = myInterval(0.1)
let counter = myInterval(.milliseconds(100))
print("Started ----")
@@ -499,7 +499,7 @@ There are two things that need to be defined.
The usual choice is a combination of `replay(1).refCount()`, aka `share(replay: 1)`.

```swift
let counter = myInterval(0.1)
let counter = myInterval(.milliseconds(100))
.share(replay: 1)
print("Started ----")
@@ -634,7 +634,7 @@ extension ObservableType {
So now you can use your own map:

```swift
let subscription = myInterval(0.1)
let subscription = myInterval(.milliseconds(100))
.myMap { e in
return "This is simply \(e)"
}
@@ -779,7 +779,7 @@ Using debugger alone is useful, but usually using `debug` operator will be more
`debug` acts like a probe. Here is an example of using it:

```swift
let subscription = myInterval(0.1)
let subscription = myInterval(.milliseconds(100))
.debug("my probe")
.map { e in
return "This is simply \(e)"
@@ -858,7 +858,7 @@ In case you want to have some resource leak detection logic, the simplest method
/* add somewhere in
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplicationLaunchOptionsKey : Any]? = nil)
*/
_ = Observable<Int>.interval(1, scheduler: MainScheduler.instance)
_ = Observable<Int>.interval(.seconds(1), scheduler: MainScheduler.instance)
.subscribe(onNext: { _ in
print("Resource count \(RxSwift.Resources.total)")
})
@@ -988,8 +988,8 @@ Let's say you have something like this:

```swift
let searchResults = searchText
.throttle(0.3, $.mainScheduler)
.distinctUntilChanged
.throttle(.milliseconds(300), scheduler: MainScheduler.instance)
.distinctUntilChanged()
.flatMapLatest { query in
API.getSearchResults(query)
.retry(3)
@@ -3,6 +3,6 @@
To use playgrounds:

* Open `Rx.xcworkspace`
* Build the `RxSwift-macOS` scheme
* Build the `RxSwift` scheme on `My Mac`.
* Open `Rx` playground in the `Rx.xcworkspace` tree view.
* Choose `View > Debug Area > Show Debug Area`
@@ -282,7 +282,7 @@ This is a typical beginner example.

```swift
let results = query.rx.text
.throttle(0.3, scheduler: MainScheduler.instance)
.throttle(.milliseconds(300), scheduler: MainScheduler.instance)
.flatMapLatest { query in
fetchAutoCompleteItems(query)
}
@@ -313,7 +313,7 @@ A more appropriate version of the code would look like this:

```swift
let results = query.rx.text
.throttle(0.3, scheduler: MainScheduler.instance)
.throttle(.milliseconds(300), scheduler: MainScheduler.instance)
.flatMapLatest { query in
fetchAutoCompleteItems(query)
.observeOn(MainScheduler.instance) // results are returned on MainScheduler
@@ -340,7 +340,7 @@ The following code looks almost the same:

```swift
let results = query.rx.text.asDriver() // This converts a normal sequence into a `Driver` sequence.
.throttle(0.3, scheduler: MainScheduler.instance)
.throttle(.milliseconds(300), scheduler: MainScheduler.instance)
.flatMapLatest { query in
fetchAutoCompleteItems(query)
.asDriver(onErrorJustReturn: []) // Builder just needs info about what to return in case of error.
@@ -138,7 +138,7 @@ Writing all of this and properly testing it would be tedious. This is that same

```swift
searchTextField.rx.text
.throttle(0.3, scheduler: MainScheduler.instance)
.throttle(.milliseconds(300), scheduler: MainScheduler.instance)
.distinctUntilChanged()
.flatMapLatest { query in
API.getSearchResults(query)
@@ -169,7 +169,7 @@ This is how we can do it using Rx:
```swift
// this is a conceptual solution
let imageSubscription = imageURLs
.throttle(0.2, scheduler: MainScheduler.instance)
.throttle(.milliseconds(200), scheduler: MainScheduler.instance)
.flatMapLatest { imageURL in
API.fetchImage(imageURL)
}
@@ -50,7 +50,7 @@ KVO observing, async operations and streams are all unified under [abstraction o

###### ... understand the structure

RxSwift is comprised of five separate components depending on eachother in the following way:
RxSwift comprises five separate components depending on eachother in the following way:

```none
┌──────────────┐ ┌──────────────┐
@@ -95,7 +95,7 @@ RxSwift is comprised of five separate components depending on eachother in the f
<tr>
<td><div class="highlight highlight-source-swift"><pre>
let searchResults = searchBar.rx.text.orEmpty
.throttle(0.3, scheduler: MainScheduler.instance)
.throttle(.milliseconds(300), scheduler: MainScheduler.instance)
.distinctUntilChanged()
.flatMapLatest { query -> Observable&lt;[Repository]&gt; in
if query.isEmpty {

0 comments on commit ac20a26

Please sign in to comment.
You can’t perform that action at this time.