New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tutorials for Orleans 2.0 #4310

Open
mehmetakbulut opened this Issue Mar 27, 2018 · 9 comments

Comments

Projects
None yet
6 participants
@mehmetakbulut
Contributor

mehmetakbulut commented Mar 27, 2018

This is issue ticket is intended to start a discussion on how the tutorials for Orleans 2.0 (and beyond) should be developed. As we come up with tangible items to take action on, we can populate this post to track progress on these items.

@mehmetakbulut

This comment has been minimized.

Contributor

mehmetakbulut commented Mar 27, 2018

Here are a few impressions I have from using the 1.x tutorials last year:

  • Confusing Order. The tutorials appear to be intended for sequential consumption. However the order in which they are presented doesn't really bode well for needs of someone learning from scratch. Particularly the first 3-4 tutorials.
  • Too Verbose. The tutorials are well written but suffer slightly from verbosity. We should aim to summarize as much as possible. (reader should be referred to other sections for details info)
  • Reliance on Third Parties. Showing how Orleans interfaces with third parties is great but detracts from the general learning goals. For example, the Declarative Persistence tutorial depends on Azure. This is great if the reader's target environment is Azure but detracting for others. We should either write tutorials so there is no reliance on third-party services/software (i.e. the reader also learns how to substitute alternatives) and/or make sure there are alternate versions of the tutorial for alternative services/software.
@sergeybykov

This comment has been minimized.

Member

sergeybykov commented Mar 27, 2018

@mehmetakbulut Thanks for raising this. These are some of the reasons we didn't want to simply copy and tweak the 1.5 tutorial. Most of them were written for the original OSS release more than 3 years ago.

I think it would be much better to step back and rethink how they should be better organized, now that we have the extensive community experience of what helps, what confuses, and what's missing.

For example, the Declarative Persistence tutorial depends on Azure.

The tutorial shouldn't depend on Azure. We need to make it more clear that there's a variety of providers to choose from. At the same time, I suspect for the majority of people coming to Orleans (C# developers) Azure and Azure Storage are familiar things, and using Azure Tables and Blobs as initial examples I think can actually help.

@sergeybykov sergeybykov added this to the 2.0.0-docs-tests milestone Mar 27, 2018

@jason-bragg

This comment has been minimized.

Contributor

jason-bragg commented Mar 27, 2018

I don’t have the experience of learning Orleans via the tutorials but will contribute some thoughts on this subject as a maintainer. I advocate we pay special attention to the input of those who have attempted to learn Orleans via the tutorials, more so than the input from maintainers like me.

It is important to have materials which help developers ramp up on Orleans and use it successfully in their environments. Tutorials help with this but developing and maintaining tutorials is time consuming because of the wide range of features and capabilities as well as the churn which necessitates regular updates of the tutorials.

The Orleans core team is small, and effort spent on developing and maintaining tutorials is always competing with other efforts (features, technical debt). This often leads to our tutorials becoming out of date, which diminishes their value and can leave new users with a negative impression of Orleans.

As an alternative to maintaining a set of tutorials and sample applications, I ask that the team and community consider maintaining a set of tests which can serve as tutorials. The advantages of this approach are:

  • We need to write tests for all features and capabilities anyway, so the overhead of making a subset of those more readable is significantly less than the cost of writing tutorials. This reduced cost means we can cover a much wider range of features.
  • Since the examples are also tests, users can prototype and play with the features within the existing test harnesses while learning, prior to setting up a service of their own.
  • Since our tests are run during CI and nightly, having tests double as tutorials and samples means that they will be kept up to date. This is done at a cost we’re already paying to keeping all our tests passing.
  • If users encounter issues using a feature, having working tests that demonstrate successful use of a feature can help them identify the problem by contrasting their code with the tests. In the case where there are bugs missed by incomplete test coverage, this process can even lead to them finding bugs in our code and contributing tests that reproduce them.

The above suggestion does have some downsides.

  • We’d need to update our test cluster such that tests (at least those that serve as examples/tutorials) can very closely resemble what a user’s service would look like.
  • We’d need to make branches for each release, so users could pull down the branch specific to the version of Orleans they are using to see the tests for that branch.
  • Documentation links pointing to example/tutorial tests may need be updated per release.

While I understand that what I’m suggesting is not common practice, I’d note that most of the common practices are geared towards software for sale. I suggest that in an open source project, where the cost of the software is the community involvement, it is not sufficient to ramp users up on the use of the software, but also on how to test and contribute. By reducing the cost of maintaining our tutorials, we’re reducing the maintenance cost for the community as well. By using our tests as examples, we surface our testing, making it more transparent and accessible for the community.

@mehmetakbulut

This comment has been minimized.

Contributor

mehmetakbulut commented Mar 28, 2018

I think some tutorials are helpful (and at the end of the day, we might end up with more 2.0 tutorials than we had for 1.x) but looking back: I picked up most of my "practical Orleans skills" by actually browsing the samples and tests directories in the Orleans repo. Therefore I agree with Jason's point.

We’d need to make branches for each release, so users could pull down the branch specific to the version of Orleans they are using to see the tests for that branch. Documentation links pointing to example/tutorial tests may need be updated per release.

I imagine using tags rather than branches, or at the very least having a "latest" branch that is up to date with latest Orleans release. So documentation just points to the latest branch rather than getting updated per release.

Has there been any consideration to maintaining a generated API reference? Being able to just look at what is available can be very effective in figuring out how to accomplish a task. Relying on auto complete to suggest what you may do (or even when you don't have access to an IDE) and browsing the source code are sometimes too slow and hard to interpret without context. I have been building docfx-generated reference for Orleans base for my own use, I just pushed it onto a repo. For example, I can see here that when writing grain code, I have access to so and so methods and properties and also that there exists grains with persisted states with additional functionality.

@ReubenBond

This comment has been minimized.

Contributor

ReubenBond commented Mar 28, 2018

DocFx has a way to reference code from your pages, so we can include snippets from samples/tests in documentation pages. See: https://dotnet.github.io/docfx/spec/docfx_flavored_markdown.html?tabs=tabid-1%2Ctabid-a#code-snippet

Thankfully we are already using tags
image

A generated API reference is a good idea. At the moment, our docs live in a separate (entirely disjoint) branch, so generating API reference docs is a little trickier. I've been advocating moving docs under the main tree, under a docs folder, but I believe it's an unpopular idea. It also makes it easier to include snippets from the source tree. We could add a comment above and below the snippet source code to say "This is used as a snippet, so keep that in mind when editing this file. Referenced from /docs/SomeDoc.md".

@mehmetakbulut

This comment has been minimized.

Contributor

mehmetakbulut commented Mar 28, 2018

Moving docs into master is one way but what if the latest release is copied in to gh-pages branch instead?

@jason-bragg

This comment has been minimized.

Contributor

jason-bragg commented Mar 28, 2018

@mehmetakbulut I am in favor of having a generated api reference.

In the short run, it shouldn't be hard to generate an api reference and link to it in the docs. Most generators I've seen can link in full doc pages, so my tendency is to favor using the generator to generate the docs, not just the api references. In the past I've done this by keeping doc pages in with the source, and maintaining clean code comments. With that, docs can be maintained along with the code with relatively little effort and are less likely to get out of date.

Regarding tags or branches, tags should work and I don't have a strong opinion, I mainly meant to call out that if the samples/tutorial were part of the tests, we need to figure out how to make them available via the docs and for users running older versions of Orleans. Not saying these are hard problems to solve. :)

@mehmetakbulut

This comment has been minimized.

Contributor

mehmetakbulut commented Mar 29, 2018

I summarized below some of the ideas we discussed along with a few questions/uncertainties associated with these ideas.

  • Develop documentation & tutorials for consumption by new users while minimizing burden on core team
    • What tutorials should be developed/maintained in the immediate future?
  • Write tests so they can serve as tutorials and/or use case examples for the community
    • Do they live in the orleans/master or do we move them to a different branch and/or repo?
    • How do we make tests easier to interpret/utilize as tutorials?
    • Would there potentially need to be changes to the TestCluster code to realize this goal?
  • Set up a generated API reference
    • Do they live in the orleans/master or do we move them to a different branch and/or repo?
    • Do we merge them with the documentation already on the orleans/gh-pages?
    • What about something like https://readthedocs.org/ so different API versions are maintained automatically?

It would be great to hear if anybody has any points against implementing these things. I think once we answer these questions we could better determine if there are any action items that could be tackled.

@sergeybykov sergeybykov modified the milestones: 2.0.0-docs-tests, 2.1.0 Aug 16, 2018

@sergeybykov sergeybykov added the P3 label Aug 20, 2018

@sergeybykov sergeybykov modified the milestones: 2.1.0, Backlog Aug 20, 2018

@rzubek

This comment has been minimized.

rzubek commented Oct 7, 2018

@sergeybykov @JillHeaden Just wanted to +1 this effort. As someone new to Orleans 2 (but familiar with actors), I find it impossible to get started on Orleans 2 given the current state of tutorials. The general docs are well-considered and informative, but there is a lot to this implementation that a new user just doesn't understand out of context. Unfortunately it doesn't seem like I can just follow 1.5 tutorials, since they're deprecated, and it appears both the API and deployment details have changed.

Also, I would like to suggest that just sample code or API references are not sufficient: a new user doesn't have the mental models to understand the code they're seeing, and especially for something like a distributed programming library, which requires things to be set up just so to work right.

What would be best is sample code, introduced bit by bit, alongside good narrative explanation of why the author is doing what they're doing, and what are the results or consequences. (And before that, even the very basic steps, like how to set up a new project and test a "hello world" client/server app locally, is also crucial.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment