Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revisit starter implementation policy for nextercism #82

Open
Stargator opened this issue Dec 6, 2017 · 7 comments
Open

Revisit starter implementation policy for nextercism #82

Stargator opened this issue Dec 6, 2017 · 7 comments

Comments

@Stargator
Copy link
Contributor

Stargator commented Dec 6, 2017

@exercism/dart

The next version of Exercism breaks exercises into core exercises and branch exercises. It will therefore be possible for users to complete exercises in many different orders.

In light of this, we should revisit our current policy regarding starter implementations, which "assumes" a fixed order of exercise completion, and decide whether/how to update it. Discussion should occur in this issue thread.

This discussion is stemming out of exercism/discussions

Current policy for reference:

We currently just create the implementation file with an empty class most of the time. Should we stub other methods and classes? I think a few exercises also stub a method.

I know our intent was to have the user generate or implement each required piece that was missing until all that was left was for them to figure out the logic.

peterseng summed it up nicely with this:

In general, it seems stubs may be provided for at least two reasons:

  1. If the implementation file needs to be placed in a certain directory due to the language or build tool's project structure.
    • Having a stub file (empty or not) saves the student the busy-work of having to create the directory structure then create the file.
    • Note that this busy-work has been described as supremely annoying.
    • I think this is always a good reason to have a stub file.
  2. The stub file could possibly provide the expected signatures.
    • There is healthy debate about whether it is a useful learning experience for the student to figure out the expected signatures versus just busy-work.
    • As a middle ground, some tracks may decide to place the expected signatures in a comment in the test file, or in HINTS.md such that it becomes included in the README.
    • Some tracks choose to only do this for the first few exercise, to get the student started on the track, but then wean off.
    • Of course, tracks that do not wish to do this can just include an empty stub file, which at least will cause the directory structure to be preserved.
    • For statically-typed languages, typically the entire test suite must type-check before it can be run. Therefore, to make the test suite runnable, tracks of statically-typed languages tend to choose one of the following options:
      • nothing (as a consequence, students must figure out the signatures from reading the tests and write them all before any work can be done on the first test)
      • provide stubs with the signatures: saves student the above work, but also removes that part of the learning process
      • use conditional compilation to make the compiler not perform type-checking for later tests. This is not possible in all languages.
@devkabiir
Copy link
Contributor

devkabiir commented Dec 6, 2017

I like the idea of having stubs for basic/entry level exercises or if an exercise introduces completely new language syntax specific concepts that a student can't figure easily on there own without much googling then a stub should be present. Otherwise for all exercises an empty file should suffice.

This is how I think it should be :
Student starts from

  1. Just function stub (entry level exercise which make no sense of having a class e.g. Hello World)
  2. Class with all method stubs. (entry level exercises which only have a 2 or less methods )
  3. Only empty class. (exercises with multiples methods being called in test suite)
  4. Only empty file, with hints in Readme
  5. Only empty exercise.dart file.

After solving first 5 or so exercises we can expect students to go on there own without a stub. Test suites will provide enough information on how to proceed and what is required.

@Stargator
Copy link
Contributor Author

I like the thought of "hand holding" for the first couple of exercises. I also think for any exercises we consider to be "core", then we should consider what level of stubbing is good.

For "core" exercises, I see them as ways to in general present the language in and of itself, but also key concepts within the language which I envision can be a lot.

But these "core" exercises are dependent on whether there are specifications that include any new concepts without being overly difficult.

@kabiir As for the 5 levels of stubbing you presented, maybe we can codify them into categories? Otherwise, I was thinking of tying them with a difficulty level, but that may lead to confusion or conflict later.

@devkabiir
Copy link
Contributor

devkabiir commented Dec 6, 2017

@Stargator key concepts also include introduction to syntactic sugars which I find difficult to be explained or introduced via test suites. But a clever example in exercise Readme or hints.md which intrigues a student to either explore or use the concept is what I'm aiming for.

Now by a clever example I mean, not just serving a concept on a silver platter but maybe puzzling it a little so when they discover it the feel of it intrigues them more.

So maybe after figuring out how to assign difficulties and deciding on the 5 level stubs any proceeding examples may/should contain such clever examples.

@Stargator
Copy link
Contributor Author

I'm unsure of being "clever". Some concepts would have to be introduced via hints or suggestions and leave it to the user to decide how to implement the solution.

Because, despite how clever a problem is, dart is flexible enough that we can't assume how a solution is developed.

For example, if we hint or suggest to use Regular Expressions in a "core" exercise, which unlocks an exercise that explicitly depends on regular expressions (like values used in test suite are regular expressions), then it would be up to the user on how to get up to speed.

The core exercise may hint or suggest Regular Expressions, but we have to tread lightly about what kind of concepts we can assume the user will already know.

Using the configlet tree command can show us what exercises depend on others, so we can try and build our exercises off that.

I'm starting to think we may come to a point where we need to either update the README, add pages to the wiki, or create new documentation about the track's exercises and how they build off each other.

@devkabiir
Copy link
Contributor

Well exercises don't necessarily cover all aspects of dart world, which means we'll probably need to come up with our own exercises covering topics such as

  1. pub
  2. pubspec.yml
  3. probably more topics but I can't think of them right now.

Exercises explaining getting started with dart, like where are you supposed to use dart, how are you supposed to use dart and so on...
Exercises mostly explain syntax and a couple of essential programming concepts sometimes even computer science concepts like binary search tree. But where does dart come in all this?

Why would a student chose dart track on exercism in the first place? It's probably because they heard/saw angulardart/flutter/etc framework/platform usage of dart and how easy and fast it is to develop stuff with.

I think we need to think through this and design a plan/policy of how dart is to be taught not just in the sense of exercism but also dart as its own world.

Now, again, is it exercism's job to teach every framework out there? No! I don't think so. So we have to draw a line for our track of what things from dart world we will introduce and what will be left for the student to explore.

@Stargator
Copy link
Contributor Author

Is exercism about teaching a language or providing an environment where people are given a problem to solve?

Currently, Exercism itself doesn't have pages detailing each language's features. Each language track provides a cursory overview of the language, some steps for installing software needed to use the language, and then two pages with resources to dig deeper into the language.

Outside of that, each exercise for a language may or may not provide language-specific information, hints, or suggestions. The central point being, it is up to each track to decide whether or not to include language-specific information in the REAME for an exercise.

So I think it is a little bit of a stretch to say exercism teaches users about languages. Though again, the track's maintainers are given a lot of room to decide the granular details.

Even after reading descriptions of v2 of exercism, I'm not convinced it's about "teaching", but more about providing a collaborative environment where people can reach out to teams, mentors, and track maintainers, if the resources (websites, documents, etc) they have are not helpful or clear.

And if exercism is about teaching, then what does it mean to "teach"? Where is that fine line between providing just enough information to allow the person to solve the problem, giving them too much information, or solving the problem for them? That's not including language barriers or understanding new concepts by phrasing it differently then the "teacher" might.

Summary:

  • Is exercism about teaching a language or providing an environment where people are given a problem to solve?
  • For the Dart track, what does it mean to "teach" Dart? Where is that fine line between providing just enough information to allow the person to solve the problem, giving them too much information, or solving the problem for them?

@jvarness
Copy link
Contributor

Is exercism about teaching a language or providing an environment where people are given a problem to solve?

I would say it provides an environment where people are given problems to solve, which in turn facilitates learning a language. Mentorship helps with learning further techniques and mechanics that others may not have been aware of before.

The exercises themselves are not enough to completely learn a language though. They're a catalyst in my opinion.

For the Dart track, what does it mean to "teach" Dart? Where is that fine line between providing just enough information to allow the person to solve the problem, giving them too much information, or solving the problem for them?

I think that's more of a question for the folks who create the problem specifications and less about the Dart track. Either the specifications give them enough information or not.

I will say that since each exercise has tests associated with them, it's easier for us to be able to stub out classes and methods for the students so that they at least have something to base their code off of.

When we go to implement an exercise, the paradigms of Dart will need to be considered up-front. Do we have enough exercises that utilize get and set properties? Dart allows users to override operators, do we have exercises for that? Do any of our exercises challenge someone to create a generic type?

I would say that having stubs in each exercise with the expected signatures removes the need for people to have to look at the tests, which I think they should do. I think our current model of providing either a class or high-order function stub works pretty well. If folks want to figure out a different way to solve a problem, they should feel empowered to change the tests and their implementation to learn something new. If we generate an exercise and think it could fit an educational use case that we haven't implemented yet, we should feel empowered to alter it to our will.

@Stargator Stargator removed this from the Ready for v2 of Exercism milestone Nov 23, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants