API decoupling #231

Closed
pascalduez opened this Issue Sep 28, 2014 · 78 comments

Comments

Projects
None yet
4 participants
@pascalduez
Member

pascalduez commented Sep 28, 2014

Observations

Current exposed API api.js and more generally file.js methods combine file based operations (IO) and data ones (parsing, postprocessing). This makes it really hard if not impossible to call those operations independently. While working on the plugins namely (grunt-sassdoc, gulp-sassdoc, broccoli-sassdoc) I felt a bit frustrated not to be able to fully take advantage of those systems strengths.

This is specifically true for Gulp whose concept resolve around streams. Broccoli uses file based trees.
At some point Grunt will also implement streams and piping.

Appart from plugins integration, I also think SassDoc's API could be cleaner, easier to work with and more futur proof if we could split things up.

Case study

SassDoc 1.* expose the following API:

documentize(source, destination [, config])

All in one function.

parse(source) alias to fs.getData(source)

Parse a folder and return the data object.

What would actually be needed by a system providing its own file operations API is something similar to file.process as well as end operations of file.getData. Then calling the theme system.

Suggestions

We could think of a new granular API based on the different steps of the process.

resolveConfig
checkDest
refresh
read
parse
process
postProcess
theme
exit

Discussion

  • Does it makes sense ?
    After writing this I'm not sure anymore...

The true benefits of streams and build systems like Gulp is to apply in memory transformations to files and piping, so you don't have to save intermediary states to disk. In the case of SassDoc we don't modify anything, just parse and generates a separate set of files.

So the only feature I could think of, is being able to include SassDoc in a tasks chain.

gulp.task('styles', function ()
  return gulp.src('scss/**/*.scss')
    .pipe(plumber())
    .pipe(sassdoc({
      dest: 'docs'
    }))
    .pipe(rubySass())
    .pipe(autoprefixer())
    .pipe(gulp.dest('assets/css'));
});

This should already be possible with the current plugin.
Although it implies opening/reading the input files two times.

  • What's the best way to implement this ?
  • Is there any other applications, implications ?

Refs #217

@FWeinb

This comment has been minimized.

Show comment
Hide comment
@FWeinb

FWeinb Sep 28, 2014

Member

I like the first point about splitting file based operations (IO) and data ones. But I don't like to maintain such a big public API.

Basically what we could do it separate IO and data tasks. And design the API around that.

Member

FWeinb commented Sep 28, 2014

I like the first point about splitting file based operations (IO) and data ones. But I don't like to maintain such a big public API.

Basically what we could do it separate IO and data tasks. And design the API around that.

@pascalduez

This comment has been minimized.

Show comment
Hide comment
@pascalduez

pascalduez Sep 28, 2014

Member

But I don't like to maintain such a big public API.

Agree.
The suggestion part is just some sort of brainstorm thing.

Or all those steps don't need to be "public" but used internally.
One plugin could always require('sassdoc/file') at its own risks.

Member

pascalduez commented Sep 28, 2014

But I don't like to maintain such a big public API.

Agree.
The suggestion part is just some sort of brainstorm thing.

Or all those steps don't need to be "public" but used internally.
One plugin could always require('sassdoc/file') at its own risks.

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Sep 29, 2014

Member

Maybe we could have @valeriangalliat's opinion?

Member

HugoGiraudel commented Sep 29, 2014

Maybe we could have @valeriangalliat's opinion?

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Sep 29, 2014

Member

I'm totally in favor of a refectoring, and proper seperation of concerns in the API.

But @pascalduez, how do you "stream" a directory? Gulp will for example open a set of files in memory, and only work on buffers? SassDoc would need an array of file buffers instead of an input directory, so the logic would be abstracted.

If so, the so called "buffers" should implement Stream, and not be a simple string, so we're not required to load everything in memory when not needed.

To develop your list, I'd see something like this:

  • resolveConfig(String path = null) : Object Get the configuration object from path (or default finder).
  • checkDest(String dest) Reject the promise if dangerous overwrite.
  • refresh(String dest) Wipe the destination folder.
  • read(src) : Array<Stream> Get all the file streams from source folder.
  • parse(Array<Stream> buffers) : Object Parse a set of file streams and get a raw SassDoc data.
  • process(Object ctx) Process the raw SassDoc data to resolve cross references and stuff.
  • postProcess I don't see the difference with process.
  • theme(String dest, Object ctx) Render the theme in dest with given context.
Member

valeriangalliat commented Sep 29, 2014

I'm totally in favor of a refectoring, and proper seperation of concerns in the API.

But @pascalduez, how do you "stream" a directory? Gulp will for example open a set of files in memory, and only work on buffers? SassDoc would need an array of file buffers instead of an input directory, so the logic would be abstracted.

If so, the so called "buffers" should implement Stream, and not be a simple string, so we're not required to load everything in memory when not needed.

To develop your list, I'd see something like this:

  • resolveConfig(String path = null) : Object Get the configuration object from path (or default finder).
  • checkDest(String dest) Reject the promise if dangerous overwrite.
  • refresh(String dest) Wipe the destination folder.
  • read(src) : Array<Stream> Get all the file streams from source folder.
  • parse(Array<Stream> buffers) : Object Parse a set of file streams and get a raw SassDoc data.
  • process(Object ctx) Process the raw SassDoc data to resolve cross references and stuff.
  • postProcess I don't see the difference with process.
  • theme(String dest, Object ctx) Render the theme in dest with given context.

@HugoGiraudel HugoGiraudel added this to the 2.0 milestone Oct 5, 2014

@HugoGiraudel HugoGiraudel added Refacto and removed Opinions needed labels Oct 7, 2014

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Oct 11, 2014

Member

We are now working on 2.0. What is the idea behind this issue please @SassDoc/owners?

Member

HugoGiraudel commented Oct 11, 2014

We are now working on 2.0. What is the idea behind this issue please @SassDoc/owners?

@FWeinb FWeinb referenced this issue Oct 13, 2014

Closed

[Meta] Development SassDoc 2.0 #255

20 of 20 tasks complete
@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Oct 18, 2014

Member

Hey @SassDoc/owners. So who takes what and when?

Member

HugoGiraudel commented Oct 18, 2014

Hey @SassDoc/owners. So who takes what and when?

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 10, 2014

Member

Okay, I'm finally working on this one.

First, I propose to convert everything to ES6, to enjoy all the awesome stuff coming with ES6. After inquiring about this, Google Traceur seems to be a great solution. It supports ES6 modules and generators unlike some other alternatives, and that's what I like the most.

ES6 modules

Here's the main structure I suggest for the ES6 workflow: 198e0cd. Note that's the first time I do such a setup, so there may be a better way. I'm totally open to suggestions.

We have a main ES6 module src/sassdoc.js which exposes the public API. A makefile converts this module to CommonJS style in index.js, ensuring the Traceur runtime is properly loaded.

Streaming

I've made a basic example to search for src/**/*.js and piping the parse function, which will, for each file, print the filename and the contents: 26555b5.

From this, we can easily call the actual parser for each file like we already do in file.js, compute the whole SassDoc data and render the theme.

What do you think guys?

Member

valeriangalliat commented Nov 10, 2014

Okay, I'm finally working on this one.

First, I propose to convert everything to ES6, to enjoy all the awesome stuff coming with ES6. After inquiring about this, Google Traceur seems to be a great solution. It supports ES6 modules and generators unlike some other alternatives, and that's what I like the most.

ES6 modules

Here's the main structure I suggest for the ES6 workflow: 198e0cd. Note that's the first time I do such a setup, so there may be a better way. I'm totally open to suggestions.

We have a main ES6 module src/sassdoc.js which exposes the public API. A makefile converts this module to CommonJS style in index.js, ensuring the Traceur runtime is properly loaded.

Streaming

I've made a basic example to search for src/**/*.js and piping the parse function, which will, for each file, print the filename and the contents: 26555b5.

From this, we can easily call the actual parser for each file like we already do in file.js, compute the whole SassDoc data and render the theme.

What do you think guys?

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Nov 11, 2014

Member

I am fully okay to move to ES6, I think it is an excellent idea. More and more projects are written in ES6 rather than ES5, starting with AngularJS 2.0. Also, book series You Don't Know Javascript from @getify are written in an ES6 environment. Let's do this.

Regarding streaming, I won't be able to help much on this one. I trust your judgement @valeriangalliat, so if you feel confident with this, you have my full support.

Member

HugoGiraudel commented Nov 11, 2014

I am fully okay to move to ES6, I think it is an excellent idea. More and more projects are written in ES6 rather than ES5, starting with AngularJS 2.0. Also, book series You Don't Know Javascript from @getify are written in an ES6 environment. Let's do this.

Regarding streaming, I won't be able to help much on this one. I trust your judgement @valeriangalliat, so if you feel confident with this, you have my full support.

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 11, 2014

Member

9 files changed, 198 insertions(+), 506 deletions

I'm rewriting SassDoc guys…

Code gonna be damn shorter thanks to ES6, I like it so bad!

Member

valeriangalliat commented Nov 11, 2014

9 files changed, 198 insertions(+), 506 deletions

I'm rewriting SassDoc guys…

Code gonna be damn shorter thanks to ES6, I like it so bad!

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 11, 2014

Member

Time's passing so fast, I'm totally far from finished but I'll push my progress on refacto branch before going to bed anyway.

BTW I saw @pascalduez on IRC, we gonna sync together on the refactoring in the next days.

Member

valeriangalliat commented Nov 11, 2014

Time's passing so fast, I'm totally far from finished but I'll push my progress on refacto branch before going to bed anyway.

BTW I saw @pascalduez on IRC, we gonna sync together on the refactoring in the next days.

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 11, 2014

Member

I pushed a lot of commits tonight (that moment when I see I pushed commits on what I would have called "tomorrow"…). Now all the CLI, config and theme stuff is done. I also refactored the logger and utils.

I think I finally found a stable workflow with Traceur (though I would like to see how other Node.js projects with Traceur are doing). The only inconvenient is we can't access require.resolve without explicitely calling $traceurRuntime.require.resolve, because Traceur create custom require functions for each module during the compilation without binding resolve and other static methods. I think that's acceptable, and the good think is we don't need any static compilation (though we may look at static compilation for further optimization before publishing).

There's still a lot of work to do, but now we have to focus on the streaming/parsing parts, possibly with vinyl-fs.

Member

valeriangalliat commented Nov 11, 2014

I pushed a lot of commits tonight (that moment when I see I pushed commits on what I would have called "tomorrow"…). Now all the CLI, config and theme stuff is done. I also refactored the logger and utils.

I think I finally found a stable workflow with Traceur (though I would like to see how other Node.js projects with Traceur are doing). The only inconvenient is we can't access require.resolve without explicitely calling $traceurRuntime.require.resolve, because Traceur create custom require functions for each module during the compilation without binding resolve and other static methods. I think that's acceptable, and the good think is we don't need any static compilation (though we may look at static compilation for further optimization before publishing).

There's still a lot of work to do, but now we have to focus on the streaming/parsing parts, possibly with vinyl-fs.

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Nov 12, 2014

Member

Two things to discuss:

  • should we be using arrow functions since they implicitly rebind this?
  • should we use </code> when interpolating a variable within a string and'` when not?
Member

HugoGiraudel commented Nov 12, 2014

Two things to discuss:

  • should we be using arrow functions since they implicitly rebind this?
  • should we use </code> when interpolating a variable within a string and'` when not?
@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 12, 2014

Member

Too bad there's no ES6 style guide for this kind of stuff.

I didn't consider the this binding when using arrow functions, and now I think about it, I don't like this unneeded "side-effect", even if we're not currently using this.

Though, arrow functions are really appreciated in a case like this. But the export default just below could use a standard anonymous function instead of an arrow function.

About the backticks, for now I use them only when I'm using string interpolation, otherwise I stick with '. But there's not a lot of cases where I would want ${} in a string without interpolation, so I would be comfortable with using backticks everywhere. The only issue could be about performance, like in PHP between single and double quotes, when it's recommended to use single quotes to avoid the overhead of searching for variables inside the string when not needed. But that should be quite negligible, and anyway we have poor performance because of runtime ES6 transpilation.

Member

valeriangalliat commented Nov 12, 2014

Too bad there's no ES6 style guide for this kind of stuff.

I didn't consider the this binding when using arrow functions, and now I think about it, I don't like this unneeded "side-effect", even if we're not currently using this.

Though, arrow functions are really appreciated in a case like this. But the export default just below could use a standard anonymous function instead of an arrow function.

About the backticks, for now I use them only when I'm using string interpolation, otherwise I stick with '. But there's not a lot of cases where I would want ${} in a string without interpolation, so I would be comfortable with using backticks everywhere. The only issue could be about performance, like in PHP between single and double quotes, when it's recommended to use single quotes to avoid the overhead of searching for variables inside the string when not needed. But that should be quite negligible, and anyway we have poor performance because of runtime ES6 transpilation.

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 12, 2014

Member

Note that ``` is definitely harder to type than ' or `"`, at least on a French keybord, so I'd prefer to keep it only for string interpolation, but not general case strings.

// This is okay
console.log(`Some message with ${string.interpolation}.`);

// This is wrong IMO
foo.on(`message`, bar);
Member

valeriangalliat commented Nov 12, 2014

Note that ``` is definitely harder to type than ' or `"`, at least on a French keybord, so I'd prefer to keep it only for string interpolation, but not general case strings.

// This is okay
console.log(`Some message with ${string.interpolation}.`);

// This is wrong IMO
foo.on(`message`, bar);
@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Nov 12, 2014

Member

on a French keybord

Non-Mac.

Member

HugoGiraudel commented Nov 12, 2014

on a French keybord

Non-Mac.

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 12, 2014

Member

Indeed.

Here's some progress on the main API file.

It's more or less the same as documentize, but in ES6, and with the differents parts of the process clearely decoupled and exposed publicly so the build tools can easily use the core functions.

The parse function is built for streaming, meaning it should be piped to an object stream (like the one returned by vinyl-fs). When the stream is ended (flush callback), a promise is resolved so the main function can continue with the passed data.

I would like some feedback on this, since this may be the definitive API for 2.0… it would be great if it can fit the build tools needs, without requiring much logic to be duplicated. For example, I totally don't know how to handle the logger for build tools. Currently the logger is only internal to SassDoc CLI and the default all-in-one function, and can be injected in the cfg functions.

Anyway, let's see what's remaining to do:

  • Port the annotations to ES6. 9795d9a
  • Port src/parser.js to ES6. e1aa536
  • Port src/sorter.js to ES6. 66257c4
  • Port src/convert.js to ES6 (@pascalduez is working on it), and adapt to new exposed API. Since we now use a streaming filesystem, we could add the Sass to SCSS conversion as one simple pipe before the parser pipe. This would require to call sass-convert file by file instead of processing the whole directory at once, but since the parser pipe parses the files one by one too, I think this fits really well into the streaming workflow (and we wouldn't need a temproary directory at all).
  • Find a way to handle input directories (compared to file streams), see the transform function. We could handle this at the very beginning, by testing the glob pattern, and adding **/*.scss if it seems to target a directory, like @pascalduez suggested, and like they do here. Though, it feels kinda hacky to me (applying a pattern on a glob expression). We can also add a condition directly in the transform function; if it's a directory, recursively call parse for a **/*.scss glob in this directory, and resolve the final promise only when the recursive calls are resolved (and the data merged), but this seems not easy to do at first sight. 183ebff
  • Adapt the tests.
    • Port test/annotations. a3e679e
    • Rewrite test/data to make it more evolutive and informative in case of error (@HugoGiraudel is working on it). b8b5074 - It will now dump the data for a test Sass file and compare it to a JSON fixture. If it doesn't matches, a diff is printed between the 2 indented JSON files (now easy to see what messed up, and to adapt the fixture in case of API changes). The current fixture is wrong though, since I've built it while the parser post processor wasn't active (see data flatten task below).
    • While the previous task will handle most cases, port the toJSON test somewhere. After reflection I think it's already tested by the above once the post processor works again.
    • Port the tests for SCSS to Sass converter.
  • Maybe adapt (or remove) the Gruntfile? Some work to port stuff to a simple makefile. d77f75b a3e679e 283eda8 059ce9b
  • JSHint for ES6. d77f75b, 5fb959e, ac5f5fe and 4bc20e9 - Note that the .jshintrc is generated from a .jshintrc.yaml file because I prefer YAML syntax, and I can put comments in it which are really usefull. Just run make lint to lint, it cares of regenerating the .jshintrc if needed.
  • Remove Q dependency and use only native promises. We might need our own denodeify function to help. 2aa31d0
  • Be sure src/file.js, src/api.js, src/sorter.js are ported and used or removed.
  • Flatten data for #239.
  • Remove config.view anywhere it's used since it's the same object now. SassDoc/sassdoc-theme-default@7f92528
  • Document API changes (the CLI have slightly changed, the public API too, with streams in particular).
  • Port Grunt/Gulp/Brocoli plugins to use new API.
  • Maybe provide a backward compatible documentize function (also for ease of use) as suggested by @pascalduez. 5adb692
  • Allow sassdoc.parse(src).then(function (data) { .. }).

Also, see some inspiration on streaming from another documentation tool. It looks like a really smooth pipeline, but I'm not sure this could be applied to SassDoc… it looks like in grock, all destination files are mapped to a source file, and that's why it works seamlessly with filesystem streaming/transforming.

Member

valeriangalliat commented Nov 12, 2014

Indeed.

Here's some progress on the main API file.

It's more or less the same as documentize, but in ES6, and with the differents parts of the process clearely decoupled and exposed publicly so the build tools can easily use the core functions.

The parse function is built for streaming, meaning it should be piped to an object stream (like the one returned by vinyl-fs). When the stream is ended (flush callback), a promise is resolved so the main function can continue with the passed data.

I would like some feedback on this, since this may be the definitive API for 2.0… it would be great if it can fit the build tools needs, without requiring much logic to be duplicated. For example, I totally don't know how to handle the logger for build tools. Currently the logger is only internal to SassDoc CLI and the default all-in-one function, and can be injected in the cfg functions.

Anyway, let's see what's remaining to do:

  • Port the annotations to ES6. 9795d9a
  • Port src/parser.js to ES6. e1aa536
  • Port src/sorter.js to ES6. 66257c4
  • Port src/convert.js to ES6 (@pascalduez is working on it), and adapt to new exposed API. Since we now use a streaming filesystem, we could add the Sass to SCSS conversion as one simple pipe before the parser pipe. This would require to call sass-convert file by file instead of processing the whole directory at once, but since the parser pipe parses the files one by one too, I think this fits really well into the streaming workflow (and we wouldn't need a temproary directory at all).
  • Find a way to handle input directories (compared to file streams), see the transform function. We could handle this at the very beginning, by testing the glob pattern, and adding **/*.scss if it seems to target a directory, like @pascalduez suggested, and like they do here. Though, it feels kinda hacky to me (applying a pattern on a glob expression). We can also add a condition directly in the transform function; if it's a directory, recursively call parse for a **/*.scss glob in this directory, and resolve the final promise only when the recursive calls are resolved (and the data merged), but this seems not easy to do at first sight. 183ebff
  • Adapt the tests.
    • Port test/annotations. a3e679e
    • Rewrite test/data to make it more evolutive and informative in case of error (@HugoGiraudel is working on it). b8b5074 - It will now dump the data for a test Sass file and compare it to a JSON fixture. If it doesn't matches, a diff is printed between the 2 indented JSON files (now easy to see what messed up, and to adapt the fixture in case of API changes). The current fixture is wrong though, since I've built it while the parser post processor wasn't active (see data flatten task below).
    • While the previous task will handle most cases, port the toJSON test somewhere. After reflection I think it's already tested by the above once the post processor works again.
    • Port the tests for SCSS to Sass converter.
  • Maybe adapt (or remove) the Gruntfile? Some work to port stuff to a simple makefile. d77f75b a3e679e 283eda8 059ce9b
  • JSHint for ES6. d77f75b, 5fb959e, ac5f5fe and 4bc20e9 - Note that the .jshintrc is generated from a .jshintrc.yaml file because I prefer YAML syntax, and I can put comments in it which are really usefull. Just run make lint to lint, it cares of regenerating the .jshintrc if needed.
  • Remove Q dependency and use only native promises. We might need our own denodeify function to help. 2aa31d0
  • Be sure src/file.js, src/api.js, src/sorter.js are ported and used or removed.
  • Flatten data for #239.
  • Remove config.view anywhere it's used since it's the same object now. SassDoc/sassdoc-theme-default@7f92528
  • Document API changes (the CLI have slightly changed, the public API too, with streams in particular).
  • Port Grunt/Gulp/Brocoli plugins to use new API.
  • Maybe provide a backward compatible documentize function (also for ease of use) as suggested by @pascalduez. 5adb692
  • Allow sassdoc.parse(src).then(function (data) { .. }).

Also, see some inspiration on streaming from another documentation tool. It looks like a really smooth pipeline, but I'm not sure this could be applied to SassDoc… it looks like in grock, all destination files are mapped to a source file, and that's why it works seamlessly with filesystem streaming/transforming.

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Nov 16, 2014

Member

Is there any point in adapting the Gruntfile? If no, just skip it. It doesn't matter much.

Member

HugoGiraudel commented Nov 16, 2014

Is there any point in adapting the Gruntfile? If no, just skip it. It doesn't matter much.

@pascalduez

This comment has been minimized.

Show comment
Hide comment
@pascalduez

pascalduez Nov 16, 2014

Member

Is there any point in adapting the Gruntfile?

There's no point in porting it to ES6.
And maybe some cleaning could be done, we should isolate what are the used/useful tasks.
Not prio though.

Member

pascalduez commented Nov 16, 2014

Is there any point in adapting the Gruntfile?

There's no point in porting it to ES6.
And maybe some cleaning could be done, we should isolate what are the used/useful tasks.
Not prio though.

@pascalduez

This comment has been minimized.

Show comment
Hide comment
@pascalduez

pascalduez Nov 19, 2014

Member

Progress on the converter

The converter is piped before the filter/parser so it will looks like this

// --- sassdoc.js ---
stream.read(src)
   .pipe(converter)
   .pipe(filter);

Which means, right now .sass files are being streamed and converted trough sass-convert in memory, without hitting the file system in between. They are returned in memory as well so that filter can access them.
The converter is looking for .sass files: src/**/*.sass

Which makes me thing we could actually remove the --sass-convert option.
If a provided source folder contains .sass files, they will be converted prior to being parsed, if there is none, then just nothing happens at the converter level.
What do you think @SassDoc/owners shall we remove the option ?

Member

pascalduez commented Nov 19, 2014

Progress on the converter

The converter is piped before the filter/parser so it will looks like this

// --- sassdoc.js ---
stream.read(src)
   .pipe(converter)
   .pipe(filter);

Which means, right now .sass files are being streamed and converted trough sass-convert in memory, without hitting the file system in between. They are returned in memory as well so that filter can access them.
The converter is looking for .sass files: src/**/*.sass

Which makes me thing we could actually remove the --sass-convert option.
If a provided source folder contains .sass files, they will be converted prior to being parsed, if there is none, then just nothing happens at the converter level.
What do you think @SassDoc/owners shall we remove the option ?

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Nov 19, 2014

Member

If we can ditch this manual step to include it in the build process with no drawback, then let's do it.

Member

HugoGiraudel commented Nov 19, 2014

If we can ditch this manual step to include it in the build process with no drawback, then let's do it.

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 19, 2014

Member

+1 to remove the --sass-convert option. Just hope some people don't have Sass code in non .sass files.

Member

valeriangalliat commented Nov 19, 2014

+1 to remove the --sass-convert option. Just hope some people don't have Sass code in non .sass files.

@pascalduez

This comment has been minimized.

Show comment
Hide comment
@pascalduez

pascalduez Nov 19, 2014

Member

Just hope some people don't have Sass code in non .sass files.

I wondered as well whether it's safe to guess file content based on file extension, but that's already what the sass-convert binary do in a way. It guesses the conversion based on file extension, unless you pass --from --to options.

Member

pascalduez commented Nov 19, 2014

Just hope some people don't have Sass code in non .sass files.

I wondered as well whether it's safe to guess file content based on file extension, but that's already what the sass-convert binary do in a way. It guesses the conversion based on file extension, unless you pass --from --to options.

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Nov 19, 2014

Member

+1 to remove the --sass-convert option. Just hope some people don't have Sass code in non .sass files.

Don't build on anecdotal evidences.

Let's convert all the time.

Member

HugoGiraudel commented Nov 19, 2014

+1 to remove the --sass-convert option. Just hope some people don't have Sass code in non .sass files.

Don't build on anecdotal evidences.

Let's convert all the time.

@pascalduez

This comment has been minimized.

Show comment
Hide comment
@pascalduez

pascalduez Nov 19, 2014

Member

The only downside I could see from this is it will cost one more file system lookup (vinyl-fs.src()) for projects without a single .sass file. I don't have numbers, but I guess it's completely acceptable.

Member

pascalduez commented Nov 19, 2014

The only downside I could see from this is it will cost one more file system lookup (vinyl-fs.src()) for projects without a single .sass file. I don't have numbers, but I guess it's completely acceptable.

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 19, 2014

Member

Why one more ? We can change the default directory pattern to **/*.{scss,sass} (don't know if vinyl-fs supports this but there's probably something similar), and we don't have additional filesystem lookup.

If we pipe the convert filter anyway, and it alters only .sass files, there will be nearly no overhead for .scss only projects, neither will be for .sass projects.

Member

valeriangalliat commented Nov 19, 2014

Why one more ? We can change the default directory pattern to **/*.{scss,sass} (don't know if vinyl-fs supports this but there's probably something similar), and we don't have additional filesystem lookup.

If we pipe the convert filter anyway, and it alters only .sass files, there will be nearly no overhead for .scss only projects, neither will be for .sass projects.

@pascalduez

This comment has been minimized.

Show comment
Hide comment
@pascalduez

pascalduez Nov 19, 2014

Member

We can change the default directory pattern to **/*.{scss,sass}

That's exactly what I though.
But the current code articulation makes it difficult. @valeriangalliat we need to talk :-)

Member

pascalduez commented Nov 19, 2014

We can change the default directory pattern to **/*.{scss,sass}

That's exactly what I though.
But the current code articulation makes it difficult. @valeriangalliat we need to talk :-)

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 19, 2014

Member

Keeping track of the IRC talk.

There's indeed a problem with the current code articulation and the need to convert Sass to SCSS: when recursing inside a directory, the potential previous pipes/filters are totally ignored. This would be a problem for Sass to SCSS conversion because the Sass files could not be parsed with this flow.

The proposed solution is to do something like this:

src('src')
  .pipe(recurseDir('**/*.{scss,sass}')) // When find a directory, recurse with given pattern and yield files
  .pipe(convert()) // Convert Sass to SCSS
  .pipe(filter) // Generate documentation
Member

valeriangalliat commented Nov 19, 2014

Keeping track of the IRC talk.

There's indeed a problem with the current code articulation and the need to convert Sass to SCSS: when recursing inside a directory, the potential previous pipes/filters are totally ignored. This would be a problem for Sass to SCSS conversion because the Sass files could not be parsed with this flow.

The proposed solution is to do something like this:

src('src')
  .pipe(recurseDir('**/*.{scss,sass}')) // When find a directory, recurse with given pattern and yield files
  .pipe(convert()) // Convert Sass to SCSS
  .pipe(filter) // Generate documentation
@pascalduez

This comment has been minimized.

Show comment
Hide comment
@pascalduez

pascalduez Nov 22, 2014

Member

Update: pushing the recurse code and finalizing the converter one.

Question: should we include an alias method for .documentize so that third parties using v1.* API could still work out of the box, or decide to be non backward compatible ? I'm really unsure about this.

Member

pascalduez commented Nov 22, 2014

Update: pushing the recurse code and finalizing the converter one.

Question: should we include an alias method for .documentize so that third parties using v1.* API could still work out of the box, or decide to be non backward compatible ? I'm really unsure about this.

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 22, 2014

Member

Awesome @pascalduez!

We can tick "port src/convert.js" here?

I don't see any drawback supporting a simple documentize function hiding "low-level" considerations (mostly streaming/pipe stuff), and if it can make the API backward-compatible, it's even better.

But anyway the global thing can't be backward compatible, mostly due to the configuration and data interface changes.

Member

valeriangalliat commented Nov 22, 2014

Awesome @pascalduez!

We can tick "port src/convert.js" here?

I don't see any drawback supporting a simple documentize function hiding "low-level" considerations (mostly streaming/pipe stuff), and if it can make the API backward-compatible, it's even better.

But anyway the global thing can't be backward compatible, mostly due to the configuration and data interface changes.

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 23, 2014

Member

I'm done with the data interface updates (in extras, theme and the core)!

I added some tasks above.

@HugoGiraudel, I don't know what's the current state on the documentation, but there's been some changes on sassdoc-extras, and the core SassDoc API. Can you prepare the website for 2.0?

@pascalduez, you know better than everyone the Grunt/Gulp/Brocoli plugins, can you update them, taking advantage of the new streaming capabilities?

@SassDoc/owners, as asked by @pascalduez above, do you think we should maintain the documentize function with the exact same behavior as in 1.0? There's technically no problem doing it, so I'm in favor.

Also about the Gruntfile, there's a specific issue here.

@pascalduez You're probably gonna have a lot of work with the plugins, I don't know what's your progress with the converter, but I can help or even take over this part.

Member

valeriangalliat commented Nov 23, 2014

I'm done with the data interface updates (in extras, theme and the core)!

I added some tasks above.

@HugoGiraudel, I don't know what's the current state on the documentation, but there's been some changes on sassdoc-extras, and the core SassDoc API. Can you prepare the website for 2.0?

@pascalduez, you know better than everyone the Grunt/Gulp/Brocoli plugins, can you update them, taking advantage of the new streaming capabilities?

@SassDoc/owners, as asked by @pascalduez above, do you think we should maintain the documentize function with the exact same behavior as in 1.0? There's technically no problem doing it, so I'm in favor.

Also about the Gruntfile, there's a specific issue here.

@pascalduez You're probably gonna have a lot of work with the plugins, I don't know what's your progress with the converter, but I can help or even take over this part.

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Nov 23, 2014

Member

@HugoGiraudel, I don't know what's the current state on the documentation, but there's been some changes on sassdoc-extras, and the core SassDoc API. Can you prepare the website for 2.0?

I can give it a look but I'm not sure what has been changed regarding the public API.

Member

HugoGiraudel commented Nov 23, 2014

@HugoGiraudel, I don't know what's the current state on the documentation, but there's been some changes on sassdoc-extras, and the core SassDoc API. Can you prepare the website for 2.0?

I can give it a look but I'm not sure what has been changed regarding the public API.

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 23, 2014

Member

Basically look the src/sassdoc.js file, everything exported is the public API.

Also, src/cli.js for the CLI options, there's at least been the cp style options, and the --sass-convert removal.

For sassdoc-extras, I removed the flat indexer, and the eachItem helper (since it's a flat array, no need to flatten it, and we can just use forEach instead of eachItem.

Also for the data interface, the view is now merged in the top-level ctx (also called config), since nearly everything in the view ended up copied/transformed in the top-level object (now it's transformed/defaulted in place).

Member

valeriangalliat commented Nov 23, 2014

Basically look the src/sassdoc.js file, everything exported is the public API.

Also, src/cli.js for the CLI options, there's at least been the cp style options, and the --sass-convert removal.

For sassdoc-extras, I removed the flat indexer, and the eachItem helper (since it's a flat array, no need to flatten it, and we can just use forEach instead of eachItem.

Also for the data interface, the view is now merged in the top-level ctx (also called config), since nearly everything in the view ended up copied/transformed in the top-level object (now it's transformed/defaulted in place).

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 23, 2014

Member

About the documentize function, I'm realizing that the default export of src/sassdoc.js have exactly the same signature… it's only the matter of export var documentize = sassdoc;.

Member

valeriangalliat commented Nov 23, 2014

About the documentize function, I'm realizing that the default export of src/sassdoc.js have exactly the same signature… it's only the matter of export var documentize = sassdoc;.

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Nov 23, 2014

Member

Then do it and tick that damn box. ;)

Member

HugoGiraudel commented Nov 23, 2014

Then do it and tick that damn box. ;)

HugoGiraudel referenced this issue in SassDoc/sassdoc.github.io Nov 24, 2014

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Nov 24, 2014

Member

Please don't tell me you're expecting people to write:

var sassdoc = require('sassdoc').default;
var Logger = require('sassdoc/src/logger').default;
var config = require('sassdoc/src/cfg').default;
var Converter = require('sassdoc/src/converter').default;
var Parser = require('sassdoc/src/parser').default;
var sort = require('sassdoc/src/sorter').default;
Member

HugoGiraudel commented Nov 24, 2014

Please don't tell me you're expecting people to write:

var sassdoc = require('sassdoc').default;
var Logger = require('sassdoc/src/logger').default;
var config = require('sassdoc/src/cfg').default;
var Converter = require('sassdoc/src/converter').default;
var Parser = require('sassdoc/src/parser').default;
var sort = require('sassdoc/src/sorter').default;
@pascalduez

This comment has been minimized.

Show comment
Hide comment
@pascalduez

pascalduez Nov 24, 2014

Member

Please don't tell me you're expecting people to write:

That would be for someone willing to use the API in a decoupled way. (being able to provide it's own stream of files).
For normal, all in one use it is:

var sassdoc = require('sassdoc').default;
sassdoc('src', 'dest', {});

Still, the .default is a bit sad.

Member

pascalduez commented Nov 24, 2014

Please don't tell me you're expecting people to write:

That would be for someone willing to use the API in a decoupled way. (being able to provide it's own stream of files).
For normal, all in one use it is:

var sassdoc = require('sassdoc').default;
sassdoc('src', 'dest', {});

Still, the .default is a bit sad.

@pascalduez pascalduez closed this Nov 24, 2014

@pascalduez pascalduez reopened this Nov 24, 2014

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Nov 24, 2014

Member

Shit, I forgot about .default. Should I add it here SassDoc/sassdoc.github.io#49?

Member

HugoGiraudel commented Nov 24, 2014

Shit, I forgot about .default. Should I add it here SassDoc/sassdoc.github.io#49?

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 24, 2014

Member

@HugoGiraudel The above case is really an edge case, the regular user won't need SassDoc's logger, config processor, and internal default sort function.

In some cases the converter can be used to convert Sass to SCSS in a custom workflow (even not using SassDoc, but while I think of it it could totally be in its own library; a streaming Sass to SCSS converter).

The raw parser could sometimes be used too when the sassdoc.parse function is not sufficient (when there's a real need of full control over streaming), but it's like 0.1% of the use cases IMO.

Member

valeriangalliat commented Nov 24, 2014

@HugoGiraudel The above case is really an edge case, the regular user won't need SassDoc's logger, config processor, and internal default sort function.

In some cases the converter can be used to convert Sass to SCSS in a custom workflow (even not using SassDoc, but while I think of it it could totally be in its own library; a streaming Sass to SCSS converter).

The raw parser could sometimes be used too when the sassdoc.parse function is not sufficient (when there's a real need of full control over streaming), but it's like 0.1% of the use cases IMO.

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 24, 2014

Member

And yes, we're stuck with this .default to mimic default ES6 export with Traceur…

Member

valeriangalliat commented Nov 24, 2014

And yes, we're stuck with this .default to mimic default ES6 export with Traceur…

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Nov 24, 2014

Member

Are we going to export refresh? I feel like it has no point.

Member

HugoGiraudel commented Nov 24, 2014

Are we going to export refresh? I feel like it has no point.

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 24, 2014

Member

It looks more internal than anything, people can use safe-wipe and mkdirp on their own I think.

Member

valeriangalliat commented Nov 24, 2014

It looks more internal than anything, people can use safe-wipe and mkdirp on their own I think.

@pascalduez

This comment has been minimized.

Show comment
Hide comment
@pascalduez

pascalduez Nov 24, 2014

Member

The Gulp plugin refactor will bring more concrete usage demo and highlight pitfalls pretty quickly.
But I'm afraid it will need to replicate a lot of boilerplate code from both cli and sassdoc.

but while I think of it it could totally be in its own library; a streaming Sass to SCSS converter

I'm not against the idea, could make sense. But I would say to first clean and finish things up.
Once again the Gulp refactor might push toward this direction.

Member

pascalduez commented Nov 24, 2014

The Gulp plugin refactor will bring more concrete usage demo and highlight pitfalls pretty quickly.
But I'm afraid it will need to replicate a lot of boilerplate code from both cli and sassdoc.

but while I think of it it could totally be in its own library; a streaming Sass to SCSS converter

I'm not against the idea, could make sense. But I would say to first clean and finish things up.
Once again the Gulp refactor might push toward this direction.

@FWeinb

This comment has been minimized.

Show comment
Hide comment
@FWeinb

FWeinb Nov 24, 2014

Member

Just want to say that I really like the new API. Awesome work!

Member

FWeinb commented Nov 24, 2014

Just want to say that I really like the new API. Awesome work!

@pascalduez

This comment has been minimized.

Show comment
Hide comment
@pascalduez

pascalduez Nov 24, 2014

Member

Okay so after thinking of it, I started extracting the Sass converter into its own npm module.
This is obviously lightening core, code and dependencies wise.
Also it will make more sense in a Gulp workflow.

var gulp = require('gulp');
var sassdoc = require('gulp-sassdoc');
var converter = require('sass-convert');

gulp.task('sassdoc', function () {
  return gulp
    .src('path/to/**/*.sass')
    .pipe(converter())
    .pipe(sassdoc({
      'dest': 'path/to/docs'
    }));
});
Member

pascalduez commented Nov 24, 2014

Okay so after thinking of it, I started extracting the Sass converter into its own npm module.
This is obviously lightening core, code and dependencies wise.
Also it will make more sense in a Gulp workflow.

var gulp = require('gulp');
var sassdoc = require('gulp-sassdoc');
var converter = require('sass-convert');

gulp.task('sassdoc', function () {
  return gulp
    .src('path/to/**/*.sass')
    .pipe(converter())
    .pipe(sassdoc({
      'dest': 'path/to/docs'
    }));
});
@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 24, 2014

Member

I really like it.

Member

valeriangalliat commented Nov 24, 2014

I really like it.

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Nov 24, 2014

Member

I suppose you could even slightly tweak the API so you can simply pass a reference to the function (e.g. .pipe(converter) rather than calling it right away. What do you think?

Member

HugoGiraudel commented Nov 24, 2014

I suppose you could even slightly tweak the API so you can simply pass a reference to the function (e.g. .pipe(converter) rather than calling it right away. What do you think?

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 24, 2014

Member

Not sure, it's always good to have the API ready to be passed a configuration object or any arguments. Plus it won't invoke through2 and create a garbage stream at the beginning of the script if the converter is never used.

Member

valeriangalliat commented Nov 24, 2014

Not sure, it's always good to have the API ready to be passed a configuration object or any arguments. Plus it won't invoke through2 and create a garbage stream at the beginning of the script if the converter is never used.

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
Member

HugoGiraudel commented Nov 24, 2014

Okay

@pascalduez

This comment has been minimized.

Show comment
Hide comment
@pascalduez

pascalduez Nov 24, 2014

Member

At the risk of building a separate library, I will implement options to be able to convert in both directions:
sass >> scss and scss >> sass and even css >> scss as I think the converter support this.
So the lib should basically a be node/stream convenience wrapper around the Sass core binary.

converter({ from: 'sass', to: 'scss' }); 

Or something in this vein.

EDIT: well, it could also detect syntax from file extension. :trollface:

Member

pascalduez commented Nov 24, 2014

At the risk of building a separate library, I will implement options to be able to convert in both directions:
sass >> scss and scss >> sass and even css >> scss as I think the converter support this.
So the lib should basically a be node/stream convenience wrapper around the Sass core binary.

converter({ from: 'sass', to: 'scss' }); 

Or something in this vein.

EDIT: well, it could also detect syntax from file extension. :trollface:

@pascalduez

This comment has been minimized.

Show comment
Hide comment
@pascalduez

pascalduez Nov 24, 2014

Member

Guys, as the library name does sass-convert similar to the Sass binary sounds good to you ?
Otherwise node-sass-convert

Member

pascalduez commented Nov 24, 2014

Guys, as the library name does sass-convert similar to the Sass binary sounds good to you ?
Otherwise node-sass-convert

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 24, 2014

Member

I vote for sass-convert.

Member

valeriangalliat commented Nov 24, 2014

I vote for sass-convert.

@FWeinb

This comment has been minimized.

Show comment
Hide comment
@FWeinb

FWeinb Nov 24, 2014

Member

👍 for sass-convert

Member

FWeinb commented Nov 24, 2014

👍 for sass-convert

@pascalduez

This comment has been minimized.

Show comment
Hide comment
@pascalduez

pascalduez Nov 24, 2014

Member

Silly idea for later, convert the Ruby converter to JS to we don't have to do all the shitty binary tests.
http://opalrb.org

Member

pascalduez commented Nov 24, 2014

Silly idea for later, convert the Ruby converter to JS to we don't have to do all the shitty binary tests.
http://opalrb.org

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 24, 2014

Member

Interesting.

Member

valeriangalliat commented Nov 24, 2014

Interesting.

@pascalduez

This comment has been minimized.

Show comment
Hide comment
@pascalduez

pascalduez Nov 24, 2014

Member

Humm, I prefer sass-convert but shouldn't it be clearer and more standard to prefix with node ?
https://github.com/sass/node-sass

Member

pascalduez commented Nov 24, 2014

Humm, I prefer sass-convert but shouldn't it be clearer and more standard to prefix with node ?
https://github.com/sass/node-sass

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 24, 2014

Member

I don't know, they don't use gem install ruby-sass after all.

Member

valeriangalliat commented Nov 24, 2014

I don't know, they don't use gem install ruby-sass after all.

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 24, 2014

Member

@pascalduez The love needed for the Node.js API is there: 6a4157a c12bb09, and the parse function 1218e02.

Member

valeriangalliat commented Nov 24, 2014

@pascalduez The love needed for the Node.js API is there: 6a4157a c12bb09, and the parse function 1218e02.

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Nov 24, 2014

Member

Port src/convert.js to ES6 (@pascalduez is working on it), and adapt to new exposed API. Since we now use a streaming filesystem, we could add the Sass to SCSS conversion as one simple pipe before the parser pipe. This would require to call sass-convert file by file instead of processing the whole directory at once, but since the parser pipe parses the files one by one too, I think this fits really well into the streaming workflow (and we wouldn't need a temproary directory at all).

Is this done?

Member

HugoGiraudel commented Nov 24, 2014

Port src/convert.js to ES6 (@pascalduez is working on it), and adapt to new exposed API. Since we now use a streaming filesystem, we could add the Sass to SCSS conversion as one simple pipe before the parser pipe. This would require to call sass-convert file by file instead of processing the whole directory at once, but since the parser pipe parses the files one by one too, I think this fits really well into the streaming workflow (and we wouldn't need a temproary directory at all).

Is this done?

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 24, 2014

Member

Not exactly yet, but @pascalduez's refactoring it and moving it to a separate module.

Member

valeriangalliat commented Nov 24, 2014

Not exactly yet, but @pascalduez's refactoring it and moving it to a separate module.

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Nov 24, 2014

Member

Do you think we could merge refacto into develop at some point so that we can also fix #270, #277 and the likes?

Member

HugoGiraudel commented Nov 24, 2014

Do you think we could merge refacto into develop at some point so that we can also fix #270, #277 and the likes?

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 24, 2014

Member

Soon. Just wait for the converter to be fully externalized I think.

Member

valeriangalliat commented Nov 24, 2014

Soon. Just wait for the converter to be fully externalized I think.

@pascalduez

This comment has been minimized.

Show comment
Hide comment
@pascalduez

pascalduez Nov 24, 2014

Member

I completely removed converter stuff from core, so I guess you could merge into develop if needed.
I will re-add the needed bits once the lib is ready. No big deal.

Member

pascalduez commented Nov 24, 2014

I completely removed converter stuff from core, so I guess you could merge into develop if needed.
I will re-add the needed bits once the lib is ready. No big deal.

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Nov 24, 2014

Member

Alright, if you feel like merging refacto into develop @valeriangalliat, feel free to do so.

Member

HugoGiraudel commented Nov 24, 2014

Alright, if you feel like merging refacto into develop @valeriangalliat, feel free to do so.

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 24, 2014

Member

Alright, let's do it.

Member

valeriangalliat commented Nov 24, 2014

Alright, let's do it.

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
Member

valeriangalliat commented Nov 24, 2014

Done.

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 24, 2014

Member

The only remaining tasks are about converter and plugins. We can maybe remove it from the list and close this issue?

Member

valeriangalliat commented Nov 24, 2014

The only remaining tasks are about converter and plugins. We can maybe remove it from the list and close this issue?

@valeriangalliat

This comment has been minimized.

Show comment
Hide comment
@valeriangalliat

valeriangalliat Nov 24, 2014

Member

Move this to "2.0 checklist" maybe.

Member

valeriangalliat commented Nov 24, 2014

Move this to "2.0 checklist" maybe.

@HugoGiraudel

This comment has been minimized.

Show comment
Hide comment
@HugoGiraudel

HugoGiraudel Nov 24, 2014

Member

Fully okay.

Member

HugoGiraudel commented Nov 24, 2014

Fully okay.

@pascalduez

This comment has been minimized.

Show comment
Hide comment
@pascalduez

pascalduez Nov 24, 2014

Member

Remove converter + plugins and close. Time for beta testing I guess.

Member

pascalduez commented Nov 24, 2014

Remove converter + plugins and close. Time for beta testing I guess.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment