Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Document the tokeniser #11

Merged
merged 5 commits into from
Feb 6, 2021
Merged

Document the tokeniser #11

merged 5 commits into from
Feb 6, 2021

Conversation

GarkGarcia
Copy link
Contributor

This is a follow up to #9. I documented everything I could find about the bits of the tokeniser used by core.

@GarkGarcia GarkGarcia added the documentation Improvements or additions to documentation label Feb 1, 2021
Copy link
Member

@rocky rocky left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM - thanks.

mathics_scanner/errors.py Outdated Show resolved Hide resolved
"""
A tokeniser for the Wolfram Language.

When subclassing ``Tokeniser``, custom tokenisation rules can be defined by
Copy link
Contributor Author

@GarkGarcia GarkGarcia Feb 2, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this mechanism should be exposed in the public API (and therefore it shouldn't be documented in the docstring). If you think about it, all consumers of this library want is to have a functioning WL tokeniser that they can use as a black-box (that's what I think at-least).

This definitively should be documented somewhere though. @rocky I'd apprciate if we could merge #8 before this, so that I can move this information to implementation.rst. I also plan to convert implementation.rst and the rest of the documentation to a proper Sphinx document before we release the library (which should be pretty easy to do, so it's not gonna take too much time).

This methods are only useful internally and are not used by core anywhere
Copy link
Member

@rocky rocky left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have only a little time to make a release so we'll merge this in and continue after 1.0.0

@rocky rocky merged commit 656a7cc into master Feb 6, 2021
@GarkGarcia
Copy link
Contributor Author

@rocky Please wait for me to finish document the tokeniser before the release. It's going to be very fast and it won't break anything since we're only changing the documentation. I'd be very disappointed if you went through with the release without giving me time to fix the documentation, since I've been waiting the entire week for you to merge #8 so that I could proceed with the changes in here.

@rocky
Copy link
Member

rocky commented Feb 6, 2021

A release is already done. Sorry. We can make another realase.

I don't understand why fixing documentation couldn't have been done previously.

@GarkGarcia
Copy link
Contributor Author

I don't understand why fixing documentation couldn't have been done previously.

I needed implementation.rst to be around, and it was added in #8, as I explained in here:

This definitively should be documented somewhere though. @rocky I'd apprciate if we could merge #8 before this, so that I can move this information to implementation.rst.

Since you didn't reply to my comment, I assumed you were going to wait for me to make the changes.

@rocky
Copy link
Member

rocky commented Feb 6, 2021

implementation.rst is there in master. Just keep committing to it.

Sorry about the release.

@mmatera
Copy link
Contributor

mmatera commented Feb 6, 2021

Rocky, please, before de release check that mathics master pass the CI

@rocky
Copy link
Member

rocky commented Feb 6, 2021

@mmatera Have been working on that right now. Looks like there was longstanding breakage I am finding out about now. See mathics/Mathics#1146

The failing PossbleQ tests I think are related to something else.

@GarkGarcia GarkGarcia mentioned this pull request Feb 8, 2021
@rocky rocky deleted the tokenizer-documentation branch March 19, 2021 01:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants