Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Simplify imports #594

Merged
merged 51 commits into from
Sep 8, 2018
Merged

Simplify imports #594

merged 51 commits into from
Sep 8, 2018

Conversation

jmfrank63
Copy link
Contributor

@jmfrank63 jmfrank63 commented Aug 7, 2018

As advised I replaced all import * with explicit imports in __init__.py of the trio root.

I tested against vscode code completion and it seems to work. I did not run any tests yet.

Also pylint does not complain anymore about missing attributes.

The issue was #542.

@Zac-HD
Copy link
Member

Zac-HD commented Aug 8, 2018

This looks great! You can see the formatting issues Travis is complaining about here, and fix them by running

pip install yapf==0.20.1
yapf -rpi setup.py trio

After that, we'll need a test that every name in the __all__ list appears in the trio namespace, i.e.

def test_dunder_all():
    for name in trio.__all__:
        assert hasattr(trio, name)

This will ensure that we keep the imports up to date when new names are added at lower levels.

The optional last step would be to check that the dynamically-created __all__ works for pylint and mypy; and if not to declare it up front and write another test that it's consistent with the lower-level __all__s. However, that step can be left to a future pull request by someone else.

Finally - and most importantly - thanks for contributing! 😍 🎉

trio/_version.py Outdated
@@ -1,3 +1,3 @@
# This file is imported from __init__.py and exec'd from setup.py

__version__ = "0.5.0+dev"
__version__ = "0.5.1+dev"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need to change the version here :-) We'll do that when we release.

@njsmith
Copy link
Member

njsmith commented Aug 8, 2018

If we're switching to explicitly listing all re-exports in the import statements... is there any point in maintaining the __all__ lists at all? Either in trio.__all__, or inside the individual sub-modules? Surely static analysis tools are all clever enough to understand that if they see from ... import X then that means X is defined, right...?

@Zac-HD
Copy link
Member

Zac-HD commented Aug 8, 2018

@njsmith - at runtime, __all__ just defines which names are imported by from foo import *. However it's also widely used for static analysis (eg pylint and mypy), because PEP8 defines names which appear in __all__ as public API.

Since the goal of #542 was to improve the static analyse-ability of Trio, I'd like to keep or write __all__ literals for all our public modules. (this is actually the main blocker for static typing, FWIW)

@njsmith
Copy link
Member

njsmith commented Aug 8, 2018

@Zac-HD Huh, the rule for import * is that if there's an __all__, it uses that, and if there isn't, it defaults to "all non-underscored names". I would have assumed that pylint/mypy/etc. knew that rule. Are you sure pylint etc. really treat these two modules differently?

# module 1
def f():
    pass
# module 2
def f()
    pass
__all__ = ["f"]

And if so, I'm super curious what the difference is...

@Zac-HD
Copy link
Member

Zac-HD commented Aug 8, 2018

I don't know of any tools that would treat those differently, but module three has no public API, and the interpretation of module four may vary depending on the tool.

# module 3
__all__ = []
def f()
    pass
# module 4
__all__ = []
def f()
    pass
__all__.append('f')  # or +=, or .extend, etc.

In particular I've encountered a few weird issues with non-literal __all__ in e.g. pydocstyle, and having empty declarations is really useful for Sphinx autodoc and similar tools which walk namespaces looking for something to do.

trio/__init__.py Outdated
current_time,
TaskLocal,
__all__)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure what the trio style is but rather than have towers of arguments at various arbitrary columns I personally prefer to drop down to the next line and only indent one level:

from ._toplevel_core_reexports import (
    ClosedResourceError,
    MultiError,
    ResourceBusyError,
    RunFinishedError,
    TrioInternalError,
    Cancelled,
    WouldBlock,
    TaskLocal,
    TASK_STATUS_IGNORED,
    current_effective_deadline,
    current_time,
    open_cancel_scope,
    open_nursery,
    run,
)

...this also allows for a more compact form:

from ._toplevel_core_reexports import (
    ClosedResourceError, MultiError, ResourceBusyError, RunFinishedError,
    TrioInternalError, Cancelled, WouldBlock, TaskLocal, TASK_STATUS_IGNORED,
    current_effective_deadline, current_time, open_cancel_scope, open_nursery,
    run,
)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, the compact form is enforced by yapf; see my comment above re: Travis.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yapf allows either of the two forms @dhirschfeld showed – IIRC you get the first one if you put a trailing comma on the last item, and the "more compact" one if you don't use a trailing comma.

@njsmith
Copy link
Member

njsmith commented Aug 8, 2018

@Zac-HD

In particular I've encountered a few weird issues with non-literal __all__ in e.g. pydocstyle

Right, that's one of the motivations here (and see also #314, which has some commentary from one of the PyCharm devs on what it can/can't understand).

Let me try phrasing my question a different way. For trio/__init__.py, I think we basically have these four options:

# option A
from ._submodule1 import *
__all__ += _submodule1.__all__
from ._submodule2 import *
__all__ += _submodule2.__all__

# option B
from ._submodule1 import X, Y
__all__ += _submodule1.__all__
from ._submodule2 import Z
__all__ += _submodule2.__all__

# Option C
from ._submodule1 import X, Y
from ._submodule2 import Z
__all__ = ["X", "Y", "Z"]

# option D
from ._submodule1 import X, Y
from ._submodule2 import Z
# (__all__ is left undefined)

Trio master currently uses option A, and that hasn't worked out so well. So now we're picking between B, C, or D.

With B, the __all__ += _submodule.__all__ thing may or may not be understood by static analysis tools. Options C and D don't have that problem.

With B and C, the definition of __all__ is more annoying for us to maintain, because now every time we add a new export, we have to manually update both the import line and the definition of __all__, and make sure they stay in sync. (With B we have to manually update trio._submodule.__all__; with C we have to manually update trio.__all__, but either way it's two places that need updating.)

Option D, it seems to me, doesn't have either problem: the exports are clearly visible to static analysis tools, there's no chance for __all__ to confuse static analysis because there is no __all__, and the exports are only listed once, so maintenance is easier.

In fact I assume mypy has to mostly ignore __all__, because when it sees something like trio.open_nursery it has to track that back to a definition so it can find the type signature – and only the import line can help it do that.

@Zac-HD
Copy link
Member

Zac-HD commented Aug 8, 2018

I'd be happy with either C or D and leaning towards D as the less-effort option.

Either way, it would be good to write a test that everything in the lower-level modules __all__ is exported at the top level.

@njsmith
Copy link
Member

njsmith commented Aug 8, 2018

Either way, it would be good to write a test that everything in the lower-level modules __all__ is exported at the top level.

I'd be tempted to get rid of the lower-level __all__'s too. If we're not using them, then what's the point of maintaining and testing them?

I guess the case where this test would catch a bug is: someone adds a new symbol, and remembers to add it to the module's __all__, but forgets to add it to the higher-level import line – AND if we didn't have the __all__ then they would have forgotten to export it anywhere. I guess this scenario could happen? We all forget stuff sometimes. But I feel like if someone's going to forget to export something, then it doesn't help to have two places they're supposed to export it.

@jmfrank63
Copy link
Contributor Author

jmfrank63 commented Aug 8, 2018 via email

@dhirschfeld
Copy link
Member

dhirschfeld commented Aug 8, 2018

I'm no doubt missing something but is there a reason you need to define an __all__ in __init__.py? If you do:

from ._submodule1 import *
from ._submodule2 import *

...that will only import the public objects (defined in __all__) from the submodules. I'm pretty sure that's the pattern I've used - define all at the submodule level, not in the __init__.py. Does that confuse static analysis tools? It's certainly possible but not something I've noticed.

Edit: I did a bit of testing to make sure I wasn't crazy. What I found was that importing * picked up everything - so importing something that wasn't in __all__ didn't show up as an error in static analysis. This appeared to be true using either jedi or the python language server in vscode. The only way to avoid false positives appeared to be explicit imports in the __init__.py. Ugh.

@njsmith
Copy link
Member

njsmith commented Aug 8, 2018

@dhirschfeld This is one of the frustrating things about this yeah – it's really hard to know what works and what doesn't :-(. To make sure I understand correctly: you're saying that if you write:

# submodule.py
# has a mix of public and private with literal __all__
a = 1
b = 2
__all__ = ["a"]

# __init__.py
from .submodule import *

... Then both Jedi and VSCode think that now b is available in the __init__.py namespace?

Before I saw your edit I was going to say well, maybe that's the one case that works and we can use it for the simpler parts of trio where that pattern applies, but if even that doesn't work then never mind.

@njsmith
Copy link
Member

njsmith commented Aug 8, 2018

@jmfrank63 I think first let's settle how we want to handle trio/__init__.py. Then we'll probably want to apply the same change to the other files that currently follow the same pattern:

  • trio/hazmat.py
  • trio/testing/__init__.py
  • trio/_core/__init__.py

For hazmat and _core this will be a little tricky because their exports are different on different platforms. But we can at least apply this trick to the parts that aren't platform-dependent.

And then after that, we'll want to think about what to do with the trio.abc and trio.socket modules, which are a bit different.

@dhirschfeld
Copy link
Member

Then both Jedi and VSCode think that now b is available in the init.py namespace?

Yep, that appears to be the case - i.e. val = mymodule.b didn't show as an error using static analysis and b was a tab-completion option. In a python terminal with live objects b correctly didn't show up as an option. I was surprised by that, so much so that I'd call it a bug in static analysis!

I wonder if PyCharm does any better... might check that out.

@jmfrank63
Copy link
Contributor Author

jmfrank63 commented Aug 9, 2018 via email

@codecov
Copy link

codecov bot commented Aug 9, 2018

Codecov Report

Merging #594 into master will decrease coverage by 0.02%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #594      +/-   ##
==========================================
- Coverage   99.31%   99.29%   -0.03%     
==========================================
  Files          95       95              
  Lines       11028    11033       +5     
  Branches      786      791       +5     
==========================================
+ Hits        10953    10955       +2     
- Misses         56       59       +3     
  Partials       19       19
Impacted Files Coverage Δ
trio/_deprecate.py 100% <ø> (ø) ⬆️
trio/_toplevel_core_reexports.py 0% <ø> (-100%) ⬇️
trio/_util.py 100% <100%> (ø) ⬆️
trio/tests/test_exports.py 100% <100%> (ø) ⬆️
trio/_signals.py 100% <100%> (ø) ⬆️
trio/__init__.py 100% <100%> (ø) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 8e16393...3afdb8b. Read the comment docs.

@jmfrank63
Copy link
Contributor Author

Hi I did the formatting and reverted the version. Currently there is not a test for this. I will need some guidance on the tests, as testing is an absolute weak spot with me.
I left __all__intact, but I moved the call to the globals into a function so we can import it and call it for the reexport. I thought this will not hurt but might be beneficial for the future.
I don't think it is called anyway now that we do an explicit import and not a *, but I could be wrong.
Pycharm still works with the git repro installed and shows all the completions vscode with the python extension shows.
Jedi vim is currently in progress, setup is tricky due to the lack of virtualenv picked up by jedi.

@njsmith
Copy link
Member

njsmith commented Aug 9, 2018

@jmfrank63 Thanks for checking in with the pylint developers – that's super interesting to hear that they explicitly don't use __all__. (Link to the pylint issue: pylint-dev/pylint#2382 (comment). By the way, as a general tip, I like to link between issues like this as much as I can, because it helps when someone is trying to track down why a decision was made like 5 years later and they have no idea what's going on. I guess github has some official syntax for linking to issues in other repositories, but I can't be bothered to learn it – I just cut and paste the actual url into my comment, and then github notices that it's a github url and renders it nicely.)

For this PR:

  • It sounds like it's more or less settled now that __all__ isn't doing anything useful. Can you delete all the __all__ update lines in trio/__init__.py, and also in all the trio/_something.py files that the __init__.py file imports from? (Just those files, not any others – for example trio/_abc.py will probably need some changes eventually but for this PR we shouldn't change it.)

  • Regarding trio/_toplevel_core_reexports.py: this whole file only exists because of an earlier attempt to fix this same issue. I guess it helps with pycharm, but apparently not with any other analyzer (because all it does is try to make __all__ more visible). Our new strategy replaces it. So we should delete this file entirely, and in __init__.py we should write:

from ._core import (
   # ... everything we currently import from _toplevel_core_reexports
)
  • For writing a test: the basic idea is to open one of the files in the tests directory – for this I would use trio/tests/test_exports.py – and add a function named test_<whatever you like>. The function should assert that whatever it's testing is true. Then to run the tests, use pytest (there's some more details here: https://trio.readthedocs.io/en/latest/contributing.html#pull-request-tests). If you haven't done this before, I'd try just locally playing around with writing tests like assert 1 + 1 == 2 and assert 1 + 1 == 987 to get the hang of it.

    Then for the actual test, I suppose we want to use the pylint and/or Jedi APIs to query for what exports they see, and compare that to what we see when looking at the same namespace dynamically. For an example of how to use the pylint API, see here. Jedi has an example of how to get completions on the first page of its docs; we could ask it for completions for a script like "import trio; trio.".

Please do ask for help if you get stuck!

@jmfrank63
Copy link
Contributor Author

jmfrank63 commented Aug 13, 2018

The changes just as they are get a difference in jedi of

{'__builtins__', '__cached__', '__loader__', '__path__', '__spec__'}

and

{'fixup_module_metadata'}

with pylint.

@njsmith
Copy link
Member

njsmith commented Aug 14, 2018

We could move the fixup_module_metadata stuff to a submodule, if it makes this easier to test. Or just import fixup_module_metadata as _fixup_module_metadata.

@jmfrank63
Copy link
Contributor Author

jmfrank63 commented Aug 17, 2018

I wrote a first test against pylint:

def test_pylint_sees_all_names_in_namespace():
    # Test pylints ast to contain the same content as dir(trio)
    
    linter = PyLinter()
    ast_set = set(linter.get_ast('trio/__init__.py', 'trio'))
    trio_set = set([symbol for symbol in dir(trio) if symbol[0] != '_'])
    trio_set.remove('tests')
    assert trio_set - ast_set == set([])

However since testing is not included by default (commented out) the test fails on the testing module not being in the namespace.

But this one passes the test:

TESTING = False
if TESTING:
    from . import testing

So we could define an environment variable that if not present skips the import otherwise imports.
Any suggestions?

@njsmith
Copy link
Member

njsmith commented Aug 17, 2018

I think it'd be fine for the test to special-case "testing" as a name that pylint/jedi might not pick up.

Copy link
Member

@njsmith njsmith left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A few quick comments I noticed while looking in...

trio/__init__.py Outdated
TrioInternalError, RunFinishedError, WouldBlock, Cancelled,
ResourceBusyError, ClosedResourceError, MultiError, run, open_nursery,
open_cancel_scope, current_effective_deadline, TASK_STATUS_IGNORED,
current_time, TaskLocal, __all__
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You shouldn't be importing __all__ here I think.

trio/__init__.py Outdated
@@ -12,59 +12,62 @@

from ._version import __version__

__all__ = []
from ._toplevel_core_reexports import (
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can make this from ._core import ..., and delete the _toplevel_core_reexports.py file entirely.


def test_jedi_sees_all_completions():
# Test the jedi completion library get all in dir(trio)
script = jedi.Script(path=trio.__file__)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the way jedi.Script is supposed to be used is maybe more like jedi.Script("import trio\ntrio.")? (As in, we imagine somebody typed those characters into a file and then hit tab to get a list of completions.)

linter = PyLinter()
ast_set = set(linter.get_ast(trio.__file__, 'trio'))
trio_set = set([symbol for symbol in dir(trio) if symbol[0] != '_'])
trio_set.remove('tests')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably worthwhile to factor out some of the namespace cleanup code from the two functions – I think in both cases we want to test something like: "the two sets are equivalent, if you ignore names that .startswith("_"), "tests", and "testing""?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I'll work myself through all the comments. Will take some time as this is all new stuff.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No worries, take your time, and don't be afraid to ask for help if you get stuck :-)

@njsmith
Copy link
Member

njsmith commented Aug 19, 2018

@jmfrank63 Ah, so what's happened is you've discovered some places where we were unexpectedly relying on __all__. Exciting! :-)

In test_exports, it checks for symbol in source.__all__. Here symbol is the name of something exported from trio._core, like "TrioExternalError", and source is one of the modules that's supposed to re-export those symbols, like trio. But now we've gotten rid of trio.__all__, so this test no longer makes sense – we don't want "TrioExternalError" to show up in trio.__all__.

The quick fix is replace symbol in source.__all__ with hasattr(source, symbol). Later on when we clean up trio/_core/__init__.py we might also need to adjust the for symbol in _core.__all__ loop that's a few lines up, but for right now that's not necessary.

In test_module_metadata_is_fixed_up, the problem is an actual bug in fixup_module_metadata. Here's the source:

trio/trio/_util.py

Lines 171 to 182 in ffb5bca

def fixup_module_metadata(module_name, namespace):
def fix_one(obj):
mod = getattr(obj, "__module__", None)
if mod is not None and mod.startswith("trio."):
obj.__module__ = module_name
if isinstance(obj, type):
for attr_value in obj.__dict__.values():
fix_one(attr_value)
for objname in namespace["__all__"]:
obj = namespace[objname]
fix_one(obj)

Notice how on line 180, we've got another of those references to __all__. The goal of this function is to take a module namespace, go through all the objects it exports, and fix up their __module__ attribute so that when someone later tries to figure out the public name of this object, they get the right thing. For example, at the bottom of tracebacks it shows you the name of the exception type:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
trio.TooSlowError: timeout expired

Notice how it says trio.TooSlowError there? That trio is coming from the exception type's __module__ attribute. If we didn't rewrite it, then it would instead say trio._timeouts.TooSlowError: timeout expired, which exposes internal implementation details and is also kinda weird and distracting.

Anyway, those details don't really matter right now, just giving you some context :-). The specific issue is that the function needs to iterate over all the public exports in a namespace, and it's trying to do that using __all__, but of course that doesn't work once we stop using __all__ to control public exports. Instead, we can rewrite that loop like:

for objname, obj in namespace.items():
    if not objname.startswith("_"):  # ignore private attributes
        fix_one(obj)

@jmfrank63
Copy link
Contributor Author

Great, I'll try this and see how it works. Currently moving my home so some delay may occur.

@njsmith
Copy link
Member

njsmith commented Aug 23, 2018

Filed a bug upstream with jedi to ask about the skipped test: davidhalter/parso#47

@njsmith njsmith changed the base branch from master to release August 23, 2018 07:13
@jmfrank63
Copy link
Contributor Author

jmfrank63 commented Sep 6, 2018 via email

@belm0
Copy link
Member

belm0 commented Sep 7, 2018

I think this is the top user-happiness issue of Trio currently (watchdog seems like a distant 2nd).

@jmfrank63 do you have an idea of time frame? I'm happy to take a stab this weekend otherwise.

@@ -1,4 +1,4 @@
pytest >= 3.3 # for catchlog fixture
pytest == 3.7.4 # for catchlog fixture
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a particular reason for pinning to exactly 3.7.4 rather than just using >=?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was just a workaround to see the real errors. See #650
The change has been reverted and is done in PR#650

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! When making a change such as pinning to a particular version it's useful to users to make a comment (in the commit message) as to why so they know whether or not they can safely ignore it e.g. because the underlying issue has been fixed.

Fixing it in #650 is great because now the blame https://github.com/jmfrank63/trio/blame/528a2986434d644be8138219f0f3087e111d79e8/test-requirements.txt#L1 is linked to the PR and so a user can follow that all the way to #651 to see if the problem has been fixed and they're safe to use pytest 3.8.1 with trio.

@njsmith
Copy link
Member

njsmith commented Sep 8, 2018

We don't need _toplevel_core_reexports.py anymore, right? it doesn't look like anything is using it now...

Copy link
Member

@Zac-HD Zac-HD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks so much for taking this on @jmfrank63 - I know it's a much larger job than it appears, but it's also appreciated by more people than will comment here 😍

I think that there are only three things left before we can merge this (:tada:):

  • The formatting nitpick I note below
  • deleting trio/_toplevel_core_reexports.py entirely (via @njsmith)
  • Refactoring namespace cleanup code in your excellent tests (also via @njsmith), which I plan to copy for Hypothesis 😄 [deferred to a future PR]

trio/_signals.py Outdated
@@ -177,7 +177,7 @@ def __aiter__(self):
return self

async def __anext__(self):
return { await self._signal_queue.__anext__()}
return {await self._signal_queue.__anext__()}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the line causing Travis to fail - yapf requires that leading space. I'd just drop it from the diff, as it's not really related to the imports issue.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, this is some weird thing where yapf is apparently doing different for @jmfrank63 than for the rest of us, even with the same version and settings. ¯\(ツ)

And yeah, simplest thing is to manually delete the space...

@njsmith
Copy link
Member

njsmith commented Sep 8, 2018

@Zac-HD Thanks for reviewing! Re:

Refactoring namespace cleanup code in your excellent tests

I think I'd also be OK with putting this off to a follow-up PR... this one has been dragging on a long time and we don't want it to get stuck again :-)

@belm0
Copy link
Member

belm0 commented Sep 8, 2018

My project just started using pylint for static checking in the continuous build, and many on our team rely on the PyCharm IDE, so this change will be fantastic-- thank you!

For the sake of those looking through historical commits, I'd like to ask again that the "squash" option be used on merge since 50 commits is a lot for what's ultimately a relatively small change.

Copy link
Member

@Zac-HD Zac-HD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good to me - thanks again @jmfrank63 🎉

@Zac-HD Zac-HD merged commit 312b470 into python-trio:master Sep 8, 2018
@njsmith
Copy link
Member

njsmith commented Sep 8, 2018

You know, in all that, I think we forgot to add a newsfragment. We should do that so users will realize how hard we're working.

@jmfrank63
Copy link
Contributor Author

jmfrank63 commented Sep 8, 2018 via email

@njsmith
Copy link
Member

njsmith commented Sep 8, 2018

@jmfrank63 For the mechanics, see: https://trio.readthedocs.io/en/latest/contributing.html#pull-request-release-notes

Basically: make a file called newsfragments/542.bugfix.rst (or feature? Idk, use your judgement :-)), and write a sentence or two to explain to users why they should be interested in upgrading to 0.8.0. For examples, you can use the previous release notes: https://trio.readthedocs.io/en/latest/history.html

@jmfrank63
Copy link
Contributor Author

Now, that the pull request is merged, how to add a newsfragment? Via an additional PR?

@njsmith
Copy link
Member

njsmith commented Sep 9, 2018

@jmfrank63 yep, exactly

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants