Skip to content

[CI]: Consider/Discuss Tools for Verifying Fenced Code in Markdown Documents #2848

Open
@BethanyG

Description

@BethanyG

This came up in discussing PR #2838, so opening an issue here for longer discussion/evaluation.

We currently "hand validate" example python code used in introduction.md, about.md, and instruction.md and similar documents. This leads to errors where certain code will not work in the REPL, or syntax or other errors get made and published. As we scale up exercises, this doesn't feel like a sustainable solution, hence this issue to propose, evaluate, and track possible tools and strategies for verifying code , and (possibly) adding that verification to the track CI.

Below are three applicable libraries, but I'd warmly welcome more. Of the three below, pmdoctest feels like the nicest solution, and I've run the comparisons concept exercise through it with reasonable results. But I'd like to see if there are other strategies/libraries out there.

doctest - this is the old-school original, but doesn't really work well in markdown fences.
mkcodes - have not tried this yet.
pmdoctest - reasonably good, but requires some weird quirks with code fence language names and or excess >>> in code fences to make parsing work.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions