Skip to content

Conversation

@dotNomad
Copy link
Collaborator

@dotNomad dotNomad commented Mar 11, 2025

This PR adds a new GitHub workflow and associated actions to release, version, and list Connect extensions.

High Level

To facilitate this I'll go over some of the high level changes:

  • Extensions that we TAR now have a new extension section in the manifest.json that stores metadata that fills the extension list
    • This includes a name, title, description, homepage (the link to the GitHub page for the extension currently), and a semver version
  • A new base level file called extensions.json
    • This currently just contains { extensions: [] } but is where the gallery's Extension List will live
    • This file is not intended to be edited by hand except for removal (documentation on this to come)
  • A new scripts folder that contains a extension.list.ts script that uses GitHub releases created by the new workflow to add to the Extension List
    • This is a TypeScript file so there are additional files to add TypeScript support and types for easier script writing

GitHub Workflow

There are 3 new GitHub actions in this PR:

  • lint-extension: to check if a given extension name in the repo contains the new extension section in its manifest.json and that the version string is a valid semver
  • package-extension: to TAR up the extension for releases
    • Takes an extension name and optionally an artifact-name for custom workflow extensions (see more below)
  • release-extension: to release the packaged extension as a GitHub Release and update the Extension List with the data of the release for the Gallery to utilize

The main workflow - extensions.yml checks for changes to extensions to lint, package, and if on main release the extensions.

It does this in two groups:

  • simple-extensions: includes all extensions in directories that can just be tarball-ed to release
  • complex-extensions: all extensions that require custom workflows
    • The only one in this group is publisher-command-center at the moment

Once all of the extensions have been released as GitHub Releases the extension list script is run to capture that information for the Gallery to consume in something like the below:

Example `extensions.json`
{
"extensions": [
  {
    "name": "publisher-command-center",
    "title": "Publisher Command Center",
    "description": "A dashboard for publishers to help manage and track their content",
    "homepage": "https://github.com/posit-dev/connect-extensions/tree/main/extensions/publisher-command-center",
    "latestVersion": {
      "version": "2.0.2",
      "released": "2025-03-11T22:55:27Z",
      "url": "https://github.com/dotNomad/connect-extensions/releases/download/publisher-command-center%40v2.0.2/publisher-command-center.tar.gz"
    },
    "versions": [
      {
        "version": "2.0.2",
        "released": "2025-03-11T22:55:27Z",
        "url": "https://github.com/dotNomad/connect-extensions/releases/download/publisher-command-center%40v2.0.2/publisher-command-center.tar.gz"
      },
      {
        "version": "2.0.1",
        "released": "2025-03-11T22:39:02Z",
        "url": "https://github.com/dotNomad/connect-extensions/releases/download/publisher-command-center%40v2.0.1/publisher-command-center.tar.gz"
      }
    ]
  },
  {
    "name": "reaper",
    "title": "Reaper",
    "description": "A view to a kill.",
    "homepage": "https://github.com/posit-dev/connect-extensions/tree/main/extensions/reaper",
    "latestVersion": {
      "version": "2.0.4",
      "released": "2025-03-11T22:54:56Z",
      "url": "https://github.com/dotNomad/connect-extensions/releases/download/reaper%40v2.0.4/reaper.tar.gz"
    },
    "versions": [
      {
        "version": "2.0.4",
        "released": "2025-03-11T22:54:56Z",
        "url": "https://github.com/dotNomad/connect-extensions/releases/download/reaper%40v2.0.4/reaper.tar.gz"
      },
      {
        "version": "2.0.3",
        "released": "2025-03-11T22:42:29Z",
        "url": "https://github.com/dotNomad/connect-extensions/releases/download/reaper%40v2.0.3/reaper.tar.gz"
      },
      {
        "version": "2.0.1",
        "released": "2025-03-11T22:38:48Z",
        "url": "https://github.com/dotNomad/connect-extensions/releases/download/reaper%40v2.0.1/reaper.tar.gz"
      }
    ]
  }
]
}

The actions and general extensions.yml workflow files are heavily commented to explain the steps and some of the decisions made.

Custom Workflow Extensions

The "Complex Extensions" have custom workflows. They cannot simply be packaged and require some additional steps to package, and other steps potentially in the future.

The one example now is publisher-command-center which needs to setup node to do a build of the frontend it ships. This cannot be handled generically so requires a more custom built GitHub workflow to be utilized by the main workflow - extensions.yml.

The custom workflows can use the same actions above (and in this PR do) for easy linting, packaging, and releasing.

Approach

There were a few goals I kept in mind when writing the workflows and script:

  • Simple extensions should have little to no edits in GitHub Workflows to work
    • With this PR only 1 line is required to add a new extension
  • Don't release every extension in every directory to avoid releasing an extension earlier than developers of it would like
  • Complex extensions like publisher-command-center should be possible to build and release
  • Actions in the workflow should be generalized to avoid duplicate code
  • The flow should be easily extendable
  • The flow should run in parallel wherever possible
  • It should be easy to consume the latest version of an extension and versions in the Extension List should be sorted to help this as well
  • Commit once to update the extension list
  • Pull Requests should see if an extension would release, and with what TAR (included as artifact(s) of the workflow)

Workflow testing

Each step was done as a POC and then generalized to Actions and then other extensions. The development for this was done on a fork dotNomad/connect-extensions to run the workflows.

There are a few workflows I'll point out that were done as tests:

  • Workflow #77 updates two extensions, one with an error one without. The success still updates the extension list and the failure does not get released
  • Workflow #78 updates two extensions at once successfully
  • Workflow #80 shows no extensions updating and no extension commands run around linting, packaging, etc.
  • Workflow #81 shows an update to a single extension
  • Workflow #82 shows a correct linting failure
  • Workflows #83 and #84 show two workflows completing near-the-same-time and the extension list still updating.

@dotNomad dotNomad marked this pull request as ready for review March 12, 2025 00:00
Copy link
Contributor

@tdstein tdstein left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very neat!

Does it make sense to write up the instructions in a CONTRIBUTING.md document? That's the first place I would think to look when making changes.

Comment on lines +17 to +25
jq '
if (.extension | type) != "object" then error("Missing extension object")
elif (.extension.name | type) != "string" then error("Missing extension.name")
elif (.extension.title | type) != "string" then error("Missing extension.title")
elif (.extension.description | type) != "string" then error("Missing extension.description")
elif (.extension.homepage | type) != "string" then error("Missing extension.homepage")
elif (.extension.version | type) != "string" then error("Missing extension.version")
else . end
' ./extensions/${{ inputs.extension-name }}/manifest.json
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a lot of inline jq. Should this, or any of the other jq scripts, move into package.json so that they can be run locally?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh that is a good idea. I can look into that. It would be helpful to do a pre-push lint.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm going to look at this as a follow-up in combination with #35 (comment)

Comment on lines +165 to +170
- run: |
git config user.name "github-actions[bot]"
git config user.email "41898282+github-actions[bot]@users.noreply.github.com"
git add extensions.json
git commit -m "Update extension list"
git push
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this pushing directly to main? Should this go through a pull request to ensure a human is in the loop before changes make it to Connect?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is pushing directly to main yes. I could adjust it to instead open a PR.

My thought around this part of the flow is that a PR containing the updates to extensions has already been reviewed and merged and a release has already gone out in the repo, so the extension list should update right away.

I can see an argument though that any adjustments to main should go through a PR flow.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That makes sense. I don't see anything wrong with moving forward with this approach. The risk seems minimal.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That makes sense. I don't see anything wrong with moving forward with this approach. The risk seems minimal.

Really appreciate the question and the consideration.

Comment on lines +1438 to +1444
"extension": {
"name": "audit-api",
"title": "Audit Log API",
"description": "An API Publisher can call for audit logs.",
"homepage": "https://github.com/posit-dev/connect-extensions/tree/main/extensions/audit-api",
"version": "0.0.0"
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are there any drawbacks to using the mainfest.json for this purpose?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One drawback that comes to mind is having to manually add this. That is true for the connect-extension.toml as well (which we plan to remove). But because this lives in the manifest.json we can expand tools that create manifests to support this in the future.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One edge case we would want to document around is to remind any user that may re-regnerate a manifest using rsconnect write-manifest or other tools that they will lose their extension data and have to re-add it.

@dotNomad
Copy link
Collaborator Author

Does it make sense to write up the instructions in a CONTRIBUTING.md document? That's the first place I would think to look when making changes.

Yes definitely! We are picking up that work right away 👍

@dotNomad dotNomad requested review from edavidaja and jonkeane March 12, 2025 16:28
Comment on lines +47 to +48
"homepage": "https://github.com/posit-dev/connect-extensions/tree/main/extensions/audit-reports",
"version": "0.0.0"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it make sense for this to use a link that won't change if the extension is updated?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure I follow this question - can you expand a bit more?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As in, would it make sense to use a permalink:

https://github.com/posit-dev/connect-extensions/tree/5e5066d9b0ad5f5d3b1454a2cc2437cf97bad5c9/extensions/audit-reports

rather than the link to tree/main, which may refer to something different in the future if the extensions is updated.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah that is a great point. Currently the extension list doesn't capture a different homepage for each version instead having a homepage for the entire extension. This is similar in concept to how Homebrew has https://cli.github.com/ as the homepage for the gh formulae.

There could be a good amount of value in splitting that up per-version especially since the homepage is linking to a git repo (it doesn't have to).

@edavidaja
Copy link
Collaborator

Just a thought: what if requires-build or some more expansive notion of extension complexity was captured by the manifest, rather than as a workflow step?

@dotNomad
Copy link
Collaborator Author

Just a thought: what if requires-build or some more expansive notion of extension complexity was captured by the manifest, rather than as a workflow step?

Interesting 🤔
Thinking through this it would allow for a possible generalization of the complex-extension-changes going through each extension that has a extension complexity flag and create a matrix of the directory names. However I believe each extension would still need its own custom workflow to setup the environment, run the write commands, and know how to package itself up.

Is there a big benefit you see in moving this over to the manifest.json that I may be missing?

@edavidaja
Copy link
Collaborator

Is there a big benefit you see in moving this over to the manifest.json that I may be missing?

I don't think I would describe the benefit as large, necessarily, but I think it offers some interesting flexibility in terms of exposing the information to Connect for use later even if we don't do anything with it now.

Comment on lines +30 to +32
filters: |
reaper: extensions/reaper/**
integration-session-manager: extensions/integration-session-manager/**
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After a conversation with @cgraham-rs and the recommendation from @edavidaja I think we can get the simple extension flow to require no change from the extension authors. It will require some additional investigation, but the idea would be to find every extension/** directory with a signifier in the manifest that it is a simple extension, and see if it has changes.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A quick look into this I cannot find a way to do both matrix directories and see which have changed without some inputs. There is probably a way to do this, but it is more complex than I expected. We can tackle this if the feedback from the CONTRIBUTING documentation is that it is too complex.

# Add a new line here with the name of the extension and directory path
# Be sure the extension name and directory have the same name
# e.g. `extension-name: extensions/extension-name/**`
filters: |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment: I was not expecting extension authors to have to modify workflow code when adding new extensions and that the processing would be more automatic.

- run: |
jq '
if (.extension | type) != "object" then error("Missing extension object")
elif (.extension.name | type) != "string" then error("Missing extension.name")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we add additional lint around name and folder matching? Or lose name entirely and only rely on folder?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Definitely a good idea. I'll review the workflow to see if this 1-1 match is required before adding a hard linting rule.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Going to do this as a follow-up for improved linting.

run: tar -czf ${{ inputs.extension-name}}.tar.gz ./extensions/${{ inputs.extension-name}}
shell: bash

# If we are passed an artifact to use as the source for the tarball
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I love all of the code comments. Keep doing that. 👍

shell: bash

# Grabs the latest version from the extensions.json file
# If an extension hasn't been released yet we default to `0.0.0` so any
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we really want to allow unreleased extensions to be merged to main? It could potentially simply things if the act of merging to main implied releasing.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't want to:

  • remove any extensions from the repo that we didn't want initially released
  • require that extension authors keep long lived branches until the extension is ready to be released

This may not be the right choice though.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I very much see where you're coming from @cgraham-rs , that theoretically keeping unrelease out of main makes things nice and clean, but PRs into other branches and having long running branches with development versions is advanced enough git that I would like to avoid requiring it if we can.

It would be nice to have this noted prominently in a README (if we don't already) that just cause something is on main doesn't mean it's released

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be nice to have this noted prominently in a README (if we don't already) that just cause something is on main doesn't mean it's released

Absolutely. The CONTRIBUTING doc we write for adding extensions will have this callout and tips for reading the GitHub Action logs to determine if a PR will result in a release or not 👍

- name: Check if manifest has newer version
id: should_release
run: |
echo "The last released version is '$LATEST_VERSION' and the manifest version is '$MANIFEST_VERSION'"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similar to the great code comments, all of this output seems excellent and should really help humans understand what is happening and why.

@@ -0,0 +1,162 @@
import fs from "fs";
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For my education - not having worked on Posit public repos before, are there any requirements or strongly suggested guidance around languages to use? I would expect in our general public facing world that Python would be a more familiar language.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

are there any requirements or strongly suggested guidance around languages to use

If there are I'm unaware of them as well. I went with TypeScript for two reasons:

  • it is my most comfortable scripting language
  • custom GitHub actions can be written in JavaScript (or TypeScript with additional tooling) so seemed the most appropriate for GitHub workflow scripting

References:

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bingo! Exactly the education I was looking for. 😄


static fromFile(path: string) {
const extensions = JSON.parse(fs.readFileSync(path, "utf8")).extensions;
return new ExtensionList(extensions);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need error handling for potentially non-existent files, bad JSON, etc?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could, but not sure how much it would add. The most important thing is that the workflow job errors and doesn't continue.

Would potentially improve debugging when/if things go wrong.

return new ExtensionList(extensions);
}

public addRelease(manifest: ExtensionManifest, githubRelease) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How rigorous do we need to be with typing? Each and every thing should have typing always or is that unrealistic?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have strict: false set for the TypeScript config, and didn't pull in hard types for the GitHub release API JSON object. I can increase the strictness though. Could be a nice iterative step to avoid mistakes when/if we edit this script.

"extensions.json"
);

const releases = JSON.parse(process.env.RELEASES);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similar question about error handling if there's bad JSON.

);

const releases = JSON.parse(process.env.RELEASES);
const list = ExtensionList.fromFile(extensionListFilePath);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similar question about file/path error handling.

Comment on lines +18 to +19
# If we are not passed an artifact to use as the source for the tarball
# we will create a tarball from the extension's directory
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Mostly for my own education: it's not immediately clear to me when we would take this path versus the other path down below for these. Could you educate me?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Absolutely. Currently the only time we do this is in the publisher-command-center custom worfklow which builds the extension, and then creates an artifact to package:

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The creation and passing of the artifact to this action allows custom workflows to select which files to package.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah ok, so if it's a custom workflow style task, this allows us to build the artifact elsewhere (in the custom workflow) and then pass it to this workflow but/and therefore bypasses the building step in this workflow. Is that a good summary?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or, even an accurate summary — doesn't need to be good 😂

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah ok, so if it's a custom workflow style task, this allows us to build the artifact elsewhere (in the custom workflow) and then pass it to this workflow but/and therefore bypasses the building step in this workflow. Is that a good summary?

Very close. The only correction I'll make is that this is a composite action. It allows us to build the artifact we want packaged and pass it to this action. It avoids building as part of the packaging step (for those extensions with custom workflows), and simplifies the things that need to be passed to this action.

Comment on lines 47 to 49
echo "Should release: the manifest version is higher than the lastest version"
echo "should_release=true" >> "$GITHUB_OUTPUT"
else
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

super minor, but it might be nice if these messages had slightly different shapes / cadences. I know I sometimes read to quickly and don't see a "not" that is critical :P Even like a 🚀 or 🛳️ at the front of the first and a 😴 or 🛏️ or something for the second would be helpful

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great suggestion. I'll make that change so this is much easier to scan.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed the messaging and added emojis for glance-ability in dab3a99

They now read:

🚀 Will release! The manifest version is greater than the released version.
😴 Holding back from release: The manifest version is not greater than the released version.

# We fetch the full release data to utilize in the extension list update
- name: Get release data
if: github.ref_name == 'main' && steps.should_release.outputs.should_release == 'true'
run: gh api /repos/${{ github.repository }}/releases/tags/$RELEASE_TAG > release-${{ inputs.extension-name }}.json
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

More for my education / curiosity: what's in the release-....json?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great question. It is the response from GitHub's "Get a release by tag name" API endpoint.

Using this as opposed to gh release list allowed me to get the specific release the flow needed. The script that updates the extension list uses published_at time and the list of assets to grab the browser_download_url of the extension TAR we created.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah interesting. It might be worth stating (more?) explicitly in the comment "We fetch the full release data from the github API" That wasn't at the top of my head for where we were fetching that from, actually.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated the comment in e534088 to mention that the release data is fetched from the GitHub API 👍

# we will download the artifact and create a tarball from it
# this avoids including extra files in the extension
- name: Download optional artifact
if: ${{ inputs.artifact-name != '' }}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we offer a message to the user if an error occurs during this process? Or is that even possible?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There will be a failure if the artifact doesn't exist and the logs will give the name and that no artifact matched it. Thats all built in to the download-artifact action. Does that satisfy that you think?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, I think that makes sense.

Copy link
Contributor

@cgraham-rs cgraham-rs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking forward to seeing all of this work merged. Again, great job on the code comments.


steps:
# If we are not passed an artifact to use as the source for the tarball
# we will create a tarball from the extension's directory
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not a requested change, just a question: would it be beneficial to add additional detail such as "this is what we expect during for simple content", and in the next one "we expect artifacts to be passed and processed for complex content". Or is that getting too deep?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I kept the explanations in the actions a bit lighter since "simple content" probably doesn't need to interact with them. We can definitely add some deeper explanations if we get questions often.

"extension": {
"name": "reaper",
"title": "Reaper",
"description": "A view to a kill.",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want a better description here?

Copy link
Collaborator Author

@dotNomad dotNomad Mar 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added this to a follow-up issue to add instructions to some of the extensions, we can improve the descriptions at the same time.

cache: "npm"
cache-dependency-path: extensions/${{ env.EXTENSION_NAME }}/package-lock.json

- run: npm ci
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This isn't important and should not prevent merge, but since this is the only complex extension in the gallery and is likely to be used as an example/template to follow it could be useful to have some of your amazing code comments in there that fully explain what's happening each step of the way and why things are in here.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. I'll add comments before I merge this.

@dotNomad dotNomad merged commit 3b0b840 into main Mar 18, 2025
9 checks passed
@dotNomad dotNomad deleted the dotnomad/extension-release-list branch March 18, 2025 23:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants