Releasing software at Open Collective #6951
BenJam
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
In this post I talk about evolving how we release software at Open Collective, and how we can do a better job of making our release processes more transparent, accountable and engaging for users and staff alike.
Open Collective is continuing to mature as an organisation, with that maturity comes a necessary degree of process that is there to ensure that we're working on the most important and impactful projects at any given time. That process can begin feel like bureaucracy, or worse still autocracy, without engagement from the team and our users in the prioritisation process and a high level of accountability and transparency judging the success or failure of our work.
I am hugely appreciative of the work that @iamronen has been doing to bring our staff into the prioritisation process on Coda, and I continue to support him to bring our users into that environment. That said, as the person nominally responsible for product management, it falls on me to say that we have not been doing a good enough job toward the end of a project to convey a clear release plan, including how and when we'll demonstrate the success of a project to the rest of the team. So I'm addressing that here.
Plausible Analytics
Plausible is a privacy-preserving front-end analytics tool that allows us to better-understand interactions on the platform that can't be measured using metabase i.e. anything that doesn't write an entry to the database. We hooked up Plausible a while back, but we've not been getting the most out of it... until now.
We took the opportunity this cycle to A/B test some changes to our contribution flow around platform tips. @hdiniz created the necessary tags to build a funnel system to describe how successful a new varient was, compared to the last:
With data like this, alongside the data we already collect and represent in metabase, we can make more informed decisions about whether the changes we're proposing are improving the outcomes that we care about. Plausible addresses a blind spot that we've had and I would like us to use it to demonstrate user experience improvements.
Feature Preview
Meanwhile, while working on a couple of projects these past few months, @gustavlrsn has stealthily delivered feature previews!
Feature previews provide us with some additional tools to test the water when managing a release. They allow us to test with users in a way that does not impact everyone, they give us valuable data about users who are trying new features and whether they continue to use them, and they give us an additional channel to communicate the work we're doing on the platform. Feature preview provides us with the opportunity to test significant new feature development or improvements, and this is how I intend to use it from here on.
Release categories
With the addition of feature preview, I think it's important for us to be clear and consistent about how we release software, so I would like us to use the following framework for describing how we intend to test improvements and new features:
When in a pre-release state we have an additional choice regarding the default setting on feature preview for the users who are included in a pre-release:
Typically a new feature will go through alpha, a limited beta with an opt-in, a beta with opt out, then a general release, but other combinations may be selected as needed. At each stage we monitor opt-ins and outs to understand whether a new feature or improvement is actually working for the users who are experiencing it.
Addressing the backlog
Over the past 8 months we have accumulated a backlog of projects that have yet to be released in a way that meets our own expectations, let alone the expectations of our users.
I will be working with @iamronen and @aerugo to categorise to review the work we've done this year, creating feature preview entries for those that are currently in a state of beta development, creating dashboards to measure and track the success, adding closing comments to issues with follow-ups, and otherwise polishing off the work that has been left to hang out this year.
Beta Was this translation helpful? Give feedback.
All reactions