Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ Let's compare some aspects of each solution:
| Called directly within a job, not from a step | Run as a step within a job |
| Can contain multiple jobs | Does not contain jobs |
| Each step is logged in real-time | Logged as one step even if it contains multiple steps |
| Can connect a maximum of ten levels of workflows | Can be nested to have up to 10 composite actions in one workflow |
| Can connect a maximum of {% ifversion fpt or ghec %}ten {% else %}four {% endif %}levels of workflows | Can be nested to have up to 10 composite actions in one workflow |
| Can use secrets | Cannot use secrets |

## Workflow templates
Expand Down
4 changes: 3 additions & 1 deletion content/actions/how-tos/reuse-automations/reuse-workflows.md
Original file line number Diff line number Diff line change
Expand Up @@ -181,7 +181,9 @@ jobs:

## Nesting reusable workflows

You can connect a maximum of ten levels of workflows - that is, the top-level caller workflow and up to nine levels of reusable workflows. For example: _caller-workflow.yml_ → _called-workflow-1.yml_ → _called-workflow-2.yml_ → _called-workflow-3.yml_ → ... → _called-workflow-9.yml_. Loops in the workflow tree are not permitted.
You can connect a maximum of {% ifversion fpt or ghec %}ten levels of workflows - that is, the top-level caller workflow and up to nine levels of reusable workflows. For example: _caller-workflow.yml_ → _called-workflow-1.yml_ → _called-workflow-2.yml_ → _called-workflow-3.yml_ → ... → _called-workflow-9.yml_.{% else %}four levels of workflows - that is, the top-level caller workflow and up to three levels of reusable workflows. For example: _caller-workflow.yml_ → _called-workflow-1.yml_ → _called-workflow-2.yml_ → _called-workflow-3.yml_.{% endif %}

Loops in the workflow tree are not permitted.

> [!NOTE] Nested reusable workflows require all workflows in the chain to be accessible to the caller, and permissions can only be maintained or reduced—not elevated—throughout the chain. For more information, see [AUTOTITLE](/actions/reference/reusable-workflows-reference#access-and-permissions-for-nested-workflows).

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,8 +46,9 @@ For {% ifversion ghes or ghec %}internal or {% endif %}private repositories, the

### Limitations of reusable workflows

* You can connect up to ten levels of workflows. For more information, see [Nesting reusable workflows](/actions/how-tos/sharing-automations/reuse-workflows#nesting-reusable-workflows).
* You can call a maximum of 50 unique reusable workflows from a single workflow file. This limit includes any trees of nested reusable workflows that may be called starting from your top-level caller workflow file.

* You can connect up to {% ifversion fpt or ghec %}ten {% else %}four {% endif %}levels of workflows. For more information, see [Nesting reusable workflows](/actions/how-tos/sharing-automations/reuse-workflows#nesting-reusable-workflows).
* You can call a maximum of {% ifversion fpt or ghec %}50 {% else %}20 {% endif %}unique reusable workflows from a single workflow file. This limit includes any trees of nested reusable workflows that may be called starting from your top-level caller workflow file.

For example, _top-level-caller-workflow.yml_ → _called-workflow-1.yml_ → _called-workflow-2.yml_ counts as 2 reusable workflows.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -80,8 +80,9 @@ If you're using a dedicated block device as your backup target, you need to init
This command:
* Formats the device (erases all data).
* Prepares it for use by the backup service.
* Sets it to mount automatically at `/data/backup` on boot.

* Sets it to mount automatically at `/data/backup` on boot.{% ifversion ghes > 3.19 %}
* If in a clustered environment, configures the node in `cluster.conf` with the `backup-server` role.{% endif %}

{% ifversion ghes = 3.17 %}
From {% data variables.product.prodname_ghe_server %} 3.17.4 onward, the script is installed in PATH so you can run it directly using: `ghe-storage-init-backup /dev/YOUR_DEVICE_NAME`.
{% endif %}
Expand Down Expand Up @@ -109,9 +110,9 @@ If the device was already initialized using `ghe-storage-init-backup`, you can r

### Configuring backup settings

After the backup target is mounted, the Backup Service page will become available in the {% data variables.enterprise.management_console %}. If you're using a block device, this requires completing the initialization or mount steps above.
After the backup target is mounted, the Backup Service page will become available in the {% data variables.enterprise.management_console %}. {% ifversion ghes > 3.19 %} If your instance is part of a clustered environment, the system will automatically detect the node that was initialized with `ghe-storage-init-backup` and treat it as the backup server. {% endif %}

>[!NOTE] The settings page won’t appear until the backup storage is mounted at `/data/backup`.
>[!NOTE] The settings page won’t appear until the backup storage is mounted at `/data/backup` by completing the initialization or mount steps above.

If you're migrating from {% data variables.product.prodname_enterprise_backup_utilities %}, you can transfer your configuration in one of two ways:

Expand Down
24 changes: 15 additions & 9 deletions src/graphql/components/Changelog.tsx
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import React from 'react'
import cx from 'classnames'
import GithubSlugger from 'github-slugger'

import { HeadingLink } from '@/frame/components/article/HeadingLink'
import { ChangelogItemT } from './types'
Expand All @@ -10,14 +11,19 @@ type Props = {
}

export function Changelog({ changelogItems }: Props) {
const changes = changelogItems.map((item) => {
const slugger = new GithubSlugger()

const changes = changelogItems.map((item, index) => {
const heading = `Schema changes for ${item.date}`
const slug = slugger.slug(heading)

return (
<div key={item.date}>
<HeadingLink as="h2">{heading}</HeadingLink>
{(item.schemaChanges || []).map((change, index) => (
<React.Fragment key={index}>
<div key={`${item.date}-${index}`}>
<HeadingLink as="h2" slug={slug}>
{heading}
</HeadingLink>
{(item.schemaChanges || []).map((change, changeIndex) => (
<React.Fragment key={changeIndex}>
<p>{change.title}</p>
<ul>
{change.changes.map((changeItem) => (
Expand All @@ -26,8 +32,8 @@ export function Changelog({ changelogItems }: Props) {
</ul>
</React.Fragment>
))}
{(item.previewChanges || []).map((change, index) => (
<React.Fragment key={index}>
{(item.previewChanges || []).map((change, changeIndex) => (
<React.Fragment key={changeIndex}>
<p>{change.title}</p>
<ul>
{change.changes.map((changeItem) => (
Expand All @@ -36,8 +42,8 @@ export function Changelog({ changelogItems }: Props) {
</ul>
</React.Fragment>
))}
{(item.upcomingChanges || []).map((change, index) => (
<React.Fragment key={index}>
{(item.upcomingChanges || []).map((change, changeIndex) => (
<React.Fragment key={changeIndex}>
<p>{change.title}</p>
{change.changes.map((changeItem) => (
<li key={changeItem} dangerouslySetInnerHTML={{ __html: changeItem }} />
Expand Down
20 changes: 20 additions & 0 deletions src/graphql/tests/server-rendering.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,26 @@ describe('server rendering certain GraphQL pages', () => {
expect.assertions(hrefs.length + 1)
})

test('minitoc hrefs on changelog match and verify slugger behavior', async () => {
// Testing the minitoc links match the heading ids but also validating
// slugger behavior see docs-engineering/issues#5792.
// Little funky because we need to make 2 requests to the page to test
// the problem behavior where slugger state accumulates across
// requests, it won't fail the first time around.
await getDOM('/graphql/overview/changelog')
const $ = await getDOM('/graphql/overview/changelog')
const links = $('[data-testid="minitoc"] a[href]')
const hrefs = links.map((i, link) => $(link).attr('href')).get()
const headings = $('#article-contents h2')
const headingIds = headings.map((i, heading) => `#${$(heading).attr('id')}`).get()

expect(hrefs.length).toBe(headingIds.length)

for (let i = 0; i < hrefs.length; i++) {
expect(hrefs[i]).toBe(headingIds[i])
}
})

const autogeneratedPages = pageList.filter(
(page) => page.autogenerated === 'graphql' && page.relativePath.includes('reference'),
)
Expand Down
Loading