Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
91 commits
Select commit Hold shift + click to select a range
540b03b
Forked from oracle/graalpython - Initial Commit with ci
Ariouz Nov 14, 2025
c53861f
Fix set-export / unpack-artifact permissions
Ariouz Nov 14, 2025
445be61
Disabled OOM/Buff Overflow tests
Ariouz Nov 14, 2025
68c5dd4
Add phase shift warmup benchmark.
jchalou Nov 10, 2025
6781f3a
Use succinct option format.
jchalou Nov 12, 2025
aad9f30
Add documentation.
jchalou Nov 12, 2025
ff9f041
Process jobs "Downloads" and extract 'key' to env
Ariouz Nov 19, 2025
ff8d4c2
Fix PATH on ci
Ariouz Nov 20, 2025
4165828
Retag bytecode DSL tests
steve-s Nov 11, 2025
1d4d9bd
Retag more Bytecode DSL tests
steve-s Nov 11, 2025
bda8238
Retag more Bytecode DSL tests
steve-s Nov 12, 2025
008a521
Fix test_custom_iterator_return
steve-s Nov 12, 2025
3c75c08
Retag more Bytecode DSL tests
steve-s Nov 12, 2025
2d81110
Untag test failing in CI although it passes locally
steve-s Nov 12, 2025
dd6ed4d
Ignore some flaky ZoneInfoTest tests
steve-s Nov 14, 2025
ed34a7f
Remove transiently timing out test_threading:test_print_exception_gh_…
steve-s Nov 18, 2025
7156473
Update TCK provider to communicate that subscripting works with types…
timfel Nov 19, 2025
614748b
Fix PATH error on Linux/MacOS + Update imports
Ariouz Nov 20, 2025
a494af4
Disable Linux aarch64 svm build
Ariouz Nov 20, 2025
b742111
Construct download links using string format + Disable linux aarch64 …
Ariouz Nov 21, 2025
9f55132
Split python-unittest-retagger to 8 batches like python-tagged-unittests
Ariouz Nov 24, 2025
ed53829
Build: disable -g and enable -Ob while GITHUB_CI env is set
Ariouz Nov 24, 2025
0706bfe
Windows set-export / unpack-artifact scripts
Ariouz Nov 24, 2025
927f2e1
Add env: ${{ matrix.env }} to tier2 and 3 jobs
Ariouz Nov 25, 2025
59639ed
Add build arg -Ob on libpythonvm library build
Ariouz Nov 25, 2025
fce92cc
Add -J-XX:MaxRAMPercentage=90.0 on graalpy native / libpythonvm builds
Ariouz Nov 25, 2025
edabea7
upload report.json artifact
Ariouz Nov 25, 2025
f9dc6f5
test: python 3.12.8 on linux aarch64
Ariouz Dec 1, 2025
1cb5373
GItHub CI: Run linux amd64+aarch64 retagger weekly jobs and enable au…
Ariouz Dec 2, 2025
d4ef0c7
Update imports
Ariouz Dec 9, 2025
ee03bd3
WIP: Github tagged test exclusion character
Ariouz Dec 2, 2025
2e4bd61
revert aarch64 retag
Ariouz Dec 3, 2025
989cd60
Revert "Applied retagger diff for linux aarch64 tests"
Ariouz Dec 3, 2025
9de50d4
Revert "Applied diff with 'retagg failed' to true"
Ariouz Dec 3, 2025
bb981c3
Revert "Apply diff_report generated by retagger jobs"
Ariouz Dec 3, 2025
f635124
Update imports
Ariouz Dec 3, 2025
2b08d72
Disable PR test runs while in draft
Ariouz Dec 3, 2025
69d1581
Test: Apply new retagger exclusion character for github jobs
Ariouz Dec 3, 2025
8bd5752
Apply linux-amd64 new github exclusion test tag
Ariouz Dec 4, 2025
73c0cd2
wip: ci-unittest-retagger workflow file
Ariouz Dec 3, 2025
a53b747
Fix workflow
Ariouz Dec 3, 2025
a56e6f7
Use ci-matrix-gen to call retagger and run retagger merge script
Ariouz Dec 3, 2025
9988e22
Test MX_REPORT_SUFFIX with os-arch
Ariouz Dec 3, 2025
7b08e03
Use CURRENT_PLATFORM as mx report suffix while running in GITHUB_CI
Ariouz Dec 3, 2025
598935e
Tmp workflow retagger filter
Ariouz Dec 5, 2025
c961ad8
Extract PR test to a single workflow
Ariouz Dec 5, 2025
c1502fb
rm redundant check
Ariouz Dec 5, 2025
10ce536
PR workflow, rename job
Ariouz Dec 5, 2025
749d0e0
Allow retagger fiter for windows
Ariouz Dec 5, 2025
bfda686
Apply Linux aarch64 retags - fix retagger merge
Ariouz Dec 5, 2025
623c2f7
Fix Windows build
Ariouz Dec 5, 2025
2a09b26
unpack-artifact.cmd
Ariouz Dec 8, 2025
b88861e
Run retagger on Linux aarch64
Ariouz Dec 8, 2025
4e3248d
Run PR tests on synchronize
Ariouz Dec 8, 2025
c0f2511
Fix Windows arch name in retagger merge
Ariouz Dec 8, 2025
bcb63e9
disable tests on draft
Ariouz Dec 8, 2025
3576d43
Clean some debugs
Ariouz Dec 8, 2025
01354dc
Fix GITHUB_CI not set on merge job
Ariouz Dec 8, 2025
bf1ef89
Rm redundant check
Ariouz Dec 8, 2025
82afb67
Set build allocated memory relative system's available
Ariouz Dec 8, 2025
230f7e5
Open PR on weekly retagger (#8)
Ariouz Dec 8, 2025
e607a53
Add retagger label to PR
Ariouz Dec 9, 2025
6bd5d2a
Remove debugs
Ariouz Dec 9, 2025
908afa8
Merge oracle/graalpython master and fixes issues
Ariouz Dec 9, 2025
0aadcf2
Enable Windows in ci-unittest workflow
Ariouz Dec 9, 2025
9d17178
Update PR unittest trigger
Ariouz Dec 9, 2025
0df91d5
Rm debugs, fix github_ci_build_args not used
Ariouz Dec 9, 2025
5e1461d
Apply retags for linux-x86_64
Dec 9, 2025
8ce1883
Apply retags for win32-AMD64
Dec 9, 2025
4fbdefc
Update imports
Ariouz Dec 10, 2025
f3f43f1
Revert 'update imports', mx labsjdk version isn't up to date yet.
Ariouz Dec 9, 2025
9ecb978
Fix psutil package removed by mx update imports
Ariouz Dec 10, 2025
be4379c
test_repr_deep broken after mx update ?
Ariouz Dec 10, 2025
72adc6c
Fix choco / maven in PATH
Ariouz Dec 10, 2025
9de5ec0
Remove comments
Ariouz Dec 10, 2025
5a28761
Fix tagged test platform sort order
Ariouz Dec 10, 2025
b07efff
Add retagger workflow input
Ariouz Dec 10, 2025
6381b6e
Revert retagger gate to weekly, allow weekly in matrix
Ariouz Dec 10, 2025
eca2884
Fix .github/scripts/ path in github path
Ariouz Dec 10, 2025
bb9345a
Run weekly jobs only in tier1 otherwise it would run multiple times
Ariouz Dec 10, 2025
b5d9b6f
Build standalone artifacts before unittest/retags to avoid re-buildin…
Ariouz Dec 10, 2025
62b58b4
Handle case when no tag has changed
Ariouz Dec 10, 2025
0c708df
tmp: Add debug to retagger
Ariouz Dec 10, 2025
22ee071
Excluce 'bench' from pr unittest
Ariouz Dec 10, 2025
d591ea3
Revert mx update in ci/graal/common.jsonnet
Ariouz Dec 10, 2025
cd9a093
Update msbuild action / graal versions
Ariouz Dec 11, 2025
4b93051
Fix empty weekly retagger branch
Ariouz Dec 11, 2025
6104bd6
Rebase ci/graal on master
Ariouz Dec 11, 2025
7031498
Update microsoft/setup-msbuild version, fix empty retagger branch, fi…
Ariouz Dec 11, 2025
469adc5
Override linux aarch64 python version
Ariouz Dec 11, 2025
1391889
Override linux aarch64 python version
Ariouz Dec 11, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
322 changes: 322 additions & 0 deletions .github/scripts/extract_matrix.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,322 @@
#!/usr/bin/env python3
import argparse
import fnmatch
import json
import os
import re
import shlex
import subprocess
import sys

from dataclasses import dataclass
from functools import cached_property, total_ordering
from typing import Any

DEFAULT_ENV = {
"CI": "true",
"PYTHONIOENCODING": "utf-8",
"GITHUB_CI": "true"
}

# If any of these terms are in the job json, they do not run in public
# infrastructure
JOB_EXCLUSION_TERMS = (
"enterprise",
"corporate-compliance",

# Jobs failing in GitHub Actions:buffer overflow, out of memory
"python-svm-unittest",
"cpython-gate",

"darwin",
)

DOWNLOADS_LINKS = {
"GRADLE_JAVA_HOME": "https://download.oracle.com/java/{major_version}/latest/jdk-{major_version}_{os}-{arch_short}_bin{ext}"
}

# Gitlab Runners OSS
OSS = {
"macos-latest": ["darwin", "aarch64"],
"ubuntu-latest": ["linux", "amd64"],
"ubuntu-24.04-arm": ["linux", "aarch64"],
"windows-latest": ["windows", "amd64"]
}

# Override unavailable Python versions for some OS/Arch combinations
PYTHON_VERSIONS = {
"ubuntu-24.04-arm": "3.12.8",
}


@dataclass
class Artifact:
name: str
pattern: str


@total_ordering
class Job:
def __init__(self, job: dict[str, Any]):
self.job = job

@cached_property
def runs_on(self) -> str:
capabilities = self.job.get("capabilities", [])

for os, caps in OSS.items():
if all(required in capabilities for required in caps): return os

return "ubuntu-latest"

@cached_property
def name(self) -> str:
return self.job["name"]

@cached_property
def targets(self) -> list[str]:
return self.job.get("targets", [])

@cached_property
def env(self) -> dict[str, str]:
return self.job.get("environment", {}) | DEFAULT_ENV

@cached_property
def mx_version(self) -> str | None:
for k, v in self.job.get("packages", {}).items():
if k == "mx":
return v.strip("=<>~")

@cached_property
def python_version(self) -> str | None:
python_version = None
for k, v in self.job.get("packages", {}).items():
if k == "python3":
python_version = v.strip("=<>~")
for k, v in self.job.get("downloads", {}).items():
if k == "PYTHON3_HOME":
python_version = v.get("version", python_version)
if "MX_PYTHON" in self.env:
del self.env["MX_PYTHON"]
if "MX_PYTHON_VERSION" in self.env:
del self.env["MX_PYTHON_VERSION"]

if self.runs_on in PYTHON_VERSIONS:
python_version = PYTHON_VERSIONS[self.runs_on]
return python_version

@cached_property
def system_packages(self) -> list[str]:
# TODO: support more packages
system_packages = []
for k, _ in self.job.get("packages", {}).items():
if k.startswith("pip:"):
continue
elif k.startswith("00:") or k.startswith("01:"):
k = k[3:]
system_packages.append(f"'{k}'" if self.runs_on != "windows-latest" else f"{k}")
return system_packages

@cached_property
def python_packages(self) -> list[str]:
python_packages = []
for k, v in self.job.get("packages", {}).items():
if k.startswith("pip:"):
python_packages.append(f"'{k[4:]}{v}'" if self.runs_on != "windows-latest" else f"{k[4:]}{v}")
return python_packages

def get_download_steps(self, key: str, version: str) -> str:
download_link = self.get_download_link(key, version)
filename = download_link.split('/')[-1]

if self.runs_on == "windows-latest":
return (f"""
Invoke-WebRequest -Uri {download_link} -OutFile {filename}
$dirname = (& tar -tzf {filename} | Select-Object -First 1).Split('/')[0]
tar -xzf {filename}
Add-Content $env:GITHUB_ENV "{key}=$(Resolve-Path $dirname)"
""")

return (f"wget -q {download_link} && "
f"dirname=$(tar -tzf {filename} | head -1 | cut -f1 -d '/') && "
f"tar -xzf {filename} && "
f'echo {key}=$(realpath "$dirname") >> $GITHUB_ENV')


def get_download_link(self, key: str, version: str) -> str:
os, arch = OSS[self.runs_on]
major_version = version.split(".")[0]
extension = ".tar.gz" if not os == "windows" else ".zip"
os = os if os != "darwin" else "macos"
arch_short = {"amd64": "x64", "aarch64": "aarch64"}[arch]

vars = {
"major_version": major_version,
"os":os,
"arch": arch,
"arch_short": arch_short,
"ext": extension,
}

return DOWNLOADS_LINKS[key].format(**vars)

@cached_property
def downloads(self) -> list[str]:
downloads = []
for k, download_info in self.job.get("downloads", {}).items():
if k in DOWNLOADS_LINKS and download_info["version"]:
downloads.append(self.get_download_steps(k, download_info["version"]))

return downloads

@staticmethod
def common_glob(strings: list[str]) -> str:
assert strings
if len(strings) == 1:
return strings[0]
prefix = strings[0]
for s in strings[1:]:
i = 0
while i < len(prefix) and i < len(s) and prefix[i] == s[i]:
i += 1
prefix = prefix[:i]
if not prefix:
break
suffix = strings[0][len(prefix):]
for s in strings[1:]:
i = 1
while i <= len(suffix) and i <= len(s) and suffix[-i] == s[-i]:
i += 1
if i == 1:
suffix = ""
break
suffix = suffix[-(i-1):]
return f"{prefix}*{suffix}"

@cached_property
def upload_artifact(self) -> Artifact | None:
if artifacts := self.job.get("publishArtifacts", []):
assert len(artifacts) == 1
dir = artifacts[0].get("dir", ".")
patterns = artifacts[0].get("patterns", ["*"])
return Artifact(
artifacts[0]["name"],
" ".join([os.path.normpath(os.path.join(dir, p)) for p in patterns])
)
return None

@cached_property
def download_artifact(self) -> Artifact | None:
if artifacts := self.job.get("requireArtifacts", []):
pattern = self.common_glob([a["name"] for a in artifacts])
return Artifact(pattern, os.path.normpath(artifacts[0].get("dir", ".")))
return None


@staticmethod
def flatten_command(args: list[str | list[str]]) -> list[str]:
flattened_args = []
for s in args:
if isinstance(s, list):
flattened_args.append(f"$( {shlex.join(s)} )")
else:
flattened_args.append(s)
return flattened_args

@cached_property
def setup(self) -> str:
cmds = [self.flatten_command(step) for step in self.job.get("setup", [])]
return "\n".join(shlex.join(s) for s in cmds)

@cached_property
def run(self) -> str:
cmds = [self.flatten_command(step) for step in self.job.get("run", [])]
return "\n".join(shlex.join(s) for s in cmds)

@cached_property
def logs(self) -> str:
return "\n".join(os.path.normpath(p) for p in self.job.get("logs", []))

def to_dict(self):
"""
This is the interchange with the YAML file defining the Github jobs, so here
is where we must match the strings and expectations of the Github workflow.
"""
return {
"name": self.name,
"mx_version": self.mx_version,
"os": self.runs_on,
"python_version": self.python_version,
"setup_steps": self.setup,
"run_steps": self.run,
"python_packages": " ".join(self.python_packages),
"system_packages": " ".join(self.system_packages),
"provide_artifact": [self.upload_artifact.name, self.upload_artifact.pattern] if self.upload_artifact else None,
"require_artifact": [self.download_artifact.name, self.download_artifact.pattern] if self.download_artifact else None,
"logs": self.logs.replace("../", "${{ env.PARENT_DIRECTORY }}/"),
"env": self.env,
"downloads_steps": " ".join(self.downloads),
}

def __str__(self):
return str(self.to_dict())

def __eq__(self, other):
if isinstance(other, Job):
return self.to_dict() == other.to_dict()
return NotImplemented

def __gt__(self, other):
if isinstance(other, Job):
if self.job.get("runAfter") == other.name:
return True
if self.download_artifact and not other.download_artifact:
return True
if self.download_artifact and other.upload_artifact:
if fnmatch.fnmatch(other.upload_artifact.name, self.download_artifact.name):
return True
if not self.upload_artifact:
return True
return False
return NotImplemented


def get_tagged_jobs(buildspec, target, filter=None):
jobs = [Job({"name": target}).to_dict()]
for job in sorted([Job(build) for build in buildspec.get("builds", [])]):
if not any(t for t in job.targets if t in [target]):
if "weekly" in job.targets and target == "tier1": pass
else:
continue
if filter and not re.match(filter, job.name):
continue
if [x for x in JOB_EXCLUSION_TERMS if x in str(job)]:
continue
jobs.append(job.to_dict())
return jobs


def main(jsonnet_bin, ci_jsonnet, target, filter=None, indent=False):

result = subprocess.check_output([jsonnet_bin, ci_jsonnet], text=True)
buildspec = json.loads(result)
tagged_jobs = get_tagged_jobs(buildspec, target, filter=filter)
matrix = tagged_jobs
print(json.dumps(matrix, indent=2 if indent else None))


if __name__ == "__main__":
parser = argparse.ArgumentParser(description="Generate GitHub CI matrix from Jsonnet buildspec.")
parser.add_argument("jsonnet_bin", help="Path to jsonnet binary")
parser.add_argument("ci_jsonnet", help="Path to ci.jsonnet spec")
parser.add_argument("target", help="Target name (e.g., tier1)")
parser.add_argument("filter", nargs="?", default=None, help="Regex filter for job names (optional)")
parser.add_argument('--indent', action='store_true', help='Indent output JSON')
args = parser.parse_args()
main(
jsonnet_bin=args.jsonnet_bin,
ci_jsonnet=args.ci_jsonnet,
target=args.target,
filter=args.filter,
indent=args.indent or sys.stdout.isatty(),
)
72 changes: 72 additions & 0 deletions .github/scripts/merge_retagger_results.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
# ================================
#
# This script is used by ci to merge several retagger report JSON files, which is then used
# by running python3 runner.py merge-tags-from-reports reports-merged.json
#
# ================================

import os
import sys
import json
import glob
import argparse
from dataclasses import dataclass

# status we want to focus on
EXPORT_STATUS = ["FAILED"]

@dataclass
class Test:
name: str
status: str
duration: str


def read_report(path: str) -> list[Test]:
tests = []
with open(path) as f:
data = json.load(f)
for result in data:
name, status, duration = result.values()
if status in EXPORT_STATUS: tests.append(Test(f"{name}", status, duration))

return tests

def merge_tests(report: list[Test], merged: dict[str, dict]):
for test in report:
if test.name not in merged:
merged[test.name] = test.__dict__

def export_reports(merged: dict[str, dict], outfile: str):
with open(outfile, "w") as f:
json.dump(list(merged.values()), f)
print(f"=== Exported {len(merged)} ({EXPORT_STATUS}) tests to {f.name} ===")

def merge_reports(reports: list[str], outfile: str):
merged_reports = {}
for report in reports:
report_tests = read_report(report)
merge_tests(report_tests, merged_reports)

export_reports(merged_reports, outfile)

def main(outfile: str, source_dir: str, pattern: str):
path = f"{source_dir}/{pattern}"
files = glob.glob(path)

files = [file for file in files if file.endswith(".json")]
merge_reports(files, outfile)


if __name__ == "__main__":
parser = argparse.ArgumentParser(description="Merge unittest retagger report JSON files")
parser.add_argument("--outfile", help="Output file name (optional)", default="reports-merged.json")
parser.add_argument("--dir", help="Reports files directory (optional)", default=".")
parser.add_argument("--pattern", default="*", help="Pattern matching for input files (optional)")

args = parser.parse_args()
main(
outfile=args.outfile,
source_dir=args.dir,
pattern=args.pattern
)
Loading