Skip to content

Commit

Permalink
Merge branch 'master' into campaign-manager
Browse files Browse the repository at this point in the history
* master: (126 commits)
  ReadMe.md: Mention 2.9.2 release
  Cleanup server output a bit (ornladios#3914)
  ci: set openmpi and openmp params
  Example using Kokkos buffers with SST
  Changes to MallocV to take into consideration the memory space of a variable
  Change install directory of Gray scott files again
  ci,crusher: increase supported num branches
  ci: add shellcheck coverage to source and testing
  Change install directory of Gray scott files
  Only rank 0 should print the initialization message in perfstub
  Defining and computing derived variables (ornladios#3816)
  Add Remote "-status" command to see if a server is running and where (ornladios#3911)
  examples,hip: use find_package(hip) once in proj
  Add Steps Tutorial
  Add Operators Tutorial
  Add Attributes Tutorial
  Add Variables Tutorial
  Add Hello World Tutorial
  Add Tutorials' Download and Build section
  Add Tutorials' Overview section
  Improve bpStepsWriteRead* examples
  Rename bpSZ to bpOperatorSZWriter
  Convert bpAttributeWriter to bpAttributeWriteRead
  Improve bpWriter/bpReader examples
  Close file after reading for hello-world.py
  Fix names of functions in engine
  Fix formatting warnings
  Add dataspaces.rst in the list of engines
  Add query.rst
  cmake: find threads package first
  docs: update new_release.md
  Bump version to v2.9.2
  ci: update number of task for mpich build
  clang-format: Correct format to old style
  Merge pull request ornladios#3878 from anagainaru/test-null-blocks
  Merge pull request ornladios#3588 from vicentebolea/fix-mpi-dp
  Adding tests for writing null blocks with and without compression
  bp5: make RecMap an static anon namespaced var
  Replace LookupWriterRec's linear search on RecList with an unordered_map. For 250k variables, time goes from 21sec to ~1sec in WSL. The order of entries in RecList was not necessary for the serializer to work correctly.
  Replace LookupWriterRec's linear search on RecList with an unordered_map. For 250k variables, time goes from 21sec to ~1sec in WSL. The order of entries in RecList was not necessary for the serializer to work correctly. (ornladios#3877)
  Fix data length calculation for hash (ornladios#3875)
  Merge pull request ornladios#3823 from eisenhauer/SstMemSel
  Merge pull request ornladios#3805 from pnorbert/fix-bpls-string-scalar
  Merge pull request ornladios#3804 from pnorbert/fix-aws-version
  Merge pull request ornladios#3759 from pnorbert/bp5dbg-metadata
  new attempt to commit query support of local array. (ornladios#3868)
  MPI::MPI_Fortran should be INTERFACE not PUBLIC
  Fix hip example compilation error (ornladios#3865)
  Server Improvements (ornladios#3862)
  ascent,ci: remove unshallow flag
  Remove Slack as a contact mechanism (ornladios#3866)
  bug fix:  syntax error in json  output (ornladios#3857)
  Update the bpWriterReadHip example's cmake to run on crusher
  Examples: Use BPFile instead of BP3/4/5 for future-proof
  inlineMWE example: Close files at the end
  Examples: Add BeginStep/EndStep wherever it was missing
  BP5Serializer: handle local variables that use operators (ornladios#3859)
  gha,ci: update checkout to v4
  Blosc2 USE ON: Fix Module Fallback
  cmake: correct prefer_shared_blosc behavior
  cmake: correct info.h installation path
  ci: disable MGARD static build
  operators: fix module library
  ci: add downloads readthedocs
  cmake: Add Blosc2 2.10.1 compatibility.
  Blosc2 USE ON: Fix Module Fallback (ornladios#3774)
  Fix destdir install test (ornladios#3850)
  cmake: update minimum cmake to 3.12 (ornladios#3849)
  MPI: add timeout for conf test for MPI_DP (ornladios#3848)
  MPI_DP: do not call MPI_Init (ornladios#3847)
  install: export adios2 device variables (ornladios#3819)
  Merge pull request ornladios#3799 from vicentebolea/support-new-yaml-cpp
  Merge pull request ornladios#3737 from vicentebolea/fix-evpath-plugins-path
  SST,MPI,DP: soft handle peer error
  SST,MPI,DP: improve uniq identifier
  Fix destdir install test (ornladios#3850)
  cmake: include ctest before detectoptions
  ci: enable tau check
  Add/Improve the ReadMe.md files in examples directory
  Disable BUILD_TESTING and ADIOS2_BUILD_EXAMPLES by default
  Remove testing based on ADIOS2-examples
  Fix formatting issue in DetectOptions.cmake
  Add examples from ADIOS2-Examples
  Improve existing examples
  MPI_DP: do not call MPI_Init (ornladios#3847)
  cmake: update minimum cmake to 3.12 (ornladios#3849)
  MPI: add timeout for conf test for MPI_DP (ornladios#3848)
  Tweak Remote class and test multi-threaded file remote access (ornladios#3834)
  Add prototype testing of remote functionality (ornladios#3830)
  Try always using the MPI version
  Try always using the MPI version
  Import tests from bp to staging common, implement memory selection in SST
  ci: fix codeql ignore path (ornladios#3772)
  install: export adios2 device variables (ornladios#3819)
  added support to query BP5 files (ornladios#3809)
  Partial FFS Upstream, only changes to type_id
  ffs 2023-09-19 (67e411c0)
  Fix abs/rel step in BP5 DoCount
  fix dummy Win build
  Pass Array Order of reader to remote server for proper Get() operation
  ...
  • Loading branch information
pnorbert committed Nov 17, 2023
2 parents c6ae903 + 5752d99 commit cc2f83a
Show file tree
Hide file tree
Showing 589 changed files with 109,992 additions and 4,477 deletions.
5 changes: 4 additions & 1 deletion .github/ISSUE_TEMPLATE/new_release.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ git push
- [ ] Create PR (BASE to master if release_@MAJOR@@MINOR@ does not exists; otherwise release_@MAJOR@@MINOR@)
- [ ] Ask for review
- [ ] Merge PR
- [ ] Create Tag commit `git tag -a v@VERSION@ the_merge_commit`
- [ ] Create Tag commit `git tag -a -m 'v@VERSION' v@VERSION@ the_merge_commit`
- [ ] Create Release in GitHub page
- Use the following script for getting the PR of this release
- `./scripts/developer/create-changelog.sh v@VERSION@ v@OLD_RELEASE@`
Expand Down Expand Up @@ -66,5 +66,8 @@ git push origin master
- CondaForge robot should do this for you automatically, expect a new PR at
https://github.com/conda-forge/adios2-feedstock a couple of hours after the
release.
- [ ] Submit a MR for ParaView Superbuild to use v@VERSION@ release.
- [ ] Update the website to point to the v@VERSION@ release
- [ ] Write an announcement in the ADIOS-ECP mail-list
(https://groups.google.com/a/kitware.com/g/adios-ecp)

43 changes: 20 additions & 23 deletions .github/workflows/everything.yml
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ jobs:
outputs:
num_code_changes: ${{ steps.get_code_changes.outputs.num_code_changes }}
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Check for appropriately named topic branch
Expand Down Expand Up @@ -80,10 +80,10 @@ jobs:
image: ghcr.io/ornladios/adios2:ci-formatting

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
path: gha
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
path: source
Expand Down Expand Up @@ -153,10 +153,10 @@ jobs:
shared: static
parallel: serial
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
path: gha
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
path: source
Expand Down Expand Up @@ -218,10 +218,10 @@ jobs:
parallel: [ompi]

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
path: gha
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
path: source
Expand Down Expand Up @@ -285,10 +285,10 @@ jobs:
compiler: xcode13_4_1

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
path: gha
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
path: source
Expand Down Expand Up @@ -352,10 +352,10 @@ jobs:
shell: bash

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
path: gha
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
path: source
Expand Down Expand Up @@ -386,7 +386,7 @@ jobs:
baseos: [ubuntu-bionic]

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
path: ci-source
Expand Down Expand Up @@ -448,24 +448,19 @@ jobs:
strategy:
fail-fast: false
matrix:
code: [examples, lammps, tau]
code: [lammps, tau]
include:
- code: examples
repo: ornladios/ADIOS2-Examples
ref: master
- code: lammps
repo: pnorbert/lammps
ref: fix-deprecated-adios-init
- code: tau
repo: ornladios/ADIOS2-Examples
ref: master

defaults:
run:
shell: bash -c "docker exec adios2-ci bash --login -e $(echo {0} | sed 's|/home/runner/work|/__w|g')"

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
if: ${{ matrix.repo != '' }}
with:
repository: ${{ matrix.repo }}
ref: ${{ matrix.ref }}
Expand Down Expand Up @@ -523,10 +518,10 @@ jobs:
language: [ 'cpp' ]

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
path: gha
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
path: source
Expand All @@ -535,8 +530,10 @@ jobs:
with:
languages: ${{ matrix.language }}
config: |
paths:
- source
paths-ignore:
- source/thirdparty/
- source/thirdparty
- name: Setup
run: gha/scripts/ci/gh-actions/linux-setup.sh
- name: Update
Expand Down
8 changes: 4 additions & 4 deletions .gitlab/config/SpackCIBridge.py
Original file line number Diff line number Diff line change
Expand Up @@ -169,7 +169,7 @@ def list_github_prs(self):
# Check if we should defer pushing/testing this PR because it is based on "too new" of a commit
# of the main branch.
tmp_pr_branch = f"temporary_{pr_string}"
subprocess.run(["git", "fetch", "--unshallow", "github",
subprocess.run(["git", "fetch", "github",
f"refs/pull/{pull.number}/head:{tmp_pr_branch}"], check=True)
# Get the merge base between this PR and the main branch.
try:
Expand Down Expand Up @@ -226,7 +226,7 @@ def list_github_prs(self):
# then we will push the merge commit that was automatically created by GitHub to GitLab
# where it will kick off a CI pipeline.
try:
subprocess.run(["git", "fetch", "--unshallow", "github",
subprocess.run(["git", "fetch", "github",
f"{pull.merge_commit_sha}:{pr_string}"], check=True)
except subprocess.CalledProcessError:
print("Failed to locally checkout PR {0} ({1}). Skipping"
Expand Down Expand Up @@ -306,7 +306,7 @@ def setup_git_repo(self):
self.gitlab_shallow_fetch()

if self.main_branch:
subprocess.run(["git", "fetch", "--unshallow", "github", self.main_branch], check=True)
subprocess.run(["git", "fetch", "github", self.main_branch], check=True)

def get_gitlab_pr_branches(self):
"""Query GitLab for branches that have already been copied over from GitHub PRs.
Expand Down Expand Up @@ -350,7 +350,7 @@ def update_refspecs_for_tags(self, tags, open_refspecs, fetch_refspecs):
def fetch_github_branches(self, fetch_refspecs):
"""Perform `git fetch` for a given list of refspecs."""
print("Fetching GitHub refs for open PRs")
fetch_args = ["git", "fetch", "-q", "--unshallow", "github"] + fetch_refspecs
fetch_args = ["git", "fetch", "-q", "github"] + fetch_refspecs
subprocess.run(fetch_args, check=True)

def build_local_branches(self, protected_branches):
Expand Down
62 changes: 51 additions & 11 deletions .gitlab/config/generate_pipelines.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,43 @@
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)


def request_as_dict(url):
r = requests.get(url + '?per_page=100', verify=False)
return r.json()
class skip_after_n_successes:
def __init__(self, default_value, n):
self.runs_max = n
self.runs_current = 0
self.default_value = default_value

def __call__(self, fn, *args, **kwargs):
if self.runs_current >= self.runs_max:
return self.default_value

ret = fn(*args, **kwargs)
if ret:
self.runs_current += 1
return ret


def http_get_request(*args, **kwargs):
kwargs['verify'] = False
return requests.get(*args, **kwargs)


def request_as_list(url, *args, **kwargs):
current_url = url
body_json = []
while current_url:
response = http_get_request(current_url, *args, **kwargs)
body_json += response.json()

header = response.headers
current_url = None
if 'link' in header:
links = re.search(
r'(?<=\<)([\S]*)(?=>; rel="next")', header['link'], flags=re.IGNORECASE)
if links is not None:
current_url = links.group(0)

return body_json


def add_timestamp(branch):
Expand All @@ -44,7 +78,12 @@ def has_no_status(branch):
gh_commit_sha = branch['commit']['parent_ids'][1]

# Query GitHub for the status of this commit
commit = request_as_dict(gh_url + '/commits/' + gh_commit_sha + '/status')
response = http_get_request(
gh_url + '/commits/' + gh_commit_sha + '/status')
if int(response.headers['x-ratelimit-remaining']) <= 0:
raise ConnectionError(response.json())

commit = response.json()
if commit is None or 'sha' not in commit:
return False

Expand Down Expand Up @@ -88,14 +127,15 @@ def has_no_status(branch):
with open(args.template_file, 'r') as fd:
template_str = fd.read()

branches = request_as_dict(gl_url + '/repository/branches')
branches = map(add_timestamp, branches)
branches = filter(is_recent, branches)
branches = filter(has_no_status, branches)

# Select the arg.max most least recent branches
branches = request_as_list(gl_url + '/repository/branches')
branches = [add_timestamp(branch) for branch in branches]
branches = [b for b in branches if is_recent(b)]
branches = sorted(branches, key=lambda x: x['dt'])
branches = itertools.islice(branches, args.max)

# Skip running (and return true) has_no_status after returning True args.max times.
# We need this not to hog the Github Rest API draconian ratelimit.
run_n_times = skip_after_n_successes(default_value=False, n=args.max)
branches = [b for b in branches if run_n_times(has_no_status, b)]

for branch in branches:
print(template_str.format(
Expand Down
15 changes: 15 additions & 0 deletions .shellcheck_exclude_paths
Original file line number Diff line number Diff line change
Expand Up @@ -15,3 +15,18 @@ scripts/developer/setup.sh
scripts/docker/setup-user.sh
scripts/runconf/runconf.sh
scripts/runconf/runconf_olcf.sh
testing/contract/lammps/build.sh
testing/contract/lammps/config.sh
testing/contract/lammps/install.sh
testing/contract/lammps/setup.sh
testing/contract/lammps/test.sh
testing/contract/scorpio/build.sh
testing/contract/scorpio/config.sh
testing/contract/scorpio/install.sh
testing/contract/scorpio/setup.sh
testing/contract/scorpio/test.sh
testing/contract/tau/build.sh
testing/contract/tau/config.sh
testing/contract/tau/install.sh
testing/contract/tau/setup.sh
testing/contract/tau/test.sh
26 changes: 17 additions & 9 deletions CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -137,7 +137,11 @@ if((NOT BUILD_SHARED_LIBS) AND (NOT DEFINED CMAKE_POSITION_INDEPENDENT_CODE))
set(CMAKE_POSITION_INDEPENDENT_CODE ON)
endif()

# Ctest creates BUILD_TESTING option and sets it to true by default
# Here we disable BUILD_TESTING option by default
option(BUILD_TESTING "Build testing" OFF)
include(CTest)
mark_as_advanced(CLEAR BUILD_TESTING)

adios_option(Blosc2 "Enable support for c-blosc-2 transforms" AUTO)
adios_option(BZip2 "Enable support for BZip2 transforms" AUTO)
Expand Down Expand Up @@ -166,8 +170,10 @@ adios_option(Profiling "Enable support for profiling" AUTO)
adios_option(Endian_Reverse "Enable support for Little/Big Endian Interoperability" AUTO)
adios_option(Sodium "Enable support for Sodium for encryption" AUTO)
adios_option(Catalyst "Enable support for in situ visualization plugin using ParaView Catalyst" AUTO)
adios_option(AWSSDK "Enable support for S3 compatible storage using AWS SDK's S3 module" AUTO)
adios_option(SQLite3 "Enable support for SQLite3 required by campaign manager" AUTO)
adios_option(SQLite3 "Enable support for SQLite3 required by campaign manager" OFF)
adios_option(AWSSDK "Enable support for S3 compatible storage using AWS SDK's S3 module" OFF)
adios_option(Derived_Variable "Enable support for derived variables" OFF)
>>>>>>> master
include(${PROJECT_SOURCE_DIR}/cmake/DetectOptions.cmake)

if(ADIOS2_HAVE_CUDA OR ADIOS2_HAVE_Kokkos_CUDA)
Expand All @@ -177,7 +183,7 @@ if(ADIOS2_HAVE_CUDA OR ADIOS2_HAVE_Kokkos_CUDA)
if(DEFINED Kokkos_CUDA_ARCHITECTURES)
set(CMAKE_CUDA_ARCHITECTURES ${Kokkos_CUDA_ARCHITECTURES})
else()
# Mininum common non-deprecated architecture
# Minimum common non-deprecated architecture
set(CMAKE_CUDA_ARCHITECTURES 52)
endif()
endif()
Expand Down Expand Up @@ -240,8 +246,8 @@ endif()
set(ADIOS2_CONFIG_OPTS
DataMan DataSpaces HDF5 HDF5_VOL MHS SST Fortran MPI Python Blosc2 BZip2
LIBPRESSIO MGARD PNG SZ ZFP DAOS IME O_DIRECT Sodium Catalyst SysVShMem UCX
ZeroMQ Profiling Endian_Reverse AWSSDK GPU_Support CUDA Kokkos Kokkos_CUDA
Kokkos_HIP Kokkos_SYCL SQLite3
ZeroMQ Profiling Endian_Reverse Derived_Variable AWSSDK GPU_Support CUDA Kokkos
Kokkos_CUDA Kokkos_HIP Kokkos_SYCL SQLite3
)

GenerateADIOSHeaderConfig(${ADIOS2_CONFIG_OPTS})
Expand Down Expand Up @@ -288,12 +294,15 @@ if(BUILD_SHARED_LIBS AND ADIOS2_RUN_INSTALL_TEST)
endif()
endif()

if(MSVC AND BUILD_SHARED_LIBS AND ADIOS2_HAVE_HDF5)
# See note about building with visual studio and shared libs, here:
# https://github.com/HDFGroup/hdf5/blob/develop/release_docs/USING_HDF5_VS.txt
add_definitions(-DH5_BUILT_AS_DYNAMIC_LIB=1)
endif()

#------------------------------------------------------------------------------#
# Third party libraries
#------------------------------------------------------------------------------#
include(CTest)
mark_as_advanced(BUILD_TESTING)
add_subdirectory(thirdparty)

#------------------------------------------------------------------------------#
Expand All @@ -314,8 +323,7 @@ add_subdirectory(plugins)
#------------------------------------------------------------------------------#
# Examples
#------------------------------------------------------------------------------#
option(ADIOS2_BUILD_EXAMPLES "Build examples" ON)
option(ADIOS2_BUILD_EXAMPLES_EXPERIMENTAL "Build experimental examples" OFF)
option(ADIOS2_BUILD_EXAMPLES "Build examples" OFF)
if(ADIOS2_BUILD_EXAMPLES)
add_subdirectory(examples)
endif()
Expand Down
3 changes: 1 addition & 2 deletions ReadMe.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,15 +70,14 @@ Once ADIOS2 is installed refer to:

## Releases

* Latest release: [v2.9.0](https://github.com/ornladios/ADIOS2/releases/tag/v2.9.0)
* Latest release: [v2.9.2](https://github.com/ornladios/ADIOS2/releases/tag/v2.9.2)

* Previous releases: [https://github.com/ornladios/ADIOS2/releases](https://github.com/ornladios/ADIOS2/releases)

## Community

ADIOS2 is an open source project: Questions, discussion, and contributions are welcome. Join us at:

- Slack workspace: [![Slack](https://img.shields.io/badge/slack-ADIOS2-pink.svg)](https://adios2.spack.io)
- Mailing list: adios-ecp@kitware.com
- Github Discussions: https://github.com/ornladios/ADIOS2/discussions

Expand Down

0 comments on commit cc2f83a

Please sign in to comment.