Skip to content

Support attention_bias on LLaMA architecture #6658

Support attention_bias on LLaMA architecture

Support attention_bias on LLaMA architecture #6658

Re-run triggered December 1, 2023 17:16
Status Success
Total duration 21m 43s
Artifacts

build.yml

on: pull_request
Matrix: windows-latest-cmake-cublas
Matrix: windows-latest-cmake
ubuntu-focal-make
1m 17s
ubuntu-focal-make
ubuntu-latest-cmake
1m 23s
ubuntu-latest-cmake
macOS-latest-make
2m 18s
macOS-latest-make
macOS-latest-cmake
3m 48s
macOS-latest-cmake
macOS-latest-cmake-ios
1m 27s
macOS-latest-cmake-ios
macOS-latest-cmake-tvos
1m 34s
macOS-latest-cmake-tvos
ios-xcode-build
1m 31s
ios-xcode-build
Matrix: macOS-latest-swift
Matrix: ubuntu-latest-cmake-mpi
Matrix: ubuntu-latest-cmake-sanitizer
release
0s
release
Fit to window
Zoom out
Zoom in

Annotations

1 error