Codes for the paper "∞Bench: Extending Long Context Evaluation Beyond 100K Tokens": https://arxiv.org/abs/2402.13718
-
Updated
Jun 13, 2024 - Python
Codes for the paper "∞Bench: Extending Long Context Evaluation Beyond 100K Tokens": https://arxiv.org/abs/2402.13718
Official release of InternLM2 7B and 20B base and chat models. 200K context support
Code and documents of LongLoRA and LongAlpaca (ICLR 2024 Oral)
Transformers with Arbitrarily Large Context
open-source code for paper: Retrieval Head Mechanistically Explains Long-Context Factuality
Implementation of Infini-Transformer in Pytorch
"Found in the Middle: How Language Models Use Long Contexts Better via Plug-and-Play Positional Encoding" Zhenyu Zhang, Runjin Chen, Shiwei Liu, Zhewei Yao, Olatunji Ruwase, Beidi Chen, Xiaoxia Wu, Zhangyang Wang.
PyTorch implementation of Infini-Transformer from "Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention" (https://arxiv.org/abs/2404.07143)
Implementation of MEGABYTE, Predicting Million-byte Sequences with Multiscale Transformers, in Pytorch
LongAlign: A Recipe for Long Context Alignment Encompassing Data, Training, and Evaluation
The official implementation of "Ada-LEval: Evaluating long-context LLMs with length-adaptable benchmarks"
TriForce: Lossless Acceleration of Long Sequence Generation with Hierarchical Speculative Decoding
Implementation of 💍 Ring Attention, from Liu et al. at Berkeley AI, in Pytorch
The code of our paper "InfLLM: Unveiling the Intrinsic Capacity of LLMs for Understanding Extremely Long Sequences with Training-Free Memory"
The official repo for "LLoCo: Learning Long Contexts Offline"
needle in a haystack for LLMs
Implementation of paper "LM-Infinite: Simple On-the-Fly Length Generalization for Large Language Models"
LooGLE: Long Context Evaluation for Long-Context Language Models
LongBench: A Bilingual, Multitask Benchmark for Long Context Understanding
Implementation of Recurrent Memory Transformer, Neurips 2022 paper, in Pytorch
Add a description, image, and links to the long-context topic page so that developers can more easily learn about it.
To associate your repository with the long-context topic, visit your repo's landing page and select "manage topics."