ncnn is a high-performance neural network inference framework optimized for the mobile platform
-
Updated
Nov 29, 2023 - C++
ncnn is a high-performance neural network inference framework optimized for the mobile platform
Parsing gigabytes of JSON per second : used by Facebook/Meta Velox, WatermelonDB, Apache Doris, Milvus, StarRocks
An open source time-series database for fast ingest and SQL queries
OpenGL Mathematics (GLM)
A blazingly fast JSON serializing & deserializing library
Performance-portable, length-agnostic SIMD with runtime dispatch
The Compute Library is a set of computer vision and machine learning functions optimised for both Arm CPUs and GPUs using SIMD technologies.
🚀 efficient approximate nearest neighbor search algorithm collections library written in Rust 🦀 .
Open source c++ skeletal animation library and toolset
C++ wrappers for SIMD intrinsics and parallelized, optimized mathematical functions (SSE, AVX, AVX512, NEON, SVE))
📽 Highly Optimized Graphics Math (glm) for C
C++ image processing and machine learning library with using of SIMD: SSE, AVX, AVX-512, AMX for x86/x64, VMX(Altivec) and VSX(Power7) for PowerPC, NEON for ARM.
Building game development ecosystem for @ziglang!
Inference Llama 2 in one file of pure 🔥
Acceleration package for neural networks on multi-core CPUs
Add a description, image, and links to the simd topic page so that developers can more easily learn about it.
To associate your repository with the simd topic, visit your repo's landing page and select "manage topics."