Stars
An LLM-powered knowledge curation system that researches a topic and generates a full-length report with citations.
A deconvolution pipeline designed to enhance the quality of 3D cryo-EM maps that suffer from anisotropic resolutions.
CryoTEN: Efficiently Enhancing Cryo-EM Density Maps Using Transformers
Deep learning tools for converting cryo-EM density maps to protein structures
Backward compatibly refactor header-based C++ into modules.
DeepChat - 连接强大AI与个人世界的智能助手 | DeepChat - A smart assistant that connects powerful AI to your personal world
A list of useful Open Source tools and scrapers to gather data for LLMs
Anthropic’s Model Context Protocol implementation for Oat++
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (V…
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
🐳 Efficient Triton implementations for "Native Sparse Attention: Hardware-Aligned and Natively Trainable Sparse Attention"
A powerful VSCode extension that enables MCP server usage in Copilot, giving it access to MCP tools, resources, and more.
A standardized protein design benchmark for motif-scaffolding problems
A list of 3D computer vision papers with Transformers
🍒 Cherry Studio is a desktop client that supports for multiple LLM providers. Support deepseek-r1
Implementation of the sparse attention pattern proposed by the Deepseek team in their "Native Sparse Attention" paper
ChatMCP is an AI chat client implementing the Model Context Protocol (MCP).
(Supports DeepSeek R1) An AI-powered research assistant that performs iterative, deep research on any topic by combining search engines, web scraping, and large language models.
Materials for the Learn PyTorch for Deep Learning: Zero to Mastery course.
⭕ Share quick reference cheat sheet for developers.