My research process is fueled by a constant stream of ideas 😊 . Naturally, many are rough drafts - far from being ready for publication. Some turn out to be things others have already done; some I talk myself out of; and others get shot down by my students. (Though, ironically, we sometimes see those 'students-do-not-like' ideas published at top conferences years later by other groups!)
That’s why I’ve decided to start sharing most of these early-stage thoughts more openly. Perhaps a raw idea that didn't make the cut for me will spark inspiration for you and grow into something amazing.
Idea (1): Employing Reversible Dirichlet Processes for Diffusion-Model-Like Tasks
Idea (2): Combining Markov Chain Monte Carlo and Generative Modeling
Idea (3): Reducing Steps in Diffusion Models
Idea (4): Optimizing Attention Heads Using Determinantal Point Process
Idea (6): Correlated Sampling from Multiple Softmax Distributions Using Gumbel-Max Trick and Copulas
Idea (7): ``Elastic'' Attention Mechanism and Sinkhorn Computation
Idea (8): Particle Transformer
Idea (9): NNGP, Transformer and Dirichlet Process Combination
Idea (10): Dirichlet Process (plus Gaussian Process) for combining Flow Matching
Idea (11): Swin Transformer with Semantic Boundary Adherence using Swendsen-Wang Sampling