Mamba-7B This is a 7B parameter model with the Mamba architecture, trained on multiple epochs (1.2T tokens) of the RefinedWeb dataset. Mamba is a state-space model that does not use self-attention unlike the standard transformer architecture.
Jun 28, 2024
Jan 1, 2023
A C++ implementation of the lattice boltzmann D2Q9 fluid simulation with immersed boundaries.
Oct 25, 2017
WebGL Demo WebGL Demo
Jan 1, 0001