Mamba-7B This is a 7B parameter model with the Mamba architecture, trained on multiple epochs (1.2T tokens) of the RefinedWeb dataset. Mamba is a state-space model that does not use self-attention unlike the standard transformer architecture.
A C++ implementation of the lattice boltzmann D2Q9 fluid simulation with immersed boundaries.
WebGL Demo WebGL Demo