Liquid: LFM 40B MoE

liquid/lfm-40b

About Liquid: LFM 40B MoE

Liquid's 40.3B Mixture of Experts (MoE) model. Liquid Foundation Models (LFMs) are large neural networks built with computational units rooted in dynamic systems.

LFMs are general-purpose AI models that can be used to model any kind of sequential data, including video, audio, text, time series, and signals.

See the launch announcement for benchmarks and more info.

Specifications

Context Length

32,768

Tokenizer

Other

Pricing

Prompt

0.150

Completion

0.150

Image

0

Request

0

Last updated: 4/11/2025