a team of Databricks

Created: by Pradeep Gowda Updated: Apr 02, 2024 Tagged: llm

MoscaicML was a startup acquired by Databricks in July 6, 2023 for $1.3B. In March 2024, they were behind the announcement of Databricks’s new Foundation Model - DBRX

Some libraries by MosaicML are:

  • LLM Foundary  code for training, finetuning, evaluating, and deploying LLMs for inference with Composerand the MosaicML platform.

  • Composer is an  open-source deep learning training library. Built on top of pytorch, this library makes it easier to implement distributed training workflows on large-scale clusters.  Use Composer to speedup experimentation workflow if training neural networks of any size, including:

    • Large Language Models (llms)
    • Diffusion models
    • Embedding models (e.g. BERT)
    • Transformer-based models
    • Convolutional Neural Networks (CNNs)
  • Streaming

  • and more

huggingfacemosaicml (Mosaic ML, Inc.)