Download Latest Version Adapters v1.2.0 source code.tar.gz (15.2 MB)
Email in envelope

Get an email when there's a new version of Adapters

Home / v1.2.0
Name Modified Size InfoDownloads / Week
Parent folder
Adapters v1.2.0 source code.tar.gz 2025-05-20 15.2 MB
Adapters v1.2.0 source code.zip 2025-05-20 15.4 MB
README.md 2025-05-20 1.6 kB
Totals: 3 Items   30.5 MB 0

Blog post: https://adapterhub.ml/blog/2025/05/adapters-for-any-transformer

This version is built for Hugging Face Transformers v4.51.x.

New

Adapter Model Plugin Interface (@calpt via [#738]; @lenglaender via [#797])

The new adapter model interface makes it easy to plug most adapter features into any new or custom Transformer model. Check out our release blog post for details. Also see https://docs.adapterhub.ml/plugin_interface.html.

Multi-Task composition with MTL-LoRA (@FrLdy via [#792])

MTL-LoRA (Yang et al., 2024) is a new adapter composition method leveraging LoRA for multi-task learning. See https://docs.adapterhub.ml/multi_task_methods.html#mtl-lora.

VeRA - parameter-efficient LoRA variant (@julian-fong via [#763])

VeRA (Kopiczko et al., 2024) is a LoRA adapter variant that requires even less trainable parameters. See https://docs.adapterhub.ml/methods.html#vera.

New Models (via new interface)

A couple of new models are supported out-of-the-box via the new adapter model plugin interface: - Gemma 2, Gemma 3 - ModernBERT - Phi 1, Phi 2 - Qwen 2, Qwen 2.5, Qwen 3

More

  • New init_weights_seed adapter config attribute to initialize adapters with identical weights (@TimoImhof via [#786])
  • Support defining custom forward method args via ForwardContext (@calpt via [#789])

Changed

  • Upgrade supported Transformers version (@calpt via [#799]; @TimoImhof via [#805])
Source: README.md, updated 2025-05-20