Hacker News new | past | comments | ask | show | jobs | submit login

I know almost nothing about this stuff, but what I know about Apple adapters I learned from this page:

https://machinelearning.apple.com/research/introducing-apple...

> Our foundation models are fine-tuned for users’ everyday activities, and can dynamically specialize themselves on-the-fly for the task at hand. We utilize adapters, small neural network modules that can be plugged into various layers of the pre-trained model, to fine-tune our models for specific tasks. For our models we adapt the attention matrices, the attention projection matrix, and the fully connected layers in the point-wise feedforward networks for a suitable set of the decoding layers of the transformer architecture.




Sounds like regular lora adapters




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: