Just published this recap of a paper reading we did last week with Frank Liu, ML Architect at Zilliz on
Extending the Context Window of LLaMA Models.
This paper examines Position Interpolation, a method extending context window sizes of LLaMA models up to 32,768 positions with minimal fine-tuning. 🦙!
Watch/read/learn here:
https://arize.com/blog/extending-the-context-window-of-llama-models-paper-reading/