Ritvik Rastogi

May 17, 2024

3 stories

LLM Lingua Series

A novel approach to task-agnostic prompt compression, aiming to enhance generalizability, using data distillation and leveraging a Transformer encoder for token classification.
A novel approach for prompt compression to enhance performance in long context scenarios using question-aware compression and document reordering.
A novel coarse-to-fine prompt compression method, incorporating a budget controller, an iterative token-level compression algorithm, and distribution alignment, achieving up to 20x compression with minimal performance loss.
Ritvik Rastogi

Ritvik Rastogi

Data Scientist, 2x Kaggle Expert