Skip to content

Latest commit

 

History

History
45 lines (31 loc) · 1.67 KB

README.md

File metadata and controls

45 lines (31 loc) · 1.67 KB

Gemini_Connect

This is the official repository for my submission to the Kaggle Gemini Long Context Challenge.
Submission Notebook: https://www.kaggle.com/code/ariondas/iiit-ranchi-gemini-long-context-challenge

I try to show a use-case of the long context Gemini's new models (as of 2nd December, 2024) provide. I try to utilize the long context the aforementioned models provide to come up with related papers to a particular topic. Hopefully, it helps us fellow researchers.


Here are a few plots to highlight the models' performance:

Related Papers Plot:


Maximum Tokens Limit across models:


Time taken to generate responses:


REPORT

The Gemini-1.5 model variants claim upwards of 1 million context length (Gemini-1.5-pro claims 2 million). But, how do they fair in practise? I have prepared a report on my observations. All the details of my experiments are added to this report:

Report


I have also summarized the entire work in this video:

Is.Gemini.s.2M.context.length.a.myth_.Presenting.Gemini_Connect.for.Researchers.mp4