Skip to content

Meeting Note #7 11.04.2019

ugurcanarikan edited this page Apr 29, 2019 · 1 revision

Location: Bogazici University Computer Engineering Building

Date/Time: 11.04.2019 / 12:00

Attendees:

  • Suzan Üsküdarlı
  • Onur Güngör
  • Uğurcan Arıkan

1. Preparation Before Meeting

  • 1.1. Deciding the batch size to be used in BERT's pretraining for sequence number of 128

2. Agenda

  • 2.1. New roadmap
  • 2.2. BERT's pretraining
  • 2.3. BERT's fine-tuning process

3. Discussion

  • 3.1. New roadmap has been discussed
  • 3.2. Fine-tuning process of the BERT has been discussed
  • 3.3. BERT's pretraining parameters has been discussed

4. Outcomes

  • 4.1. BERT pretraining
    • 4.1.1. BERT's will be pretrained with batch size 56 and for 26.5 million train steps in order to train for 10 epochs
  • 4.2. BERT fine-tuning
    • 4.2.1. BERT's fine-tuning will be done using flair
  • 4.3. New Roadmap
    • 4.3.1. BERT's pretraining will be completed for at least 2.6 million train steps before the beginning of the fine-tuning process
    • 4.3.2. Fine-tuning will start in 2 weeks
    • 4.3.3. Deployment of the project will be done after 3 weeks
    • 4.3.3. Preparations for the project's article will start

5. TO-DO list

Deadline: 18.04 12:00 Assignee: Uğurcan Arıkan

  • 5.1. Start pretraining BERT with the decided parameters

Deadline: 18.04 12:00 Assignee: Uğurcan Arıkan

  • 5.2. Start working with flair