-
Notifications
You must be signed in to change notification settings - Fork 3
Meeting Note #7 11.04.2019
ugurcanarikan edited this page Apr 29, 2019
·
1 revision
Location: Bogazici University Computer Engineering Building
Date/Time: 11.04.2019 / 12:00
- Suzan Üsküdarlı
- Onur Güngör
- Uğurcan Arıkan
- 1.1. Deciding the batch size to be used in BERT's pretraining for sequence number of 128
- 2.1. New roadmap
- 2.2. BERT's pretraining
- 2.3. BERT's fine-tuning process
- 3.1. New roadmap has been discussed
- 3.2. Fine-tuning process of the BERT has been discussed
- 3.3. BERT's pretraining parameters has been discussed
- 4.1. BERT pretraining
- 4.1.1. BERT's will be pretrained with batch size 56 and for 26.5 million train steps in order to train for 10 epochs
- 4.2. BERT fine-tuning
- 4.2.1. BERT's fine-tuning will be done using flair
- 4.3. New Roadmap
- 4.3.1. BERT's pretraining will be completed for at least 2.6 million train steps before the beginning of the fine-tuning process
- 4.3.2. Fine-tuning will start in 2 weeks
- 4.3.3. Deployment of the project will be done after 3 weeks
- 4.3.3. Preparations for the project's article will start
Deadline: 18.04 12:00 Assignee: Uğurcan Arıkan
- 5.1. Start pretraining BERT with the decided parameters
Deadline: 18.04 12:00 Assignee: Uğurcan Arıkan
- 5.2. Start working with flair