You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Research question
The last hidden layer of BERT is best suited for contextualized text embeddings.
Hypothesis
It is the layer where the structure is best defined, considering all previous relations in the other 11 layers.
Method
Instantiate pretrained ClinialBERT
Gather a dataset of medical terms with different classes. E.g. all brain locations, but locations are grouped by occurrence of tumours in those regions.
Research question
The last hidden layer of BERT is best suited for contextualized text embeddings.
Hypothesis
It is the layer where the structure is best defined, considering all previous relations in the other 11 layers.
Method
Why is this experiment worthwhile?
Papers report different accuracies when using different embedding strategies from pretrained models (ref!).
The text was updated successfully, but these errors were encountered: