I'm trying to use transformer's huggingface pretrained model bert-base-uncased
, but I want to increace dropout. There isn't any mention to this in from_pretrained
method, but colab ran the object instantiation below without any problem. I saw these dropout parameters in classtransformers.BertConfig
documentation.
Am I using bert-base-uncased AND changing dropout in the correct way?
model = BertForSequenceClassification.from_pretrained(
pretrained_model_name_or_path='bert-base-uncased',
num_labels=2,
output_attentions = False,
output_hidden_states = False,
attention_probs_dropout_prob=0.5,
hidden_dropout_prob=0.5
)