SHARE
TWEET

Untitled

a guest May 19th, 2019 71 Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
  1. # use this for BERT-base
  2.  
  3. bert_base_config = {
  4.   "attention_probs_dropout_prob": 0.1,
  5.   "directionality": "bidi",
  6.   "hidden_act": "gelu",
  7.   "hidden_dropout_prob": 0.1,
  8.   "hidden_size": 768,
  9.   "initializer_range": 0.02,
  10.   "intermediate_size": 3072,
  11.   "max_position_embeddings": 512,
  12.   "num_attention_heads": 12,
  13.   "num_hidden_layers": 12,
  14.   "pooler_fc_size": 768,
  15.   "pooler_num_attention_heads": 12,
  16.   "pooler_num_fc_layers": 3,
  17.   "pooler_size_per_head": 128,
  18.   "pooler_type": "first_token_transform",
  19.   "type_vocab_size": 2,
  20.   "vocab_size": VOC_SIZE
  21. }
  22.  
  23. with open("{}/bert_config.json".format(MODEL_DIR), "w") as fo:
  24.   json.dump(bert_base_config, fo, indent=2)
  25.  
  26. with open("{}/{}".format(MODEL_DIR, VOC_FNAME), "w") as fo:
  27.   for token in bert_vocab:
  28.     fo.write(token+"\n")
RAW Paste Data
We use cookies for various purposes including analytics. By continuing to use Pastebin, you agree to our use of cookies as described in the Cookies Policy. OK, I Understand
 
Top