Advertisement
Guest User

Untitled

a guest
Mar 19th, 2019
130
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 0.36 KB | None | 0 0
  1. // The minimum prediction confidence.
  2. const threshold = 0.9;
  3.  
  4. // Which toxicity labels to return.
  5. const labelsToInclude = [‘identity_attack’, ‘insult’, ‘threat’];
  6.  
  7. toxicity.load(threshold, labelsToInclude).then(model => {
  8. // Now you can use the `model` object to label sentences.
  9. model.classify([‘you suck’]).then(predictions => {...});
  10. });
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement