Skip to content

Latest commit

 

History

History

tensorflowjs-toxicity

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Sample - TensorFlow.js Toxicity

Execute the toxicity model for TensorFlow.js with the ScaleDynamics WarpJS SDK.

The toxicity model detects whether text contains toxic content such as threatening language, insults, obscenities, identity-based hate, or sexually explicit language.

👉 Try a live demo

Setup

  • Clone the project
  • Go to the warp-samples/tensorflowjs-toxicity directory
  • Run the following commands:
# install deps
$ npm install

# login to ScaleDynamics
$ npx warp login

Run

# run a dev server
$ npm run dev

# build and deploy to production
$ npm run build
$ npm run deploy

Resources