Skip to content

Konstantina-Lazaridou/toxic-comments-bias-kaggle

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Contributors


Classifying toxic online comments

Reproducing the solution by Simple LSTM - PyTorch version.

Table of Contents
  1. About The Project
  2. Getting Started
  3. Usage
  4. Roadmap
  5. Contributing
  6. License
  7. Contact
  8. Acknowledgments

About The Project

Classifying toxic online comments.

(back to top)

Built With

  • Python
  • Pytorch

(back to top)

Getting Started

Prerequisites

Things you need to use the software.

  • Python
  • Jupyter notebook
  • Poetry (pip install did not work out on Ubuntu 20.04, curl install worked after executing source $HOME/.poetry/env as well)
  • Text data by Kaggle
  • Embeddings by Stanford and Facebook

Installation

Follow these steps to install create a virtual environment and activate it in Jupyter.

  • In this project's repository, execute poetry install, which creates a virtual environment based on the pyproject.toml file
  • Activate the virtual environment with poetry shell
  • If you need any new dependecies, e.g., nltk, execute poetry add nltk
  • To run the notebook with this environment, execute poetry run jupyter notebook

(back to top)

Usage

(back to top)

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published