Aspect Term Extraction (ATE) & Aspect Polarity Classification (APC)
Fast & Low Memory requirement & Enhanced implementation of Local Context Focus
Build from LC-ABSA / LCF-ABSA / LCF-BERT and LCF-ATEPC.
PyTorch Implementations (CPU & CUDA supported).
If you are willing to support PyABSA project, please star this repository as your contribution.
Models | Laptop14 (acc) | Rest14 (acc) | Rest15 (acc) | Rest16 (acc) |
---|---|---|---|---|
LSA-T-DeBerta | 84.16 | 90.45 | 88.15 | 93.98 |
LSA-S-DeBerta | 84.33 | 89.64 | 89.04 | 94.47 |
Results are based on roberta-base (V1.3.5), which is different from our original paper using bert-base-uncased. We are working on update our experimental results of our paper please see more in APC leaderboard
ATEPC leaderboard update is pending due to resoruce limitaion, if you can do it please contact me.
pyabsa | package root (including all interfaces) |
pyabsa.functional | recommend interface entry |
pyabsa.functional.checkpoint | checkpoint manager entry, inference model entry |
pyabsa.functional.dataset | datasets entry |
pyabsa.functional.config | predefined config manager |
pyabsa.functional.trainer | training module, every trainer return a inference model |
PyABSA use the FindFile to find the target file which means you can specify a dataset/checkpoint by keywords instead of using absolute path. e.g.,
- First, refer to ABSADatasets to prepare your dataset into acceptable format.
- You can PR to contribute your dataset and use it like
ABDADatasets.your_dataset
, or use it by dataset absolute / relative path, or dataset dir name
dataset = './laptop' # relative path
dataset = 'ABSOLUTE_PATH/laptop/' # absolute path
dataset = 'laptop' # dataset directory name, keyword case doesn't matter
dataset = 'lapto' # search any directory whose path contains the 'lapto' or 'aptop'
checkpoint = 'lcfs' # checkpoint assignment is similar to above methods
PyABSA use the AutoCUDA to support automatic cuda assignment, but you can still set a preferred device.
auto_device = True # to auto assign a cuda device for training / inference
auto_device = False # to use cpu
auto_device = 'cuda:1' # to specify a preferred device
auto_device = 'cpu' # to specify a preferred device
PyABSA encourages you to use string labels instead of numbers. e.g., sentiment labels = {negative, positive, unknown}
- What labels you labeled in the dataset, what labels will be output in inference
- The version information of PyABSA is also available in the output while loading checkpoints training args.
- You can train a model using multiple datasets with same sentiment labels, and you can even contribute and define a combination of datasets here!
The default spaCy english model is en_core_web_sm, if you didn't install it, PyABSA will download/install it automatically.
If you would like to change english model (or other pre-defined options), you can get/set as following:
from pyabsa.functional.config.apc_config_manager import APCConfigManager
from pyabsa.functional.config.atepc_config_manager import ATEPCConfigManager
from pyabsa.functional.config.classification_config_manager import ClassificationConfigManager
# Set
APCConfigManager.set_apc_config_english({'spacy_model': 'en_core_web_lg'})
ATEPCConfigManager.set_atepc_config_english({'spacy_model': 'en_core_web_lg'})
ClassificationConfigManager.set_classification_config_english({'spacy_model': 'en_core_web_lg'})
# Get
APCConfigManager.get_apc_config_english()
ATEPCConfigManager.get_atepc_config_english()
ClassificationConfigManager.get_classification_config_english()
# Manually Set spaCy nlp Language object
from pyabsa.core.apc.dataset_utils.apc_utils import configure_spacy_model
nlp = configure_spacy_model(APCConfigManager.get_apc_config_english())
- Create a new python environment and install pyabsa
- ind a target demo script (ATEPC , APC , Text Classification) to prepare your work
- Format your dataset referring to ABSADatasets or use public dataset in ABSADatasets
- Init your config to specify Model, Dataset, hyper-parameters
- Training your model and get checkpoints
- Share your checkpoint and dataset
Please do not install the version without corresponding release note to avoid installing a test version.
To use PyABSA, install the latest version from pip or source code:
pip install -U pyabsa
git clone https://github.com/yangheng95/PyABSA --depth=1
cd PyABSA
python setup.py install
PyABSA will check the latest available checkpoints before and load the latest checkpoint from Google Drive. To view available checkpoints, you can use the following code and load the checkpoint by name:
from pyabsa import available_checkpoints
checkpoint_map = available_checkpoints() # show available checkpoints of PyABSA of current version
If you can not access to Google Drive, you can download our checkpoints and load the unzipped checkpoint manually. 如果您无法访问谷歌Drive,您可以从此处 (提取码:ABSA) 下载我们预训练的模型,并加载模型(百度云上的checkpoints更新较慢,版本较为滞后,请注意使用对应版本的PyABSA)。
More datasets are available at ABSADatasets.
- Laptop14
- Restaurant14
- Restaurant15
- Restaurant16
- Phone
- Car
- Camera
- Notebook
- MAMS
- TShirt
- Television
- MOOC
- Shampoo
- Multilingual (The sum of all datasets.)
You don't have to download the datasets, as the datasets will be downloaded automatically.
Except for the following models, we provide a template model involving LCF vec, you can develop your model based on the LCF-APC model template or LCF-ATEPC model template.
- LCF-ATEPC
- LCF-ATEPC-LARGE (Dual BERT)
- FAST-LCF-ATEPC
- LCFS-ATEPC
- LCFS-ATEPC-LARGE (Dual BERT)
- FAST-LCFS-ATEPC
- BERT-BASE
- SLIDE-LCF-BERT (Faster & Performs Better than LCF/LCFS-BERT)
- SLIDE-LCFS-BERT (Faster & Performs Better than LCF/LCFS-BERT)
- LCF-BERT (Reimplemented & Enhanced)
- LCFS-BERT (Reimplemented & Enhanced)
- FAST-LCF-BERT (Faster with slightly performance loss)
- FAST_LCFS-BERT (Faster with slightly performance loss)
- LCF-DUAL-BERT (Dual BERT)
- LCFS-DUAL-BERT (Dual BERT)
- BERT-BASE
- BERT-SPC
- LCA-Net
- DLCF-DCA-BERT *
- AOA_BERT
- ASGCN_BERT
- ATAE_LSTM_BERT
- Cabasc_BERT
- IAN_BERT
- LSTM_BERT
- MemNet_BERT
- MGAN_BERT
- RAM_BERT
- TD_LSTM_BERT
- TC_LSTM_BERT
- TNet_LF_BERT
We expect that you can help us improve this project, and your contributions are welcome. You can make a contribution in many ways, including:
- Share your custom dataset in PyABSA and ABSADatasets
- Integrates your models in PyABSA. (You can share your models whether it is or not based on PyABSA. if you are interested, we will help you)
- Raise a bug report while you use PyABSA or review the code (PyABSA is a individual project driven by enthusiasm so your help is needed)
- Give us some advice about feature design/refactor (You can advise to improve some feature)
- Correct/Rewrite some error-messages or code comment (The comments are not written by native english speaker, you can help us improve documents)
- Create an example script in a particular situation (Such as specify a SpaCy model, pretrainedbert type, some hyperparameters)
- Star this repository to keep it active
The LCF is a simple and adoptive mechanism proposed for ABSA. Many models based on LCF has been proposed and achieved SOTA performance. Developing your models based on LCF will significantly improve your ABSA models. If you are looking for the original proposal of local context focus, please redirect to the introduction of LCF. If you are looking for the original codes of the LCF-related papers, please redirect to LC-ABSA / LCF-ABSA or LCF-ATEPC.
This work build from LC-ABSA/LCF-ABSA and LCF-ATEPC, and other impressive works such as PyTorch-ABSA and LCFS-BERT.
MIT
Thanks goes to these wonderful people (emoji key):
XuMayi 💻 |
YangHeng 📆 |
brtgpy 🔣 |
Ryan 💻 |
lpfy 💻 |
Jackie Liu 💻 |
This project follows the all-contributors specification. Contributions of any kind welcome!