Hidden Interaction Tensor Factorization (IJCAI-18)
This repo contains the PyTorch implementation of the paper Joint Learning of Phenotypes and Diagnosis-Medication Correspondence via Hidden Interaction Tensor Factorization
in IJCAI-18. [paper] [dataset]
The codes have been tested with the following packages:
- Python 3.6
- PyTorch 0.4.1
To run the model with a quick demo data, simply clone the repo and decompress the data archive by executing the following commands:
git clone git@github.com:jakeykj/hitf.git
cd hitf
tar -xzvf demo_data.tar.gz
python train.py ./demo_data/
A folder ./results/
will be automatically created and the results will be saved there.
The data are stored in three seperate files contained in a folder (we refer to its path by <DATA_PATH>
): <DATA_PATH>/D.csv
, <DATA_PATH>/M.csv
and <DATA_PATH>/labels.csv
.
D.csv
andM.csv
: contain the patient-by-diagnosis binary matrix and the patient-by-medication counting matrix, respectively. These two files should be comma seperated.labels.csv
: contains the binary label information for each patient in each line. The order of patients must be aligned with the two above matrices.
If you use other datasets, you can organize the input data in the same format described above, and pass the <DATA_PATH>
as a parameter to the training script:
python train.py <DATA_PATH>
If you find the paper or the implementation helpful, please cite the following paper:
@inproceedings{yin2018joint,
title={Joint learning of phenotypes and diagnosis-medication correspondence via hidden interaction tensor factorization},
author={Yin, Kejing and Cheung, William K and Liu, Yang and Fung, Benjamin C. M. and Poon, Jonathan},
booktitle={Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence ({IJCAI-18})},
pages={3627--3633},
year={2018},
organization={AAAI Press}
}
If you have any enquires, please contact Mr. Kejing Yin by email: cskjyin [AT] comp [DOT] hkbu.edu.hk, or leave your questions in issues.
👉 Check out my home page for more research work by us.