Welcome to the repository for K-ON. This project investigates the potential of LLMs in understanding and interacting with knowledge graphs, a domain that has received limited exploration in the context of NLP.
Recent advancements in large language models (LLMs) have significantly improved various natural language processing (NLP) tasks. Typically, LLMs are trained to predict the next token, aligning well with many NLP tasks. However, in knowledge graph (KG) scenarios, entities are the fundamental units and identifying an entity requires at least several tokens. This leads to a granularity mismatch between KGs and natural languages. To address this issue, we propose K-ON, which integrates KG knowledge into the LLM by employing multiple head layers for next k-step prediction. K-ON can not only generate entity-level results in one step, but also enables contrastive loss against entities, which is the most powerful tool in KG representation learning. Experimental results show that K-ON outperforms state-of-the-art methods that incorporate text and even the other modalities.
To run the code, please first install all required packages:
pip install --upgrade pandas transformers peft==0.9 bitsandbytes swifter deepspeed easydict pyyaml
Then, we need to preprocess the datasets
python dataset.py -c config/DB15K.yaml
python dataset.py -c config/MKG-W.yaml
Run with the following scripts:
% for DBP15K dataset
sh scripts/run_db15k.sh
and
% for MKG-Y dataset
sh scripts/run_mkgy.sh
We use 8 GPUs to train K-ON, and you can modify this setting ("gpu_ids" and "num_processes") in the above scripts.
Please condiser citing our paper if it is helpful to your work!
@inproceedings{K-ON,
author = {Lingbing Guo and
Yichi Zhang and
Zhongpu Bo and
Zhuo Chen and
Mengshu Sun and
Zhiqiang Zhang and
Yangyifei Luo and
Wen Zhang and
Huajun Chen},
title = {K-ON: Knowledge On the Head Layer of Large Language Model},
booktitle = {{AAAI}},
year = {2025}
}
We appreciate LLaMA, Huggingface Transformers, Alpaca, Alpaca-LoRA, and many other related works for their open-source contributions.
R.I.P. Kyoto Animation.