Code for Bilinear Transdcution on regression and imitation learning tasks as proposed in Learning to Extrapolate: A Transductive Approach
Clone the repository
git clone https://github.com/avivne/bilinear-transduction.git
Create a virtual environment with the following requirements
conda env create -f environment.yml
Download the grasping data and place demos.pkl under data/grasping.
Training and evaluating bilinear transduction on regression tasks (grasping)
python bilinear_transduction_regression.py
Training and evaluating bilinear transduction on imitation learning tasks
python bilinear_transduction_imitation.py --config-name configs/reach_metaworld.yaml
replace reach_metaworld
with push_metaworld
, slider
or adroit
for other imitation learning environments.
Add --model-type bc
for training and evaluating the neural net baseline on any of the tasks.
Rendering sample videos of expert demonstrations and model performance on in-distribution and OOS data points
python render_videos.py
modify this block to render selected environments, models and data points.
If you use this code in your research, please consider citing
@inproceedings{netanyahu2023transduction,
title={Learning to Extrapolate: A Transductive Approach},
author={Netanyahu, Aviv and Gupta, Abhishek and Simchowitz, Max and Zhang, Kaiqing and Agrawal, Pulkit},
booktitle={The Eleventh International Conference on Learning Representations (ICLR)},
year={2023}
}
The environments and data are derived from Meta-World, Adroit relocate, mjrl and NDF.