We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
push_to_hub
I created a 1.5B parameters model and want to push the newly initialized model on the hub.
Doing a simple model.push_to_hub("thomwolf/my-model) fail with:
model.push_to_hub("thomwolf/my-model)
--------------------------------------------------------------------------- CalledProcessError Traceback (most recent call last) ~/miniconda2/envs/datasets/lib/python3.7/site-packages/huggingface_hub/repository.py in git_push(self) 445 encoding="utf-8", --> 446 cwd=self.local_dir, 447 ) ~/miniconda2/envs/datasets/lib/python3.7/subprocess.py in run(input, capture_output, timeout, check, *popenargs, **kwargs) 511 raise CalledProcessError(retcode, process.args, --> 512 output=stdout, stderr=stderr) 513 return CompletedProcess(process.args, retcode, stdout, stderr) CalledProcessError: Command '['git', 'push']' returned non-zero exit status 1. During handling of the above exception, another exception occurred: OSError Traceback (most recent call last) <ipython-input-4-3b4d432f9948> in <module> ----> 1 model.push_to_hub("thomwolf/codeparrot") ~/miniconda2/envs/datasets/lib/python3.7/site-packages/transformers/file_utils.py in push_to_hub(self, repo_path_or_name, repo_url, use_temp_dir, commit_message, organization, private, use_auth_token) 2029 self.save_pretrained(repo_path_or_name) 2030 # Commit and push! -> 2031 url = self._push_to_hub(repo, commit_message=commit_message) 2032 2033 # Clean up! Clean up! Everybody everywhere! ~/miniconda2/envs/datasets/lib/python3.7/site-packages/transformers/file_utils.py in _push_to_hub(cls, repo, commit_message) 2109 commit_message = "add model" 2110 -> 2111 return repo.push_to_hub(commit_message=commit_message) ~/miniconda2/envs/datasets/lib/python3.7/site-packages/huggingface_hub/repository.py in push_to_hub(self, commit_message) 460 self.git_add() 461 self.git_commit(commit_message) --> 462 return self.git_push() 463 464 @contextmanager ~/miniconda2/envs/datasets/lib/python3.7/site-packages/huggingface_hub/repository.py in git_push(self) 448 logger.info(result.stdout) 449 except subprocess.CalledProcessError as exc: --> 450 raise EnvironmentError(exc.stderr) 451 452 return self.git_head_commit_url() OSError: batch response: You need to configure your repository to enable upload of files > 5GB. Run "huggingface-cli lfs-enable-largefiles ./path/to/your/repo" and try again. error: impossible de pousser des références vers 'https://huggingface.co/thomwolf/codeparrot'
Since push_to_hub() create a temporary folder (afaiu) I can't by default run huggingface-cli lfs-enable-largefiles ./path/to/your/repo in it.
push_to_hub()
huggingface-cli lfs-enable-largefiles ./path/to/your/repo
Workaround is probably to manually create the folder to save and run the lfs command in it before pushing but maybe this shouldn't fail by default?
The text was updated successfully, but these errors were encountered:
That's good feedback.
@LysandreJik Repository should apply lfs-enable-largefiles by default no?
Repository
lfs-enable-largefiles
Sorry, something went wrong.
Yes, got the same feedback from @sgugger yesterday! Will work on a PR.
LysandreJik
Successfully merging a pull request may close this issue.
I created a 1.5B parameters model and want to push the newly initialized model on the hub.
Doing a simple
model.push_to_hub("thomwolf/my-model)
fail with:Since
push_to_hub()
create a temporary folder (afaiu) I can't by default runhuggingface-cli lfs-enable-largefiles ./path/to/your/repo
in it.Workaround is probably to manually create the folder to save and run the lfs command in it before pushing but maybe this shouldn't fail by default?
The text was updated successfully, but these errors were encountered: