You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Originally posted by wissemkarous December 30, 2023
My data is located on Kaggle, and I have transferred my notebook from Kaggle to GitHub. I want to leverage GitHub Actions for automated deployment. However, I am encountering issues in my YAML file. While I have attempted to use the Kaggle API to download the data during the deployment, the dataset is quite large, exceeding 9GB. This has proven challenging. Are there alternative approaches or solutions to efficiently handle the data transfer during deployment through GitHub Actions?
The text was updated successfully, but these errors were encountered:
Thanks for opening your first issue here! Engagement like this is essential for open source projects! 🤗
If you haven't done so already, check out EBP's Code of Conduct. Also, please try to follow the issue template as it helps other community members to contribute more effectively.
If your issue is a feature request, others may react to it, to raise its prominence (see Feature Voting).
Discussed in https://github.com/orgs/executablebooks/discussions/1116
Originally posted by wissemkarous December 30, 2023
My data is located on Kaggle, and I have transferred my notebook from Kaggle to GitHub. I want to leverage GitHub Actions for automated deployment. However, I am encountering issues in my YAML file. While I have attempted to use the Kaggle API to download the data during the deployment, the dataset is quite large, exceeding 9GB. This has proven challenging. Are there alternative approaches or solutions to efficiently handle the data transfer during deployment through GitHub Actions?
The text was updated successfully, but these errors were encountered: