-
Notifications
You must be signed in to change notification settings - Fork 84
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
application for extra disk quota #4469
Labels
Comments
I granted you the extra space. Out of curiosity, what do you mean by "deleted some of my bundle"? What commands did you run, if you don't mind me asking? |
i delete some bundles that has large size to reuse the disk but i find the disk cannot reuse by delete some bundle , i have received the disk quote that you grant to me ,thank |
i delete some bundle to reuse the disk but i find the disk cannot be reused,so i ask for extra disk space .
thanks for your grant me extra disk
…---- Replied Message ----
| From | ***@***.***> |
| Date | 05/11/2023 09:55 |
| To | ***@***.***> |
| Cc | ***@***.***>***@***.***> |
| Subject | Re: [codalab/codalab-worksheets] application for extra disk quota (Issue #4469) |
I granted you the extra space.
Out of curiosity, what do you mean by "deleted some of my bundle"? What commands did you run, if you don't mind me asking?
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
i use the codalab to submit my model then the blind test will be taken out ,but i used up the disk quota though i have deleted some of my bundle , i still cannot run my model , can you allocate me extra 30G disk ,i need to evaluate about 5 models and each model will take about 5G, i need to finish this evaluation to write my paper .i will be appreciate if you can process my request as soon as possible ,thanks
my username :atom
my email:[email protected]
The text was updated successfully, but these errors were encountered: