-
Notifications
You must be signed in to change notification settings - Fork 206
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reaching maximum value for TaskResult
primary key field causes task failures
#307
Comments
that could be making default to bigautofield in this package |
Django 3.2 allows configuring the type of AutoFields, so it seems there's no need for action from |
We run into this problem at least twice a year. It could be solved by just using a uuid field and populating it with a uuid4 value. |
I've changed the default primary keys to BigAutoField in #426 It feels like the primary keys of these models shouldn't be integers, but rather some sort of UUID. That seems like a much difficult change to make as a third party package. I don't know what mechanisms Django migrations gives that can be applied here. Doing a data migration on tables with millions of rows might take a while to run, so that seems out of question... |
When using
django-celery-results
with the Postgres backend, it's possible to reach the maximum integer value for the primary key column, and this causes every task to fail.We're using Python 3.9.7, Django 2.2.28, Django Celery Results 2.2.0, Postgres 12.
Traceback:
The fix was to manually
ALTER SEQUENCE
andALTER COLUMN
tobigint
. After this, tasks continued to work as expected.Although our problem is now solved, I'm not sure if there was a way to prevent this, or at least handle the issue in a better way.
The text was updated successfully, but these errors were encountered: