Django + S3BotoStorage, chunking?
So basically, I'm getting an error where on my production server -- large files won't upload. Specifically 90MB or greater (although the error has happened sporadically with even an 8.8MB file)
The usual stack trace looks like this:
Stacktrace (most recent call last): File "django/core/handlers/base.py", line 111, in get_response response = callback(request, *callback_args, **callback_kwargs) File "mobileapi/decorators.py", line 11, in wrapper return viewfunc(request, **kwargs)
In my view I have a model instance and call save on it...
super(File, self).save(*args, **kwargs) File "django/db/models/base.py", line 463, in save self.save_base(using=using, force_insert=force_insert, force_update=force_update) File "django/db/models/base.py", line 551, in save_base result = manager._insert([self], fields=fields, return_id=update_pk, using=using, raw=raw) File "django/db/models/manager.py", line 203, in _insert return insert_query(self.model, objs, fields, **kwargs) File "django/db/models/query.py", line 1576, in insert_query return query.get_compiler(using=using).execute_sql(return_id) File "django/db/models/sql/compiler.py", line 910, in execute_sql cursor.execute(sql, params) File "django/db/backends/mysql/base.py", line 114, in execute return self.cursor.execute(query, args) File "MySQLdb/cursors.py", line 155, in execute charset = db.character_set_name()
My impression is that the database connection is closing before the full contents of the file has been set.
I've been neck deep in the boto library and the django-storages library trying to debug this but to no avail :(
I know that django-stroages s3boto module offers multipart uploading, but I'm unsure of how to go about implementing this.
Any help is greatly appreciated.