Uploading Files to a Database

I need to be able to upload several files to a database, and associate them with a given record in a given table. I initially thought about zipping all the necessary files and send the resulting byte array as a parameter along with the remaining record data to a stored procedure, so that I could both be sure that this acts as a single transaction and that retrieving all the files associated with a given record would be a piece of cake. However, this plan was not accepted for performance reasons, and now I am left with two options:

  1. Create a data structure which wraps an array of byte[], (binary-)serialize it and send it along with the data; or

  2. Send the remaining data first, catch the ID of the record and send each file separately (associating it with the given ID);

Now, 1) looks too much like my first plan and is likely to be rejected even though this time no compression would be involved. Option 2) seems to be the way to go, but, however, I must guarantee that I upload both the record's data and the files in a single transaction, which will force me to make changes (albeit slight) in my data layer...

If you were facing this problem, which option would you choose? If none of those I stated above, which one then?

Edit: The database is located in a remote server. The files can be arbitrarily large, although I don't expect them to be larger than, say, 1-2MB. Storing the files in a location accessible by both the application and the database server is not an option, so that I can't just send paths to the files to the database (they must really be stored as BLOBs in the database).

Answers


Your first plan was rejected due to performance issues - presumably this means network transfer speed?

Your alternate plan is to send the files individually & then tie them together on your server. That's a fine plan, but you want to ensure all the files are uploaded and committed in a single transaction. Following from that, your server can't respond with a success signal until it's received all the files and successfully committed them.

So you second scenario requires the same amount of data to be uploaded in the transaction. How then will it be able to perform any better?

In fact, any possible design you can come up with - assuming the files need to be received within a single transaction - will have the same 'performance issue'.

You may have a different understanding of 'transaction' than I do, but typically you can't go and use the originally submitted 'remaining data' until the transaction is complete. I think your problem is around the required 'single transaction' nature of your system, rather than any network bottleneck.


Need Your Help

Passing 'next' to django login template via urls.py

django django-urls

I'm trying to pass the 'next' field to my login.html template via the urls.py file using this code

Why is a bias neuron necessary for a backpropagating neural network that recognizes the XOR operator?

artificial-intelligence neural-network xor backpropagation bias-neuron

I posted a question yesterday regarding issues that I was having with my backpropagating neural network for the XOR operator. I did a little more work and realized that it may have to do with not h...

About UNIX Resources Network

Original, collect and organize Developers related documents, information and materials, contains jQuery, Html, CSS, MySQL, .NET, ASP.NET, SQL, objective-c, iPhone, Ruby on Rails, C, SQL Server, Ruby, Arrays, Regex, ASP.NET MVC, WPF, XML, Ajax, DataBase, and so on.