I have a JS/HTML5 front end that calls a C# WebAPI to upload a large file (1-3GB) that needs to be encrypted and then stored in a SQL DB.
My restrictions are that I can't store file unecnrypted and it can't be stored outside of SQL. I also can't use SQL FileStream.
For the WebAPI, I disabled the BufferStream (via overriding WebHostBufferPolicySelector) to take care of any memory exceptions. This allows me to upload files directly to disk without increasing the memory footprint significantly.
I would like to store the series of encrypted chunks that make up the file in order in rows in a SQL table. Then I could pull the chunks out, decrypt them, and stream them back to the user when requested.
I do not know how I can take a multi-data part message, encrypt the chunks, and write it to rows in the database. Can someone point me in the right direction with how I can do this?
OK. So you've solved the issue of WebAPI not running out of RAM. But the next concern is that you've got to pass the data to SQL Server. You can pipe the request stream through any of the .NET stream encryptors, but then it's gotta go somewhere after that.
I suggest you do something like:
Request stream -> encryptor -> custom stream
The custom stream will be responsible for sending data to sql server in chunks. You can fine tune the size of the chunks but it's essentially so that you don't try to send a sql command with a 1GB or more argument size. Your custom stream sql command will break the upload into chunks of bytes, given a fixed buffer size. When the buffer gets over a certain amount, you write a row to SQL Server, flush the buffer, and wait for more to come in from the encryptor stream. The rows that get written will be tied together with some sort of ID, and of course, a sequence number that will tell you the order to use when re-assembling them.
When you want to send the data back, you just do this again, but in reverse.