SilentNot SilentNot - 2 months ago 180
C# Question

Equivalent of Objective-C Dispatch Queues in C# .NET approach or idiom

I'm a newbie to C# .Net world and am creating a .NET API service that writes to an MS Access database table which does full table locks, there will be up to 20 people writing to the table at the same time (we are getting rid of Access, but not soon enough!). We need to ensure "writers" don't get locked out.

While a threading solution would work in most languages, Apple recommends using Dispatch Queues to support concurrency challenges such as above (see excerpt below), where requests to a shared resource are queued and processed one at a time to avoid conflicts.

Is there an equivalent approach or idiom in C#?

I did see this SO question, but it did not really have an answer in the correct context I need.

For more details:

Excerpt from Dispatch Queues Objective-C docs:

Dispatch queues are a C-based mechanism for executing custom tasks. A dispatch queue executes tasks either serially or concurrently but always in a first-in, first-out order. (In other words, a dispatch queue always dequeues and starts tasks in the same order in which they were added to the queue.) A serial dispatch queue runs only one task at a time, waiting until that task is complete before dequeuing and starting a new one. By contrast, a concurrent dispatch queue starts as many tasks as it can without waiting for already started tasks to finish.

Dispatch queues have other benefits:

They provide a straightforward and simple programming interface. They
offer automatic and holistic thread pool management. They provide the
speed of tuned assembly. They are much more memory efficient (because
thread stacks do not linger in application memory). They do not trap
to the kernel under load. The asynchronous dispatching of tasks to a
dispatch queue cannot deadlock the queue. They scale gracefully under
contention. Serial dispatch queues offer a more efficient alternative
to locks and other synchronization primitives.


The tasks you submit to a dispatch queue must be encapsulated inside either a function or a block object

EDIT:
Turns out that the .Net "Main Loop" or main thread is where you can make requests to process your code. The main loop is where all the UI work is typically done. According to this SO question The Windows GUI Main Loop in C#...where is it? You can also access it via Application.Run and Timer

Answer Source

I found your question interesting and related to an area of the .NET framework I could use more knowledge about, so I did a little research on the topic. Here you go:

There are several .NET options related to friendly-managing threads that might help with what you are trying to do. The stand outs are TaskScheduler.QueueTask and ThreadPool.QueueUserWorkItem. BackgroundWorker is also possibly applicable to your architecturally unenviable situation.

Neither documentation for TaskScheduler or ThreadPool task/thread queues mention any guarantee about the order of queued items. If the order of your threaded tasks is very important, based on my admittedly limited knowledge of the .NET framework, you might want to guarantee the queuing yourself with a queue starter method that takes the write requests and writes to the database or queues the write until the write rights are available. That is going to be a little messy in that you will need to lock for concurrency. This github hosted code from a similar SO question might be good for that. I have not tested it.

Alternatively you could queue your write tasks to a thread pool limited to one thread at a time using SetMaxThreads. No guarantees from me about that being appropriate for your situation, or ever, though it does seem simple.

Hope that helps

EDIT: Further research from the similar SO question, pointed out by the original question poster, the Concurrent Queue, new with .NET 4.0, ensures processing a next task out of the ConcurrentQueue queued tasks without the extra concurrency lock code necessary if a regular Queue were used. That approach would allow simultaneously multiple tasks where possible instead of always waiting for the previous task to complete as in the single thread processor thread pool approach.