I have the following code in my imaging library which utilises a list of processors to dynamically manipulate images captured by a HttpModule.
At present there is only one instance of each processor created in order to keep memory overheads down and each of these processors have writeable properties which help determine the order in which to process each matched querystring parameter and store the parsed values to process.
As you can see I am currently wrapping the the methods functionality within a lock statement to prevent different threads from the HttpModule overwriting the processors properties though I know this could act as a bottleneck. What I am wondering is: Is there a design pattern or method by which I can make my processors threadsafe without the lock?
public static ImageFactory AutoProcess(this ImageFactory factory)
{
if (factory.ShouldProcess)
{
// TODO: This is going to be a bottleneck for speed. Find a faster way.
lock (SyncLock)
{
// Get a list of all graphics processors that
// have parsed and matched the querystring.
List<IGraphicsProcessor> list =
ImageProcessorConfig.Instance.GraphicsProcessors
.Where(x => x.MatchRegexIndex(factory.QueryString) != int.MaxValue)
.OrderBy(y => y.SortOrder)
.ToList();
// Loop through and process the image.
foreach (IGraphicsProcessor graphicsProcessor in list)
{
factory.Image = graphicsProcessor.ProcessImage(factory);
}
}
}
return factory;
}
A producer consumer queue may be of interest to you. Very generally, your HttpModule would receive events (the producer) and queue them to one or more instances of IGraphicsProcessor (the consumer(s)).
This is the canonical, simplest possible producer/consumer queue implementation: http://www.albahari.com/threading/part4.aspx#_Wait_Pulse_Producer_Consumer_Queue
If you are intent on eliminating locking, you should experiement with producer/consumer queues using a lock free queue implementation, such as System.Collections.Concurrent.ConcurrentQueue<T> in .NET 4.0.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With