Home Limiting Concurrent Operations With SemaphoreSlim Using C#
Post
Cancel

Limiting Concurrent Operations With SemaphoreSlim Using C#

If you’re looking to limit the number of concurrent operations but maintain as high throughput as possible SemaphoreSlim can help! For instance, this could help maintain a consistent flow of HTTP requests to an external API during a bulk processing operation – respecting the limits of the external API to not potentially overwhelm it with too many concurrent requests.

Example

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
using System.Threading;
using System.Threading.Tasks;

namespace MyApp
{
    public class MyService
    {
        private const int MaximumConcurrentOperations = 10;

        private static readonly SemaphoreSlim RateLimit = new SemaphoreSlim(
            initialCount: MaximumConcurrentOperations,
            maxCount: MaximumConcurrentOperations);

        public async Task Process()
        {
            await RateLimit.WaitAsync();

            try
            {
                // Rate limited logic here
            }
            finally
            {
                RateLimit.Release();
            }
        }
    }
}

maxCount

If maxCount is not specified int.MaxValue is used. From my understanding maxCount in this context is sort of a safety mechanism, making sure you aren’t releasing more than waiting.

Consider the following example:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;

namespace MyApp
{
    class Program
    {
        static readonly SemaphoreSlim SemaphoreSlim =
            new SemaphoreSlim(initialCount: 1);

        static void Main(string[] args)
        {
            var tasks = new List<Task>();

            for (var i = 0; i < 10; i++)
            {
                tasks.Add(Task.Run(() =>
                {
                    SemaphoreSlim.Wait();

                    Console.WriteLine($"Start task, CurrentCount: {SemaphoreSlim.CurrentCount}");
                    Thread.Sleep(100);
                    Console.WriteLine($"End task, CurrentCount: {SemaphoreSlim.CurrentCount}");

                    SemaphoreSlim.Release();
                    SemaphoreSlim.Release(); // Extra call to Release();
                }));

            }

            Task.WaitAll(tasks.ToArray());
        }
    }
}

With example output:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
Start task, CurrentCount: 0
End task, CurrentCount: 0
Start task, CurrentCount: 1
Start task, CurrentCount: 0
End task, CurrentCount: 0
End task, CurrentCount: 0
Start task, CurrentCount: 1
Start task, CurrentCount: 0
Start task, CurrentCount: 0
Start task, CurrentCount: 1
End task, CurrentCount: 0
End task, CurrentCount: 0
End task, CurrentCount: 0
End task, CurrentCount: 0
Start task, CurrentCount: 1
Start task, CurrentCount: 0
Start task, CurrentCount: 1
End task, CurrentCount: 5
End task, CurrentCount: 5
End task, CurrentCount: 5

However, if the SemaphoreSlim is changed to new SemaphoreSlim(initialCount: 1, maxCount: 1) then System.Threading.SemaphoreFullException is thrown instead.

Summary

Rate limiting using SemaphoreSlim can help avoid overwhelming external services with too many concurrent requests while maintaining high throughput.

This post is licensed under CC BY 4.0 by the author.

Thread Safe Lazy Initialization Using C# Notes

Dynamic Parameterized SQL Using ADO.NET With C#

Comments powered by Disqus.