[ACCEPTED]-ASP.NET Multithreading Web Requests-webrequest

Accepted answer
Score: 14

Multi-threading is the right choice, but 18 I would call it doing stuff asynchronously.

Anyway, you should know 17 that multi-threading works different in 16 IIS.

IIS worker process will finish a request 15 once all child threads end, and this is 14 a big problem, because you don't want to 13 hold a worker process for a long time but 12 re-use it for other requests. Actually it's 11 a thread pool.

This is why ASP.NET offers 10 its own approach to implement asynchronity 9 and if you use the right approach, IIS will 8 be able to process more requests at once 7 because asynchronous work will be executed 6 outside IIS process model.

I would suggest 5 you to read more about ASP.NET async:

Conclusion: use 4 asynchronous work and this will make a more 3 efficient use of server resources, but first 2 learn more about how to do it in the right 1 way!

Score: 10

Multithreading is not recommended on ASP.NET; because 7 ASP.NET/IIS does its own multithreading, and 6 any multithreading you do will interfere 5 with those heuristics.

What you really want 4 is concurrency, more specifically asynchronous concurrency since your operations 3 are I/O-bound.

The best approach is to use 2 HttpClient with the Task-based Asynchronous Pattern:

public async Task<OccupationSearch> GetOccupationAsync(string requestUrl)
{
  // You can also reuse HttpClient instances instead of creating a new one each time
  using (var client = new HttpClient())
  {
    var response = client.GetStringAsync(requestUrl);
    return new JsonSerializer().Deserialize<OccupationSearch>(new JsonTextReader(new StringReader(response)));
  }
}

This is asynchronous, and you 1 can easily make it concurent by using Task.WhenAll:

List<string> urls = ...;
OccupationSearch[] results = await Task.WhenAll(urls.Select(GetOccupationAsync));
Score: 3

I would argue that Multithreading (within 8 reason) would provide benefits as the way 7 your code is written know the calls to GetResponse().GetResponseStream() are 6 blocking.

One of the easiest ways to improve 5 performance is to use a Parallel.ForEach:

        var urls = new List<string>();

        var results = new ConcurrentBag<OccupationSearch>();

        Parallel.ForEach(urls, url =>
        {
            WebRequest request = WebRequest.Create(requestUrl);

            string response = new StreamReader(request.GetResponse().GetResponseStream()).ReadToEnd();

            var result = JsonSerializer().Deserialize<OccupationSearch>(new JsonTextReader(new StringReader(response)));

            results.Add(result);
        });

If you are using 4 .NET 4.5 the new approach is to use async/await. Msdn 3 has a pretty extensive article on this very 2 topic here: http://msdn.microsoft.com/en-us/library/hh300224.aspx and http://msdn.microsoft.com/en-us/library/hh696703.aspx

Scott Hanselman also has 1 a good looking Blog Post on this topic: http://www.hanselman.com/blog/TheMagicOfUsingAsynchronousMethodsInASPNET45PlusAnImportantGotcha.aspx

More Related questions