Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Unable to resolve DNS (sometimes?)

Given an application that in parallel requests 100 urls at a time for 10000 urls, I'll receive the following error for 50-5000 of them:

The remote name cannot be resolved 'www.url.com'

I understand that the error means the DNS Server was unable to resolve the url. However, for each run, the number of urls that cannot be resolved changes (ranging from 50 to 5000).

Am I making too many requests too fast? And can I even do that? - Running the same test on a much more powerful server, shows that only 10 urls could not be resolved - which sounds much more realistic.

The code that does the parallel requesting:

var semp = new SemaphoreSlim(100);
var uris = File.ReadAllLines(@"C:\urls.txt").Select(x => new Uri(x));

foreach(var uri in uris)
{
   Task.Run(async () =>
   {
      await semp.WaitAsync();
      var result = await Web.TryGetPage(uri); // Using HttpWebRequest
      semp.Release();
   });   
}
like image 956
ebb Avatar asked Jan 19 '26 17:01

ebb


1 Answers

I'll bet that you didn't know that the DNS lookup of HttpWebRequest (which is the cornerstone of all .net http apis) happens synchronously, even when making async requests (annoying, right?). This means that firing off many requests at once causes severe ThreadPool strain and large amount of latency. This can lead to unexpected timeouts. If you really want to step things up, don't use the .net dns implementation. You can use a third party library to resolve hosts and create your webrequest with an ip instead of a hostname, then manually set the host header before firing off the request. You can achieve much higher throughput this way.

like image 175
spender Avatar answered Jan 22 '26 10:01

spender



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!