So i encountered problem with deadlocks, i was getting this exception:
System.InvalidOperationException: An exception has been raised that is likely due to a transient failure. Consider enabling transient error resiliency by adding 'EnableRetryOnFailure()' to the 'UseSqlServer' call.
I even made second project to write the simplest code i can that results in the same issue, its just a few lines of code. It is really basic, you just need to have two entities that reference the same diffrent entity. It is such a common thing in my opinion and EF Core can't handle this by default and hints to use EnableRetryOnFailure().
I think it is quite common to have two or more entities that has relation with the same entity. We could even have something like logging system in our application that would add record to log table in database whenever someone adds data and data would be always related to some log record. Or we could save which user added some data etc. possibilities are endless.
My question is: If we have big application, then 1000 users could trigger retry every second, depending on how frequently they use some action that results in this deadlock. Doesn't this hurt performance? Won't this clog database eventually?
It's just API with all the code inside controllers for simplicity. Database is generated by EF Core (code first aproach). It has two controllers with endpoints that accept json file with data that is added to database. Information about file is saved everytime data from file is added to database (file name, date and time). Posting one file results in many records with data (depending on what is inside) and one record with file name, date and time (every data record gets the same file reference so all records will have the same InputFileId).
You just need to send requests to both endpoints at the same time. Or just send for example 4 requests at the same time to the same endpoint. I had 2000 entries in json file so it would take a little bit longer to easier trigger deadlock.
User Controller:
[Route("api/[controller]")]
[ApiController]
public class UsersController : ControllerBase
{
public MyDbContext DbContext { get; }
public UsersController(MyDbContext dbContext)
{
DbContext = dbContext;
}
[HttpPost]
public async Task<IActionResult> Import([FromForm]IFormFile file)
{
using var streamReader = new StreamReader(file.OpenReadStream());
JsonSerializer serializer = new JsonSerializer();
List<User> users = (List<User>)serializer.Deserialize(streamReader, typeof(List<User>));
var inputFile = new InputFile
{
FileName = file.FileName,
DateAdded = DateTime.Now
};
foreach (var user in users)
{
user.InputFile = inputFile;
}
await DbContext.AddRangeAsync(users);
await DbContext.SaveChangesAsync();
return Ok();
}
}
Product Controller:
[Route("api/[controller]")]
[ApiController]
public class ProductsController : ControllerBase
{
public MyDbContext DbContext { get; }
public ProductsController(MyDbContext dbContext)
{
DbContext = dbContext;
}
[HttpPost]
public async Task<IActionResult> Import([FromForm]IFormFile file)
{
using var streamReader = new StreamReader(file.OpenReadStream());
JsonSerializer serializer = new JsonSerializer();
List<Product> products = (List<Product>)serializer.Deserialize(streamReader, typeof(List<Product>));
var inputFile = new InputFile
{
FileName = file.FileName,
DateAdded = DateTime.Now
};
foreach (var product in products)
{
product.InputFile = inputFile;
}
await DbContext.AddRangeAsync(products);
await DbContext.SaveChangesAsync();
return Ok();
}
}
Db context:
public class MyDbContext : DbContext
{
public DbSet<Product> Products { get; set; }
public DbSet<User> Users { get; set; }
public DbSet<InputFile> InputFiles { get; set; }
public MyDbContext(DbContextOptions<MyDbContext> options) : base(options)
{
}
}
Product:
public class Product
{
public long Id { get; set; }
public string SerialNumber { get; set; }
public string Name { get; set; }
public string Description { get; set; }
public long InputFileId { get; set; }
public InputFile InputFile { get; set; }
}
User:
public class User
{
public long Id { get; set; }
public string Name { get; set; }
public string Surname { get; set; }
public long InputFileId { get; set; }
public InputFile InputFile { get; set; }
}
InputFile:
public class InputFile
{
public long Id { get; set; }
public string FileName { get; set; }
public DateTime DateAdded { get; set; }
}
Startup.cs, ConfigureServices:
public void ConfigureServices(IServiceCollection services)
{
services.AddControllers();
services.AddDbContext<MyDbContext>(options => options.UseSqlServer(Configuration.GetConnectionString("DefaultConnection")));
}
Won't this clog database eventually?
The default EnableRetryOnFailure implementation increases the wait time between retries. So it usually works out of the box without even needing you to adjust it.
Doesn't this hurt performance?
No, it doesn't.
If we have big application, then 1000 users could trigger retry every second, depending on how frequently they use some action that results in this deadlock.
Also make sure, that you have READ_COMMITTED_SNAPSHOT set to ON.
For more information see Connection Resiliency. Also take a look at the SqlServerRetryingExecutionStrategy and ExecutionStrategy implementations.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With