With the code below I am getting the "Safe handle has been closed." from CopyToFileAsync. This happens even if only a single file operation is taking place. Also, this only happens on large(ish) files.
As you can see the fille(s) are being updated from a post request over HTTP.
Is some sort of "keepalive" option needed? Is CopyToFileAsync inappropriate?
[HttpPost]
public async Task<IActionResult> PostFormData(List<IFormFile> files, CancellationToken cancellationToken = default)
{
try {
if (!Request.ContentType?.Contains("multipart/form-data") == true)
{
return BadRequest("Unsupported media type.");
}
string webRootPath = _webHostEnvironment.WebRootPath;
string contentRootPath = _webHostEnvironment.ContentRootPath;
List<Task> taskList = new List<Task>();
foreach (var file in files)
{
if (file.Length > 0)
{
var fileName = Path.GetFileName(file.FileName);
var filePath = Path.Combine(_webHostEnvironment.WebRootPath, fileName);
using (var stream = new FileStream(filePath, FileMode.Create))
{
taskList.Add(file.CopyToAsync(stream, cancellationToken));
}
}
}
int filesCopied = 0;
if (taskList.Count > 0)
{
await Task.WhenAll(taskList);
foreach (var task in taskList)
if (task.Status == TaskStatus.RanToCompletion)
filesCopied++;
}
return Ok(new { message = $"{filesCopied} Files uploaded successfully" });
}
catch (TaskCanceledException)
{
throw;
}
catch (System.Exception ex)
{
return BadRequest($"Error: {ex.Message}");
}
}
await Parallel.ForEachAsync(files,async (file,cancellationToken)=> ....});. It's easier to handle erros by not letting them escape the loop, ie use try/catch.catch (TaskCanceledException){ throw;you can just docatch (System.Exception ex) when (ex is not TaskCanceledException)AlsoBadRequestis probably the wrong HTTP code for a generalized failure.