Comments (19)
Thanks for providing more details. Support for uploading unseekable streams in Transfer Utility
was added in S3 (3.7.202.0). You may want to upgrade to this version to confirm that it's an unseekable stream issue.
However, S3 (3.7.305.8) may have introduced some conflicts with the above feature release. Will discuss this with the team and keep you posted.
Regards,
Chaitanya
from aws-sdk-net.
Thank you for reporting the issue. Unfortunately, I am unable to reproduce the exception using your code sample.
Can you please elaborate on how the sourceStream
is populated? Preferably, if you can enable AWS SDK verbose logging that might also be helpful here. You can turn on verbose logging using the code below to see detailed logs.
Amazon.AWSConfigs.LoggingConfig.LogResponses = Amazon.ResponseLoggingOption.Always;
Amazon.AWSConfigs.LoggingConfig.LogTo = Amazon.LoggingOptions.SystemDiagnostics;
Amazon.AWSConfigs.AddTraceListener("Amazon", new System.Diagnostics.ConsoleTraceListener());
Regards,
Chaitanya
from aws-sdk-net.
Hello,
The sourceStream
is coming from a multipart form in an api request:
var boundary = Request.GetBoundary();
var reader = new MultipartReader(boundary, Request.Body, _bufferSize);
MultipartSection section;
while ((section = await reader.ReadNextSectionAsync(cancellationToken)) != null)
{
var contentDisposition = section.GetContentDispositionHeader();
if (contentDisposition.IsFileDisposition())
{
var fileSection = section.AsFileSection();
_logger.LogInformation("Received file to upload: {0}", fileSection.FileName);
Validate.FormFile(fileSection.FileStream, fileSection.FileName, fileSection.Section.ContentType);
var stream = new TypedStream
{
ContentType = fileSection.Section.ContentType,
ContentStream = fileSection.FileStream,
Filename = fileSection.FileName
};
await _documentManager.UploadDocument(tenantId, new DocumentReference(documentId), name, stream);
}
}
the _documentManager.UploadDocument
calls:
public async Task UploadDocument(Guid tenantId, IDocumentReference document,
string name, ITypedStream stream)
{
//... other code logic
await _storageManager.UploadContentStream(tenantId, document.DocumentId, name, stream);
}
the _storageManager.UploadContentStream
calls:
public async Task UploadContentStream(Guid tenantId, Guid id, string name, ITypedStream typedStream, CancellationToken cancellationToken = default)
{
var fullPath = $"{tenantId}/{id}/{name.ToLower()}";
await _blobStorageRepo.UploadAsync(typedStream.ContentStream, fullPath, typedStream.ContentType, cancellationToken);
_logger.LogInformation("Content stream has been uploaded");
}
the _blobStorageRepo.UploadAsync
is from a company's private nuget package that I do not have access to the code, I only can decompile and see what is called there (this is why I have the code snippet from the original question), so there is no possibility to turn on the detailed logging. Is there a way I can turn on this in this situation?
from aws-sdk-net.
One more thing. I altered the code
var boundary = Request.GetBoundary();
var reader = new MultipartReader(boundary, Request.Body, _bufferSize);
MultipartSection section;
while ((section = await reader.ReadNextSectionAsync(cancellationToken)) != null)
{
var contentDisposition = section.GetContentDispositionHeader();
if (contentDisposition.IsFileDisposition())
{
var fileSection = section.AsFileSection();
_logger.LogInformation("Received file to upload: {0}", fileSection.FileName);
Validate.FormFile(fileSection.FileStream, fileSection.FileName, fileSection.Section.ContentType);
var stream = new TypedStream
{
ContentType = fileSection.Section.ContentType,
ContentStream = fileSection.FileStream,
Filename = fileSection.FileName
};
await _documentManager.UploadDocument(tenantId, new DocumentReference(documentId), name, stream);
}
}
to this:
var fStream = System.IO.File.OpenRead(@"C:\original0001.pdf");
var stream = new TypedStream
{
ContentType = "application/pdf",
ContentStream = fStream,
Filename = "original0001.pdf"
};
await _documentManager.UploadDocument(tenantId, new DocumentReference(documentId), name, stream);
Basically I do not read the stream from the multipart form from the request but read a local file, and the Upload is successful. If this can help narrow the possible cause of the problem.
Regards,
from aws-sdk-net.
Hello,
I recently upgraded the nuget package to latest version and I have this same issue since.
I downgraded the package to 3.7.305.2 and that's fine.
I'm really interrested by this fix too.
The use case is same: upload from multipart form data without store anything into memory (case of big files like 10GB) and upload chunks to S3 directly without know the content length in advance.
All the best,
Adrien
from aws-sdk-net.
I'm also seeing the same issue.
from aws-sdk-net.
Looks like TransferUtility
doesn't support streams where CanSeek
is false. https://jasonterando.medium.com/net-core-tee-streaming-and-buffered-s3-uploads-4a063230d99f
from aws-sdk-net.
Hi!
I noticed that the latest version of the AWSSDK.S3 package where this - sort of - works is: 3.7.305.7 while also using .NET 8
From then onwards we get the aforementioned error.
It is also worth noting that the following conditions must be met for this to work (please verify before trying them):
- Buffering needs to be disabled for the particular request.
- Synchronous operations must be allowed (this seems to be a problem within the UploadAsync call with TransferUtility related to #1534)
Hope this helps in any sort of way.
from aws-sdk-net.
@julian-dimitroff Good afternoon. Using the latest version of AWSSDK.S3 (version 3.7.305.31
) in an ASP.NET core project targeting .NET 8.0, doesn't reproduce the issue using the code below, using 500KB file (kindly note that code is only for demonstration purposes):
Views > Home > Index.cshtml
@{
ViewData["Title"] = "Home Page";
}
<div class="text-center">
<form method="post" enctype="multipart/form-data" asp-controller="FileUpload" asp-action="Index">
<div class="form-group">
<div class="col-md-10">
<p>Upload one or more files using this form:</p>
<input type="file" name="files" multiple />
</div>
</div>
<div class="form-group">
<div class="col-md-10">
<input type="submit" value="Upload" />
</div>
</div>
</form>
</div>
HomeController.cs
using Microsoft.AspNetCore.Mvc;
using TestWebAppNetCore.Models;
namespace TestWebAppNetCore.Controllers
{
public class HomeController : Controller
{
public IActionResult Index()
{
return View();
}
}
}
FileUploadController.cs
using Amazon.Runtime;
using Amazon.S3;
using Amazon.S3.Transfer;
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.WebUtilities;
using System;
using System.ComponentModel.DataAnnotations;
using System.Threading;
using System.Xml.Linq;
namespace TestWebAppNetCore.Controllers
{
public class FileUploadController : Controller
{
[HttpPost("FileUpload")]
public async Task<IActionResult> Index(List<IFormFile> files)
{
long size = files.Sum(f => f.Length);
foreach (var formFile in files)
{
if (formFile.Length > 0)
{
AmazonS3Client amazonS3Client = new AmazonS3Client();
await UploadAsync(amazonS3Client, "testbucket-issue3201", formFile.OpenReadStream(), formFile.FileName, formFile.ContentType);
}
}
return Ok(new { count = files.Count, size });
}
public async Task UploadAsync(AmazonS3Client amazonS3Client, string bucketName, Stream sourceStream, string fullPath, string contentType, CancellationToken cancellationToken = default(CancellationToken))
{
try
{
TransferUtility transferUtility = new TransferUtility(amazonS3Client);
await transferUtility.UploadAsync(sourceStream, bucketName, fullPath, cancellationToken);
}
catch (AmazonServiceException ex3)
{
AmazonServiceException ex2 = ex3;
throw new Exception($"Got AmazonServiceException in {"UploadAsync"} while uploading object to '{fullPath}'.", ex2);
}
catch (Exception ex4)
{
Exception ex = ex4;
throw new Exception($"Got exception in {"UploadAsync"} while uploading object to '{fullPath}'.", ex);
}
}
}
}
Kindly note that in above code I'm relying on ASP.NET Model binding feature to bind incoming files to list of IFormFile
objects in file upload controller.
Could you please:
- Check if the issue goes away with the latest version of SDK.
- Share the base minimum code to reproduce the issue (removing any internally used packages or code base)
- List of NuGet packages used in the project.
Thanks,
Ashish
from aws-sdk-net.
As I said in the previous posts: "the _blobStorageRepo.UploadAsync is from a company's private nuget package that I do not have access to the code, I only can decompile and see what is called there". The only thing that I can verify is the version on the AWSSDK.S3 version, which is:
I'll ask my management to contact whoever is responsible to that package, to update the version. Since I be able to test it, I'll write the result here.
Thanks for your quick response
from aws-sdk-net.
@julian-dimitroff Good afternoon. Using the latest version of AWSSDK.S3 (version
3.7.305.31
) in an ASP.NET core project targeting .NET 8.0, doesn't reproduce the issue using the code below, using 500KB file (kindly note that code is only for demonstration purposes): Views > Home > Index.cshtml@{ ViewData["Title"] = "Home Page"; } <div class="text-center"> <form method="post" enctype="multipart/form-data" asp-controller="FileUpload" asp-action="Index"> <div class="form-group"> <div class="col-md-10"> <p>Upload one or more files using this form:</p> <input type="file" name="files" multiple /> </div> </div> <div class="form-group"> <div class="col-md-10"> <input type="submit" value="Upload" /> </div> </div> </form> </div>
HomeController.cs
using Microsoft.AspNetCore.Mvc; using TestWebAppNetCore.Models; namespace TestWebAppNetCore.Controllers { public class HomeController : Controller { public IActionResult Index() { return View(); } } }FileUploadController.cs
using Amazon.Runtime; using Amazon.S3; using Amazon.S3.Transfer; using Microsoft.AspNetCore.Mvc; using Microsoft.AspNetCore.WebUtilities; using System; using System.ComponentModel.DataAnnotations; using System.Threading; using System.Xml.Linq; namespace TestWebAppNetCore.Controllers { public class FileUploadController : Controller { [HttpPost("FileUpload")] public async Task<IActionResult> Index(List<IFormFile> files) { long size = files.Sum(f => f.Length); foreach (var formFile in files) { if (formFile.Length > 0) { AmazonS3Client amazonS3Client = new AmazonS3Client(); await UploadAsync(amazonS3Client, "testbucket-issue3201", formFile.OpenReadStream(), formFile.FileName, formFile.ContentType); } } return Ok(new { count = files.Count, size }); } public async Task UploadAsync(AmazonS3Client amazonS3Client, string bucketName, Stream sourceStream, string fullPath, string contentType, CancellationToken cancellationToken = default(CancellationToken)) { try { TransferUtility transferUtility = new TransferUtility(amazonS3Client); await transferUtility.UploadAsync(sourceStream, bucketName, fullPath, cancellationToken); } catch (AmazonServiceException ex3) { AmazonServiceException ex2 = ex3; throw new Exception($"Got AmazonServiceException in {"UploadAsync"} while uploading object to '{fullPath}'.", ex2); } catch (Exception ex4) { Exception ex = ex4; throw new Exception($"Got exception in {"UploadAsync"} while uploading object to '{fullPath}'.", ex); } } } }Kindly note that in above code I'm relying on ASP.NET Model binding feature to bind incoming files to list of
IFormFile
objects in file upload controller.Could you please:
- Check if the issue goes away with the latest version of SDK.
- Share the base minimum code to reproduce the issue (removing any internally used packages or code base)
- List of NuGet packages used in the project.
Thanks, Ashish
Could you please retry but, this time, use a stream instead of a FormFile as a parameter. This issue happens mainly when working with the stream directly (such as in multipart/content) and then trying to upload a stream that has a Length of 0.
from aws-sdk-net.
Okay, I'll try to setup a sample project that is able to reproduce the problem without this internal nuget package.
I'll get back to you when this is done
Have a great day!
from aws-sdk-net.
Hello again @JFSiller @ashishdhingra I made a simple setup using .NET 8 and AWSSDK.S3 v.3.7.307:
But I still got the error with the content length.
I'll attach the simple project, hope you can run it!
The request to the endpoint is made with VS Code's extension "Thunder Client":
The file that I'm using to upload is in the .zip archive
Hope that helps reproduce the problem
AWS Tests.zip
Best regards
from aws-sdk-net.
@julian-dimitroff Thanks for providing the reproduction code and steps. I was able to reproduce the issue where it gave error Unable to write content to request stream; content would exceed Content-Length.
with the following stack trace:
at System.IO.Stream.<<CopyToAsync>g__Core|27_0>d.MoveNext()
at System.Net.Http.HttpContent.<<CopyToAsync>g__WaitAsync|56_0>d.MoveNext()
at System.Net.Http.HttpConnection.<SendRequestContentAsync>d__61.MoveNext()
at System.Net.Http.HttpConnection.<SendRequestContentWithExpect100ContinueAsync>d__62.MoveNext()
at System.Net.Http.HttpConnection.<SendAsync>d__57.MoveNext()
at System.Net.Http.HttpConnectionPool.<SendWithVersionDetectionAndRetryAsync>d__89.MoveNext()
at System.Net.Http.DiagnosticsHandler.<SendAsyncCore>d__10.MoveNext()
at System.Net.Http.HttpClient.<<SendAsync>g__Core|83_0>d.MoveNext()
at Amazon.Runtime.HttpWebRequestMessage.<GetResponseAsync>d__20.MoveNext()
at Amazon.Runtime.Internal.HttpHandler`1.<InvokeAsync>d__9`1.MoveNext()
at Amazon.Runtime.Internal.RedirectHandler.<InvokeAsync>d__1`1.MoveNext()
at Amazon.Runtime.Internal.Unmarshaller.<InvokeAsync>d__3`1.MoveNext()
at Amazon.S3.Internal.AmazonS3ResponseHandler.<InvokeAsync>d__1`1.MoveNext()
at Amazon.Runtime.Internal.ErrorHandler.<InvokeAsync>d__5`1.MoveNext()
at Amazon.Runtime.Internal.ErrorHandler.<InvokeAsync>d__5`1.MoveNext()
at Amazon.Runtime.Internal.CallbackHandler.<InvokeAsync>d__9`1.MoveNext()
at Amazon.Runtime.Internal.Signer.<InvokeAsync>d__1`1.MoveNext()
at Amazon.S3.Internal.S3Express.S3ExpressPreSigner.<InvokeAsync>d__5`1.MoveNext()
at Amazon.Runtime.Internal.EndpointDiscoveryHandler.<InvokeAsync>d__2`1.MoveNext()
at Amazon.Runtime.Internal.EndpointDiscoveryHandler.<InvokeAsync>d__2`1.MoveNext()
at Amazon.Runtime.Internal.CredentialsRetriever.<InvokeAsync>d__7`1.MoveNext()
at Amazon.Runtime.Internal.RetryHandler.<InvokeAsync>d__10`1.MoveNext()
at Amazon.Runtime.Internal.RetryHandler.<InvokeAsync>d__10`1.MoveNext()
at Amazon.Runtime.Internal.CallbackHandler.<InvokeAsync>d__9`1.MoveNext()
at Amazon.Runtime.Internal.CallbackHandler.<InvokeAsync>d__9`1.MoveNext()
at Amazon.S3.Internal.AmazonS3ExceptionHandler.<InvokeAsync>d__1`1.MoveNext()
at Amazon.Runtime.Internal.ErrorCallbackHandler.<InvokeAsync>d__5`1.MoveNext()
at Amazon.Runtime.Internal.MetricsHandler.<InvokeAsync>d__1`1.MoveNext()
at Amazon.S3.Transfer.Internal.SimpleUploadCommand.<ExecuteAsync>d__10.MoveNext()
at AWS_Tests.Controllers.WeatherForecastController.<UploadDocumentWithTenantId>d__6.MoveNext() in D:\source\repros\AWS.Tests\AWS Tests\AWS.Tests\Controllers\WeatherForecastController.cs:line 89
Based on stack trace, TransferUtility
correctly used simple upload based on the file size. We would investigate the issue at our end.
Thanks,
Ashish
from aws-sdk-net.
@julian-dimitroff Thanks for providing the reproduction code and steps. I was able to reproduce the issue where it gave error
Unable to write content to request stream; content would exceed Content-Length.
with the following stack trace:at System.IO.Stream.<<CopyToAsync>g__Core|27_0>d.MoveNext() at System.Net.Http.HttpContent.<<CopyToAsync>g__WaitAsync|56_0>d.MoveNext() at System.Net.Http.HttpConnection.<SendRequestContentAsync>d__61.MoveNext() at System.Net.Http.HttpConnection.<SendRequestContentWithExpect100ContinueAsync>d__62.MoveNext() at System.Net.Http.HttpConnection.<SendAsync>d__57.MoveNext() at System.Net.Http.HttpConnectionPool.<SendWithVersionDetectionAndRetryAsync>d__89.MoveNext() at System.Net.Http.DiagnosticsHandler.<SendAsyncCore>d__10.MoveNext() at System.Net.Http.HttpClient.<<SendAsync>g__Core|83_0>d.MoveNext() at Amazon.Runtime.HttpWebRequestMessage.<GetResponseAsync>d__20.MoveNext() at Amazon.Runtime.Internal.HttpHandler`1.<InvokeAsync>d__9`1.MoveNext() at Amazon.Runtime.Internal.RedirectHandler.<InvokeAsync>d__1`1.MoveNext() at Amazon.Runtime.Internal.Unmarshaller.<InvokeAsync>d__3`1.MoveNext() at Amazon.S3.Internal.AmazonS3ResponseHandler.<InvokeAsync>d__1`1.MoveNext() at Amazon.Runtime.Internal.ErrorHandler.<InvokeAsync>d__5`1.MoveNext() at Amazon.Runtime.Internal.ErrorHandler.<InvokeAsync>d__5`1.MoveNext() at Amazon.Runtime.Internal.CallbackHandler.<InvokeAsync>d__9`1.MoveNext() at Amazon.Runtime.Internal.Signer.<InvokeAsync>d__1`1.MoveNext() at Amazon.S3.Internal.S3Express.S3ExpressPreSigner.<InvokeAsync>d__5`1.MoveNext() at Amazon.Runtime.Internal.EndpointDiscoveryHandler.<InvokeAsync>d__2`1.MoveNext() at Amazon.Runtime.Internal.EndpointDiscoveryHandler.<InvokeAsync>d__2`1.MoveNext() at Amazon.Runtime.Internal.CredentialsRetriever.<InvokeAsync>d__7`1.MoveNext() at Amazon.Runtime.Internal.RetryHandler.<InvokeAsync>d__10`1.MoveNext() at Amazon.Runtime.Internal.RetryHandler.<InvokeAsync>d__10`1.MoveNext() at Amazon.Runtime.Internal.CallbackHandler.<InvokeAsync>d__9`1.MoveNext() at Amazon.Runtime.Internal.CallbackHandler.<InvokeAsync>d__9`1.MoveNext() at Amazon.S3.Internal.AmazonS3ExceptionHandler.<InvokeAsync>d__1`1.MoveNext() at Amazon.Runtime.Internal.ErrorCallbackHandler.<InvokeAsync>d__5`1.MoveNext() at Amazon.Runtime.Internal.MetricsHandler.<InvokeAsync>d__1`1.MoveNext() at Amazon.S3.Transfer.Internal.SimpleUploadCommand.<ExecuteAsync>d__10.MoveNext() at AWS_Tests.Controllers.WeatherForecastController.<UploadDocumentWithTenantId>d__6.MoveNext() in D:\source\repros\AWS.Tests\AWS Tests\AWS.Tests\Controllers\WeatherForecastController.cs:line 89Based on stack trace,
TransferUtility
correctly used simple upload based on the file size. We would investigate the issue at our end.Thanks, Ashish
@julian-dimitroff While debugging, I noticed that fileSection.FileStream.Length
has value 0
. The issue persists in .NET 7.0
. However, the code works fine if using .NET 6.0
, where fileSection.FileStream.Length
has value 17939
.
One weird behavior noticed is that if I temporarily copy the contents of fileSection.FileStream
to a MemoryStream
, it causes fileSection.FileStream.Length
property to be populated.
In other words, copy this code:
var memoryStream = new MemoryStream();
fileSection.FileStream.CopyTo(memoryStream);
after
ValidateFormFile(fileSection.FileStream, fileSection.FileName, fileSection.Section.ContentType);
fileSection.FileStream.Position = 0;
...
So there appears to be issue with Microsoft.AspNetCore.WebUtilities
package (targeting .NET 7.0 and .NET 8.0) where the FileMultipartSection.FileStream.Length
is missing value initially. Most likely this is an issue which should be reported to Microsoft.
Thanks,
Ashish
from aws-sdk-net.
@ashishdhingra Thank you again for your effort.
Please advice should I report the problem, or you can do it? My concern is if i report it I will not be able to explain in detail what is going on.
Best regards,
Yulian
from aws-sdk-net.
@ashishdhingra Thank you again for your effort. Please advice should I report the problem, or you can do it? My concern is if i report it I will not be able to explain in detail what is going on.
Best regards, Yulian
@julian-dimitroff We have reviewed this created backlog item for this to be groomed for taking further course of action.
from aws-sdk-net.
@ashishdhingra Again thank you for your time and effort!
Regards,
Julian
from aws-sdk-net.
we also have this issue with upload failing for non-seekable streams.
From this.
"Use the low-level API when you need to pause and resume multipart uploads, vary part sizes during the upload, or do not know the size of the upload data in advance. When you don't have these requirements, use the high-level API"
From the above, you are suggesting to use the low level API for non-seekable streams.
But from this documentation you are saying non-seekable streams should work with the high level API
"For nonseekable streams or streams with an unknown length, TransferUtility will use multipart upload and buffer up to a part size in memory until the final part is reached and complete the upload."
So which one is it? Do you expect the high level API to work with non-seekable streams or you expect us to use the low level API for future versions?
Thank you in advance
from aws-sdk-net.
Related Issues (20)
- V4 Development: Sensible LangVersion HOT 4
- Record Support in DynamoDBContext HOT 3
- Null QueueUrl when using LocalStack HOT 3
- DisableFetchingTableMetadata leads to InvalidOperationException if PropertyConverter is used HOT 6
- AWSSDK.S3 ListObjectsV2 API returns Size incorrectly every other request for some subset of files +/- 2 bytes HOT 6
- V4 Development: DefineConstants requires an overhaul HOT 1
- AmazonS3Client.PutObjectAsync does not retry with HttpClient HttpCompletionOption.ResponseHeadersRead when using devproxy HOT 5
- Amazon.CloudWatch.Model.MetricDatum timestamp is not set correctly HOT 3
- connectCases:Add retry if created customer or case is not immediately available HOT 6
- Unable to use S3 Path Style Access S3 URL HOT 3
- Cannot Get DynamoDB record when model inherits from base class with member with the same name HOT 8
- Unable to connect to Timestream from using endpoint (.Net) HOT 8
- DynamoDB SDK convert Nullable DateTime to Local HOT 4
- When cancelled, dynamodb.eu-west-1.amazonaws.com throws an exception instead of returning HTTP code HOT 2
- There is no way to get bedrock streaming api responses using async io. HOT 8
- V4 Development Tracker HOT 2
- Upcoming Observability Features in AWS SDK for .NET HOT 4
- Disable Expect100Continue and how to sub-class S3 Requests HOT 1
- AmazonBedrockRuntimeClient and anthropic.claude-3-5-sonnet-20240620-v1:0 throw exception HOT 2
- [v4] `CryptoUtil` refactoring opportunities. HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from aws-sdk-net.