You have two main options for uploading files in WCF: streamed or buffered/chunked. The latter option is the one that provides reliable data transfer as you can use wsHttpBinding with WS-ReliableMessaging turned on. Reliable messaging cannot be used with streaming as the WS-RM mechanism requires processing the data as a unity to apply checksums, etc. Processing a large file this way would require a huge buffer and thus a lot of available memory on both client and server; denial-of-service comes to mind. The workaround for this is chunking: splitting the file into e.g. 64KB fragments and using reliable, ordered messaging to transfer the data.
If you do not need the robustness of reliable messaging, streaming lets you transfer large amount of data using small message buffers without the overhead of implementing chuncking. Streaming over HTTP requires you to use basicHttpBinding; thus you will need SSL to encrypt the transferred data. Buffered transfer, on the other hand, can use wsHttpBinding, which by default provides integrity and confidentiality for your messages; thus there is no need for SSL then.
Read more about 'Large Data and Streaming' at MSDN.
So I implemented my upload prototype following the 'How to: Enable Streaming' guidelines, using a VS2005 web application (Cassini) as the service host. I made my operation a OneWay=true void operation with some SOAP headers to provide meta-data:
[OperationContract(Action = "UploadFile", IsOneWay = true)]
void UploadFile(ServiceContracts.FileUploadMessage request);
public class FileUploadMessage
[MessageHeader(MustUnderstand = true)]
public DataContracts.DnvsDnvxSession DnvxSession
[MessageHeader(MustUnderstand = true)]
public DataContracts.EApprovalContext Context
[MessageHeader(MustUnderstand = true)]
public DataContracts.FileMetaData FileMetaData
[MessageBodyMember(Order = 1)]
public System.IO.Stream FileByteStream
The reason for using message headers for meta-data is that WCF requires that the stream object is the only item in the message body for a streamed operation. Headers are the recommended way for sending meta-data when streaming. Note that headers are always sent before the body, thus you can depend on receving the header data before processing the streamed data.
Note that you should provide a separate endpoint (address, binding, contract) for your streamed services. The main reason for this is that configuration such as transferMode = "Streamed" applies to all operations in the endpoint. The same goes for transferMode, maxBufferSize, maxReceivedMessageSize, receiveTimeout, etc.
I use MTOM as the encoding format for the streamed data and set the max file size to 64MB. I also set the buffer size to 64KB even if I read the input stream in 4KB chucks, this to avoid receive buffer underruns. My binding config looks like this:
<!-- buffer: 64KB; max size: 64MB -->
I have set the transport security to none as I am testing the prototype without using SSL. I have not modified the service behavior; the default authentication (credentials) and authorization behaviors of the binding is used.
Note that the setting for transferMode does not propagate to clients when using a HTTP binding. You must manually edit the client config file to set transferMode = "Streamed" after using 'Add service reference' or running SVCUTIL.EXE. If you forget to do this, the transfer mode will be "Buffered" and you will get an error like this:
System.ServiceModel.CommunicationException: An error occurred while receiving the HTTP response to http://localhost:1508/Host/FileTransferService.svc. This could be due to the service endpoint binding not using the HTTP protocol. This could also be due to an HTTP request context being aborted by the server (possibly due to the service shutting down). See server logs for more details.
After ensuring that the client basicHttpBinding config was correct, I ran the unit test once more. Now I got this error from VS2005 Cassini:
System.ServiceModel.ProtocolException: The remote server returned an unexpected response: (400) Bad Request.
System.Net.WebException: The remote server returned an error: (400) Bad Request.
I have inspected the HTTP traffic using Fiddler and can see nothing wrong with the MTOM encoded request. Googling lead me to a lot of frustrated "streamed" entries at the forums.microsoft.com "Indigo" group, but no solution to my problem. So I made a self-hosted service using a console application, using the code provided in the WCF samples (see below), and it worked like a charm. I expect that it is just VS2005 Cassini that does not support streamed transfers/MTOM encoding.
I then installed IIS to check whether HTTP streaming works better with IIS. The first error I ran into when uploading my 1KB test file was this:
System.ServiceModel.CommunicationException: The underlying connection was closed: The connection was closed unexpectedly.
System.Net.WebException: The underlying connection was closed: The connection was closed unexpectedly.
Inspecting the traffic with Fiddler, I found the HTTP request to be fine, but no HTTP response. Knowing that this is a sure sign of a server-side exception, I debugged the service operation. The error was caused by the IIS application pool identity (ASPNET worker process) lacking NTFS rights to store the uploaded file on disk. Note that WCF by default will not impersonate the caller, thus the identity used to run your service will need to have sufficient permissions on all resources that your service needs to access. This includes the temp folder, which is used to provide the WSDL file for HTTP GET.
As the UploadFile operation has [OperationContract(IsOneWay = true)], it cannot have a response and neither a fault contract. Thus, there will be no HTTP response when a server-side exception occurs, just the famous "The underlying connection was closed" message.
I assigned the correct NTFS rights to my upload folder and re-ran the unit test: streamed upload to a WCF service hosted by IIS worked. The next unit test used a 43MB file to check if uploading data larger than the WCF buffer size works. The test gave me this error:
System.ServiceModel.CommunicationException: The socket connection was aborted. This could be caused by an error processing your message or a receive timeout being exceeded by the remote host, or an underlying network resource issue. Local socket timeout was '00:00:59.8590000'.System.IO.IOException: Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.
System.Net.Sockets.SocketException: An existing connection was forcibly closed by the remote host
I was a bit puzzled as the file was less than the WCF maxReceivedMessageSize limit set to 64MB, and the stream read buffer was only 4KB as compared to the WCF maxBufferSize being 64KB. Knowing that IIS/ASP.NET has a system enforced limit on the size of a HTTP request to prevent denial-of-service attacks, which seems to be unrelated to WCF as I do not use the AspNetCompatibilityRequirements service setting, I decided to set the maxRequestLength limit to 64MB also (default: 4096KB). This time the "stream large file upload" test passed!
So for WCF streaming to IIS to work properly, you need this <system.web> config in addition to the <system.serviceModel> config in the IIS web.config file:
<!-- maxRequestLength (in KB) max size: 2048MB -->
Make the maximum HTTP request size equal to the WCF maximum request size.
I have to recommend the 'Service Trace Viewer' (SvcTraceViewer.EXE) and the 'Service Configuration Editor' (SvcConfigEditor.EXE) provided in the WCF toolbox (download .NET3 SDK). These tools make it really simple to add tracing and logging to your service for debugging and diagnosing errors. Start the WCF config editor, open your WCF config file, select the 'Diagnostics' node and just click "Enable tracing" in the "Tracing" section to turn on tracing. The figure shows some typical settings (click to enlarge):
Save the config file and run the unit test to invoke the service. This will generate the web_tracelog.svclog file to use as input for the trace viewer. Open the trace file in the trace viewer and look for the errors (red text). Click an error to see the exception details. The figure shows some of the details available for the "Maximum request length exceeded" error (click to enlarge):
The WCF tools installed by the SDK is located here:
C:\Program Files\Microsoft SDKs\Windows\v6.0\Bin\
The code to process the incoming stream is rather simple, storing the uploaded files to a single folder on disk and overwriting any existing files:
public void UploadFile(FileUploadMessage request)
FileStream targetStream = null;
Stream sourceStream = request.FileByteStream;
string uploadFolder = @"C:\work\temp\";
string filename = request.FileMetaData.Filename;
string filePath = Path.Combine(uploadFolder, filename);
using (targetStream = new FileStream(filePath, FileMode.Create,
//read from the input stream in 4K chunks
//and save to output stream
const int bufferLen = 4096;
byte buffer = new byte[bufferLen];
int count = 0;
while ((count = sourceStream.Read(buffer, 0, bufferLen)) > 0)
targetStream.Write(buffer, 0, count);
The 'Stream Sample' available on MSDN contains all the code you need to upload a file as a stream to a self-hosted WCF service and then save it to disk on the server by reading the stream in 4KB chunks. The download contains about 150 solutions with more than 4800 files, so there is a lot of stuff in it. Absolutely worth a look. The streaming sample is located here:
<unzip folder> \TechnologySamples\Basic\Contract\Service\Stream
Please let me know if you have been able to get HTTP streaming with MTOM encoding to work with Microsoft VS2005 Cassini.
Finally, a useful tip for streaming download: How to release streams and files using Disponse() on the the message contract.