Asynchronous Method Invocation (AMI) in C#
Asynchronous Method Invocation (AMI) is the term used to describe the client-side support for the asynchronous programming model. AMI supports both oneway and twoway requests, but unlike their synchronous counterparts, AMI requests never block the calling thread. When a client issues an AMI request, the Ice runtime hands the message off to the local transport buffer or, if the buffer is currently full, queues the request for later delivery. The application can then continue its activities and poll or wait for completion of the invocation, or receive a callback when the invocation completes.
AMI is transparent to the server: there is no way for the server to tell whether a client sent a request synchronously or asynchronously.
In a modern C# application, you should always use AMI. The synchronous API is provided for backwards compatibility.
Asynchronous API
Consider the following Slice definition:
module Demo
{
interface Employees
{
["cs:identifier:GetName"]
string getName(int number);
}
}
slice2cs generates the following asynchronous proxy method:
public partial interface EmployeesPrx : Ice.ObjectPrx
{
Task<string> GetNameAsync(
int number,
Dictionary<string, string>? context = null,
IProgress<bool>? progress = null,
CancellationToken cancel = default);
...
}
As you can see, the getName operation generates a GetNameAsync method that accepts several optional parameters:
a sent callback
a cancellation token
The GetNameAsync method sends (or queues) an invocation of getName. This method does not block the calling thread. It returns a Task that you typically await. Here's an example that calls getNameAsync:
EmployeesPrx e = ...;
string name = await e.GetNameAsync(99);
Asynchronous Exception Semantics
If an invocation throws an exception, the exception can be obtained from the task.
The exception is provided by the task, even if the actual error condition for the exception was encountered during the call to the Async method ("on the way out"). The advantage of this behavior is that all exception handling is located with the code that handles the task (instead of being present twice, once where the Async method is called, and again where the task is handled).
There are two exceptions to this rule:
if you destroy the communicator and then make an asynchronous invocation, the
Asyncmethod throwsCommunicatorDestroyedExceptiondirectly.a call to an
Asyncmethod can throwTwowayOnlyException. AnAsyncmethod throws this exception if you call an operation that has a return value or out-parameters on a oneway proxy.
This behavior is provided for consistency with other Ice language mappings. In modern C#, it is preferable to report synchronous exceptions (such as marshaling exceptions) synchronously.
Asynchronous Oneway Invocations
You can invoke operations via oneway proxies asynchronously, provided the operation has void return type, does not have any out-parameters, and does not throw user exceptions. If you call an asynchronous method on a oneway proxy for an operation that returns values or throws a user exception, the proxy method throws TwowayOnlyException.
The task returned for a oneway invocation completes as soon as the request is successfully written to the client-side transport. The task completes with an exception if an error occurs before the request is successfully written.
Flow Control
Asynchronous method invocations never block the thread that calls the asynchronous proxy method. The Ice runtime checks to see whether it can write the request to the local transport. If it can, it does so immediately in the caller's thread. Alternatively, if the local transport does not have sufficient buffer space to accept the request, the Ice runtime queues the request internally for later transmission in the background.
This creates a potential problem: if a client sends many asynchronous requests at the time the server is too busy to keep up with them, the requests pile up in the client-side runtime until, eventually, the client runs out of memory.
The API provides a way for you to implement flow control by counting the number of requests that are queued so, if that number exceeds some threshold, the client stops invoking more operations until some of the queued operations have drained out of the local transport. One of the optional arguments to every asynchronous proxy invocation is a System.IProgress<bool>. If you provide a delegate, the Ice runtime will eventually invoke it when the request has been sent and provide a boolean argument indicating whether the request was sent synchronously. This argument is true if the entire request could be transferred to the local transport in the caller's thread without blocking, otherwise the argument is false. Furthermore, a value of true indicates that Ice is invoking your delegate recursively from the calling thread, whereas a value of false indicates that Ice is invoking the delegate from an Ice thread pool thread.
Here's a simple example to demonstrate the flow control feature:
ExamplePrx proxy = ...;
proxy.DoSomethingAsync(progress: (sentSynchronously) =>
{
if (sentSynchronously)
{
// Entire request was accepted by the transport,
// called recursively from this thread
}
else
{
// Request was queued but has now been sent,
// called from a separate thread
}
});
Using this feature, you can limit the number of queued requests by counting the number of requests that are queued and decrementing the count when the Ice runtime passes a request to the local transport.
Canceling an Asynchronous Invocation
Every asynchronous proxy method accepts an optional CancellationToken argument. Its default value is default, which is equivalent to passing CancellationToken.None.
Cancellation prevents a queued invocation from being sent or, if the invocation has already been sent, ignores a reply if the server sends one. Cancellation is a local operation and has no effect on the server. The result of a canceled invocation is an Ice.InvocationCanceledException.