以下是一段利用 HttpWebRequest 取得遠端檔案的程式碼
private void GetStreamCallback(IAsyncResult asynchronousResult)
{
try
{
HttpWebRequest request = (HttpWebRequest)asynchronousResult.AsyncState;
HttpWebResponse response = (HttpWebResponse)request.EndGetResponse(asynchronousResult);
if (response.StatusCode == HttpStatusCode.OK)
{
Stream streamSource = response.GetResponseStream();
const int STREAM_READ_BUFFER = 65536;
byte[] btReadBuf = new byte[STREAM_READ_BUFFER];
int nRead = 0;
do
{
nRead = streamSource.Read(btReadBuf, 0, STREAM_READ_BUFFER);
} while ();
}
}
catch (Exception )
{
NotifyComplete();
}
}
這段簡單的程式碼在一般的應用程式中執行起來不會有什麼問題,但擺到 BackgroundAgent 中執行就會發現很容易在迴圈的內容執行了幾次後發生 streamSource.Read 這一行卡住的現像,且沒有任何的 Exception 被拋出,在之前我一直認為是 BackgroundAgent 中無解的 Bug,找到一些人的說法就是加上適當的 sleep 可以解決 Read 卡住的問題,加上 sleep 的確幫我解決了卡住的問題,但始終沒有一個可以讓人了解原因的答案,直到最近看到了一個討論串才了解原因。
就結論來講,除非該 Stream 的來源是本機檔案而相當的快速才可以不用加 sleep,否則加上 sleep 似乎還是目前唯一的方法,原因大概就在於 BackgroundAgent 中禁止太過密集的訊息傳遞而造成 Stream.Read 的迴圈異常,以下是連結與原文…
Unbuffered downloads deadlock when run from a background agent
In background agents, we limit the amount of the stream that we prebuffer because of the memory constraints, so we’ll pre-buffer up to 128KB of data. (2 64KB blocks).
When you issue a read, if the data isn’t available, it will wait until the data is. As data is coming in, we send notifications to say that more data has arrived. We don’t want these notifications to be too chatty, so we won’t fire a notification if the last one was fired to close before. This is not a problem normally when we can prebuffer the whole stream, but in the background agent case, that can mean that we have filled the buffer that we have but the Read that is waiting for the data doesn’t get notified that it can complete because we’ve suppressed that notification. And since the buffer is full, another notification will not come in until we read some data, and that is the root of the issue. If you wait 20ms between the time a read completes and the time you issue the next one, then the notifications shouldn’t get suppressed and the data should keep progressing as you’d expect.
沒有留言:
張貼留言