c# - Azure Data Lake Store - error while reading from file -


i use filesystemoperationsextensions.open method returns stream , can read it. when service reading big files stream (~150-300 mb) service gets following exceptions:

system.io.ioexception: read operation failed, see inner exception. ---> system.net.webexception: request aborted: request canceled. @ system.net.connectstream.read(byte[] buffer, int32 offset, int32 size) @ system.net.http.httpclienthandler.webexceptionwrapperstream.read(byte[] buffer, int32 offset, int32 count)  "classname": "system.io.ioexception", "message": "unable read data transport connection: existing connection forcibly closed remote host."  @ system.net.connectstream.read(byte[] buffer, int32 offset, int32 size)\r\n     @ system.net.http.httpclienthandler.webexceptionwrapperstream.read(byte[] buffer, int32 offset, int32 count) 

and occurs randomly. also, create object of datalakestorefilesystemmanagementclient class 60 minutes timeout, these errors occur before it. may take 3, 10, 20 or whatever minutes. of course, can reread stream offset, requires time development. perhaps there way avoid these exceptions. me it?

i demo test 270m+ size file 3 times, works correctly me. please have try using following code test it. can more datalake store demo code data lake store started net sdk.

enter image description here

demo code:

var applicationid = "application id";                 var secretkey = "secretkey";                 var tenantid = "tenant id";                 var adlsaccountname = "account name";                 var creds = applicationtokenprovider.loginsilentasync(tenantid, applicationid, secretkey).result;                 var adlsfilesystemclient = new datalakestorefilesystemmanagementclient(creds,clienttimeoutinminutes:60);                 var srcpath = "/mytempdir/fordemocode.zip";                 var destpath = @"c:\tom\fordemocode1.zip";                  stopwatch stopwatch = new stopwatch();                 stopwatch.start();                 using (var stream = adlsfilesystemclient.filesystem.open(adlsaccountname, srcpath))                 using (var filestream = new filestream(destpath, filemode.create))                 {                     stream.copyto(filestream);                 }                 var file = new fileinfo(destpath);                 console.writeline($"file size :{file.length}");                 stopwatch.stop();                 // elapsed time timespan value.                 timespan ts = stopwatch.elapsed;                 // format , display timespan value.                 string elapsedtime = $"{ts.hours:00}:{ts.minutes:00}:{ts.seconds:00}.{ts.milliseconds/10:00}";                 console.writeline("runtime " + elapsedtime);                 console.readkey(); 

package config file:

<?xml version="1.0" encoding="utf-8"?> <packages>   <package id="microsoft.azure.management.datalake.store" version="2.1.1-preview" targetframework="net452" />   <package id="microsoft.azure.management.datalake.storeuploader" version="1.0.0-preview" targetframework="net452" />   <package id="microsoft.identitymodel.clients.activedirectory" version="3.13.8" targetframework="net452" />   <package id="microsoft.rest.clientruntime" version="2.3.5" targetframework="net452" />   <package id="microsoft.rest.clientruntime.azure" version="3.3.5" targetframework="net452" />   <package id="microsoft.rest.clientruntime.azure.authentication" version="2.2.0-preview" targetframework="net452" />   <package id="newtonsoft.json" version="9.0.2-beta1" targetframework="net452" /> </packages> 

Comments

Popular posts from this blog

Command prompt result in label. Python 2.7 -

javascript - How do I use URL parameters to change link href on page? -

amazon web services - AWS Route53 Trying To Get Site To Resolve To www -