You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Below is the method i am trying to read a stream and I am getting the error as No header record found. I have checked the stream and its return length of stream. and also to verify that my stream is valid I checked with writing it in StreamReader and then using readtoEnd() and it gives me all contents. Now I cant do readToEnd() for large files so going for Stream. below is the method How I am reading blobs and passing stream. I am using the CSVHelper version 15 here
public Type DataType
{
get
{
switch( Type.ToUpper() )
{
case "STRING":
return typeof(string);
case "INT":
return typeof( int );
case "BOOL":
```
case "BOOLEAN":
return typeof( bool );
case "FLOAT":
case "SINGLE":
case "DOUBLE":
return typeof( double );
case "DATETIME":
return typeof( DateTime );
default:
throw new NotSupportedException( $"CSVColumn data type '{Type}' not supported" );
}
}
}
// This is the method where the stream is passed and the error is received at line "csv.ReadHeader()"
```
private IEnumerable<Dictionary<string, EntityProperty>> ReadCSV(Stream source, IEnumerable<TableField> cols)
{
using (TextReader reader = new StreamReader(source, Encoding.UTF8))
{
var cache = new TypeConverterCache();
cache.AddConverter<float>(new CSVSingleConverter());
cache.AddConverter<double>(new CSVDoubleConverter());
var csv = new CsvReader(reader,
new CsvHelper.Configuration.CsvConfiguration(global::System.Globalization.CultureInfo.InvariantCulture)
{
Delimiter = ";",
HasHeaderRecord = true,
CultureInfo = global::System.Globalization.CultureInfo.InvariantCulture,
TypeConverterCache = cache
});
csv.Read();
csv.ReadHeader();
var map = (
from col in cols
from src in col.Sources()
let index = csv.GetFieldIndex(src, isTryGet: true)
where index != -1
select new { col.Name, Index = index, Type = col.DataType }).ToList();
while (csv.Read())
{
yield return map.ToDictionary(
col => col.Name,
col => EntityProperty.CreateEntityPropertyFromObject(csv.GetField(col.Type, col.Index)));
}
}
```
// This is the method from where the stream is being returned to ReadCsv() method above
```
public async Task<Stream> ReadStream(string containerName, string digestFileName, string fileName, string connectionString)
{
string data = string.Empty;
string fileExtension = Path.GetExtension(fileName);
var contents = await DownloadBlob(containerName, digestFileName, connectionString);
return contents;
}
// method where blob is read as stream
public async Task<Stream> DownloadBlob(string containerName, string fileName, string connectionString)
{
Microsoft.Azure.Storage.CloudStorageAccount storageAccount = Microsoft.Azure.Storage.CloudStorageAccount.Parse(connectionString);
CloudBlobClient serviceClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = serviceClient.GetContainerReference(containerName);
CloudBlockBlob blob = container.GetBlockBlobReference(fileName);
if (!blob.Exists())
{
throw new Exception($"Unable to upload data in table store for document");
}
return await blob.OpenReadAsync();
}`
```
The text was updated successfully, but these errors were encountered:
Read returns a bool. If that bool is false, meaning there is no more data, you'll get that error when calling ReadHeader because there is no header record. Basically it's an empty file.
I can't think of any other reason this would happen.
Below is the method i am trying to read a stream and I am getting the error as No header record found. I have checked the stream and its return length of stream. and also to verify that my stream is valid I checked with writing it in StreamReader and then using readtoEnd() and it gives me all contents. Now I cant do readToEnd() for large files so going for Stream. below is the method How I am reading blobs and passing stream. I am using the CSVHelper version 15 here
The text was updated successfully, but these errors were encountered: