Skip to content

Onnx with single float output throws exception during MLContext predict #7225

@xqiu

Description

@xqiu

System Information (please complete the following information):

  • OS & Version: [Windows 11]
  • ML.NET Version: [ML.NET v3.0.1]
  • .NET Version: [.NET 8.0]
  • Microsoft.ML.OnnxRuntime Version: [1.1.9.1]
  • Microsoft.ML.OnnxTransformer Version: [1.1.9.1]

Describe the bug
I have an onnx file with only a single float32 output. By using the following class to map the output with mlContext.Transforms.ApplyOnnxModel, and CreatePredictionEngine, predict function will report System.ArgumentNullException: Value cannot be null. (Parameter 'source') at Microsoft.ML.Transforms.Onnx.OnnxTransformer.Mapper.<>c__DisplayClass16_0`1.

I can use InferenceSession to run it without problem.

To Reproduce
Steps to reproduce the behavior:

  1. Clone https://github.com/xqiu/ml_onnx_output_error
  2. Run the program to see the bug appearance

Code is:

    public class ModelInput
    {
        [ColumnName("input")]
        [VectorType(1, 1, 40, 40)]  // Adjust dimensions to match your model input
        public float[] Features { get; set; } = new float[1 * 1 * 40 * 40];  // Flattened 4D array for ONNX input
    }

    public class ModelOutput
    {
        [ColumnName("output")]
        public float Prediction { get; set; }
    }

        public static void PredictWithMLContextSingle(string modelPath)
        {
            Console.WriteLine("\r\nRun PredictWithMLContext:");

            var mlContext = new MLContext();

            // Define the ONNX pipeline
            var pipeline = mlContext.Transforms.ApplyOnnxModel(
                modelFile: modelPath,
                outputColumnNames: new[] { "output" },
                inputColumnNames: new[] { "input" });
            // Prepare input data (replace this with your actual input data)
            var input = new ModelInput();
            for (int i = 0; i < input.Features.Length; i++)
            {
                input.Features[i] = 0.1f; // Example value, replace with your actual data
            }

            // Create the data view with a single item wrapped in a list
            var emptyDataView = mlContext.Data.LoadFromEnumerable(new[] { input });

            // Fit the pipeline
            var mlModel = pipeline.Fit(emptyDataView);

            // Load the model
            var predictionEngine = mlContext.Model.CreatePredictionEngine<ModelInput, ModelOutput>(mlModel);

            // Perform prediction
            var output = predictionEngine.Predict(input);

            Console.WriteLine("Using MLContext Prediction result:");
            Console.WriteLine($"Output: {output}");
        }

Expected behavior
Prediction works with MLContext.

Screenshots, Code, Sample Projects
Sample code in https://github.com/xqiu/ml_onnx_output_error

Additional context
Add any other context about the problem here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    untriagedNew issue has not been triaged

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions