Load TensorFlow model into memory.
Load TensorFlow model into memory. This is the convenience method that allows the model to be loaded once and subsequently use it for querying schema and creation of TensorFlowEstimator.
Import-TensorFlowModel [-Path] <String> [-OutputAsBatched] [-Context <MLContext>] [<CommonParameters>]
The location of the TensorFlow model to load.
Type: System.String
Required: True
Position: 0
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
If the first dimension of the output is unknown, should it be treated as batched or not.
Type: System.Management.Automation.SwitchParameter
Required: False
Position: named
Default value: False
Accept pipeline input: False
Accept wildcard characters: False
The context on which to perform the action. If omitted, the current (cached) context will be used.
Type: Microsoft.ML.MLContext
Required: False
Position: named
Default value: Current context
Accept pipeline input: False
Accept wildcard characters: False
This cmdlet supports the common parameters: Verbose, Debug, ErrorAction, ErrorVariable, WarningAction, WarningVariable, OutBuffer, PipelineVariable, and OutVariable. For more information, see about_CommonParameters.
Type | Description |
---|---|
None | This cmdlet does not accept pipeline input. |
Type | Description |
---|---|
Microsoft.ML.Transforms.TensorFlowModel |