[onert] Improve model type inference with filesystem path handling#16258
[onert] Improve model type inference with filesystem path handling#16258glistening merged 1 commit intoSamsung:masterfrom
Conversation
| if (!file_path.has_extension()) | ||
| return ""; | ||
|
|
||
| return file_path.extension().string().substr(1); |
There was a problem hiding this comment.
The loadModel() function checks model type in a case-sensitive way. But file name extensions in most cases are treated in a cases-insensitive way, e.g. photo.jpg, photo.JPG, or config.yaml, config.YAML - these files are recognized as JPEG and YAML by e.g. VSCode. How about converting the extension to lowercase here?
There was a problem hiding this comment.
I think your question is just question about loader, not review of this PR because this PR does not change current loader's case handling policy.
We are supporting tflite, circle, and tvn type on loader. circle and tvn files are created by our frontend, and our frontend creates file as lowercase. And I have never seen tflite file which has uppercase extension.
If we need to handle uppercase file later, we can implement that.
There was a problem hiding this comment.
We are supporting tflite, circle, and tvn type on loader. circle and tvn files are created by our frontend, and our frontend creates file as lowercase.
If you expect that the extensions are always lowercase that's fine. I've just spotted that user input (file name) is used down the line without any sanitization. In most cases that's a bad practice (sometimes leading to security issues) so I thought that since your are changing the code here, you can add some sort of "sanitization" - converting file extension to a model type tag (which in the code base is in lower case). :)
| if (!file_path.has_extension()) | ||
| return ""; | ||
|
|
||
| return file_path.extension().string().substr(1); |
There was a problem hiding this comment.
Beyond this PR:
When hello. is provided, std::out_of_range will be thrown.
I missed this point at my previous PR.
It can be handled in this PR or it can be done in next PR (including https://github.com/Samsung/ONE/pull/16258/files#r2476759843)
std::string inferModelType(const std::filesystem::path &file_path)
{
std::string ext = file_path.extension().string();
if (ext.empty() || ext.length() <= 1)
return "";
return ext.substr(1);
}There was a problem hiding this comment.
When
hello.is provided,std::out_of_rangewill be thrown.
Because file_path.extension().string() is ., inferModelType will not throw std::out_of_range exception.
There was a problem hiding this comment.
I know the extension is .. My (and AI coding assistant's) concern was substr. I searched the reference and found that substr(pos) returns "" if pos equals to the string length.
eb66c3a to
219bdd2
Compare
This commit updates inferModelType function to accept filesystem::path directly and handle extension in a case-insensitive way. It updates load_model_from_path function to use inferModelType. ONE-DCO-1.0-Signed-off-by: Hyeongseok Oh <hseok82.oh@samsung.com>
219bdd2 to
8901c4e
Compare
|
@arkq @glistening I've rebased and updated to handle extension in a case-insensitive way. PTAL. |
|
I'm not able to resolve my previous comment nor submit an approval (not even a "gray" one), so I will post my +1 here :) |
@seanshpark could you please take a look? Since @arkq is one of our team members, can we add him to this org? 😄 |
| { | ||
| std::cerr << "Error: Cannot determine model type for '" << filename << "'." | ||
| << "Please use a file with valid extension." << std::endl; | ||
| return NNFW_STATUS_ERROR; | ||
| } | ||
| else |
There was a problem hiding this comment.
(optional) We may remove this repeating error handling by:
- consider
""as unknown. - early exit in loadModelFile when model_type =
""
|
@dahlinPL @seanshpark resigned from the company last week. |
That's very unexpected news! In that case, who could add @arkq to org? |
This commit updates inferModelType function to accept filesystem::path directly and handle extension in a case-insensitive way.
It updates load_model_from_path function to use inferModelType.
ONE-DCO-1.0-Signed-off-by: Hyeongseok Oh hseok82.oh@samsung.com