Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

is possible to load tf model from memory buffer #10

Closed
zjd1988 opened this issue Oct 27, 2023 · 4 comments
Closed

is possible to load tf model from memory buffer #10

zjd1988 opened this issue Oct 27, 2023 · 4 comments

Comments

@zjd1988
Copy link

zjd1988 commented Oct 27, 2023

wonderful job, but i also want to load model form memory, I google that, but can not find any useful info. Do you have any solution

@lreiher
Copy link
Member

lreiher commented Oct 27, 2023

I guess you need to provide a little more info on what exactly you intend to do. In what form do you have your model in memory?

@zjd1988
Copy link
Author

zjd1988 commented Oct 30, 2023

hi @lreiher ,thank for your reply, here is a c++ demo code

int test_file_size(std::string file_name)
{
	std::ifstream in(file_name.c_str(), std::ios::binary);
	if (!in.is_open()) {
		std::cout << "fail to open file\n";
		return -1;
	}
 
	std::streampos begin, end;
	begin = in.tellg();
	in.seekg(0, std::ios::end);
	end = in.tellg();
 
	in.close();
 
	std::cout << "this file's size is: " << (end - begin) << " bytes.\n";
 
	return 0;
}

int main()
{
	std::string  model_file = "savedmodel.pb"
	int file_size = test_file_size(model_file);
	std::vector<uin8_t>  model_data(file_size);
	std::ifstream fin(filename.c_str(), std::ios::binary | std::ios::in);
	if (!fin) {
		std::cerr << "error: open file for input failed!" << std::endl;
		return -1;
	}
	fin.read(model_data.data(), file_size);
        fin.close();
       tensorflow_cpp::Model model;
       model.loadModelFromData(model_data.data(), file_size);
       return 0;
}
  

is possible implement a function like loadModelFromData to load model?

@lreiher
Copy link
Member

lreiher commented Oct 30, 2023

Kind of depends on the C++ APIs available by TensorFlow. For example, the actual TensorFlow API call for loading a saved model is called here.

If you find a suitable API for the desired behavior, I would also be happy to look over a PR that is exposing this API in this tensorflow_cpp wrapper convenience library.

Alternatively, I guess you could first write the datastream to a file and then load from there?

@zjd1988
Copy link
Author

zjd1988 commented Nov 28, 2023

Hi @lreiher I google that, not found a suitable solution.

@zjd1988 zjd1988 closed this as completed Nov 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants