Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TensorFlow engine not found when exported to a runnable jar file #940

Closed
macster110 opened this issue May 5, 2021 · 7 comments
Closed
Labels
question Further information is requested

Comments

@macster110
Copy link

macster110 commented May 5, 2021

Description

We have a large project in which we use djl to allow users to load acoustic deep learning models. The djl library is packaged within a single dependency (jdl4pam) generated from a seperate project. During development everything was working great, however, when we packaged the project into a jar file, the TensorFlow based classifiers stopped working returning "Deep learning engine not found" exceptions. Pytorch models were unaffected.

Expected Behavior

It should make no difference running a project from an IDE (in this case Eclipse) or packaging it into a jar file.

Error Message

Exception in thread "main" java.lang.IllegalArgumentException: Deep learning engine not found: TensorFlow
at ai.djl.engine.Engine.getEngine(Engine.java:150)
at ai.djl.Model.newInstance(Model.java:82)
at testjfrog.TestJFrog.main(TestJFrog.java:32)

How to Reproduce?

I've created a minimal project (master branch) which imports a tensorflow "***.pb" file and loads it. The program accepts a path as an argument so any TensorFlow model can be loaded. The available engines are [Tensorflow Pytorch] if run from Eclipse. However, if this is exported to a jar and run from powershell in Windows the only available engine is Pytorch and a the program will throw an exception.

Interestingly if the TensorFlow engine is added before the jdl4pam dependency then the opposite occurs, Tensorflow becomes the only available engine and Pytorch is no longer available.

i.e. this works for Tensorflow models

<!-- https://mvnrepository.com/artifact/ai.djl.tensorflow/tensorflow-engine-->
	<dependency>
		<groupId>ai.djl.tensorflow</groupId>
		<artifactId>tensorflow-engine</artifactId>
		<version>0.11.0</version>
	</dependency>

	 <dependency>
		<groupId>org.jamdev</groupId>
		<artifactId>jdl4pam</artifactId>
		<version>0.0.87</version>
	</dependency>

but it should work using just the jdl4pam dependency.

Steps to reproduce

  1. Import the minimal example Maven project into eclipse
  2. Download the TensorFlow model from here.
  3. Run the project in Eclipse.
  4. Export the project as .jar file and run in powershell e.g. java -jar test_jdl4pam.jar "C:\Users\Jamie\Desktop\Right_whales_DG\model_lenet_dropout_input_conv_all\saved_model.pb"
  5. The project will run fine in Eclipse but will return an exception when run from the jar file. In eclipse the available models will be [PyTorch, TensorFlow] and from the jar only [Pytorch].

What have you tried to solve it?

I updated jdl4pam to the latest version of the djl library. I played around with Maven dependencies and found where the issue is (see above) but cannot explain why this occurs?

Environment Info

Windows 10.
Version 10.0.19042 Build 19042
Java 14
(This issue also occurs on MacOS)

@macster110 macster110 added the bug Something isn't working label May 5, 2021
@frankfliu
Copy link
Contributor

@macster110 You also need include tensorflow-native-auto as dependency. See: http://docs.djl.ai/tensorflow/tensorflow-engine/index.html#installation

@macster110
Copy link
Author

macster110 commented May 5, 2021

@macster110 You also need include tensorflow native-auto as dependency. See: http://docs.djl.ai/tensorflow/tensorflow-engine/index.html#installation
All the required dependencies are in the jdl4pam library. I only have the tensorflow-engine dependency as an example to show how you can resolve this issue. But, this is not an ideal solution because it should just work when all required djl dependencies are bundled with jdl4pam.

I've updated the issue to add a little more clarity.

@frankfliu
Copy link
Contributor

@macster110
I cloned your project: git@github.com:macster110/jpam.git, but I'm not able to build the project. The dependency isn't configured properly.

Based on your description, it looks like your are trying to create a fat jar and include everything in the fat jar.
I noticed you include both pytorch and tensorflow eingine, both the jar file contains a file named:

META-INF/services/ai.djl.engine.EngineProvider

When you merge the two jar file into single fat jar, one of the file get override (depends on your merge policy).
If Pytorch one get preserved, TensorFlow one will be lost.

The only workaround if you manually merge this file and added into your fat jar.

@macster110
Copy link
Author

Thanks @frankfliu. I'll check out what's going on with jpam.

Does this mean that there is no easy way to bundle two engines in one jar using djl? If this is the case, what is the recommended way to structure and then export a project using multiple engines and djl?

@frankfliu frankfliu added question Further information is requested and removed bug Something isn't working labels May 7, 2021
@macster110
Copy link
Author

The Java service loader is a common pattern and there seem to be other solutions existing. A quick search turned up:

* https://stackoverflow.com/questions/42540485/how-to-stop-maven-shade-plugin-from-blocking-java-util-serviceloader-initializat

* https://stackoverflow.com/questions/47310215/merging-meta-inf-services-files-with-maven-assembly-plugin

* https://maven.apache.org/plugins/maven-shade-plugin/examples/resource-transformers.html#ServicesResourceTransformer

Thanks @zachgk . After the diagnosis of the issue from @frankfliu I managed to get the test project working by exporting in Eclipse and selecting "package required libraries into generated jar" instead of "extract packaged libraries into generated jar".

The MAVEN shade plugin should be able to deal with this but I have been unsuccessful so far in getting it to work. I will post the solution here when I do.

Thanks again for all the help on this.

@macster110
Copy link
Author

For reference there are three solutions to this question.

Export the jar file without extracting libraries
In Eclipse IDE right click on the project -> Export. In the Runnable JAR File Speciifcation dialog select Package required libraries int generated jar instead of the default Extract required libraries into generated jar.
Note that this solution is not ideal for larger projects.

Manually change the META-INF/services/ folder in the jar file

  • Create a jar file.
  • Unzip it using 7-zip or equivalent.
  • Navigate to the META-INF/services folder
  • Open the ai.djl.engine.EngineProvider file (e.g. in Sublime)
  • Replace the contents of the files with the desired engines...for example to run both Pytorch and Tensorflow the text should read
ai.djl.pytorch.engine.PtEngineProvider
ai.djl.tensorflow.engine.TfEngineProvider

-Save the file and re-zip the contents of the jar file. Rename the zip file to .jar. Everything should work.

Let Maven do it all for you

Add the Maven shade plugin with the following settings and the jar will be created properly. Note that you must use Maven to generate the jar i.e. set the goal to package shade:shade

	<!-- Maven Shade plugin - for creating the uberjar / fatjar -->
		<!-- see http://maven.apache.org/plugins/maven-shade-plugin/index.html for 
			details -->
		<plugin>
			<groupId>org.apache.maven.plugins</groupId>
			<artifactId>maven-shade-plugin</artifactId>
			<version>3.2.1</version>
			<configuration>
				<transformers>
					<transformer
						implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer" />
				</transformers>
			</configuration>
			<executions>
				<execution>
					<phase>package</phase>
					<goals>
						<goal>shade</goal>
					</goals>
					<configuration>
						<filters>
							<filter>
								<artifact>*:*</artifact>
								<excludes>
								 	<exclude>PamModel/*.*</exclude>  <!--  don't include files in the PamModel folder -->
									<exclude>META-INF/*.SF</exclude> <!-- get rid of manifests from library jars -->
									<exclude>META-INF/*.DSA</exclude>
									<exclude>META-INF/*.RSA</exclude>
								</excludes>
							</filter>
						</filters>
					</configuration>
				</execution>
			</executions>
		</plugin>

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants