A sample application that uses the new Jetpack CameraX support library. The app takes input for a phrase from the user, and then uses CameraX and MLKit Text Recognition to preview the camera feed, analyze the image buffer to search for the phrase, and capture the image once the phrase has been detected.
To build this locally, you must create a Firebase project and add the google-services.json, then build.