Skip to content
Permalink
Browse files

Project import generated by Copybara.

PiperOrigin-RevId: 264072870
  • Loading branch information...
MediaPipe Team chuoling
MediaPipe Team authored and chuoling committed Aug 19, 2019
1 parent c27a7c1 commit 71a47bb18b122f8b571fb7018f083dced5893fab
@@ -1 +1,8 @@
mediapipe/provisioning_profile.mobileprovision
bazel-bin
bazel-genfiles
bazel-mediapipe-ioss
bazel-out
bazel-testlogs
mediapipe/MediaPipe.xcodeproj
mediapipe/MediaPipe.tulsiproj/*.tulsiconf-user
@@ -0,0 +1,115 @@
{
"additionalFilePaths" : [
"/BUILD",
"mediapipe/BUILD",
"mediapipe/objc/BUILD",
"mediapipe/examples/ios/BUILD",
"mediapipe/examples/ios/edgedetectiongpu/BUILD",
"mediapipe/examples/ios/facedetectioncpu/BUILD",
"mediapipe/examples/ios/facedetectiongpu/BUILD",
"mediapipe/examples/ios/handdetectiongpu/BUILD",
"mediapipe/examples/ios/handtrackinggpu/BUILD",
"mediapipe/examples/ios/objectdetectioncpu/BUILD",
"mediapipe/examples/ios/objectdetectiongpu/BUILD"
],
"buildTargets" : [
"//mediapipe/examples/ios/edgedetectiongpu:EdgeDetectionGpuApp",
"//mediapipe/examples/ios/facedetectioncpu:FaceDetectionCpuApp",
"//mediapipe/examples/ios/facedetectiongpu:FaceDetectionGpuApp",
"//mediapipe/examples/ios/handdetectiongpu:HandDetectionGpuApp",
"//mediapipe/examples/ios/handtrackinggpu:HandTrackingGpuApp",
"//mediapipe/examples/ios/objectdetectioncpu:ObjectDetectionCpuApp",
"//mediapipe/examples/ios/objectdetectiongpu:ObjectDetectionGpuApp",
"//mediapipe/objc:mediapipe_framework_ios"
],
"optionSet" : {
"BazelBuildOptionsDebug" : {
"p" : "$(inherited)"
},
"BazelBuildOptionsRelease" : {
"p" : "$(inherited)"
},
"BazelBuildStartupOptionsDebug" : {
"p" : "$(inherited)"
},
"BazelBuildStartupOptionsRelease" : {
"p" : "$(inherited)"
},
"BuildActionPostActionScript" : {
"p" : "$(inherited)"
},
"BuildActionPreActionScript" : {
"p" : "$(inherited)"
},
"CommandlineArguments" : {
"p" : "$(inherited)"
},
"EnvironmentVariables" : {
"p" : "$(inherited)"
},
"LaunchActionPostActionScript" : {
"p" : "$(inherited)"
},
"LaunchActionPreActionScript" : {
"p" : "$(inherited)"
},
"ProjectGenerationBazelStartupOptions" : {
"p" : "$(inherited)"
},
"TestActionPostActionScript" : {
"p" : "$(inherited)"
},
"TestActionPreActionScript" : {
"p" : "$(inherited)"
}
},
"projectName" : "Mediapipe",
"sourceFilters" : [
"mediapipe",
"mediapipe/calculators",
"mediapipe/calculators/core",
"mediapipe/calculators/image",
"mediapipe/calculators/internal",
"mediapipe/calculators/tflite",
"mediapipe/calculators/util",
"mediapipe/examples",
"mediapipe/examples/ios",
"mediapipe/examples/ios/edgedetectiongpu",
"mediapipe/examples/ios/edgedetectiongpu/Base.lproj",
"mediapipe/examples/ios/facedetectioncpu",
"mediapipe/examples/ios/facedetectioncpu/Base.lproj",
"mediapipe/examples/ios/facedetectiongpu",
"mediapipe/examples/ios/facedetectiongpu/Base.lproj",
"mediapipe/examples/ios/handdetectiongpu",
"mediapipe/examples/ios/handdetectiongpu/Base.lproj",
"mediapipe/examples/ios/handtrackinggpu",
"mediapipe/examples/ios/handtrackinggpu/Base.lproj",
"mediapipe/examples/ios/objectdetectioncpu",
"mediapipe/examples/ios/objectdetectioncpu/Base.lproj",
"mediapipe/examples/ios/objectdetectiongpu",
"mediapipe/examples/ios/objectdetectiongpu/Base.lproj",
"mediapipe/framework",
"mediapipe/framework/deps",
"mediapipe/framework/formats",
"mediapipe/framework/formats/annotation",
"mediapipe/framework/formats/object_detection",
"mediapipe/framework/port",
"mediapipe/framework/profiler",
"mediapipe/framework/stream_handler",
"mediapipe/framework/tool",
"mediapipe/gpu",
"mediapipe/graphs",
"mediapipe/graphs/edge_detection",
"mediapipe/graphs/face_detection",
"mediapipe/graphs/hand_tracking",
"mediapipe/graphs/object_detection",
"mediapipe/models",
"mediapipe/objc",
"mediapipe/util",
"mediapipe/util/android",
"mediapipe/util/android/file",
"mediapipe/util/android/file/base",
"mediapipe/util/tflite",
"mediapipe/util/tflite/operations"
]
}
@@ -0,0 +1,24 @@
{
"configDefaults" : {
"optionSet" : {
"CLANG_CXX_LANGUAGE_STANDARD" : {
"p" : "c++14"
}
}
},
"packages" : [
"",
"mediapipe",
"mediapipe/objc",
"mediapipe/examples/ios",
"mediapipe/examples/ios/edgedetectiongpu",
"mediapipe/examples/ios/facedetectioncpu",
"mediapipe/examples/ios/facedetectiongpu",
"mediapipe/examples/ios/handdetectiongpu",
"mediapipe/examples/ios/handtrackinggpu",
"mediapipe/examples/ios/objectdetectioncpu",
"mediapipe/examples/ios/objectdetectiongpu"
],
"projectName" : "Mediapipe",
"workspaceRoot" : "../.."
}
@@ -30,8 +30,8 @@ detection example.
use MediaPipe with a TFLite model for object detection in a GPU-accelerated
pipeline.

* [Android](./object_detection_mobile_gpu.md#android)
* [iOS](./object_detection_mobile_gpu.md#ios)
* [Android](./object_detection_mobile_gpu.md)
* [iOS](./object_detection_mobile_gpu.md)

### Object Detection with CPU

@@ -48,24 +48,24 @@ The selfie face detection TFLite model is based on
and model details are described in the
[model card](https://sites.google.com/corp/view/perception-cv4arvr/blazeface#h.p_21ojPZDx3cqq).

* [Android](./face_detection_mobile_gpu.md#android)
* [iOS](./face_detection_mobile_gpu.md#ios)
* [Android](./face_detection_mobile_gpu.md)
* [iOS](./face_detection_mobile_gpu.md)

### Hand Detection with GPU

[Hand Detection with GPU](./hand_detection_mobile_gpu.md) illustrates how to use
MediaPipe with a TFLite model for hand detection in a GPU-accelerated pipeline.

* [Android](./hand_detection_mobile_gpu.md#android)
* [iOS](./hand_detection_mobile_gpu.md#ios)
* [Android](./hand_detection_mobile_gpu.md)
* [iOS](./hand_detection_mobile_gpu.md)

### Hand Tracking with GPU

[Hand Tracking with GPU](./hand_tracking_mobile_gpu.md) illustrates how to use
MediaPipe with a TFLite model for hand tracking in a GPU-accelerated pipeline.

* [Android](./hand_tracking_mobile_gpu.md#android)
* [iOS](./hand_tracking_mobile_gpu.md#ios)
* [Android](./hand_tracking_mobile_gpu.md)
* [iOS](./hand_tracking_mobile_gpu.md)

### Hair Segmentation with GPU

@@ -76,7 +76,7 @@ pipeline. The selfie hair segmentation TFLite model is based on
and model details are described in the
[model card](https://sites.google.com/corp/view/perception-cv4arvr/hair-segmentation#h.p_NimuO7PgHxlY).

* [Android](./hair_segmentation_mobile_gpu.md#android)
* [Android](./hair_segmentation_mobile_gpu.md)

## Desktop

@@ -208,8 +208,8 @@ of a MediaPipe graph. In the specification, a node in the graph represents an
instance of a particular calculator. All the necessary configurations of the
node, such its type, inputs and outputs must be described in the specification.
Description of the node can also include several optional fields, such as
node-specific options, input policy and executor, discussed in Section
[Framework Concepts > Scheduling mechanics](scheduling_sync.md#scheduling-mechanics).
node-specific options, input policy and executor, discussed in
[Framework Architecture](scheduling_sync.md).
`GraphConfig` has several other fields to configure the global graph-level
settings, eg, graph executor configs, number of threads, and maximum queue size
@@ -58,8 +58,9 @@ The hand detection [main graph](#main-graph) internally utilizes a
[hand detection subgraph](#hand-detection-subgraph). The subgraph shows up in
the main graph visualization as the `HandDetection` node colored in purple, and
the subgraph itself can also be visualized just like a regular graph. For more
information on how to visualize a graph that includes subgraphs, see
[visualizing subgraphs](./visualizer.md#visualizing-subgraphs).
information on how to visualize a graph that includes subgraphs, see the
Visualizing Subgraphs section in the
[visualizer documentation](./visualizer.md).

### Main Graph

@@ -22,6 +22,15 @@ performed only within the hand rectangle for computational efficiency and
accuracy, and hand detection is only invoked when landmark localization could
not identify hand presence in the previous iteration.

The example also comes with an experimental mode that localizes hand landmarks
in 3D (i.e., estimating an extra z coordinate):

![hand_tracking_3d_android_gpu.gif](images/mobile/hand_tracking_3d_android_gpu.gif)

In the visualization above, the localized hand landmarks are represented by dots
in different shades, with the brighter ones denoting landmarks closer to the
camera.

## Android

Please see [Hello World! in MediaPipe on Android](hello_world_android.md) for
@@ -35,6 +44,12 @@ To build the app, run:
bazel build -c opt --config=android_arm64 mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu
```

To build for the experimental mode that localizes hand landmarks in 3D, run:

```bash
bazel build -c opt --config=android_arm64 --define 3D=true mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu
```

To further install the app on an Android device, run:

```bash
@@ -56,6 +71,12 @@ Specific to this example, run:
bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/handtrackinggpu:HandTrackingGpuApp
```

To build for the experimental mode that localizes hand landmarks in 3D, run:

```bash
bazel build -c opt --config=ios_arm64 --define 3D=true mediapipe/examples/ios/handtrackinggpu:HandTrackingGpuApp
```

## Graph

The hand tracking [main graph](#main-graph) internally utilizes a
@@ -66,7 +87,8 @@ The hand tracking [main graph](#main-graph) internally utilizes a
The subgraphs show up in the main graph visualization as nodes colored in
purple, and the subgraph itself can also be visualized just like a regular
graph. For more information on how to visualize a graph that includes subgraphs,
see [visualizing subgraphs](./visualizer.md#visualizing-subgraphs).
see the Visualizing Subgraphs section in the
[visualizer documentation](./visualizer.md).

### Main Graph

@@ -20,8 +20,8 @@ stream on an Android device.

1. Install MediaPipe on your system, see [MediaPipe installation guide] for
details.
2. Install Android Development SDK and Android NDK. See how to do so in
[Setting up Android SDK and NDK].
2. Install Android Development SDK and Android NDK. See how to do so also in
[MediaPipe installation guide].
3. Enable [developer options] on your Android device.
4. Setup [Bazel] on your system to build and deploy the Android app.

@@ -728,7 +728,6 @@ If you ran into any issues, please see the full code of the tutorial
[`FrameProcessor`]:https://github.com/google/mediapipe/tree/master/mediapipe/java/com/google/mediapipe/components/FrameProcessor.java
[MediaPipe installation guide]:./install.md
[`PermissionHelper`]: https://github.com/google/mediapipe/tree/master/mediapipe/java/com/google/mediapipe/components/PermissionHelper.java
[Setting up Android SDK and NDK]:./install.md#setting-up-android-sdk-and-ndk
[`SurfaceHolder.Callback`]:https://developer.android.com/reference/android/view/SurfaceHolder.Callback.html
[`SurfaceView`]:https://developer.android.com/reference/android/view/SurfaceView
[`SurfaceView`]:https://developer.android.com/reference/android/view/SurfaceView
@@ -183,6 +183,8 @@ bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/edgedetectiongpu:Ed

Then, go back to XCode, open Window > Devices and Simulators, select your
device, and add the `.ipa` file generated by the command above to your device.
Here is the document on [setting up and compiling](./mediapipe_ios_setup.md) iOS
MediaPipe apps.

Open the application on your device. Since it is empty, it should display a
blank white screen.
Binary file not shown.

0 comments on commit 71a47bb

Please sign in to comment.
You can’t perform that action at this time.