Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can you share the MPPIrisTracker.mm file? #1

Closed
kiranscaria opened this issue Jun 7, 2021 · 7 comments
Closed

Can you share the MPPIrisTracker.mm file? #1

kiranscaria opened this issue Jun 7, 2021 · 7 comments
Assignees
Labels
question Further information is requested

Comments

@kiranscaria
Copy link

kiranscaria commented Jun 7, 2021

Can you share the MPPIrisTracker.mm file?

@61315 61315 self-assigned this Jun 7, 2021
@61315
Copy link
Owner

61315 commented Jun 7, 2021

Sure thing.

#import "MPPIrisTracker.h"
#import "mediapipe/objc/MPPGraph.h"
#import "mediapipe/objc/MPPCameraInputSource.h"
#import "mediapipe/objc/MPPLayerRenderer.h"
#include "mediapipe/framework/formats/landmark.pb.h"

static NSString* const kGraphName = @"iris_tracking_gpu";

static const char* kInputStream = "input_video";
static const char* kOutputStream = "output_video";

static const char* kLandmarksOutputStream = "iris_landmarks";
static const char* kVideoQueueLabel = "com.google.mediapipe.example.videoQueue";


/// Input side packet for focal length parameter.
std::map<std::string, mediapipe::Packet> _input_side_packets;
mediapipe::Packet _focal_length_side_packet;

@interface MPPIrisTracker() <MPPGraphDelegate>
@property(nonatomic) MPPGraph* mediapipeGraph;
@end


@implementation MPPIrisTracker { }

#pragma mark - Cleanup methods

- (void)dealloc {
    self.mediapipeGraph.delegate = nil;
    [self.mediapipeGraph cancel];
    // Ignore errors since we're cleaning up.
    [self.mediapipeGraph closeAllInputStreamsWithError:nil];
    [self.mediapipeGraph waitUntilDoneWithError:nil];
}

#pragma mark - MediaPipe graph methods
// https://google.github.io/mediapipe/getting_started/hello_world_ios.html#using-a-mediapipe-graph-in-ios

+ (MPPGraph*)loadGraphFromResource:(NSString*)resource {
    // Load the graph config resource.
    NSError* configLoadError = nil;
    NSBundle* bundle = [NSBundle bundleForClass:[self class]];
    if (!resource || resource.length == 0) {
        return nil;
    }
    NSURL* graphURL = [bundle URLForResource:resource withExtension:@"binarypb"];
    NSData* data = [NSData dataWithContentsOfURL:graphURL options:0 error:&configLoadError];
    if (!data) {
        NSLog(@"Failed to load MediaPipe graph config: %@", configLoadError);
        return nil;
    }
    
    // Parse the graph config resource into mediapipe::CalculatorGraphConfig proto object.
    mediapipe::CalculatorGraphConfig config;
    config.ParseFromArray(data.bytes, data.length);
    
    // Create MediaPipe graph with mediapipe::CalculatorGraphConfig proto object.
    MPPGraph* newGraph = [[MPPGraph alloc] initWithGraphConfig:config];
    
    _focal_length_side_packet =
    mediapipe::MakePacket<std::unique_ptr<float>>(absl::make_unique<float>(0.0));
    _input_side_packets = {
        {"focal_length_pixel", _focal_length_side_packet},
    };
    [newGraph addSidePackets:_input_side_packets];
    [newGraph addFrameOutputStream:kLandmarksOutputStream outputPacketType:MPPPacketTypeRaw];
    [newGraph addFrameOutputStream:kOutputStream outputPacketType:MPPPacketTypePixelBuffer];
    
    return newGraph;
}

- (instancetype)init
{
    self = [super init];
    if (self) {
        self.mediapipeGraph = [[self class] loadGraphFromResource:kGraphName];
        self.mediapipeGraph.delegate = self;
        self.mediapipeGraph.maxFramesInFlight = 2;
    }
    return self;
}

- (void)startGraph {
    // Start running self.mediapipeGraph.
    NSError* error;
    if (![self.mediapipeGraph startWithError:&error]) {
        NSLog(@"Failed to start graph: %@", error);
    }
}

#pragma mark - MPPInputSourceDelegate methods

- (void)processVideoFrame:(CVPixelBufferRef)imageBuffer
                timestamp:(CMTime)timestamp {
    
    mediapipe::Timestamp graphTimestamp(static_cast<mediapipe::TimestampBaseType>(
        mediapipe::Timestamp::kTimestampUnitsPerSecond * CMTimeGetSeconds(timestamp)));
    
    [self.mediapipeGraph sendPixelBuffer:imageBuffer
                              intoStream:kInputStream
                              packetType:MPPPacketTypePixelBuffer
                               timestamp:graphTimestamp];
}

#pragma mark - MPPGraphDelegate methods

// Receives CVPixelBufferRef from the MediaPipe graph. Invoked on a MediaPipe worker thread.
- (void)mediapipeGraph:(MPPGraph*)graph
  didOutputPixelBuffer:(CVPixelBufferRef)pixelBuffer
            fromStream:(const std::string&)streamName {
    if (streamName == kOutputStream) {
        [_delegate irisTracker: self didOutputPixelBuffer: pixelBuffer];
    }
}

// Receives a raw packet from the MediaPipe graph. Invoked on a MediaPipe worker thread.
- (void)mediapipeGraph:(MPPGraph*)graph
       didOutputPacket:(const ::mediapipe::Packet&)packet
            fromStream:(const std::string&)streamName {
    if (streamName == kLandmarksOutputStream) {
        if (packet.IsEmpty()) {
            NSLog(@"[TS:%lld] No iris landmarks", packet.Timestamp().Value());
            return;
        }
        
        const auto& landmarks = packet.Get<::mediapipe::NormalizedLandmarkList>();
        NSLog(@"[TS:%lld] Number of landmarks on iris: %d", packet.Timestamp().Value(),
              landmarks.landmark_size());
        for (int i = 0; i < landmarks.landmark_size(); ++i) {
          NSLog(@"\tLandmark[%d]: (%f, %f, %f)", i, landmarks.landmark(i).x(),
                landmarks.landmark(i).y(), landmarks.landmark(i).z());
        }
    }
}

@end

@kiranscaria
Copy link
Author

Sure thing.

#import "MPPIrisTracker.h"
#import "mediapipe/objc/MPPGraph.h"
#import "mediapipe/objc/MPPCameraInputSource.h"
#import "mediapipe/objc/MPPLayerRenderer.h"
#include "mediapipe/framework/formats/landmark.pb.h"

static NSString* const kGraphName = @"iris_tracking_gpu";

static const char* kInputStream = "input_video";
static const char* kOutputStream = "output_video";

static const char* kLandmarksOutputStream = "iris_landmarks";
static const char* kVideoQueueLabel = "com.google.mediapipe.example.videoQueue";


/// Input side packet for focal length parameter.
std::map<std::string, mediapipe::Packet> _input_side_packets;
mediapipe::Packet _focal_length_side_packet;

@interface MPPIrisTracker() <MPPGraphDelegate>
@property(nonatomic) MPPGraph* mediapipeGraph;
@end


@implementation MPPIrisTracker { }

#pragma mark - Cleanup methods

- (void)dealloc {
    self.mediapipeGraph.delegate = nil;
    [self.mediapipeGraph cancel];
    // Ignore errors since we're cleaning up.
    [self.mediapipeGraph closeAllInputStreamsWithError:nil];
    [self.mediapipeGraph waitUntilDoneWithError:nil];
}

#pragma mark - MediaPipe graph methods
// https://google.github.io/mediapipe/getting_started/hello_world_ios.html#using-a-mediapipe-graph-in-ios

+ (MPPGraph*)loadGraphFromResource:(NSString*)resource {
    // Load the graph config resource.
    NSError* configLoadError = nil;
    NSBundle* bundle = [NSBundle bundleForClass:[self class]];
    if (!resource || resource.length == 0) {
        return nil;
    }
    NSURL* graphURL = [bundle URLForResource:resource withExtension:@"binarypb"];
    NSData* data = [NSData dataWithContentsOfURL:graphURL options:0 error:&configLoadError];
    if (!data) {
        NSLog(@"Failed to load MediaPipe graph config: %@", configLoadError);
        return nil;
    }
    
    // Parse the graph config resource into mediapipe::CalculatorGraphConfig proto object.
    mediapipe::CalculatorGraphConfig config;
    config.ParseFromArray(data.bytes, data.length);
    
    // Create MediaPipe graph with mediapipe::CalculatorGraphConfig proto object.
    MPPGraph* newGraph = [[MPPGraph alloc] initWithGraphConfig:config];
    
    _focal_length_side_packet =
    mediapipe::MakePacket<std::unique_ptr<float>>(absl::make_unique<float>(0.0));
    _input_side_packets = {
        {"focal_length_pixel", _focal_length_side_packet},
    };
    [newGraph addSidePackets:_input_side_packets];
    [newGraph addFrameOutputStream:kLandmarksOutputStream outputPacketType:MPPPacketTypeRaw];
    [newGraph addFrameOutputStream:kOutputStream outputPacketType:MPPPacketTypePixelBuffer];
    
    return newGraph;
}

- (instancetype)init
{
    self = [super init];
    if (self) {
        self.mediapipeGraph = [[self class] loadGraphFromResource:kGraphName];
        self.mediapipeGraph.delegate = self;
        self.mediapipeGraph.maxFramesInFlight = 2;
    }
    return self;
}

- (void)startGraph {
    // Start running self.mediapipeGraph.
    NSError* error;
    if (![self.mediapipeGraph startWithError:&error]) {
        NSLog(@"Failed to start graph: %@", error);
    }
}

#pragma mark - MPPInputSourceDelegate methods

- (void)processVideoFrame:(CVPixelBufferRef)imageBuffer
                timestamp:(CMTime)timestamp {
    
    mediapipe::Timestamp graphTimestamp(static_cast<mediapipe::TimestampBaseType>(
        mediapipe::Timestamp::kTimestampUnitsPerSecond * CMTimeGetSeconds(timestamp)));
    
    [self.mediapipeGraph sendPixelBuffer:imageBuffer
                              intoStream:kInputStream
                              packetType:MPPPacketTypePixelBuffer
                               timestamp:graphTimestamp];
}

#pragma mark - MPPGraphDelegate methods

// Receives CVPixelBufferRef from the MediaPipe graph. Invoked on a MediaPipe worker thread.
- (void)mediapipeGraph:(MPPGraph*)graph
  didOutputPixelBuffer:(CVPixelBufferRef)pixelBuffer
            fromStream:(const std::string&)streamName {
    if (streamName == kOutputStream) {
        [_delegate irisTracker: self didOutputPixelBuffer: pixelBuffer];
    }
}

// Receives a raw packet from the MediaPipe graph. Invoked on a MediaPipe worker thread.
- (void)mediapipeGraph:(MPPGraph*)graph
       didOutputPacket:(const ::mediapipe::Packet&)packet
            fromStream:(const std::string&)streamName {
    if (streamName == kLandmarksOutputStream) {
        if (packet.IsEmpty()) {
            NSLog(@"[TS:%lld] No iris landmarks", packet.Timestamp().Value());
            return;
        }
        
        const auto& landmarks = packet.Get<::mediapipe::NormalizedLandmarkList>();
        NSLog(@"[TS:%lld] Number of landmarks on iris: %d", packet.Timestamp().Value(),
              landmarks.landmark_size());
        for (int i = 0; i < landmarks.landmark_size(); ++i) {
          NSLog(@"\tLandmark[%d]: (%f, %f, %f)", i, landmarks.landmark(i).x(),
                landmarks.landmark(i).y(), landmarks.landmark(i).z());
        }
    }
}

@end

Thanks

@61315
Copy link
Owner

61315 commented Jun 7, 2021

@kiranscaria Implementation is still going, I'll add it to the repo once the pose estimation part is done. Thank you.

@kiranscaria
Copy link
Author

@kiranscaria Implementation is still going, I'll add it to the repo once the pose estimation part is done. Thank you.

Looking forward to it

@61315 61315 pinned this issue Sep 18, 2021
@resignedScientist
Copy link

Could you please upload it to this repo as well?

@61315
Copy link
Owner

61315 commented Jul 25, 2022

@P1xelfehler You can checkout the latest commit(05deb55).

+new file:   src/ios/hand/BUILD
+new file:   src/ios/hand/Info.plist
+new file:   src/ios/hand/MPPBHand.h
+new file:   src/ios/hand/MPPBHand.mm
+new file:   src/ios/iris/BUILD
+new file:   src/ios/iris/Info.plist
+new file:   src/ios/iris/MPPBIris.h
+new file:   src/ios/iris/MPPBIris.mm

@61315 61315 reopened this Jul 25, 2022
@resignedScientist
Copy link

@P1xelfehler You can checkout the latest commit(05deb55).

+new file:   src/ios/hand/BUILD
+new file:   src/ios/hand/Info.plist
+new file:   src/ios/hand/MPPBHand.h
+new file:   src/ios/hand/MPPBHand.mm
+new file:   src/ios/iris/BUILD
+new file:   src/ios/iris/Info.plist
+new file:   src/ios/iris/MPPBIris.h
+new file:   src/ios/iris/MPPBIris.mm

Thank you!

@61315 61315 added the question Further information is requested label Jul 27, 2022
@61315 61315 closed this as completed Sep 25, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants