From 2dc8333c2d2000a1ce239dfa8d0ee095495bd225 Mon Sep 17 00:00:00 2001 From: Arren Glover Date: Mon, 6 Feb 2023 10:40:24 +0100 Subject: [PATCH 01/23] Update README.md --- README.md | 56 +++++++++++++++++++++++++++++++++++++++++++------------ 1 file changed, 44 insertions(+), 12 deletions(-) diff --git a/README.md b/README.md index 67c97c43..22cb3a1d 100644 --- a/README.md +++ b/README.md @@ -1,17 +1,49 @@ # Human Pose Estimation Core Library -A library of functions for human pose estimation with event-driven cameras +_A library of functions for human pose estimation with event-driven cameras_ -* Compile and link the [core-library](https://github.com/event-driven-robotics/hpe-core/tree/main/core) in your application to use the event-based human pose estimation functions including: - * joint detectors (neural network implementations including OpenPose) - * joint velocity estimation - * asynchronous pose fusion -* [Example applications](https://github.com/event-driven-robotics/hpe-core/tree/main/example) are available to show how to connect the HPE blocks -* [Evaluate](https://github.com/event-driven-robotics/hpe-core/tree/main/evaluation) your performance using the scripts in evaluation -* Scripts to convert datasets are also available. Please contribute to the ever-growing datasets for event-driven HPE! For curated datasets compatible with the example function see [hosted datasts](https://github.com/event-driven-robotics/hpe-core) TODO +Please contribute your event-driven HPE application and datasets to enable comparisons! + +``` +@INPROCEEDINGS{9845526, + author={Carissimi, Nicolò and Goyal, Gaurvi and Pietro, Franco Di and Bartolozzi, Chiara and Glover, Arren}, + booktitle={2022 8th International Conference on Event-Based Control, Communication, and Signal Processing (EBCCSP)}, + title={Unlocking Static Images for Training Event-driven Neural Networks}, + year={2022}, + pages={1-4}, + doi={10.1109/EBCCSP56922.2022.9845526}} +``` + +### [Core (C++)](https://github.com/event-driven-robotics/hpe-core/tree/main/core) + +Compile and link the core C++ library in your application to use the event-based human pose estimation functions including: +* joint detectors: OpenPose built upon greyscales formed from events +* joint velocity estimation @>500Hz +* asynchronous pose fusion of joint velocity and detection +* event representation methods to be compatible with convolutional neural networks. + +[installation](https://github.com/event-driven-robotics/hpe-core/tree/main/core) + +### PyCore + +Importable python libraries for joint detection +* event-based movenet: MoveEnet built on PyTorch + +### [Examples](https://github.com/event-driven-robotics/hpe-core/tree/main/example) + +Some example applications are available giving ideas on how to use the HPE-core libraries + +### [Evaluation](https://github.com/event-driven-robotics/hpe-core/tree/main/evaluation) + +Python scripts can be used to compare different detectors and velocity estimation combinations + +### Datasets and Conversion + +Scripts to convert datasets into common formats to easily facilitate valid comparisons ### Authors -@arrenglover -@nicolocarissimi -@gaurvigoyal -@francodipietro +> [@arrenglover](https://www.linkedin.com/in/arren-glover/) +> [@nicolocarissimi](https://www.linkedin.com/in/nicolocarissimi/) +> [@gaurvigoyal](https://www.linkedin.com/in/gaurvigoyal/) +> [@francodipietro](https://www.linkedin.com/in/francodipietrophd/) + From 0a641f9c0cf50768f623e8a3470f5103cbc3706f Mon Sep 17 00:00:00 2001 From: Arren Glover Date: Mon, 6 Feb 2023 10:49:18 +0100 Subject: [PATCH 02/23] Update README.md --- README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/README.md b/README.md index 22cb3a1d..8939d674 100644 --- a/README.md +++ b/README.md @@ -13,6 +13,8 @@ Please contribute your event-driven HPE application and datasets to enable compa doi={10.1109/EBCCSP56922.2022.9845526}} ``` +![demo](https://user-images.githubusercontent.com/9265237/216939617-703fc4ef-b4b9-4cbc-aab8-a87c04822be2.gif) + ### [Core (C++)](https://github.com/event-driven-robotics/hpe-core/tree/main/core) Compile and link the core C++ library in your application to use the event-based human pose estimation functions including: From 9185afa8fc9871b3acbdb19c58aef886f3af0064 Mon Sep 17 00:00:00 2001 From: Arren Glover Date: Mon, 6 Feb 2023 10:49:51 +0100 Subject: [PATCH 03/23] Update README.md --- README.md | 2 -- 1 file changed, 2 deletions(-) diff --git a/README.md b/README.md index 8939d674..32b13d97 100644 --- a/README.md +++ b/README.md @@ -23,8 +23,6 @@ Compile and link the core C++ library in your application to use the event-based * asynchronous pose fusion of joint velocity and detection * event representation methods to be compatible with convolutional neural networks. -[installation](https://github.com/event-driven-robotics/hpe-core/tree/main/core) - ### PyCore Importable python libraries for joint detection From e784629bb4d3146ac5afd513a02933704c7fb5dd Mon Sep 17 00:00:00 2001 From: Arren Glover Date: Mon, 6 Feb 2023 10:56:42 +0100 Subject: [PATCH 04/23] Update README.md --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index 32b13d97..584b3fce 100644 --- a/README.md +++ b/README.md @@ -47,3 +47,4 @@ Scripts to convert datasets into common formats to easily facilitate valid compa > [@gaurvigoyal](https://www.linkedin.com/in/gaurvigoyal/) > [@francodipietro](https://www.linkedin.com/in/francodipietrophd/) +[Event-driven Perception for Robotics](https://www.edpr.iit.it/research) From 0c2681b1dbe0c8b409e35077bcb7acdcc97cd586 Mon Sep 17 00:00:00 2001 From: Arren Glover Date: Mon, 6 Feb 2023 10:58:34 +0100 Subject: [PATCH 05/23] Update README.md --- core/README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/core/README.md b/core/README.md index 559d54a2..8e268af2 100644 --- a/core/README.md +++ b/core/README.md @@ -10,6 +10,8 @@ C++ src for compiling a library of functions useful for HPE: # Build the library +The library is designed to use CMake to configure and build. + * Clone the repository: e.g. `git clone https://github.com/event-driven-robotics/hpe-core.git` * `cd hpe-core/core` * `mkdir build && cd build` From 5b2e92d17a2ba94f8726b1f765d74a789fc22cd1 Mon Sep 17 00:00:00 2001 From: arrenglover Date: Mon, 6 Feb 2023 17:07:11 +0100 Subject: [PATCH 06/23] removed volume compensation example --- example/volume_compensation/.gitignore | 2 - example/volume_compensation/CMakeLists.txt | 19 - .../example_volumecomp.cpp | 478 ------------------ example/volume_compensation/plot_velocity.py | 34 -- example/volume_compensation/visualisation.h | 175 ------- 5 files changed, 708 deletions(-) delete mode 100644 example/volume_compensation/.gitignore delete mode 100644 example/volume_compensation/CMakeLists.txt delete mode 100644 example/volume_compensation/example_volumecomp.cpp delete mode 100644 example/volume_compensation/plot_velocity.py delete mode 100644 example/volume_compensation/visualisation.h diff --git a/example/volume_compensation/.gitignore b/example/volume_compensation/.gitignore deleted file mode 100644 index f7857461..00000000 --- a/example/volume_compensation/.gitignore +++ /dev/null @@ -1,2 +0,0 @@ -build/ -.vscode diff --git a/example/volume_compensation/CMakeLists.txt b/example/volume_compensation/CMakeLists.txt deleted file mode 100644 index 1d287f15..00000000 --- a/example/volume_compensation/CMakeLists.txt +++ /dev/null @@ -1,19 +0,0 @@ -# requires minimum cmake version -cmake_minimum_required(VERSION 3.5) - -# produce the cmake var PROJECT_NAME -project(volume_compensation) -find_package(YARP) -find_package(event-driven) -find_package(hpe-core) - -#default the install location to that of event-driven -add_executable(${PROJECT_NAME} example_volumecomp.cpp visualisation.h) - -target_link_libraries(${PROJECT_NAME} PRIVATE YARP::YARP_os - YARP::YARP_init - hpe-core::hpe-core - ev::event-driven) - -install(TARGETS ${PROJECT_NAME} DESTINATION ${CMAKE_INSTALL_BINDIR}) - diff --git a/example/volume_compensation/example_volumecomp.cpp b/example/volume_compensation/example_volumecomp.cpp deleted file mode 100644 index 690ed549..00000000 --- a/example/volume_compensation/example_volumecomp.cpp +++ /dev/null @@ -1,478 +0,0 @@ -/*BSD 3-Clause License -Copyright (c) 2021, Event Driven Perception for Robotics -All rights reserved. */ - -#include -#include -#include -#include -#include -#include -#include -#include -#include -#include "visualisation.h" -#include - - -// #define f 274.573 -// #define x0 139.228 -// #define y0 123.092 - -#define f 500 -#define x0 152 -#define y0 120 - - -using namespace ev; -using namespace yarp::os; -using namespace yarp::sig; -using std::vector; -using std::string; - -class flowVisualiser { -private: - cv::Mat masker; - cv::Mat debug_image; - cv::Mat arrow_image; - cv::Mat compensated_image; - const static int arrow_space{20}; - - pixelShifter ps; - cv::Mat blank_iso; - -public: - - flowVisualiser() {} - void initialise(int height, int width, double window_size) { - - masker = cv::Mat(height, width, CV_8U); - debug_image = cv::Mat(height, width, CV_8UC3); - arrow_image = cv::Mat(height, width, CV_8UC3); - compensated_image = cv::Mat(height, width, CV_32FC3); - reset(); - - ps = drawISOBase(height, width, window_size, blank_iso); - - } - - void reset() { - masker.setTo(1); - debug_image.setTo(128); - arrow_image.setTo(0); - compensated_image.setTo(cv::Scalar(0, 0, 0)); - - } - - void addPixel(int u, int v, int uhat, int vhat, double udot, double vdot, double period) - { - //update the original uncompensated image - debug_image.at(v, u) = cv::Vec3b(255, 255, 255); - - //update the compensated image with HSV calculated colours - if (vhat >= 0 && vhat < compensated_image.rows && uhat >= 0 && uhat < compensated_image.cols) { - double magnitude = 255.0; - magnitude *= sqrt(udot * udot + vdot * vdot) / 200.0; - if (magnitude > 255.0) magnitude = 255.0; - if (magnitude < 1.0) magnitude = 1.0; - double angle = atan2(vdot, udot); - angle = 180.0 * (angle + M_PI) / (2.0 * M_PI); - compensated_image.at(vhat, uhat) = cv::Vec3f(angle, 255.0f, magnitude); - } - - //if an arrow hasn't been drawn nearby, draw one - if (masker.at(v, u)) { - cv::line(arrow_image, cv::Point(u, v), - cv::Point(u + udot * period, v + vdot * period), - CV_RGB(1, 1, 1)); - - //and update the masker to say an arrow is already drawn - //in this region - int xl = std::max(u - arrow_space, 0); - int xh = std::min(u + arrow_space, masker.cols); - int yl = std::max(v - arrow_space, 0); - int yh = std::min(v + arrow_space, masker.rows); - masker(cv::Rect(cv::Point(xl, yl), cv::Point(xh, yh))).setTo(0); - } - } - - void drawISO(const std::deque &volume, std::string name) - { - cv::namedWindow(name, cv::WINDOW_NORMAL); - cv::Mat base_copy; - blank_iso.copyTo(base_copy); - for (auto &c : volume) { - int x = c.x; - int y = c.y; - double z = ev::vtsHelper::deltaS(volume.back().stamp, c.stamp); - ps.pttr(x, y, z); - if (x >= 0 && y >= 0 && x < base_copy.cols && y < base_copy.rows) - base_copy.at(y, x) = cv::Vec3b(120, 120, 120); - } - for (auto &c : volume) { - int x = c.x; - int y = c.y; - double z = ev::vtsHelper::deltaS(volume.back().stamp, c.stamp); - ps.pttr(x, y, z); - base_copy.at(c.y, x) = cv::Vec3b(255, 255, 255); - } - cv::imshow(name, base_copy); - } - - void countNonZero() - { - static double mean_val{0.0}; - static int mean_count{0}; - - cv::Mat temp, temp2; - cv::cvtColor(debug_image, temp, cv::COLOR_BGR2GRAY); - cv::threshold(temp, temp, 129, 255, cv::THRESH_BINARY); - int c_orig = cv::countNonZero(temp); - - compensated_image.convertTo(temp, CV_8UC3); - cv::cvtColor(temp, temp2, cv::COLOR_BGR2GRAY); - int c_comp = cv::countNonZero(temp2); - - double score = (c_orig - c_comp) / (double)std::max(c_orig, c_comp); - mean_count++; - mean_val += (score - mean_val) / mean_count; - - //yInfo() << 100 * mean_val; - } - - void show() - { - - //build hsv image - cv::Mat bgr, hsv8; - compensated_image.convertTo(hsv8, CV_8UC3); - cv::cvtColor(hsv8, bgr, cv::COLOR_HSV2BGR); - - cv::Mat together = cv::Mat(bgr.rows, bgr.cols*2, CV_8UC3, cv::Scalar(128, 128, 128)); - cv::Rect roi1 = cv::Rect(0, 0, bgr.cols, bgr.rows); - cv::Rect roi2 = cv::Rect(bgr.cols, 0, bgr.cols, bgr.rows); - - //debug_image.copyTo(together(roi2)); - bgr.copyTo(together(roi2), bgr); - - debug_image.copyTo(together(roi1)); - arrow_image.copyTo(together(roi1), arrow_image); - //cv::threshold(together, together, 129, 255, cv::THRESH_BINARY); - - cv::namedWindow("Compensated", cv::WINDOW_NORMAL); - cv::imshow("Compensated", together); - cv::waitKey(3); - - } - -}; - -class volumeCompensator : public RFModule, - public Thread { - private: - - vReadPort< vector > vis_port; - vReadPort< vector > imu_port; - BufferedPort< Vector > scope_port; - - imuAdvHelper imu_helper; - double period{0.1}; - double window_size{0.1}; - bool rot_only{false}; - - resolution res{304, 240}; - - deque window; - flowVisualiser fv; - std::mutex m; - - std::fstream vel_writer; - -public: - - volumeCompensator(){}; - - virtual bool configure(yarp::os::ResourceFinder& rf) - { - if(rf.check("help") || rf.check("h")) { - yInfo() << "Event Volumer and Compensator"; - yInfo() << "--height : image height"; - yInfo() << "--width : image width"; - yInfo() << "--period : image output rate"; - yInfo() << "--rot_only : use 3DoF rotation only"; - yInfo() << "--window : period of data to compensate"; - yInfo() << "--vel_dump : path to file to dump velocities"; - yInfo() << "--imu : path to imu calibration file"; - yInfo() << "--imu_beta : AHRS beta parameter"; - return false; - } - - //set the module name used to name ports - setName((rf.check("name", Value("/volumer")).asString()).c_str()); - - res.height = rf.check("height", Value(240)).asInt(); - res.width = rf.check("width", Value(304)).asInt(); - period = rf.check("period", Value(0.05)).asDouble(); - rot_only = rf.check("rot_only", Value(false)).asBool(); - window_size = rf.check("window", Value(0.1)).asDouble(); - std::string vel_dump_path = ""; - if(rf.check("vel_dump")) - vel_dump_path = rf.find("vel_dump").asString(); - - if(!vel_dump_path.empty()) { - vel_writer.open(vel_dump_path, std::ios_base::out|std::ios_base::trunc); - if(!vel_writer.is_open()) { - yError() << "Could not open supplied writer path" << vel_dump_path; - return false; - } - vel_writer << std::fixed << std::setprecision(6); - } - - Network yarp; - if(!yarp.checkNetwork(2.0)) { - std::cout << "Could not connect to YARP" << std::endl; - return false; - } - - //open io ports - if(!vis_port.open(getName("/AE:i"))) { - yError() << "Could not open input port"; - return false; - } - - if(!imu_port.open(getName("/IMU:i"))) { - yError() << "Could not open input port"; - return false; - } - - if(!scope_port.open(getName() + "/scope:o")) { - yError() << "Could not open scope out port"; - return false; - } - - yarp.connect("/vPreProcess/imu_samples:o", - getName("/IMU:i"), - "fast_tcp"); - - yarp.connect("/vPreProcess/right:o", - getName("/AE:i"), - "fast_tcp"); - - fv.initialise(res.height, res.width, window_size); - - if(rf.check("imu")) - imu_helper.loadIMUCalibrationFiles(rf.find("imu").asString()); - - if(rf.check("imu_beta")) - imu_helper.setBeta(rf.find("imu_beta").asDouble()); - - //start the asynchronous and synchronous threads - return Thread::start(); - } - - virtual double getPeriod() - { - return period; //period of synchrnous thread - } - - bool interruptModule() //RFMODULE - { - //if the module is asked to stop ask the asynchrnous thread to stop - return Thread::stop(); - } - - void onStop() //THREAD - { - //when the asynchrnous thread is asked to stop, close ports and do - //other clean up - vis_port.close(); - imu_port.close(); - scope_port.close(); - vel_writer.close(); - } - - //synchronous thread - virtual bool updateModule() - { - - //get all available updates to the IMU, then push them to the - //new IMU helper that does orientation estimation and velocity - //estimation. extract the velocity vector - static Stamp yarpstamp_imu; - auto n_packets = imu_port.queryunprocessed(); - - //update the current value of the imu sensor values - for (auto i = 0; i < n_packets; i++) { - const vector *q_imu = imu_port.read(yarpstamp_imu); - if (!q_imu || Thread::isStopping()) return false; - imu_helper.addIMUPacket(*q_imu); - } - if(window.empty()) return true; - - //update the madgwick filter - // double dtime = vtsHelper::deltaMS(timestamp, p_timestamp); - // orientation_filter.setPeriod(dtime); - // p_timestamp = timestamp; - - // orientation_filter.AHRSupdateIMU(imu_state[imuHelper::GYR_X], - // imu_state[imuHelper::GYR_Y], - // imu_state[imuHelper::GYR_Z], - // imu_state[imuHelper::ACC_X], - // imu_state[imuHelper::ACC_Y], - // imu_state[imuHelper::ACC_Z]); - - // //perform compensation - // auto gp = orientation_filter.gravity_compensate( - // imu_state[imuHelper::ACC_X], - // imu_state[imuHelper::ACC_Y], - // imu_state[imuHelper::ACC_Z]); - // vel_estimate.integrate(gp, dtime); - - //form the velocity vector correctly - // static vector vel_imu(6, 0.0); - // vel_imu[0] = vel_estimate.getLinearVelocity()[0]; - // vel_imu[1] = vel_estimate.getLinearVelocity()[1]; - // vel_imu[2] = vel_estimate.getLinearVelocity()[2]; - // vel_imu[3] = imu_state[imuHelper::GYR_X]; - // vel_imu[4] = imu_state[imuHelper::GYR_Y]; - // vel_imu[5] = imu_state[imuHelper::GYR_Z]; - - - - double period = vtsHelper::deltaS(window.back().stamp, window.front().stamp); - hpecore::camera_velocity cam_vel = imu_helper.extractVelocity(); - double temp3 = cam_vel[0]; cam_vel[0] = cam_vel[1]; cam_vel[1] = -temp3; - temp3 = cam_vel[3]; cam_vel[3] = cam_vel[4]; cam_vel[4] = -temp3; - //cam_vel[0] = 0.0; cam_vel[1] = 0.0; cam_vel[2] = 0.0; - hpecore::camera_params cam_par = {f, x0, y0}; - fv.reset(); - deque warped_volume; - m.lock(); - for (auto &v : window) { - hpecore::point_flow pf = hpecore::estimateVisualFlow({(double)v.x, (double)v.y, 0.0}, cam_vel, cam_par); - pf.udot *= 12.5; pf.vdot *= 12.5; - auto we = hpecore::spatiotemporalWarp(v, pf, vtsHelper::deltaS(window.back().stamp, v.stamp)); - fv.addPixel(v.x, v.y, we.x, we.y, pf.udot, pf.vdot, period); - warped_volume.push_back(we); - } - //fv.drawISO(window, "Original Events"); - //fv.drawISO(warped_volume, "Warped Events"); - std::array qd = imu_helper.extractHeading(); - //for(auto i : qd) std::cout << i; std::cout << std::endl; - //std::array qf = qd; - - m.unlock(); - - //fv.countNonZero(); - fv.show(); - - static std::array< deque, 6> values; - - yarp::sig::Vector &yarp_vel = scope_port.prepare(); - yarp_vel.resize(6); - std::array cam_acc = imu_helper.extractAcceleration(); - static std::array mean_acc = {0, 0, 0, 0, 0, 0}; - static std::array mean_vel = {0, 0, 0, 0, 0, 0}; - static std::array my_vel = {0, 0, 0, 0, 0, 0}; - std::array gv = imu_helper.getGravityVector(); - static double tic = yarp::os::Time::now(); - double dt = yarp::os::Time::now() - tic; tic += dt; - double max_diff{0.0}; - if(dt > 1.0) dt = 1.0; - - //display gravity vector -#if 0 - for(auto i = 0; i < 3; i++) { - yarp_vel[i] = gv[i]; - yarp_vel[i+3] = cam_acc[i]; - } -#endif - - //display integrated velocity -#if 1 - for(auto i = 0; i < 3; i++) { - - values[i].push_back(cam_acc[i]); - if(values[i].size() > 5) values[i].pop_front(); - - double mean = 0.0; - for(auto &v : values[i]) mean += v; - mean /= values[i].size(); - - double var = 0.0; - for(auto &v : values[i]) var += (v-mean)*(v-mean); - var /= values[i].size(); - var = sqrt(var); - - if(var > 0.1) { - my_vel[i] += (cam_acc[i] - gv[i]) * dt; - mean_vel[i] = mean_vel[i]*(1.0-dt*0.5) + my_vel[i]*dt*0.5; - } else { - my_vel[i] = 0.0; - mean_vel[i] = 0.0; - } - - yarp_vel[i] = my_vel[i]; - max_diff = std::max(std::fabs(gv[i]- cam_acc[i]), max_diff); - } -#endif - //yInfo() << max_diff << " ms^-2"; - static double mean_max_diff{0.0}; - static int count_max_diff{0}; - mean_max_diff += (max_diff - mean_max_diff) / ++count_max_diff; - - std::stringstream ss; - ss << std::fixed << std::setprecision(2) << mean_max_diff << "m/s^2"; - - cv::Mat temp = ev::drawRefAxis(qd); - cv::putText(temp, ss.str(), cv::Point(0, temp.rows), cv::FONT_HERSHEY_PLAIN, 1.0, cv::Scalar(255, 255, 255)); - cv::imshow("AHRS", temp); - cv::waitKey(1); - - scope_port.write(); - - if (vel_writer.is_open()) { - vel_writer << yarpstamp_imu.getTime() << " "; - for (auto &i : cam_vel) - vel_writer << i << " "; - vel_writer << std::endl; - } - - return Thread::isRunning(); - } - - //asynchronous thread run forever - void run() - { - Stamp yarpstamp; - while(true) { - const vector * q_flow = vis_port.read(yarpstamp); - if(!q_flow || Thread::isStopping()) - return; - - m.lock(); - for(auto &v : *q_flow) - if(v.x >= 0 && v.y >= 0 && v.x < res.width && v.y < res.height) - window.push_back(v); - - while(ev::vtsHelper::deltaS(window.back().stamp, window.front().stamp) > window_size) - window.pop_front(); - m.unlock(); - - } - } -}; - -int main(int argc, char * argv[]) -{ - - /* prepare and configure the resource finder */ - yarp::os::ResourceFinder rf; - rf.configure( argc, argv ); - - /* create the module */ - volumeCompensator instance; - return instance.runModule(rf); -} - - diff --git a/example/volume_compensation/plot_velocity.py b/example/volume_compensation/plot_velocity.py deleted file mode 100644 index 6cc602d7..00000000 --- a/example/volume_compensation/plot_velocity.py +++ /dev/null @@ -1,34 +0,0 @@ -#BSD 3-Clause License -#Copyright (c) 2021, Event Driven Perception for Robotics -#All rights reserved. - -import matplotlib.pyplot as plt -import numpy as np - -#%% -input_file_path = "/home/aglover/imu_out_temp.txt" -data = np.loadtxt(input_file_path); -data[:, 0] = data[:, 0] - data[0, 0] -labels = ["X", "Y", "Z", "PITCH", "YAW", "ROLL"] - -#%% -plt.rcParams.update({'font.size': 12}) -#plt.figure(figsize=(12, 9)) -fig,axs = plt.subplots(6) -#fig.set_figsize(12, 9) -#plt.xlim(0.0, 1.0) -#plt.ylim(0.0, 0.75) -axs[2].set_ylabel('velocity') -axs[5].set_xlabel('time') -plt.draw() -plt.pause(0.01) - -for i in range(0, 6): - axs[i].plot(data[:, 0], data[:, i+1], label=labels[i]) - axs[i].legend(loc="upper right") - -for ax in axs: - ax.label_outer() - -plt.show() - diff --git a/example/volume_compensation/visualisation.h b/example/volume_compensation/visualisation.h deleted file mode 100644 index b51c8c3b..00000000 --- a/example/volume_compensation/visualisation.h +++ /dev/null @@ -1,175 +0,0 @@ -/*BSD 3-Clause License -Copyright (c) 2021, Event Driven Perception for Robotics -All rights reserved. */ - -#pragma once - -#include -#include -#include - -class pixelShifter { - //angles - double thetaY; - double thetaX; - double CY, SY; - double CX, SX; - double xshift; - double yshift; - double ts_scaler; - - public: - - pixelShifter() - { - setRotation(20.0, 40.0); - setShift(0, 0, 1.0); - } - - void setRotation(double pitch, double yaw) - { - thetaX = pitch * 3.14 / 180.0; //PITCH - thetaY = yaw * 3.14 / 180.0; //YAW - - CY = cos(thetaY); - SY = sin(thetaY); - CX = cos(thetaX); - SX = sin(thetaX); - } - - void setShift(int xoffset, int yoffset, double tsoffset) - { - xshift = xoffset; - yshift = yoffset; - ts_scaler = tsoffset; - } - - void pttr(int &x, int &y, double &z) { - // we want a negative rotation around the y axis (yaw) - // a positive rotation around the x axis (pitch) (no roll) - // the z should always be negative values. - // the points need to be shifted across by negligble amount - // the points need to be shifted up by (x = max, y = 0, ts = 0 rotation) - z = z*ts_scaler; - int xmod = x*CY + z*SY + 0.5; // +0.5 rounds rather than floor - int ymod = y*CX - SX*(-x*SY + z*CY) + 0.5; - //int zmod = y*SX + CX*(-x*SY + z*CY) + 0.5; - x = xmod + xshift; y = ymod + yshift; //z = zmod; - } -}; - -pixelShifter drawISOBase(int height, int width, double period, cv::Mat &baseimage) -{ - int Xlimit = width; - int Ylimit = height; - int Zlimit = width * 3; - double ts_to_axis = (double)Zlimit / period; - - pixelShifter pr; - pr.setRotation(-20.0, 0.0); - - //the following calculations make the assumption of a negative yaw and - //a positive pitch - int x, y; double z; - int maxx = 0, maxy = 0, miny = Ylimit, minx = Xlimit; - for(int xi = 0; xi <= Xlimit; xi+=Xlimit) { - for(int yi = 0; yi <= Ylimit; yi+=Ylimit) { - for(double zi = 0; zi <= Zlimit; zi+=Zlimit) { - x = xi; y = yi; z = zi; pr.pttr(x, y, z); - maxx = std::max(maxx, x); - maxy = std::max(maxy, y); - minx = std::min(minx, x); - miny = std::min(miny, y); - } - } - } - - - int imagexshift = -minx + 10; - int imageyshift = -miny + 10; - pr.setShift(imagexshift, imageyshift, ts_to_axis); - - int imagewidth = maxx + imagexshift + 10; - int imageheight = maxy + imageyshift + 10; - - baseimage = cv::Mat(imageheight, imagewidth, CV_8UC3); - baseimage.setTo(0); - - - - //cv::putText(baseimage, std::string("X"), cv::Point(100, 100), 1, 0.5, CV_RGB(0, 0, 0)); - - cv::Scalar invertedtextc = CV_RGB(125, 125, 125); - cv::Vec3b invertedaxisc = cv::Vec3b(255, 255, 255); - cv::Vec3b invertedframec = cv::Vec3b(125, 125, 125); - - for(int xi = 0; xi < Xlimit; xi++) { - x = xi; y = 0; z = 0; pr.pttr(x, y, z); - baseimage.at(y, x) = invertedaxisc; - x = xi; y = Ylimit; z = 0; pr.pttr(x, y, z); - baseimage.at(y, x) = invertedaxisc; - if(xi == Xlimit / 2) { - cv::putText(baseimage, std::string("x"), cv::Point(x-10, y+10), - cv::FONT_ITALIC, 0.5, invertedtextc, 1, 8, false); - } - } - - for(int yi = 0; yi <= Ylimit; yi++) { - x = 0; y = yi; z = 0; pr.pttr(x, y, z); - baseimage.at(y, x) = invertedaxisc; - if(yi == Ylimit / 2) { - cv::putText(baseimage, std::string("y"), cv::Point(x-10, y+10), - cv::FONT_ITALIC, 0.5, invertedtextc, 1, 8, false); - } - x = Xlimit; y = yi; z = 0; pr.pttr(x, y, z); - baseimage.at(y, x) = invertedaxisc; - - } - - unsigned int tsi; - for(tsi = 0; tsi < (unsigned int)(period*0.3); tsi++) { - - x = Xlimit; y = Ylimit; z = tsi; pr.pttr(x, y, z); - baseimage.at(y, x) = invertedaxisc; - - if(tsi == (unsigned int)(period *0.15)) { - cv::putText(baseimage, std::string("t"), cv::Point(x, y+12), - cv::FONT_ITALIC, 0.5, invertedtextc, 1, 8, false); - } - - } - - // for(int i = 0; i < 14; i++) { - - // x = Xlimit-i/2; y = Ylimit; z = tsi-i; pr.pttr(x, y, z); - // baseimage.at(y, x) = invertedaxisc; - - // x = Xlimit+i/2; y = Ylimit; z = tsi-i; pr.pttr(x, y, z); - // baseimage.at(y, x) = invertedaxisc; - // } - - for(tsi = ev::vtsHelper::vtsscaler / 10.0; - tsi < (unsigned int)period; - tsi += ev::vtsHelper::vtsscaler / 10.0) { - - int zc = tsi + 0.5; - - for(int xi = 0; xi < Xlimit; xi++) { - x = xi; y = 0; z = zc; pr.pttr(x, y, z); - baseimage.at(y, x) = invertedframec; - x = xi; y = Ylimit; z = zc; pr.pttr(x, y, z); - baseimage.at(y, x) = invertedframec; - } - - for(int yi = 0; yi <= Ylimit; yi++) { - x = 0; y = yi; z = zc; pr.pttr(x, y, z); - baseimage.at(y, x) = invertedframec; - x = Xlimit; y = yi; z = zc; pr.pttr(x, y, z); - baseimage.at(y, x) = invertedframec; - - } - - } - - return pr; -} \ No newline at end of file From a5d1049abde75e0d83c7b7653c9b776ba692307f Mon Sep 17 00:00:00 2001 From: Arren Glover Date: Tue, 7 Feb 2023 08:45:16 +0100 Subject: [PATCH 07/23] Update README.md --- core/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/core/README.md b/core/README.md index 8e268af2..5e08c5b4 100644 --- a/core/README.md +++ b/core/README.md @@ -5,7 +5,7 @@ C++ src for compiling a library of functions useful for HPE: * human pose estimation from events * joint tracking * skeleton tracking -* fusion +* position and velocity fusion * on-line visualisation # Build the library From b72ce693d14f624206201b0975c879479a8b79e1 Mon Sep 17 00:00:00 2001 From: Arren Glover Date: Tue, 7 Feb 2023 09:04:52 +0100 Subject: [PATCH 08/23] Update README.md --- core/README.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/core/README.md b/core/README.md index 5e08c5b4..1906d254 100644 --- a/core/README.md +++ b/core/README.md @@ -27,5 +27,6 @@ Using cmake, add the following to your `CMakeLists.txt` * `find_package(hpe-core)` * `target_link_libraries(${PROJECT_NAME} PRIVATE hpe-core::hpe-core)` +# Example - +Examples of how to install dependencies, build the library, and link it in your own project can be found [here](https://github.com/event-driven-robotics/hpe-core/tree/main/example/op_detector_example_module). The Dockerfile can be used to automatically build the dependencies required in a isolated container, or if you prefer, your can follow it's instructions to install the dependencies natively on your machine. [Interested to learn more about Docker?](https://www.docker.com/) From 76a6df886ea8c6cb3f39371aac5a86690ca09da5 Mon Sep 17 00:00:00 2001 From: arrenglover Date: Tue, 7 Feb 2023 10:09:03 +0000 Subject: [PATCH 09/23] removed old files --- .gitignore | 2 + core/CMakeLists.txt | 4 - core/detection_wrappers/detection.cpp | 40 - core/detection_wrappers/detection.h | 40 - .../jointMotionEstimator.cpp | 708 ------------------ core/motion_estimation/jointMotionEstimator.h | 67 -- 6 files changed, 2 insertions(+), 859 deletions(-) delete mode 100644 core/detection_wrappers/detection.cpp delete mode 100644 core/detection_wrappers/detection.h delete mode 100644 core/motion_estimation/jointMotionEstimator.cpp delete mode 100644 core/motion_estimation/jointMotionEstimator.h diff --git a/.gitignore b/.gitignore index 46d80014..33596926 100644 --- a/.gitignore +++ b/.gitignore @@ -37,3 +37,5 @@ #compiled python caches *.pyc +.vscode/ +build/ \ No newline at end of file diff --git a/core/CMakeLists.txt b/core/CMakeLists.txt index 86a07496..9397ca94 100644 --- a/core/CMakeLists.txt +++ b/core/CMakeLists.txt @@ -52,8 +52,6 @@ if(OpenCV_FOUND) endif() set( folder_source - detection_wrappers/detection.cpp - motion_estimation/jointMotionEstimator.cpp event_representations/representations.cpp motion_estimation/motion_estimation.cpp fusion/fusion.cpp @@ -61,9 +59,7 @@ set( folder_source set( folder_header utility/utility.h - detection_wrappers/detection.h motion_estimation/motion_estimation.h - motion_estimation/jointMotionEstimator.h event_representations/representations.h fusion/fusion.h ) diff --git a/core/detection_wrappers/detection.cpp b/core/detection_wrappers/detection.cpp deleted file mode 100644 index c245a6f9..00000000 --- a/core/detection_wrappers/detection.cpp +++ /dev/null @@ -1,40 +0,0 @@ -/* BSD 3-Clause License - -Copyright (c) 2021, Event Driven Perception for Robotics -All rights reserved. - -Redistribution and use in source and binary forms, with or without -modification, are permitted provided that the following conditions are met: - -1. Redistributions of source code must retain the above copyright notice, this - list of conditions and the following disclaimer. - -2. Redistributions in binary form must reproduce the above copyright notice, - this list of conditions and the following disclaimer in the documentation - and/or other materials provided with the distribution. - -3. Neither the name of the copyright holder nor the names of its - contributors may be used to endorse or promote products derived from - this software without specific prior written permission. - -THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" -AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE -DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE -FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL -DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR -SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER -CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, -OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ - -#include "detection.h" - -using namespace hpecore; - -skeleton13 detect(cv::Mat image) -{ - skeleton13 pose; - return pose; -} - diff --git a/core/detection_wrappers/detection.h b/core/detection_wrappers/detection.h deleted file mode 100644 index 8231ba87..00000000 --- a/core/detection_wrappers/detection.h +++ /dev/null @@ -1,40 +0,0 @@ -/* BSD 3-Clause License - -Copyright (c) 2021, Event Driven Perception for Robotics -All rights reserved. - -Redistribution and use in source and binary forms, with or without -modification, are permitted provided that the following conditions are met: - -1. Redistributions of source code must retain the above copyright notice, this - list of conditions and the following disclaimer. - -2. Redistributions in binary form must reproduce the above copyright notice, - this list of conditions and the following disclaimer in the documentation - and/or other materials provided with the distribution. - -3. Neither the name of the copyright holder nor the names of its - contributors may be used to endorse or promote products derived from - this software without specific prior written permission. - -THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" -AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE -DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE -FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL -DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR -SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER -CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, -OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ -#pragma once - -#include "utility.h" -#include - - -namespace hpecore { - -skeleton13 detect(cv::Mat image); - -} diff --git a/core/motion_estimation/jointMotionEstimator.cpp b/core/motion_estimation/jointMotionEstimator.cpp deleted file mode 100644 index 9d521a52..00000000 --- a/core/motion_estimation/jointMotionEstimator.cpp +++ /dev/null @@ -1,708 +0,0 @@ -/* BSD 3-Clause License - -Copyright (c) 2021, Event Driven Perception for Robotics -All rights reserved. - -Redistribution and use in source and binary forms, with or without -modification, are permitted provided that the following conditions are met: - -1. Redistributions of source code must retain the above copyright notice, this - list of conditions and the following disclaimer. - -2. Redistributions in binary form must reproduce the above copyright notice, - this list of conditions and the following disclaimer in the documentation - and/or other materials provided with the distribution. - -3. Neither the name of the copyright holder nor the names of its - contributors may be used to endorse or promote products derived from - this software without specific prior written permission. - -THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" -AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE -IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE -DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE -FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL -DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR -SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER -CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, -OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE -OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ - -#include "jointMotionEstimator.h" - -using namespace hpecore; - - -skeleton13 jointMotionEstimator::resetPose(skeleton13 detection) -{ - return detection; -} - -skeleton13 jointMotionEstimator::resetVel() -{ - skeleton13 vel; - for (size_t i = 0; i < vel.size(); i++) - { - vel[i].u = 0; - vel[i].v = 0; - } - return vel; -} - -// Velocity estimation method 2: neighbor events -skeleton13 jointMotionEstimator::estimateVelocity(std::deque evs, std::deque evsTs, int jointName, int nevs, std::deque& vels) -{ - double vx = 0, vy = 0; - int num = 0; - - skeleton13 vel; - for (size_t i = 0; i < vel.size(); i++) - { - vel[i].u = 0; - vel[i].v = 0; - } - - // erase repeated events from the past - for(size_t i=nevs-1; i < evs.size()-1; i++) - { - for(size_t j=i+1; j 0)) - // { - // distMin = dist; - // jmin = j; - // } - // } - // // event-by-event velocity - // double vxE = (x - evs[jmin].u)/(ts - evsTs[jmin]); - // double vyE = (y - evs[jmin].v)/(ts - evsTs[jmin]); - // vx += vxE; - // vy += vyE; - // joint V {vxE, vyE}; - // vels.push_back(V); - // // accumulated velocity - // vx += (x - evs[jmin].u)/(ts - evsTs[jmin]); - // vy += (y - evs[jmin].v)/(ts - evsTs[jmin]); - // num++; - // } - // if(num) // average velocity - // { - // vx /= num; - // vy /= num; - // } - // } - - // Estimation using neighbor events from the past - if(nevs < int(evs.size())) // there are past events - { - for(int i = 0; i < nevs-1; i++) // new events - { - int x = evs[i].u; - int y = evs[i].v; - double ts = evsTs[i]; - double rad = 2.97; - double vxE = 0, vyE = 0; - int numE = 0; - for(std::size_t j = nevs; j < evs.size(); j++) // past events - { - int xp = evs[j].u; - int yp = evs[j].v; - double tsp = evsTs[j]; - double dist = sqrt((xp-x)*(xp-x) + (yp-y)*(yp-y)); - double dt = ts-tsp; - if(dist <= rad && dist > 0) - // if(fabs(x-xp) <= rad && fabs(y-yp) <= rad) - { - vxE += (x-xp)/dt; - vyE += (y-yp)/dt; - numE++; - } - } - // event-by-event velocity - if(numE) - { - vxE /= numE; - vyE /= numE; - vx += vxE; - vy += vyE; - joint V {vxE, vyE}; - vels.push_back(V); - num++; - } - } - if(num) // average velocity - { - vx /= num; - vy /= num; - } - } - - vel[jointName].u = vx; - vel[jointName].v = vy; - - return vel; -} - -// Velocity estimation using past events surfaces -skeleton13 jointMotionEstimator::estimateVelocity(std::deque evs, std::deque evsTs, int jointName, int nevs, std::deque& vels, cv::Mat eros, cv::Mat SAE) -{ - double vx = 0, vy = 0; - int num = 0; - - skeleton13 vel; - for (size_t i = 0; i < vel.size(); i++) - { - vel[i].u = 0; - vel[i].v = 0; - } - int th = 100; - double tFirst = evsTs[0]; - double tLast = evsTs[nevs-1]; - - // std::cout << tFirst << ", " << tLast << std::endl; - - // std::cout<< evs.size() << ", " << nevs << std::endl; - - // Estimation using neighbor events from the past - for(int v = 0; v < nevs; v++) // new events - { - int x = evs[v].u; - int y = evs[v].v; - double ts = evsTs[v]; - double rad = 2; - double vxE = 0, vyE = 0; - int numE = 0; - - for(int i = x-rad; i <= x+rad; i++) // past events - { - for(int j = y-rad; j <= y+rad; j++) - { - // float tsp = SAE.at(j, i); - // auto asd = int(eros.at(j, i)); - // std::cerr << x << ", " << y << " @ " << ts << " -> " << i << ", " << j << " @ " << tsp << " -> " << asd; - // std::cout << asd << "\t"; - if(int(eros.at(j, i)) > th && (x!=i || y!=j)) - { - // std::cerr << " * "; - double dt = ts - SAE.at(j, i); - // std::cout << dt << std::endl; - vxE += (x-i)/dt; - vyE += (y-j)/dt; - numE++; - } - // std::cerr << std::endl; - - } - // std::cout << std::endl; - } - // std::cout << numE << std::endl; - - // event-by-event velocity - if(numE > 1) - { - vxE /= numE; - vyE /= numE; - vx += vxE; - vy += vyE; - joint V {vxE, vyE}; - vels.push_back(V); - num++; - } - } - if(num) // average velocity - { - vx /= num; - vy /= num; - } - - vel[jointName].u = vx; - vel[jointName].v = vy; - - return vel; -} - -void jointMotionEstimator::fusion(skeleton13 *pose, skeleton13 dpose, double dt) -{ - for (size_t i = 0; i < (*pose).size(); i++) - { - (*pose)[i].u = (*pose)[i].u + dpose[i].u * dt; - (*pose)[i].v = (*pose)[i].v + dpose[i].v * dt; - } -} - - -// Funtiones using joints instead of skeleton13 - -// Velocity estimation method 2: neighbor events -joint jointMotionEstimator::estimateVelocity(std::deque evs, std::deque evsTs, int nevs, std::deque& vels) -{ - double vx = 0, vy = 0; - int num = 0; - - joint vel ={0, 0}; - - // erase repeated events from the past - for(size_t i=nevs-1; i < evs.size()-1; i++) - { - for(size_t j=i+1; j 0) - // if(fabs(x-xp) <= rad && fabs(y-yp) <= rad) - { - vxE += (x-xp)/dt; - vyE += (y-yp)/dt; - numE++; - } - } - // std::cout << numE << std::endl; - // event-by-event velocity - if(numE) - { - vxE /= numE; - vyE /= numE; - vx += vxE; - vy += vyE; - joint V {vxE, vyE}; - vels.push_back(V); - num++; - } - } - if(num) // average velocity - { - vx /= num; - vy /= num; - } - } - - vel.u = vx; - vel.v = vy; - - return vel; -} - - -// Velocity estimation using past events surfaces -joint jointMotionEstimator::estimateVelocity(std::deque evs, std::deque evsTs, int nevs, std::deque& vels, cv::Mat eros, cv::Mat SAE) -{ - double vx = 0, vy = 0; - int num = 0; - - joint vel ={0, 0}; - - int th = 100; - double tFirst = evsTs[0]; - double tLast = evsTs[nevs-1]; - - // std::cout << tFirst << ", " << tLast << std::endl; - - // std::cout<< evs.size() << ", " << nevs << std::endl; - - // Estimation using neighbor events from the past - for(int v = 0; v < nevs; v++) // new events - { - int x = evs[v].u; - int y = evs[v].v; - double ts = evsTs[v]; - double rad = 2; - double vxE = 0, vyE = 0; - int numE = 0; - - for(int i = x-rad; i <= x+rad; i++) // past events - { - for(int j = y-rad; j <= y+rad; j++) - { - // float tsp = SAE.at(j, i); - // auto asd = int(eros.at(j, i)); - // std::cerr << x << ", " << y << " @ " << ts << " -> " << i << ", " << j << " @ " << tsp << " -> " << asd; - // std::cout << asd << "\t"; - if(int(eros.at(j, i)) > th && (x!=i || y!=j)) - { - // std::cerr << " * "; - double dt = ts - SAE.at(j, i); - // std::cout << dt << std::endl; - vxE += (x-i)/dt; - vyE += (y-j)/dt; - numE++; - } - // std::cerr << std::endl; - - } - // std::cout << std::endl; - } - // std::cout << numE << std::endl; - - // event-by-event velocity - if(numE > 0) - { - vxE /= numE; - vyE /= numE; - vx += vxE; - vy += vyE; - joint V {vxE, vyE}; - vels.push_back(V); - num++; - } - } - if(num) // average velocity - { - vx /= num; - vy /= num; - } - - if(num > 2) - { - vel.u = vx; - vel.v = vy; - } - - return vel; -} - -void jointMotionEstimator::fusion(joint *pose, joint dpose, double dt) -{ - (*pose).u = (*pose).u + dpose.u * dt; - (*pose).v = (*pose).v + dpose.v * dt; -} - -// Functions no longer being used -// Velocity estimation method 1: time diff on adjacent events -skeleton13 jointMotionEstimator::method1(std::deque evs, std::deque evsTs, int jointName, int nevs, std::deque& vels) -{ - skeleton13 vel; - for (size_t i = 0; i < vel.size(); i++) - { - vel[i].u = 0; - vel[i].v = 0; - } - // erase repeated events - for(size_t i=0; i < evs.size()-1; i++) - { - for(size_t j=i+1; j lim) vel[jointName].u = 0; - // if(std::fabs(vel[jointName].v) > lim) vel[jointName].v = 0; - - // Arren'sway - // if(nx == 0 && ny ==0) { - // dtx = 1e6; dty = 1e6; - // } - // if(nx) dtx /= nx; - // if(ny) dty /= ny; - // double dtdp = sqrt(dtx*dtx + dty*dty); - // double speed = 1.0 / dtdp; - // double angle = atan2(dty, dtx); - // vel[jointName].u = speed * cos(angle); - // vel[jointName].v = speed * sin(angle); - - return vel; -} - - -// Estimated time of fire functions -void jointMotionEstimator::estimateFire(std::deque evs, std::deque evsTs, std::deque evsPol, int jointName, int nevs, skeleton13 pose, skeleton13 dpose, double **Te, cv::Mat matTe) -{ - double du = dpose[jointName].u; - double dv = dpose[jointName].v; - double den = du*du + dv*dv + 1e-10; - double duI = du/den; - double dvI = dv/den; - - int dimY = 240, dimX = 346; - // for (int i = 0; i < dimY; i++) - // { - // for (int j = 0; j < dimX; j++) - // Te[i][j] = 0 ; - // } - - // Use actual events to compute Te - /* - for(std::size_t k = nevs; k < evs.size(); k++) - - { - int x = evs[k].u; - int y = evs[k].v; - double ts = evsTs[k]; - // // int value = evsPol[i]; - for (int i = 0; i < dimY; i++) - { - for (int j = 0; j < dimX; j++) - { - double prod = duI * (j-x) + dvI * (i-y) + ts; - // if(prod > Te[i][j] || fabs(Te[i][j]) > 1e5) - if(prod > 0) - { - Te[i][j] = prod; - } - } - } - } - // for (int i = 0; i < dimY; i++) - // { - // for (int j = 0; j < dimX; j++) - // Te[i][j] = Te[i][j]/(evs.size()-nevs); - // } - */ - - - // Test: create simple edge to check the gradient - std::deque evs2; - int x0 = pose[jointName].u; - int y0 = pose[jointName].v; - for(int i=x0-10; i b-eps && a < b+eps)) - if((prod < Te[i][j] || Te[i][j] == 0) && (a == b || (a > b-eps && a < b+eps)) && (j-x)*du>0) - { - Te[i][j] = prod; - } - } - } - - } - // search max - double max = Te[0][0]; - for (int i = 0; i < dimY; i++) - { - for (int j = 0; j < dimX; j++) - { - if(Te[i][j] > max) - max = Te[i][j]; - } - } - // substract max where non zero - for (int i = 0; i < dimY; i++) - { - for (int j = 0; j < dimX; j++) - { - if(Te[i][j] > 0) - Te[i][j] = max - Te[i][j]; - } - } - // cv mat - for (int i = 0; i < dimY-1; i++) - { - for (int j = 0; j < dimX-1; j++) - { - matTe.at(i, j) = float(Te[i][j]); - } - } - cv::normalize(matTe,matTe, 0, 1, cv::NORM_MINMAX); - -} - -double jointMotionEstimator::getError(std::deque evs, std::deque evsTs, std::deque evsPol, int jointName, int nevs, skeleton13 pose, skeleton13 dpose, double **Te, cv::Mat matTe) -{ - double sum = 0; - int cont = 0; - int dimY = 240, dimX = 346; - - int end = (nevs < int(evs.size())) ? nevs : evs.size(); - for(int k = 0; k < end; k++) - { - int x = evs[k].u; - int y = evs[k].v; - double ts = evsTs[k]; - if(x >= 0 && x= 0 && y evs, int jointName, int nevs, skeleton13 pose) -{ - double sumX = 0, sumY = 0; - int cont = 0; - int dimY = 240, dimX = 346; - joint center = {0, 0}; - - - int end = (nevs < int(evs.size())) ? nevs : evs.size(); - for(int k = 0; k < end; k++) - { - int x = evs[k].u; - int y = evs[k].v; - if(x >= 0 && x= 0 && y -#include -#include "utility.h" -#include -#include -#include - -namespace hpecore -{ - class jointMotionEstimator - { - - public: - - skeleton13 resetPose(skeleton13 detection); - skeleton13 resetVel(); - skeleton13 estimateVelocity(std::deque evs, std::deque evsTs, int jointName, int nevs, std::deque& vels); - skeleton13 estimateVelocity(std::deque evs, std::deque evsTs, int jointName, int nevs, std::deque& vels, cv::Mat eros, cv::Mat timeSurf); - void fusion(skeleton13 *pose, skeleton13 dpose, double dt); - - // Funtiones using joints instead of skeleton13 - joint estimateVelocity(std::deque evs, std::deque evsTs, int nevs, std::deque& vels); - joint estimateVelocity(std::deque evs, std::deque evsTs, int nevs, std::deque& vels, cv::Mat eros, cv::Mat timeSurf); - void fusion(joint *pose, joint dpose, double dt); - - // Functions no longer being used - // method one for velocity estimation - skeleton13 method1(std::deque evs, std::deque evsTs, int jointName, int nevs, std::deque& vels); - // Estimated time of fire functions - void estimateFire(std::deque evs, std::deque evsTs, std::deque evsPol, int jointName, int nevs, skeleton13 pose, skeleton13 dpose, double **Te, cv::Mat matTe); - double getError(std::deque evs, std::deque evsTs, std::deque evsPol, int jointName, int nevs, skeleton13 pose, skeleton13 dpose, double **Te, cv::Mat matTe); - joint centerMass(std::deque evs, int jointName, int nevs, skeleton13 pose); - skeleton13 setVel(int jointName, skeleton13 dpose, double dx, double dy, double err); - - }; -} \ No newline at end of file From d195a28740bd3925a60eecf4d9d3c6790c7ea7ae Mon Sep 17 00:00:00 2001 From: arrenglover Date: Tue, 7 Feb 2023 10:09:11 +0000 Subject: [PATCH 10/23] openposethread in hpecore --- core/detection_wrappers/openpose_detector.cpp | 90 +++++++++++++++++-- core/detection_wrappers/openpose_detector.h | 24 +++++ 2 files changed, 109 insertions(+), 5 deletions(-) diff --git a/core/detection_wrappers/openpose_detector.cpp b/core/detection_wrappers/openpose_detector.cpp index 22debea2..c9099a5e 100644 --- a/core/detection_wrappers/openpose_detector.cpp +++ b/core/detection_wrappers/openpose_detector.cpp @@ -38,10 +38,11 @@ bool OpenPoseDetector::init(std::string models_path, std::string pose_model, std // description of detector's parameters can be found at // https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/include/openpose/flags.hpp const auto poseMode = op::flagsToPoseMode(1); // body keypoints detection - const auto poseModel = op::flagsToPoseModel(op::String(pose_model)); // 'BODY_25', 25 keypoints, fastest with CUDA, most accurate, includes foot keypoints - // 'COCO', 18 keypoints - // 'MPI', 15 keypoints, least accurate model but fastest on CPU - // 'MPI_4_layers', 15 keypoints, even faster but less accurate + const auto poseModel = op::flagsToPoseModel(op::String(pose_model)); + // 'BODY_25', 25 keypoints, fastest with CUDA, most accurate, includes foot keypoints + // 'COCO', 18 keypoints + // 'MPI', 15 keypoints, least accurate model but fastest on CPU + // 'MPI_4_layers', 15 keypoints, even faster but less accurate // TODO: set poseJointsNum (get number of joints from openpose mapping) @@ -148,4 +149,83 @@ skeleton13 OpenPoseDetector::detect(cv::Mat &input) pose13 = hpecore::coco18_to_dhp19(pose); return pose13; -} \ No newline at end of file +} + + +void openposethread::run() +{ + while (true) + { + m.lock(); + if (stop) + return; + auto t0 = std::chrono::high_resolution_clock::now(); + pose.pose = detop.detect(image); + auto t1 = std::chrono::high_resolution_clock::now(); + std::chrono::microseconds ms = std::chrono::duration_cast(t1 - t0); + latency = std::chrono::duration(ms).count() * 1e3; + data_ready = true; + } +} + + +bool openposethread::init(std::string model_path, std::string model_name, std::string model_size) +{ + // initialise open pose + if (!detop.init(model_path, model_name, model_size)) + return false; + + // make sure the thread won't start until an image is provided + m.lock(); + + // make sure that providing an image will start things for the first go + data_ready = true; + + // start the thread + th = std::thread([this] + { this->run(); }); + + return true; +} + +void openposethread::close() +{ + stop = true; + m.try_lock(); + m.unlock(); +} + +bool openposethread::update(cv::Mat next_image, double image_timestamp, hpecore::stampedPose &previous_result) +{ + // if no data is ready (still processing) do nothing + if (!data_ready) + return false; + + // else set the result to the provided stampedPose + previous_result = pose; + + // set the timestamp + pose.timestamp = image_timestamp; + + // and the image for the next detection + static cv::Mat img_u8, img_float; + next_image.copyTo(img_float); + double min_val, max_val; + cv::minMaxLoc(img_float, &min_val, &max_val); + max_val = std::max(fabs(max_val), fabs(min_val)); + img_float /= (2 * max_val); + img_float += 0.5; + img_float.convertTo(img_u8, CV_8U, 255, 0); + cv::cvtColor(img_u8, image, cv::COLOR_GRAY2BGR); + + // and unlock the procesing thread + m.try_lock(); + m.unlock(); + data_ready = false; + return true; +} + +double openposethread::getLatency() +{ + return latency; +} diff --git a/core/detection_wrappers/openpose_detector.h b/core/detection_wrappers/openpose_detector.h index a2f5532e..a90cca24 100644 --- a/core/detection_wrappers/openpose_detector.h +++ b/core/detection_wrappers/openpose_detector.h @@ -46,4 +46,28 @@ class OpenPoseDetector { void stop(); }; +class openposethread +{ +private: + std::thread th; + hpecore::OpenPoseDetector detop; + hpecore::stampedPose pose{0.0, -1.0, 0.0}; + cv::Mat image; + double latency = 0; + + bool stop{false}; + bool data_ready{true}; + std::mutex m; + + void run(); + +public: + bool init(std::string model_path, std::string model_name, + std::string model_size = "256"); + void close(); + bool update(cv::Mat next_image, double image_timestamp, + hpecore::stampedPose &previous_result); + double getLatency(); +}; + } \ No newline at end of file From 1a88c9eef1a8c5282c167378a7391eb069e462b3 Mon Sep 17 00:00:00 2001 From: arrenglover Date: Tue, 7 Feb 2023 10:17:11 +0000 Subject: [PATCH 11/23] removed old evaluation files --- .../utils}/genYarpImages.py | 0 evaluation/OP_EVALUATION.md | 73 --- evaluation/dhp19/load_dhp19.py | 447 ------------------ evaluation/example_evaluation.py | 58 --- evaluation/plot_circle.py | 151 ------ evaluation/test_eval.py | 148 ------ evaluation/test_single_joint.py | 114 ----- 7 files changed, 991 deletions(-) rename {evaluation => datasets/utils}/genYarpImages.py (100%) delete mode 100644 evaluation/OP_EVALUATION.md delete mode 100755 evaluation/dhp19/load_dhp19.py delete mode 100644 evaluation/example_evaluation.py delete mode 100755 evaluation/plot_circle.py delete mode 100755 evaluation/test_eval.py delete mode 100755 evaluation/test_single_joint.py diff --git a/evaluation/genYarpImages.py b/datasets/utils/genYarpImages.py similarity index 100% rename from evaluation/genYarpImages.py rename to datasets/utils/genYarpImages.py diff --git a/evaluation/OP_EVALUATION.md b/evaluation/OP_EVALUATION.md deleted file mode 100644 index b6e46cfd..00000000 --- a/evaluation/OP_EVALUATION.md +++ /dev/null @@ -1,73 +0,0 @@ -## OpenPose Evaluation on DHP19 -Evaluation of OpenPose on DHP19 can be done using the following steps - - -### 1. Create Grayscale Frames -- Follow instructions in file `/hpe-core/evaluation/dhp19/e2vid/README.MD` to create grayscale frames from preprocessed -DHP19 events. - - -### 2. Extract 2D Ground-Truth Poses -- After downloading the `.mat` files for the events and the Vicon data, run - ```shell - $ cd hpe-core/evaluation/dhp19 - $ python export_gt_to_numpy.py - -e /path/to/events/files/S%d_%d_%d.mat - -v /path/to/vicon/files/S%d_%d_%d.mat - -c - -p /path/to/P_matrices/P%d.npy - -o /path/to/output_folder - -td - ``` - - where - * `-e` and `-v` specify the paths to the files containing the preprocessed events and vicon data, respectively - * `-c` is the id of the camera (in range [0, 3]) - * `-p` is the path to the projection matrix file used to project 3D poses to 2D (cam_id_0->P4.npy, cam_id_1->P1.npy, cam_id_2->P3.npy, cam_id_3->P2.npy) - * `-o` is the path to the output folder. - - -### 3. Predict 2D Poses with OpenPose -- Download the Dockerfile from the [EDPR-APRIL OpenPose repository](https://github.com/event-driven-robotics/EDPR-APRIL/tree/openpose-yarp-docker) -and build the Docker image - ```shell - $ cd /path/to/dockerfile - $ docker build -t op-yarp - < Dockerfile - ``` -- Run the Docker container and, inside it, run the pose detector - ```shell - $ xhost + - $ docker run -it -v /tmp/.X11-unix/:/tmp/.X11-unix -v /path/to/grayscale/frames:/data -e DISPLAY=unix$DISPLAY --runtime=nvidia - $ ./pose_detector_json - -input_folder /path/to/grayscale/frames - -output_folder /path/to/output/folder - --write_json /path/to/output/folder - --model-folder /openpose/models - ``` -OpenPose will create in `/path/to/output/folder` one `.json` file for each input frame. - - -### 4. Evaluate OpenPose -Once all previous steps have been completed, there should be -- one folder with `n` grayscale frames generated by E2Vid -- one folder with `n` `.json` pose files created by OpenPose -- one file `.npy` file containing `n` ground-truth poses extracted from DHP19 -with `n` varying according to the length of the DHP19 recording (see [DHP19's website](https://sites.google.com/view/dhp19/home)). - -Run -```shell -$ cd hpe-core/evaluation -$ python example_evaluation.py - -gt /path/to/recording/gt/poses.npy - -p /path/to/json/files - -if /path/to/grayscale/frames - -o /path/to/output/folder -``` -where -* `-gt` is the path to the .npy -* `-p` is the path to the json files folder -* `-if` is the path to the grayscale frames -* `-o` is the path to the output folder. - -The script will plot ground-truth (in green) and predicted (in blue) poses on the input grayscale frames and will print -the computed Mean Per Joint Position Error (MPJPE) metric in the terminal. \ No newline at end of file diff --git a/evaluation/dhp19/load_dhp19.py b/evaluation/dhp19/load_dhp19.py deleted file mode 100755 index 2a6f326f..00000000 --- a/evaluation/dhp19/load_dhp19.py +++ /dev/null @@ -1,447 +0,0 @@ -#!/usr/bin/env python3 -# -*- coding: utf-8 -*- -""" -Copyright (C) 2021 Event-driven Perception for Robotics -Author: Franco Di Pietro - -LICENSE GOES HERE -""" - -# %% Preliminaries -import math -import numpy as np -import os -import sys - -from os.path import join - -import datasets.utils.mat_files as mat_utils - -# Load env variables set on .bashrc -bimvee_path = os.environ.get('BIMVEE_PATH') -mustard_path = os.environ.get('MUSTARD_PATH') - -# Add local paths -sys.path.insert(0, bimvee_path) -sys.path.insert(0, mustard_path) - -# Directory with DVS (after Matlab processing) and Vicon Data -datadir = '/data/dhp19' - -# Selected recording -subj, sess, mov = 1, 1, 1 -datafile = 'S{}_{}_{}'.format(subj, sess, mov) + '.mat' - -# %% Load DVS data -DVS_dir = join(datadir, 'events_preprocessed') -dataDvs = mat_utils.loadmat(join(DVS_dir, datafile)) - -# Build container -info = {} -# info['filePathOrName'] = '' -container = {} -container['info'] = info -container['data'] = {} -startTime = dataDvs['out']['extra']['startTime'] -for i in range(4): - container['data']['ch' + str(i)] = dataDvs['out']['data']['cam' + str(i)] - container['data']['ch' + str(i)]['dvs']['x'] = container['data']['ch' + str(i)]['dvs']['x'] - 1 - 346 * i - container['data']['ch' + str(i)]['dvs']['y'] = container['data']['ch' + str(i)]['dvs']['y'] - 1 - container['data']['ch' + str(i)]['dvs']['ts'] = (container['data']['ch' + str(i)]['dvs']['ts'] - startTime) * 1e-6 - container['data']['ch' + str(i)]['dvs']['pol'] = np.array(container['data']['ch' + str(i)]['dvs']['pol'], - dtype=bool) - -# %% Load Vicon data -Vicon_dir = join(datadir, 'vicon') -dataVicon = mat_utils.loadmat(join(Vicon_dir, datafile)) - -# dt = (dataDvs['out']['extra']['ts'][-1]-startTime)/np.shape(dataVicon['XYZPOS']['head'])[0] -dt = 10000 -thz = np.arange(dataDvs['out']['extra']['ts'][0] - startTime, dataDvs['out']['extra']['ts'][-1] - startTime + dt, - dt) * 1e-6 # Vicon timestams @ 100Hz -diff = len(thz) - dataVicon['XYZPOS']['head'].shape[0] -if diff > 0: - thz = thz[:-diff] - -# %% Vicon 3D -> 2D -# % Load P Matrix -P_mat_dir = join(datadir, 'P_matrices/') - -# constant parameters -H = 260 -W = 344 -num_joints = 13 - -# % Load camera matrices, camera centers, input and label files, and import CNN mode -P_mat_cam1 = np.load(join(P_mat_dir, 'P1.npy')) -P_mat_cam2 = np.load(join(P_mat_dir, 'P2.npy')) -P_mat_cam3 = np.load(join(P_mat_dir, 'P3.npy')) -P_mat_cam4 = np.load(join(P_mat_dir, 'P4.npy')) -P_mats = [P_mat_cam1, P_mat_cam2, P_mat_cam3, P_mat_cam4] -cameras_pos = np.load(join(P_mat_dir, 'camera_positions.npy')) - - -def get_all_joints(viconData, idx): - viconMat = np.zeros([13, 3], dtype=viconData['XYZPOS']['head'].dtype) - for i in range(0, len(viconData['XYZPOS'])): - viconMat[i, :] = viconData['XYZPOS'][list(viconData['XYZPOS'])[i]][idx] - return viconMat - - -def get_2Dcoords_and_heatmaps_label(vicon_xyz, ch_idx): - # " From 3D label, get 2D label coordinates and heatmaps for selected camera " - if ch_idx == 1: - P_mat_cam = np.load(join(P_mat_dir, 'P1.npy')) - elif ch_idx == 3: - P_mat_cam = np.load(join(P_mat_dir, 'P2.npy')) - elif ch_idx == 2: - P_mat_cam = np.load(join(P_mat_dir, 'P3.npy')) - elif ch_idx == 0: - P_mat_cam = np.load(join(P_mat_dir, 'P4.npy')) - # use homogeneous coordinates representation to project 3d XYZ coordinates to 2d UV pixel coordinates. - vicon_xyz_homog = np.concatenate([vicon_xyz, np.ones([1, 13])], axis=0) - coord_pix_homog = np.matmul(P_mat_cam, vicon_xyz_homog) - coord_pix_homog_norm = coord_pix_homog / coord_pix_homog[-1] - u = coord_pix_homog_norm[0] - v = H - coord_pix_homog_norm[1] # flip v coordinate to match the image direction - # mask is used to make sure that pixel positions are in frame range. - mask = np.ones(u.shape).astype(np.float32) - mask[np.isnan(u)] = 0 - mask[np.isnan(v)] = 0 - mask[u > W] = 0 - mask[u <= 0] = 0 - mask[v > H] = 0 - mask[v <= 0] = 0 - # pixel coordinates - u = u.astype(np.int32) - v = v.astype(np.int32) - return np.stack((v, u), axis=-1), mask - - -size = dataVicon['XYZPOS']['head'].shape[0] -dataType = dataVicon['XYZPOS']['head'].dtype -joints = {} -for ch in range(4): - joints['ch' + str(ch)] = {} - for key in dataVicon['XYZPOS']: - joints['ch' + str(ch)][key] = np.empty((size, 2), dtype='uint16') - for i in range(len(thz)): # CHECK -1 - viconMat = get_all_joints(dataVicon, i) - y_2d, gt_mask = get_2Dcoords_and_heatmaps_label(np.transpose(viconMat), ch) - y_2d_float = y_2d.astype(np.uint16) - for j in range(13): - joints['ch' + str(ch)][list(joints['ch' + str(ch)])[j]][i, 1] = y_2d_float[j, 0].astype(np.uint16) - joints['ch' + str(ch)][list(joints['ch' + str(ch)])[j]][i, 0] = y_2d_float[j, 1].astype(np.uint16) - joints['ch' + str(ch)]['ts'] = thz - -# %% Build DVS+GT container -for ch in range(4): - # Change dvs data to hpe - container['data']['ch' + str(ch)]['hpe'] = container['data']['ch' + str(ch)].pop('dvs') - # Add GT data - container['data']['ch' + str(ch)]['hpe']['skeleton'] = {} - container['data']['ch' + str(ch)]['hpe']['skeleton']['gt'] = joints['ch' + str(ch)] - -# %% Plot joints vs t -import matplotlib.pyplot as plt - -plt.close('all') -fig, ax = plt.subplots(2) -ch = 3 # choose cahnnel to plot -l = joints['ch' + str(ch)]['head'].shape[0] -for j in range(13): - ax[0].plot(thz[0:l], joints['ch' + str(ch)][list(joints['ch' + str(ch)])[j]][:, 0], ) - ax[0].set_ylabel('x coordinate [px]', fontsize=12, labelpad=5) - ax[1].plot(thz[0:l], joints['ch' + str(ch)][list(joints['ch' + str(ch)])[j]][:, 1]) - ax[1].set_ylabel('y coordinate [px]', fontsize=12, labelpad=5) - -for i in range(2): - ax[i].legend(list(joints['ch' + str(ch)]), loc='upper right', fontsize=11) - ax[i].set_xlabel('time [sec]', fontsize=12, labelpad=-5) - ax[i].grid() -fig.suptitle('Ground truth 2D position for ch' + str(ch) + ' camera', fontsize=18) - -# figManager = plt.get_current_fig_manager() -# figManager.window.showMaximized() -plt.show() - -# %% Plot events (RoI) + GT (2 Roi values) -import matplotlib.pyplot as plt - -plt.close('all') -fig, ax = plt.subplots(2, 2, sharex=True) - -# choose joint and camera cahnnel to plot -ch = 3 -joint = 'handL' -# set RoI values -RoI = [2, 4] -# set plot markers -plotMarkers = True - - -def findNearest(array, value): - idx = np.searchsorted(array, value) # side="left" param is the default - if idx > 0 and ( - idx == len(array) or - math.fabs(value - array[idx - 1]) < math.fabs(value - array[idx])): - return idx - 1 - else: - return idx - - -# evs -t = container['data']['ch' + str(ch)]['hpe']['ts'] -x = container['data']['ch' + str(ch)]['hpe']['x'] -y = container['data']['ch' + str(ch)]['hpe']['y'] -# extract events inside roi(t) -for idx, roi in enumerate(RoI): - prev = -1 - xev = np.array([]) - yev = np.array([]) - tev = np.array([]) - for i in range(0, len(thz)): - if i > 0: - t2 = findNearest(t, thz[i]) - 1 - else: - t2 = 0 - auxX = np.array(x[prev:t2]) - auxY = np.array(y[prev:t2]) - auxT = np.array(t[prev:t2]) - condX = np.logical_and(auxX > joints['ch' + str(ch)][joint][i, 0] - roi, - auxX < joints['ch' + str(ch)][joint][i, 0] + roi) - condY = np.logical_and(auxY > joints['ch' + str(ch)][joint][i, 1] - roi, - auxY < joints['ch' + str(ch)][joint][i, 1] + roi) - cond = np.logical_and(condX, condY) - xev = np.append(xev, auxX[cond]) - yev = np.append(yev, auxY[cond]) - tev = np.append(tev, auxT[cond]) - prev = t2 - if plotMarkers: - # With markers - ax[0, idx].plot(tev, xev, color='red', marker=".") - ax[1, idx].plot(tev, yev, color='red', marker=".") - ax[0, idx].plot(thz, joints['ch' + str(ch)][joint][:, 0], marker="D", markersize=12) - ax[1, idx].plot(thz, joints['ch' + str(ch)][joint][:, 1], marker="D", markersize=12) - else: - # Without markers - ax[0, idx].plot(tev, xev, color='red') - ax[1, idx].plot(tev, yev, color='red') - ax[0, idx].plot(thz, joints['ch' + str(ch)][joint][:, 0]) - ax[1, idx].plot(thz, joints['ch' + str(ch)][joint][:, 1]) - ax[0, idx].set_ylabel('x coordinate [px]', fontsize=12, labelpad=5) - ax[1, idx].set_ylabel('y coordinate [px]', fontsize=12, labelpad=5) - ax[0, idx].set_title('RoI = ' + str(roi), fontsize=20) - -figManager = plt.get_current_fig_manager() -figManager.window.showMaximized() -plt.show() - -# %% Plot events (RoI) + GT (FILTER) -import math -import matplotlib.pyplot as plt - -plt.close('all') -fig, ax = plt.subplots(2, sharex=True) -# choose joint and camera cahnnel to plot -ch = 3 -joint = 'handL' -# set RoI values -roi = 5 -# set order of the polynomial used for filter -order = 3 - - -def findNearest(array, value): - idx = np.searchsorted(array, value) # side="left" param is the default - if idx > 0 and ( - idx == len(array) or - math.fabs(value - array[idx - 1]) < math.fabs(value - array[idx])): - return idx - 1 - else: - return idx - - -# evs -t = container['data']['ch' + str(ch)]['hpe']['ts'] -x = container['data']['ch' + str(ch)]['hpe']['x'] -y = container['data']['ch' + str(ch)]['hpe']['y'] -# extract events inside roi(t) -prev = -1 -xev = np.array([]) -yev = np.array([]) -tev = np.array([]) -avg = 0 -for i in range(0, len(thz)): - if i > 0: - t2 = findNearest(t, thz[i]) - 1 - else: - t2 = 0 - auxX = np.array(x[prev:t2]) - auxY = np.array(y[prev:t2]) - auxT = np.array(t[prev:t2]) - condX = np.logical_and(auxX > joints['ch' + str(ch)][joint][i, 0] - roi, - auxX < joints['ch' + str(ch)][joint][i, 0] + roi) - condY = np.logical_and(auxY > joints['ch' + str(ch)][joint][i, 1] - roi, - auxY < joints['ch' + str(ch)][joint][i, 1] + roi) - cond = np.logical_and(condX, condY) - xev = np.append(xev, auxX[cond]) - yev = np.append(yev, auxY[cond]) - tev = np.append(tev, auxT[cond]) - prev = t2 - avg += len(auxT[cond]) -avg = int(avg / len(thz)) * 10 -print('Average num evs per vicon sample = ', avg) -if avg % 2 == 0: - avg += 1 - -# Without markers -# ax[0].plot(tev, xev, color='red') -# # ax[1].plot(tev, yev, color='red') -# # GT -# ax[0].plot(thz, joints['ch'+str(ch)][joint][:,0]) -# ax[0].set_ylabel('x coordinate [px]', fontsize=12, labelpad=5) -# ax[1].plot(thz, joints['ch'+str(ch)][joint][:,1]) -# ax[1].set_ylabel('y coordinate [px]', fontsize=12, labelpad=5) -# # With big markers -# ax[0].plot(tev, xev, color='red',marker = ".") -# ax[1].plot(tev, yev, color='red',marker = ".") -# # GT -# ax[0].plot(thz, joints['ch'+str(ch)][joint][:,0],marker = "D", markersize=12) -# ax[0].set_ylabel('x coordinate [px]', fontsize=12, labelpad=5) -# ax[1].plot(thz, joints['ch'+str(ch)][joint][:,1],marker = "D", markersize=12) -# ax[1].set_ylabel('y coordinate [px]', fontsize=12, labelpad=5) - - -# Filter -from scipy.signal import savgol_filter # Savitzky-Golay filter - -# x-axis -ax[0].plot(tev, xev, color='red', marker=".", label='raw events') -ax[0].plot(thz, joints['ch' + str(ch)][joint][:, 0], marker="D", markersize=12, label='GT') -ax[0].set_ylabel('x coordinate [px]', fontsize=12, labelpad=5) -xS = savgol_filter(xev, avg, order) -ax[0].plot(tev, xS, color='blue', marker="o", label='filtered events') - -# y-axis -ax[1].plot(tev, yev, color='red', marker=".") -ax[1].plot(thz, joints['ch' + str(ch)][joint][:, 1], marker="D", markersize=12) -ax[1].set_ylabel('y coordinate [px]', fontsize=12, labelpad=5) -yS = savgol_filter(yev, avg, order) -ax[1].plot(tev, yS, color='blue', marker="o") - -lines_labels = [ax.get_legend_handles_labels() for ax in fig.axes] -lines, labels = [sum(lol, []) for lol in zip(*lines_labels)] - -fig.legend(lines, labels, loc=1, prop={'size': 16}) -figManager = plt.get_current_fig_manager() -figManager.window.showMaximized() -plt.show() - -# %% Plot GT 10Hz -import matplotlib.pyplot as plt - -plt.close('all') -fig, ax = plt.subplots(2, sharex=True) -# choose joint and camera cahnnel to plot -ch = 3 -joint = 'handL' - -f = 10 -T = int(100 / f) -gtf = joints['ch' + str(ch)][joint][::T, :] -tf = thz[::T] - -# x-axis -ax[0].plot(tf, gtf[:, 0], color='red', marker="o", markersize=12, label='GT') -ax[0].plot(thz, joints['ch' + str(ch)][joint][:, 0], marker="*", markersize=8, label='sampled GT') -ax[0].set_ylabel('x coordinate [px]', fontsize=14, labelpad=5) -ax[0].set_title('Left hand joint - Sampling freq = ' + str(f) + ' Hz', fontsize=18) -# inset axes -axins = ax[0].inset_axes([0.5, 0.1, 0.45, 0.8]) -# sub region of the original image -x1, x2, y1, y2 = 22, 25, 168, 215 -axins.set_xlim(x1, x2) -axins.set_ylim(y1, y2) -axins.set_xticklabels('') -axins.set_yticklabels('') -axins.plot(tf, gtf[:, 0], color='red', marker="o", markersize=12) -axins.plot(thz, joints['ch' + str(ch)][joint][:, 0], marker="*", markersize=8) -axins.set_facecolor('gainsboro') -ax[0].indicate_inset_zoom(axins, edgecolor="black", label='_nolegend_') - -# y-axis -ax[1].plot(tf, gtf[:, 1], color='red', marker="o", markersize=12) -ax[1].plot(thz, joints['ch' + str(ch)][joint][:, 1], marker="*", markersize=8) -ax[1].set_ylabel('y coordinate [px]', fontsize=14, labelpad=5) -ax[1].set_xlabel('time [sec]', fontsize=14) -axins = ax[1].inset_axes([0.5, 0.15, 0.45, 0.8]) -y1, y2 = 115, 150 -axins.set_xlim(x1, x2) -axins.set_ylim(y1, y2) -axins.set_xticklabels('') -axins.set_yticklabels('') -axins.plot(tf, gtf[:, 1], color='red', marker="o", markersize=12) -axins.plot(thz, joints['ch' + str(ch)][joint][:, 1], marker="*", markersize=8) -axins.set_facecolor('gainsboro') -ax[1].indicate_inset_zoom(axins, edgecolor="black", label='_nolegend_') - -# global legend -lines_labels = [ax.get_legend_handles_labels() for ax in fig.axes] -lines, labels = [sum(lol, []) for lol in zip(*lines_labels)] -fig.legend(lines, labels, loc=1, prop={'size': 16}) -figManager = plt.get_current_fig_manager() -figManager.window.showMaximized() -plt.show() -plt.tight_layout() - -# %% Generate a second skeleton with noise to see if comparisson works -import copy - -noisySkt = copy.deepcopy(joints) -for ch in range(4): - for key in noisySkt['ch' + str(ch)]: - if key != 'ts': - noise = np.int_(np.random.normal(0, 1, noisySkt['ch' + str(ch)][key].shape)) - noisySkt['ch' + str(ch)][key] = noisySkt['ch' + str(ch)][key] + noise - -for ch in range(4): - container['data']['ch' + str(ch)]['hpe']['skeleton']['test'] = noisySkt['ch' + str(ch)] - -# %% Container with just one channel -contCh3 = {} -contCh3['info'] = container['info'] -contCh3['data'] = {} -contCh3['data']['ch3'] = container['data']['ch3'] - -# %% Start Mustard -cwd = os.getcwd() - -import threading -import mustard - -app = mustard.Mustard() -thread = threading.Thread(target=app.run) -thread.daemon = True -thread.start() - -# %% Once mustard is open, undo the change of working directory -os.chdir(cwd) - -# %% Visualize data -app.setData(container) -# app.setData(contCh3) - - -# %% Export to yarp (only DVS) -from bimvee import exportIitYarp - -datafile = 'S{}_{}_{}'.format(subj, sess, mov) - -# exportIitYarp.exportIitYarp(container, -# exportFilePath= '/home/fdipietro/hpe-data/yarp/'+datafile, -# pathForPlayback= '/home/fdipietro/hpe-data/yarp/'+datafile, -# protectedWrite = False) -exportIitYarp.exportIitYarp(container, - exportFilePath='/home/fdipietro/hpe-data/yarp/' + datafile, - protectedWrite=False) diff --git a/evaluation/example_evaluation.py b/evaluation/example_evaluation.py deleted file mode 100644 index b21a11b0..00000000 --- a/evaluation/example_evaluation.py +++ /dev/null @@ -1,58 +0,0 @@ - -import argparse -import cv2 -import numpy as np -import os - -from pathlib import Path - -from utils import metrics, plots -from utils import parse_openpose_keypoints_json -from datasets.dhp19.utils import DHP19_BODY_PARTS, openpose_to_dhp19 - - -# def evaluate_openpose(): -# pass - - -if __name__ == '__main__': - parser = argparse.ArgumentParser(description='Evaluate OpenPose on DHP19, plotting predicted and ground-truth poses \ - on input frames and computing Mean Per Joint Position Error metric') - parser.add_argument('-gt', '--ground_truth_path', help='Path to the .npy file containing the ground truth poses for every frame (computed using dhp19/extract_gt_poses.py)', required=True) - parser.add_argument('-p', '--predictions_folder', help='Path to the folder containing the .json files (one for each frame) with predicted poses (computed using OpenPose)', required=True) - parser.add_argument('-if', '--images_folder', help='Path to the folder containing the image frames') - parser.add_argument('-o', '--output_folder', help='Path to the folder where evaluation results will be saved', required=True) - args = parser.parse_args() - - # read data - poses_gt = np.load(args.ground_truth_path) - poses_pred_files = sorted(Path(args.predictions_folder).glob('*.json')) - image_files = sorted(Path(args.images_folder).glob('*.png')) - - # check if the number of images, gt and predicted poses is the same - assert len(poses_gt) == len(poses_pred_files) - assert len(poses_pred_files) == len(image_files) - - # read and preprocess predicted poses - poses_pred = np.zeros((len(poses_pred_files), len(DHP19_BODY_PARTS), 2), dtype=int) - for pi, path in enumerate(poses_pred_files): - op_pose = parse_openpose_keypoints_json(path) - poses_pred[pi, :] = openpose_to_dhp19(op_pose) - - os.makedirs(args.output_folder, exist_ok=True) - - # compute metrics - mpjpe = metrics.compute_mpjpe(poses_pred, poses_gt) - print(f'MPJPE for OpenPose') - print(f'ground-truth path: {args.ground_truth_path}') - print(f'predictions folder: {args.predictions_folder}') - print(f'images folder: {args.images_folder}') - metrics.print_mpjpe(mpjpe, list(DHP19_BODY_PARTS.keys())) - - # TODO: plot some examples (good and bad based on metrics?) - - # plot predictions and gt - for i, path in enumerate(image_files): - img = cv2.imread(str(path)) - plots.plot_poses(img, poses_gt[i], poses_pred[i]) - cv2.imwrite(os.path.join(args.output_folder, path.name), img) diff --git a/evaluation/plot_circle.py b/evaluation/plot_circle.py deleted file mode 100755 index 6f84301b..00000000 --- a/evaluation/plot_circle.py +++ /dev/null @@ -1,151 +0,0 @@ -#!/usr/bin/env python3 -# -*- coding: utf-8 -*- -""" -Created on Thu Nov 4 15:01:04 2021 - -@author: fdipietro -""" - -# %% Load data -import numpy as np -import matplotlib.pyplot as plt -import pickle - -# Load trakcing output -prefix = '/home/fdipietro/projects/hpe-core/example/joint_tracking_example/build/' -filename = 'output.txt' -lines = tuple(open(prefix+filename, 'r')) -data= np.empty((len(lines),27)) -for i in range(len(lines)): - data[i,:] = np.array(lines[i].split()) - -# Load ground-truth -subj, sess, mov = 1, 1, 1 # Selected recording -recording = 'S{}_{}_{}'.format(subj, sess, mov) -cam = str(3) -gt2Dfile = '/home/fdipietro/hpe-data/gt2D/' + recording +'.pkl' -with open(gt2Dfile, 'rb') as f: - dataGT = pickle.load(f) -# Build full 100Hz ground-truth (for plots) -L = len(dataGT['ch3']['ts']) -gt = np.empty([L,13,2]) -for j in range(L): - i=0 - for key, value in dataGT['ch3'].items(): - if(key!='ts'): - gt[j,i,:] = value[j] - i=i+1 -t_gt = np.array(dataGT['ch3']['ts']) - -# Load detections (aux output in tracking app) -aux = 'aux_out.txt' -lines = tuple(open(prefix+aux, 'r')) -aux= np.empty((len(lines),27)) -for i in range(len(lines)): - aux[i,:] = np.array(lines[i].split()) - -# Load velocities estimations (vel output in tracking app) -vel = 'vel_out.txt' -lines = tuple(open(prefix+vel, 'r')) -vel= np.empty((len(lines),3)) -for i in range(len(lines)): - if(i!=0): - vel[i,:] = np.array(lines[i].split()) - -# Get index for selected joint -DHP19_BODY_PARTS = { # map from body parts to indices for dhp19 - 'head': 1, - 'shoulderR': 2, - 'shoulderL': 3, - 'elbowR': 4, - 'elbowL': 5, - 'hipL': 6, - 'hipR': 7, - 'handR': 8, - 'handL': 9, - 'kneeR': 10, - 'kneeL': 11, - 'footR': 12, - 'footL': 13 -} -j = DHP19_BODY_PARTS['handL'] # selected joint -p = dict(zip(DHP19_BODY_PARTS.values(),DHP19_BODY_PARTS.keys())) - -plt.close('all') -my_dpi = 96 - -# %% Plot 1: Tracking a ground-truth v time -plotGT = True -# detections -x10 = aux[:,j*2-1] -y10 = aux[:,j*2] -t10 = aux[:,0] -#tracking -st = 1 -xTR = data[st:,j*2-1] -yTR = data[st::,j*2] -t = data[st::,0] -idx = np.logical_and(np.logical_not(np.isnan(xTR)), np.logical_not(np.isnan(yTR))) -xTR = xTR[idx] -yTR = yTR[idx] -t = t[idx] -xOS = np.interp(t, t_gt, gt[:,j-1,0]) -yOS = np.interp(t, t_gt, gt[:,j-1,1]) - - -fig1 = plt.figure(figsize=(2048/my_dpi, 900/my_dpi), dpi=my_dpi) -ax1 = plt.subplot(111) -fig1.tight_layout(pad=5) -ax1.plot(t,xTR, marker = ".", label =r'$x_{tracked}$',linestyle = 'None', alpha=1) -ax1.plot(t,yTR, marker = ".", label =r'$y_{tracked}$',linestyle = 'None', alpha=1) -ax1.set_xlim([1, t[-1]/2]) -ax1.set_ylim([min(min(xTR),min(yTR))*0.6, max(max(xTR),max(yTR))*1.4]) - -plt.xlabel('time [sec]', fontsize=22, labelpad=5) -plt.ylabel('x/y coordinate [px]', fontsize=22, labelpad=5) -fig1.suptitle('Joint tracker output - '+recording+' - '+p[j], fontsize=28, y=0.97) -plt.tick_params(axis='both', which='major', labelsize=18) -if(plotGT): - ax1.plot(t,xOS,color='tab:blue', alpha=0.3, label =r'$x_{GT}$') - ax1.plot(t,yOS,color='tab:orange', alpha=0.3, label =r'$y_{GT}$') -ax1.legend(fontsize=18, loc = 'upper right') -plt.show() - -# %%Plot 2: Observation model - velocities circle - -# Compute deivatives -dt = 0.1 -dx = np.gradient(x10)/dt -dy = np.gradient(y10)/dt - -t0 = 3.7 # starting time to plot -tf = t0+dt - -# get velocities in the selected interval -idx1 = np.argwhere(np.logical_and(t10 > t0, t10 < tf)) -tdx1 = t10[idx1] -dx1 = dx[idx1] -dy1 = dy[idx1] - -idx2 = np.argwhere(np.logical_and(vel[:,0] > t0, vel[:,0] < tf)) -auxt = vel[:,0] -auxdx = vel[:,1] -auxdy = vel[:,2] -tdx2 = auxt[idx2] -dx2 = auxdx[idx2] -dy2 = auxdy[idx2] - -fig2 = plt.figure(figsize=(2048/my_dpi, 1600/my_dpi), dpi=my_dpi) -ax2= plt.subplot(111) -ax2.plot(dx2,dy2, marker = ".", label =r'$d\hat x/dt$',linestyle = 'None', color='black', markersize = 2) -# determine circle based on ground-truth veolicty -cx = np.average(dx1) -cy = np.average(dy1) -rad = np.sqrt(cx*cx+cy*cy)/2 -circle1 = plt.Circle((cx/2, cy/2), rad, color='r', linestyle='--', fill = False, linewidth=3) -plt.gca().add_patch(circle1) -plt.gca().set_aspect('equal') -plt.arrow(0, 0, cx, cy, head_width=5, length_includes_head = True, width=1.5, color='r', label =r'$dx/dt$') -plt.xlabel(r'$v_x$', fontsize=22, labelpad=5) -plt.ylabel(r'$v_y$', fontsize=22, labelpad=5) -plt.show() \ No newline at end of file diff --git a/evaluation/test_eval.py b/evaluation/test_eval.py deleted file mode 100755 index 0648b324..00000000 --- a/evaluation/test_eval.py +++ /dev/null @@ -1,148 +0,0 @@ -# %% Start -import numpy as np -import pickle -from pathlib import Path -from utils import metrics -from utils import parse_openpose_keypoints_json -from datasets.dhp19.utils import DHP19_BODY_PARTS, openpose_to_dhp19 - -# selected recording -subj, sess, mov = 1, 1, 1 -recording = 'S{}_{}_{}'.format(subj, sess, mov) -cam = str(3) - -# read data -poses_gt = np.load('/home/fdipietro/hpe-data/2d_Nicolo/' + recording +'/2d_poses_cam_3_7500_events.npy') -poses_pred_files = sorted(Path('/home/fdipietro/hpe-data/open-pose/' + recording).glob('*.json')) -image_files = sorted(Path('/home/fdipietro/hpe-data/grayscale/' + recording +'/' + cam +'/reconstruction').glob('*.png')) -t_op = np.load('/home/fdipietro/hpe-data/2d_Nicolo/' + recording +'/2d_poses_cam_3_7500_ts.npy') -gt2Dfile = '/home/fdipietro/hpe-data/gt2D/' + recording +'.pkl' -with open(gt2Dfile, 'rb') as f: - data = pickle.load(f) - -# build full GT (for plots) -L = len(data['ch3']['ts']) -gt = np.empty([L,13,2]) -for j in range(L): - i=0 - for key, value in data['ch3'].items(): - if(key!='ts'): - gt[j,i,:] = value[j] - i=i+1 -t_gt = np.array(data['ch3']['ts']) - -# Frame rate information -diff = np.ediff1d(t_op) -print('Mean freq = ', 1/np.mean(diff)) -print('Min freq = ', 1/np.max(diff)) -print('Max freq = ', 1/np.min(diff)) -print('\n') - -# check if the number of images, gt and predicted poses is the same -assert len(poses_gt) == len(poses_pred_files) -assert len(poses_pred_files) == len(image_files) - -# read and preprocess predicted poses -poses_pred = np.zeros((len(poses_pred_files), len(DHP19_BODY_PARTS), 2), dtype=int) -for pi, path in enumerate(poses_pred_files): - op_pose = parse_openpose_keypoints_json(path) - poses_pred[pi, :] = openpose_to_dhp19(op_pose) - - -# compute metrics -poses_gt_flip = np.array(poses_gt) # bug: x and y are fliped for some reason -> should be fixed -for i in range(len(poses_gt)): - poses_gt_flip[i,:,:] = np.fliplr(poses_gt[i,:,:]) - -mpjpe = metrics.compute_mpjpe(poses_pred, poses_gt_flip) -metrics.print_mpjpe(mpjpe, list(DHP19_BODY_PARTS.keys())) - -# save metric for bar graph plots -np.save('/home/fdipietro/hpe-data/op-eval_Franco/m_' + recording + '.npy', mpjpe, allow_pickle=False) - - -# %% Plot joint vs t (one joint per figure) -import matplotlib.pyplot as plt - -#select interesting joints for each dataset -if(recording == 'S1_1_1'): - J = [DHP19_BODY_PARTS['head'], DHP19_BODY_PARTS['elbowL'], DHP19_BODY_PARTS['shoulderL']] -elif(recording == 'S1_2_1'): - J = [DHP19_BODY_PARTS['handL'], DHP19_BODY_PARTS['kneeL'], DHP19_BODY_PARTS['footR']] -elif(recording == 'S1_2_3'): - J = [DHP19_BODY_PARTS['head'], DHP19_BODY_PARTS['shoulderL'], DHP19_BODY_PARTS['kneeR']] -elif(recording == 'S1_2_4'): - J = [DHP19_BODY_PARTS['head'], DHP19_BODY_PARTS['shoulderL'], DHP19_BODY_PARTS['footL']] -elif(recording == 'S1_3_3'): - J = [DHP19_BODY_PARTS['handL'], DHP19_BODY_PARTS['elbowL'], DHP19_BODY_PARTS['shoulderL']] -elif(recording == 'S1_3_6'): - J = [DHP19_BODY_PARTS['handR'], DHP19_BODY_PARTS['elbowR'], DHP19_BODY_PARTS['shoulderR']] -elif(recording == 'S10_1_4'): - J = [DHP19_BODY_PARTS['footR'], DHP19_BODY_PARTS['kneeR'], DHP19_BODY_PARTS['hipR']] -elif(recording == 'S13_1_8'): - J = [DHP19_BODY_PARTS['footR'], DHP19_BODY_PARTS['kneeR'], DHP19_BODY_PARTS['hipR']] -else: - print('Recording not present') - -plt.close('all') - -my_dpi = 96 -for i in range(len(J)): - fig = plt.figure(figsize=(2048/my_dpi, 600/my_dpi), dpi=my_dpi) - ax = plt.subplot(111) - coord = 1 - ax.plot(t_gt,gt[:,J[i],1-coord], label ='GT [x]') - ax.plot(t_gt,gt[:,J[i],coord], label ='GT [y]') - ax.plot(t_op, poses_pred[:,J[i],1-coord], marker = ".", markersize=12, linestyle = 'None', label ='OP [x]') - ax.plot(t_op, poses_gt[:,J[i],1-coord], marker = ".", markersize=12, linestyle = 'None', label ='OP [x]') - coord=1 - ax.plot(t_op, poses_pred[:,J[i],coord], marker = ".", markersize=12, linestyle = 'None', label ='OP [y]') - ax.plot(t_op, poses_gt[:,J[i],coord], marker = ".", markersize=12, linestyle = 'None', label ='OP [y]') - plt.xlabel('time [sec]', fontsize=14, labelpad=5) - plt.ylabel('x/y coordinate [px]', fontsize=14, labelpad=5) - ax.legend(fontsize=12, loc = 'lower right') - fig.suptitle('Recording: ' + recording + ' - Joint: ' + list(DHP19_BODY_PARTS.keys())[list(DHP19_BODY_PARTS.values()).index(J[i])], fontsize=18, y=0.92) - plt.show() - # plt.savefig('/home/fdipietro/hpe-data/op-eval_Franco/' + recording + '_' + list(DHP19_BODY_PARTS.keys())[list(DHP19_BODY_PARTS.values()).index(J[i])] + '.png', bbox_inches='tight') - - -# %% BAR GRAPH FULL -import numpy as np -import matplotlib.pyplot as plt -import seaborn as sns -plt.close('all') - -m111 = np.load('/home/fdipietro/hpe-data/op-eval_Franco/m_S1_1_1.npy') -m121 = np.load('/home/fdipietro/hpe-data/op-eval_Franco/m_S1_2_1.npy') -m123 = np.load('/home/fdipietro/hpe-data/op-eval_Franco/m_S1_2_3.npy') -m124 = np.load('/home/fdipietro/hpe-data/op-eval_Franco/m_S1_2_4.npy') -m133 = np.load('/home/fdipietro/hpe-data/op-eval_Franco/m_S1_3_6.npy') -m136 = np.load('/home/fdipietro/hpe-data/op-eval_Franco/m_S1_3_6.npy') -m1014 = np.load('/home/fdipietro/hpe-data/op-eval_Franco/m_S10_1_4.npy') -m1318 = np.load('/home/fdipietro/hpe-data/op-eval_Franco/m_S13_1_8.npy') - -width =0.1 -my_dpi=96 -fig = plt.figure(figsize=(2048/my_dpi, 1200/my_dpi), dpi=my_dpi) - -plt.grid(axis='y') -plt.bar(np.arange(len(m111)), m111, width=width, label = 'S1_1_1') -plt.bar(np.arange(len(m121))+ width, m121, width=width, label = 'S1_2_1') -plt.bar(np.arange(len(m123))+ 2*width, m123, width=width, label = 'S1_2_3') -plt.bar(np.arange(len(m124))+ 3*width, m124, width=width, label = 'S1_2_4') -plt.bar(np.arange(len(m133))+ 4*width, m133, width=width, label = 'S1_3_3') -plt.bar(np.arange(len(m136))+ 5*width, m136, width=width, label = 'S1_3_6') -plt.bar(np.arange(len(m1014))+ 6*width, m1014, width=width, label = 'S10_1_4') -plt.bar(np.arange(len(m1318))+ 7*width, m1318, width=width, label = 'S13_1_8') -# plt.yscale("log") -locs, labels = plt.xticks() -plt.xticks(np.arange(0.35, 13.35, step=1)) -plt.ylabel('Error [px]', fontsize=16, labelpad=5) -ax = plt.gca() -ax.tick_params(axis='x', labelrotation = 45) -ax.legend(fontsize=18, loc = 'upper left', mode = "expand", ncol = 8) -ax.set_xticklabels(DHP19_BODY_PARTS.keys(), fontsize = 14) -sns.despine(bottom=True) -plt.suptitle('Mean Per Joint Position Error', fontsize=18, y=0.92) -plt.show() -# plt.savefig('/home/fdipietro/hpe-data/op-eval_Franco/bars.png', bbox_inches='tight') \ No newline at end of file diff --git a/evaluation/test_single_joint.py b/evaluation/test_single_joint.py deleted file mode 100755 index b87709ed..00000000 --- a/evaluation/test_single_joint.py +++ /dev/null @@ -1,114 +0,0 @@ -#!/usr/bin/env python3 -# -*- coding: utf-8 -*- -""" -@author: fdipietro -""" - -import numpy as np -import matplotlib.pyplot as plt -import pickle - -# map from body parts to indices for dhp19 -DHP19_BODY_PARTS = { - 'head': 1, - 'shoulderR': 2, - 'shoulderL': 3, - 'elbowR': 4, - 'elbowL': 5, - 'hipL': 6, - 'hipR': 7, - 'handR': 8, - 'handL': 9, - 'kneeR': 10, - 'kneeL': 11, - 'footR': 12, - 'footL': 13 -} - -# selected recording -subj, sess, mov = 1, 1, 1 -# select joint -j = DHP19_BODY_PARTS['elbowL'] -p = dict(zip(DHP19_BODY_PARTS.values(),DHP19_BODY_PARTS.keys())) - -# Load trakcing output -prefix = '/home/fdipietro/projects/hpe-core/example/joint_tracking_example/build/' -filename = 'output.txt' - -# Load ground-truth -lines = tuple(open(prefix+filename, 'r')) -data= np.empty((len(lines),27)) -for i in range(len(lines)): - data[i,:] = np.array(lines[i].split()) - -recording = 'S{}_{}_{}'.format(subj, sess, mov) -cam = str(3) -gt2Dfile = '/home/fdipietro/hpe-data/gt2D/' + recording +'.pkl' -with open(gt2Dfile, 'rb') as f: - dataGT = pickle.load(f) -# Build full GT (for plots) -L = len(dataGT['ch3']['ts']) -gt = np.empty([L,13,2]) -for k in range(L): - i=0 - for key, value in dataGT['ch3'].items(): - if(key!='ts'): - gt[k,i,:] = value[k] - i=i+1 -t_gt = np.array(dataGT['ch3']['ts']) - - -# Interpolate GT to tracking output -import math -st = 1 -xTR = data[st:,j*2-1] -yTR = data[st::,j*2] -t = data[st::,0] -idx = np.logical_and(np.logical_not(np.isnan(xTR)), np.logical_not(np.isnan(yTR))) -xTR = xTR[idx] -yTR = yTR[idx] -t = t[idx] -xOS = np.interp(t, t_gt, gt[:,j-1,0]) -yOS = np.interp(t, t_gt, gt[:,j-1,1]) - -# calculate nrmse -nrmse = np.zeros(2) -from sklearn.metrics import mean_squared_error -mse = np.zeros(2) -mse[0] = mean_squared_error(xOS, xTR) -nrmse[0] = math.sqrt(mse[0]) -mse[1] = mean_squared_error(yOS, yTR) -nrmse[1] = math.sqrt(mse[1]) -Gnrmse = np.amax(nrmse) -print('============== sklearn ===================') -print("nRMSE(x) = ", nrmse[0] ) -print("nRMSE(y) = ", nrmse[1] ) -print("nRMSE = ",Gnrmse) -print('==========================================') -ex = str("%.2f%%" % round(nrmse[0],2)) -ey = str("%.2f%%" % round(nrmse[1],2)) - - -# Plots -plt.close('all') -plotGT = True -my_dpi = 96 -fig = plt.figure(figsize=(2048/my_dpi, 900/my_dpi), dpi=my_dpi) -ax = plt.subplot(111) -fig.tight_layout(pad=5) -ax.plot(t,xTR, marker = ".", label =r'$x_{tracked}\,,\,e_x=$'+ex,linestyle = 'None') -ax.plot(t,yTR, marker = ".", label =r'$y_{tracked}\,,\,e_y=$'+ey,linestyle = 'None') -# ax2.set_xlim([0.8, 4.2]) -ax.set_ylim([45, 320]) -ax.set_xlim([0, 12]) -# ax.set_ylim([min(min(xTR),min(yTR))*0.6, max(max(xTR),max(yTR))*1.4]) -plt.xlabel('time [sec]', fontsize=22, labelpad=5) -plt.ylabel('x/y coordinate [px]', fontsize=22, labelpad=5) -fig.suptitle('Joint tracker output - '+recording+' - '+p[j], fontsize=28, y=0.97) -plt.tick_params(axis='both', which='major', labelsize=18) -if(plotGT): - ax.plot(t,xOS,color='tab:blue', alpha=0.3, label =r'$x_{GT}$') - ax.plot(t,yOS,color='tab:orange', alpha=0.3, label =r'$y_{GT}$') -ax.legend(fontsize=22, loc = 'upper left') -plt.show() - From e890c4c10c0c3d1f88fc6c15ebb7cede0efa86c2 Mon Sep 17 00:00:00 2001 From: Arren Glover Date: Tue, 7 Feb 2023 11:28:34 +0100 Subject: [PATCH 12/23] Update README.md --- datasets/README.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/datasets/README.md b/datasets/README.md index bd8b9dd5..d86fc13b 100644 --- a/datasets/README.md +++ b/datasets/README.md @@ -1,8 +1,8 @@ -# Datasets +# Dataset Conversion -This folder contains scripts for reading, preprocessing and exporting HPE datasets to different formats. +This folder contains scripts for reading, preprocessing and exporting HPE datasets to common formats to enable valid comparison of human pose estimation algorithms. As each dataset has a unique format, tailored scripts are required per-dataset. As a final step we use [BIMVEE](https://github.com/event-driven-robotics/bimvee) to export events to any common format. -We use the folowing datasets +We currently support the folowing datasets * [DHP19](dhp19/README.md) * [H3.6M](h36m/README.md) -* [MPII](mpii/README.md). +* [MPII](mpii/README.md) From 80ad5af1239c4ac3e93133e64b373540156828f4 Mon Sep 17 00:00:00 2001 From: Arren Glover Date: Tue, 7 Feb 2023 11:38:39 +0100 Subject: [PATCH 13/23] Update README.md --- evaluation/README.md | 79 +++++++++++++++++++++----------------------- 1 file changed, 37 insertions(+), 42 deletions(-) diff --git a/evaluation/README.md b/evaluation/README.md index 0bdf2958..d23d9889 100644 --- a/evaluation/README.md +++ b/evaluation/README.md @@ -1,50 +1,10 @@ # Evaluating HPE and Joints Tracking Algorithms -This folder contains scripts for evaluating HPE and joints tracking algorithms. - - -## Metrics - -The following metrics are computed for every algorithm - -* Percentage of Correct Keypoints (PCK) -* Root Mean Square Error (RMSE). - -### PCK -PCK is computed -* for every single body joint -* as an average on all body joints - -and - -* for each single input dataset -* for all datasets -* for every specified classification threshold. - -The classification uses the body's head or torso size, depending on the available ground truth. - -Metric output -* tables with all computed PCK values for every input algorithm's predictions -* plots showing average PCK for every algorithm over the selected classification thresholds. - -### RMSE -RMSE is computed -* for every body joint coordinate -* as an average on all joints for each coordinate - -and - -* for each single input dataset -* for all datasets. - -Metric output -* tables with all computed RMSE values for every input algorithm's predictions -* plots for each joint showing ground truth and predicted x and y coordinates over time. - +This folder contains scripts for evaluating HPE and joints tracking algorithms. In particular, each algorithm to be compared should process each dataset to produce a `.csv` file and stored in a particular folder structure as outlined below. The scripts can be run to evaluate joint position comparisons, or joint velocity comparisons. ## Usage -The evaluation script can be run with +The pose evaluation script can be run with ```shell $ python3 evaluate_hpe.py $ -d path/to/datasets/folder @@ -99,3 +59,38 @@ $ -latex - `-latex` flag specifying that tables must be saved to latex format. The output folder will contain files with metric tables and prediction plots. + +## Metrics + +The following metrics are computed for every algorithm + +### Percentage of Correct Keypoints (PCK) +PCK is computed +* for every single body joint +* as an average on all body joints + +and + +* for each single input dataset +* for all datasets +* for every specified classification threshold. + +The classification uses the body's head or torso size, depending on the available ground truth. + +Metric output +* tables with all computed PCK values for every input algorithm's predictions +* plots showing average PCK for every algorithm over the selected classification thresholds. + +### Root Mean Square Error (RMSE) +RMSE is computed +* for every body joint coordinate +* as an average on all joints for each coordinate + +and + +* for each single input dataset +* for all datasets. + +Metric output +* tables with all computed RMSE values for every input algorithm's predictions +* plots for each joint showing ground truth and predicted x and y coordinates over time. From 8f4bb60e24d5bf71b220f2a5d5616cb49adae0b6 Mon Sep 17 00:00:00 2001 From: Arren Glover Date: Tue, 7 Feb 2023 11:46:12 +0100 Subject: [PATCH 14/23] Create README.MD --- example/test_dataset/README.MD | 6 ++++++ 1 file changed, 6 insertions(+) create mode 100644 example/test_dataset/README.MD diff --git a/example/test_dataset/README.MD b/example/test_dataset/README.MD new file mode 100644 index 00000000..e9c18c3a --- /dev/null +++ b/example/test_dataset/README.MD @@ -0,0 +1,6 @@ +# Test Datasets + +A small dataset is included to validate example algorithms can be correctly run. Evaluation should be performed on complete datasets which we provide informaiton in [datasets](../datasets) + +- DPH19_s111_ch3: is YARP format converted of a single DHP19 session that can be used to validate live streaming applications +- dhp19_eros: is some EROS representations from a single DHP19 session that can be used to evaluate MoveEnet operation From 2e93f7815c7604ccccce6b8de595890b169922f2 Mon Sep 17 00:00:00 2001 From: Arren Glover Date: Tue, 7 Feb 2023 11:48:36 +0100 Subject: [PATCH 15/23] Update README.md --- example/yarp-glhpe/README.md | 6 +----- 1 file changed, 1 insertion(+), 5 deletions(-) diff --git a/example/yarp-glhpe/README.md b/example/yarp-glhpe/README.md index a01feb15..53463af8 100644 --- a/example/yarp-glhpe/README.md +++ b/example/yarp-glhpe/README.md @@ -13,14 +13,10 @@ The software was tested on Ubuntu 20.04.2 LTS without GPU support. $ cd $ git clone git@github.com:event-driven-robotics/hpe-core.git $ cd hpe-core/example/yarp_glhpe - $ docker build -t gl_hpe:yarp --ssh default --build-arg ssh_pub_key="$(cat ~/.ssh/.pub)" --build-arg ssh_prv_key="$(cat ~/.ssh/)" - < Dockerfile + $ docker build -t gl_hpe:yarp < Dockerfile ``` :bulb: `` is the parent directory in which the repository is cloned -:bulb: The ssh keys are required to access hpe-core as it is currently private. [Create a new ssh key](https://docs.github.com/en/github/authenticating-to-github/connecting-to-github-with-ssh/generating-a-new-ssh-key-and-adding-it-to-the-ssh-agent) if required. - -:warning: Ensure your ssh key is built without a passphrase. - ## Usage - Download the pre-trained model from [here](https://drive.google.com/drive/folders/1AgsQl6sTJBygPvgbdR1e9IfVAYxupMGI) and store it into folder `/path/to/pre/trained/model/folder` - Run the Docker container From d98e861a38e56ce90a311499e496e6c191a9a369 Mon Sep 17 00:00:00 2001 From: arrenglover Date: Tue, 7 Feb 2023 10:52:40 +0000 Subject: [PATCH 16/23] removed old joint tracking example --- example/joint_tracking_example/.gitignore | 2 - example/joint_tracking_example/CMakeLists.txt | 46 -- example/joint_tracking_example/full13.cpp | 780 ------------------ .../joint_tracking_example/jointTracking.xml | 37 - .../joint_tracking_example.cpp | 613 -------------- .../joint_tracking_example/past_surface.cpp | 657 --------------- example/joint_tracking_example/roiq.cpp | 34 - example/joint_tracking_example/roiq.h | 23 - 8 files changed, 2192 deletions(-) delete mode 100644 example/joint_tracking_example/.gitignore delete mode 100644 example/joint_tracking_example/CMakeLists.txt delete mode 100644 example/joint_tracking_example/full13.cpp delete mode 100644 example/joint_tracking_example/jointTracking.xml delete mode 100644 example/joint_tracking_example/joint_tracking_example.cpp delete mode 100644 example/joint_tracking_example/past_surface.cpp delete mode 100644 example/joint_tracking_example/roiq.cpp delete mode 100644 example/joint_tracking_example/roiq.h diff --git a/example/joint_tracking_example/.gitignore b/example/joint_tracking_example/.gitignore deleted file mode 100644 index 6f31401f..00000000 --- a/example/joint_tracking_example/.gitignore +++ /dev/null @@ -1,2 +0,0 @@ -build/ -.vscode/ diff --git a/example/joint_tracking_example/CMakeLists.txt b/example/joint_tracking_example/CMakeLists.txt deleted file mode 100644 index ca2919cc..00000000 --- a/example/joint_tracking_example/CMakeLists.txt +++ /dev/null @@ -1,46 +0,0 @@ -# requires minimum cmake version -cmake_minimum_required(VERSION 3.5) - -# produce the cmake var PROJECT_NAME -project(joint_track) - -#find_package(YCM REQUIRED) -find_package(YARP COMPONENTS OS sig cv REQUIRED) -find_package(hpe-core REQUIRED) -find_package(event-driven REQUIRED) - -add_executable(${PROJECT_NAME} joint_tracking_example.cpp - roiq.cpp - roiq.h) - -add_executable(pastSurface past_surface.cpp - roiq.cpp - roiq.h) - -add_executable(full13 full13.cpp - roiq.cpp - roiq.h) - -target_compile_options(${PROJECT_NAME} PRIVATE -Wall -Wextra) -target_link_libraries(${PROJECT_NAME} hpe-core::hpe-core - YARP::YARP_os - YARP::YARP_init - YARP::YARP_cv - ev::event-driven) - -target_compile_options(pastSurface PRIVATE -Wall -Wextra) -target_link_libraries(pastSurface hpe-core::hpe-core - YARP::YARP_os - YARP::YARP_init - YARP::YARP_cv - ev::event-driven) - -target_compile_options(full13 PRIVATE -Wall -Wextra -g) -target_link_libraries(full13 hpe-core::hpe-core - YARP::YARP_os - YARP::YARP_init - YARP::YARP_cv - ev::event-driven) - -# install(TARGETS ${PROJECT_NAME} pastSurface DESTINATION ${CMAKE_INSTALL_BINDIR}) -install(TARGETS ${PROJECT_NAME} pastSurface full13 DESTINATION ${CMAKE_INSTALL_BINDIR}) diff --git a/example/joint_tracking_example/full13.cpp b/example/joint_tracking_example/full13.cpp deleted file mode 100644 index cb77d954..00000000 --- a/example/joint_tracking_example/full13.cpp +++ /dev/null @@ -1,780 +0,0 @@ -#include -#include -#include -#include -#include -#include -#include -#include -#include "roiq.h" -#include -#include -#include -#include -#include -#include -#include - -using namespace ev; -using namespace yarp::os; -using namespace yarp::sig; -using namespace hpecore; -using std::tuple; -using yarp::os::Bottle; -using yarp::os::BufferedPort; -// using std::tuple; -// using skeleton = std::vector>; - -#define N 13 -typedef std::array roiqSklt; - - -class jointTrack : public RFModule, public Thread -{ - -private: - vReadPort> input_port; - BufferedPort input_sklt; - std::ofstream output_writer, aux_out, vel_out; - deque evsFullImg; - deque pose2img[N]; - deque vels; - std::mutex m; - skeleton13 pose, dpose, poseGT; - hpecore::jointMotionEstimator tracker; - // roiqSklt qROI; - roiq qROI[N]; - int roiWidth = 32; - int roiHeight = roiWidth; - jointName jointNum; - string jointNameStr; - int dimY, dimX; - int nevs[N] = {0}; - bool plotCv = false; - cv::Mat fullImg; - bool initTimer = false; - double avgF = 0; - BufferedPort > image_port; - float displayF = 25.0; // display frequency in Hz - bool showF = false; - cv::Mat SAE, SAE_vis; - double** Te; - int method; - bool avgV = false; - int detF; - bool objectTracking = false; - bool h36 = false; - bool pastSurf = false; - bool roi = false; - cv::Point ptJoint; - hpecore::surface S; - -public: - jointTrack() {} - - virtual bool configure(yarp::os::ResourceFinder &rf) - { - std::cout << "\033c"; - yInfo() << "\t\t - = HPE - JOINT MOTION ESTIMATION = -"; - //set the module name used to name ports - setName((rf.check("name", Value("/jointTrack")).asString()).c_str()); - - /* initialize yarp network */ - yarp::os::Network yarp; - if (!yarp.checkNetwork(2.0)) - { - std::cout << "Could not connect to YARP" << std::endl; - return false; - } - - //open io ports - if (!input_port.open(getName("/AE:i"))) - { - yError() << "Could not open input port"; - return false; - } - - if (!input_sklt.open(getName("/SKLT:i"))) - { - yError() << "Could not open input port"; - return false; - } - - if(!image_port.open(getName("/img:o"))) { - yError() << "Could not open output port"; - return false; - } - - detF = rf.check("detF", Value(10)).asInt(); - yInfo() << "Detections frequency = " << detF << " Hz"; - if(rf.check("obj")) - objectTracking = true; - - - - output_writer.open("output.txt"); - if (!output_writer.is_open()) - yError() << "Could not open pose writer!!"; - else - yInfo() << "Pose writer opened"; - - aux_out.open("aux_out.txt"); - if (!aux_out.is_open()) - yError() << "Could not open pose writer!!"; - else - yInfo() << "aux_out writer opened"; - - vel_out.open("vel_out.txt"); - if (!vel_out.is_open()) - yError() << "Could not open pose writer!!"; - else - yInfo() << "vel_out writer opened"; - - jointNameStr = rf.check("joint", Value("handL")).asString(); - jointNum = str2enum(jointNameStr); - displayF = rf.check("F", Value(25)).asFloat32(); - method = rf.check("M", Value(2)).asInt(); - - if(rf.check("avgV")) - avgV = true; - - if(rf.check("roi")) - roi = true; - - - yInfo() << "Method for velocity estimations = " << method; - - // intialize velocities - for (size_t i = 0; i < dpose.size(); i++) - { - dpose[i].u = 0; - dpose[i].v = 0; - } - - if(rf.check("cv")) - plotCv = true; - dimX = rf.check("dimX", Value(346)).asInt(); - dimY = rf.check("dimY", Value(260)).asInt(); - if(rf.check("h36")) - h36 = true; - - // connect ports - if(!objectTracking) - { - if(!h36) // dhp19 dataset - { - yarp.connect("/file/ch3dvs:o", getName("/AE:i"), "fast_tcp"); - // yarp.connect("/file/ch3GT10Hzskeleton:o", getName("/SKLT:i"), "fast_tcp"); - yarp.connect("/file/ch3GTskeleton:o", getName("/SKLT:i"), "fast_tcp"); - } - else // H 3.6 dataset - { - yarp.connect("/file/ch0dvs:o", getName("/AE:i"), "fast_tcp"); - yarp.connect("/file/ch0GT50Hzskeleton:o", getName("/SKLT:i"), "fast_tcp"); - roiWidth = 20; - roiHeight = roiWidth; - } - } - else - { - yarp.connect("/file/leftdvs:o", getName("/AE:i"), "fast_tcp"); - yarp.connect("/file/502Hz:o", getName("/SKLT:i"), "fast_tcp"); - roiWidth = 60; - roiHeight = 100; - dimX = 640; - dimY = 480; - jointNum = str2enum("head"); - } - - yarp.connect(getName("/img:o"), "/yarpview/img:i", "fast_tcp"); - - - yInfo() << "Image dimensions = [" << dimX << ", " << dimY << "]"; - fullImg = cv::Mat::zeros(cv::Size(dimX, dimY), CV_32F); - cvtColor(fullImg, fullImg, cv::COLOR_GRAY2RGB); - if(rf.check("showF")) - showF = true; - - if(rf.check("past")) - pastSurf = true; - - if(plotCv) - { - int k = 1; - if(!objectTracking) k =2; - int x0 = 1600, y0 = 0; - cv::namedWindow("HPE OUTPUT", cv::WINDOW_NORMAL); - cv::resizeWindow("HPE OUTPUT", 640, 480); - cv::moveWindow("HPE OUTPUT", x0+1920/3, y0); - - if(pastSurf) - { - cv::namedWindow("TOS", cv::WINDOW_NORMAL); - cv::resizeWindow("TOS", 640, 480); - cv::moveWindow("TOS", x0+1920/3-325, y0+ 600); - - - cv::namedWindow("SAE", cv::WINDOW_NORMAL); - cv::resizeWindow("SAE", 640, 480); - cv::moveWindow("SAE", x0+1920/3+325,y0+ 600); - } - } - - - SAE = cv::Mat::zeros(cv::Size(dimX, dimY), CV_32F); - SAE_vis = cv::Mat::zeros(cv::Size(dimX, dimY), CV_32F); - // // Expected times - // Te = new double*[dimY]; - // for (int i = 0; i < dimY; i++) - // Te[i] = new double[dimX]; - - // for (int i = 0; i < dimY; i++) - // { - // for (int j = 0; j < dimX; j++) - // { - // Te[i][j] = 0; - // } - // } - - - // past events surface - // S.getSurface(); - // S.init(dimX, dimY, 7, 0.3); // EROS - S.init(dimX, dimY, 7, 2); // TOS - - - //start the asynchronous and synchronous threads - return Thread::start(); - } - - - virtual double getPeriod() - { - return 1/displayF; //period of synchrnous thread - } - - - bool interruptModule() - { - //if the module is asked to stop ask the asynchrnous thread to stop - std::cout << "\033c"; - return Thread::stop(); - } - - - void onStop() - { - //when the asynchrnous thread is asked to stop, close ports and do - //other clean up - input_port.close(); - input_sklt.close(); - image_port.close(); - output_writer.close(); - aux_out.close(); - // output_port.close(); - } - - - skeleton13 buildSklt(Bottle &readBottle) - { - skeleton13 newPose; - for (size_t i = 0; i < readBottle.size(); i = i + 2) - { - Value &u = readBottle.get(i); - Value &v = readBottle.get(i + 1); - newPose[i / 2].u = u.asInt32(); - newPose[i / 2].v = v.asInt32(); - } - return newPose; - } - -void drawSkeleton(skeleton13 pose, cv::Mat img, cv::Scalar color) - { - // plot detected joints - for(int i=0; i<13; i++) - { - int x = int(pose[i].u); - int y = int(pose[i].v); - cv::Point pt(x, y); - if(x && y) cv::drawMarker(img, pt, color, 1, 8); - - } - // plot links between joints - cv::Point head(int(pose[0].u), int(pose[0].v)); - cv::Point shoulderR(int(pose[1].u), int(pose[1].v)); - cv::Point shoulderL(int(pose[2].u), int(pose[2].v)); - cv::Point elbowR(int(pose[3].u), int(pose[3].v)); - cv::Point elbowL(int(pose[4].u), int(pose[4].v)); - cv::Point hipL(int(pose[5].u), int(pose[5].v)); - cv::Point hipR(int(pose[6].u), int(pose[6].v)); - cv::Point handR(int(pose[7].u), int(pose[7].v)); - cv::Point handL(int(pose[8].u), int(pose[8].v)); - cv::Point kneeR(int(pose[9].u), int(pose[9].v)); - cv::Point kneeL(int(pose[10].u), int(pose[10].v)); - cv::Point footR(int(pose[11].u), int(pose[11].v)); - cv::Point footL(int(pose[12].u), int(pose[12].v)); - - int th = 1; - if(head.x && head.y && shoulderL.x && shoulderL.y) cv::line(img, head, shoulderL, color, th); - if(head.x && head.y && shoulderR.x && shoulderR.y) cv::line(img, head, shoulderR, color, th); - if(shoulderL.x && shoulderL.y && shoulderR.x && shoulderR.y) cv::line(img, shoulderL, shoulderR, color, th); - if(shoulderL.x && shoulderL.y && elbowL.x && elbowL.y) cv::line(img, shoulderL, elbowL, color, th); - if(shoulderR.x && shoulderR.y && elbowR.x && elbowR.y) cv::line(img, shoulderR, elbowR, color, th); - if(shoulderL.x && shoulderL.y && hipL.x && hipL.y) cv::line(img, shoulderL, hipL, color, th); - if(shoulderR.x && shoulderR.y && hipR.x && hipR.y) cv::line(img, shoulderR, hipR, color, th); - if(hipL.x && hipL.y && hipR.x && hipR.y) cv::line(img, hipL, hipR, color, th); - if(elbowL.x && elbowL.y && handL.x && handL.y) cv::line(img, elbowL, handL, color, th); - if(elbowR.x && elbowR.y && handR.x && handR.y) cv::line(img, elbowR, handR, color, th); - if(hipL.x && hipL.y && kneeL.x && kneeL.y) cv::line(img, hipL, kneeL, color, th); - if(hipR.x && hipR.y && kneeR.x && kneeR.y) cv::line(img, hipR, kneeR, color, th); - if(kneeR.x && kneeR.y && footR.x && footR.y) cv::line(img, kneeR, footR, color, th); - if(kneeL.x && kneeL.y && footL.x && footL.y) cv::line(img, kneeL, footL, color, th); - } - -void drawRoi(cv::Mat img, cv::Scalar color, int th) -{ - // draw Roi rectangles - if(ptJoint.x && ptJoint.y) - { - for(int i=0; i(img_out)); - image_port.write(); - fullImg = cv::Vec3f(0.4, 0.4, 0.4); - - return Thread::isRunning(); - } - - - - - - void evsToImage(deque &evs) - { - while(!evs.empty()) - { - int x = evs.front().x; - int y = evs.front().y; - int p = evs.front().polarity; - if(x>=0 && x< dimX && y>=0 && y(y, x) = cv::Vec3f(1.0, 1.0, 1.0); - else - fullImg.at(y, x) = cv::Vec3f(0.0, 0.0, 0.0); - } - evs.pop_front(); - } - } - - - //asynchronous thread run forever - void run() - { - Stamp evStamp, skltStamp; - Bottle *bot_sklt; - double skltTs; - const vector *q; - double t0 = Time::now(), t1=t0, t2 = t0, t2prev = t0, t3[N] = {t0}, t4 = t0; - // t0 = initial time - // t1 = global timer - // t2 = timer used for operation frequency calculation - // t3 = timer used for fusion - // t4 = timer used for detection frequency downsampling - long int mes = 0; // amount of porcessing cycles used to average operation frequency - double freq = 0; // operation frequenc - bool firstDet = false; // first detection took place - - while (!Thread::isStopping()) - { - t1 = Time::now() - t0; - // read detections - // int N = input_sklt.getPendingReads(); - bot_sklt = input_sklt.read(false); - input_sklt.getEnvelope(skltStamp); - - skltTs = skltStamp.getTime(); - - ptJoint.x = int(poseGT[jointNum].u); - ptJoint.y = int(poseGT[jointNum].v); - - if(bot_sklt && 1/(t1 - t4) <= detF) // there is a detection - { - Value &coords = (*bot_sklt).get(1); - Bottle *sklt_lst = coords.asList(); - // build skeleton from reading - skeleton13 builtPose = buildSklt(*sklt_lst); - pose = tracker.resetPose(builtPose); // reset pose - poseGT = pose; - // set roi for just all 13 joints - for(int i=0; i1000) - { - mes= 0; - freq=0; - } - t2prev = t2; - // avgF = 1/(t2-tprev); // instant freq - // yInfo() << "\033c" << avgF; - } - for(int i=0; i= qROI[j].roi[0] && qi.x= qROI[j].roi[2] && qi.y skltTs) // there are events to process - { - // separate events into deques to avoid using event-driven in hpe-core - std::deque evs; - std::deque evsTs; - getEventsUV(qROI[i].q, evs, evsTs, vtsHelper::tsscaler); // get events u,v coords - - if(method == 1) // Velocity estimation Method 1: time diff on adjacent events - { - qROI[i].setSize(int((qROI[i].roi[1] - qROI[i].roi[0]) * (qROI[i].roi[3] - qROI[i].roi[2])*1.5)); - if(nevs[i] > 20) - { - dpose = tracker.method1(evs, evsTs, i, nevs[i], vels); // get veocities from delta ts - double dt = (qROI[i].q.front().stamp - qROI[i].q.back().stamp) * vtsHelper::tsscaler; - tracker.fusion(&pose, dpose, dt); - } - else - { - dpose = tracker.resetVel(); - double dt = (qROI[i].q.front().stamp - qROI[i].q.back().stamp) * vtsHelper::tsscaler; - tracker.fusion(&pose, dpose, dt); - } - } - else if(method == 2) // Velocity estimation method 2: neighbor events - { - int halfRoi = int((qROI[i].roi[1] - qROI[i].roi[0]) * (qROI[i].roi[3] - qROI[i].roi[2])*0.5); - if(pastSurf) qROI[i].setSize(nevs[i]); - else qROI[i].setSize(halfRoi); - // yInfo() << nevs; - // tracker.estimateFire(evs, evsTs, evsPol, jointNum, nevs, pose, dpose, Te, SAE); - // double err = tracker.getError(evs, evsTs, evsPol, jointNum, nevs, pose, dpose, Te, SAE); - // dpose = tracker.setVel(jointNum, dpose, pose[jointNum].u, pose[jointNum].v, err); - if(pastSurf) dpose[i] = tracker.estimateVelocity(evs, evsTs, nevs[i], vels, S.getSurface(), SAE); - else dpose[i] = tracker.estimateVelocity(evs, evsTs, nevs[i], vels); - double dt = t1 - t3[i]; - tracker.fusion(&pose[i], dpose[i], dt); - t3[i] = t1; - } - - // // write integrated pose output to file - // output_writer << t1 << " "; - // for (auto &t : pose) - // output_writer << t.u << " " << t.v << " "; - // output_writer << std::endl; - - // update roi - int x = pose[i].u; - int y = pose[i].v; - qROI[i].setROI(x - roiWidth / 2, x + roiWidth / 2, y - roiHeight / 2, y + roiHeight / 2); - pose2img[i].push_back(pose[i]); - - // // output velocities estimations to file - // if(avgV) // true = write averaged vel - false = write event by event vel - // { - // vel_out << t1 << " " << dpose[jointNum].u << " " << dpose[jointNum].v << std::endl; - // } - // else - // { - // while(!vels.empty()) - // { - // joint V = vels.front(); - // vel_out << t1 << " " << V.u << " " << V.v << std::endl; - // vels.pop_front(); - // } - // } - } - // else if (bot_sklt) // there weren't events to process but a detection occured - // { - // // write detected output to file - // output_writer << t1 << " "; - // for (auto &t : pose) - // output_writer << t.u << " " << t.v << " "; - // output_writer << std::endl; - // } - } - } - if(pastSurf && initTimer) - { - for (auto &qi : *q) - { - // S.EROSupdate(qi.x, qi.y); - S.TOSupdate(qi.x, qi.y); - SAE.at(qi.y, qi.x) = float(qi.stamp*vtsHelper::tsscaler); - } - cv::normalize(SAE,SAE_vis, 0, 1, cv::NORM_MINMAX); - } - } - } -}; - -int main(int argc, char *argv[]) -{ - /* initialize yarp network */ - yarp::os::Network yarp; - if (!yarp.checkNetwork(2)) - { - std::cout << "Could not connect to YARP" << std::endl; - return false; - } - /* prepare and configure the resource finder */ - yarp::os::ResourceFinder rf; - rf.configure(argc, argv); - /* create the module */ - jointTrack instance; - return instance.runModule(rf); -} diff --git a/example/joint_tracking_example/jointTracking.xml b/example/joint_tracking_example/jointTracking.xml deleted file mode 100644 index 5a8ac817..00000000 --- a/example/joint_tracking_example/jointTracking.xml +++ /dev/null @@ -1,37 +0,0 @@ - - jointTrack - - - - - - /mnt/Shared/code/hpe-core/example/joint_tracking_example/build/joint_track - --dimX 640 --dimY 480 --cv --F 24 --joint handR - localhost - - - - /file/ch3dvs:o - /glhpe-framer/AE:i - fast_tcp - - - - /atis3/AE:o - /jointTrack/AE:i - fast_tcp - - - - /op_detector_example_module/skl:o - /jointTrack/SKLT:i - fast_tcp - - - - /jointTrack/img:o - /img_vs - fast_tcp - - - \ No newline at end of file diff --git a/example/joint_tracking_example/joint_tracking_example.cpp b/example/joint_tracking_example/joint_tracking_example.cpp deleted file mode 100644 index bdba7c72..00000000 --- a/example/joint_tracking_example/joint_tracking_example.cpp +++ /dev/null @@ -1,613 +0,0 @@ -#include -#include -#include -#include -#include -#include -#include -#include "roiq.h" -#include -#include -#include -#include -#include -#include -#include - -using namespace ev; -using namespace yarp::os; -using namespace yarp::sig; -using namespace hpecore; -using std::tuple; -using yarp::os::Bottle; -using yarp::os::BufferedPort; -// using std::tuple; -// using skeleton = std::vector>; - - -class jointTrack : public RFModule, public Thread -{ - -private: - vReadPort> input_port; - BufferedPort input_sklt; - std::ofstream output_writer, aux_out, vel_out; - deque evsFullImg; - deque pose2img; - deque vels; - std::mutex m; - skeleton13 pose, dpose, poseGT; - hpecore::jointMotionEstimator tracker; - roiq qROI; - int roiWidth = 32; - int roiHeight = roiWidth; - jointName jointNum; - string jointNameStr; - int dimY, dimX; - int nevs = 0; - bool plotCv = false; - cv::Mat fullImg; - bool initTimer = false; - double avgF = 0; - BufferedPort > image_port; - float displayF = 25.0; // display frequency in Hz - bool showF = false; - cv::Mat matTe; - double** Te; - int method; - bool avgV = false; - int detF; - bool objectTracking = false; - bool h36 = false; - cv::Point ptJoint; - -public: - jointTrack() {} - - virtual bool configure(yarp::os::ResourceFinder &rf) - { - std::cout << "\033c"; - yInfo() << "\t\t - = HPE - JOINT MOTION ESTIMATION = -"; - //set the module name used to name ports - setName((rf.check("name", Value("/jointTrack")).asString()).c_str()); - - /* initialize yarp network */ - yarp::os::Network yarp; - if (!yarp.checkNetwork(2.0)) - { - std::cout << "Could not connect to YARP" << std::endl; - return false; - } - - //open io ports - if (!input_port.open(getName("/AE:i"))) - { - yError() << "Could not open input port"; - return false; - } - - if (!input_sklt.open(getName("/SKLT:i"))) - { - yError() << "Could not open input port"; - return false; - } - - if(!image_port.open(getName("/img:o"))) { - yError() << "Could not open output port"; - return false; - } - - detF = rf.check("detF", Value(10)).asInt(); - yInfo() << "Detections frequency = " << detF << " Hz"; - if(rf.check("obj")) - objectTracking = true; - - - - output_writer.open("output.txt"); - if (!output_writer.is_open()) - yError() << "Could not open pose writer!!"; - else - yInfo() << "Pose writer opened"; - - aux_out.open("aux_out.txt"); - if (!aux_out.is_open()) - yError() << "Could not open pose writer!!"; - else - yInfo() << "aux_out writer opened"; - - vel_out.open("vel_out.txt"); - if (!vel_out.is_open()) - yError() << "Could not open pose writer!!"; - else - yInfo() << "vel_out writer opened"; - - jointNameStr = rf.check("joint", Value("handL")).asString(); - jointNum = str2enum(jointNameStr); - displayF = rf.check("F", Value(25)).asFloat32(); - method = rf.check("M", Value(2)).asInt(); - - if(rf.check("avgV")) - avgV = true; - - - yInfo() << "Method for velocity estimations = " << method; - - // intialize velocities - for (size_t i = 0; i < dpose.size(); i++) - { - dpose[i].u = 0; - dpose[i].v = 0; - } - - if(rf.check("cv")) - plotCv = true; - dimX = rf.check("dimX", Value(304)).asInt(); - dimY = rf.check("dimY", Value(240)).asInt(); - if(rf.check("h36")) - h36 = true; - - // connect ports - if(!objectTracking) - { - if(!h36) // dhp19 dataset - { - yarp.connect("/file/ch3dvs:o", getName("/AE:i"), "fast_tcp"); - // yarp.connect("/file/ch3GT10Hzskeleton:o", getName("/SKLT:i"), "fast_tcp"); - yarp.connect("/file/ch3GTskeleton:o", getName("/SKLT:i"), "fast_tcp"); - } - else // H 3.6 dataset - { - yarp.connect("/file/ch0dvs:o", getName("/AE:i"), "fast_tcp"); - yarp.connect("/file/ch0GT50Hzskeleton:o", getName("/SKLT:i"), "fast_tcp"); - roiWidth = 20; - roiHeight = roiWidth; - } - } - else - { - yarp.connect("/file/leftdvs:o", getName("/AE:i"), "fast_tcp"); - yarp.connect("/file/502Hz:o", getName("/SKLT:i"), "fast_tcp"); - roiWidth = 60; - roiHeight = 100; - dimX = 640; - dimY = 480; - jointNum = str2enum("head"); - } - - yarp.connect(getName("/img:o"), "/yarpview/img:i", "fast_tcp"); - - - yInfo() << "Image dimensions = [" << dimX << ", " << dimY << "]"; - fullImg = cv::Mat::zeros(cv::Size(dimX, dimY), CV_32F); - cvtColor(fullImg, fullImg, cv::COLOR_GRAY2RGB); - if(rf.check("showF")) - showF = true; - if(plotCv) - { - cv::namedWindow("HPE OUTPUT", cv::WINDOW_NORMAL); - cv::resizeWindow("HPE OUTPUT", 346, 260); - - // cv::namedWindow("METHOD II", cv::WINDOW_NORMAL); - // cv::resizeWindow("METHOD II", 346, 260); - } - - - matTe = cv::Mat::zeros(cv::Size(dimX, dimY), CV_32F); - // Expected times - Te = new double*[dimY]; - for (int i = 0; i < dimY; i++) - Te[i] = new double[dimX]; - - for (int i = 0; i < dimY; i++) - { - for (int j = 0; j < dimX; j++) - { - Te[i][j] = 0; - } - } - - //start the asynchronous and synchronous threads - return Thread::start(); - } - - - virtual double getPeriod() - { - return 1/displayF; //period of synchrnous thread - } - - - bool interruptModule() - { - //if the module is asked to stop ask the asynchrnous thread to stop - std::cout << "\033c"; - return Thread::stop(); - } - - - void onStop() - { - //when the asynchrnous thread is asked to stop, close ports and do - //other clean up - input_port.close(); - input_sklt.close(); - image_port.close(); - output_writer.close(); - aux_out.close(); - // output_port.close(); - } - - - skeleton13 buildSklt(Bottle &readBottle) - { - skeleton13 newPose; - for (size_t i = 0; i < readBottle.size(); i = i + 2) - { - Value &u = readBottle.get(i); - Value &v = readBottle.get(i + 1); - newPose[i / 2].u = u.asInt32(); - newPose[i / 2].v = v.asInt32(); - } - return newPose; - } - - - //synchronous thread - virtual bool updateModule() - { - // plot ground-truth skeleton - if (initTimer) - drawSkeleton(poseGT); - - // plot tracked joint - while (!pose2img.empty()) - { - int x = pose2img.front()[jointNum].u; - int y = pose2img.front()[jointNum].v; - cv::Point pt(x, y); - cv::drawMarker(fullImg, pt, cv::Scalar(0.8, 0, 0), 0, 6); - pose2img.pop_front(); - } - - // double the resolution to add text - cv::Mat aux; - fullImg.copyTo(aux); - cv::resize(fullImg, aux, cv::Size(dimX * 2, dimY * 2), 0, 0, cv::INTER_CUBIC); - // Add text - cv::putText(aux, - "Detection", - cv::Point(dimX * 1.6, dimY * 1.8), // Coordinates - cv::FONT_HERSHEY_SIMPLEX, // Font - 0.8, // Scale - cv::Scalar(0.0, 0.0, 0.8), // BGR Color - 1, // Line Thickness - cv::LINE_AA); // Anti-alias - cv::putText(aux, - "Tracking", - cv::Point(dimX * 1.6, dimY * 1.9), // Coordinates - cv::FONT_HERSHEY_SIMPLEX, // Font - 0.8, // Scale - cv::Scalar(0.8, 0.0, 0.0), // BGR Color - 1, // Line Thickness - cv::LINE_AA); // Anti-alias - std::string strF = std::to_string(int(avgF)); - std::string strdetF = std::to_string(int(detF)); - if (showF) - cv::putText(aux, - "Freq = " + strF + "Hz" + " - detF = " + strdetF + "Hz", - cv::Point(dimX * 0.05, dimY * 1.9), // Coordinates - cv::FONT_HERSHEY_SIMPLEX, // Font - 0.8, // Scale - cv::Scalar(0.8, 0.8, 0.8), // BGR Color - 1, // Line Thickness - cv::LINE_AA); // Anti-alias - - cv::putText(aux, - "HPE-core EDPR", - cv::Point(dimX * 0.5, dimY * 0.2), // Coordinates - cv::FONT_HERSHEY_SIMPLEX, // Font - 1.2, // Scale - cv::Scalar(0.8, 0.8, 0.8), // BGR Color - 1, // Line Thickness - cv::LINE_AA); // Anti-alias - - if (plotCv) // output image using opencv - { - cv::imshow("HPE OUTPUT", aux); - // cv::imshow("METHOD II", matTe); - cv::waitKey(1); - } - - // output image using yarp - aux *= 255; - cv::Mat img_out; - aux.convertTo(img_out, CV_8UC3); - image_port.prepare().copy(yarp::cv::fromCvMat(img_out)); - image_port.write(); - fullImg = cv::Vec3f(0.4, 0.4, 0.4); - - return Thread::isRunning(); - } - - - void drawSkeleton(skeleton13 poseGT) - { - // plot detected joints - for(int i=0; i<13; i++) - { - int x = int(poseGT[i].u); - int y = int(poseGT[i].v); - cv::Point pt(x, y); - if(x && y) cv::drawMarker(fullImg, pt, cv::Scalar(0, 0, 0.8), 1, 8); - - } - // plot links between joints - cv::Point head(int(poseGT[0].u), int(poseGT[0].v)); - cv::Point shoulderR(int(poseGT[1].u), int(poseGT[1].v)); - cv::Point shoulderL(int(poseGT[2].u), int(poseGT[2].v)); - cv::Point elbowR(int(poseGT[3].u), int(poseGT[3].v)); - cv::Point elbowL(int(poseGT[4].u), int(poseGT[4].v)); - cv::Point hipL(int(poseGT[5].u), int(poseGT[5].v)); - cv::Point hipR(int(poseGT[6].u), int(poseGT[6].v)); - cv::Point handR(int(poseGT[7].u), int(poseGT[7].v)); - cv::Point handL(int(poseGT[8].u), int(poseGT[8].v)); - cv::Point kneeR(int(poseGT[9].u), int(poseGT[9].v)); - cv::Point kneeL(int(poseGT[10].u), int(poseGT[10].v)); - cv::Point footR(int(poseGT[11].u), int(poseGT[11].v)); - cv::Point footL(int(poseGT[12].u), int(poseGT[12].v)); - - cv::Scalar colorS = cv::Scalar(0, 0, 0.5); - int th = 1; - if(head.x && head.y && shoulderL.x && shoulderL.y) cv::line(fullImg, head, shoulderL, colorS, th); - if(head.x && head.y && shoulderR.x && shoulderR.y) cv::line(fullImg, head, shoulderR, colorS, th); - if(shoulderL.x && shoulderL.y && shoulderR.x && shoulderR.y) cv::line(fullImg, shoulderL, shoulderR, colorS, th); - if(shoulderL.x && shoulderL.y && elbowL.x && elbowL.y) cv::line(fullImg, shoulderL, elbowL, colorS, th); - if(shoulderR.x && shoulderR.y && elbowR.x && elbowR.y) cv::line(fullImg, shoulderR, elbowR, colorS, th); - if(shoulderL.x && shoulderL.y && hipL.x && hipL.y) cv::line(fullImg, shoulderL, hipL, colorS, th); - if(shoulderR.x && shoulderR.y && hipR.x && hipR.y) cv::line(fullImg, shoulderR, hipR, colorS, th); - if(hipL.x && hipL.y && hipR.x && hipR.y) cv::line(fullImg, hipL, hipR, colorS, th); - if(elbowL.x && elbowL.y && handL.x && handL.y) cv::line(fullImg, elbowL, handL, colorS, th); - if(elbowR.x && elbowR.y && handR.x && handR.y) cv::line(fullImg, elbowR, handR, colorS, th); - if(hipL.x && hipL.y && kneeL.x && kneeL.y) cv::line(fullImg, hipL, kneeL, colorS, th); - if(hipR.x && hipR.y && kneeR.x && kneeR.y) cv::line(fullImg, hipR, kneeR, colorS, th); - if(kneeR.x && kneeR.y && footR.x && footR.y) cv::line(fullImg, kneeR, footR, colorS, th); - if(kneeL.x && kneeL.y && footL.x && footL.y) cv::line(fullImg, kneeL, footL, colorS, th); - - // draw Roi rectangle - if(ptJoint.x && ptJoint.y) - { - cv::Point roi02(int(qROI.roi[0]), int(qROI.roi[2])); - cv::Point roi12(int(qROI.roi[1]), int(qROI.roi[2])); - cv::Point roi03(int(qROI.roi[0]), int(qROI.roi[3])); - cv::Point roi13(int(qROI.roi[1]), int(qROI.roi[3])); - cv::line(fullImg, roi02, roi12, cv::Scalar(0, 0.5, 0), 2); - cv::line(fullImg, roi03, roi13, cv::Scalar(0, 0.5, 0), 2); - cv::line(fullImg, roi02, roi03, cv::Scalar(0, 0.5, 0), 2); - cv::line(fullImg, roi12, roi13, cv::Scalar(0, 0.5, 0), 2); - } - - } - - - void evsToImage(deque &evs) - { - while(!evs.empty()) - { - int x = evs.front().x; - int y = evs.front().y; - int p = evs.front().polarity; - if(x>=0 && x< dimX && y>=0 && y(y, x) = cv::Vec3f(1.0, 1.0, 1.0); - else - fullImg.at(y, x) = cv::Vec3f(0.0, 0.0, 0.0); - } - evs.pop_front(); - } - } - - - //asynchronous thread run forever - void run() - { - Stamp evStamp, skltStamp; - Bottle *bot_sklt; - double skltTs; - const vector *q; - double t0 = Time::now(), t1=t0, t2 = t0, t2prev = t0, t3 = t0, t4 = t0; - // t0 = initial time - // t1 = global timer - // t2 = timer used for operation frequency calculation - // t3 = timer used for fusion - // t4 = timer used for detection frequency downsampling - long int mes = 0; // amount of porcessing cycles used to average operation frequency - double freq = 0; // operation frequenc - bool firstDet = false; // first detection took place - - while (!Thread::isStopping()) - { - t1 = Time::now() - t0; - // read detections - // int N = input_sklt.getPendingReads(); - bot_sklt = input_sklt.read(false); - input_sklt.getEnvelope(skltStamp); - - skltTs = skltStamp.getTime(); - - ptJoint.x = int(poseGT[jointNum].u); - ptJoint.y = int(poseGT[jointNum].v); - - if(bot_sklt && 1/(t1 - t4) <= detF) // there is a detection - { - Value &coords = (*bot_sklt).get(1); - Bottle *sklt_lst = coords.asList(); - // build skeleton from reading - skeleton13 builtPose = buildSklt(*sklt_lst); - pose = tracker.resetPose(builtPose); // reset pose - poseGT = pose; - // set roi for just one joint - int x = builtPose[jointNum].u; - int y = builtPose[jointNum].v; - qROI.setROI(x - roiWidth / 2, x + roiWidth / 2, y - roiHeight / 2, y + roiHeight / 2); - // output detections to file - aux_out << t1 << " "; - for (auto &t : builtPose) - aux_out << t.u << " " << t.v << " "; - aux_out << std::endl; - firstDet = true; - t4 = Time::now() - t0; - } - - // read events - int np = input_port.queryunprocessed(); - if(np && initTimer) - { - t2 = Time::now() - t0; - freq += 1/(t2-t2prev); - mes++; - avgF = freq/mes; // average freq - if(mes>1000) - { - mes= 0; - freq=0; - } - t2prev = t2; - // avgF = 1/(t2-tprev); // instant freq - // yInfo() << "\033c" << avgF; - } - nevs = 0; - for (int i = 0; i < np; i++) - { - if(!initTimer) - { - initTimer = true; - t0 = Time::now(); - } - q = input_port.read(evStamp); - if (!q || Thread::isStopping()) - return; - for (auto &qi : *q) - { - evsFullImg.push_back(qi); // save events to visualize in sync thread - if(qi.x >= qROI.roi[0] && qi.x= qROI.roi[2] && qi.y 2*nevs) - // qROI.setSize(nevs*2); - // else - // qROI.setSize(halfRoi); - qROI.setSize(halfRoi); - } - } - - // Add events to output image - if (initTimer) - evsToImage(evsFullImg); - - // Process data for tracking - if(pose.size() && firstDet) // a pose has been detected before - { - if (nevs && qROI.q.size() && !bot_sklt)// && qROI.q.front().stamp * vtsHelper::tsscaler > skltTs) // there are events to process - { - // separate events into deques to avoid using event-driven in hpe-core - std::deque evs; - std::deque evsTs; - getEventsUV(qROI.q, evs, evsTs, vtsHelper::tsscaler); // get events u,v coords - - if(method == 1) // Velocity estimation Method 1: time diff on adjacent events - { - if(nevs > 20) - { - dpose = tracker.method1(evs, evsTs, jointNum, nevs, vels); // get veocities from delta ts - double dt = (qROI.q.front().stamp - qROI.q.back().stamp) * vtsHelper::tsscaler; - tracker.fusion(&pose, dpose, dt); - } - else - { - dpose = tracker.resetVel(); - double dt = (qROI.q.front().stamp - qROI.q.back().stamp) * vtsHelper::tsscaler; - tracker.fusion(&pose, dpose, dt); - } - } - else if(method == 2) // Velocity estimation method 2: neighbor events - { - // yInfo() << nevs; - // tracker.estimateFire(evs, evsTs, evsPol, jointNum, nevs, pose, dpose, Te, matTe); - // double err = tracker.getError(evs, evsTs, evsPol, jointNum, nevs, pose, dpose, Te, matTe); - // dpose = tracker.setVel(jointNum, dpose, pose[jointNum].u, pose[jointNum].v, err); - dpose = tracker.estimateVelocity(evs, evsTs, jointNum, nevs, vels); - double dt = t1 - t3; - tracker.fusion(&pose, dpose, dt); - t3 = t1; - } - - // write integrated pose output to file - output_writer << t1 << " "; - for (auto &t : pose) - output_writer << t.u << " " << t.v << " "; - output_writer << std::endl; - - // update roi - int x = pose[jointNum].u; - int y = pose[jointNum].v; - qROI.setROI(x - roiWidth / 2, x + roiWidth / 2, y - roiHeight / 2, y + roiHeight / 2); - pose2img.push_back(pose); - - // output velocities estimations to file - if(avgV) // true = write averaged vel - false = write event by event vel - { - vel_out << t1 << " " << dpose[jointNum].u << " " << dpose[jointNum].v << std::endl; - } - else - { - while(!vels.empty()) - { - joint V = vels.front(); - vel_out << t1 << " " << V.u << " " << V.v << std::endl; - vels.pop_front(); - } - } - } - else if (bot_sklt) // there weren't events to process but a detection occured - { - // write detected output to file - output_writer << t1 << " "; - for (auto &t : pose) - output_writer << t.u << " " << t.v << " "; - output_writer << std::endl; - } - } - } - } -}; - -int main(int argc, char *argv[]) -{ - /* initialize yarp network */ - yarp::os::Network yarp; - if (!yarp.checkNetwork(2)) - { - std::cout << "Could not connect to YARP" << std::endl; - return false; - } - /* prepare and configure the resource finder */ - yarp::os::ResourceFinder rf; - rf.configure(argc, argv); - /* create the module */ - jointTrack instance; - return instance.runModule(rf); -} diff --git a/example/joint_tracking_example/past_surface.cpp b/example/joint_tracking_example/past_surface.cpp deleted file mode 100644 index 045b38be..00000000 --- a/example/joint_tracking_example/past_surface.cpp +++ /dev/null @@ -1,657 +0,0 @@ -#include -#include -#include -#include -#include -#include -#include -#include -#include "roiq.h" -#include -#include -#include -#include -#include -#include -#include - -using namespace ev; -using namespace yarp::os; -using namespace yarp::sig; -using namespace hpecore; -using std::tuple; -using yarp::os::Bottle; -using yarp::os::BufferedPort; -// using std::tuple; -// using skeleton = std::vector>; - - -class jointTrack : public RFModule, public Thread -{ - -private: - vReadPort> input_port; - BufferedPort input_sklt; - std::ofstream output_writer, aux_out, vel_out; - deque evsFullImg; - deque pose2img; - deque vels; - std::mutex m; - skeleton13 pose, dpose, poseGT; - hpecore::jointMotionEstimator tracker; - roiq qROI; - int roiWidth = 32; - int roiHeight = roiWidth; - jointName jointNum; - string jointNameStr; - int dimY, dimX; - int nevs = 0; - bool plotCv = false; - cv::Mat fullImg; - bool initTimer = false; - double avgF = 0; - BufferedPort > image_port; - float displayF = 25.0; // display frequency in Hz - bool showF = false; - cv::Mat matTe, matTe_vis; - double** Te; - int method; - bool avgV = false; - int detF; - bool objectTracking = false; - bool h36 = false; - bool pastSurf = false; - cv::Point ptJoint; - hpecore::surface S; - -public: - jointTrack() {} - - virtual bool configure(yarp::os::ResourceFinder &rf) - { - std::cout << "\033c"; - yInfo() << "\t\t - = HPE - JOINT MOTION ESTIMATION = -"; - //set the module name used to name ports - setName((rf.check("name", Value("/jointTrack")).asString()).c_str()); - - /* initialize yarp network */ - yarp::os::Network yarp; - if (!yarp.checkNetwork(2.0)) - { - std::cout << "Could not connect to YARP" << std::endl; - return false; - } - - //open io ports - if (!input_port.open(getName("/AE:i"))) - { - yError() << "Could not open input port"; - return false; - } - - if (!input_sklt.open(getName("/SKLT:i"))) - { - yError() << "Could not open input port"; - return false; - } - - if(!image_port.open(getName("/img:o"))) { - yError() << "Could not open output port"; - return false; - } - - detF = rf.check("detF", Value(10)).asInt(); - yInfo() << "Detections frequency = " << detF << " Hz"; - if(rf.check("obj")) - objectTracking = true; - - - - output_writer.open("output.txt"); - if (!output_writer.is_open()) - yError() << "Could not open pose writer!!"; - else - yInfo() << "Pose writer opened"; - - aux_out.open("aux_out.txt"); - if (!aux_out.is_open()) - yError() << "Could not open pose writer!!"; - else - yInfo() << "aux_out writer opened"; - - vel_out.open("vel_out.txt"); - if (!vel_out.is_open()) - yError() << "Could not open pose writer!!"; - else - yInfo() << "vel_out writer opened"; - - jointNameStr = rf.check("joint", Value("handL")).asString(); - jointNum = str2enum(jointNameStr); - displayF = rf.check("F", Value(25)).asFloat32(); - method = rf.check("M", Value(2)).asInt(); - - if(rf.check("avgV")) - avgV = true; - - - yInfo() << "Method for velocity estimations = " << method; - - // intialize velocities - for (size_t i = 0; i < dpose.size(); i++) - { - dpose[i].u = 0; - dpose[i].v = 0; - } - - if(rf.check("cv")) - plotCv = true; - dimX = rf.check("dimX", Value(346)).asInt(); - dimY = rf.check("dimY", Value(260)).asInt(); - if(rf.check("h36")) - h36 = true; - - // connect ports - if(!objectTracking) - { - if(!h36) // dhp19 dataset - { - yarp.connect("/file/ch3dvs:o", getName("/AE:i"), "fast_tcp"); - // yarp.connect("/file/ch3GT10Hzskeleton:o", getName("/SKLT:i"), "fast_tcp"); - yarp.connect("/file/ch3GTskeleton:o", getName("/SKLT:i"), "fast_tcp"); - } - else // H 3.6 dataset - { - yarp.connect("/file/ch0dvs:o", getName("/AE:i"), "fast_tcp"); - yarp.connect("/file/ch0GT50Hzskeleton:o", getName("/SKLT:i"), "fast_tcp"); - roiWidth = 20; - roiHeight = roiWidth; - } - } - else - { - yarp.connect("/file/leftdvs:o", getName("/AE:i"), "fast_tcp"); - yarp.connect("/file/502Hz:o", getName("/SKLT:i"), "fast_tcp"); - roiWidth = 60; - roiHeight = 100; - dimX = 640; - dimY = 480; - jointNum = str2enum("head"); - } - - yarp.connect(getName("/img:o"), "/yarpview/img:i", "fast_tcp"); - - - yInfo() << "Image dimensions = [" << dimX << ", " << dimY << "]"; - fullImg = cv::Mat::zeros(cv::Size(dimX, dimY), CV_32F); - cvtColor(fullImg, fullImg, cv::COLOR_GRAY2RGB); - if(rf.check("showF")) - showF = true; - - if(rf.check("past")) - pastSurf = true; - - if(plotCv) - { - int x0 = 2148, y0 = 0; - cv::namedWindow("HPE OUTPUT", cv::WINDOW_NORMAL); - cv::resizeWindow("HPE OUTPUT", dimX*2, dimY*2); - cv::moveWindow("HPE OUTPUT", x0, y0); - - if(pastSurf) - { - cv::namedWindow("TOS", cv::WINDOW_NORMAL); - cv::resizeWindow("TOS", dimX*2, dimY*2); - cv::moveWindow("TOS", x0+dimX*2.1, y0); - - - cv::namedWindow("TIME SURFACE", cv::WINDOW_NORMAL); - cv::resizeWindow("TIME SURFACE", dimX*2, dimY*2); - cv::moveWindow("TIME SURFACE", x0+dimX*2*2.1, y0); - } - } - - - matTe = cv::Mat::zeros(cv::Size(dimX, dimY), CV_32F); - // Expected times - Te = new double*[dimY]; - for (int i = 0; i < dimY; i++) - Te[i] = new double[dimX]; - - for (int i = 0; i < dimY; i++) - { - for (int j = 0; j < dimX; j++) - { - Te[i][j] = 0; - } - } - - - // past events surface - // S.getSurface(); - // S.init(dimX, dimY, 7, 0.3); // EROS - S.init(dimX, dimY, 7, 3); // TOS - - - //start the asynchronous and synchronous threads - return Thread::start(); - } - - - virtual double getPeriod() - { - return 1/displayF; //period of synchrnous thread - } - - - bool interruptModule() - { - //if the module is asked to stop ask the asynchrnous thread to stop - std::cout << "\033c"; - return Thread::stop(); - } - - - void onStop() - { - //when the asynchrnous thread is asked to stop, close ports and do - //other clean up - input_port.close(); - input_sklt.close(); - image_port.close(); - output_writer.close(); - aux_out.close(); - // output_port.close(); - } - - - skeleton13 buildSklt(Bottle &readBottle) - { - skeleton13 newPose; - for (size_t i = 0; i < readBottle.size(); i = i + 2) - { - Value &u = readBottle.get(i); - Value &v = readBottle.get(i + 1); - newPose[i / 2].u = u.asInt32(); - newPose[i / 2].v = v.asInt32(); - } - return newPose; - } - - - //synchronous thread - virtual bool updateModule() - { - // plot ground-truth skeleton - if (initTimer) - drawSkeleton(poseGT); - - // plot tracked joint - while (!pose2img.empty()) - { - int x = pose2img.front()[jointNum].u; - int y = pose2img.front()[jointNum].v; - cv::Point pt(x, y); - cv::drawMarker(fullImg, pt, cv::Scalar(0.8, 0, 0), 0, 6); - pose2img.pop_front(); - } - - // double the resolution to add text - cv::Mat aux; - fullImg.copyTo(aux); - cv::resize(fullImg, aux, cv::Size(dimX * 2, dimY * 2), 0, 0, cv::INTER_CUBIC); - // Add text - cv::putText(aux, - "Detection", - cv::Point(dimX * 1.6, dimY * 1.8), // Coordinates - cv::FONT_HERSHEY_SIMPLEX, // Font - 0.8, // Scale - cv::Scalar(0.0, 0.0, 0.8), // BGR Color - 1, // Line Thickness - cv::LINE_AA); // Anti-alias - cv::putText(aux, - "Tracking", - cv::Point(dimX * 1.6, dimY * 1.9), // Coordinates - cv::FONT_HERSHEY_SIMPLEX, // Font - 0.8, // Scale - cv::Scalar(0.8, 0.0, 0.0), // BGR Color - 1, // Line Thickness - cv::LINE_AA); // Anti-alias - std::string strF = std::to_string(int(avgF)); - std::string strdetF = std::to_string(int(detF)); - if (showF) - cv::putText(aux, - "Freq = " + strF + "Hz" + " - detF = " + strdetF + "Hz", - cv::Point(dimX * 0.05, dimY * 1.9), // Coordinates - cv::FONT_HERSHEY_SIMPLEX, // Font - 0.8, // Scale - cv::Scalar(0.8, 0.8, 0.8), // BGR Color - 1, // Line Thickness - cv::LINE_AA); // Anti-alias - - cv::putText(aux, - "HPE-core EDPR", - cv::Point(dimX * 0.5, dimY * 0.2), // Coordinates - cv::FONT_HERSHEY_SIMPLEX, // Font - 1.2, // Scale - cv::Scalar(0.8, 0.8, 0.8), // BGR Color - 1, // Line Thickness - cv::LINE_AA); // Anti-alias - - if (plotCv) // output image using opencv - { - cv::imshow("HPE OUTPUT", aux); - if(pastSurf) - { - cv::imshow("TOS", S.getSurface()); - cv::imshow("TIME SURFACE", matTe_vis); - } - cv::waitKey(1); - } - - // output image using yarp - aux *= 255; - cv::Mat img_out; - aux.convertTo(img_out, CV_8UC3); - image_port.prepare().copy(yarp::cv::fromCvMat(img_out)); - image_port.write(); - fullImg = cv::Vec3f(0.4, 0.4, 0.4); - - return Thread::isRunning(); - } - - - void drawSkeleton(skeleton13 poseGT) - { - // plot detected joints - for(int i=0; i<13; i++) - { - int x = int(poseGT[i].u); - int y = int(poseGT[i].v); - cv::Point pt(x, y); - if(x && y) cv::drawMarker(fullImg, pt, cv::Scalar(0, 0, 0.8), 1, 8); - - } - // plot links between joints - cv::Point head(int(poseGT[0].u), int(poseGT[0].v)); - cv::Point shoulderR(int(poseGT[1].u), int(poseGT[1].v)); - cv::Point shoulderL(int(poseGT[2].u), int(poseGT[2].v)); - cv::Point elbowR(int(poseGT[3].u), int(poseGT[3].v)); - cv::Point elbowL(int(poseGT[4].u), int(poseGT[4].v)); - cv::Point hipL(int(poseGT[5].u), int(poseGT[5].v)); - cv::Point hipR(int(poseGT[6].u), int(poseGT[6].v)); - cv::Point handR(int(poseGT[7].u), int(poseGT[7].v)); - cv::Point handL(int(poseGT[8].u), int(poseGT[8].v)); - cv::Point kneeR(int(poseGT[9].u), int(poseGT[9].v)); - cv::Point kneeL(int(poseGT[10].u), int(poseGT[10].v)); - cv::Point footR(int(poseGT[11].u), int(poseGT[11].v)); - cv::Point footL(int(poseGT[12].u), int(poseGT[12].v)); - - cv::Scalar colorS = cv::Scalar(0, 0, 0.5); - int th = 1; - if(head.x && head.y && shoulderL.x && shoulderL.y) cv::line(fullImg, head, shoulderL, colorS, th); - if(head.x && head.y && shoulderR.x && shoulderR.y) cv::line(fullImg, head, shoulderR, colorS, th); - if(shoulderL.x && shoulderL.y && shoulderR.x && shoulderR.y) cv::line(fullImg, shoulderL, shoulderR, colorS, th); - if(shoulderL.x && shoulderL.y && elbowL.x && elbowL.y) cv::line(fullImg, shoulderL, elbowL, colorS, th); - if(shoulderR.x && shoulderR.y && elbowR.x && elbowR.y) cv::line(fullImg, shoulderR, elbowR, colorS, th); - if(shoulderL.x && shoulderL.y && hipL.x && hipL.y) cv::line(fullImg, shoulderL, hipL, colorS, th); - if(shoulderR.x && shoulderR.y && hipR.x && hipR.y) cv::line(fullImg, shoulderR, hipR, colorS, th); - if(hipL.x && hipL.y && hipR.x && hipR.y) cv::line(fullImg, hipL, hipR, colorS, th); - if(elbowL.x && elbowL.y && handL.x && handL.y) cv::line(fullImg, elbowL, handL, colorS, th); - if(elbowR.x && elbowR.y && handR.x && handR.y) cv::line(fullImg, elbowR, handR, colorS, th); - if(hipL.x && hipL.y && kneeL.x && kneeL.y) cv::line(fullImg, hipL, kneeL, colorS, th); - if(hipR.x && hipR.y && kneeR.x && kneeR.y) cv::line(fullImg, hipR, kneeR, colorS, th); - if(kneeR.x && kneeR.y && footR.x && footR.y) cv::line(fullImg, kneeR, footR, colorS, th); - if(kneeL.x && kneeL.y && footL.x && footL.y) cv::line(fullImg, kneeL, footL, colorS, th); - - // draw Roi detection rectangle - if(ptJoint.x && ptJoint.y) - { - cv::Point roi02(ptJoint.x-roiWidth/2, ptJoint.y-roiHeight/2); - cv::Point roi12(ptJoint.x+roiWidth/2, ptJoint.y-roiHeight/2); - cv::Point roi03(ptJoint.x-roiWidth/2, ptJoint.y+roiHeight/2); - cv::Point roi13(ptJoint.x+roiWidth/2, ptJoint.y+roiHeight/2); - cv::line(fullImg, roi02, roi12, cv::Scalar(0, 0, 0.25), 1); - cv::line(fullImg, roi03, roi13, cv::Scalar(0, 0, 0.25), 1); - cv::line(fullImg, roi02, roi03, cv::Scalar(0, 0, 0.25), 1); - cv::line(fullImg, roi12, roi13, cv::Scalar(0, 0, 0.25), 1); - } - - - // draw Roi rectangle - if(ptJoint.x && ptJoint.y) - { - cv::Point roi02(int(qROI.roi[0]), int(qROI.roi[2])); - cv::Point roi12(int(qROI.roi[1]), int(qROI.roi[2])); - cv::Point roi03(int(qROI.roi[0]), int(qROI.roi[3])); - cv::Point roi13(int(qROI.roi[1]), int(qROI.roi[3])); - cv::line(fullImg, roi02, roi12, cv::Scalar(0, 0.5, 0), 2); - cv::line(fullImg, roi03, roi13, cv::Scalar(0, 0.5, 0), 2); - cv::line(fullImg, roi02, roi03, cv::Scalar(0, 0.5, 0), 2); - cv::line(fullImg, roi12, roi13, cv::Scalar(0, 0.5, 0), 2); - } - - } - - - void evsToImage(deque &evs) - { - while(!evs.empty()) - { - int x = evs.front().x; - int y = evs.front().y; - int p = evs.front().polarity; - if(x>=0 && x< dimX && y>=0 && y(y, x) = cv::Vec3f(1.0, 1.0, 1.0); - else - fullImg.at(y, x) = cv::Vec3f(0.0, 0.0, 0.0); - } - evs.pop_front(); - } - } - - - //asynchronous thread run forever - void run() - { - Stamp evStamp, skltStamp; - Bottle *bot_sklt; - double skltTs; - const vector *q; - double t0 = Time::now(), t1=t0, t2 = t0, t2prev = t0, t3 = t0, t4 = t0; - // t0 = initial time - // t1 = global timer - // t2 = timer used for operation frequency calculation - // t3 = timer used for fusion - // t4 = timer used for detection frequency downsampling - long int mes = 0; // amount of porcessing cycles used to average operation frequency - double freq = 0; // operation frequenc - bool firstDet = false; // first detection took place - - while (!Thread::isStopping()) - { - t1 = Time::now() - t0; - // read detections - // int N = input_sklt.getPendingReads(); - bot_sklt = input_sklt.read(false); - input_sklt.getEnvelope(skltStamp); - - skltTs = skltStamp.getTime(); - - ptJoint.x = int(poseGT[jointNum].u); - ptJoint.y = int(poseGT[jointNum].v); - - if(bot_sklt && 1/(t1 - t4) <= detF) // there is a detection - { - Value &coords = (*bot_sklt).get(1); - Bottle *sklt_lst = coords.asList(); - // build skeleton from reading - skeleton13 builtPose = buildSklt(*sklt_lst); - pose = tracker.resetPose(builtPose); // reset pose - poseGT = pose; - // set roi for just one joint - int x = builtPose[jointNum].u; - int y = builtPose[jointNum].v; - qROI.setROI(x - roiWidth / 2, x + roiWidth / 2, y - roiHeight / 2, y + roiHeight / 2); - // output detections to file - aux_out << t1 << " "; - for (auto &t : builtPose) - aux_out << t.u << " " << t.v << " "; - aux_out << std::endl; - firstDet = true; - t4 = Time::now() - t0; - } - - // read events - int np = input_port.queryunprocessed(); - if(np && initTimer) - { - t2 = Time::now() - t0; - freq += 1/(t2-t2prev); - mes++; - avgF = freq/mes; // average freq - if(mes>1000) - { - mes= 0; - freq=0; - } - t2prev = t2; - // avgF = 1/(t2-tprev); // instant freq - // yInfo() << "\033c" << avgF; - } - nevs = 0; - for (int i = 0; i < np; i++) - { - if(!initTimer) - { - initTimer = true; - t0 = Time::now(); - } - q = input_port.read(evStamp); - if (!q || Thread::isStopping()) - return; - for (auto &qi : *q) - { - evsFullImg.push_back(qi); // save events to visualize in sync thread - if(qi.x >= qROI.roi[0] && qi.x= qROI.roi[2] && qi.y skltTs) // there are events to process - { - // separate events into deques to avoid using event-driven in hpe-core - std::deque evs; - std::deque evsTs; - getEventsUV(qROI.q, evs, evsTs, vtsHelper::tsscaler); // get events u,v coords - - if(method == 1) // Velocity estimation Method 1: time diff on adjacent events - { - qROI.setSize(int((qROI.roi[1] - qROI.roi[0]) * (qROI.roi[3] - qROI.roi[2])*1.5)); - if(nevs > 20) - { - dpose = tracker.method1(evs, evsTs, jointNum, nevs, vels); // get veocities from delta ts - double dt = (qROI.q.front().stamp - qROI.q.back().stamp) * vtsHelper::tsscaler; - tracker.fusion(&pose, dpose, dt); - } - else - { - dpose = tracker.resetVel(); - double dt = (qROI.q.front().stamp - qROI.q.back().stamp) * vtsHelper::tsscaler; - tracker.fusion(&pose, dpose, dt); - } - } - else if(method == 2) // Velocity estimation method 2: neighbor events - { - int halfRoi = int((qROI.roi[1] - qROI.roi[0]) * (qROI.roi[3] - qROI.roi[2])*0.5); - if(pastSurf) qROI.setSize(nevs); - else qROI.setSize(halfRoi); - // yInfo() << nevs; - // tracker.estimateFire(evs, evsTs, evsPol, jointNum, nevs, pose, dpose, Te, matTe); - // double err = tracker.getError(evs, evsTs, evsPol, jointNum, nevs, pose, dpose, Te, matTe); - // dpose = tracker.setVel(jointNum, dpose, pose[jointNum].u, pose[jointNum].v, err); - if(pastSurf) dpose[jointNum] = tracker.estimateVelocity(evs, evsTs, nevs, vels, S.getSurface(), matTe); - else dpose[jointNum] = tracker.estimateVelocity(evs, evsTs, nevs, vels); - double dt = t1 - t3; - tracker.fusion(&pose[jointNum], dpose[jointNum], dt); - t3 = t1; - } - - // write integrated pose output to file - output_writer << t1 << " "; - for (auto &t : pose) - output_writer << t.u << " " << t.v << " "; - output_writer << std::endl; - - // update roi - int x = pose[jointNum].u; - int y = pose[jointNum].v; - qROI.setROI(x - roiWidth / 2, x + roiWidth / 2, y - roiHeight / 2, y + roiHeight / 2); - pose2img.push_back(pose); - - // output velocities estimations to file - if(avgV) // true = write averaged vel - false = write event by event vel - { - vel_out << t1 << " " << dpose[jointNum].u << " " << dpose[jointNum].v << std::endl; - } - else - { - while(!vels.empty()) - { - joint V = vels.front(); - vel_out << t1 << " " << V.u << " " << V.v << std::endl; - vels.pop_front(); - } - } - } - else if (bot_sklt) // there weren't events to process but a detection occured - { - // write detected output to file - output_writer << t1 << " "; - for (auto &t : pose) - output_writer << t.u << " " << t.v << " "; - output_writer << std::endl; - } - } - - for (auto &qi : *q) - { - // S.EROSupdate(qi.x, qi.y); - S.TOSupdate(qi.x, qi.y); - matTe.at(qi.y, qi.x) = float(qi.stamp*vtsHelper::tsscaler); - } - cv::normalize(matTe,matTe_vis, 0, 1, cv::NORM_MINMAX); - // cv::normalize(matTe,matTe, 0, 1, cv::NORM_MINMAX); - } - } -}; - -int main(int argc, char *argv[]) -{ - /* initialize yarp network */ - yarp::os::Network yarp; - if (!yarp.checkNetwork(2)) - { - std::cout << "Could not connect to YARP" << std::endl; - return false; - } - /* prepare and configure the resource finder */ - yarp::os::ResourceFinder rf; - rf.configure(argc, argv); - /* create the module */ - jointTrack instance; - return instance.runModule(rf); -} diff --git a/example/joint_tracking_example/roiq.cpp b/example/joint_tracking_example/roiq.cpp deleted file mode 100644 index 0de1232d..00000000 --- a/example/joint_tracking_example/roiq.cpp +++ /dev/null @@ -1,34 +0,0 @@ -#include "roiq.h" - -roiq::roiq() -{ - roi.resize(4); - n = 1000; - roi[0] = 0; roi[1] = 1000; - roi[2] = 0; roi[3] = 1000; - use_TW = false; -} - -void roiq::setSize(unsigned int value) -{ - //if TW n is in clock-ticks - //otherwise n is in # events. - n = value; - while(q.size() > n) - q.pop_back(); -} - -void roiq::setROI(int xl, int xh, int yl, int yh) -{ - roi[0] = xl; roi[1] = xh; - roi[2] = yl; roi[3] = yh; -} - -int roiq::add(const ev::AE &v) -{ - - if(v.x < roi[0] || v.x > roi[1] || v.y < roi[2] || v.y > roi[3]) - return 0; - q.push_front(v); - return 1; -} \ No newline at end of file diff --git a/example/joint_tracking_example/roiq.h b/example/joint_tracking_example/roiq.h deleted file mode 100644 index fe10597e..00000000 --- a/example/joint_tracking_example/roiq.h +++ /dev/null @@ -1,23 +0,0 @@ -#include -#include -#include - -using namespace ev; -using namespace yarp::os; -using namespace yarp::sig; - -class roiq -{ -public: - - std::deque q; - unsigned int n; - yarp::sig::Vector roi; - bool use_TW; - - roiq(); - void setSize(unsigned int value); - void setROI(int xl, int xh, int yl, int yh); - int add(const ev::AE &v); - -}; \ No newline at end of file From bc60ee1602727ae710a2e7a423be5f61039c6168 Mon Sep 17 00:00:00 2001 From: Arren Glover Date: Tue, 7 Feb 2023 11:53:19 +0100 Subject: [PATCH 17/23] Update README.md --- example/op_detector_example_module/README.md | 6 +----- 1 file changed, 1 insertion(+), 5 deletions(-) diff --git a/example/op_detector_example_module/README.md b/example/op_detector_example_module/README.md index dd877d15..5925a8c8 100644 --- a/example/op_detector_example_module/README.md +++ b/example/op_detector_example_module/README.md @@ -28,14 +28,10 @@ The software was tested on Ubuntu 20.04.2 LTS with an Nvidia GPU. $ cd $ git clone git@github.com:event-driven-robotics/hpe-core.git $ cd hpe-core/example/op_detector_example_module - $ docker build -t op-yarp --ssh default --build-arg ssh_pub_key="$(cat ~/.ssh/)" --build-arg ssh_prv_key="$(cat ~/.ssh/.pub)" - < Dockerfile + $ docker build -t op-yarp - < Dockerfile ``` :bulb: `` is the parent directory in which the repository is cloned -:bulb: The ssh keys are required to access hpe-core as it is currently private. [Create a new ssh key](https://docs.github.com/en/github/authenticating-to-github/connecting-to-github-with-ssh/generating-a-new-ssh-key-and-adding-it-to-the-ssh-agent) if required. - -:warning: Ensure your ssh key is built without a passphrase. - ## Live Atis Camera Usage - Run the Docker container and, inside it, run the yarp manager and server ```shell From ceca5c28afa9b7c0bb608844458a953d034d0698 Mon Sep 17 00:00:00 2001 From: Arren Glover Date: Tue, 7 Feb 2023 11:54:05 +0100 Subject: [PATCH 18/23] Update README.md --- example/movenet/README.md | 6 ++---- 1 file changed, 2 insertions(+), 4 deletions(-) diff --git a/example/movenet/README.md b/example/movenet/README.md index 9d03116a..823a309c 100644 --- a/example/movenet/README.md +++ b/example/movenet/README.md @@ -17,9 +17,7 @@ This is A Pytorch implementation of MoveNet inspired from the [Movenet.Pytorch]( $ cd $ git clone git@github.com:event-driven-robotics/hpe-core.git $ cd hpe-core/example/movenet - $ docker build -t movenet --ssh default \ - $ --build-arg ssh_pub_key="$(cat ~/.ssh/.pub)"\ - $ --build-arg ssh_prv_key="$(cat ~/.ssh/)" - < Dockerfile + $ docker build -t movenet - < Dockerfile ``` :bulb: `` is the parent directory of choice in which the repository is cloned This will create a Docker image names movenet. Before running docker, instruct the host to accept GUI windows with the following command: @@ -67,4 +65,4 @@ To create csv files from a stored dataset of eros frames: --results_path <> -``` \ No newline at end of file +``` From c2ac9ee5b83578ef4a9a7cdcb1df0a3c28ba0c64 Mon Sep 17 00:00:00 2001 From: Arren Glover Date: Tue, 7 Feb 2023 11:54:39 +0100 Subject: [PATCH 19/23] Update README.md --- example/yarp-glhpe/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/example/yarp-glhpe/README.md b/example/yarp-glhpe/README.md index 53463af8..78a786ef 100644 --- a/example/yarp-glhpe/README.md +++ b/example/yarp-glhpe/README.md @@ -13,7 +13,7 @@ The software was tested on Ubuntu 20.04.2 LTS without GPU support. $ cd $ git clone git@github.com:event-driven-robotics/hpe-core.git $ cd hpe-core/example/yarp_glhpe - $ docker build -t gl_hpe:yarp < Dockerfile + $ docker build -t gl_hpe:yarp - < Dockerfile ``` :bulb: `` is the parent directory in which the repository is cloned From 3b3635692d7191812687cb761f57bc605c22e7f8 Mon Sep 17 00:00:00 2001 From: Arren Glover Date: Tue, 7 Feb 2023 12:03:44 +0100 Subject: [PATCH 20/23] Create README.MD --- datasets/h36m/README.MD | 5 +++++ 1 file changed, 5 insertions(+) create mode 100644 datasets/h36m/README.MD diff --git a/datasets/h36m/README.MD b/datasets/h36m/README.MD new file mode 100644 index 00000000..0cf3eefb --- /dev/null +++ b/datasets/h36m/README.MD @@ -0,0 +1,5 @@ +# Human 3.6 million + +link to dataset + +how to use file to produce events from H3.6m From 384cd09e520cc74c60c2d0e8ace70ca1f980fa87 Mon Sep 17 00:00:00 2001 From: arrenglover Date: Tue, 7 Feb 2023 11:09:26 +0000 Subject: [PATCH 21/23] removed old dataset files --- datasets/e2vid_iterator.py | 130 ------------------------------------- datasets/yarp_to_eros.py | 84 ------------------------ 2 files changed, 214 deletions(-) delete mode 100644 datasets/e2vid_iterator.py delete mode 100644 datasets/yarp_to_eros.py diff --git a/datasets/e2vid_iterator.py b/datasets/e2vid_iterator.py deleted file mode 100644 index 22c6a47b..00000000 --- a/datasets/e2vid_iterator.py +++ /dev/null @@ -1,130 +0,0 @@ - -import argparse -import os -import torch - -from image_reconstructor import ImageReconstructor -from options.inference_options import set_inference_options -from utils.inference_utils import events_to_voxel_grid, events_to_voxel_grid_pytorch -from utils.loading_utils import load_model, get_device -from utils.timers import Timer - - -import sys -sys.path.append('/path/to/hpe-core') - -import datasets.utils.parsing as parsing - -from datasets.utils.mat_files import loadmat - - -if __name__ == "__main__": - - parser = argparse.ArgumentParser() - parser.add_argument('-sw', '--sensor_width', default=346, type=int, help='width of the sensor') - parser.add_argument('-sh', '--sensor_height', default=260, type=int, help='height of the sensor') - parser.add_argument('-cid', '--cam_id', default=0, type=int, help='id of the DVS camera') - parser.add_argument('-ie', '--events_file', required=True, type=str) - parser.add_argument('--fixed_duration', dest='fixed_duration', action='store_true') - parser.set_defaults(fixed_duration=False) - parser.add_argument('-N', '--window_size', default=7500, type=int, - help="Size of each event window, in number of events. Ignored if --fixed_duration=True") - parser.add_argument('-T', '--window_duration', default=33.33, type=float, - help="Duration of each event window, in milliseconds. Ignored if --fixed_duration=False") - parser.add_argument('--num_events_per_pixel', default=0.35, type=float, - help='in case N (window size) is not specified, it will be \ - automatically computed as N = width * height * num_events_per_pixel') - parser.add_argument('--skipevents', default=0, type=int) - parser.add_argument('--suboffset', default=0, type=int) - parser.add_argument('--compute_voxel_grid_on_cpu', dest='compute_voxel_grid_on_cpu', action='store_true') - parser.set_defaults(compute_voxel_grid_on_cpu=False) - - set_inference_options(parser) - - args = parser.parse_args() - - width = args.sensor_width - height = args.sensor_height - print('Sensor size: {} x {}'.format(width, height)) - - # create subfolder for the selected cam - subfolder_path = f'{args.output_folder}/{args.cam_id}' - args.output_folder = subfolder_path - try: - os.umask(0) - os.makedirs(subfolder_path) - except FileExistsError: - print(f'Folder {subfolder_path} already exists') - exit(0) - except: - print(f'Could not create folder {subfolder_path}') - exit(0) - - # Load model - model = load_model('pretrained/E2VID_lightweight.pth.tar') - device = get_device(args.use_gpu) - - model = model.to(device) - model.eval() - - reconstructor = ImageReconstructor(model, height, width, model.num_bins, args) - - """ Read chunks of events using Pandas """ - - # Loop through the events and reconstruct images - N = args.window_size - if not args.fixed_duration: - if N is None: - N = int(width * height * args.num_events_per_pixel) - print('Will use {} events per tensor (automatically estimated with num_events_per_pixel={:0.2f}).'.format( - N, args.num_events_per_pixel)) - else: - print('Will use {} events per tensor (user-specified)'.format(N)) - mean_num_events_per_pixel = float(N) / float(width * height) - if mean_num_events_per_pixel < 0.1: - print('!!Warning!! the number of events used ({}) seems to be low compared to the sensor size. \ - The reconstruction results might be suboptimal.'.format(N)) - elif mean_num_events_per_pixel > 1.5: - print('!!Warning!! the number of events used ({}) seems to be high compared to the sensor size. \ - The reconstruction results might be suboptimal.'.format(N)) - - initial_offset = args.skipevents - sub_offset = args.suboffset - start_index = initial_offset + sub_offset - - if args.compute_voxel_grid_on_cpu: - print('Will compute voxel grid on CPU.') - - # if args.fixed_duration: - # event_window_iterator = FixedDurationEventReader(args.events_file, - # duration_ms=args.window_duration, - # start_index=start_index) - # else: - # event_window_iterator = FixedSizeEventReader(args.events_file, num_events=N, start_index=start_index) - - data_dvs = loadmat(args.events_file) - - iterator = parsing.YarpHPEIterator(data_dvs['data']['left']['dvs'], data_skeleton) - - for fi, (events, skeleton, skeleton_ts, head_size, torso_size) in tqdm(enumerate(iterator)): - - last_timestamp = events[-1, 0] - - with Timer('Building event tensor'): - if args.compute_voxel_grid_on_cpu: - event_tensor = events_to_voxel_grid(events, - num_bins=model.num_bins, - width=width, - height=height) - event_tensor = torch.from_numpy(event_tensor) - else: - event_tensor = events_to_voxel_grid_pytorch(events, - num_bins=model.num_bins, - width=width, - height=height, - device=device) - - num_events_in_window = events.shape[0] - reconstructor.update_reconstruction(event_tensor, start_index + num_events_in_window, last_timestamp) - - start_index += num_events_in_window diff --git a/datasets/yarp_to_eros.py b/datasets/yarp_to_eros.py deleted file mode 100644 index 3828f8eb..00000000 --- a/datasets/yarp_to_eros.py +++ /dev/null @@ -1,84 +0,0 @@ - -import argparse - -import cv2 -import json -import pathlib - -from bimvee.importIitYarp import importIitYarp -from tqdm import tqdm - -import datasets.utils.export as export -import datasets.utils.parsing as parsing - -from datasets.utils.events_representation import EROS - - -def yarp_to_eros(data_dvs, data_skeleton, output_folder_path, frame_width, frame_height, - eros_k_size=7, gaussian_blur_k_size=5, gaussian_blur_sigma=0, frames_name_prefix=None): - - iterator = parsing.YarpHPEIterator(data_dvs['data']['left']['dvs'], data_skeleton) - eros = EROS(kernel_size=eros_k_size, frame_width=frame_width, frame_height=frame_height) - - hpecore_skeletons = [] - for fi, (events, skeleton, skeleton_ts, head_size, torso_size) in tqdm(enumerate(iterator)): - for ei in range(len(events)): - eros.update(vx=int(events[ei, 1]), vy=int(events[ei, 2])) - frame = eros.get_frame() - - # apply gaussian filter - kernel = (gaussian_blur_k_size, gaussian_blur_k_size) - frame = cv2.GaussianBlur(frame, kernel, gaussian_blur_sigma) - - if fi == 0: # Almost empty image, not beneficial for training - # kps_old = get_keypoints(pose, frame_height, frame_width) - continue - - # save eros frame - if frames_name_prefix: - frame_name = f'{frames_name_prefix}_frame_{fi:06}.png' - else: - frame_name = f'frame_{fi:06}.png' - frame_path = output_folder_path / frame_name - cv2.imwrite(str(frame_path.resolve()), frame) - - hpecore_skeletons.append(export.skeleton_to_dict(skeleton, frame_name, frame_width, frame_height, head_size, torso_size)) - - # save skeletons to a json files - skeletons_json_path = output_folder_path / 'skeletons.json' - with open(str(skeletons_json_path.resolve()), 'w') as f: - json.dump(hpecore_skeletons, f, ensure_ascii=False) - - -def main(args): - - # import skeletons - skeleton_folder_path = pathlib.Path(args.s) - skeleton_folder_path = skeleton_folder_path.resolve() - skeleton_file_path = skeleton_folder_path / 'data.log' - data_skeleton = parsing.import_yarp_skeleton_data(skeleton_file_path) - - # import events - dvs_folder_path = pathlib.Path(args.e) - dvs_folder_path = dvs_folder_path.resolve() - data_dvs = importIitYarp(filePathOrName=str(dvs_folder_path)) - - # create output folder - output_folder_path = pathlib.Path(args.o) - output_folder_path = output_folder_path.resolve() - output_folder_path.mkdir(parents=True, exist_ok=True) - - yarp_to_eros(data_dvs, data_skeleton, output_folder_path, args.fw, args.fh, frames_name_prefix='') - # yarp_to_hpecore_skeleton(data_dvs, data_skeleton, output_folder_path, args.w, args.h, file_name_prefix='') - - -if __name__ == '__main__': - - parser = argparse.ArgumentParser() - parser.add_argument('-e', help='path to events yarp folder', required=True) - parser.add_argument('-s', help='path to skeletons yarp folder', required=True) - parser.add_argument('-fw', help='frame width', type=int, required=True) - parser.add_argument('-fh', help='frame height', type=int, required=True) - parser.add_argument('-o', help='path to output folder', required=True) - args = parser.parse_args() - main(args) From 65f6fe437f10e1ab95246d89666cc9bd05169967 Mon Sep 17 00:00:00 2001 From: Arren Glover Date: Tue, 7 Feb 2023 12:16:58 +0100 Subject: [PATCH 22/23] Update README.MD --- datasets/h36m/README.MD | 13 +++++++++++-- 1 file changed, 11 insertions(+), 2 deletions(-) diff --git a/datasets/h36m/README.MD b/datasets/h36m/README.MD index 0cf3eefb..4eb1b113 100644 --- a/datasets/h36m/README.MD +++ b/datasets/h36m/README.MD @@ -1,5 +1,14 @@ # Human 3.6 million -link to dataset +[Human 3.6m million](http://vision.imar.ro/human3.6m/description.php) is an image-based dataset recorded at 50Hz. Event-based dataset can be created using [v2e](https://github.com/SensorsINI/v2e). + +### Usage + +Make events from frames + +Convert H5 to YARP + +Convert GT to YARP + +Convert YARP to training formats. -how to use file to produce events from H3.6m From f4bed56236e35dd48851d96cbd335728319d77bb Mon Sep 17 00:00:00 2001 From: Arren Glover Date: Tue, 7 Feb 2023 13:27:15 +0100 Subject: [PATCH 23/23] Update README.MD --- datasets/h36m/README.MD | 11 ++++------- 1 file changed, 4 insertions(+), 7 deletions(-) diff --git a/datasets/h36m/README.MD b/datasets/h36m/README.MD index 4eb1b113..5621609c 100644 --- a/datasets/h36m/README.MD +++ b/datasets/h36m/README.MD @@ -4,11 +4,8 @@ ### Usage -Make events from frames - -Convert H5 to YARP - -Convert GT to YARP - -Convert YARP to training formats. +- `Run_v2e_on_h36m.py` is used to create high temporal resolution events from frame inputs. Cropping can be used in which the video is converted to `640x480` centered on the person in the image. +- `h36_h5_to_yarp.py` is used to convert the output of `v2e` to the YARP format. +- `export_to_yarp.py` is used to convert the ground-truth and images to YARP format. If cropping is used in step 1, the same cropping can be used to modify the images and ground-truth joint positions. +- `export_xxx.py` is used to convert YARP format events to image-based formats used for training and testing detectors.