From 70269fe9601b6189914a739c4c0d42b598c3425b Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Mon, 3 Feb 2025 22:19:10 +0900 Subject: [PATCH 01/35] update getting started (#1138) --- .../python_for_robotics_main.rst} | 5 +---- docs/modules/1_introduction/introduction_main.rst | 7 ++++++- 2 files changed, 7 insertions(+), 5 deletions(-) rename docs/modules/1_introduction/{2_software_for_robotics/software_for_robotics_main.rst => 2_python_for_robotics/python_for_robotics_main.rst} (51%) diff --git a/docs/modules/1_introduction/2_software_for_robotics/software_for_robotics_main.rst b/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst similarity index 51% rename from docs/modules/1_introduction/2_software_for_robotics/software_for_robotics_main.rst rename to docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst index 835441f85d..23f31da779 100644 --- a/docs/modules/1_introduction/2_software_for_robotics/software_for_robotics_main.rst +++ b/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst @@ -1,7 +1,4 @@ -Software for Robotics ----------------------- - Python for Robotics -~~~~~~~~~~~~~~~~~~~~~ +---------------------- TBD diff --git a/docs/modules/1_introduction/introduction_main.rst b/docs/modules/1_introduction/introduction_main.rst index ec1f237545..a7ce55f9bf 100644 --- a/docs/modules/1_introduction/introduction_main.rst +++ b/docs/modules/1_introduction/introduction_main.rst @@ -3,11 +3,16 @@ Introduction ============ +PythonRobotics is composed of two words: "Python" and "Robotics". +Therefore, I will first explain these two topics, Robotics and Python. +After that, I will provide an overview of the robotics technologies +covered in PythonRobotics. + .. toctree:: :maxdepth: 2 :caption: Table of Contents 1_definition_of_robotics/definition_of_robotics - 2_software_for_robotics/software_for_robotics + 2_python_for_robotics/python_for_robotics 3_technology_for_robotics/technology_for_robotics From 5b06435be9e24a73c8ee9d0f0acb1e409a118141 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 4 Feb 2025 07:37:17 +0900 Subject: [PATCH 02/35] build(deps): bump ruff from 0.9.3 to 0.9.4 in /requirements (#1139) Bumps [ruff](https://github.com/astral-sh/ruff) from 0.9.3 to 0.9.4. - [Release notes](https://github.com/astral-sh/ruff/releases) - [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md) - [Commits](https://github.com/astral-sh/ruff/compare/0.9.3...0.9.4) --- updated-dependencies: - dependency-name: ruff dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/requirements.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/requirements/requirements.txt b/requirements/requirements.txt index e8bed7b6d7..9d4e7deb4d 100644 --- a/requirements/requirements.txt +++ b/requirements/requirements.txt @@ -5,4 +5,4 @@ cvxpy == 1.5.3 pytest == 8.3.4 # For unit test pytest-xdist == 3.6.1 # For unit test mypy == 1.14.1 # For unit test -ruff == 0.9.3 # For unit test +ruff == 0.9.4 # For unit test From 7b7bd784093e4d46108b667c54261d054894e875 Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Wed, 5 Feb 2025 13:41:28 +0900 Subject: [PATCH 03/35] update introduction (#1141) --- .../python_for_robotics_main.rst | 46 +++++++++++++++++++ 1 file changed, 46 insertions(+) diff --git a/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst b/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst index 23f31da779..1ad5316f53 100644 --- a/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst +++ b/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst @@ -1,4 +1,50 @@ Python for Robotics ---------------------- +Python for general-purpose programming +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +`Python `_ is an general-purpose programming language developed by +`Guido van Rossum `_ in the late 1980s. + +It features as follows: + +#. High-level +#. Interpreted +#. Dynamic type system (also type annotation is supported) +#. Emphasizes code readability +#. Rapid prototyping +#. Batteries included +#. Interoperability for C and Fortran + +Due to these features, Python is the most popular programming language +for educational purposes for programming beginners. + +Python for Scientific Computing +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +Python itself was not designed for scientific computing. +However, scientists quickly recognized its strengths. +For example, + +#. High-level and interpreted features enable scientists to focus on their problems without dealing with low-level programming tasks like memory management. +#. Code readability, rapid prototyping, and batteries included features enables scientists who are not professional programmers, to solve their problems easily. +#. The interoperability to wrap C and Fortran libraries enables scientists to access already existed powerful scientific computing libraries. + +To address the more needs of scientific computing, many libraries have been developed. + +- `NumPy `_ is the fundamental package for scientific computing with Python. +- `SciPy `_ is a library that builds on NumPy and provides a large number of functions that operate on NumPy arrays and are useful for different types of scientific and engineering applications. +- `Matplotlib `_ is a plotting library for the Python programming language and its numerical mathematics extension NumPy. +- `Pandas `_ is a fast, powerful, flexible, and easy-to-use open-source data analysis and data manipulation library built on top of NumPy. +- `SymPy `_ is a Python library for symbolic mathematics. + +And more domain-specific libraries have been developed: +- `Scikit-learn `_ is a free software machine learning library for the Python programming language. +- `Scikit-image `_ is a collection of algorithms for image processing. + +Python for Robotics +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + TBD + From 322fead45a62417c5d0d7dbb07ef17067122da7b Mon Sep 17 00:00:00 2001 From: Aglargil <34728006+Aglargil@users.noreply.github.com> Date: Wed, 5 Feb 2025 20:56:13 +0800 Subject: [PATCH 04/35] feat: add DistanceMap (#1142) * feat: add DistanceMap * feat: add DistanceMap test * feat: add DistanceMap doc * feat: DistanceMap doc update * feat: DistanceMap update --- Mapping/DistanceMap/distance_map.py | 151 ++++++++++++++++++ .../3_mapping/distance_map/distance_map.png | Bin 0 -> 32698 bytes .../distance_map/distance_map_main.rst | 27 ++++ docs/modules/3_mapping/mapping_main.rst | 1 + tests/test_distance_map.py | 118 ++++++++++++++ 5 files changed, 297 insertions(+) create mode 100644 Mapping/DistanceMap/distance_map.py create mode 100644 docs/modules/3_mapping/distance_map/distance_map.png create mode 100644 docs/modules/3_mapping/distance_map/distance_map_main.rst create mode 100644 tests/test_distance_map.py diff --git a/Mapping/DistanceMap/distance_map.py b/Mapping/DistanceMap/distance_map.py new file mode 100644 index 0000000000..54c98c6a75 --- /dev/null +++ b/Mapping/DistanceMap/distance_map.py @@ -0,0 +1,151 @@ +""" +Distance Map + +author: Wang Zheng (@Aglargil) + +Ref: + +- [Distance Map] +(https://cs.brown.edu/people/pfelzens/papers/dt-final.pdf) +""" + +import numpy as np +import matplotlib.pyplot as plt + +INF = 1e20 +ENABLE_PLOT = True + + +def compute_sdf(obstacles): + """ + Compute the signed distance field (SDF) from a boolean field. + + Parameters + ---------- + obstacles : array_like + A 2D boolean array where '1' represents obstacles and '0' represents free space. + + Returns + ------- + array_like + A 2D array representing the signed distance field, where positive values indicate distance + to the nearest obstacle, and negative values indicate distance to the nearest free space. + """ + a = compute_udf(obstacles) + b = compute_udf(obstacles == 0) + return a - b + + +def compute_udf(obstacles): + """ + Compute the unsigned distance field (UDF) from a boolean field. + + Parameters + ---------- + obstacles : array_like + A 2D boolean array where '1' represents obstacles and '0' represents free space. + + Returns + ------- + array_like + A 2D array of distances from the nearest obstacle, with the same dimensions as `bool_field`. + """ + edt = obstacles.copy() + if not np.all(np.isin(edt, [0, 1])): + raise ValueError("Input array should only contain 0 and 1") + edt = np.where(edt == 0, INF, edt) + edt = np.where(edt == 1, 0, edt) + for row in range(len(edt)): + dt(edt[row]) + edt = edt.T + for row in range(len(edt)): + dt(edt[row]) + edt = edt.T + return np.sqrt(edt) + + +def dt(d): + """ + Compute 1D distance transform under the squared Euclidean distance + + Parameters + ---------- + d : array_like + Input array containing the distances. + + Returns: + -------- + d : array_like + The transformed array with computed distances. + """ + v = np.zeros(len(d) + 1) + z = np.zeros(len(d) + 1) + k = 0 + v[0] = 0 + z[0] = -INF + z[1] = INF + for q in range(1, len(d)): + s = ((d[q] + q * q) - (d[int(v[k])] + v[k] * v[k])) / (2 * q - 2 * v[k]) + while s <= z[k]: + k = k - 1 + s = ((d[q] + q * q) - (d[int(v[k])] + v[k] * v[k])) / (2 * q - 2 * v[k]) + k = k + 1 + v[k] = q + z[k] = s + z[k + 1] = INF + k = 0 + for q in range(len(d)): + while z[k + 1] < q: + k = k + 1 + dx = q - v[k] + d[q] = dx * dx + d[int(v[k])] + + +def main(): + obstacles = np.array( + [ + [1, 0, 0, 0, 0], + [0, 1, 1, 1, 0], + [0, 1, 1, 1, 0], + [0, 0, 1, 1, 0], + [0, 0, 1, 0, 0], + [0, 0, 0, 0, 0], + [0, 0, 0, 0, 0], + [0, 0, 0, 0, 0], + [0, 0, 1, 0, 0], + [0, 0, 0, 0, 0], + [0, 0, 0, 0, 0], + ] + ) + + # Compute the signed distance field + sdf = compute_sdf(obstacles) + udf = compute_udf(obstacles) + + if ENABLE_PLOT: + _, (ax1, ax2, ax3) = plt.subplots(1, 3, figsize=(15, 5)) + + obstacles_plot = ax1.imshow(obstacles, cmap="binary") + ax1.set_title("Obstacles") + ax1.set_xlabel("x") + ax1.set_ylabel("y") + plt.colorbar(obstacles_plot, ax=ax1) + + udf_plot = ax2.imshow(udf, cmap="viridis") + ax2.set_title("Unsigned Distance Field") + ax2.set_xlabel("x") + ax2.set_ylabel("y") + plt.colorbar(udf_plot, ax=ax2) + + sdf_plot = ax3.imshow(sdf, cmap="RdBu") + ax3.set_title("Signed Distance Field") + ax3.set_xlabel("x") + ax3.set_ylabel("y") + plt.colorbar(sdf_plot, ax=ax3) + + plt.tight_layout() + plt.show() + + +if __name__ == "__main__": + main() diff --git a/docs/modules/3_mapping/distance_map/distance_map.png b/docs/modules/3_mapping/distance_map/distance_map.png new file mode 100644 index 0000000000000000000000000000000000000000..2d89252a70358ad1b60025d7bb3d9a217a2cefcc GIT binary patch literal 32698 zcmbrm1yq&m_C35Yjx81jAO;wKf*_zYqJoHoq>?Hk-3=0U5sD&82#R!x(rJRC(ug1_ z0s_*pX}pF~IQnuGgV3br1&E)};?wTvS? zdXb_A+vZ0VUp1Cr^EJfF3q``7P}NxWSD>*qy> z(r5qt`2^hphhM+mv9W3C&!0W!venZ4{P~H;Md4p>S^fX(;u{-HYm#OBb=@*ntXQ#X z_3DU`5l3zNYW-nW!>*JUxCE#BZZvJbzgs7Wzl5~)f`*W&sZBv{@z zPN5jv$L%eP*W9y&d2dP5c$!6nPMDOJwW*t?@yax@=?R+*Ev_EJ`s|FFWTP;h9A~4# zJM7NWlT=RaXZBqh8CnTK1@%q?t?YaDq{&y!d3tX4ys>P1X>V%1OYwMvk!wXuO>;}0 zG)1mF&>}bb26yhUu9_5ON5}N9U%%dY@F3xfdDh3aY2a5JYH~q)k5*~fc!{w1xI5w z+x8&JcH)k*eCa(NJ=>|#w(*myl#;zV&h1+!$Exx^JwG?wj*RK1P5TZWJXrHo|HcF7 z4))1*1^503!WQ)+;t3V&{I%`xn`dE~VgXU=r^%5s$^8E^<6gRyKZ=T z#to2W*6sA}BNvK`7cN|A`)-T>b|E1R9v+^tj}~rW7P+JO zZ9!E!E^@QyEmgr{TeWUow0&1idzmbECB67WXtDToEt6+YMjK0ydMO7-o7m)_a-HR? zB^{rW!ZeZ%AKsQ1l3|tdEXnH0)od3$i;2waO8rvtaR0nZSFUgh2&lcfyrf)ugR4rj z4L29pr=4lFk8Uf79Y1;UBzbsIYHe75zjep}?l}>gHCF%q-Rsw{E0I}h3wHR_F*KLL zcjVxrJKI9eDwRhnsGmCZ=uudhU3W&Cae?=Gu{Lw8$2D4(=T7PAF|e>CyncOP$p$ek zUANJ2YEVqB`P9^S4<{?@{+$}hC1f4;H5YV-`|HG1IO{h)KiBaoK2X4-K5lAqJoey? z)g2`dFHH4b5r5m*W2B*&I52XU`L?T>HsIrX1jzt=Lf%xO=Z?QIefu*fCCZ?ScWxgBC#$~OseFov>k#E%gW}oRS0-j zN!yv7yvwbdFgR!%Z25A5arM*i%Zt}N!uDKmlY3zvy)mBSx{r@KZoDEHPn{G&Q7)$EJf*y8Q$567@X$E;={qso=_gBQCrS~XTJbGXJN1?%f@ytlU% zas~0qCnsN3Jk^)ovSmwq>KF5Y(O@naeSQ5*Y{tqA8v`tg&etmyDwgc2RcyD3P|j1{ zzkmNw?<@LP+>%(*vzWWQ4QB_u%yUQ2;0jJvHE9;QIe0z{8Ro5Qjqcf|a_8`uWZhtB zHRCnY)NE~&i$n&FV2TgruMOc(K@cY^LM>h+oW!MDw{CURW;|nY9V|WE$-QI8U;_J& z9Y?NRyO#Om>HEQ=5(P*EQa^@prczC#~<{Q5|cOF+%sN`T|th4Isn{m5I z3F%)KdaV~$>&ctWb)#lAv|peXD1|vF~BVl`(z`7cELL zuH^9Z^J7Ouj8%&-LugMd4U;zcaF54#>GZ(~+@%tB>fjaeiGAB1I6lgq9M%*a{;Vf7 z{^MDS*%tx3jtUvyWQztd65~VMr!vg*Chdb=hpTJy-19yq>LxyVwArq^HlwgjRZA=G zS&@+JrcIkN$Gfxa3fUtKoAbTMWDVh4P5QpR@qHZm!1*wKS^oOkN`8YP=1UhZekb!h zZ^PLWT@7>;OH+q>ClVTC^QI?zIM~@syV4pGma|HIe?>2@VrUq(lzDIHK#{NlJ3G7c zw_DPyuu{K%Jr#fanH07{tbIN1X35%J65E0-&yy(d`SUSR1j@=(B~HYPlq#KAq)EBm zXFil*!eis&c5KQu4(nIRBf_s ztTVKD^S?j3!Buv736m|g&LLJkQD+RnzByjXA_|)_JR-sv5sHJGJCQef^ZJ;p4q0`X z(^KP>3EIM4^*NeM0;VOnD0jOe(}VY1Sw|2z>r-bnH8uG@E-O=V9q&pD5VDJr7iwq2 zgtb?y>sD^@*Lw8k#u^vh+Y7mK27O6RlQdvPXoI7vgX-)AvR%!1FO^cjxtU!gli5Bi51jo+! zwED6KqAqOfjbGW26o}^w0W7ILT;-0anAvhyvjS5jb7fd1aIe)PY?_=znM-L6Zki;F zcBeNlee>o`J2r2Yln+ZhzD%_($j1Be{&&NGaqW#SF4R+5T!w#;?E9!EcU<1e$|}Xp z-Fk;~LD6-c5!K_Fw$TY$-9p-qO_!X1Bo$O-JK0PQCAwGM+_ZnNkTcP+xA8?~ecJ<< zZdwY%?(BT}o=bV(zH{c7fQ&kG3o%OXNytLzZ3uKQ%leIC8b zw@ybKx&PIsXr+S4Nw)jpvi^*Eo}+Ez(?)FfyS}8U3-|14IIG8Yf8Fjg%r@~YEiDmo zaUX1dd@gOPe|F|0ukR5Z9UZT#4Xz;oiEolktIN(}Wr?+YyuZ8P5HE=}Jo)z)tlWAJ zVQ{^TE5)NG`YJ0TtDK*BPfbefx?LwcsC@tko@~1TT|x$?`h48i%ACPI&tI#TYFd4Y z?LM9^A%A?^wrzo!)%x)f0`;siwCWqi8IVHo27Z%Ijy~AB&kykeMnn{6L zm1+eb0(8X_xW{5xFmba$f>uV^TD=eNx${if?vp7NYpcV$NxAML?9k2U`(ZY{h+>zZ zbiSqD(!2)@#U|A5Iw{|G@7_iC=}1p!sYGUH*D{MZmUyy=)(rlrE^jmNJ))%*gn-h= zf*>J)@U>M~_>0RA^}I1ZBgD|^6chCnYP1=1qWqh@meS@LlSKY}=Bb`!NuXb&PjUC} zW6czcApY#`>+5U1&Y955B0N?Dyz?ZVDla`nolo)7$4x}&=DF23xMPd#aUC1vsjjX* zgPb7NwiFM&?Jtr3x2#gWPNOaUu0>f`ke+PM9Q#i7B$44d#5+AwAbz%Jy@2Vx_KN7+ z)IOjdOf3Va+DK5E)4&TAgdn(*zxRLow`kvBkXl97XCwwqSOyUY#G}NNx5J5BLb;J8{*~G&6{6pVQGm~O<`^F_!<5GHMgWGq} z9t84F-0q{Y!7kkyb^+6*xcO@d<6}f=KYxE0>4U0GKul~h+)osxk7p zND#3i;$mVtxTIQmib~vmp;dbLjtCY}=aJX5}OM3 za-UC`b|%5rh3w9a=jQP_bZY^+#4EW3={b$HM^*x`q?lGKVJwt^1cBBzhd64VL~sSJ z?HBXDaYNbEG>$qnj*2VUyiQ2l>J0<`@VR)b)pzWXL60zNYEkLf?uLx7IV&k6`w0vf>r6DkH;uj8FcrFpJ6mys`$W>yr%!Eh zGgTuQ26dTs_~w}83|qUTazM$~SC@+w#9R+XbT7NQob{|*@$mQ3{n!Qtp1}g{ku03{ zhp&ijYi(`qsCVh9jF1boMXVF5Yev2(&uDuP3wV0z^5qx+KLQz242oG$)V^=4#h5vA zHf>?wyjha$a95o?;H2Xh{#v5s&!rucV_?=0Qo+W{`&8+NRj8*b$q^fg+^2z^96guClz~8-p}z zz-iQK4te9jT$$fD=Zt*4;fxdmEPKYTr(RopY9x%BO(L33V1Vo7P?a+xYb9VB5I5BCI-QRp+wSB71F6piurUj=`O2Yd6lkv zJb_nAp4YBl_vOiF(gQAU_6m^}63Q2N4S4wwpg&}-8=L?8a2f(Y^zF`@b_;fDy)V@F zdL_!^#~QBu(}1m|4C-36YE9Okt1{neEeBWHjWqLkpd5J&NO55eW7mSN9KuMB^+zgqqJO@JL|{O!_)36m(O8QiE;FWa_!I}8 zBbB22^c287;VB3SBKJV#IKV?@M@+1-mgd34wCZODJm#x1UAzh2umW_np& z6EUKaAXMy`#KMpbqpcm6naoR228AL38%~Z51ppIpr8mxUMV&Wcx1Q1w_4Sn_hAN!C zHGeGw0|Wcco$)$0*cI&vYLTZsOND!~!)j}_2yjG=liBPo9FvXhDj*hkO*QJ{yw;9n zlyCTps=1+B-@tVjbp~-J>mF8y)vIaGxa)+3m#RCGY<)+vkNDIvlCSt|T8{vE8of9_ zm%m=Gy)2y6=$LcY+yxGUQjRH=2z=Q5EW1;yws?P0?6VjKlzvu}l=rnEHIph&S++f3 zk3|9?%Xjj7#E%@~Vw3U+IlsU=0(|zVM&(kdXJY!>7#h5-#~eEFwNdK2uBwqi)`z_o z{6M+Mv2B~QprBx3ad8zwS*UP#`u;5Yu9#s#6-PWY4TW{;PWO-Qe{EANs@`K))oNpf z!17s%r(r+6=;Qhv7j>tiq=n2eV-_*#>7TQlge}=PYU$>9qHb8H$7f!fez@|od7378 zl<94Ho(Pj?-G)(*j)+VZh)+H2@9#gHRV906#kfI;Ybv41WXga{w?tdpeZ2){h1$kI zXe$ii7&8+ViP+SL9x3)x98<4oTfUg<+>=AKMG>Vkw#ZdOL&I-Fg~HyZIfMx!+{N+Z z_v=CW2-U5mY@F;+&+`RAiNWQZG_8zLMd1`}6Zj+yj3Ws`?o$IH_${YJZFC_iaxj*v z#`^{;locw+r$~7_1sorZ*;dWJcb)zk*08UNU<8+DYFs@vp%-Al7}X-GnkukS#;$cF zt=8F9X$wsbmb>btg>klXfCWYR9cBA+g~-< z#=uMD+f<$M0;UeB^wmd^scxhP4WP=BXgE);yr-0I_FslTM$QBadGFz~@2-9(ZJw4@ z*pS4aS|L9rZ$40cMA9fcESKBE4zdZ%k*cKR`Rg}s_yq^Y zBQFniRK6b;pGVJlBcq{?lT*M-vOo>x2iIsn z221;PZJ5CWCn@q?)Y93;iGiSvNsc3n7X^nB@HXL8!Tla8dzdI+RS9k~<hF_6X%4*#dVRVeG8gP zv14%W9mFPv{$^kPYkO~XhH0c-`SdL*Ng>1iv z0{jMJxsm}K#9SKIrdttSM@3OFlz^zihYpnx>aeX!-nX?VIM!`)tQ}zV(|U*6homSZ z#TY5KNo}X9qM{PtoW2v)R~1TjKG!i@)SCuE(-Q+jH*_Z?$$O{98>Txhu9PnU0j1oM zqMd15YT-7()bj0H6abMBwbo_`JJqNtNbtJ1cWSB>CF%xHhY*^KP|KA%E;aAELgCn=d-u;fk{?i?HWm~Kr zRTnTDF8qci8?|jd@)L6G(W6JnfCHr92U&29nvI06T`V%-NeCGbS{A~0n_gWp0cVui z8en3J9b{7cKr~>tfqeM+XPNgXW6A38u#mkfL+EC$9=w_0IkK(M7?+Rb!La zcJ9bKaC^2bP%~Cei`u_LR)AyDmfUIO;#;)M=pOK52&!cpMh?bw`cV`bpD*A0T))skoAolFp3uHnoQNi1j}} zk#~>hRa+)u*^dP9y?gxl@jcLffu!ETuiuh1 zukON_34}@e2#{O4b-Lbd+^olG_Fs#4X=mBT51u6e?RHSm4;$0zsmWGwB)q{wcD&@4 zw{Fc3!d5&3jVuDW*QFl3nT2_FpHw;W7`bPgdZ)JgL_8`?w@bcW6?*F8<;wwNs<-m< zR~~sFQk@Ng%r7C@QItH3Cr>LDiU9OD-e6$|0YVQNp(++^_T2pe=+8h|N2fcd8wBc} zb?eq`<=`kh)bEt*KJseA5iEN-h-`o_H%7lSOtre^S+dETnHcHw`=nBt_l^*kTzM1L zWZKCvBC;ERz}CmTZ|f0Exz=16-QDn1QnhnW{ymII(PBovw+E!9$x6Y4o03&KfHm^T z(agvwIs<|32nsqfa5*ki0jOYc{TZ_*^hkipz-BoDvQO^)-JNm>wsOESAKu<#{fJv= zv@ur=dC@;G@DYUJ*GUn__Hh3I1bIf8RunbV2M!1cfh*Rg)jJ5`T1kmRUb{}-{ zCb`RVz3`_XtNe9ag@kGjNJ#}CzYu_5HQa0JrJ`>l6=wF_)H$ln*{a z1`~YcyP_fi=mfmJAF#f1+6ec6sGzQ)tUA4Q?%cVpgmge1Ni?VW;sj|o`9$yzF)CDx z<0nBl_(e{pt$+9q{7pVwbF#gds-}LgTP)*>eBMlU+9X>kRG@ z0=*D)i=OBEdm5%riy<99=cdsL$b-P-3Msk{kmausUcsTx&L86$@AiH^#rOg!3JOKS zgT$s4D_2%RkLqg31Eo-vH%J&zym`$#dkQ6{7utiYFG<%>Lpn&23w}PTVWM>p+ZVc# zximq0qXeHty%xPluSf1gzbwcIl9K|29n8HZP3S1|&k%KqEJ;_e5ga>rs$!aLLFS-# z>F%eNQRc8~6#mk@PwcBpGr=9J#p&khKsQr_poPH(k(i;U^9+0d2x$(K(CdY4OU}<- z6r{VI_Q@y)BEFBjgH~LDC1BC;PLI$xBpaY4F@e}h0xuCm1ZZ+AM0y`pN z-df5c5{00rnL)qo$JJ+sg^Zl}hliU@!4HAbi2FG2xj6al)n_+hBq#_cWu%OZ%#_Yw zjp^sl&k8=<|IYivB=QjuPgzG0$su@RSMQJac=lsQB=fpkStUVFKkJP$F&b=1mP3yk zP?6s_<%Rd&8z&l1d-^SXQ+?;#U5@&eQ~?8!1oqvvGQ5<%MavDAFJ@#dl9EcfTj-uF zu+8cd%Zl#jR@)Iv8f2r&kRvB^=I||t4uu-ndN9u}(2v3}+|!VV`+_>|;!_=Co@i;y zpm@+=Z_EXX3x^wZuzVN4a^qQ>Z-Ifu!_*au3gbeK6FGCZixtG%-`}SSO}Radi#zfr zt9I;^iOWH0{_K~Dr$eOQMTbDx@H{}V42m6@eD-80`ugGpLj%T3JBM5Ia1Rybnz~&B z)D_uoB2RQz7WGtaw^C~nkEu3NKlE^QePSkk%*X}zWd=JpH7#YnGK)XS{bs~sz{+xUV z5)_$s>Ps?;&C7P=r(F;a_QYeM14TkHjC^<76C(}^rD#D?vQbP@>D|9{h|gx97jf7W z^}T&7N8TfyzGF-6!IyO$zr`RrR7oD8QkTg49+D7I^zfzVuA@y$N8V*S4czhdeFRvI zK-2+sluGrbPz-smKEn-O*}I5R$%?nVK-tOQWPzUNV}gLhDlJ$}sg zF+{wfL3Z}}r3)NF+cy=>I`>M?iZXxk>*E_ORg&){@tmN?A8rwT0%@f1?b}|#2e)oL zT*4$UVx)>OUGn-kg%Y$sXs(BzUb#x>Eh68c)CV~uR?z(QtBRJERtrBED@5oaF)ns? zD_bv#T@QuU8){sACUL^*{adq>uauc=D297+-^Z4wTu4eX;^40q=)(xkS`AX4RI`OyIAV3*@ zckf0(_TQMIa#6T(8=FXZ;whIWipRXOESenZ!qOkU#3z}oNOh3?3 zB$QXKSP`>~sj^)qT@%_mWn+{oTdf(EXJhQt3UOZCw@89pJN({jxsC7Ak z;nYaCe5jl%$IHQK~ZaT9QSorB~}50YPF3Ifsg^c96w-p&Kn=v7=@UEwdNTv zr@SOEVira5+I#)zmvwfA-rv6)hFo;9^W+JD5=ZQ2@!DBx5cGyJdq4x;xp`AD(PZ{I zpf3qU1RPw$)XcFkBAP-`X`SyW{v~nZkGPj-n{2#0DhM1BHP`(y2>oamPvfa|n&YgGZvI+#?iUtV;)=l*=3<%;PcEO*xvc(!l?iyDqH08JO6Qz^R$LR)y-)3K zTMwA)oh+|tbd88U;(a$zBRh=smO7Fdo^Q;sz<~)a z&=DbIqTr)zAe44g#`=R+ScM&OgDdk8A@ceH!1}4JYY>Oi<#P5MTr!{G>r(*|*2(ut z+xDT5d!eMG^PuqkdjQFYs7{y%YT%8Cii&zGjH37Rd`=)}xw$*_DadvN=W^>RifmnS>CR^gXsEO8H#il*ug@T=Ml@ z5mGP%eG~hj;PdX@T?ro*qG){004|Vh*(8Zf8k3x?0%kM)FM5_Z3a%>4ZQ$@gdP!TLjP`0H#vWAL{bKS83hMA0w0^k1 zp9IpIT3o`F?PnjP+?d&^jp>Tuzp%x6n)OuE6yV_G&h9-=oIem-Gm5C4by;b5xwJLf zIx*_KGZ=-8Mk8~Y)jKmr7FN#ItKQzjGU@6CYs0Rs+}1&=2SgV?Utgufabmec02<0% zO`$|)>=@FGyppB;8}hK?JYc?Pr+%$?+MVcl8177}pXxiKGJ=K?4oO8u2?PXqh%GYM zQ5-vVRQPU@Ws?EvZ`_cZlL@l_=~LdF+LV`JmWq7Wz`2ZA8B+?7J<-KJ4EXDh-4FcXQ; zZ6mP+G`;Y+=yJk%L>SO><|RRLR{BiH6Y;H$XyPd=?+r)L{dVzU`l?e8h3XWW%&}Ac zj%gwbZxy^;6Kh6Ge$5L7`Ak2|c5xK^wLfQ|UP+h`zSAIG?rLrR%ZTXJtynnGO-x_T zU31v^>zj|!zB=mYauj_wBG&I%nF1qL095SNICK@!!uTZ5&siqZWVw{K=Dj&zZ}Go$ zG0RKPa&3>suPgllFVh(|r(F`_1|~w{NY}c{Q9CMP^2Y<@Z}CYgc~$q48TgBT6PLf_ zjrFX%_w|YAO z1ZI*h$r^|KM6$`;pJycTBB=SUtU14k)J#_1%Jw>Z*`jyWOg4?X3K3amUVl7H2{JmR zt=CCl*9_#Y4+XGbPWzF3!D`Q0cHF;U!p;4!W)RKH$CsT@>k(?~*`ngR+~s_pKV^={ zlpD%@V+O!>T&q{LR4G3nZ_VWpkW1m|*bHo#<2*Iph;2nlk>mzS=OnV6X1D0Xl5+G7g=%4qO= zDKXFw4VEhq(*on}Ge`F8VlbIce)BDOxofq6&Ri_-J_owvmq4c-_Jszl-6aTCO$AV` zI0tq0^}SeFa`NE4sTpMbTMp4f2|=VZ z(oB#&EFnq5t`n-u6=A<4b?Vq6cifzS>1V|je%ORc!B$6u$tzqNMyUAdi8gWmvm?FE z;_g_r2V~dbzj`j^6)1S#yngK{ZEru!Q?&~&{2VYiq?q+_>Mjil^x{!v5wGVTpn}Tj z-NvLF5_?ZtNM*Staeo67stdWpp9CFRb?Q=HdU|@H3zTqaq7Wox_aBDJr$FwQ7-C2d zz!DvC3Prwmt-nYFAbdV;w{of1jVG$_@nK_FwF-{#8kg*t;-}ICjuCb!9`qE16Kk)w z{DK1YvA8UgDlYiFjsQDHw-wG_sgK(_rc!m?BWCKQZruh7WVzXL0emvV4VY?P2j*YR z2o?=eb)*c@`a{ku12+~&aW$EH5^0C1HQzzEDIY#u$kRp=8Zv%y5+FS%L?Up5X-}sl z_R*?<>IBO~vx5>?Qw)rZ8ZEjbJ_*i=Hz9fDx{h}cR~lQp=kB7C1>1r`K@D{hsTAh; z3n7?afsEfn*u~4~tBfDMtgEo{{xdJNOlmQ$Tp9NE?I9T@HMP6Mf_wVi3Yjt#1H_xm zxqG)V@q80o3ieWE48zC1%ZR=np=YwLxR`OynyJ_{+PNqGuGc_R&@Wujx1{dj!-u|0 zskyT$8*kF8ScK@rn;|Q^m#p7pcO<Qc+|>$4Wo^3=0Br3fyvqJW2_ zi3|v`gczi-yg38~k1MrY_W(#XBGNDDM7u9$X~Y>mR+;D)E)GlK&vm!`0LvoXxKtW( z;uU|Fmt_&cG>yH3`Xs^R^t^8xWwU6cuwvvXFbSsiEk`6JWh7c$0}02TkM#Oo`M0TsgdHx*I< zG|>C(SD5N*^P{|Y7lJa!lErFuFSnlLH=ZTar&5x3&7+K8edgiLzvx!L@hJIiA8HQb zbFBY?72()NE!W|yrKD80_WkVhz=omin~P>mwf&vvJhWdQe?D!Ra-k0plmoeFgtn2O z*M3L&71x*jcr>p8r$A`~y%1`Q%r8z{kF?DVWqoFIpV8Mh>Z#AE zKdPu$awv9AfS7>X+@IBZ{-^xtDYvZ@?*kZ@{a3@6DDMZYbNw!_1ncg6!>_Nekv^Fe zv_XYGe#|^T)G~ynJuf?uTXL8GNobHMfd9E2!iGAsBLwR$GGI}zdwLEIba*uGr~gT4 zywNi%SNWTPk<{L_aEXzl=VH6%286@od-`91Po6m_ZikwEM7ljPbnuM@vBZamhBpSy>Qcu4)?qyh^1!ynU8-LVIqFM{tF;6>a^+WEu~Nb0?usW;(w+X+#La??>D46 z1A0o7w(dH*_19HA{wV5L{Y9PvmYyV6VdSWty%PKW(^|cWoQsDKo&S0a88)gYcPY!I z1K_beU*%K#!1rErJ$)h1--9BJWLD|a8S83XTGkM|H?GPh5T7f|92%xBT-GS5eRoAS z!Gbg?hW72ha4Lr0!)Av4*>qQ<@FH2yKT9gSkJcjy@<%28%UUp@>IW}ndHqXGxwn;6 zTFS=8G0+l7)6vdjPeOAQ5EctmO#_Veo`caxn3jG@AoK`lQ|`Z2m?|zGA!_l%3q91+ zpo>&f4&^mk=Jr;jH`f`KxFTs=&iRMKLaB-tjgW#HYq-@Alt{vZ_8nPhvU*mlXPed!rRZJh81x0lwQ?T~JwRc>##}4I0t6cnGHpTFp1zqav|(WES4jC! zT_C@B#lB%4)Oc>JEpuKevt zV6d1er{P9L6Vua=(MVCKCMr5Q3B>aI;}Qm%F8R&^T2+iW@LmK(4<3eLu>?80agwh| znDN?Ezqd6*dYZ0ot#XSMGFb;{;he)yxaG#6F;hp4IuTmW+igqnyQFZ0EujR8Iq7<< zax{nbg1ztR4Xs@Er%$sZ*AU?EOsr7wLWe9vcG`w9R01so)6N5@)^)^w16(WE%8!01 z5CQ{&sm?>jU~x4k})87;2VcvvvDqdS6EmE z3sjq{2!sS;-6TQ+oX1h5I}N&KSw*yRd=BdJ&#O*FySp3`<*2Pf$qf6zb{?LiT>i4? zkJR#NOVrLvEi5jDtbSB9#I^AeO2Lg$v`yLuZcSMe4hDZV(ofnp0)~*^xMI5u$(l5Y zLH1uIhU9%FAZ^$mqJ-Lcc6EF)Ge{3bAUwNQOZHT-t92vgMT*vve607@}zdeQ&q;>;RpOJVud|i+ZfMXcO;X z+q9;V(pMW15fLr0A4~GiJ-&Ex4xyT(qoYBvS7O8y2(yW)Cj3iWEvw?+-M$_h{lP#E zIL>yR|-)EP}oEq!&3k-|}-wegX#a#z=1X>Mao;*<`&1z`RASOfT8gaiaOJ~Ht+nAb)9wQk7UX0EjVV(P8cJPLP<%j zg-JJeLd;sasE^bbGcUu1zt`ZmRQ9`634O$-JUePruA<`edzQ-hru0hhaluUK|M0GJ z`WK|ye@j+8T%1e~o)WdnuaqPs%|&> zDjkb=%})JUD+!e`^ZKLM|0#^wt!ETyJts<=8K2WgsYPUsc-Uwm>Gaeq6M4=YAFYy?l5|=cUz28fh!JWlN;yen^ZEmyxk=_0>Fz$Q>Z& znhEnHY1f4v{~!!OhOF^3$g1KZE$$2$TpItWFrpAtfX}fK&JA^9MMOi(IWUpxE&S?Q zR+n<8#!N`xCt4I^kaol2-@VKQKF$E^oRv&25Ec}N@Woc22!96yMQw%+X<{QJhA-L9 z=x;lSh^=mKZy!FOj+Rm2iFSDTl>sZZ$iNC{@nH)sDS0r?#82Q9GtbQJ*#00#Xg&uK za5ZQXjOq^{VhHS&v_8KwnQiUVxI+7qs>Nyi;LLcRdohH#kM02mk?hcOv-9t zJI2tN6ODrQi|bL3trzz$8u1^^2>0}&+t>@_KOJMpu1CW`9vjT+vyU%Y!$mqJn|i`2 zPuSSm2@bD6s%mnNb-ngF3(x3Ojr!)kw=j$#)s&Dr@#L$^Mxgsu0SQyivizuvWXlNL2JPhxFY7B~Z z=w768oP4<+Oq92Hq)2mdxcVGYUP*ZD;&PcS@q8mR&KE6D_Yi^LxdO@Tup4!vd2on~ z)fwR?7IfF|MsGoEwgucI;Jv_{VXRQ;frg3h4k*@H$0~ zNYaa;XUvp4lHVXiDC~CUrp%X85FNTx@7%c)!}Mg6Cvfz z91R;@cx0qW)5}YQCj(FLC!I}BOdt)Coy6o%nw0^vXlEr5KOE$vv4Lm`9i&^tqMt&n z1S{wE?Iq}Xj3E@r=FN}Ps!rp41sH>@9Q7qEg`NuKJRGQ~^i$V~28U<(DegE22Zs^> zU9xq{VOX(be+gBhc__7jlt{lWk$^!een*2_2`qkJ8i!`f&Bb%?YHd5I$3k(FMrn5= z-hLOdI|59}TLNS;MeOk7xqRrdHvwiObv+RoOG*?&OyHLXL)K1;)JO0#N@?zTe}8jF>f7%5l&qXs0(2gsqSw$Fry1lI0^-z;&k^?)y#z^>U#LQhT{ zZ(90XjEL`^R_}f<}J!_?O6e8CXW^2LiC$@vy20QKTUk=)G}5auN=U zjg&a;tWOQHl=G_@x+TgnP7`Q)oB3he0j^6?O*~Jl zctHFqKqVdG)1yA{Lq&p#O;W4cpzu%7Wj=j6GJLTaj(mywpd@>L?AwnkAWI+azw&8p z0u*r+rUA0$EjGevqJ6eBNYExg>t*oYwbZh2AUV%U!Gf5Qu?M`Ar|04jJ#uCQtgh(T z+Gj#tFo%}+5+4=jY(fZAd81DdJL{$NA4{Mz9VRl21D2SkrpbRhs8a1iBqRt`Nqfz= z{#6Vtg~Ll|y7Y~4{yh$#4LviDLo1M%(f;ch-{YbBVLA5YZOb%vbk-qgi4~C040{1- zSMaVa*Zpk}Xcn4+i|GI(y~k0w7x+SThUu@g?K1>KvV7qj6Ly=<|HnKa5l@VK2+UQ0 zVq0X;Kw5zx{6a#wiRMc~_@eGVJ|~g>V?yVmsZ)`dPSG^>7$|E@Z`7OU>_6QH`||A! zKi>ng9kG;o+IKWrOxg6`OhS7`xn!VFYh+l5qjUF>G>g*#oFd@d`(mCQgkS;NQ92@& z|4P#N%zz%WoO(LaRmQb-El!SbC@+us1Fa9Gf9bpQ_mon8FOM(KO2mTy&?`1N6UO`A zKl?%Awwkgy4-T@i);N;%Qd59R<;}Jaswskj&GhUB_ry?NUuDrL9my^JOKQLMY;pAK z7LU3zku#ribI`2M?b79WMLj=g=tOc$PW_^wk)xZI=|)y7m{XxQ*M^IaxGb1YnE|%Q z%n)68e*zP3|4!oW_yO6BED?#@&)g=Ry+Ai&NYHfeOb+M}aR>UZ7k|A$Pz;K1{_$O) zG*PyCXkcLhg$~{QS1+=dU;5%kY{JXpd@W7*DbwY^uN* zNCFX<>|-Iw$9!hAXN!-~i625a*|&-hj!vVb%J#UMOQd28_Ry9P7M6-MtsuUwM ze31V46c-BRg!0w*^&X-G7tP_%JwK1)abXA5k{i`Ydm;-9i}}g^{qR!!XKhm|*cPPM zC5A%t^|sRIz>efj8`GDOY-YN(HnuFw!NL_4vv)nVE#FSU6#Xn*-dbOB(E~XV2xWP~ z2C|&@3pL@j8ik+9$lz7-?FaS->71nYA;hM*j@y%?L@*7?ztP%C^77trawaBdXT`vc zNcsUGCncklaTf*iYRUYYBD}F#2&$yAGzmdCA0XRj_wL+GcW6IveH(F-EPJaZPEVjJjo+ckvbb}Rg)=2r z6cII18pyysUIzYQG6%>wfbHuJQ~+>JNt{4=d5D-<(NI{Cq$f@6!)S(@(pljlrF4{d zjiMNlO?0@{*Y8c)_o_i*&ioUwp*dz~0k@IQ68{KP0)EIgq7Yk%#6+|%(zQhxnmm6L z^HyHJ^7}GUe2SwH&dWS!NDXL64ku-}+o7Y)_}klC#At}lj6ra?BQDn{%#wD(jUq^4 zk3HYlQxtdNrywnu=!nzG@(yaoc%2*#;-|#X8pQugj=Uf()l<-!#$B8!JHo;jm!tzl z-Q-z4&qEIs^1gh>A4`rQdEO+60p+SBX&@v0OGJf(T^~rIOwUI&0?4DYuV4e|ewoO@ zzV;UWn=XA`MoyHPwGSHzCqX6Tjv14qVu&ii5xo2K+hxS~R9f}d^W&*8< zaQnU1_i2S*j1=us_Pf|>*Ts@(HS>BtiiKCSgD3_7RJ zWBX?b&Vyr+EU_k*n?JcMeE5$N_X=6nmdhBr>1HSvxETn^Ng@y4n(Haue{Ag#WFeG2 z$TiomUR4CQiGGtI<6>+pGry29;+`238~6CRtANkle^UxsHvO;l`#O`9MAOi*s1}p!8ft*Fp(*?$Do41=+AtW`M-zvg#J{R#qWOUQbTiFaUY~!)LQCJa#-W5xwN8i zRKs`lO&>)E1kDDdn^)g$vVt;`z%<`kll7$XLhWe~A{)hToegk=v=Ur7BzawljWHK5 zD-F5e#Z&1f(EZ<@I%$w%@~-MEH~(H?9vjz?1P~*eo`$|%sD^$q)ec=Xs?i-eTR{ka zf?!h?%0F-O%xasFk@a4w^yGsZd+y(Imp29BHv(NnP`O5#i{HOjM(ez>Xz@=wU8Ktf zda@?x87qbu6xoW{_C%b_P)&IAu2DRqDnRKQk$X1(qDVm_F4kx{yVG{a;9>Ut}M{gjSPx42`r9^=&~~ zy84q5`d^!s@Jg1;NTz9A>)%I<2l>orm&`!ra{}OiDGad_6Og$R$_@97Ti%KJ4aq88#jIQLRj@8st13 z7{l{wNbyPI6hQtOzGj$hS@^Fe^eV?XG;i1uzX`yaSF%#$2u#Q`xni#X`65}^VQ^Ss zt4$ox5C7KBs6iRX?xTEeDqcwapHZ-KEg z-~b7__Q_>ZNt5%iU|Aet9v^A;!NUI$$w2qwqy5Xt?^AccfvMJ~0eL~5RwcHUJ``_e zpPxLe8RF8(J_g!gnjDCU{f_f=hNxv8p%pimDEK^(_C;^z+1#y6EG#;) zog8cRVzhj|bbaEl(h5UclQcOEgehAmG#XHZE+2SxVULH01X-R|vMg!APA8Ut-k~Q) zACMM$H2zSk71FCxx%B~biQywUz!lNKFQ$GA@fnbF7l;uGXB!=Y+tiOo3QAHrip8N= z-N~32#j_0;Vw@-lAbfxZ8!gxRNavULPIU@EkfrgL^#8+NA3hKk7B78jrbaiNEwuO=wUd?tT8kGt$T03P4+6iS-ck%WZjx;eQx-9G z=X5WkNQ8yb(jFYWG!y(Be)nxaFouOc6`(IWo6Q}c){?09FGzbv)N&LdScgo~FW=$? zS_Q`JQ=mle04n<5IR8!>#{`F&V=eD$u%_Ug|4w$4Kj0LtxDV1JPQ~x&Vfgh;p+XKl zLfTE@e59zI%1BMi^H-D^j66rZRJ}ejl>i2&?>%#w)y7`!ld<5hnbaZ;2MXH6xQ z2wDJK7R9M}nK%N@IQtCp5Yhb5nw$vx3)l!bw7vZ1^{KFS+GeNNqRj=L}hQ#gnY{J}6Mwd+Q0Wf~rg6r&ve4LRB3T|8Ci1)`8K` z@GuQRh{luy30~cxS-2Ry_WmnN&tMI8k60~Xkj?5T84w1$( zpum;T6`y7_9h&UA>&9f2?BTPT>qv))h)8W=?H_Krzcpj}pw-G0$|0eOygKN_Og7KS zk)e2bG)w-Y7JsR!1!DurbU+a2vqa*=CPa=eHLl1F<)mNemu1$}B0z&?F@6;|zb3|< zTHhAsXzm%^zn4Ne_dLCh*L+c6gvJck|E1f!wWg=iT>n{eXcFb%;u;oA{l`>h_3Z&_ z-Kzz9(N~1g4nkB_yf%IZ$;!#iUHppjnMbe-Mir%<694Jc6=SzLm zmH#DM{I|(Khq}a;+y-89Mc~{j(2PoQmM`I)AgFdAn%rfB=h^9YkMA9{RKYT3j6%Gl zH0%nE8OE8e6~q?c4}09Qx8#gYa{lbtU^_a70>`L5iQodqUVmaGj{-`O^jMKo66$bb z2Fbysd*Lmo0}G+u_Ia@Yi|!>OocTX_Cc8XPd2S~}Sk^6?tm@od@PDcxYyP)~!N}UZ zHiA}8%>lrGuh9l3cdD!JMuld&K_oN|-7>>#;&wG}Lzj)~qVUFSV6Dt_#@2`kd z2;oSS0W7ec`i;3vRdfHnzWgWl_kA{ESN7o;{qW(47qg%eR3ku&DjaBxqAd~Y^sq1B zj>@rPQ8>HdkZ->C`dBy>j-d%Cvppg(87ChQ*-Jre^z^H$4JSxEo_QH}0j5=8`SF>2 z+JVJKjI4yPM6tg`2CY&ME2s}bey&Dxm^BXMymRkf6dKYMLBB=kJvguTTY-3$wyp|b z%c^iL;`tqV|Km6q%BebOz3SkQfW8!h{9Wr@MuXbk z1yZg*H-L?JdwJawI}j*ht0|;TWmEhyM>#`94}&=gDuro1_?QuD6t#ut7g* zLT1_$VTPo2?0X*VZ~yl-^iJ0baMUhH7Wfy1`pV3Jq>I{f(7I`6#)`I*Xp4jSRYdf; zYyZjUem@#aRA<9^7?RyYn3(w(?ZI#Ul6vV^bpQ7!e>PBQ<;%;it0_b1yKj}%VwDw- zK7s$prF=%`j@g-h-^ZwkKPU#%59h}ocS_ImZ(^7Hc(_%Cu@ zOvKzrWOdDp&qy`@=Zt}Np*h=rc9UKBP(0%ym^b71{JDGn_!2?dvJ%X9JS_tBvgFTJ z-M7C?2=o5agh0SQ)HfK79EkW$3>{FGfStGaTuP+Sp@(`QTBSUuiS1O0)9H*5drXyB)F>zLydJck)798nG|3ItAew8;>jyar4s{ppnCHuv z3%l$)bH}^OaVD-dtys*&VH%89eh9iDzgcMEoS-GYpQbGwhm$t=S>%&e|g< z%MyOHnIA-O%;0^{J@wuR<(SWNQo}eJM9rbr>PkXp2Zu}<3aA9`+!0AswZxwjo4s|$ z+T@jSWTPSMBqhnD2*>u8xJ`!rMxW#99j1=PdfowAs0YXT-*b~wn`O#A?rDg-#VXzL zW{qxKUnV*ebzu@57F)V#k&SI|DsUTPH91!O=CTHEf1H2vl$DP{S^cX4SVEhe%76nn z{75SZIa1Jf5JkEiyl*v-MW7|9bfv&=g%i6i;=(m(dMG)39c)4*M7C%clL&esCmk_a zWP?PTeI6%b&5!{%WbFk+wiM1fcaB(No0Fb3{eQ)sc|6tm{>KluZE}ll3rQ&2NTw{E zva|@7rlzt-R7SF-RI+7h+B_Od+@y>_H;pY+wnkY-I+8|H$h9wL3Td$<3disD{+>hX zj{E!L{vMCt`D0Xbob~(td_V8k`}KOh9du6*Jo!%+PnjHVEpka z>_-)s4%vYDf-nn_$cWJl(IN(aotF{KSx-lpsMfpl#I$w*b_uSG2=rFnN|mzP@GZQj z`cU_l)c&oEp>mcS^cx3X*OcjKm?mU6v1lG68R5W)*ooi28#I zS0<1ELvP&k%hP;&6(OjoHPsqJ#gVjtl^B^6Y?iq&Gobe|@+t1G5$pkctc>=0qa>mu z&oSyC@+M(PD-Nr0D$ECuJr6hY8XiV@hGTmRC^!`6BshMBhmD`en)leJtM1Z&>(y|6 zy9|C0DTQ-zm{Kg8dLK1MV&F}LL)4QXi%XZ|tEZSt(h zvX5MT1<0ZdgBa3oFLsaAA`UMD;Eae}n$*I8Y5ggA4)+#K$p!p{Mi$6KGh0+y3G5xI z^Pu&@lMW{2&l!{z=-*-TbYJ=x>^+S-2FU4;b}s@$34M|CL>N>6YRx|vMMxSpiH?Dh zhN#c5>C;$%eUGD4g6C_{Dd5~zip#f3;C;M`;}RjoHHQq5_opEc9Y6o*@)Xh?dOnNm zL8T&s*Kzf4u-i?iIIsbkHq>yu)1U@om_JM1l?*Fp0^UOofSvO#llhIYojNtYN@;}{ z5A3QJY(5FDrd_PEYE=drP&@0(xm|>&Bg~qX*)*1aI?4O|q(4$^5}Q$|RC{DjqUa2z zgdoKjJY{k-NN{MkLh;hU>Ju&x?BPo=iz|^1Nhf z`tWzK;a<;BN>$!A8VKdRi(l7{ghx)xImMT`*38u=ssb!Umpy{}kJP=xC!hW{(_3Y^5-3Ew*8etT}U7UR#H}|qwk|$aP<@LSWL4kLg8->$h zClmbo%r~>|{35gYGySBvF45;h_j5@tuMF`09Eh>oYlF3CI@(2OS%47g0IV7o zU}#*+w6u1?ZDfoZ4EtBbv5D;VZMTL1FfX^^b&Df2UTZmD24g2KUAi<0^g7mc6>9C_ zk!3ixxcKgTXfC+C^HmbqBA%c_Me-_@`T|(Nh)UAXLyRB56Rf~|HG`DzUWaQ6G-WtQ z!8rTUk@or*teL&M1-u7cPxNKfS`bYv=n5Wjq1};(*8Iucq8);(v1OV!`ZcSWElA?k z=cN%&0QKeHK~SJcAo-Gz^iV~oWCeD zd!m#i!>o_f@t~!4uYz*G6-p&$`AS<(0>nwmuA@Fjf(N}EsKi(~`ho$J@Y2T1h_b7b z22ull$yfswb%iA4q1!Vq?LD3C{01(>*X+!X=dAMQ+6*mz%-eC;vB_umqy4?%=&+d7 zyo%+$eygkcJydussU~zC9F-wCP9URy*;RQ^a!q(cM{Uc{)M=tojnzMjY^KJIS8jF* zww3ZQ4QS~OkS)@^U6f}nEX-m=oE~@DZbzz?D%ruTGE++sls@|Lb8B1Ko%a4Gha;PE=w2SQ@UAl5v#opAfetTCnkR6Zyr1`@5 zG-N&NID7$%S^~I9iIa?}u9}jRvnvdjQ3l&l`3ibWl5wx~Kd`fL0y&`95>4bOhSVvX z;>h$1Pcr&L7}`EQdTw9!0bNB5ClW0@5NU%EZK&ZdI2DbMNgIkm4sZJ2@`{2@jF`em zXOl0-oDK4PNgjRdSB!V=j0C1X6S;9>DPsKzivzrrsI4Voi^Ag2lA{S1So=-T5krbQ z34N;l2r0~gfWmt385mk@dj;;lA=9Q&Q^5bOEjILw+rt0Qfoq{Zm&)Z4?9A=J(9(^S zRz9r~W97$Bq3J5=j zae2s9cS21NcoQ3SK!Neat|G@$)57nQ$jN? zBBuJcklF{fd#=v>;zdhEAmP&=nG>{zB0Pb$56t_MGZrBEP4yoC_L-r9=@|#FF-8cd z+*pBWa^uh%qfzLHGNH7QUmol2-N59RB&faxjzM!;qsQ#=3ZVybC3*pIo!C?uwlzx& zN>Hj?Vd~LaA@X`RpTT0QsmIw^Snm;#KAy14?}|xf>vQm&U~-K00Y9*nk2w9^8XGr& zN0^VNIpnEy%XJf~rT8~?3yK12O;&&iNUi}>#BK@fh_N1I9s%uv1lqT+HWJPXAA}%n z%{D|oo?%RU?P7a?orv}c?GJfxhk0qws%V%JEu7NWV3Ere3L_z1X$Z<>Ge;CHqojxu zLBm$RM*>-NDc-&JuxH8{z~7ou`GF&W?#6!XgL3`|1C)RWhzQAGy27gZ!=*}Fl`5yw zg3a7yAD{u(ABZ6!`PXfcbcLX4ie65iIbeB&fg&uC-4-`~+=o%jPBQ!U8 znxE(ddFpT(ctNbehyKjg=ToQkK(XI9;4Ngb{~uz{A*SDo-yeftF40dmgQd)xv+G$x zpT@2E-{`a^CZn5^gk9aVJa@Dqcg&KJNkaynky2^C3Hh;UAcFEHi5Mx`(Um4QD|jDtw&^ zXDQNo;Bi+0yt&YTzGE6mMETTqL-a_TPgE` zt5K5+R;2ZEHNY=Iq^1Ob)*M6`h=2{5zS7zZW@!?{sp1)xSRAqC_&oskUu3{U@4zFw zyDr%fV^Wxy9y+y8+=PEUC5>90a9QVXR^a~*r_O8daX zl@<{o<1y<394F-58A5rBc%!1S48pFu-22$=0^x4x6C2~^axx?wA0`ND-I z9P-n#Rc9x?!f)OnEbK@4%5p>8-_#`{K#`0ynD7yTGm+4lBSj)Cxowe7sa@Qp6HRuC z7F+SaQuOuP2Fi=bv=~fa1%1yJk+KPtpXV30_#wrnZ#aYgAPG)xpDUU;^6lVUAi17O zcxi%%6Nd~Y^LSED$RY=YAphcDVZ}Q+V#WLY5I)3GKx38~=}Zj9G5P4SiW>4}9Lrh$ zjwrdMYy#2?y6$^uu1Y;q)i75FS@4w*e&{#^U?{#E>>pZZ`M0sLUcSjF*mYjX0SCsZ zKDvhq3AJJ_=3vE*msgrRSFWV+OuTUez$jnQ-I&c5Rda~y6+eKPol)ia|8MLzg4yYe zYrN3(L(%&MdZZ%{OT<*}&DAfV1ECwN`Jy#z)if#Tsr-${C&d9g3FGvp6>K$UMG51QFL4X{WpePB3!3bHVrBj+Z?_Z#P-V)`VZEs7TQ!F78a5WS!C7a3*Ci5q6^PBCe{^8N% zp*!adwE-6i78!ouC!ExIK3{y>#3i=ZkFv}1w%>hP6IEmp>yQwT{zjD7+#25hv8jFu zsyo#Mcr`IgS7b08RTwF*|0ezj@_+FaEa$`#QBX+QaFDmz(2zDbd`zmP)t>Ug4r2{< z8w^~)z9SxY-JG1f90nnH_l+$s=Q;)?X&0H9xGgcjHh>8VDcqs@CMh-fvlg7>jl(Sc zg=Ok1tmA8L_O4aa3k-*l46Eq@tGin>b)vK`6k$Z)4RxEZapp*!MZI+tE^5mkmDjF~ zEU*;8eQZ124&RcD-3UrfW5c#jEj&_`a;c4XwE48Sspl39AF)2(r8~kcjYN6>rp@cb z?u}8sqV(ilqff5K|Ca@2g=Y$!U~XQ2p6H8ka=BA?`6CPg#1kD`$q{Yr7JH#V$Roe4I}Ie zoaWH9j66Mmf&2StQ-P45vXbl<(Kx#Dh6J%KSM#66+qtMZ2jB7Hh?Sgv?-?w~M(ih3 zXnG#jvOeh!STzVt0#wO>tKEqX0PVK}3qVf@EI@H%=~qiBUQDwip+L10pa#ZiM!2>U z@^2Kg?+9jFYGJACUs(2RzH63Ok;T8@wh%joK+Biy^udgzg6t8B6LP2g<1P}*6_n-d-u6uI^9UEVT9yQjOpxu1+IiI z#+W7%VsNdW@{(B1-U=gX#n4m$Kjn!mVPV&-H_Rgm`zte_c(Q)f>6tk7H#~ZN_=Bc8 z$9U}-9hx{Hpw*qpkNDd*=qX3>AdVOa0Vi`O7}Bz_V)jlUY)dt~S!4Y4E~-wNVc$Fp zo;)!}Owrn?qoX*~>&N)LCdkt4;W6a8YWg4{o?|xS9VqcHwO(NMCSoT91t#n^4Dm8% znB0cPU0qcD64#>GRd}sp_VikfAVN&I@^9iyJ2&2BqO@%wy>uFYk(T>D5{v6#Zk)dg zAXB***F&|1dmJVm!Sz;&1o%Xn`Qsc4f6JfDTbw>MgHs(mQcvozu8qizg(~*kBr92qhGAE8@_|aCrVO91R3CUW7S-UoA2~DgACu9zcjel4?m^;uBbG z7cn}jMOX>S{6`&joogGIFFsP6L;0lsHvjhjHsd>?5R(tD!fK9|24?!9p#~qsCDSzP z*Tv!RZ49D^#5if2Y*yF?cJ(b}$%+vqi3qS@a8a}{+5T}f&1^w;2%sPAZoNl#w}T}8 zRqVzr057o3k#|t0Mkl6@rK7tum7XAz@ysTk!I;6;h?F)DtsYL5OxdyCZ*$+|ztsud<&qni*UaCVaxrFfj*=_rPI9 z`Z(biQ1(;al>!}T23`R?hvFeDsjAT@5&uHs5`6Rj#a#COVR$F$VRoO-@0C9`M-a?m hO~OY>xBs)BLaGTf+b>>o`Ia8lKzHk=)Q!8p{~vM(KKB3s literal 0 HcmV?d00001 diff --git a/docs/modules/3_mapping/distance_map/distance_map_main.rst b/docs/modules/3_mapping/distance_map/distance_map_main.rst new file mode 100644 index 0000000000..45273cd3bb --- /dev/null +++ b/docs/modules/3_mapping/distance_map/distance_map_main.rst @@ -0,0 +1,27 @@ +Distance Map +------------ + +This is an implementation of the Distance Map algorithm for path planning. + +The Distance Map algorithm computes the unsigned distance field (UDF) and signed distance field (SDF) from a boolean field representing obstacles. + +The UDF gives the distance from each point to the nearest obstacle. The SDF gives positive distances for points outside obstacles and negative distances for points inside obstacles. + +Example +~~~~~~~ + +The algorithm is demonstrated on a simple 2D grid with obstacles: + +.. image:: distance_map.png + +API +~~~ + +.. autofunction:: PathPlanning.DistanceMap.distance_map.compute_sdf + +.. autofunction:: PathPlanning.DistanceMap.distance_map.compute_udf + +References +~~~~~~~~~~ + +- `Distance Transforms of Sampled Functions `_ paper by Pedro F. Felzenszwalb and Daniel P. Huttenlocher. \ No newline at end of file diff --git a/docs/modules/3_mapping/mapping_main.rst b/docs/modules/3_mapping/mapping_main.rst index 28e18984d3..825b08d3ec 100644 --- a/docs/modules/3_mapping/mapping_main.rst +++ b/docs/modules/3_mapping/mapping_main.rst @@ -17,3 +17,4 @@ Mapping is the ability of a robot to understand its surroundings with external s circle_fitting/circle_fitting rectangle_fitting/rectangle_fitting normal_vector_estimation/normal_vector_estimation + distance_map/distance_map diff --git a/tests/test_distance_map.py b/tests/test_distance_map.py new file mode 100644 index 0000000000..df6e394e2c --- /dev/null +++ b/tests/test_distance_map.py @@ -0,0 +1,118 @@ +import conftest # noqa +import numpy as np +from Mapping.DistanceMap import distance_map as m + + +def test_compute_sdf(): + """Test the computation of Signed Distance Field (SDF)""" + # Create a simple obstacle map for testing + obstacles = np.array([[0, 0, 0], [0, 1, 0], [0, 0, 0]]) + + sdf = m.compute_sdf(obstacles) + + # Verify basic properties of SDF + assert sdf.shape == obstacles.shape, "SDF should have the same shape as input map" + assert np.all(np.isfinite(sdf)), "SDF should not contain infinite values" + + # Verify SDF value is negative at obstacle position + assert sdf[1, 1] < 0, "SDF value should be negative at obstacle position" + + # Verify SDF value is positive in free space + assert sdf[0, 0] > 0, "SDF value should be positive in free space" + + +def test_compute_udf(): + """Test the computation of Unsigned Distance Field (UDF)""" + # Create obstacle map for testing + obstacles = np.array([[0, 0, 0], [0, 1, 0], [0, 0, 0]]) + + udf = m.compute_udf(obstacles) + + # Verify basic properties of UDF + assert udf.shape == obstacles.shape, "UDF should have the same shape as input map" + assert np.all(np.isfinite(udf)), "UDF should not contain infinite values" + assert np.all(udf >= 0), "All UDF values should be non-negative" + + # Verify UDF value is 0 at obstacle position + assert np.abs(udf[1, 1]) < 1e-10, "UDF value should be 0 at obstacle position" + + # Verify UDF value is 1 for adjacent cells + assert np.abs(udf[0, 1] - 1.0) < 1e-10, ( + "UDF value should be 1 for cells adjacent to obstacle" + ) + assert np.abs(udf[1, 0] - 1.0) < 1e-10, ( + "UDF value should be 1 for cells adjacent to obstacle" + ) + assert np.abs(udf[1, 2] - 1.0) < 1e-10, ( + "UDF value should be 1 for cells adjacent to obstacle" + ) + assert np.abs(udf[2, 1] - 1.0) < 1e-10, ( + "UDF value should be 1 for cells adjacent to obstacle" + ) + + +def test_dt(): + """Test the computation of 1D distance transform""" + # Create test data + d = np.array([m.INF, 0, m.INF]) + m.dt(d) + + # Verify distance transform results + assert np.all(np.isfinite(d)), ( + "Distance transform result should not contain infinite values" + ) + assert d[1] == 0, "Distance at obstacle position should be 0" + assert d[0] == 1, "Distance at adjacent position should be 1" + assert d[2] == 1, "Distance at adjacent position should be 1" + + +def test_compute_sdf_empty(): + """Test SDF computation with empty map""" + # Test with empty map (no obstacles) + empty_map = np.zeros((5, 5)) + sdf = m.compute_sdf(empty_map) + + assert np.all(sdf > 0), "All SDF values should be positive for empty map" + assert sdf.shape == empty_map.shape, "Output shape should match input shape" + + +def test_compute_sdf_full(): + """Test SDF computation with fully occupied map""" + # Test with fully occupied map + full_map = np.ones((5, 5)) + sdf = m.compute_sdf(full_map) + + assert np.all(sdf < 0), "All SDF values should be negative for fully occupied map" + assert sdf.shape == full_map.shape, "Output shape should match input shape" + + +def test_compute_udf_invalid_input(): + """Test UDF computation with invalid input values""" + # Test with invalid values (not 0 or 1) + invalid_map = np.array([[0, 2, 0], [0, -1, 0], [0, 0.5, 0]]) + + try: + m.compute_udf(invalid_map) + assert False, "Should raise ValueError for invalid input values" + except ValueError: + pass + + +def test_compute_udf_empty(): + """Test UDF computation with empty map""" + # Test with empty map + empty_map = np.zeros((5, 5)) + udf = m.compute_udf(empty_map) + + assert np.all(udf > 0), "All UDF values should be positive for empty map" + assert np.all(np.isfinite(udf)), "UDF should not contain infinite values" + + +def test_main(): + """Test the execution of main function""" + m.ENABLE_PLOT = False + m.main() + + +if __name__ == "__main__": + conftest.run_this_test(__file__) From 2234abf63d07e5496e37c1b57a1b927c303272a8 Mon Sep 17 00:00:00 2001 From: Aglargil <34728006+Aglargil@users.noreply.github.com> Date: Thu, 6 Feb 2025 12:05:20 +0800 Subject: [PATCH 05/35] fix: DistanceMap doc autofunction (#1143) --- docs/modules/3_mapping/distance_map/distance_map_main.rst | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/modules/3_mapping/distance_map/distance_map_main.rst b/docs/modules/3_mapping/distance_map/distance_map_main.rst index 45273cd3bb..0ef9e3022f 100644 --- a/docs/modules/3_mapping/distance_map/distance_map_main.rst +++ b/docs/modules/3_mapping/distance_map/distance_map_main.rst @@ -17,9 +17,9 @@ The algorithm is demonstrated on a simple 2D grid with obstacles: API ~~~ -.. autofunction:: PathPlanning.DistanceMap.distance_map.compute_sdf +.. autofunction:: Mapping.DistanceMap.distance_map.compute_sdf -.. autofunction:: PathPlanning.DistanceMap.distance_map.compute_udf +.. autofunction:: Mapping.DistanceMap.distance_map.compute_udf References ~~~~~~~~~~ From 0676dfd67e099198c34c485e176b077ad6fa7374 Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Thu, 6 Feb 2025 15:16:17 +0900 Subject: [PATCH 06/35] update introduction (#1144) --- .../python_for_robotics_main.rst | 57 +++++++++++++++++-- 1 file changed, 51 insertions(+), 6 deletions(-) diff --git a/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst b/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst index 1ad5316f53..2f89f0c7b5 100644 --- a/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst +++ b/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst @@ -1,11 +1,13 @@ Python for Robotics ---------------------- +This section explains the Python itself and features for Robotics. + Python for general-purpose programming ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ `Python `_ is an general-purpose programming language developed by -`Guido van Rossum `_ in the late 1980s. +`Guido van Rossum `_ from the late 1980s. It features as follows: @@ -17,7 +19,7 @@ It features as follows: #. Batteries included #. Interoperability for C and Fortran -Due to these features, Python is the most popular programming language +Due to these features, Python is one of the most popular programming language for educational purposes for programming beginners. Python for Scientific Computing @@ -29,9 +31,9 @@ For example, #. High-level and interpreted features enable scientists to focus on their problems without dealing with low-level programming tasks like memory management. #. Code readability, rapid prototyping, and batteries included features enables scientists who are not professional programmers, to solve their problems easily. -#. The interoperability to wrap C and Fortran libraries enables scientists to access already existed powerful scientific computing libraries. +#. The interoperability to wrap C and Fortran libraries enables scientists to access already existed powerful and optimized scientific computing libraries. -To address the more needs of scientific computing, many libraries have been developed. +To address the more needs of scientific computing, many fundamental scientific computation libraries have been developed based on the upper features. - `NumPy `_ is the fundamental package for scientific computing with Python. - `SciPy `_ is a library that builds on NumPy and provides a large number of functions that operate on NumPy arrays and are useful for different types of scientific and engineering applications. @@ -39,12 +41,55 @@ To address the more needs of scientific computing, many libraries have been deve - `Pandas `_ is a fast, powerful, flexible, and easy-to-use open-source data analysis and data manipulation library built on top of NumPy. - `SymPy `_ is a Python library for symbolic mathematics. -And more domain-specific libraries have been developed: +Also, more domain-specific libraries have been developed based on these fundamental libraries: + - `Scikit-learn `_ is a free software machine learning library for the Python programming language. - `Scikit-image `_ is a collection of algorithms for image processing. +- `Networkx `_ is a Python package for the creation, manipulation, and study of the structure, dynamics, and functions of complex networks. +- `SunPy `_ is a community-developed free and open-source software package for solar physics. +- `Astropy `_ is a community-developed free and open-source software package for astronomy. + +Currently, Python is one of the most popular programming languages for scientific computing. Python for Robotics ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -TBD +Scientific computation routine are very important for robotics. +For example, matrix operation, optimization, and visualization are fundamental for robotics. + +Python has become an increasingly popular language in robotics due to its versatility, readability, and extensive libraries. Here's a breakdown of why Python is a great choice for robotics development: + +Advantages of Python for Robotics: + +Simplicity and Readability: Python's syntax is clear and concise, making it easier to learn and write code. This is crucial in robotics where complex algorithms and control logic are involved. +Extensive Libraries: Python boasts a rich collection of libraries specifically designed for robotics: +ROS (Robot Operating System): ROS, a widely used framework for robotics development, has strong Python support (rospy). This allows developers to easily create nodes, manage communication between different parts of a robot system, and utilize various ROS tools. +OpenCV: This powerful library provides tools for computer vision tasks like image processing, object detection, and motion tracking, essential for robots that perceive and interact with their environment. +NumPy and SciPy: These libraries offer efficient numerical computation and scientific tools, enabling developers to implement complex mathematical models and control algorithms. +Scikit-learn: This library provides machine learning algorithms, which are increasingly important in robotics for tasks like perception, planning, and control. +Cross-Platform Compatibility: Python code can run on various operating systems (Windows, macOS, Linux), providing flexibility in choosing hardware platforms for robotics projects. +Large Community and Support: Python has a vast and active community, offering ample resources, tutorials, and support for developers. This is invaluable when tackling challenges in robotics development. +Use Cases of Python in Robotics: + +Robot Control: Python can be used to write control algorithms for robot manipulators, mobile robots, and other robotic systems. +Perception: Python, combined with libraries like OpenCV, enables robots to process sensor data (camera images, lidar data) to understand their surroundings. +Path Planning: Python algorithms can be used to plan collision-free paths for robots to navigate in complex environments. +Machine Learning: Python libraries like Scikit-learn empower robots to learn from data and improve their performance in tasks like object recognition and manipulation. +Simulation: Python can be used to create simulated environments for testing and developing robot algorithms before deploying them on real hardware. +Examples of Python in Robotics: + +Autonomous Navigation: Python is used in self-driving cars and other autonomous vehicles for tasks like perception, localization, and path planning. +Industrial Robotics: Python is employed in manufacturing for robot control, quality inspection, and automation. +Service Robotics: Python powers robots that perform tasks like cleaning, delivery, and customer service in various environments. +Research and Education: Python is a popular choice in robotics research and education due to its ease of use and versatility. +Getting Started with Python in Robotics: + +Learn Python Basics: Familiarize yourself with Python syntax, data structures, and programming concepts. +Explore Robotics Libraries: Dive into libraries like ROS, OpenCV, and others relevant to your robotics interests. +Practice with Projects: Start with small projects to gain hands-on experience, gradually increasing complexity. +Join the Community: Engage with the robotics community through online forums, workshops, and conferences to learn from others and share your knowledge. +In conclusion, Python's simplicity, extensive libraries, and strong community support make it an ideal language for robotics development. Whether you're a beginner or an experienced programmer, Python offers the tools and resources you need to build innovative and capable robots. + +Python is used for this `PythonRobotics` project because of the above features +to achieve the purpose of this project described in the :ref:`What is PythonRobotics?`. From 9936f344635e146b9a2ee402ab1672b4e7216d6e Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Fri, 7 Feb 2025 13:32:11 +0900 Subject: [PATCH 07/35] update introduction (#1145) --- .../definition_of_robotics_main.rst | 7 +++ .../python_for_robotics_main.rst | 63 +++++++++---------- 2 files changed, 37 insertions(+), 33 deletions(-) diff --git a/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst b/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst index 2f31834017..fd151e3f20 100644 --- a/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst +++ b/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst @@ -5,3 +5,10 @@ In recent years, autonomous navigation technologies have received huge attention in many fields. Such fields include, autonomous driving[22], drone flight navigation, and other transportation systems. + +Examples of Python in Robotics: + +Autonomous Navigation: Python is used in self-driving cars and other autonomous vehicles for tasks like perception, localization, and path planning. +Industrial Robotics: Python is employed in manufacturing for robot control, quality inspection, and automation. +Service Robotics: Python powers robots that perform tasks like cleaning, delivery, and customer service in various environments. +Research and Education: Python is a popular choice in robotics research and education due to its ease of use and versatility. \ No newline at end of file diff --git a/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst b/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst index 2f89f0c7b5..90edd5dc0c 100644 --- a/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst +++ b/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst @@ -40,55 +40,52 @@ To address the more needs of scientific computing, many fundamental scientific c - `Matplotlib `_ is a plotting library for the Python programming language and its numerical mathematics extension NumPy. - `Pandas `_ is a fast, powerful, flexible, and easy-to-use open-source data analysis and data manipulation library built on top of NumPy. - `SymPy `_ is a Python library for symbolic mathematics. +- `CVXPy `_ is a Python-embedded modeling language for convex optimization problems. Also, more domain-specific libraries have been developed based on these fundamental libraries: - `Scikit-learn `_ is a free software machine learning library for the Python programming language. - `Scikit-image `_ is a collection of algorithms for image processing. -- `Networkx `_ is a Python package for the creation, manipulation, and study of the structure, dynamics, and functions of complex networks. +- `Networkx `_ is a package for the creation, manipulation for complex networks. - `SunPy `_ is a community-developed free and open-source software package for solar physics. - `Astropy `_ is a community-developed free and open-source software package for astronomy. Currently, Python is one of the most popular programming languages for scientific computing. Python for Robotics -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +^^^^^^^^^^^^^^^^^^^^ -Scientific computation routine are very important for robotics. -For example, matrix operation, optimization, and visualization are fundamental for robotics. +Python has become an increasingly popular language in robotics. -Python has become an increasingly popular language in robotics due to its versatility, readability, and extensive libraries. Here's a breakdown of why Python is a great choice for robotics development: +These are advantages of Python for Robotics: -Advantages of Python for Robotics: +Simplicity and Readability +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +Python's syntax is clear and concise, making it easier to learn and write code. +This is crucial in robotics where complex algorithms and control logic are involved. -Simplicity and Readability: Python's syntax is clear and concise, making it easier to learn and write code. This is crucial in robotics where complex algorithms and control logic are involved. -Extensive Libraries: Python boasts a rich collection of libraries specifically designed for robotics: + +Extensive libraries for scientific computation. +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +Scientific computation routine are fundamental for robotics. +For example: + +- Matrix operation is needed for rigid body transformation, state estimation, and model based control. +- Optimization is needed for optimization based SLAM, optimal path planning, and optimal control. +- Visualization is needed for robot teleoperation, debugging, and simulation. + +ROS supports Python +~~~~~~~~~~~~~~~~~~~~~~~~~~~ ROS (Robot Operating System): ROS, a widely used framework for robotics development, has strong Python support (rospy). This allows developers to easily create nodes, manage communication between different parts of a robot system, and utilize various ROS tools. -OpenCV: This powerful library provides tools for computer vision tasks like image processing, object detection, and motion tracking, essential for robots that perceive and interact with their environment. -NumPy and SciPy: These libraries offer efficient numerical computation and scientific tools, enabling developers to implement complex mathematical models and control algorithms. -Scikit-learn: This library provides machine learning algorithms, which are increasingly important in robotics for tasks like perception, planning, and control. -Cross-Platform Compatibility: Python code can run on various operating systems (Windows, macOS, Linux), providing flexibility in choosing hardware platforms for robotics projects. -Large Community and Support: Python has a vast and active community, offering ample resources, tutorials, and support for developers. This is invaluable when tackling challenges in robotics development. -Use Cases of Python in Robotics: - -Robot Control: Python can be used to write control algorithms for robot manipulators, mobile robots, and other robotic systems. -Perception: Python, combined with libraries like OpenCV, enables robots to process sensor data (camera images, lidar data) to understand their surroundings. -Path Planning: Python algorithms can be used to plan collision-free paths for robots to navigate in complex environments. -Machine Learning: Python libraries like Scikit-learn empower robots to learn from data and improve their performance in tasks like object recognition and manipulation. -Simulation: Python can be used to create simulated environments for testing and developing robot algorithms before deploying them on real hardware. -Examples of Python in Robotics: - -Autonomous Navigation: Python is used in self-driving cars and other autonomous vehicles for tasks like perception, localization, and path planning. -Industrial Robotics: Python is employed in manufacturing for robot control, quality inspection, and automation. -Service Robotics: Python powers robots that perform tasks like cleaning, delivery, and customer service in various environments. -Research and Education: Python is a popular choice in robotics research and education due to its ease of use and versatility. -Getting Started with Python in Robotics: - -Learn Python Basics: Familiarize yourself with Python syntax, data structures, and programming concepts. -Explore Robotics Libraries: Dive into libraries like ROS, OpenCV, and others relevant to your robotics interests. -Practice with Projects: Start with small projects to gain hands-on experience, gradually increasing complexity. -Join the Community: Engage with the robotics community through online forums, workshops, and conferences to learn from others and share your knowledge. -In conclusion, Python's simplicity, extensive libraries, and strong community support make it an ideal language for robotics development. Whether you're a beginner or an experienced programmer, Python offers the tools and resources you need to build innovative and capable robots. + +Cross-Platform Compatibility +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +Python code can run on various operating systems (Windows, macOS, Linux), providing flexibility in choosing hardware platforms for robotics projects. + +Large Community and Support +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +Python has a vast and active community, offering ample resources, tutorials, and support for developers. This is invaluable when tackling challenges in robotics development. + Python is used for this `PythonRobotics` project because of the above features to achieve the purpose of this project described in the :ref:`What is PythonRobotics?`. From 15e106839285033e1fa68cbed6f14e219f3af88f Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Fri, 7 Feb 2025 13:42:56 +0900 Subject: [PATCH 08/35] Update CONTRIBUTING.md --- CONTRIBUTING.md | 24 +++--------------------- 1 file changed, 3 insertions(+), 21 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 91f6dfa822..3bcc499e6a 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -1,23 +1,5 @@ -# Contributing to Python +# Contributing -:+1::tada: First of off, thanks very much for taking the time to contribute! :tada::+1: +:+1::tada: First of all, thank you very much for taking the time to contribute! :tada::+1: -The following is a set of guidelines for contributing to PythonRobotics. - -These are mostly guidelines, not rules. - -Use your best judgment, and feel free to propose changes to this document in a pull request. - -# Before contributing - -## Taking a look at the paper. - -Please check this paper to understand the philosophy of this project. - -- [\[1808\.10703\] PythonRobotics: a Python code collection of robotics algorithms](https://arxiv.org/abs/1808.10703) ([BibTeX](https://github.com/AtsushiSakai/PythonRoboticsPaper/blob/master/python_robotics.bib)) - -## Check your Python version. - -We only accept a PR for Python 3.8.x or higher. - -We will not accept a PR for Python 2.x. +Please check this document for contribution: [How to contribute — PythonRobotics documentation](https://atsushisakai.github.io/PythonRobotics/modules/0_getting_started/3_how_to_contribute.html) From a8f3388bbe80e41655acedc44c2cbb86d011fb48 Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Sun, 9 Feb 2025 21:00:02 +0900 Subject: [PATCH 09/35] update introduction (#1146) --- .../3_how_to_contribute_main.rst | 37 ++++++++++++++----- .../python_for_robotics_main.rst | 29 ++++++++++++--- 2 files changed, 51 insertions(+), 15 deletions(-) diff --git a/docs/modules/0_getting_started/3_how_to_contribute_main.rst b/docs/modules/0_getting_started/3_how_to_contribute_main.rst index 6e5c1be8ee..874564cbb8 100644 --- a/docs/modules/0_getting_started/3_how_to_contribute_main.rst +++ b/docs/modules/0_getting_started/3_how_to_contribute_main.rst @@ -9,10 +9,30 @@ There are several ways to contribute to this project as below: #. `Adding missed documentations for existing examples`_ #. `Supporting this project`_ +Before contributing +^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +Please check following items before contributing: + +Understanding this project +--------------------------- + +Please check this :ref:`What is PythonRobotics?` section and this paper +`PythonRobotics: a Python code collection of robotics algorithms`_ +to understand the philosophies of this project. + +.. _`PythonRobotics: a Python code collection of robotics algorithms`: https://arxiv.org/abs/1808.10703 + +Check your Python version. +--------------------------- + +We only accept a PR for Python 3.12.x or higher. + +We will not accept a PR for Python 2.x. .. _`Adding a new algorithm example`: -Adding a new algorithm example +1. Adding a new algorithm example ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ This is a step by step manual to add a new algorithm example. @@ -112,8 +132,8 @@ Note that this is my hobby project; I appreciate your patience during the review .. _`Reporting and fixing a defect`: -Reporting and fixing a defect -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +2. Reporting and fixing a defect +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Reporting and fixing a defect is also great contribution. @@ -136,8 +156,8 @@ This doc `submit a pull request`_ can be helpful to submit a pull request. .. _`Adding missed documentations for existing examples`: -Adding missed documentations for existing examples -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +3. Adding missed documentations for existing examples +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Adding the missed documentations for existing examples is also great contribution. @@ -150,8 +170,8 @@ This doc `how to write doc`_ can be helpful to write documents. .. _`Supporting this project`: -Supporting this project -^^^^^^^^^^^^^^^^^^^^^^^^ +4. Supporting this project +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Supporting this project financially is also a great contribution!!. @@ -165,8 +185,7 @@ If you or your company would like to support this project, please consider: If you would like to support us in some other way, please contact with creating an issue. -Current Major Sponsors ------------------------ +Current Major Sponsors: #. `JetBrains`_ : They are providing a free license of their IDEs for this OSS development. #. `1Password`_ : They are providing a free license of their 1Password team license for this OSS project. diff --git a/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst b/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst index 90edd5dc0c..65b1705150 100644 --- a/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst +++ b/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst @@ -1,6 +1,8 @@ Python for Robotics ---------------------- +Python is used for this `PythonRobotics` project because of the above features +to achieve the purpose of this project described in the :ref:`What is PythonRobotics?`. This section explains the Python itself and features for Robotics. Python for general-purpose programming @@ -76,7 +78,27 @@ For example: ROS supports Python ~~~~~~~~~~~~~~~~~~~~~~~~~~~ -ROS (Robot Operating System): ROS, a widely used framework for robotics development, has strong Python support (rospy). This allows developers to easily create nodes, manage communication between different parts of a robot system, and utilize various ROS tools. +`ROS`_ (Robot Operating System) is an open-source and widely used framework for robotics development. +It is designed to help developping complicated robotic applications. +ROS provides essential tools, libraries, and drivers to simplify robot programming and integration. + +Key Features of ROS: + +- Modular Architecture – Uses a node-based system where different components (nodes) communicate via messages. +- Hardware Abstraction – Supports various robots, sensors, and actuators, making development more flexible. +- Powerful Communication System – Uses topics, services, and actions for efficient data exchange between components. +- Rich Ecosystem – Offers many pre-built packages for navigation, perception, and manipulation. +- Multi-language Support – Primarily uses Python and C++, but also supports other languages. +- Simulation & Visualization – Tools like Gazebo (for simulation) and RViz (for visualization) aid in development and testing. +- Scalability & Community Support – Widely used in academia and industry, with a large open-source community. + +ROS has strong Python support (`rospy`_ for ROS1 and `rclpy`_ for ROS2). +This allows developers to easily create nodes, manage communication between +different parts of a robot system, and utilize various ROS tools. + +.. _`ROS`: https://www.ros.org/ +.. _`rospy`: http://wiki.ros.org/rospy +.. _`rclpy`: https://docs.ros.org/en/jazzy/Tutorials/Beginner-Client-Libraries/Writing-A-Simple-Py-Publisher-And-Subscriber.html Cross-Platform Compatibility ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -85,8 +107,3 @@ Python code can run on various operating systems (Windows, macOS, Linux), provid Large Community and Support ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Python has a vast and active community, offering ample resources, tutorials, and support for developers. This is invaluable when tackling challenges in robotics development. - - -Python is used for this `PythonRobotics` project because of the above features -to achieve the purpose of this project described in the :ref:`What is PythonRobotics?`. - From e304f07a997bf40977a004ac2a66f476aa2aa861 Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Mon, 10 Feb 2025 11:21:19 +0900 Subject: [PATCH 10/35] update introduction (#1147) --- .../python_for_robotics_main.rst | 32 +++++++++++++++++-- 1 file changed, 29 insertions(+), 3 deletions(-) diff --git a/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst b/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst index 65b1705150..b677d2c59b 100644 --- a/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst +++ b/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst @@ -1,9 +1,10 @@ Python for Robotics ---------------------- -Python is used for this `PythonRobotics` project because of the above features -to achieve the purpose of this project described in the :ref:`What is PythonRobotics?`. -This section explains the Python itself and features for Robotics. +A programing language, Python is used for this `PythonRobotics` project +to achieve the purposes of this project described in the :ref:`What is PythonRobotics?`. + +This section explains the Python itself and features for science computing Robotics. Python for general-purpose programming ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ @@ -107,3 +108,28 @@ Python code can run on various operating systems (Windows, macOS, Linux), provid Large Community and Support ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Python has a vast and active community, offering ample resources, tutorials, and support for developers. This is invaluable when tackling challenges in robotics development. + +Situations which Python is NOT suitable for Robotics +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +We explained the advantages of Python for robotics. +However, Python is not always the best choice for robotics development. + +These are situations where Python is NOT suitable for robotics: + +High-speed real-time control +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +Python is an interpreted language, which means it is slower than compiled languages like C++. +This can be a disadvantage when real-time control is required, +such as in high-speed motion control or safety-critical systems. + +So, for these applications, we recommend to understand the each algorithm you +needed using this project and implement it in other suitable languages like C++. + +Resource-constrained systems +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +Python is a high-level language that requires more memory and processing power +compared to low-level languages. +So, it is difficult to run Python on resource-constrained systems like +microcontrollers or embedded devices. +In such cases, C or C++ is more suitable for these applications. From 610f35ff58e6535efa619f9ed73a113c6ca2c7f7 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 11 Feb 2025 10:42:10 +0900 Subject: [PATCH 11/35] build(deps): bump mypy from 1.14.1 to 1.15.0 in /requirements (#1148) Bumps [mypy](https://github.com/python/mypy) from 1.14.1 to 1.15.0. - [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md) - [Commits](https://github.com/python/mypy/compare/v1.14.1...v1.15.0) --- updated-dependencies: - dependency-name: mypy dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/requirements.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/requirements/requirements.txt b/requirements/requirements.txt index 9d4e7deb4d..178046ac0d 100644 --- a/requirements/requirements.txt +++ b/requirements/requirements.txt @@ -4,5 +4,5 @@ matplotlib == 3.10.0 cvxpy == 1.5.3 pytest == 8.3.4 # For unit test pytest-xdist == 3.6.1 # For unit test -mypy == 1.14.1 # For unit test +mypy == 1.15.0 # For unit test ruff == 0.9.4 # For unit test From ba307673013376204ceb5a6def16da0e3a86a15d Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 11 Feb 2025 11:35:28 +0900 Subject: [PATCH 12/35] build(deps): bump ruff from 0.9.4 to 0.9.6 in /requirements (#1149) Bumps [ruff](https://github.com/astral-sh/ruff) from 0.9.4 to 0.9.6. - [Release notes](https://github.com/astral-sh/ruff/releases) - [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md) - [Commits](https://github.com/astral-sh/ruff/compare/0.9.4...0.9.6) --- updated-dependencies: - dependency-name: ruff dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/requirements.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/requirements/requirements.txt b/requirements/requirements.txt index 178046ac0d..f5f674d7d2 100644 --- a/requirements/requirements.txt +++ b/requirements/requirements.txt @@ -5,4 +5,4 @@ cvxpy == 1.5.3 pytest == 8.3.4 # For unit test pytest-xdist == 3.6.1 # For unit test mypy == 1.15.0 # For unit test -ruff == 0.9.4 # For unit test +ruff == 0.9.6 # For unit test From b298609b2832c8029baa98b2d3906abce0263ec4 Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Tue, 11 Feb 2025 21:15:34 +0900 Subject: [PATCH 13/35] update introduction doc (#1151) --- .../definition_of_robotics_main.rst | 81 ++++++++++++++++--- .../python_for_robotics_main.rst | 2 +- .../technology_for_robotics_main.rst | 9 +++ 3 files changed, 82 insertions(+), 10 deletions(-) diff --git a/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst b/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst index fd151e3f20..f6fba646b4 100644 --- a/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst +++ b/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst @@ -1,14 +1,77 @@ Definition of Robotics ---------------------- -In recent years, autonomous navigation technologies have received huge -attention in many fields. -Such fields include, autonomous driving[22], drone flight navigation, -and other transportation systems. +What is Robotics? +^^^^^^^^^^^^^^^^^^ -Examples of Python in Robotics: +Robot is a machine that can perform tasks automatically or semi-autonomously. +Robotics is the study of robots. +The field of robotics has wide areas of technologies such as mechanical engineering, +electrical engineering, computer science, and artificial intelligence (AI), +to create machines that can perform tasks autonomously or semi-autonomously. + +The History of Robots +^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +This timeline highlights key milestones in the history of robotics: + +Ancient and Early Concepts (Before 1500s) +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The idea of **automated machines** has existed for thousands of years. Ancient civilizations imagined mechanical beings: + +- **Ancient Greece (4th Century BC)** – Greek engineer **Hero of Alexandria** designed early **automata** (self-operating machines) powered by water or air. +- **Chinese and Arabic Automata (9th–13th Century)** – Inventors like **Al-Jazari** created intricate mechanical devices, including water clocks and humanoid robots. + +The Birth of Modern Robotics (1500s–1800s) +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +- **Leonardo da Vinci’s Robot (1495)** – Designed a humanoid knight with mechanical movement. +- **Jacques de Vaucanson’s Automata (1738)** – Created robotic figures like a mechanical duck that could "eat" and "digest." +- **Industrial Revolution (18th–19th Century)** – Machines began replacing human labor in factories, setting the foundation for automation. + +The Rise of Industrial Robots (1900s–1950s) +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +- **The Term “Robot” (1921)** – Czech writer **Karel Čapek** introduced the word *“robot”* in his play *R.U.R. (Rossum’s Universal Robots)*. +- **Early Cybernetics (1940s–1950s)** – Scientists like **Norbert Wiener** developed theories of self-regulating machines, influencing modern robotics. + +The Birth of Modern Robotics (1950s–1980s) +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +- **First Industrial Robot (1961)** – *Unimate*, created by **George Devol and Joseph Engelberger**, was the first programmable robot used in a factory. +- **Rise of AI & Autonomous Robots (1970s–1980s)** – Researchers developed mobile robots like **Shakey** (Stanford, 1966) and AI-based control systems. + +Advanced Robotics and AI Integration (1990s–Present) +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +- **Autonomous Vehicles & Drones** – Self-driving cars and UAVs (unmanned aerial vehicles) became more advanced. +- **Medical Robotics** – Robots like **da Vinci Surgical System** revolutionized healthcare. +- **Personal Robots** – Devices like **Roomba** (vacuum robot) and **Sophia** (AI humanoid) became popular. +- **Collaborative Robots (Cobots)** – Robots started working alongside humans in industries. + +Key Components of Robotics +^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +Robotics consists of several essential components: + +#. Sensors – Gather information from the environment (e.g., cameras, LiDAR, gyro, accelerometer, wheel encoders). +#. Actuators – Enable movement and interaction with the world (e.g., motors, hydraulic systems). +#. Computers – Process sensor data and make decisions (e.g., micro-controllers, CPUs, GPUs). +#. Power Supply – Provides energy to run the robot (e.g., batteries, solar power). +#. Software & Algorithms – Allow the robot to function and make intelligent decisions (e.g., ROS, machine learning models, localization, mapping, path planning, control). + +This project, PythonRobotics, focuses on the software and algorithms part of robotics. + +Applications of Robots +^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +Robots come in various forms depending on their purpose: + +#. 🤖 Industrial Robots – Used in manufacturing (e.g., robotic arms in manufacturing factories). +#. 🏠 Service Robots – Assist in daily life (e.g., vacuum robots, delivery robots). +#. 🚗 Autonomous Vehicles – Self-driving cars and drones. +#. 👨‍⚕️ Medical Robots – Assist in surgeries and healthcare. +#. 🚀 Space & Exploration Robots – Used for planetary exploration (e.g., NASA’s Mars rovers). +#. 🐶 Humanoid & Social Robots – Designed to interact with humans (e.g., ASIMO, Sophia). -Autonomous Navigation: Python is used in self-driving cars and other autonomous vehicles for tasks like perception, localization, and path planning. -Industrial Robotics: Python is employed in manufacturing for robot control, quality inspection, and automation. -Service Robotics: Python powers robots that perform tasks like cleaning, delivery, and customer service in various environments. -Research and Education: Python is a popular choice in robotics research and education due to its ease of use and versatility. \ No newline at end of file diff --git a/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst b/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst index b677d2c59b..c47c122853 100644 --- a/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst +++ b/docs/modules/1_introduction/2_python_for_robotics/python_for_robotics_main.rst @@ -4,7 +4,7 @@ Python for Robotics A programing language, Python is used for this `PythonRobotics` project to achieve the purposes of this project described in the :ref:`What is PythonRobotics?`. -This section explains the Python itself and features for science computing Robotics. +This section explains the Python itself and features for science computing and robotics. Python for general-purpose programming ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ diff --git a/docs/modules/1_introduction/3_technology_for_robotics/technology_for_robotics_main.rst b/docs/modules/1_introduction/3_technology_for_robotics/technology_for_robotics_main.rst index 4dd1d1842f..93dc9e3466 100644 --- a/docs/modules/1_introduction/3_technology_for_robotics/technology_for_robotics_main.rst +++ b/docs/modules/1_introduction/3_technology_for_robotics/technology_for_robotics_main.rst @@ -1,6 +1,15 @@ Technology for Robotics ------------------------- + +Autonomous Navigation +^^^^^^^^^^^^^^^^^^^^^^^^ + +In recent years, autonomous navigation technologies have received huge +attention in many fields. +Such fields include, autonomous driving[22], drone flight navigation, +and other transportation systems. + An autonomous navigation system is a system that can move to a goal over long periods of time without any external control by an operator. The system requires a wide range of technologies: From be608f067cbd96f34955b81d3e5be9e22a46f588 Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Wed, 12 Feb 2025 21:51:29 +0900 Subject: [PATCH 14/35] update introduction doc (#1152) --- .../definition_of_robotics_main.rst | 38 ++++++++++++++----- .../technology_for_robotics_main.rst | 3 ++ 2 files changed, 31 insertions(+), 10 deletions(-) diff --git a/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst b/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst index f6fba646b4..63525057fe 100644 --- a/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst +++ b/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst @@ -1,14 +1,24 @@ Definition of Robotics ---------------------- +This section explains the definition, history, key components, and applications of robotics. + What is Robotics? ^^^^^^^^^^^^^^^^^^ Robot is a machine that can perform tasks automatically or semi-autonomously. Robotics is the study of robots. -The field of robotics has wide areas of technologies such as mechanical engineering, -electrical engineering, computer science, and artificial intelligence (AI), -to create machines that can perform tasks autonomously or semi-autonomously. + +The word “robot” comes from the Czech word “robota,” which means “forced labor” or “drudgery.” +It was first used in the 1920 science fiction play `R.U.R.`_ (Rossum’s Universal Robots) +by the Czech writer `Karel Čapek`_. +In the play, robots were artificial workers created to serve humans, but they eventually rebelled. + +Over time, “robot” came to refer to machines or automated systems that can perform tasks, +often with some level of intelligence or autonomy. + +.. _`R.U.R.`: https://thereader.mitpress.mit.edu/origin-word-robot-rur/ +.. _`Karel Čapek`: https://en.wikipedia.org/wiki/Karel_%C4%8Capek The History of Robots ^^^^^^^^^^^^^^^^^^^^^^^^^^^ @@ -18,22 +28,30 @@ This timeline highlights key milestones in the history of robotics: Ancient and Early Concepts (Before 1500s) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -The idea of **automated machines** has existed for thousands of years. Ancient civilizations imagined mechanical beings: +The idea of **automated machines** has existed for thousands of years. +Ancient civilizations imagined mechanical beings: + +- **Ancient Greece (4th Century BC)** – Greek engineer `Hero of Alexandria`_ designed early **automata** (self-operating machines) powered by water or air. +- **Chinese and Arabic Automata (9th–13th Century)** – Inventors like `Ismail Al-Jazari`_ created intricate mechanical devices, including water clocks and automated moving peacocks driven by hydropower. -- **Ancient Greece (4th Century BC)** – Greek engineer **Hero of Alexandria** designed early **automata** (self-operating machines) powered by water or air. -- **Chinese and Arabic Automata (9th–13th Century)** – Inventors like **Al-Jazari** created intricate mechanical devices, including water clocks and humanoid robots. +.. _`Hero of Alexandria`: https://en.wikipedia.org/wiki/Hero_of_Alexandria +.. _`Ismail Al-Jazari`: https://en.wikipedia.org/wiki/Ismail_al-Jazari The Birth of Modern Robotics (1500s–1800s) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -- **Leonardo da Vinci’s Robot (1495)** – Designed a humanoid knight with mechanical movement. -- **Jacques de Vaucanson’s Automata (1738)** – Created robotic figures like a mechanical duck that could "eat" and "digest." -- **Industrial Revolution (18th–19th Century)** – Machines began replacing human labor in factories, setting the foundation for automation. +- `Leonardo da Vinci’s Robot`_ (1495) – Designed a humanoid knight with mechanical movement. +- `Jacques de Vaucanson’s Digesting Duck`_ (1738) – Created robotic figures like a mechanical duck that could "eat" and "digest." +- `Industrial Revolution`_ (18th–19th Century) – Machines began replacing human labor in factories, setting the foundation for automation. + +.. _`Leonardo da Vinci’s Robot`: https://en.wikipedia.org/wiki/Leonardo%27s_robot +.. _`Jacques de Vaucanson’s Digesting Duck`: https://en.wikipedia.org/wiki/Jacques_de_Vaucanson +.. _`Industrial Revolution`: https://en.wikipedia.org/wiki/Industrial_Revolution The Rise of Industrial Robots (1900s–1950s) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -- **The Term “Robot” (1921)** – Czech writer **Karel Čapek** introduced the word *“robot”* in his play *R.U.R. (Rossum’s Universal Robots)*. +- **The Term “Robot” (1921)** – Czech writer `Karel Čapek`_ introduced the word *“robot”* in his play `R.U.R.`_ (Rossum’s Universal Robots). - **Early Cybernetics (1940s–1950s)** – Scientists like **Norbert Wiener** developed theories of self-regulating machines, influencing modern robotics. The Birth of Modern Robotics (1950s–1980s) diff --git a/docs/modules/1_introduction/3_technology_for_robotics/technology_for_robotics_main.rst b/docs/modules/1_introduction/3_technology_for_robotics/technology_for_robotics_main.rst index 93dc9e3466..e460059e20 100644 --- a/docs/modules/1_introduction/3_technology_for_robotics/technology_for_robotics_main.rst +++ b/docs/modules/1_introduction/3_technology_for_robotics/technology_for_robotics_main.rst @@ -1,6 +1,9 @@ Technology for Robotics ------------------------- +The field of robotics needs wide areas of technologies such as mechanical engineering, +electrical engineering, computer science, and artificial intelligence (AI). + Autonomous Navigation ^^^^^^^^^^^^^^^^^^^^^^^^ From 1ecc154fbaf9bc2342671fe93a305e8f0e3f510f Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Thu, 13 Feb 2025 17:19:03 +0900 Subject: [PATCH 15/35] update contribution link in README.md to fix invalid link (#1154) --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index cb1816c2b5..818d7b0d4e 100644 --- a/README.md +++ b/README.md @@ -617,7 +617,7 @@ This is a list of user's comment and references:[users\_comments](https://github Any contribution is welcome!! -Please check this document:[How To Contribute — PythonRobotics documentation](https://atsushisakai.github.io/PythonRobotics/how_to_contribute.html) +Please check this document:[How To Contribute — PythonRobotics documentation](https://atsushisakai.github.io/PythonRobotics/modules/0_getting_started/3_how_to_contribute.html) # Citing From 156483000524488d64687a853ac030e8d57c18a6 Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Thu, 13 Feb 2025 17:56:17 +0900 Subject: [PATCH 16/35] update robotics definition document to enhance references and clarity (#1155) --- .../definition_of_robotics_main.rst | 29 ++++++++++++++----- 1 file changed, 22 insertions(+), 7 deletions(-) diff --git a/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst b/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst index 63525057fe..265814e068 100644 --- a/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst +++ b/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst @@ -52,21 +52,36 @@ The Rise of Industrial Robots (1900s–1950s) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - **The Term “Robot” (1921)** – Czech writer `Karel Čapek`_ introduced the word *“robot”* in his play `R.U.R.`_ (Rossum’s Universal Robots). -- **Early Cybernetics (1940s–1950s)** – Scientists like **Norbert Wiener** developed theories of self-regulating machines, influencing modern robotics. +- **Early Cybernetics (1940s–1950s)** – Scientists like `Norbert Wiener`_ developed theories of self-regulating machines, influencing modern robotics (Cybernetics). + +.. _`Norbert Wiener`: https://en.wikipedia.org/wiki/Norbert_Wiener The Birth of Modern Robotics (1950s–1980s) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -- **First Industrial Robot (1961)** – *Unimate*, created by **George Devol and Joseph Engelberger**, was the first programmable robot used in a factory. -- **Rise of AI & Autonomous Robots (1970s–1980s)** – Researchers developed mobile robots like **Shakey** (Stanford, 1966) and AI-based control systems. +- **First Industrial Robot (1961)** – `Unimate`_, created by `George Devol`_ and `Joseph Engelberger`_, was the first programmable robot used in a factory. +- **Rise of AI & Autonomous Robots (1970s–1980s)** – Researchers developed mobile robots like `Shakey`_ (Stanford, 1966) and AI-based control systems. + +.. _`Unimate`: https://en.wikipedia.org/wiki/Unimate +.. _`George Devol`: https://en.wikipedia.org/wiki/George_Devol +.. _`Joseph Engelberger`: https://en.wikipedia.org/wiki/Joseph_Engelberger +.. _`Shakey`: https://en.wikipedia.org/wiki/Shakey_the_robot Advanced Robotics and AI Integration (1990s–Present) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -- **Autonomous Vehicles & Drones** – Self-driving cars and UAVs (unmanned aerial vehicles) became more advanced. -- **Medical Robotics** – Robots like **da Vinci Surgical System** revolutionized healthcare. -- **Personal Robots** – Devices like **Roomba** (vacuum robot) and **Sophia** (AI humanoid) became popular. -- **Collaborative Robots (Cobots)** – Robots started working alongside humans in industries. +- **Autonomous Vehicles** – Self-driving cars for robo taxi like `Waymo`_ and autonomous haulage system in mining like `AHS`_ became more advanced and bisiness-ready. +- **Medical Robotics** – Robots like `da Vinci Surgical System`_ revolutionized healthcare. +- **Personal Robots** – Devices like `Roomba`_ (vacuum robot) and `Sophia`_ (AI humanoid) became popular. +- **Service Robots** - Assistive robots like serving robots in restaurants and hotels like `Bellabot`_. +- **Collaborative Robots (Drones)** – Collaborative robots like UAV (Unmanned Aerial Vehicle) in drone shows and delivery services. + +.. _`Waymo`: https://waymo.com/ +.. _`AHS`: https://www.futurebridge.com/industry/perspectives-industrial-manufacturing/autonomous-haulage-systems-the-future-of-mining-operations/ +.. _`da Vinci Surgical System`: https://en.wikipedia.org/wiki/Da_Vinci_Surgical_System +.. _`Roomba`: https://en.wikipedia.org/wiki/Roomba +.. _`Sophia`: https://en.wikipedia.org/wiki/Sophia_(robot) +.. _`Bellabot`: https://www.pudurobotics.com/en Key Components of Robotics ^^^^^^^^^^^^^^^^^^^^^^^^^^^ From 77ad3344b5e9df26ca370725d3921dd354c755b1 Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Fri, 14 Feb 2025 21:20:35 +0900 Subject: [PATCH 17/35] update robotics definition document to improve clarity and add references (#1157) --- .../definition_of_robotics_main.rst | 28 +++++++------------ 1 file changed, 10 insertions(+), 18 deletions(-) diff --git a/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst b/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst index 265814e068..f54d4d41fa 100644 --- a/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst +++ b/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst @@ -70,12 +70,17 @@ The Birth of Modern Robotics (1950s–1980s) Advanced Robotics and AI Integration (1990s–Present) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +- **Industrial Robots** – Advanced robots like `Baxter`_ and `Amazon Robotics`_ revolutionized manufacturing and logistics. - **Autonomous Vehicles** – Self-driving cars for robo taxi like `Waymo`_ and autonomous haulage system in mining like `AHS`_ became more advanced and bisiness-ready. +- **Exploration Robots** – Used for planetary exploration (e.g., NASA’s `Mars rovers`_). - **Medical Robotics** – Robots like `da Vinci Surgical System`_ revolutionized healthcare. - **Personal Robots** – Devices like `Roomba`_ (vacuum robot) and `Sophia`_ (AI humanoid) became popular. - **Service Robots** - Assistive robots like serving robots in restaurants and hotels like `Bellabot`_. - **Collaborative Robots (Drones)** – Collaborative robots like UAV (Unmanned Aerial Vehicle) in drone shows and delivery services. +.. _`Baxter`: https://en.wikipedia.org/wiki/Baxter_(robot) +.. _`Amazon Robotics`: https://en.wikipedia.org/wiki/Amazon_Robotics +.. _`Mars rovers`: https://en.wikipedia.org/wiki/Mars_rover .. _`Waymo`: https://waymo.com/ .. _`AHS`: https://www.futurebridge.com/industry/perspectives-industrial-manufacturing/autonomous-haulage-systems-the-future-of-mining-operations/ .. _`da Vinci Surgical System`: https://en.wikipedia.org/wiki/Da_Vinci_Surgical_System @@ -88,23 +93,10 @@ Key Components of Robotics Robotics consists of several essential components: -#. Sensors – Gather information from the environment (e.g., cameras, LiDAR, gyro, accelerometer, wheel encoders). -#. Actuators – Enable movement and interaction with the world (e.g., motors, hydraulic systems). -#. Computers – Process sensor data and make decisions (e.g., micro-controllers, CPUs, GPUs). -#. Power Supply – Provides energy to run the robot (e.g., batteries, solar power). -#. Software & Algorithms – Allow the robot to function and make intelligent decisions (e.g., ROS, machine learning models, localization, mapping, path planning, control). +#. Sensors – Gather information from the environment (e.g., Cameras, LiDAR, GNSS, Gyro, Accelerometer, Wheel encoders). +#. Actuators – Enable movement and interaction with the world (e.g., Motors, Hydraulic systems). +#. Computers – Process sensor data and make decisions (e.g., Micro-controllers, CPUs, GPUs). +#. Power Supply – Provides energy to run the robot (e.g., Batteries, Solar power). +#. Software & Algorithms – Allow the robot to function and make intelligent decisions (e.g., ROS, Machine learning models, Localization, Mapping, Path planning, Control). This project, PythonRobotics, focuses on the software and algorithms part of robotics. - -Applications of Robots -^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -Robots come in various forms depending on their purpose: - -#. 🤖 Industrial Robots – Used in manufacturing (e.g., robotic arms in manufacturing factories). -#. 🏠 Service Robots – Assist in daily life (e.g., vacuum robots, delivery robots). -#. 🚗 Autonomous Vehicles – Self-driving cars and drones. -#. 👨‍⚕️ Medical Robots – Assist in surgeries and healthcare. -#. 🚀 Space & Exploration Robots – Used for planetary exploration (e.g., NASA’s Mars rovers). -#. 🐶 Humanoid & Social Robots – Designed to interact with humans (e.g., ASIMO, Sophia). - From 35c08824d00bd9fb452d1ee951018b177e65fd00 Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Sat, 15 Feb 2025 16:08:23 +0900 Subject: [PATCH 18/35] add external sensors documentation to appendix (#1159) --- docs/modules/12_appendix/appendix_main.rst | 1 + .../12_appendix/external_sensors_main.rst | 56 +++++++++++++++++++ 2 files changed, 57 insertions(+) create mode 100644 docs/modules/12_appendix/external_sensors_main.rst diff --git a/docs/modules/12_appendix/appendix_main.rst b/docs/modules/12_appendix/appendix_main.rst index cb1ac04066..89a7fa9303 100644 --- a/docs/modules/12_appendix/appendix_main.rst +++ b/docs/modules/12_appendix/appendix_main.rst @@ -10,4 +10,5 @@ Appendix steering_motion_model Kalmanfilter_basics Kalmanfilter_basics_2 + external_sensors diff --git a/docs/modules/12_appendix/external_sensors_main.rst b/docs/modules/12_appendix/external_sensors_main.rst new file mode 100644 index 0000000000..a1dba3b214 --- /dev/null +++ b/docs/modules/12_appendix/external_sensors_main.rst @@ -0,0 +1,56 @@ +External Sensors for Robots +============================ + +Introduction +------------ + +In recent years, the application of robotic technology has advanced, +particularly in areas such as autonomous vehicles and disaster response robots. +A crucial element in these technologies is external recognition—the robot's ability to understand its surrounding environment, identify safe zones, and detect moving objects using onboard sensors. Achieving effective external recognition involves various techniques, but equally important is the selection of appropriate sensors. Robots, like the sensors they employ, come in many forms, but external recognition sensors can be broadly categorized into three types. Developing an advanced external recognition system requires a thorough understanding of each sensor's principles and characteristics to determine their optimal application. This article summarizes the principles and features of these sensors for personal study purposes. + +Laser Sensors +------------- + +Laser sensors measure distances by utilizing light, commonly referred to as Light Detection and Ranging (LIDAR). They operate by emitting light towards an object and calculating the distance based on the time it takes for the reflected light to return, using the speed of light as a constant. + +Radar Sensors +------------- + +TBD + + +Monocular Cameras +----------------- + +Monocular cameras utilize a single camera to recognize the external environment. Compared to other sensors, they can detect color and brightness information, making them primarily useful for object recognition. However, they face challenges in independently measuring distances to surrounding objects and may struggle in low-light or dark conditions. + +Requirements for Cameras and Image Processing in Robotics +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +While camera sensors are widely used in applications like surveillance, deploying them in robotics necessitates meeting specific requirements: + +1. High dynamic range to adapt to various lighting conditions +2. Wide measurement range +3. Capability for three-dimensional measurement through techniques like motion stereo +4. Real-time processing with high frame rates +5. Cost-effectiveness + +Stereo Cameras +-------------- + +Stereo cameras employ multiple cameras to measure distances to surrounding objects. By knowing the positions and orientations of each camera and analyzing the disparity in the images (parallax), the distance to a specific point (the object represented by a particular pixel) can be calculated. + +Characteristics of Stereo Cameras +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +Advantages of stereo cameras include the ability to obtain high-precision and high-density distance information at close range, depending on factors like camera resolution and the distance between cameras (baseline). This makes them suitable for indoor robots that require precise shape recognition of nearby objects. Additionally, stereo cameras are relatively cost-effective compared to other sensors, leading to their use in consumer products like Subaru's EyeSight system. However, stereo cameras are less effective for long-distance measurements due to a decrease in accuracy proportional to the square of the distance. They are also susceptible to environmental factors such as lighting conditions. + +Ultrasonic Sensors +------------------ + +Ultrasonic sensors are commonly used in indoor robots and some automotive autonomous driving systems. Their features include affordability compared to laser or radar sensors, the ability to detect very close objects, and the capability to sense materials like glass, which may be challenging for lasers or cameras. However, they have limitations such as shorter maximum measurement distances and lower resolution and accuracy. + +References +---------- + +TBD From e82a12319b5336beaff01ab9fd2ce7818bb08dfb Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Sun, 16 Feb 2025 21:41:22 +0900 Subject: [PATCH 19/35] add internal sensors documentation to appendix and create new internal sensors overview (#1161) --- docs/modules/12_appendix/appendix_main.rst | 1 + .../12_appendix/external_sensors_main.rst | 4 +++ .../12_appendix/internal_sensors_main.rst | 32 +++++++++++++++++++ 3 files changed, 37 insertions(+) create mode 100644 docs/modules/12_appendix/internal_sensors_main.rst diff --git a/docs/modules/12_appendix/appendix_main.rst b/docs/modules/12_appendix/appendix_main.rst index 89a7fa9303..a55389e1e6 100644 --- a/docs/modules/12_appendix/appendix_main.rst +++ b/docs/modules/12_appendix/appendix_main.rst @@ -10,5 +10,6 @@ Appendix steering_motion_model Kalmanfilter_basics Kalmanfilter_basics_2 + internal_sensors external_sensors diff --git a/docs/modules/12_appendix/external_sensors_main.rst b/docs/modules/12_appendix/external_sensors_main.rst index a1dba3b214..d36b852d42 100644 --- a/docs/modules/12_appendix/external_sensors_main.rst +++ b/docs/modules/12_appendix/external_sensors_main.rst @@ -1,6 +1,10 @@ External Sensors for Robots ============================ +This project, `PythonRobotics`, focuses on algorithms, so hardware is not included. +However, having basic knowledge of hardware in robotics is also important for understanding algorithms. +Therefore, we will provide an overview. + Introduction ------------ diff --git a/docs/modules/12_appendix/internal_sensors_main.rst b/docs/modules/12_appendix/internal_sensors_main.rst new file mode 100644 index 0000000000..13b8de4203 --- /dev/null +++ b/docs/modules/12_appendix/internal_sensors_main.rst @@ -0,0 +1,32 @@ +Internal Sensors for Robots +============================ + +This project, `PythonRobotics`, focuses on algorithms, so hardware is not included. +However, having basic knowledge of hardware in robotics is also important for understanding algorithms. +Therefore, we will provide an overview. + +Introduction +------------ + +Global Navigation Satellite System (GNSS) +------------------------------------------- + +Gyroscope +---------- + +Accelerometer +-------------- + +Magnetometer +-------------- + +Inertial Measurement Unit (IMU) +-------------------------------- + +Pressure Sensor +----------------- + +Temperature Sensor +-------------------- + + From c92aaf36d8665a3b9ecd1a661f151cfd6ede4c66 Mon Sep 17 00:00:00 2001 From: Aglargil <34728006+Aglargil@users.noreply.github.com> Date: Mon, 17 Feb 2025 18:47:04 +0800 Subject: [PATCH 20/35] feat: add ElasticBands (#1156) * feat: add ElasticBands * feat: Elastic Bands update * feat: ElasticBands update * feat: ElasticBands add test * feat: ElasticBands reduce occupation * fix: ElasticBands test * feat: ElasticBands remove tangential component * feat: Elastic Bands update * feat: Elastic Bands doc * feat: Elastic Bands update * feat: ElasticBands update --- Mapping/DistanceMap/distance_map.py | 51 +++ PathPlanning/ElasticBands/elastic_bands.py | 300 ++++++++++++++++++ PathPlanning/ElasticBands/obstacles.npy | Bin 0 -> 384 bytes PathPlanning/ElasticBands/path.npy | Bin 0 -> 224 bytes .../elastic_bands/elastic_bands_main.rst | 73 +++++ .../5_path_planning/path_planning_main.rst | 1 + tests/test_elastic_bands.py | 23 ++ 7 files changed, 448 insertions(+) create mode 100644 PathPlanning/ElasticBands/elastic_bands.py create mode 100644 PathPlanning/ElasticBands/obstacles.npy create mode 100644 PathPlanning/ElasticBands/path.npy create mode 100644 docs/modules/5_path_planning/elastic_bands/elastic_bands_main.rst create mode 100644 tests/test_elastic_bands.py diff --git a/Mapping/DistanceMap/distance_map.py b/Mapping/DistanceMap/distance_map.py index 54c98c6a75..0dcc7380c5 100644 --- a/Mapping/DistanceMap/distance_map.py +++ b/Mapping/DistanceMap/distance_map.py @@ -11,11 +11,62 @@ import numpy as np import matplotlib.pyplot as plt +import scipy INF = 1e20 ENABLE_PLOT = True +def compute_sdf_scipy(obstacles): + """ + Compute the signed distance field (SDF) from a boolean field using scipy. + This function has the same functionality as compute_sdf. + However, by using scipy.ndimage.distance_transform_edt, it can compute much faster. + + Example: 500×500 map + • compute_sdf: 3 sec + • compute_sdf_scipy: 0.05 sec + + Parameters + ---------- + obstacles : array_like + A 2D boolean array where '1' represents obstacles and '0' represents free space. + + Returns + ------- + array_like + A 2D array representing the signed distance field, where positive values indicate distance + to the nearest obstacle, and negative values indicate distance to the nearest free space. + """ + # distance_transform_edt use '0' as obstacles, so we need to convert the obstacles to '0' + a = scipy.ndimage.distance_transform_edt(obstacles == 0) + b = scipy.ndimage.distance_transform_edt(obstacles == 1) + return a - b + + +def compute_udf_scipy(obstacles): + """ + Compute the unsigned distance field (UDF) from a boolean field using scipy. + This function has the same functionality as compute_udf. + However, by using scipy.ndimage.distance_transform_edt, it can compute much faster. + + Example: 500×500 map + • compute_udf: 1.5 sec + • compute_udf_scipy: 0.02 sec + + Parameters + ---------- + obstacles : array_like + A 2D boolean array where '1' represents obstacles and '0' represents free space. + + Returns + ------- + array_like + A 2D array of distances from the nearest obstacle, with the same dimensions as `bool_field`. + """ + return scipy.ndimage.distance_transform_edt(obstacles == 0) + + def compute_sdf(obstacles): """ Compute the signed distance field (SDF) from a boolean field. diff --git a/PathPlanning/ElasticBands/elastic_bands.py b/PathPlanning/ElasticBands/elastic_bands.py new file mode 100644 index 0000000000..785f822d14 --- /dev/null +++ b/PathPlanning/ElasticBands/elastic_bands.py @@ -0,0 +1,300 @@ +""" +Elastic Bands + +author: Wang Zheng (@Aglargil) + +Ref: + +- [Elastic Bands: Connecting Path Planning and Control] +(http://www8.cs.umu.se/research/ifor/dl/Control/elastic%20bands.pdf) +""" + +import numpy as np +import sys +import pathlib +import matplotlib.pyplot as plt +from matplotlib.patches import Circle + +sys.path.append(str(pathlib.Path(__file__).parent.parent.parent)) + +from Mapping.DistanceMap.distance_map import compute_sdf_scipy + +# Elastic Bands Params +MAX_BUBBLE_RADIUS = 100 +MIN_BUBBLE_RADIUS = 10 +RHO0 = 20.0 # Maximum distance for applying repulsive force +KC = 0.05 # Contraction force gain +KR = -0.1 # Repulsive force gain +LAMBDA = 0.7 # Overlap constraint factor +STEP_SIZE = 3.0 # Step size for calculating gradient + +# Visualization Params +ENABLE_PLOT = True +# ENABLE_INTERACTIVE is True allows user to add obstacles by left clicking +# and add path points by right clicking and start planning by middle clicking +ENABLE_INTERACTIVE = False +# ENABLE_SAVE_DATA is True allows saving the path and obstacles which added +# by user in interactive mode to file +ENABLE_SAVE_DATA = False +MAX_ITER = 50 + + +class Bubble: + def __init__(self, position, radius): + self.pos = np.array(position) # Bubble center coordinates [x, y] + self.radius = radius # Safety distance radius ρ(b) + if self.radius > MAX_BUBBLE_RADIUS: + self.radius = MAX_BUBBLE_RADIUS + if self.radius < MIN_BUBBLE_RADIUS: + self.radius = MIN_BUBBLE_RADIUS + + +class ElasticBands: + def __init__( + self, + initial_path, + obstacles, + rho0=RHO0, + kc=KC, + kr=KR, + lambda_=LAMBDA, + step_size=STEP_SIZE, + ): + self.distance_map = compute_sdf_scipy(obstacles) + self.bubbles = [ + Bubble(p, self.compute_rho(p)) for p in initial_path + ] # Initialize bubble chain + self.kc = kc # Contraction force gain + self.kr = kr # Repulsive force gain + self.rho0 = rho0 # Maximum distance for applying repulsive force + self.lambda_ = lambda_ # Overlap constraint factor + self.step_size = step_size # Step size for calculating gradient + self._maintain_overlap() + + def compute_rho(self, position): + """Compute the distance field value at the position""" + return self.distance_map[int(position[0]), int(position[1])] + + def contraction_force(self, i): + """Calculate internal contraction force for the i-th bubble""" + if i == 0 or i == len(self.bubbles) - 1: + return np.zeros(2) + + prev = self.bubbles[i - 1].pos + next_ = self.bubbles[i + 1].pos + current = self.bubbles[i].pos + + # f_c = kc * ( (prev-current)/|prev-current| + (next-current)/|next-current| ) + dir_prev = (prev - current) / (np.linalg.norm(prev - current) + 1e-6) + dir_next = (next_ - current) / (np.linalg.norm(next_ - current) + 1e-6) + return self.kc * (dir_prev + dir_next) + + def repulsive_force(self, i): + """Calculate external repulsive force for the i-th bubble""" + h = self.step_size # Step size + b = self.bubbles[i].pos + rho = self.bubbles[i].radius + + if rho >= self.rho0: + return np.zeros(2) + + # Finite difference approximation of the gradient ∂ρ/∂b + dx = np.array([h, 0]) + dy = np.array([0, h]) + grad_x = (self.compute_rho(b - dx) - self.compute_rho(b + dx)) / (2 * h) + grad_y = (self.compute_rho(b - dy) - self.compute_rho(b + dy)) / (2 * h) + grad = np.array([grad_x, grad_y]) + + return self.kr * (self.rho0 - rho) * grad + + def update_bubbles(self): + """Update bubble positions""" + new_bubbles = [] + for i in range(len(self.bubbles)): + if i == 0 or i == len(self.bubbles) - 1: + new_bubbles.append(self.bubbles[i]) # Fixed start and end points + continue + + f_total = self.contraction_force(i) + self.repulsive_force(i) + v = self.bubbles[i - 1].pos - self.bubbles[i + 1].pos + + # Remove tangential component + f_star = f_total - f_total * v * v / (np.linalg.norm(v) ** 2 + 1e-6) + + alpha = self.bubbles[i].radius # Adaptive step size + new_pos = self.bubbles[i].pos + alpha * f_star + new_pos = np.clip(new_pos, 0, 499) + new_radius = self.compute_rho(new_pos) + + # Update bubble and maintain overlap constraint + new_bubble = Bubble(new_pos, new_radius) + new_bubbles.append(new_bubble) + + self.bubbles = new_bubbles + self._maintain_overlap() + + def _maintain_overlap(self): + """Maintain bubble chain continuity (simplified insertion/deletion mechanism)""" + # Insert bubbles + i = 0 + while i < len(self.bubbles) - 1: + bi, bj = self.bubbles[i], self.bubbles[i + 1] + dist = np.linalg.norm(bi.pos - bj.pos) + if dist > self.lambda_ * (bi.radius + bj.radius): + new_pos = (bi.pos + bj.pos) / 2 + rho = self.compute_rho( + new_pos + ) # Calculate new radius using environment model + self.bubbles.insert(i + 1, Bubble(new_pos, rho)) + i += 2 # Skip the processed region + else: + i += 1 + + # Delete redundant bubbles + i = 1 + while i < len(self.bubbles) - 1: + prev = self.bubbles[i - 1] + next_ = self.bubbles[i + 1] + dist = np.linalg.norm(prev.pos - next_.pos) + if dist <= self.lambda_ * (prev.radius + next_.radius): + del self.bubbles[i] # Delete if redundant + else: + i += 1 + + +class ElasticBandsVisualizer: + def __init__(self): + self.obstacles = np.zeros((500, 500)) + self.obstacles_points = [] + self.path_points = [] + self.elastic_band = None + self.running = True + + if ENABLE_PLOT: + self.fig, self.ax = plt.subplots(figsize=(8, 8)) + self.fig.canvas.mpl_connect("close_event", self.on_close) + self.ax.set_xlim(0, 500) + self.ax.set_ylim(0, 500) + + if ENABLE_INTERACTIVE: + self.path_points = [] # Add a list to store path points + # Connect mouse events + self.fig.canvas.mpl_connect("button_press_event", self.on_click) + else: + self.path_points = np.load(pathlib.Path(__file__).parent / "path.npy") + self.obstacles_points = np.load( + pathlib.Path(__file__).parent / "obstacles.npy" + ) + for x, y in self.obstacles_points: + self.add_obstacle(x, y) + self.plan_path() + + self.plot_background() + + def on_close(self, event): + """Handle window close event""" + self.running = False + plt.close("all") # Close all figure windows + + def plot_background(self): + """Plot the background grid""" + if not ENABLE_PLOT or not self.running: + return + + self.ax.cla() + self.ax.set_xlim(0, 500) + self.ax.set_ylim(0, 500) + self.ax.grid(True) + + if ENABLE_INTERACTIVE: + self.ax.set_title( + "Elastic Bands Path Planning\n" + "Left click: Add obstacles\n" + "Right click: Add path points\n" + "Middle click: Start planning", + pad=20, + ) + else: + self.ax.set_title("Elastic Bands Path Planning", pad=20) + + if self.path_points: + self.ax.plot( + [p[0] for p in self.path_points], + [p[1] for p in self.path_points], + "yo", + markersize=8, + ) + + self.ax.imshow(self.obstacles.T, origin="lower", cmap="binary", alpha=0.8) + self.ax.plot([], [], color="black", label="obstacles") + if self.elastic_band is not None: + path = [b.pos.tolist() for b in self.elastic_band.bubbles] + path = np.array(path) + self.ax.plot(path[:, 0], path[:, 1], "b-", linewidth=2, label="path") + + for bubble in self.elastic_band.bubbles: + circle = Circle( + bubble.pos, bubble.radius, fill=False, color="g", alpha=0.3 + ) + self.ax.add_patch(circle) + self.ax.plot(bubble.pos[0], bubble.pos[1], "bo", markersize=10) + self.ax.plot([], [], color="green", label="bubbles") + + self.ax.legend(loc="upper right") + plt.draw() + plt.pause(0.01) + + def add_obstacle(self, x, y): + """Add an obstacle at the given coordinates""" + size = 30 # Side length of the square + half_size = size // 2 + x_start = max(0, x - half_size) + x_end = min(self.obstacles.shape[0], x + half_size) + y_start = max(0, y - half_size) + y_end = min(self.obstacles.shape[1], y + half_size) + self.obstacles[x_start:x_end, y_start:y_end] = 1 + + def on_click(self, event): + """Handle mouse click events""" + if event.inaxes != self.ax: + return + + x, y = int(event.xdata), int(event.ydata) + + if event.button == 1: # Left click to add obstacles + self.add_obstacle(x, y) + self.obstacles_points.append([x, y]) + + elif event.button == 3: # Right click to add path points + self.path_points.append([x, y]) + + elif event.button == 2: # Middle click to end path input and start planning + if len(self.path_points) >= 2: + if ENABLE_SAVE_DATA: + np.save( + pathlib.Path(__file__).parent / "path.npy", self.path_points + ) + np.save( + pathlib.Path(__file__).parent / "obstacles.npy", + self.obstacles_points, + ) + self.plan_path() + + self.plot_background() + + def plan_path(self): + """Plan the path""" + + initial_path = self.path_points + # Create an elastic band object and optimize + self.elastic_band = ElasticBands(initial_path, self.obstacles) + for _ in range(MAX_ITER): + self.elastic_band.update_bubbles() + self.path_points = [b.pos for b in self.elastic_band.bubbles] + self.plot_background() + + +if __name__ == "__main__": + _ = ElasticBandsVisualizer() + if ENABLE_PLOT: + plt.show(block=True) diff --git a/PathPlanning/ElasticBands/obstacles.npy b/PathPlanning/ElasticBands/obstacles.npy new file mode 100644 index 0000000000000000000000000000000000000000..af4376afcf0e987bbb62c4a80c7afac3d221961d GIT binary patch literal 384 zcmbR27wQ`j$;eQ~P_3SlTAW;@Zl$1ZlWC!@qoAIaUsO_*m=~X4l#&V(cT3DEP6dh= zXCxM+0{I$-W;zN+nmP)#3giN=d=|8_M@a;}<~r z%~1IiD1RoD9|z^LL+NCwxEhq72<3}I`IS(%lCm`7X literal 0 HcmV?d00001 diff --git a/docs/modules/5_path_planning/elastic_bands/elastic_bands_main.rst b/docs/modules/5_path_planning/elastic_bands/elastic_bands_main.rst new file mode 100644 index 0000000000..139996f291 --- /dev/null +++ b/docs/modules/5_path_planning/elastic_bands/elastic_bands_main.rst @@ -0,0 +1,73 @@ +Elastic Bands +------------- + +This is a path planning with Elastic Bands. + +.. image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/PathPlanning/ElasticBands/animation.gif + + +Core Concept +~~~~~~~~~~~~ +- **Elastic Band**: A dynamically deformable collision-free path initialized by a global planner. +- **Objective**: + + * Shorten and smooth the path. + * Maximize obstacle clearance. + * Maintain global path connectivity. + +Bubble Representation +~~~~~~~~~~~~~~~~~~~ +- **Definition**: A local free-space region around configuration :math:`b`: + + .. math:: + B(b) = \{ q: \|q - b\| < \rho(b) \}, + + where :math:`\rho(b)` is the radius of the bubble. + + +Force-Based Deformation +~~~~~~~~~~~~~~~~~~~~~~~ +The elastic band deforms under artificial forces: + +Internal Contraction Force +++++++++++++++++++++++++++ +- **Purpose**: Reduces path slack and length. +- **Formula**: For node :math:`b_i`: + + .. math:: + f_c(b_i) = k_c \left( \frac{b_{i-1} - b_i}{\|b_{i-1} - b_i\|} + \frac{b_{i+1} - b_i}{\|b_{i+1} - b_i\|} \right) + + where :math:`k_c` is the contraction gain. + +External Repulsion Force ++++++++++++++++++++++++++ +- **Purpose**: Pushes the path away from obstacles. +- **Formula**: For node :math:`b_i`: + + .. math:: + f_r(b_i) = \begin{cases} + k_r (\rho_0 - \rho(b_i)) \nabla \rho(b_i) & \text{if } \rho(b_i) < \rho_0, \\ + 0 & \text{otherwise}. + \end{cases} + + where :math:`k_r` is the repulsion gain, :math:`\rho_0` is the maximum distance for applying repulsion force, and :math:`\nabla \rho(b_i)` is approximated via finite differences: + + .. math:: + \frac{\partial \rho}{\partial x} \approx \frac{\rho(b_i + h) - \rho(b_i - h)}{2h}. + +Dynamic Path Maintenance +~~~~~~~~~~~~~~~~~~~~~~~ +1. **Node Update**: + + .. math:: + b_i^{\text{new}} = b_i^{\text{old}} + \alpha (f_c + f_r), + + where :math:`\alpha` is a step-size parameter, which often proportional to :math:`\rho(b_i^{\text{old}})` + +2. **Overlap Enforcement**: +- Insert new nodes if adjacent nodes are too far apart +- Remove redundant nodes if adjacent nodes are too close + +Ref: + +- `Elastic Bands: Connecting Path Planning and Control `__ diff --git a/docs/modules/5_path_planning/path_planning_main.rst b/docs/modules/5_path_planning/path_planning_main.rst index 4960330b3e..65fbfdbc3d 100644 --- a/docs/modules/5_path_planning/path_planning_main.rst +++ b/docs/modules/5_path_planning/path_planning_main.rst @@ -31,3 +31,4 @@ Path planning is the ability of a robot to search feasible and efficient path to hybridastar/hybridastar frenet_frame_path/frenet_frame_path coverage_path/coverage_path + elastic_bands/elastic_bands \ No newline at end of file diff --git a/tests/test_elastic_bands.py b/tests/test_elastic_bands.py new file mode 100644 index 0000000000..ad4e13af1a --- /dev/null +++ b/tests/test_elastic_bands.py @@ -0,0 +1,23 @@ +import conftest +import numpy as np +from PathPlanning.ElasticBands.elastic_bands import ElasticBands + + +def test_1(): + path = np.load("PathPlanning/ElasticBands/path.npy") + obstacles_points = np.load("PathPlanning/ElasticBands/obstacles.npy") + obstacles = np.zeros((500, 500)) + for x, y in obstacles_points: + size = 30 # Side length of the square + half_size = size // 2 + x_start = max(0, x - half_size) + x_end = min(obstacles.shape[0], x + half_size) + y_start = max(0, y - half_size) + y_end = min(obstacles.shape[1], y + half_size) + obstacles[x_start:x_end, y_start:y_end] = 1 + elastic_bands = ElasticBands(path, obstacles) + elastic_bands.update_bubbles() + + +if __name__ == "__main__": + conftest.run_this_test(__file__) From cbe61f8ca62785a072652741d8b33832caee1942 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 18 Feb 2025 10:45:49 +0900 Subject: [PATCH 21/35] build(deps): bump scipy from 1.15.1 to 1.15.2 in /requirements (#1163) Bumps [scipy](https://github.com/scipy/scipy) from 1.15.1 to 1.15.2. - [Release notes](https://github.com/scipy/scipy/releases) - [Commits](https://github.com/scipy/scipy/compare/v1.15.1...v1.15.2) --- updated-dependencies: - dependency-name: scipy dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/requirements.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/requirements/requirements.txt b/requirements/requirements.txt index f5f674d7d2..ea4550edb8 100644 --- a/requirements/requirements.txt +++ b/requirements/requirements.txt @@ -1,5 +1,5 @@ numpy == 2.2.2 -scipy == 1.15.1 +scipy == 1.15.2 matplotlib == 3.10.0 cvxpy == 1.5.3 pytest == 8.3.4 # For unit test From 395fca59cc1f9c44e9ea7f7c08ce41958e399481 Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Tue, 18 Feb 2025 20:06:34 +0900 Subject: [PATCH 22/35] fix: update robotics documentation for clarity and correct terminology (#1165) --- docs/modules/12_appendix/external_sensors_main.rst | 2 ++ docs/modules/12_appendix/internal_sensors_main.rst | 2 ++ .../1_definition_of_robotics/definition_of_robotics_main.rst | 5 +++++ .../technologies_for_robotics_main.rst} | 4 +++- docs/modules/1_introduction/introduction_main.rst | 2 +- 5 files changed, 13 insertions(+), 2 deletions(-) rename docs/modules/1_introduction/{3_technology_for_robotics/technology_for_robotics_main.rst => 3_technologies_for_robotics/technologies_for_robotics_main.rst} (88%) diff --git a/docs/modules/12_appendix/external_sensors_main.rst b/docs/modules/12_appendix/external_sensors_main.rst index d36b852d42..3597418150 100644 --- a/docs/modules/12_appendix/external_sensors_main.rst +++ b/docs/modules/12_appendix/external_sensors_main.rst @@ -1,3 +1,5 @@ +.. _`External Sensors for Robots`: + External Sensors for Robots ============================ diff --git a/docs/modules/12_appendix/internal_sensors_main.rst b/docs/modules/12_appendix/internal_sensors_main.rst index 13b8de4203..18f209098e 100644 --- a/docs/modules/12_appendix/internal_sensors_main.rst +++ b/docs/modules/12_appendix/internal_sensors_main.rst @@ -1,3 +1,5 @@ +.. _`Internal Sensors for Robots`: + Internal Sensors for Robots ============================ diff --git a/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst b/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst index f54d4d41fa..1e7a833b19 100644 --- a/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst +++ b/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst @@ -17,6 +17,10 @@ In the play, robots were artificial workers created to serve humans, but they ev Over time, “robot” came to refer to machines or automated systems that can perform tasks, often with some level of intelligence or autonomy. +Currently, 2 millions robots are working in the world, and the number is increasing every year. +In South Korea, where the adoption of robots is particularly rapid, +50 robots are operating per 1,000 people. + .. _`R.U.R.`: https://thereader.mitpress.mit.edu/origin-word-robot-rur/ .. _`Karel Čapek`: https://en.wikipedia.org/wiki/Karel_%C4%8Capek @@ -100,3 +104,4 @@ Robotics consists of several essential components: #. Software & Algorithms – Allow the robot to function and make intelligent decisions (e.g., ROS, Machine learning models, Localization, Mapping, Path planning, Control). This project, PythonRobotics, focuses on the software and algorithms part of robotics. +If you are interested in `Sensors` hardware, you can check :ref:`Internal Sensors for Robotics`_ or :ref:`External Sensors for Robotics`_. diff --git a/docs/modules/1_introduction/3_technology_for_robotics/technology_for_robotics_main.rst b/docs/modules/1_introduction/3_technologies_for_robotics/technologies_for_robotics_main.rst similarity index 88% rename from docs/modules/1_introduction/3_technology_for_robotics/technology_for_robotics_main.rst rename to docs/modules/1_introduction/3_technologies_for_robotics/technologies_for_robotics_main.rst index e460059e20..64c6945c32 100644 --- a/docs/modules/1_introduction/3_technology_for_robotics/technology_for_robotics_main.rst +++ b/docs/modules/1_introduction/3_technologies_for_robotics/technologies_for_robotics_main.rst @@ -1,8 +1,10 @@ -Technology for Robotics +Technologies for Robotics ------------------------- The field of robotics needs wide areas of technologies such as mechanical engineering, electrical engineering, computer science, and artificial intelligence (AI). +This project, `PythonRobotics`, only focus on computer science and artificial intelligence. + Autonomous Navigation diff --git a/docs/modules/1_introduction/introduction_main.rst b/docs/modules/1_introduction/introduction_main.rst index a7ce55f9bf..1871dfc3b1 100644 --- a/docs/modules/1_introduction/introduction_main.rst +++ b/docs/modules/1_introduction/introduction_main.rst @@ -14,5 +14,5 @@ covered in PythonRobotics. 1_definition_of_robotics/definition_of_robotics 2_python_for_robotics/python_for_robotics - 3_technology_for_robotics/technology_for_robotics + 3_technologies_for_robotics/technologies_for_robotics From d53711998aeb1d0b3aab7a62862ef5d358f2aff9 Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Tue, 18 Feb 2025 21:49:28 +0900 Subject: [PATCH 23/35] fix: update robotics documentation for clarity and correct terminology (#1166) --- README.md | 4 ++++ .../0_getting_started/1_what_is_python_robotics_main.rst | 6 ++++++ .../definition_of_robotics_main.rst | 2 +- .../5_path_planning/elastic_bands/elastic_bands_main.rst | 7 ++++--- 4 files changed, 15 insertions(+), 4 deletions(-) diff --git a/README.md b/README.md index 818d7b0d4e..9e605435ce 100644 --- a/README.md +++ b/README.md @@ -89,6 +89,10 @@ See this documentation - [Getting Started — PythonRobotics documentation](https://atsushisakai.github.io/PythonRobotics/getting_started.html#what-is-pythonrobotics) +or this Youtube video: + +- [PythonRobotics project audio overview](https://www.youtube.com/watch?v=uMeRnNoJAfU) + or this paper for more details: - [\[1808\.10703\] PythonRobotics: a Python code collection of robotics algorithms](https://arxiv.org/abs/1808.10703) ([BibTeX](https://github.com/AtsushiSakai/PythonRoboticsPaper/blob/master/python_robotics.bib)) diff --git a/docs/modules/0_getting_started/1_what_is_python_robotics_main.rst b/docs/modules/0_getting_started/1_what_is_python_robotics_main.rst index 8c932b7263..2a7bd574f0 100644 --- a/docs/modules/0_getting_started/1_what_is_python_robotics_main.rst +++ b/docs/modules/0_getting_started/1_what_is_python_robotics_main.rst @@ -110,6 +110,12 @@ the following additional libraries are required. For instructions on installing the above libraries, please refer to this section ":ref:`How to run sample codes`". +Audio overview of this project +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +For an audio overview of this project, please refer to this `YouTube video`_. + +.. _`YouTube video`: https://www.youtube.com/watch?v=uMeRnNoJAfU + Arxiv paper ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ diff --git a/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst b/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst index 1e7a833b19..5d78bf858b 100644 --- a/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst +++ b/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst @@ -103,5 +103,5 @@ Robotics consists of several essential components: #. Power Supply – Provides energy to run the robot (e.g., Batteries, Solar power). #. Software & Algorithms – Allow the robot to function and make intelligent decisions (e.g., ROS, Machine learning models, Localization, Mapping, Path planning, Control). -This project, PythonRobotics, focuses on the software and algorithms part of robotics. +This project, `PythonRobotics`, focuses on the software and algorithms part of robotics. If you are interested in `Sensors` hardware, you can check :ref:`Internal Sensors for Robotics`_ or :ref:`External Sensors for Robotics`_. diff --git a/docs/modules/5_path_planning/elastic_bands/elastic_bands_main.rst b/docs/modules/5_path_planning/elastic_bands/elastic_bands_main.rst index 139996f291..8a3e517105 100644 --- a/docs/modules/5_path_planning/elastic_bands/elastic_bands_main.rst +++ b/docs/modules/5_path_planning/elastic_bands/elastic_bands_main.rst @@ -16,7 +16,7 @@ Core Concept * Maintain global path connectivity. Bubble Representation -~~~~~~~~~~~~~~~~~~~ +~~~~~~~~~~~~~~~~~~~~~~~~ - **Definition**: A local free-space region around configuration :math:`b`: .. math:: @@ -56,7 +56,7 @@ External Repulsion Force \frac{\partial \rho}{\partial x} \approx \frac{\rho(b_i + h) - \rho(b_i - h)}{2h}. Dynamic Path Maintenance -~~~~~~~~~~~~~~~~~~~~~~~ +~~~~~~~~~~~~~~~~~~~~~~~~~~ 1. **Node Update**: .. math:: @@ -68,6 +68,7 @@ Dynamic Path Maintenance - Insert new nodes if adjacent nodes are too far apart - Remove redundant nodes if adjacent nodes are too close -Ref: +References +~~~~~~~~~~~~~~~~~~~~~~~ - `Elastic Bands: Connecting Path Planning and Control `__ From 8064488a1d63a1579d8c9d786a376c8b583c02a8 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 18 Feb 2025 21:49:40 +0900 Subject: [PATCH 24/35] build(deps): bump numpy from 2.2.2 to 2.2.3 in /requirements (#1164) Bumps [numpy](https://github.com/numpy/numpy) from 2.2.2 to 2.2.3. - [Release notes](https://github.com/numpy/numpy/releases) - [Changelog](https://github.com/numpy/numpy/blob/main/doc/RELEASE_WALKTHROUGH.rst) - [Commits](https://github.com/numpy/numpy/compare/v2.2.2...v2.2.3) --- updated-dependencies: - dependency-name: numpy dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/requirements.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/requirements/requirements.txt b/requirements/requirements.txt index ea4550edb8..8176364c29 100644 --- a/requirements/requirements.txt +++ b/requirements/requirements.txt @@ -1,4 +1,4 @@ -numpy == 2.2.2 +numpy == 2.2.3 scipy == 1.15.2 matplotlib == 3.10.0 cvxpy == 1.5.3 From 2b7080991e5289c9524a505244fcd2a4995da06e Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Wed, 19 Feb 2025 13:36:07 +0900 Subject: [PATCH 25/35] Add GitHub copilot pro sponser (#1167) * fix: correct terminology in documentation and update Sphinx options * fix: correct terminology in documentation and update Sphinx options * fix: correct terminology in documentation and update Sphinx options * fix: correct terminology in documentation and update Sphinx options * fix: correct terminology in documentation and update Sphinx options --- docs/Makefile | 3 ++- docs/modules/0_getting_started/3_how_to_contribute_main.rst | 2 ++ .../1_definition_of_robotics/definition_of_robotics_main.rst | 2 +- docs/modules/4_slam/graph_slam/graphSLAM_SE2_example.rst | 2 +- 4 files changed, 6 insertions(+), 3 deletions(-) diff --git a/docs/Makefile b/docs/Makefile index 9296811e02..ae495eb36d 100644 --- a/docs/Makefile +++ b/docs/Makefile @@ -2,7 +2,8 @@ # # You can set these variables from the command line. -SPHINXOPTS = +# SPHINXOPTS with -W means turn warnings into errors to fail the build when warnings are present. +SPHINXOPTS = -W SPHINXBUILD = sphinx-build SPHINXPROJ = PythonRobotics SOURCEDIR = . diff --git a/docs/modules/0_getting_started/3_how_to_contribute_main.rst b/docs/modules/0_getting_started/3_how_to_contribute_main.rst index 874564cbb8..1e61760649 100644 --- a/docs/modules/0_getting_started/3_how_to_contribute_main.rst +++ b/docs/modules/0_getting_started/3_how_to_contribute_main.rst @@ -187,6 +187,7 @@ If you would like to support us in some other way, please contact with creating Current Major Sponsors: +#. `GitHub`_ : They are providing a GitHub Copilot Pro license for this OSS development. #. `JetBrains`_ : They are providing a free license of their IDEs for this OSS development. #. `1Password`_ : They are providing a free license of their 1Password team license for this OSS project. @@ -202,6 +203,7 @@ Current Major Sponsors: .. _`doc README`: https://github.com/AtsushiSakai/PythonRobotics/blob/master/docs/README.md .. _`test_codestyle.py`: https://github.com/AtsushiSakai/PythonRobotics/blob/master/tests/test_codestyle.py .. _`JetBrains`: https://www.jetbrains.com/ +.. _`GitHub`: https://www.github.com/ .. _`Sponsor @AtsushiSakai on GitHub Sponsors`: https://github.com/sponsors/AtsushiSakai .. _`Become a backer or sponsor on Patreon`: https://www.patreon.com/myenigma .. _`One-time donation via PayPal`: https://www.paypal.com/paypalme/myenigmapay/ diff --git a/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst b/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst index 5d78bf858b..ca595301a6 100644 --- a/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst +++ b/docs/modules/1_introduction/1_definition_of_robotics/definition_of_robotics_main.rst @@ -104,4 +104,4 @@ Robotics consists of several essential components: #. Software & Algorithms – Allow the robot to function and make intelligent decisions (e.g., ROS, Machine learning models, Localization, Mapping, Path planning, Control). This project, `PythonRobotics`, focuses on the software and algorithms part of robotics. -If you are interested in `Sensors` hardware, you can check :ref:`Internal Sensors for Robotics`_ or :ref:`External Sensors for Robotics`_. +If you are interested in `Sensors` hardware, you can check :ref:`Internal Sensors for Robots` or :ref:`External Sensors for Robots`. diff --git a/docs/modules/4_slam/graph_slam/graphSLAM_SE2_example.rst b/docs/modules/4_slam/graph_slam/graphSLAM_SE2_example.rst index 491320512b..15963aff79 100644 --- a/docs/modules/4_slam/graph_slam/graphSLAM_SE2_example.rst +++ b/docs/modules/4_slam/graph_slam/graphSLAM_SE2_example.rst @@ -165,7 +165,7 @@ different data sources into a single optimization problem. 6 215.8405 -0.000000 -.. figure:: graphSLAM_SE2_example_files/Graph_SLAM_optimization.gif +.. image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/SLAM/GraphBasedSLAM/Graph_SLAM_optimization.gif .. code:: ipython3 From c7fb228d24856d6c7a0cb28c715176bfb8b17281 Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Thu, 20 Feb 2025 12:11:04 +0900 Subject: [PATCH 26/35] fix: update section references to use consistent formatting (#1169) --- .../technologies_for_robotics_main.rst | 48 +++++++++++++++---- .../2_localization/localization_main.rst | 2 +- docs/modules/3_mapping/mapping_main.rst | 2 +- docs/modules/4_slam/slam_main.rst | 2 +- .../5_path_planning/path_planning_main.rst | 2 +- .../6_path_tracking/path_tracking_main.rst | 2 +- .../7_arm_navigation/arm_navigation_main.rst | 2 +- .../aerial_navigation_main.rst | 2 +- docs/modules/9_bipedal/bipedal_main.rst | 2 +- 9 files changed, 48 insertions(+), 16 deletions(-) diff --git a/docs/modules/1_introduction/3_technologies_for_robotics/technologies_for_robotics_main.rst b/docs/modules/1_introduction/3_technologies_for_robotics/technologies_for_robotics_main.rst index 64c6945c32..c77997a138 100644 --- a/docs/modules/1_introduction/3_technologies_for_robotics/technologies_for_robotics_main.rst +++ b/docs/modules/1_introduction/3_technologies_for_robotics/technologies_for_robotics_main.rst @@ -5,22 +5,54 @@ The field of robotics needs wide areas of technologies such as mechanical engine electrical engineering, computer science, and artificial intelligence (AI). This project, `PythonRobotics`, only focus on computer science and artificial intelligence. +The technologies for robotics are categorized as following 3 categories: +#. `Autonomous Navigation`_ +#. `Manipulation`_ +#. `Robot type specific technologies`_ + +.. _`Autonomous Navigation`: Autonomous Navigation ^^^^^^^^^^^^^^^^^^^^^^^^ - -In recent years, autonomous navigation technologies have received huge -attention in many fields. -Such fields include, autonomous driving[22], drone flight navigation, -and other transportation systems. - -An autonomous navigation system is a system that can move to a goal over long +Autonomous navigation is a capability that can move to a goal over long periods of time without any external control by an operator. -The system requires a wide range of technologies: + +To achieve autonomous navigation, the robot needs to have the following technologies: - It needs to know where it is (localization) - Where it is safe (mapping) +- Where is is safe and where the robot is in the map (Simultaneous Localization and Mapping (SLAM)) - Where and how to move (path planning) - How to control its motion (path following). The autonomous system would not work correctly if any of these technologies is missing. + +In recent years, autonomous navigation technologies have received huge +attention in many fields. +For example, self-driving cars, drones, and autonomous mobile robots in indoor and outdoor environments. + +In this project, we provide many algorithms, sample codes, +and documentations for autonomous navigation. + +#. :ref:`Localization` +#. :ref:`Mapping` +#. :ref:`SLAM` +#. :ref:`Path planning` +#. :ref:`Path tracking` + + + +.. _`Manipulation`: + +Manipulation +^^^^^^^^^^^^^^^^^^^^^^^^ + +#. :ref:`Arm Navigation` + +.. _`Robot type specific technologies`: + +Robot type specific technologies +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +#. :ref:`Aerial Navigation` +#. :ref:`Bipedal` diff --git a/docs/modules/2_localization/localization_main.rst b/docs/modules/2_localization/localization_main.rst index 22cbd094da..770a234b69 100644 --- a/docs/modules/2_localization/localization_main.rst +++ b/docs/modules/2_localization/localization_main.rst @@ -1,4 +1,4 @@ -.. _localization: +.. _`Localization`: Localization ============ diff --git a/docs/modules/3_mapping/mapping_main.rst b/docs/modules/3_mapping/mapping_main.rst index 825b08d3ec..34a3744893 100644 --- a/docs/modules/3_mapping/mapping_main.rst +++ b/docs/modules/3_mapping/mapping_main.rst @@ -1,4 +1,4 @@ -.. _mapping: +.. _`Mapping`: Mapping ======= diff --git a/docs/modules/4_slam/slam_main.rst b/docs/modules/4_slam/slam_main.rst index dec04f253a..98211986c2 100644 --- a/docs/modules/4_slam/slam_main.rst +++ b/docs/modules/4_slam/slam_main.rst @@ -1,4 +1,4 @@ -.. _slam: +.. _`SLAM`: SLAM ==== diff --git a/docs/modules/5_path_planning/path_planning_main.rst b/docs/modules/5_path_planning/path_planning_main.rst index 65fbfdbc3d..a0f9c30a3d 100644 --- a/docs/modules/5_path_planning/path_planning_main.rst +++ b/docs/modules/5_path_planning/path_planning_main.rst @@ -1,4 +1,4 @@ -.. _path_planning: +.. _`Path Planning`: Path Planning ============= diff --git a/docs/modules/6_path_tracking/path_tracking_main.rst b/docs/modules/6_path_tracking/path_tracking_main.rst index d7a895b562..130a2340c1 100644 --- a/docs/modules/6_path_tracking/path_tracking_main.rst +++ b/docs/modules/6_path_tracking/path_tracking_main.rst @@ -1,4 +1,4 @@ -.. _path_tracking: +.. _`Path Tracking`: Path Tracking ============= diff --git a/docs/modules/7_arm_navigation/arm_navigation_main.rst b/docs/modules/7_arm_navigation/arm_navigation_main.rst index bbbc872c58..7acd3ee7d3 100644 --- a/docs/modules/7_arm_navigation/arm_navigation_main.rst +++ b/docs/modules/7_arm_navigation/arm_navigation_main.rst @@ -1,4 +1,4 @@ -.. _arm_navigation: +.. _`Arm Navigation`: Arm Navigation ============== diff --git a/docs/modules/8_aerial_navigation/aerial_navigation_main.rst b/docs/modules/8_aerial_navigation/aerial_navigation_main.rst index b2ccb071af..7f76689770 100644 --- a/docs/modules/8_aerial_navigation/aerial_navigation_main.rst +++ b/docs/modules/8_aerial_navigation/aerial_navigation_main.rst @@ -1,4 +1,4 @@ -.. _aerial_navigation: +.. _`Aerial Navigation`: Aerial Navigation ================= diff --git a/docs/modules/9_bipedal/bipedal_main.rst b/docs/modules/9_bipedal/bipedal_main.rst index fc5b933055..dc387dc4e8 100644 --- a/docs/modules/9_bipedal/bipedal_main.rst +++ b/docs/modules/9_bipedal/bipedal_main.rst @@ -1,4 +1,4 @@ -.. _bipedal: +.. _`Bipedal`: Bipedal ================= From f343573a7bb9b99fd8fcfc72cef7336ccf859c28 Mon Sep 17 00:00:00 2001 From: Aglargil <34728006+Aglargil@users.noreply.github.com> Date: Thu, 20 Feb 2025 18:09:30 +0800 Subject: [PATCH 27/35] Update move_to_pose for cases where alpha > pi/2 or alpha < -pi/2 (#1168) * Update move_to_pose for cases where alpha > pi/2 or alpha < -pi/2 * Update move_to_pose * Add move_to_pose test * Update move_to_pose --- Control/move_to_pose/move_to_pose.py | 90 +++++++++++++++++++--------- tests/test_move_to_pose.py | 74 ++++++++++++++++++++++- 2 files changed, 134 insertions(+), 30 deletions(-) diff --git a/Control/move_to_pose/move_to_pose.py b/Control/move_to_pose/move_to_pose.py index 279ba0625b..34736a2e21 100644 --- a/Control/move_to_pose/move_to_pose.py +++ b/Control/move_to_pose/move_to_pose.py @@ -5,6 +5,7 @@ Author: Daniel Ingram (daniel-s-ingram) Atsushi Sakai (@Atsushi_twi) Seied Muhammad Yazdian (@Muhammad-Yazdian) + Wang Zheng (@Aglargil) P. I. Corke, "Robotics, Vision & Control", Springer 2017, ISBN 978-3-319-54413-7 @@ -12,8 +13,13 @@ import matplotlib.pyplot as plt import numpy as np -from random import random +import sys +import pathlib + +sys.path.append(str(pathlib.Path(__file__).parent.parent.parent)) from utils.angle import angle_mod +from random import random + class PathFinderController: """ @@ -70,14 +76,20 @@ def calc_control_command(self, x_diff, y_diff, theta, theta_goal): # [-pi, pi] to prevent unstable behavior e.g. difference going # from 0 rad to 2*pi rad with slight turn + # Ref: The velocity v always has a constant sign which depends on the initial value of α. rho = np.hypot(x_diff, y_diff) - alpha = angle_mod(np.arctan2(y_diff, x_diff) - theta) - beta = angle_mod(theta_goal - theta - alpha) v = self.Kp_rho * rho - w = self.Kp_alpha * alpha - self.Kp_beta * beta + alpha = angle_mod(np.arctan2(y_diff, x_diff) - theta) + beta = angle_mod(theta_goal - theta - alpha) if alpha > np.pi / 2 or alpha < -np.pi / 2: + # recalculate alpha to make alpha in the range of [-pi/2, pi/2] + alpha = angle_mod(np.arctan2(-y_diff, -x_diff) - theta) + beta = angle_mod(theta_goal - theta - alpha) + w = self.Kp_alpha * alpha - self.Kp_beta * beta v = -v + else: + w = self.Kp_alpha * alpha - self.Kp_beta * beta return rho, v, w @@ -85,6 +97,7 @@ def calc_control_command(self, x_diff, y_diff, theta, theta_goal): # simulation parameters controller = PathFinderController(9, 15, 3) dt = 0.01 +MAX_SIM_TIME = 5 # seconds, robot will stop moving when time exceeds this value # Robot specifications MAX_LINEAR_SPEED = 15 @@ -101,18 +114,19 @@ def move_to_pose(x_start, y_start, theta_start, x_goal, y_goal, theta_goal): x_diff = x_goal - x y_diff = y_goal - y - x_traj, y_traj = [], [] + x_traj, y_traj, v_traj, w_traj = [x], [y], [0], [0] rho = np.hypot(x_diff, y_diff) - while rho > 0.001: + t = 0 + while rho > 0.001 and t < MAX_SIM_TIME: + t += dt x_traj.append(x) y_traj.append(y) x_diff = x_goal - x y_diff = y_goal - y - rho, v, w = controller.calc_control_command( - x_diff, y_diff, theta, theta_goal) + rho, v, w = controller.calc_control_command(x_diff, y_diff, theta, theta_goal) if abs(v) > MAX_LINEAR_SPEED: v = np.sign(v) * MAX_LINEAR_SPEED @@ -120,18 +134,35 @@ def move_to_pose(x_start, y_start, theta_start, x_goal, y_goal, theta_goal): if abs(w) > MAX_ANGULAR_SPEED: w = np.sign(w) * MAX_ANGULAR_SPEED + v_traj.append(v) + w_traj.append(w) + theta = theta + w * dt x = x + v * np.cos(theta) * dt y = y + v * np.sin(theta) * dt if show_animation: # pragma: no cover plt.cla() - plt.arrow(x_start, y_start, np.cos(theta_start), - np.sin(theta_start), color='r', width=0.1) - plt.arrow(x_goal, y_goal, np.cos(theta_goal), - np.sin(theta_goal), color='g', width=0.1) + plt.arrow( + x_start, + y_start, + np.cos(theta_start), + np.sin(theta_start), + color="r", + width=0.1, + ) + plt.arrow( + x_goal, + y_goal, + np.cos(theta_goal), + np.sin(theta_goal), + color="g", + width=0.1, + ) plot_vehicle(x, y, theta, x_traj, y_traj) + return x_traj, y_traj, v_traj, w_traj + def plot_vehicle(x, y, theta, x_traj, y_traj): # pragma: no cover # Corners of triangular vehicle when pointing to the right (0 radians) @@ -144,16 +175,16 @@ def plot_vehicle(x, y, theta, x_traj, y_traj): # pragma: no cover p2 = np.matmul(T, p2_i) p3 = np.matmul(T, p3_i) - plt.plot([p1[0], p2[0]], [p1[1], p2[1]], 'k-') - plt.plot([p2[0], p3[0]], [p2[1], p3[1]], 'k-') - plt.plot([p3[0], p1[0]], [p3[1], p1[1]], 'k-') + plt.plot([p1[0], p2[0]], [p1[1], p2[1]], "k-") + plt.plot([p2[0], p3[0]], [p2[1], p3[1]], "k-") + plt.plot([p3[0], p1[0]], [p3[1], p1[1]], "k-") - plt.plot(x_traj, y_traj, 'b--') + plt.plot(x_traj, y_traj, "b--") # for stopping simulation with the esc key. plt.gcf().canvas.mpl_connect( - 'key_release_event', - lambda event: [exit(0) if event.key == 'escape' else None]) + "key_release_event", lambda event: [exit(0) if event.key == "escape" else None] + ) plt.xlim(0, 20) plt.ylim(0, 20) @@ -162,15 +193,16 @@ def plot_vehicle(x, y, theta, x_traj, y_traj): # pragma: no cover def transformation_matrix(x, y, theta): - return np.array([ - [np.cos(theta), -np.sin(theta), x], - [np.sin(theta), np.cos(theta), y], - [0, 0, 1] - ]) + return np.array( + [ + [np.cos(theta), -np.sin(theta), x], + [np.sin(theta), np.cos(theta), y], + [0, 0, 1], + ] + ) def main(): - for i in range(5): x_start = 20.0 * random() y_start = 20.0 * random() @@ -178,10 +210,14 @@ def main(): x_goal = 20 * random() y_goal = 20 * random() theta_goal = 2 * np.pi * random() - np.pi - print(f"Initial x: {round(x_start, 2)} m\nInitial y: {round(y_start, 2)} m\nInitial theta: {round(theta_start, 2)} rad\n") - print(f"Goal x: {round(x_goal, 2)} m\nGoal y: {round(y_goal, 2)} m\nGoal theta: {round(theta_goal, 2)} rad\n") + print( + f"Initial x: {round(x_start, 2)} m\nInitial y: {round(y_start, 2)} m\nInitial theta: {round(theta_start, 2)} rad\n" + ) + print( + f"Goal x: {round(x_goal, 2)} m\nGoal y: {round(y_goal, 2)} m\nGoal theta: {round(theta_goal, 2)} rad\n" + ) move_to_pose(x_start, y_start, theta_start, x_goal, y_goal, theta_goal) -if __name__ == '__main__': +if __name__ == "__main__": main() diff --git a/tests/test_move_to_pose.py b/tests/test_move_to_pose.py index 8bc11a8d24..94c3ec1102 100644 --- a/tests/test_move_to_pose.py +++ b/tests/test_move_to_pose.py @@ -1,13 +1,81 @@ +import itertools +import numpy as np import conftest # Add root path to sys.path from Control.move_to_pose import move_to_pose as m -def test_1(): +def test_random(): m.show_animation = False m.main() -def test_2(): +def test_stability(): + """ + This unit test tests the move_to_pose.py program for stability + """ + m.show_animation = False + x_start = 5 + y_start = 5 + theta_start = 0 + x_goal = 1 + y_goal = 4 + theta_goal = 0 + _, _, v_traj, w_traj = m.move_to_pose( + x_start, y_start, theta_start, x_goal, y_goal, theta_goal + ) + + def v_is_change(current, previous): + return abs(current - previous) > m.MAX_LINEAR_SPEED + + def w_is_change(current, previous): + return abs(current - previous) > m.MAX_ANGULAR_SPEED + + # Check if the speed is changing too much + window_size = 10 + count_threshold = 4 + v_change = [v_is_change(v_traj[i], v_traj[i - 1]) for i in range(1, len(v_traj))] + w_change = [w_is_change(w_traj[i], w_traj[i - 1]) for i in range(1, len(w_traj))] + for i in range(len(v_change) - window_size + 1): + v_window = v_change[i : i + window_size] + w_window = w_change[i : i + window_size] + + v_unstable = sum(v_window) > count_threshold + w_unstable = sum(w_window) > count_threshold + + assert not v_unstable, ( + f"v_unstable in window [{i}, {i + window_size}], unstable count: {sum(v_window)}" + ) + assert not w_unstable, ( + f"w_unstable in window [{i}, {i + window_size}], unstable count: {sum(w_window)}" + ) + + +def test_reach_goal(): + """ + This unit test tests the move_to_pose.py program for reaching the goal + """ + m.show_animation = False + x_start = 5 + y_start = 5 + theta_start_list = [0, np.pi / 2, np.pi, 3 * np.pi / 2] + x_goal_list = [0, 5, 10] + y_goal_list = [0, 5, 10] + theta_goal = 0 + for theta_start, x_goal, y_goal in itertools.product( + theta_start_list, x_goal_list, y_goal_list + ): + x_traj, y_traj, _, _ = m.move_to_pose( + x_start, y_start, theta_start, x_goal, y_goal, theta_goal + ) + x_diff = x_goal - x_traj[-1] + y_diff = y_goal - y_traj[-1] + rho = np.hypot(x_diff, y_diff) + assert rho < 0.001, ( + f"start:[{x_start}, {y_start}, {theta_start}], goal:[{x_goal}, {y_goal}, {theta_goal}], rho: {rho} is too large" + ) + + +def test_max_speed(): """ This unit test tests the move_to_pose.py program for a MAX_LINEAR_SPEED and MAX_ANGULAR_SPEED @@ -18,5 +86,5 @@ def test_2(): m.main() -if __name__ == '__main__': +if __name__ == "__main__": conftest.run_this_test(__file__) From 64779298ffa77c918c63f9206b694f0d8d439c71 Mon Sep 17 00:00:00 2001 From: Atsushi Sakai Date: Fri, 21 Feb 2025 21:40:21 +0900 Subject: [PATCH 28/35] =?UTF-8?q?refactor:=20rename=20files=20and=20update?= =?UTF-8?q?=20references=20for=20inverted=20pendulum=20an=E2=80=A6=20(#117?= =?UTF-8?q?1)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * refactor: rename files and update references for inverted pendulum and path tracking modules * refactor: rename inverted pendulum control files and update type check references * refactor: update import statements to use consistent casing for InvertedPendulum module --- Control/move_to_pose/__init__.py | 0 .../inverted_pendulum_lqr_control.py | 0 .../inverted_pendulum_mpc_control.py | 0 {Control => PathTracking/move_to_pose}/__init__.py | 0 .../move_to_pose/move_to_pose.py | 0 .../move_to_pose/move_to_pose_robot.py | 0 docs/index_main.rst | 2 +- docs/modules/10_control/control_main.rst | 12 ------------ .../inverted-pendulum.png | Bin .../inverted_pendulum_main.rst} | 6 ++++-- docs/modules/11_utils/utils_main.rst | 2 +- docs/modules/12_appendix/appendix_main.rst | 2 +- .../technologies_for_robotics_main.rst | 10 ++++++++++ .../move_to_a_pose_control_main.rst | 0 docs/modules/6_path_tracking/path_tracking_main.rst | 1 + tests/test_inverted_pendulum_lqr_control.py | 2 +- tests/test_inverted_pendulum_mpc_control.py | 2 +- tests/test_move_to_pose.py | 2 +- tests/test_move_to_pose_robot.py | 2 +- tests/test_mypy_type_check.py | 2 +- 20 files changed, 23 insertions(+), 22 deletions(-) delete mode 100644 Control/move_to_pose/__init__.py rename {Control/inverted_pendulum => InvertedPendulum}/inverted_pendulum_lqr_control.py (100%) rename {Control/inverted_pendulum => InvertedPendulum}/inverted_pendulum_mpc_control.py (100%) rename {Control => PathTracking/move_to_pose}/__init__.py (100%) rename {Control => PathTracking}/move_to_pose/move_to_pose.py (100%) rename {Control => PathTracking}/move_to_pose/move_to_pose_robot.py (100%) delete mode 100644 docs/modules/10_control/control_main.rst rename docs/modules/{10_control/inverted_pendulum_control => 10_inverted_pendulum}/inverted-pendulum.png (100%) rename docs/modules/{10_control/inverted_pendulum_control/inverted_pendulum_control_main.rst => 10_inverted_pendulum/inverted_pendulum_main.rst} (97%) rename docs/modules/{10_control => 6_path_tracking}/move_to_a_pose_control/move_to_a_pose_control_main.rst (100%) diff --git a/Control/move_to_pose/__init__.py b/Control/move_to_pose/__init__.py deleted file mode 100644 index e69de29bb2..0000000000 diff --git a/Control/inverted_pendulum/inverted_pendulum_lqr_control.py b/InvertedPendulum/inverted_pendulum_lqr_control.py similarity index 100% rename from Control/inverted_pendulum/inverted_pendulum_lqr_control.py rename to InvertedPendulum/inverted_pendulum_lqr_control.py diff --git a/Control/inverted_pendulum/inverted_pendulum_mpc_control.py b/InvertedPendulum/inverted_pendulum_mpc_control.py similarity index 100% rename from Control/inverted_pendulum/inverted_pendulum_mpc_control.py rename to InvertedPendulum/inverted_pendulum_mpc_control.py diff --git a/Control/__init__.py b/PathTracking/move_to_pose/__init__.py similarity index 100% rename from Control/__init__.py rename to PathTracking/move_to_pose/__init__.py diff --git a/Control/move_to_pose/move_to_pose.py b/PathTracking/move_to_pose/move_to_pose.py similarity index 100% rename from Control/move_to_pose/move_to_pose.py rename to PathTracking/move_to_pose/move_to_pose.py diff --git a/Control/move_to_pose/move_to_pose_robot.py b/PathTracking/move_to_pose/move_to_pose_robot.py similarity index 100% rename from Control/move_to_pose/move_to_pose_robot.py rename to PathTracking/move_to_pose/move_to_pose_robot.py diff --git a/docs/index_main.rst b/docs/index_main.rst index c1e8d22d32..75805d1184 100644 --- a/docs/index_main.rst +++ b/docs/index_main.rst @@ -43,7 +43,7 @@ this graph shows GitHub star history of this project: modules/7_arm_navigation/arm_navigation modules/8_aerial_navigation/aerial_navigation modules/9_bipedal/bipedal - modules/10_control/control + modules/10_inverted_pendulum/inverted_pendulum modules/11_utils/utils modules/12_appendix/appendix diff --git a/docs/modules/10_control/control_main.rst b/docs/modules/10_control/control_main.rst deleted file mode 100644 index cee2aa9e8e..0000000000 --- a/docs/modules/10_control/control_main.rst +++ /dev/null @@ -1,12 +0,0 @@ -.. _control: - -Control -================= - -.. toctree:: - :maxdepth: 2 - :caption: Contents - - inverted_pendulum_control/inverted_pendulum_control - move_to_a_pose_control/move_to_a_pose_control - diff --git a/docs/modules/10_control/inverted_pendulum_control/inverted-pendulum.png b/docs/modules/10_inverted_pendulum/inverted-pendulum.png similarity index 100% rename from docs/modules/10_control/inverted_pendulum_control/inverted-pendulum.png rename to docs/modules/10_inverted_pendulum/inverted-pendulum.png diff --git a/docs/modules/10_control/inverted_pendulum_control/inverted_pendulum_control_main.rst b/docs/modules/10_inverted_pendulum/inverted_pendulum_main.rst similarity index 97% rename from docs/modules/10_control/inverted_pendulum_control/inverted_pendulum_control_main.rst rename to docs/modules/10_inverted_pendulum/inverted_pendulum_main.rst index e41729fd61..048cbea9ac 100644 --- a/docs/modules/10_control/inverted_pendulum_control/inverted_pendulum_control_main.rst +++ b/docs/modules/10_inverted_pendulum/inverted_pendulum_main.rst @@ -1,5 +1,7 @@ -Inverted Pendulum Control ------------------------------ +.. _`Inverted Pendulum`: + +Inverted Pendulum +------------------ An inverted pendulum on a cart consists of a mass :math:`m` at the top of a pole of length :math:`l` pivoted on a horizontally moving base as shown in the adjacent. diff --git a/docs/modules/11_utils/utils_main.rst b/docs/modules/11_utils/utils_main.rst index ff79a26205..95c982b077 100644 --- a/docs/modules/11_utils/utils_main.rst +++ b/docs/modules/11_utils/utils_main.rst @@ -1,4 +1,4 @@ -.. _utils: +.. _`utils`: Utilities ========== diff --git a/docs/modules/12_appendix/appendix_main.rst b/docs/modules/12_appendix/appendix_main.rst index a55389e1e6..d0b9eeea3a 100644 --- a/docs/modules/12_appendix/appendix_main.rst +++ b/docs/modules/12_appendix/appendix_main.rst @@ -1,4 +1,4 @@ -.. _appendix: +.. _`Appendix`: Appendix ============== diff --git a/docs/modules/1_introduction/3_technologies_for_robotics/technologies_for_robotics_main.rst b/docs/modules/1_introduction/3_technologies_for_robotics/technologies_for_robotics_main.rst index c77997a138..0ed51e961b 100644 --- a/docs/modules/1_introduction/3_technologies_for_robotics/technologies_for_robotics_main.rst +++ b/docs/modules/1_introduction/3_technologies_for_robotics/technologies_for_robotics_main.rst @@ -56,3 +56,13 @@ Robot type specific technologies #. :ref:`Aerial Navigation` #. :ref:`Bipedal` +#. :ref:`Inverted Pendulum` + + +Additional Information +^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +#. :ref:`utils` +#. :ref:`Appendix` + + diff --git a/docs/modules/10_control/move_to_a_pose_control/move_to_a_pose_control_main.rst b/docs/modules/6_path_tracking/move_to_a_pose_control/move_to_a_pose_control_main.rst similarity index 100% rename from docs/modules/10_control/move_to_a_pose_control/move_to_a_pose_control_main.rst rename to docs/modules/6_path_tracking/move_to_a_pose_control/move_to_a_pose_control_main.rst diff --git a/docs/modules/6_path_tracking/path_tracking_main.rst b/docs/modules/6_path_tracking/path_tracking_main.rst index 130a2340c1..d98e324583 100644 --- a/docs/modules/6_path_tracking/path_tracking_main.rst +++ b/docs/modules/6_path_tracking/path_tracking_main.rst @@ -16,3 +16,4 @@ Path tracking is the ability of a robot to follow the reference path generated b lqr_speed_and_steering_control/lqr_speed_and_steering_control model_predictive_speed_and_steering_control/model_predictive_speed_and_steering_control cgmres_nmpc/cgmres_nmpc + move_to_a_pose_control/move_to_a_pose_control diff --git a/tests/test_inverted_pendulum_lqr_control.py b/tests/test_inverted_pendulum_lqr_control.py index cbbabb93b1..62afda71c3 100644 --- a/tests/test_inverted_pendulum_lqr_control.py +++ b/tests/test_inverted_pendulum_lqr_control.py @@ -1,5 +1,5 @@ import conftest -from Control.inverted_pendulum import inverted_pendulum_lqr_control as m +from InvertedPendulum import inverted_pendulum_lqr_control as m def test_1(): diff --git a/tests/test_inverted_pendulum_mpc_control.py b/tests/test_inverted_pendulum_mpc_control.py index 800aefd7d5..94859c2e0a 100644 --- a/tests/test_inverted_pendulum_mpc_control.py +++ b/tests/test_inverted_pendulum_mpc_control.py @@ -1,6 +1,6 @@ import conftest -from Control.inverted_pendulum import inverted_pendulum_mpc_control as m +from InvertedPendulum import inverted_pendulum_mpc_control as m def test1(): diff --git a/tests/test_move_to_pose.py b/tests/test_move_to_pose.py index 94c3ec1102..e06d801555 100644 --- a/tests/test_move_to_pose.py +++ b/tests/test_move_to_pose.py @@ -1,7 +1,7 @@ import itertools import numpy as np import conftest # Add root path to sys.path -from Control.move_to_pose import move_to_pose as m +from PathTracking.move_to_pose import move_to_pose as m def test_random(): diff --git a/tests/test_move_to_pose_robot.py b/tests/test_move_to_pose_robot.py index a93b44d198..7a82f98556 100644 --- a/tests/test_move_to_pose_robot.py +++ b/tests/test_move_to_pose_robot.py @@ -1,5 +1,5 @@ import conftest # Add root path to sys.path -from Control.move_to_pose import move_to_pose as m +from PathTracking.move_to_pose import move_to_pose as m def test_1(): diff --git a/tests/test_mypy_type_check.py b/tests/test_mypy_type_check.py index 07afb40afd..6b933c1011 100644 --- a/tests/test_mypy_type_check.py +++ b/tests/test_mypy_type_check.py @@ -7,12 +7,12 @@ "AerialNavigation", "ArmNavigation", "Bipedal", - "Control", "Localization", "Mapping", "PathPlanning", "PathTracking", "SLAM", + "InvertedPendulum" ] From 6e13e8292aad80661093603ed262c6f0bdcb4137 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 25 Feb 2025 08:07:42 +0900 Subject: [PATCH 29/35] build(deps): bump ruff from 0.9.6 to 0.9.7 in /requirements (#1173) Bumps [ruff](https://github.com/astral-sh/ruff) from 0.9.6 to 0.9.7. - [Release notes](https://github.com/astral-sh/ruff/releases) - [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md) - [Commits](https://github.com/astral-sh/ruff/compare/0.9.6...0.9.7) --- updated-dependencies: - dependency-name: ruff dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/requirements.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/requirements/requirements.txt b/requirements/requirements.txt index 8176364c29..b439ea4266 100644 --- a/requirements/requirements.txt +++ b/requirements/requirements.txt @@ -5,4 +5,4 @@ cvxpy == 1.5.3 pytest == 8.3.4 # For unit test pytest-xdist == 3.6.1 # For unit test mypy == 1.15.0 # For unit test -ruff == 0.9.6 # For unit test +ruff == 0.9.7 # For unit test From 0c8ff11645e6804db18bec6bd918e03ed03454f7 Mon Sep 17 00:00:00 2001 From: Jonathan Schwartz Date: Tue, 25 Feb 2025 06:53:36 -0500 Subject: [PATCH 30/35] Space-Time AStar (#1170) * wip - sketch out obstacles * move to correct path * better animation * clean up * use np to sample points * implemented time-based A* * cleaning up Grid + adding new obstacle arrangement * added unit test * formatting p1 * format STA* file * remove newlines by docstrings * linter * working on typehints * fix linter errors * lint some more * appease AppVeyor * dataclasses are :fire: * back to @total_ordering * trailing whitespace * add docs page on SpaceTimeA* * docs lint * remove trailing newlines in doc * address comments * Update docs/modules/5_path_planning/time_based_grid_search/time_based_grid_search_main.rst --------- Co-authored-by: Atsushi Sakai --- .../GridWithDynamicObstacles.py | 273 ++++++++++++++++++ .../TimeBasedPathPlanning/SpaceTimeAStar.py | 220 ++++++++++++++ .../TimeBasedPathPlanning/__init__.py | 0 .../5_path_planning/path_planning_main.rst | 1 + .../time_based_grid_search_main.rst | 22 ++ tests/test_space_time_astar.py | 33 +++ 6 files changed, 549 insertions(+) create mode 100644 PathPlanning/TimeBasedPathPlanning/GridWithDynamicObstacles.py create mode 100644 PathPlanning/TimeBasedPathPlanning/SpaceTimeAStar.py create mode 100644 PathPlanning/TimeBasedPathPlanning/__init__.py create mode 100644 docs/modules/5_path_planning/time_based_grid_search/time_based_grid_search_main.rst create mode 100644 tests/test_space_time_astar.py diff --git a/PathPlanning/TimeBasedPathPlanning/GridWithDynamicObstacles.py b/PathPlanning/TimeBasedPathPlanning/GridWithDynamicObstacles.py new file mode 100644 index 0000000000..7b0190d023 --- /dev/null +++ b/PathPlanning/TimeBasedPathPlanning/GridWithDynamicObstacles.py @@ -0,0 +1,273 @@ +""" +This file implements a grid with a 3d reservation matrix with dimensions for x, y, and time. There +is also infrastructure to generate dynamic obstacles that move around the grid. The obstacles' paths +are stored in the reservation matrix on creation. +""" +import numpy as np +import matplotlib.pyplot as plt +from enum import Enum +from dataclasses import dataclass + +@dataclass(order=True) +class Position: + x: int + y: int + + def as_ndarray(self) -> np.ndarray: + return np.array([self.x, self.y]) + + def __add__(self, other): + if isinstance(other, Position): + return Position(self.x + other.x, self.y + other.y) + raise NotImplementedError( + f"Addition not supported for Position and {type(other)}" + ) + + def __sub__(self, other): + if isinstance(other, Position): + return Position(self.x - other.x, self.y - other.y) + raise NotImplementedError( + f"Subtraction not supported for Position and {type(other)}" + ) + + +class ObstacleArrangement(Enum): + # Random obstacle positions and movements + RANDOM = 0 + # Obstacles start in a line in y at center of grid and move side-to-side in x + ARRANGEMENT1 = 1 + + +class Grid: + # Set in constructor + grid_size: np.ndarray + reservation_matrix: np.ndarray + obstacle_paths: list[list[Position]] = [] + # Obstacles will never occupy these points. Useful to avoid impossible scenarios + obstacle_avoid_points: list[Position] = [] + + # Number of time steps in the simulation + time_limit: int + + # Logging control + verbose = False + + def __init__( + self, + grid_size: np.ndarray, + num_obstacles: int = 40, + obstacle_avoid_points: list[Position] = [], + obstacle_arrangement: ObstacleArrangement = ObstacleArrangement.RANDOM, + time_limit: int = 100, + ): + self.obstacle_avoid_points = obstacle_avoid_points + self.time_limit = time_limit + self.grid_size = grid_size + self.reservation_matrix = np.zeros((grid_size[0], grid_size[1], self.time_limit)) + + if num_obstacles > self.grid_size[0] * self.grid_size[1]: + raise Exception("Number of obstacles is greater than grid size!") + + if obstacle_arrangement == ObstacleArrangement.RANDOM: + self.obstacle_paths = self.generate_dynamic_obstacles(num_obstacles) + elif obstacle_arrangement == ObstacleArrangement.ARRANGEMENT1: + self.obstacle_paths = self.obstacle_arrangement_1(num_obstacles) + + for i, path in enumerate(self.obstacle_paths): + obs_idx = i + 1 # avoid using 0 - that indicates free space in the grid + for t, position in enumerate(path): + # Reserve old & new position at this time step + if t > 0: + self.reservation_matrix[path[t - 1].x, path[t - 1].y, t] = obs_idx + self.reservation_matrix[position.x, position.y, t] = obs_idx + + """ + Generate dynamic obstacles that move around the grid. Initial positions and movements are random + """ + def generate_dynamic_obstacles(self, obs_count: int) -> list[list[Position]]: + obstacle_paths = [] + for _ in (0, obs_count): + # Sample until a free starting space is found + initial_position = self.sample_random_position() + while not self.valid_obstacle_position(initial_position, 0): + initial_position = self.sample_random_position() + + positions = [initial_position] + if self.verbose: + print("Obstacle initial position: ", initial_position) + + # Encourage obstacles to mostly stay in place - too much movement leads to chaotic planning scenarios + # that are not fun to watch + weights = [0.05, 0.05, 0.05, 0.05, 0.8] + diffs = [ + Position(0, 1), + Position(0, -1), + Position(1, 0), + Position(-1, 0), + Position(0, 0), + ] + + for t in range(1, self.time_limit - 1): + sampled_indices = np.random.choice( + len(diffs), size=5, replace=False, p=weights + ) + rand_diffs = [diffs[i] for i in sampled_indices] + + valid_position = None + for diff in rand_diffs: + new_position = positions[-1] + diff + + if not self.valid_obstacle_position(new_position, t): + continue + + valid_position = new_position + break + + # Impossible situation for obstacle - stay in place + # -> this can happen if the oaths of other obstacles this one + if valid_position is None: + valid_position = positions[-1] + + positions.append(valid_position) + + obstacle_paths.append(positions) + + return obstacle_paths + + """ + Generate a line of obstacles in y at the center of the grid that move side-to-side in x + Bottom half start moving right, top half start moving left. If `obs_count` is less than the length of + the grid, only the first `obs_count` obstacles will be generated. + """ + def obstacle_arrangement_1(self, obs_count: int) -> list[list[Position]]: + obstacle_paths = [] + half_grid_x = self.grid_size[0] // 2 + half_grid_y = self.grid_size[1] // 2 + + for y_idx in range(0, min(obs_count, self.grid_size[1])): + moving_right = y_idx < half_grid_y + position = Position(half_grid_x, y_idx) + path = [position] + + for t in range(1, self.time_limit - 1): + # sit in place every other time step + if t % 2 == 0: + path.append(position) + continue + + # first check if we should switch direction (at edge of grid) + if (moving_right and position.x == self.grid_size[0] - 1) or ( + not moving_right and position.x == 0 + ): + moving_right = not moving_right + # step in direction + position = Position( + position.x + (1 if moving_right else -1), position.y + ) + path.append(position) + + obstacle_paths.append(path) + + return obstacle_paths + + """ + Check if the given position is valid at time t + + input: + position (Position): (x, y) position + t (int): time step + + output: + bool: True if position/time combination is valid, False otherwise + """ + def valid_position(self, position: Position, t: int) -> bool: + # Check if new position is in grid + if not self.inside_grid_bounds(position): + return False + + # Check if new position is not occupied at time t + return self.reservation_matrix[position.x, position.y, t] == 0 + + """ + Returns True if the given position is valid at time t and is not in the set of obstacle_avoid_points + """ + def valid_obstacle_position(self, position: Position, t: int) -> bool: + return ( + self.valid_position(position, t) + and position not in self.obstacle_avoid_points + ) + + """ + Returns True if the given position is within the grid's boundaries + """ + def inside_grid_bounds(self, position: Position) -> bool: + return ( + position.x >= 0 + and position.x < self.grid_size[0] + and position.y >= 0 + and position.y < self.grid_size[1] + ) + + """ + Sample a random position that is within the grid's boundaries + + output: + Position: (x, y) position + """ + def sample_random_position(self) -> Position: + return Position( + np.random.randint(0, self.grid_size[0]), + np.random.randint(0, self.grid_size[1]), + ) + + """ + Returns a tuple of (x_positions, y_positions) of the obstacles at time t + """ + def get_obstacle_positions_at_time(self, t: int) -> tuple[list[int], list[int]]: + x_positions = [] + y_positions = [] + for obs_path in self.obstacle_paths: + x_positions.append(obs_path[t].x) + y_positions.append(obs_path[t].y) + return (x_positions, y_positions) + + +show_animation = True + + +def main(): + grid = Grid( + np.array([11, 11]), + num_obstacles=10, + obstacle_arrangement=ObstacleArrangement.ARRANGEMENT1, + ) + + if not show_animation: + return + + fig = plt.figure(figsize=(8, 7)) + ax = fig.add_subplot( + autoscale_on=False, + xlim=(0, grid.grid_size[0] - 1), + ylim=(0, grid.grid_size[1] - 1), + ) + ax.set_aspect("equal") + ax.grid() + ax.set_xticks(np.arange(0, 11, 1)) + ax.set_yticks(np.arange(0, 11, 1)) + (obs_points,) = ax.plot([], [], "ro", ms=15) + + # for stopping simulation with the esc key. + plt.gcf().canvas.mpl_connect( + "key_release_event", lambda event: [exit(0) if event.key == "escape" else None] + ) + + for i in range(0, grid.time_limit - 1): + obs_positions = grid.get_obstacle_positions_at_time(i) + obs_points.set_data(obs_positions[0], obs_positions[1]) + plt.pause(0.2) + plt.show() + + +if __name__ == "__main__": + main() diff --git a/PathPlanning/TimeBasedPathPlanning/SpaceTimeAStar.py b/PathPlanning/TimeBasedPathPlanning/SpaceTimeAStar.py new file mode 100644 index 0000000000..3b3613d695 --- /dev/null +++ b/PathPlanning/TimeBasedPathPlanning/SpaceTimeAStar.py @@ -0,0 +1,220 @@ +""" +Space-time A* Algorithm + This script demonstrates the Space-time A* algorithm for path planning in a grid world with moving obstacles. + This algorithm is different from normal 2D A* in one key way - the cost (often notated as g(n)) is + the number of time steps it took to get to a given node, instead of the number of cells it has + traversed. This ensures the path is time-optimal, while respescting any dynamic obstacles in the environment. + + Reference: https://www.davidsilver.uk/wp-content/uploads/2020/03/coop-path-AIWisdom.pdf +""" + +import numpy as np +import matplotlib.pyplot as plt +from PathPlanning.TimeBasedPathPlanning.GridWithDynamicObstacles import ( + Grid, + ObstacleArrangement, + Position, +) +import heapq +from collections.abc import Generator +import random +from dataclasses import dataclass +from functools import total_ordering + + +# Seed randomness for reproducibility +RANDOM_SEED = 50 +random.seed(RANDOM_SEED) +np.random.seed(RANDOM_SEED) + +@dataclass() +# Note: Total_ordering is used instead of adding `order=True` to the @dataclass decorator because +# this class needs to override the __lt__ and __eq__ methods to ignore parent_index. Parent +# index is just used to track the path found by the algorithm, and has no effect on the quality +# of a node. +@total_ordering +class Node: + position: Position + time: int + heuristic: int + parent_index: int + + """ + This is what is used to drive node expansion. The node with the lowest value is expanded next. + This comparison prioritizes the node with the lowest cost-to-come (self.time) + cost-to-go (self.heuristic) + """ + def __lt__(self, other: object): + if not isinstance(other, Node): + return NotImplementedError(f"Cannot compare Node with object of type: {type(other)}") + return (self.time + self.heuristic) < (other.time + other.heuristic) + + def __eq__(self, other: object): + if not isinstance(other, Node): + return NotImplementedError(f"Cannot compare Node with object of type: {type(other)}") + return self.position == other.position and self.time == other.time + + +class NodePath: + path: list[Node] + positions_at_time: dict[int, Position] = {} + + def __init__(self, path: list[Node]): + self.path = path + for node in path: + self.positions_at_time[node.time] = node.position + + """ + Get the position of the path at a given time + """ + def get_position(self, time: int) -> Position | None: + return self.positions_at_time.get(time) + + """ + Time stamp of the last node in the path + """ + def goal_reached_time(self) -> int: + return self.path[-1].time + + def __repr__(self): + repr_string = "" + for i, node in enumerate(self.path): + repr_string += f"{i}: {node}\n" + return repr_string + + +class SpaceTimeAStar: + grid: Grid + start: Position + goal: Position + + def __init__(self, grid: Grid, start: Position, goal: Position): + self.grid = grid + self.start = start + self.goal = goal + + def plan(self, verbose: bool = False) -> NodePath: + open_set: list[Node] = [] + heapq.heappush( + open_set, Node(self.start, 0, self.calculate_heuristic(self.start), -1) + ) + + expanded_set: list[Node] = [] + while open_set: + expanded_node: Node = heapq.heappop(open_set) + if verbose: + print("Expanded node:", expanded_node) + + if expanded_node.time + 1 >= self.grid.time_limit: + if verbose: + print(f"\tSkipping node that is past time limit: {expanded_node}") + continue + + if expanded_node.position == self.goal: + print(f"Found path to goal after {len(expanded_set)} expansions") + path = [] + path_walker: Node = expanded_node + while True: + path.append(path_walker) + if path_walker.parent_index == -1: + break + path_walker = expanded_set[path_walker.parent_index] + + # reverse path so it goes start -> goal + path.reverse() + return NodePath(path) + + expanded_idx = len(expanded_set) + expanded_set.append(expanded_node) + + for child in self.generate_successors(expanded_node, expanded_idx, verbose): + heapq.heappush(open_set, child) + + raise Exception("No path found") + + """ + Generate possible successors of the provided `parent_node` + """ + def generate_successors( + self, parent_node: Node, parent_node_idx: int, verbose: bool + ) -> Generator[Node, None, None]: + diffs = [ + Position(0, 0), + Position(1, 0), + Position(-1, 0), + Position(0, 1), + Position(0, -1), + ] + for diff in diffs: + new_pos = parent_node.position + diff + if self.grid.valid_position(new_pos, parent_node.time + 1): + new_node = Node( + new_pos, + parent_node.time + 1, + self.calculate_heuristic(new_pos), + parent_node_idx, + ) + if verbose: + print("\tNew successor node: ", new_node) + yield new_node + + def calculate_heuristic(self, position) -> int: + diff = self.goal - position + return abs(diff.x) + abs(diff.y) + + +show_animation = True +verbose = False + +def main(): + start = Position(1, 11) + goal = Position(19, 19) + grid_side_length = 21 + grid = Grid( + np.array([grid_side_length, grid_side_length]), + num_obstacles=40, + obstacle_avoid_points=[start, goal], + obstacle_arrangement=ObstacleArrangement.ARRANGEMENT1, + ) + + planner = SpaceTimeAStar(grid, start, goal) + path = planner.plan(verbose) + + if verbose: + print(f"Path: {path}") + + if not show_animation: + return + + fig = plt.figure(figsize=(10, 7)) + ax = fig.add_subplot( + autoscale_on=False, + xlim=(0, grid.grid_size[0] - 1), + ylim=(0, grid.grid_size[1] - 1), + ) + ax.set_aspect("equal") + ax.grid() + ax.set_xticks(np.arange(0, grid_side_length, 1)) + ax.set_yticks(np.arange(0, grid_side_length, 1)) + + (start_and_goal,) = ax.plot([], [], "mD", ms=15, label="Start and Goal") + start_and_goal.set_data([start.x, goal.x], [start.y, goal.y]) + (obs_points,) = ax.plot([], [], "ro", ms=15, label="Obstacles") + (path_points,) = ax.plot([], [], "bo", ms=10, label="Path Found") + ax.legend(bbox_to_anchor=(1.05, 1)) + + # for stopping simulation with the esc key. + plt.gcf().canvas.mpl_connect( + "key_release_event", lambda event: [exit(0) if event.key == "escape" else None] + ) + + for i in range(0, path.goal_reached_time()): + obs_positions = grid.get_obstacle_positions_at_time(i) + obs_points.set_data(obs_positions[0], obs_positions[1]) + path_position = path.get_position(i) + path_points.set_data([path_position.x], [path_position.y]) + plt.pause(0.2) + plt.show() + + +if __name__ == "__main__": + main() diff --git a/PathPlanning/TimeBasedPathPlanning/__init__.py b/PathPlanning/TimeBasedPathPlanning/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/docs/modules/5_path_planning/path_planning_main.rst b/docs/modules/5_path_planning/path_planning_main.rst index a0f9c30a3d..0c84a19c22 100644 --- a/docs/modules/5_path_planning/path_planning_main.rst +++ b/docs/modules/5_path_planning/path_planning_main.rst @@ -12,6 +12,7 @@ Path planning is the ability of a robot to search feasible and efficient path to dynamic_window_approach/dynamic_window_approach bugplanner/bugplanner grid_base_search/grid_base_search + time_based_grid_search/time_based_grid_search model_predictive_trajectory_generator/model_predictive_trajectory_generator state_lattice_planner/state_lattice_planner prm_planner/prm_planner diff --git a/docs/modules/5_path_planning/time_based_grid_search/time_based_grid_search_main.rst b/docs/modules/5_path_planning/time_based_grid_search/time_based_grid_search_main.rst new file mode 100644 index 0000000000..0c26badec7 --- /dev/null +++ b/docs/modules/5_path_planning/time_based_grid_search/time_based_grid_search_main.rst @@ -0,0 +1,22 @@ +Time based grid search +---------------------- + +Space-time astar +~~~~~~~~~~~~~~~~~~~~~~ + +This is an extension of A* algorithm that supports planning around dynamic obstacles. + +.. image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/PathPlanning/TimeBasedPathPlanning/SpaceTimeAStar/path_animation.gif + +.. image:: https://github.com/AtsushiSakai/PythonRoboticsGifs/raw/master/PathPlanning/TimeBasedPathPlanning/SpaceTimeAStar/path_animation2.gif + +The key difference of this algorithm compared to vanilla A* is that the cost and heuristic are now time-based instead of distance-based. +Using a time-based cost and heuristic ensures the path found is optimal in terms of time to reach the goal. + +The cost is the amount of time it takes to reach a given node, and the heuristic is the minimum amount of time it could take to reach the goal from that node, disregarding all obstacles. +For a simple scenario where the robot can move 1 cell per time step and stop and go as it pleases, the heuristic for time is equivalent to the heuristic for distance. + +References: +~~~~~~~~~~~ + +- `Cooperative Pathfinding `__ diff --git a/tests/test_space_time_astar.py b/tests/test_space_time_astar.py new file mode 100644 index 0000000000..5290738eb4 --- /dev/null +++ b/tests/test_space_time_astar.py @@ -0,0 +1,33 @@ +from PathPlanning.TimeBasedPathPlanning.GridWithDynamicObstacles import ( + Grid, + ObstacleArrangement, + Position, +) +from PathPlanning.TimeBasedPathPlanning import SpaceTimeAStar as m +import numpy as np +import conftest + + +def test_1(): + start = Position(1, 11) + goal = Position(19, 19) + grid_side_length = 21 + grid = Grid( + np.array([grid_side_length, grid_side_length]), + obstacle_arrangement=ObstacleArrangement.ARRANGEMENT1, + ) + + m.show_animation = False + planner = m.SpaceTimeAStar(grid, start, goal) + + path = planner.plan(False) + + # path should have 28 entries + assert len(path.path) == 31 + + # path should end at the goal + assert path.path[-1].position == goal + + +if __name__ == "__main__": + conftest.run_this_test(__file__) From 67a3ca7138384b4a013974dbf22383252f83686e Mon Sep 17 00:00:00 2001 From: Aglargil <34728006+Aglargil@users.noreply.github.com> Date: Fri, 28 Feb 2025 19:30:24 +0800 Subject: [PATCH 31/35] add state machine (#1172) * add state machine state_machine test state_machine_update * add state machine test/doc * state machine update * state machine generate_plantuml() can show diagram by using https://www.plantuml.com/plantuml/ --- .../StateMachine/robot_behavior_case.py | 111 +++++++ MissionPlanning/StateMachine/state_machine.py | 294 ++++++++++++++++++ docs/index_main.rst | 1 + .../mission_planning_main.rst | 12 + .../state_machine/robot_behavior_case.png | Bin 0 -> 28565 bytes .../state_machine/state_machine_main.rst | 74 +++++ tests/test_state_machine.py | 51 +++ 7 files changed, 543 insertions(+) create mode 100644 MissionPlanning/StateMachine/robot_behavior_case.py create mode 100644 MissionPlanning/StateMachine/state_machine.py create mode 100644 docs/modules/13_mission_planning/mission_planning_main.rst create mode 100644 docs/modules/13_mission_planning/state_machine/robot_behavior_case.png create mode 100644 docs/modules/13_mission_planning/state_machine/state_machine_main.rst create mode 100644 tests/test_state_machine.py diff --git a/MissionPlanning/StateMachine/robot_behavior_case.py b/MissionPlanning/StateMachine/robot_behavior_case.py new file mode 100644 index 0000000000..03ee60ae9f --- /dev/null +++ b/MissionPlanning/StateMachine/robot_behavior_case.py @@ -0,0 +1,111 @@ +""" +A case study of robot behavior using state machine + +author: Wang Zheng (@Aglargil) +""" + +from state_machine import StateMachine + + +class Robot: + def __init__(self): + self.battery = 100 + self.task_progress = 0 + + # Initialize state machine + self.machine = StateMachine("robot_sm", self) + + # Add state transition rules + self.machine.add_transition( + src_state="patrolling", + event="detect_task", + dst_state="executing_task", + guard=None, + action=None, + ) + + self.machine.add_transition( + src_state="executing_task", + event="task_complete", + dst_state="patrolling", + guard=None, + action="reset_task", + ) + + self.machine.add_transition( + src_state="executing_task", + event="low_battery", + dst_state="returning_to_base", + guard="is_battery_low", + ) + + self.machine.add_transition( + src_state="returning_to_base", + event="reach_base", + dst_state="charging", + guard=None, + action=None, + ) + + self.machine.add_transition( + src_state="charging", + event="charge_complete", + dst_state="patrolling", + guard=None, + action="battery_full", + ) + + # Set initial state + self.machine.set_current_state("patrolling") + + def is_battery_low(self): + """Battery level check condition""" + return self.battery < 30 + + def reset_task(self): + """Reset task progress""" + self.task_progress = 0 + print("[Action] Task progress has been reset") + + # Modify state entry callback naming convention (add state_ prefix) + def on_enter_executing_task(self): + print("\n------ Start Executing Task ------") + print(f"Current battery: {self.battery}%") + while self.machine.get_current_state().name == "executing_task": + self.task_progress += 10 + self.battery -= 25 + print( + f"Task progress: {self.task_progress}%, Remaining battery: {self.battery}%" + ) + + if self.task_progress >= 100: + self.machine.process("task_complete") + break + elif self.is_battery_low(): + self.machine.process("low_battery") + break + + def on_enter_returning_to_base(self): + print("\nLow battery, returning to charging station...") + self.machine.process("reach_base") + + def on_enter_charging(self): + print("\n------ Charging ------") + self.battery = 100 + print("Charging complete!") + self.machine.process("charge_complete") + + +# Keep the test section structure the same, only modify the trigger method +if __name__ == "__main__": + robot = Robot() + print(robot.machine.generate_plantuml()) + + print(f"Initial state: {robot.machine.get_current_state().name}") + print("------------") + + # Trigger task detection event + robot.machine.process("detect_task") + + print("\n------------") + print(f"Final state: {robot.machine.get_current_state().name}") diff --git a/MissionPlanning/StateMachine/state_machine.py b/MissionPlanning/StateMachine/state_machine.py new file mode 100644 index 0000000000..de72f0f451 --- /dev/null +++ b/MissionPlanning/StateMachine/state_machine.py @@ -0,0 +1,294 @@ +""" +State Machine + +author: Wang Zheng (@Aglargil) + +Ref: + +- [State Machine] +(https://en.wikipedia.org/wiki/Finite-state_machine) +""" + +import string +from urllib.request import urlopen, Request +from base64 import b64encode +from zlib import compress +from io import BytesIO +from collections.abc import Callable +from matplotlib.image import imread +from matplotlib import pyplot as plt + + +def deflate_and_encode(plantuml_text): + """ + zlib compress the plantuml text and encode it for the plantuml server. + + Ref: https://plantuml.com/en/text-encoding + """ + plantuml_alphabet = ( + string.digits + string.ascii_uppercase + string.ascii_lowercase + "-_" + ) + base64_alphabet = ( + string.ascii_uppercase + string.ascii_lowercase + string.digits + "+/" + ) + b64_to_plantuml = bytes.maketrans( + base64_alphabet.encode("utf-8"), plantuml_alphabet.encode("utf-8") + ) + zlibbed_str = compress(plantuml_text.encode("utf-8")) + compressed_string = zlibbed_str[2:-4] + return b64encode(compressed_string).translate(b64_to_plantuml).decode("utf-8") + + +class State: + def __init__(self, name, on_enter=None, on_exit=None): + self.name = name + self.on_enter = on_enter + self.on_exit = on_exit + + def enter(self): + print(f"entering <{self.name}>") + if self.on_enter: + self.on_enter() + + def exit(self): + print(f"exiting <{self.name}>") + if self.on_exit: + self.on_exit() + + +class StateMachine: + def __init__(self, name: str, model=object): + """Initialize the state machine. + + Args: + name (str): Name of the state machine. + model (object, optional): Model object used to automatically look up callback functions + for states and transitions: + State callbacks: Automatically searches for 'on_enter_' and 'on_exit_' methods. + Transition callbacks: When action or guard parameters are strings, looks up corresponding methods in the model. + + Example: + >>> class MyModel: + ... def on_enter_idle(self): + ... print("Entering idle state") + ... def on_exit_idle(self): + ... print("Exiting idle state") + ... def can_start(self): + ... return True + ... def on_start(self): + ... print("Starting operation") + >>> model = MyModel() + >>> machine = StateMachine("my_machine", model) + """ + self._name = name + self._states = {} + self._events = {} + self._transition_table = {} + self._model = model + self._state: State = None + + def _register_event(self, event: str): + self._events[event] = event + + def _get_state(self, name): + return self._states[name] + + def _get_event(self, name): + return self._events[name] + + def _has_event(self, event: str): + return event in self._events + + def add_transition( + self, + src_state: str | State, + event: str, + dst_state: str | State, + guard: str | Callable = None, + action: str | Callable = None, + ) -> None: + """Add a transition to the state machine. + + Args: + src_state (str | State): The source state where the transition begins. + Can be either a state name or a State object. + event (str): The event that triggers this transition. + dst_state (str | State): The destination state where the transition ends. + Can be either a state name or a State object. + guard (str | Callable, optional): Guard condition for the transition. + If callable: Function that returns bool. + If str: Name of a method in the model class. + If returns True: Transition proceeds. + If returns False: Transition is skipped. + action (str | Callable, optional): Action to execute during transition. + If callable: Function to execute. + If str: Name of a method in the model class. + Executed after guard passes and before entering new state. + + Example: + >>> machine.add_transition( + ... src_state="idle", + ... event="start", + ... dst_state="running", + ... guard="can_start", + ... action="on_start" + ... ) + """ + # Convert string parameters to objects if necessary + self.register_state(src_state) + self._register_event(event) + self.register_state(dst_state) + + def get_state_obj(state): + return state if isinstance(state, State) else self._get_state(state) + + def get_callable(func): + return func if callable(func) else getattr(self._model, func, None) + + src_state_obj = get_state_obj(src_state) + dst_state_obj = get_state_obj(dst_state) + + guard_func = get_callable(guard) if guard else None + action_func = get_callable(action) if action else None + self._transition_table[(src_state_obj.name, event)] = ( + dst_state_obj, + guard_func, + action_func, + ) + + def state_transition(self, src_state: State, event: str): + if (src_state.name, event) not in self._transition_table: + raise ValueError( + f"|{self._name}| invalid transition: <{src_state.name}> : [{event}]" + ) + + dst_state, guard, action = self._transition_table[(src_state.name, event)] + + def call_guard(guard): + if callable(guard): + return guard() + else: + return True + + def call_action(action): + if callable(action): + action() + + if call_guard(guard): + call_action(action) + if src_state.name != dst_state.name: + print( + f"|{self._name}| transitioning from <{src_state.name}> to <{dst_state.name}>" + ) + src_state.exit() + self._state = dst_state + dst_state.enter() + else: + print( + f"|{self._name}| skipping transition from <{src_state.name}> to <{dst_state.name}> because guard failed" + ) + + def register_state(self, state: str | State, on_enter=None, on_exit=None): + """Register a state in the state machine. + + Args: + state (str | State): The state to register. Can be either a string (state name) + or a State object. + on_enter (Callable, optional): Callback function to be executed when entering the state. + If state is a string and on_enter is None, it will look for + a method named 'on_enter_' in the model. + on_exit (Callable, optional): Callback function to be executed when exiting the state. + If state is a string and on_exit is None, it will look for + a method named 'on_exit_' in the model. + Example: + >>> machine.register_state("idle", on_enter=on_enter_idle, on_exit=on_exit_idle) + >>> machine.register_state(State("running", on_enter=on_enter_running, on_exit=on_exit_running)) + """ + if isinstance(state, str): + if on_enter is None: + on_enter = getattr(self._model, "on_enter_" + state, None) + if on_exit is None: + on_exit = getattr(self._model, "on_exit_" + state, None) + self._states[state] = State(state, on_enter, on_exit) + return + + self._states[state.name] = state + + def set_current_state(self, state: State | str): + if isinstance(state, str): + self._state = self._get_state(state) + else: + self._state = state + + def get_current_state(self): + return self._state + + def process(self, event: str) -> None: + """Process an event in the state machine. + + Args: + event: Event name. + + Example: + >>> machine.process("start") + """ + if self._state is None: + raise ValueError("State machine is not initialized") + + if self._has_event(event): + self.state_transition(self._state, event) + else: + raise ValueError(f"Invalid event: {event}") + + def generate_plantuml(self) -> str: + """Generate PlantUML state diagram representation of the state machine. + + Returns: + str: PlantUML state diagram code. + """ + if self._state is None: + raise ValueError("State machine is not initialized") + + plant_uml = ["@startuml"] + plant_uml.append("[*] --> " + self._state.name) + + # Generate transitions + for (src_state, event), ( + dst_state, + guard, + action, + ) in self._transition_table.items(): + transition = f"{src_state} --> {dst_state.name} : {event}" + + # Add guard and action if present + conditions = [] + if guard: + guard_name = guard.__name__ if callable(guard) else guard + conditions.append(f"[{guard_name}]") + if action: + action_name = action.__name__ if callable(action) else action + conditions.append(f"/ {action_name}") + + if conditions: + transition += "\\n" + " ".join(conditions) + + plant_uml.append(transition) + + plant_uml.append("@enduml") + plant_uml_text = "\n".join(plant_uml) + + try: + url = f"http://www.plantuml.com/plantuml/img/{deflate_and_encode(plant_uml_text)}" + headers = {"User-Agent": "Mozilla/5.0"} + request = Request(url, headers=headers) + + with urlopen(request) as response: + content = response.read() + + plt.imshow(imread(BytesIO(content), format="png")) + plt.axis("off") + plt.show() + except Exception as e: + print(f"Error showing PlantUML: {e}") + + return plant_uml_text diff --git a/docs/index_main.rst b/docs/index_main.rst index 75805d1184..65634f32e8 100644 --- a/docs/index_main.rst +++ b/docs/index_main.rst @@ -44,6 +44,7 @@ this graph shows GitHub star history of this project: modules/8_aerial_navigation/aerial_navigation modules/9_bipedal/bipedal modules/10_inverted_pendulum/inverted_pendulum + modules/13_mission_planning/mission_planning modules/11_utils/utils modules/12_appendix/appendix diff --git a/docs/modules/13_mission_planning/mission_planning_main.rst b/docs/modules/13_mission_planning/mission_planning_main.rst new file mode 100644 index 0000000000..385e62f68e --- /dev/null +++ b/docs/modules/13_mission_planning/mission_planning_main.rst @@ -0,0 +1,12 @@ +.. _`Mission Planning`: + +Mission Planning +================ + +Mission planning includes tools such as finite state machines and behavior trees used to describe robot behavior and high level task planning. + +.. toctree:: + :maxdepth: 2 + :caption: Contents + + state_machine/state_machine diff --git a/docs/modules/13_mission_planning/state_machine/robot_behavior_case.png b/docs/modules/13_mission_planning/state_machine/robot_behavior_case.png new file mode 100644 index 0000000000000000000000000000000000000000..fbc1369cbcd71f278a89d38adbcda6da4b737e36 GIT binary patch literal 28565 zcmeEu2Uk?v)+H2$M4`wa2nbaYBqLEIry@ztIg3aV5ELXRk|Y%%C`!(VL?!1ah$13_ zhy+ED3<`oG^s4K7-}}11{sTQmzj4PK_tvR8XYYOXT5GO3=O$KPPva!{DRKe=f|FQH zRYL*-!n*_n2tHB-e8Q(lN>4z*Nq|*VG7chK3%zbl|DN$!xmWIu+~;U?eyJkwm@^Yf zNy$K&Ho6f7GL{FnnNh450S2HJGW2A!=C{-mQoO^OLHy<(b=%^jYp)Noo9v1|HuWgalaLZ1!U}Exs`ZF^# zDLLdrR>sSwpI>bWJ8oZjOhipU=to1K#l^Jwv>?FWzqhaNB&+0_wrEFRJQEWGLx4%- zDb7x3WH`?F{>-mmzYe}H#n0y-OITT0*t7)dy4ZGih7&ZA5@1r|e}-vXxqSJ7%(bt_ zG11k|j*fn-6P3s1-N)essqh3t-9V|UA5;gGBOgmjN=isbly5Vr6C*WM33y{wtB)^? zzb(0>tD}RoC~5iCpTK&2h2V$;$sA9hk0Up1ap<|bv9F=pQC7y6F>DgiB7){rq7TPNR#1*v`s+;?!m!L$S>TM==0~(Jg+;*DTZ6a zTn77C)vjH;ruIDnS$XCCQ_QptGc}5ni7*%Kw|zp3i`APs$rV4R$0ChB+^oLLNv8k& zmr!GyuH=FUdnJBtm748o!pRk*&s>hgkCB#G8)HHsONph!uT?5r*VT^Ryu-rFz( z7sKr&?&jv^aX->lLgb(ttd(_Z3+qYVvI}=|bLL@G2IsmIvExG|K`OGa9)&7;(wzFV zXXHj-guvZO)17kB9*ab+XHLW1#?J7tp?G6a?90yb@ZIqtQGMEHJA|K-6X8kUlIf^O z(^N^R{lz%oyB>?oys?%C1m2o3fmr`RG7_W~*Buhs2TAbVXMU7vs+A$(Qia?EX{xec z?hqj`G~GmxU*f-ODRV+A-&DSnbqv3)PWyEeM&L;2BlN!Fzgx*h1S95{JNXU2pj0yR z;fmNhTZ%m2;k)^!i4ibj>aDx}Fm%o6M?Q+VXyu^X)PsMfIq?6d`Tz3UY+ViS>+3r_ ziAY0_4hxgHD2AuNYb}$mBqJlUu&`*PJ4{PUPft%y4!^EMh|(v)$HJpUDeq62Sy`W- z+%YmVHND@|bdV+GZM?=oN&W8tk3Y3O?cH%&E=Z(biy7nM;=;nrEMQ)(m?m9Mh|s~u z*Y1=cKmREa`_AMuD)QE4-&R&!;ae;$@Gp$?^rfFQL35oE8NznOciiTXoQgsa zS6>zcZ=H8@TZ$keSvmS%S6z(~CcXGK2j*)9UV1Dm;W;T{V>3fYbYg9~A&K8Bg5%#Q zv|;~^rIl696)7U3+Y=L}+mWM0e;-TkA|)$(qcfZ^A|ir*hs^rFasipf&F^0mVq?h! zlF&#pSr!OooGRJVvND-lhe4{UszE{ffi6cAPi-o6z8}91g#VoHi5?yrLdUV)`J4Gy zPHaAT@?<1SYICOL=l16TQlnc@QBhbd_RN_xFJ8Pjefso_BEd{JK`gwkSmGAQz-TlY zgNfo(MRO%Sdh}>+Zm#9}W}F(K?BDy*CR`5=KDc|5b!d24?WI${1m@3ox|7Iy1mX)9 zK^hvd+27v}C1(8HJJH0D-V^@czmDXD9a$h5N0I-$R2*q*gXdKK*(Hrb`he|)K1k`Y z;Y1G!2x%cl&>}BfxR5pVGim>_W-c2I4HP!v7}9aVe_uqS3-j`n?4OobpQ52jfz2cn zp0OS2c>3nS&f-&>79>rC?BBv+)A_Wra;77!!*NL_lG85iPXN4fPuH258F-nA-xgfF zy*Hm-c|YWuY<^|w)nh?HK^w&g@_+wU-gn`S*aSyYQ&Xkov$#x(-ra@scQ{GIEFQH! zTA!MlsutGDa6<6oCp=EPIXI}fz;~LK7MGS*?=+AIFW`c-bnn2x=N+@B))v+F5m3Nx z-@e_^p>XRq+s$hularIHId%>XXI!0sZp=_oQN=R}_Y1%eVD3Hytf)ety2Alauc-7lp2_u7QEU zf1k~#8mH*N*f^=?#>O9;vthq}d~qu1*KecSoi2T#}WQ zO*|z*e&zMx;GB=NmzS5u@qvk{Dg2g_O?q@@<_|p2?bRbC6k1HI)~w2whK2_2_EJG-IbXPHsSb9mupWx?AEYC2NV(v9A;?KL$ue0+R# zXLXimJ35k*s3+96g!K*o+oHzAkVZDeamK)HA|ePghWmOE2!xKVZj?#3Lf9+EnvI#3 z=m0Y7#-_%`%m)wVrQWQrtjPcQ@wU2JM5poTQ?ISLE+r);^Yon1Eiv0p>jfX_#%Hkm zoqlF}T)Fc5$2U4Ars%1%va&aC-lW{Y%~4(d&$GR;yq90#Fq3Ev)R*okic_af0h%F= zOiN39V%Kr&$MT!|_wRrG`V~IkcSJa7Sw8n<@%jR9On~v+g$u0S7Og?sUtXlv@GJbD zb7>7Xpgm3~cNxAPeB;I~>7wTAo9yv@H@4;oy#g0{V->MBJ0}@d&cHsx@Zj`uIXN8i zFRE->`W8RP+K)FH z91hm0IXT|$?lG0r<*@W)V;}0=UPqf~3fr~&2L!wu9qr_M@%r`auCBbs#>VpUa)|YC z918hB;>!K>bQ2g`Bxh`FEG+!x%a@tcx9X^E`zerVcQ|9Js;ctx=66@$fB*jd&*5SH zvu6>w<}a-L<0=HaW!ie^JTwOfhh$b|K|uke&Kp0LAqzNutO0D8eever=4{9H%}>(} z-tQ+S`uX^TgYP|S#ki0J6(goTBV z)aH@w2uy&?*<59OIkVZ?xQ`=s6?1A1$2GIj>7N0ZTqqPoggHZsNE=_w3@69K|9~!5 zUNYXYVM0O%JnmaT$mr9;5Qz}{c#Jqs?3as%Uy&e<@p!U~;tQlS)4c)LGVj^=6P2npFe zH!#iE(2$mcL*d$29aud81av%l#-^qVnLTaj#*8~0l=}K)h!tgQ*zuq1&d!CfPbw>~ z>g(%UT0R^Up2bh5%$dDAR~kWCQpj^D8w&P4HW|O(7#ar$2Wob?PHrX(0+=Fgu)QMp zWm1t;9RImtG9~GuA>eYDPul4qpxpostWxe!8F2-hPC`Xuom1wFzD-~5jcI_W0CMK?a;Ub0@wlp^{Eu7M$sDlc~D(N};_AMP}5#SwJ zsB8WG2)bA~zgz)_Q1L9BARzAH4B zS`XL9rL4@%nAq792^W`^r0hEhY;0@*JSDJ7nHn2wl5)lnB6c_x6|}T2U9zipPf1UI zmw5Vuhlj@(GTTBgsBU_CRJ-3)SlG{=Ju826kMefZz-LIrD=RBh>~cYGi**?p8IO*R z)YR0tly0jlD#D!Z;yz4I8yOnHJLpg12!m!CkeZqjr+rC5fz#*ygrUqW`i6tGDe(xm ztCf|NiHV8Ozhy~YRYsvDB?Bj`9kerqLVql$5V-3TBk7|EpqG30Y&XO17K^CkorZ?% zuCB%TTs+m+j*tHIL{l3*QlekAy?XVkk5Bd1j)73d6)!J&=fPyiK6$(!#>N0soDbRG zYPzs_w(uxaVp`p}W@|2t&7+5;`^IMFB@Ad-Zm$z@)K{)y$Y@+VIyy`vGFoC*l~qQd9TpL1&@R!^-l zoHITA^K*7*fbVuxZ*TAM;gUkyB8{4iDZEZLHi5(>k&m!$_6`mak9p@c+i(69I(xRi zyW0^!9u$C}3-knp!Khp|Oh8~@w#>Eut}chJNa9P`aw$BL2LZy?jo~w@X5)t?X^&ye z;Qpnhyz{FP4$A=D7*Czz|7dGw))4&Nm4YH3sEnVRvlJAH6r7w4i1GZw`uh4Brvbez zNkJYSjdGDRo|lV$1^M}b>mSlnQ{R64_;Gx^(8sbVVD*0Y)i`m0So?eH)9^8*chA}O zw!%VHzh%7~c|o82JF8GkDy^IFVfgLaw;*m9Gz%)19Fv}wCScndG+t`({;5ss8A_Lm z`sdGW+e2iu$U1U!bI+*URaIB#?|N>Y3bbh^1li=b zwr<1InAk6f^i29waruDy zNvPPWYinh1?g{iL-}Xt13n%A|9r%1&K4fxiOziAg^_i#RTh{WSzl`ndIKrks4GocR zIvVDcmf~-``o1dc=f>&Og-G+XXo9IUV@pff&|m(SF1=n@a9mVV^Bvy&)CS4$?Ynm_ zbyF&LDc`(%2N=p_qA0ebm}2ZZ z_$_K(ym(PuT%3a=2Ns@^T`qIPHAf-L7V|1ue0gy=W6Cqo&reLIr;GQ8S~yzE!_(9H zna8_E-_PXa>0HC|a zDka3rTUt1(nqUbJSD8CT08w~Q-!9f?F zY?pzHhsRJyNq~t7Lxf0AvygS8y_3`Tt$CHZeZ=#FvEhci-yneiW9RGZ>nA(lxyvMM z3(+HTL49F)Y-|i#$ML0)$;>C0`%wCF7~lj)hK5#vzatP8MF5C`coD6k{Cv}UrWu;^ zfXz=A`!&c3wPL~}`M$k;AP)2QkrX2P1s%zSz_nym+>{u!=BB2m9=~mA>Qc|C_{S9% z-noOkdGqEW9g|p4fhm7`dwYhEjbTM{AMwr4{U@(nxgzb?J6~YBPF_&B4*fKXxO%16 z=umL(9u>+7*Uog8(lQKW>3xV*xcm>6v8S>Gn>$duf84nm}o zj*d={9D7*7!;GKn(@QHW>oeC`YEUqKjY!hrJjPoF{u34hlVL{p0+J}a9CcOIQfw=# zSOno`Fb3O0;~uktg$17{lVuFe{?{yTy+_hY0tMM^kS@8^@9n=1J1WSYR~fH08iSf9 z%|_Kq$2l!}9-8}o%O}#yN558u&z(CbUEe$3vlht0_YIN+tR6Zn;=bqu_bXS#&!0Dr z&hT{6o61v(KCAn%@Of6^&^=ZndY3%3P{!)=awZPf9lmcu+x>YUX&~`*1upOM`Q=)d zVP$1y_b(!i8U750iqUK5YvlJ8FC&-CC{r@j}O0JdSRl7 zBP7+-#X_mu+1`#AsNeY_k47V>&tloRS8}tmv%S2%sn4a4pLrj2Z_EHc8-OrxCMRQ7 zRg_L3_ZzS{C7Vj)_TL*Vot=u$$Sp>0r$^#!J8q54``Di&lSyJ`PvWW8#XUOlpz>&M z-Ri=HmbEoc*J!dg^Q4+eSb2H*>boaN^u8)vSUiA!`sIp8>HhD{4%iMSvfInBE106f z0*$2Rrlu~3Gsh0wa7oyrR`u2e4;Kd@8LGK+A39reY4HmSH`LePY2&8%(}85Z@v+J1 zO!4@DCa}m+QN*o0xl}9}F}OMJGz2v;GzR&{ z#@c!gr)!`QTUogdV3*!SH*~G_=Fe9X6O6XZ_96W(d9bX3JUk`$?osv=Dd~m73okA% zo-~K1L9D9rftU+9vBKm`77qE91~u z(NYrH_-lm6-~kOphl6(*@z)?IMw2(KU@#Dx(EBnlFzkC7;dwO{7JNGDT-XOH0qO|} zHTzt~>%+8hKy(HMN?Wwvz>`53tE-4!I>U#hMJX{oo#Ds+D}cTfC2g8%h#{2Zf>95BDqu;89z($TDM={%&x2y3}Mq} zX6dg-8H_;)eSP_$CsV%+HMsJwA~iMDfPSXYw*aCG&?>z7sw&~Qd`J>^@7{g-^y!Hc zCjdoNRaB@L)I+_Qp1uh%6F~YGAL-Pyy50GmeBU757pNy#Hv3<`d9!7ZAV)rQXx`_? z?(RSWYe_}L&G~L(dNt_bq<{*P%(@DlrL*&o_jY0Pu8@X2oSmJ$y=P%Dq0tkzX~wSx zbO9eez(#|mmdtANnOBC^3i7vY+l_67i@khh)zzZ_Z2-+qO&xrHtFwfD1hEnlq|7Go@6Du*y_6#lRd#)O zIrrwB@$KV3zkkB3*^i<ox$u*pBYx2M?wSG^j&VpCqTGbj|J`{kh2)AJx_zV@I9O`|$B& z;0C;{uLIyABqW?11=cmFB|3WG>qu4&zdH88g9i^EKJ>B)1A-iprpkGjmC4NA%j+&N z;X-HL%@%X5hf?0`dl?BEG=>Amz$}LTUTp1#^#;0ga3%si6| zZyXh?B;~9-aRJ&>|8PUPL;LUyUCeex%()!E@I5^$*248~nF!ycjZaLtdU++_ zJ%4U$XP2{2{Y*~V0R1LiRsGR1x@u#p+JWf!pzp`Gx9_X$l4cfx@*@Vi>u_H*E+6Cp z*c$`9Wlx^0K^embB zNsxLP01>h$y?^sY6yD^U0u3h>O-<~L8#kCgKLQkBcl?X#l@~K6j~BoXU|P?$Z2F`D zZEfwI4FdeeJu75)xGu-;4V2qfll?87Oan$;ECki=j~{E2-76~|jPYt)W|ni-ZC1B| zXUvd9w3(aGvZTaKN{~LNxG5(mw|_|vXv?dvt`l``mw0%1=+z+XZ|<(Rz^3-}Yz3fx z59c-8K2v4ap>+EMOk{8M{botdt!~Hpr9`NBKpz}#G%8j@4u{Tc^=(N~S=nWvji9pJ z&3C_bs~ws|CcBV4Xfoa0qC46Kz3hP^Q zbb&H#RC`OhoX)gsnPXEtgi^fr%bTOd`93_Uzn;uy~L%8V@fL4ODsT1XMw zeMP6xGLEY{M7{87xHgwdT79#Iz5C?~$gk?^>ib0xG&TVyS~KX)fH%iYkHoIL&UY7g=Hgd@ zP(FeNp}!(dur=hsuXL1e{|ec^y#T_}(&jY<_7jJH{=n*`J6K3&RhU)+Z<*xToU5jU zUtn`{=Jv@uvzOPtQ69x91_ny1W06;M_W4vEWezTi*Ba0!4H)gOu-`&D3wcjJ7lhu* zHNdg&)hoTHqnC7FogNw*5)crm?%U|)RT36Np>(kd0Jx|X4msHr;blO*+6r8g*?>4E zo55cL{ePS>B;noKV9orynI(oIJ zC^ezNISzwI;dkN%_xZ}ELyh+b(#wG>cscL0W%jUWjMzcz4D3_+)&;;TXE#!Tp1FE( zZ~&}fcZz0Xzv@5lwYvb-!0R1Az1@gx6BQLbtSYC~T(Hr;FAy6S*If}5G2;6B*Dos2 zOeQAwJ~mzBVls)D9#bu>-clwH5qQFBsJPR|D2)kdNs8-i95YSD;i|9N7MdV~;nDYT zTwGj%Ym-WRcHdTC0u~W`*Io}Tdrs)ygF#j-&^|9;Qnozi@iI`$E)~@6CDYzvR@(1A zX(#Y$-I3xRu6ag7?dS_|o0A_td|h08;Y}`3iXyB@O9NE4o|LD&w3YY5`uc6^^3HBh zb6|sDq(DQt42qOt_K54}mf~0|<1>ERbiUoS2DmTrg6NMLYR=VTlI-#!L2vTa#yleA ziclzREPzU=?GR_iJ{c3MUZ+@D6YkwZ?mfk*2mJZ9yI}hI^Lr)UxHv20i2Ty6`)85e z-8@Q&4KfPN_Q}W*SM6GudQ7d!^oI{?Yim%3IkRNl4G3YJckG(S6bM=>+(4DqpfkcFaRc^@O~MnPl}#%+GG#tFb$ z##_?T3KSuiR(WNmgt$2Ot!IW&Pv5#M7Z#$+GXK0t>Ub; zI}wX^mtm~N@q!}Jm?;glAZr#&$b-Sl(xj;Ggl$-uk@` zjUMJ_7yyys3}Gf7VMQ$0d)t3rw@6$*f`ac(G$UhjNDXmEPw($i(xh`%!!Uq(amHFD z4dO~KUcBP)>ebU{&(g`;$Ht?{=+0^eT<_`XQu^@t#T_OSl;>qG0Rh%(dTY-J;;~Pk zHXtEy44FN*SBgK6Y|cZ|t3f0FszaCQa<-^b?(^r8fCI~(%Yurt9;k51HN2pe7nhpa z*4CEp#vyQyI%0#btgsMx+!NFF#=*Bl-tB)zzKyyJ~-u zc^4ftHA&kYB#YRbTE!uLV54kJu&&bz(m;`howEA*vr5aB@x~Ug1VA1X`ALnxjmQZ* z3SJyaUtzfrT5vxNhhO?6_Jq)nkK5niHk$zRwzttA6jU6<0QiD7_;rqg*)_IhhW5g= zJC(LI4GjbBr0P|iCX4uv4Ek0R3E6w``R>NX#`gA{ib~wb!dzG^Aot=!DZ5!czan`R>hsUnAuFI-hWn~O|rNF}$q`NPMLkA@fd zy-BN|C_=N1UNd^*6DowIg7HOl_;SW_4e~NZbar+wFT3}BKwi;OR%Fz=@p})XD(=HS zJbKvB&^DkF8<~E^2IQI5kkT_T)lGQ@nHbfS^1pZ}0I{+RwH0U@)s5bAw8mw(J&-Xn zWv-oF|L9uYL%}GZsi)`a;ja^?t*)hPfNyM0OL)+)6{9!YH8nNEH&4eAMIY|{`0)dF zU9y7EJzhaXtr6^S0F@L>LTMfpW|#jIR2@E)xB~2{uH7-qSDB;%){8g%O-&hl794G* z{R|9m-uw+2Zr-$Eu*VUwTM~x?;}mfMztY_Dn*w0NOqeHpvpqN2c}$x(EoX&8a$#$>tBw-MFj*GhyrLq3Z4^QcRr=(~c`2#r zsCIg6w5Alj3YIqxSV7%vxr3I?cHl=)GB`^KuaAv1@&$?`_o}R$XeGUhT|Gpq=+&_t zk~w&V38vNo#OeL}jQj&Sk=`I0w1DbLM%0BG!;5Q=0gP%_5bqS8$<>4vt_J}KP#ZL# zHx71+8kw2HPW64>2JXqe?|vhZ;^I1V{1KZOrE(MtZ2BS?haNU*pUw{-WW~jezLWwY z_VVzco$9;Szw|n1dU{%%n^9fe7g}10D-$AO=T9Ju=;0aX&z+0!nTH4_7cLaw<+a~S z*n(v~`1MO{ZqD9=)LQ8J@@QVQ6oJI|3IY6(U=yz9olA1T7>-3t>B(NW@cHv+Fh;!2zBs(OxHaF?aqyM(i1BPX zFocv@X-0B+keSc_YG?23GJbM%s0jViZ0&m-B*3sgza%Ead>48_F%sLo4UsQdt!iEq zT_*MDk0{^W7sWLD-!YUZO~mH_G9*wi%gK%7TZ)OR#j2{;vKH8O^5Rw#NTvE#34zB+`4 z{u#|vai~1;ZbywDvdxD#Z?@lFQmC0)13tyj%nVfC`wqitw6wI6-k)@yoxiQXMYHw& z`)_EOR0EX?%2Aw2II!t_-`Sx$E&BtyQ(zjY*`)DS%faL`l)vS7KRmkvKA)m;|L#V^ zGd@u3F260t79qNa)PX?e=jA1(q#S?${yoF4xQK}T*bo+;*>Utsd+IXLZ*ryFwzg1^ ziM9lk&)D-1k+P)SIe=E&51=cg7znZ%=(q<5V|wO+hF$*lP1M!F0f-B*0C`@hDlJGw zUH}FzGBiN5(_-h$U;&yiho7GxXk8R7tyu7mv+p^cU^!b0iz$%z@Wzg}Uzivf)lXl7 zf9K}nO5{+mH8SE3S1=Yug1`@OpYY@gXResmLUe9XQBhuAL`P9U!Dr~-ApW6Sh1yPf z_(gFJ0`Vpcur*kGO>QLY-u$9Lf%?T8KS-;Tm7bC6N8*s2jW>UKfjFV3m&o=OWvs@T zjKl4K>Ll{*)#J_-ZmR(J^_C#ZRIbro#zkJfZ#g+J2T0vG#oX4`5NPG0D;ko{boZ#M zuLp{yxvlMS@saq;s~!f1%F5k9;4p~pcHT8dLUR{+`!*{pD@!2nFkNCvi5m#5a*T>* z2Bi8ixL2@#U@@r?CzGX*Rs@q0wDAl9l%kz_j*bfrv$uxB6`4^86JAgs<+-?&eN(9f z0&Xh;-2srDDA7LZj=qHj&_6G#naJoPDG$_3m9aq9Kp4A7y^Rr@0XY`DY-zqr#}{e? zRwn=^YcHFqsK6u-6^{rtrf6Vm9>Rt-oVd8_dyh74d~B@t+L9jG5>BhP9f;h~QT-72 zzqX?V6Ey~;ddkF;`1s-GfK~MS4;^7)SFcw7Vq5J^>F?{i`mr&^{KO5cVmNJ@Ke#xa zJP~-YM07w^A&pek(4b}(>71Noo^EKUuJ!?o()RXtrn9EzAV`q# zOrX5yzkIpVz)jk{=m*LEs;8%VBHM%HWE2BK6iaXPoG@upBB=Oe)#n17C=q-A8F*4(HIazjDnW_Uj}K8&fFP>kx%&M% znOO=Zn+YI{^U(8CT%(}Y&w^iVbaWIjzzGtPpDXVuF4W*97hpaC>e3*$_x8T}NJF3A z@84GjerR~*#3xQb9svl7>X@vlttF?R=o#4E*#Vv#d`{flw|nLx=bZktWM*OUZDXSy znnazkzCJb3yQZgq_naWGdhYpVe7rY@75~8XX(@OU7>rBz9Wp%OLAju^#dJpl?j$D{ z0*i%Lp@Me`A2n&{lX3Zez9s#D~qLrGZ7i zgJ_hH`0p0VF}?J`!9i-f<@=@-Ul`&EFua!m&ve&Cbrd~)x>5I9{&UA{z0Z6%@c9mI zTu#Ra>r-`0sh8w~e=O4!^Xe@^EJ3=O23D>k^oc+s==U^@g|xY7s_3_)qv0$p3b*9? zO@YpXc0V>gzURXg4-bETe^Op-5IrF^a0h>#S3f~NVvIM@*XQI6e~f|F9SB7?m>Qyu zt3nZb`1Tv*3*cPwJm&%F=xtnWU7dx%MD*q|tdI9(^%-8?!FTV#eny++`j-`YKRP-W z_6e5v0yB zz-=lj+!zcj!7I>PAxQ(<2B69NQ)|~+OJgG>Qpji})2kAXvcqx6;*VgXgLh1PDo^Vy z;zurM%jX&68BnO|^v&p5qHbQSk_5!$o}9P+G{kB>;t9z8v2@-LGc z`i3BT6c>L53H**5Q^qiq?k5!$tJh~kfx@;jG>m)^V+34V6zt-`@1zh?o9(};IJyDj zyP>XM{|TYCV1r=t;^HfVwnrn09scR7Us+uxIeF?1Yp?D$6H=XzpWm^@91I_sI$mBv z@qOLAWbeU(B_<(pl9aS_6JrD09yZdvU2Ql81rvT5m*KWwUp+oIN@^lTlS=}IAu;by zazKk6My`U`2`_*&(!QY~=i*JE3OB!eIq%SoJba!goesH*_0%cH3n#%O3Y&*Gs{%Y2 zKypC^f*u=4OpeeWG_N>s-MR(B25bf!GGNZNM+dM$uyJzh2$>FF%+T;v@EU!Vqm19t|tOtr-B zS<*DL$keCy&?}CPJ0R>v1eW6N5?tkl#kv7W2nyZu`>Gu9Q~+P+=;-*=x+%Y?D9V>E zl9QO27z~pNNBi88tE%N8jnBbX0T$z8z^-72gj5DK;ob`-CMIA%b^#3obKrvQ>wE`4!Ysh2Lx9Yin40`XhdQ+?EbNn&#l| z#XcA6lW)WP$g(sc`vdt54vn&^s-+xDq>C>Krjziw`Z!O55uo6%BFFpnKcDZHyjrlI{JAIn^G?TcgJYRNU20vlP`8Ja)B&bjCm zx+krzV2+6an$iV4Z9h5=%I;!v(Y|yqCnkVPF$Na=9M~q&66pNnD9SXnmj{@G`DJ7* z{$9}1$AL5z(2lAw&EDVN=iQvlf3^s&8t{x}Pn;PY4~ClDkd=D4`C2xOK91lKCN`@9 zNF_Zzy@eZjHo-6wKL5D4Bh31^#nRsLEZ3S&x z_W{gsqr!z;I0@1UNBQ)moD{LhR1culpb(w;*9BE6MuXB0{Dl?3K~PuF|Kk^N!;7FG zbi1XIavG{L>C?`btL7cSQIAXf(9z*=w2myxCIaBYBY(W$FMc9TRqk;+G(zL!#{a(H z??e;zivoWE!eVaNs~5%B%EXJ!GkWCJAC8N08;5c;ze@1;<5(|&eEqNp_}Cbr4seMz zlt1_x3IBmAY05mOm{aM&0_asj7u|z5kTp1R=JFx|gC!&+;PkkYjo1)S_v2)7nI3MS z+d|dWNl1+JHHgFsfeGRGD6~)ad^MT|0gx4B(SkW8EMJD?)eru2orGFVy`FBoN?2eG z02NKVHd9$_vf~edv9`#t8yxT$Y!8*-fOfiV8yj zv*@HA7`GZ4q;3-xTDlwO6#`DdoIM+;kf5lnMEd2+m%_qA{dJ$hTm^`zYHFzm!Fl9( zqJx297sM{&5owXPClLVLSC*GKzv~x#GY9EzaIh5Qtm31$udb3C7(zw@{0vNqR4Q5X zqWT;(T;PBzndvI$kLJ;Pgx6Y45}tZ}HjIJ3sj{qW3Tk*Y?&BjBXHIVJ&mc&x1m+IY zZI+u>xFNbZNX zZ2e<5qh(-7&&(u!J?HGgPICh+JK)-A^qEJrpHGKL1YTtZuJ{rMG1Ou@U#-osWP zgmF-ml9DPZE><^KYXEo6#fxnK;SFC3zuB;E4`~8E%z;Gc!93oP2AT=?HPrz2_?0x( zk&jR}x+oULbrQr{7yOz)JutcJQq6Y{&PIrwooS$FV34_R;kqyht&Y;%uy6-7@q$MA zdH!sQ*xEvnZrni1sCAE|x=7>W=f^K3R9v!o{`)Cj?EO1`j(%@KC!MZF=f51=35i@x zT>RU&3nQ&3v~U;)sPWX4lxcr2LJRo%k&Mcv*~?!g^l<`Y@z9`AQ)m2hQ4y%GwTG&D zE8#fjE-*A!D*XA8D)2B@Jo zf%Pvc!u;p1V5qilrGQ9-$QBS(6wkva_> zfQyTdcc?>`85DFxK6qfEqC)iNbM6f!+|rF#SlAn8iGSV*4IEa0?RuU%^27!a2K%@W z>J6BnHUD{@JvjY@ZCi7LQRjU)c(E&JXk_Wy&CW=8{4m3B0Y1LGFgD{PI83xc`Dbt1 z+b(oBecU5g@fuP+;S_P2sxnmsuJFXc;lX!+B!3_0D?2<6B2X-0wn(H zA|EIVXte6z+j>f}PGC}!Rba`KK0(6>=Rf|A5)R{-T3Km-=X9Tw*T;=x;~>OiV*YhU zumgmKUi{A;Wo1vb=WyevKKlN>_vOp~nK~f`dy?f31Gq&17OEagyQ2v4cLS~K5_p^O zk8(qzkfIV2Mc0t^+oxf*j9;i@#a&-YhyJP`za3B_;xd#5c=tNbZpRTo$Kv9%twQ>J z(|MfXiY6xMkjE9djng9_O2TODV`DI-^l)#Ta=^ILV2;Pq7)Ub0(2clx@U9#_GJU z;9%;*hn=8Sz~bx|Z^(gRTFn5ASa3Rb9S)D(ehI9%>&QdQt6^$4EZl$c#)FqHb$}xD zBszJNDgZu}&5mRCI!>RWe!Io(Ah1U7cmMe7AAvkyz&qqxH}dl3mUsbBy|3keuR-q1 z9C=q^_8L&(qv)6J9XEE%!9pqSm={Ek4-H;CqakN)DRJ0cSzVpn=QzI_uC1kIF2J}K zcL2@vt)qh_Sl~;LUcrG5J}M1^aWLJ0>P{=gmNoL}`Bm0WSHL>pJu6SI7A*G(JZW%9 zs;9df+~#-f{I{;5`a}0Ci-2maGk*dr8h!j{7rJ*I9+Cp|;UNX<>hWRDF`U}=q*sHj z1(goY;^}@QQD+2}?lcP3IXGI%NDsT21*@q!S?dyC?;(wG0r$}I8zQqX*evmV&DWNw z{^8QEeSIl&bMHxXveL-EbTfBS>*G%}2GhwIW)yjj^_dhCAQ|g=Jd-!V@g+`)X_(WIg0C z(%&c}m_Yi^^ZFcIW|?6d1De5)#0=)tmv_ppRB z7^d>&S{t-k8tUrM4sCpRMj(_=JRjDN)eF z!UllzSr>2BS?B(Bfvmxscg(m2P!VFC>owpeXcPXd0ZVeU?{uhiCL+gw^A~tBK^-H@ zPY#S=j^MNO$JV~mdu{?JY799D;Gi2-XV~w5djTw30xR+FIFI)s9AmnA^-mv@9k?SV zr=}bl>P2q@=f462F+Eyum^Bxe09}Br-?47$%EXg5@81I_1dcm-0G(&qn3*Rl%!EM* zcXIj+?Am;q|L?u4;^K917D-k{CXbh2zp}i1XKQQuZOLnRv9P_3sG@II8lZyjkAnfm zeKZ%UH(ldE=F=z&rcaH&w>jBTP&9*Y+v1Od7@6WbZ4kbWh;>CQBHZ7uEogfI>>te; zt9tq6L{_&cQJHeV zJ3w}BD}o{r9~UPkCbq?s#f*_Sf8N>El{~}>8x+9|DyMdv{3CjpCB$S72y;xfv1k2i zm^UcHlqN+%bZ6?Crvgsg&DnX*Y!#jZrzj7-nVfEs3y09MF7Vj>pf1fKU0Yw@PH6bQnE2K{1OihESqbXO@vrwC)l1o+JzI0$ zeeqp*7mkN)LJJQXDEPI8($9e-;+BknS50R10M}`{CQ_o)af#%P5pqs(Lp#b3$=~$o~s2F|`laQSA zLY4`R0C)t~7xdDGSH_IGMn|`xL`(`wLSqGI0UdsRhB&FN@&hRk-t0v~O)U1{OX?*N z!o&0BfQ-T-n2!ysz&{G?=^7tl(ts7%to5+S_&#L#m+heL#8@k2wC@O%z?$E8fqig- z`Tp*|MV}i91Q&=FMR|Ip`NPotSi@`gpfWZpn z{_#vk8_{+-iV~8j%BokBCc+Q;`W)0@gFw32UVWw(&p6*Q9B;LM*N8SE8gzTkvnXiI z2{>oWIDlt#=)SYDs2Isd3j50$q&@TV;D;|3ix3?rx zCYF|Q|2)q+yHhk$&F7E`*kT|CS*031r^G^RN*klRr7<)?gO{*K$NffgLS8IE@C0ztf*3bkw4?-M^d9 z3vY^Dt8qn_6P1f5RBW#DYqADwrL~@&-SI+SJnScP?VzB2AVwm7Y8lQcNTXAhjz4p6 z8OESj#0KXEawwZKwZLKP>U#D#$MJz~N>$L-oYGMrkbjB5N_Ri41N~f-fBq5|Jcy!q zf6;*cw3OFJ*m02dz)b`vgnhJrZGC#y*yq?SO?mj%F7x;7LM=!-d(-E z{z1am4TRV@E6)Dcq~HiOGizySagqd$Ef*9h?>#OEOc7HE%m)0xBgRRs8)L2a`@=M2 zzUz0Jcq1YRFk}Q=2JZ$3NpIr4B?K6UvX!;9AA5W3(mr!QvV!fg5SvQ6hU@6kR8$N% zZi$Cg9po9J^{U|V`AA2g{7gtep#x5ylLhttHM1!wAleR1E;Ge|)bagzq(=waXb#{f zCGkZoDV38@3dJl`48?ZZ8#Wgoc`ZC~;ZtN}M2g^Qx>GUV%Rvs4oz3(UN*! zWhKJGj(_~{cMPMSrbt6mWFLk}su=?R0Y_FI%LUgyf4&FD*06{D#f?i(v}Wx8x4WM& zDg=6hm6>@?n19%>k=HVMYIPj?T^ZZ^LN>qf%F@FHUb0o-GZNzC0fBZmkkhbVGGV=^ z+1^452DwXVm?_h!;vE8d`WWr(ru!G65PY2Fp{4Pk6 zUzoMAv)44aD#CTJ!NrWBku$jOY{l@<5b|^eJd!_Itdt*5&skK@K*OS{iY(^C{SQZG zl6{Hg+Ck0W(vyX34hlIdhSIp3T<1b^40rGghr5Qqp`oXzz_Dp}Et^m-_emJiSnjUf z6i}@|_D8r5h;2*`^f9D0x==4_}9NXee`QcvaBh2Ub`;D z)kC)kMA7MsH_{TdF+>Q2Ubb8;5d|@xUqn-MPfoC97AGYoWo1o+)!jw3Q1&$LIujWc zRp7O2S)xvNSGex`I$yeU$M!K+q;S&bV-SL85I2e=!W~iUS?`=mIkR0yzY5-aYLLvAax$0MukQJJ&g`R~vKfJDp6?6Tuvp5{jD2Ov)frlzWAD57D z^0b^wS`u6={tkb&2waEwXP!ni0`3JeX?c#XAOxOy@iC|$i0Qr|FN*VJ(Tad)TW8d7 zS;C$i_y}HlJX5lI`|L9s^CrJVh@!G}ZlX3K_*of9-QaA^u;DmgmJklMn>|5EkCqqb zH{qAB@$m-BePwD_DYcu_z6!S$BuD;xQ0G1V;wtoBkj-7*ljxFjga3PeeO=dqgk|-V zKF+!fPSQ!YnsPCL-jf&`djRRMY~^(7A3(zATJZLSg#k&6yCVrOQCQ)5|{hm)JYXTftr572?QGz2_v>#gaIUxuVir_*Eq0oLN6JCc7-uCwJI3?rvC0FS- zfJiv2%2OpXYQ~2E*%S;U`ehU=^3wV^L}EA;bU52Zb?Q7RBbJ{)_uYe1u-U;GrpIWL zjEG|?F@2ol*6ZQn-#f#?4k?9>qsA!=jVXm(b1jtGm$jfO3J3~b^k3Eoz!5lf3_+j~ z%u(bca+osgDDWXI+nOf+{*9;t5pOWgQ;3Zxu&&lM})PnrWq&;zpoFIV1_bG3&Jgco8 zgVUp4=zv8MhH+jEyuJCFp&>3thETU$9Sk6Ha02tm6CL7#vYluhEYMcQE&jP(7Mq_Y0Wd7 z%7YW$6+10$cYSbpKIDt7Kj-)h)A`fjENjQrX-N+q*{yt5q@kf9b6^8rAO7?yo1Bg| z1kM1jPu9Sx;{2?#+_4QfK?p|^)0;URw<@g~Km#^cjTXy-zdphIoJ1Rb46k#PkO7s8 z#6U|0NBem_XhXr1o|l&wS9^CFYzDS1fr7Rs#!7nLa9RwOzJ%8e-9X2|VF$;Cqqxn_ z$b(K)SX|8COV6A_2hzhB&?iR#n>6J)VlXowKblrrstKq(wv}l!Vfe@G0kn=EdkDRO zH=O*Fkbu9lQ4*uzv2g-S03dMA_#vNr;%!9G0D<>G9rfjZI6o|65);`)?8AYef3MKd z-QE2!Z^s_!&S2cADX$($PEH0E8VI07X3_2!FZkr;+08xzkz$KMefDdDxdA5ws+NwX zW(@e8qkDAci5637|J7Hgsa}(9qw;n;klk@qq7_D`<&t8ieyOzvVH)VHkKL@PIAzR zfQ8(m5gvu3Qer-HilA%5Kmhfgm6wNOWS?MccxM_F)do;X;7*L>1agCo3U08ztf1ie z2Vgz37tmqD0|O)#@P7Qu!ZBWE^w+`xf4YV1?Al+8hqb{Yb5LXee!$Wwd?d;i zaZm>Nb$i~2V=J;6e{i?$#%^3pjP%XDHMjxrXb`GC!Sj*v@!Ie`I6QG{Z*2-(KX`)7 z#0&!P`PFGKYWeJZ!D?s>fP>1R8Q2c+ydJ=TOd*?Qe>nIC=gi@phrJvx?WK8|?fY9> zJ|}bt=>Jz;R~`;!`|pQT_83x@5F?Z%iAjrzQAUJ8p^}iLvZk^nyM}Djki1zVq0QEU zq-+rtr6eRvWl6RqIXR!H_jk^9o$EU9A6@TNW9FIrexC3B{eCw3*dK8bzD4S~JcgY` z4oibFW7Z)d1)^be5>R_2+T(-&m3Z7}fBhODh7g-wpYMS<^6cDD-u-)^0ND4z8xG-Wdpo=8LGO(r z!NCM{tQ;Tz;$5$T*CuqXZ$f>LyOMa8C34!N6Cx{7}iax@kG zia_=sKD%5>suGYQl24FJ0@~}wrE^SqZe6t8#zt@J>LM0)yvBl{PcBl$(5ei|WIJ%4 zv@>=#=)zc6m}*r|pFgkn5$5_gX=kFq^Ko6>QAGCm{=x*Op`if=9H7;k+lO_a8pS67 zY7r2C-&^!T3#?_4{Nh9*p@h`bIry-al^R@7bU&C`&2I>~fZE25U?Q%5*KlG^`DByk zTj@475;1kW+}s^aKVstI>Ob*sRKRR6@|pW%Kn>@595+yaG&J*?636&MW$#n>lUzg25}vvc zTBs$Ywg{kNoBg{Q8ykHzT3mXhq!0@0519C1G&PXERZEen*}Up<{z=aDL=n07q4k_J7ld>u-w0YgH7pma1Xc)=uaTm zU}LA6e%KbNRjXhLi|Br}UR4#fa2OCEX1zS)e?AQjF;0f7EB!wzkGd8Zt+^`C{-3BJ zuJR7oGyvlNw6LwZuU})Irg6s!bG<7&z8~TebggWy8+0B}KRd9HulF5@B;1wefV=wR zb!NuL!7%R>3deU5R10R}3ZbVr{;{Q5ZNXKKCa1UyGZp|L=>GwG==0jeqCtWj39H4W z-t?v5_HBq*8=mjWDLxEKZ&Ff3Vnlc}%s9P|cp_vC{)j*{>rFX#sr`ELjE__+Qc_Pa zWG6a-CRnMNJ0vamP?=_T%`9sUXWA$nwp+LU!uA6S$G6Fvtq4#L!w*bOPgkBA5$pb} zRx`iUkm+X_y|42Bs6g({m;X_L$W(dTl-^vp+m!$N@Fw2h8Dp&4L2tm8A18F$R|+W6 z9*bD!JOCNwXr{xnp=H2g#Sn8o)+djx!Gafu?xq{6gI4y1@#Zqh8ABT=YOp^coYy14 zNTm{^=a0NfWdfE%!_@vcz|^9D73aBN=0L_j$Bu=(Vfo;Mg+yM+i2nWlOBE5fbGOxgC)rCF5kmEHrH0+LkaVJ!1vyfG(i}2$R{fS$MW`%tHeBa)JyZ}}v ziKeNTxNuwB%2b7Odi$QW>(>u@lE;_Rj?tyj%YDw5u)exMVHMevpt+E@MkY&%mzA@N zl-*%Ew1*u#9jx5atK(VC@(ba|rLiR+lq zq2NcNNlLK@QEHJPeRP|p{Bis(jz{2cQcgRLBFX9RYkhK%0i%iL8l%72Y28o=N}|*9 zjycB0u~?XEhK6`vr8lv~GQtccO$%*09S37z;I<@1((M0S$8H+br!^m0ELL@Co;rJp zsB+N)NvB!U5GP;>O#L)NJuIx(DmN~@zkF+2b?Q9#%dxBNm_$matOaP%L}%K3MqBc^ zg5bs8Lg>9R--e^rXN%Du2JX}IQ2tZwV0K&WEcaLmGHP_6wuc>6l$AsFySeW4WM$^S zTuwHlbDJkPk(8B(Fe;XkYd0!F#g5LDA|KWM#`2H61KCSZ2CjD7^_b!JY^IX3e21nlv_&md_Jm!ZExw*Sz?Ah2d&LlyV+kUN#GN|1#mL1*I2Df==z~ z+bhU6djlQam%)-w|2?bOYJJeAWb#TY2JdV6wkSC9;aKu-NiHwfHaC9-6N3MPn|FKY zF-*06NIHRfL2Qdcrdb3x`bdCZ!Kcj!Aqif$??z_Y`(q<&gGw+`fNeu=2m6jnf1IBX zJR5~Fi@t@e>sa-zWq8Y~*5nV)Yw1KWON23@PXU^YWSS2k?>ai3>d;cm5T}@X zLKY7jbRKY4zk6`f*}_8f(zB!m7x@pm4i3mwb2T>?`KecPzIZj|kl~X)*u61T#@hF7 zP*-p3=pd`8Txfa>6(MvTmUkUucx~`RO-)Q>g(~)jg>##xF-iPk*X+TEg7elml`iIf z4$U;WsB000y(IQCx;8#MG?Wuc)i-ah87ii1ok8sbc;M>a*w|I-BO-1}*CHrapc{LK zrqXgoY1nlE;1Ezg%NHF(l2BIFW)8;{tePc4%5`t{?%?A+d7v_nYAI`}K}yz1=a%}B6Va);D` z)Lo8&v8d}1T54%|AmPzgltiNRf*Hkh%c2NTm7);HYe_H0y@NZlz*q024h+gJ_Z&D{ zR_{-Nf`B)SzE=-L8%=jq+OolLVf|J-$|p}w!PWfJIArl89&$OElST6Wetv9&!3ixN zgUD(Kvv4e}xvXMfCJ4}sH+KLrCWvFqOwgP5Ioe%*%BSlsBi(LY4|FJX4r@#Fu|DYL zL0>nX;1K;-5Ifeu4hmNHg_Z^kh}knm8^4Q)MI0KUt-JL2ohoX@sSv%vBwUJM)D*&W z3cEy7wF&#(G>oOjG);+ui!IqSe>@vKf{0QajcQK z)#+FhOShrWScVM?XVyRx32Rkew)5+mAwY}~&Tg;wn*HcOWL&)PPa=9+B%wjrz2=6>MeqNSJM;Y6_!Jr&1ve{3nK@Jg->hzebASDfy@@gL{cZRzxE zU$bSOMKYIRw)<&x(j{lU*$&U`R(R2CEtyoj`Vy>t-jDMv1~ICjh25;BweFOt)5@qH zohgaC?_Z*R-*(HU>=;G{iyutR-;24pNNd*6tA#yx+H&=(&}_Mq3YIAvGOPLayD>AV zE5tKN1-J+A!c@NPHBY*?IkDnqacrm{nrVP9xXt;Q?YyzhYhrT)G!Sk&2+HhluZP>ZQ^g)4{l+RN>Ec#VaI+&SOvAitoZ|Klx3`__F#SBrimJQqrAp zO-5bsy7#rSFN+{T)U*zmn`XxoHO4%6cVY%W0;$$u?GYX~V$4n3hjY-m)~dhUvwON+h4gcnII(N)b+@);-*1 zlsvmT6C0W5y&di4LYbRV#@EU&f32&pp^`-9x=~7uI1;Y6C{ikMx8E98{}I0>-#QLI z`VjYP>$e#(^2}IOK;YQ2j{n*bSoQ7OuaaM>-4DAT0?WXT!S_+;r>UVaFgOS_!$kGc zy5whrljwmfD=Tq!WM9cN0@@0V56YRnL;KO}B8#;T!N}|Bh1oAUB@dqB1CmG(83^F_ z#O=MB?}DvXN=Wog_|o$&@}>4!e~H}@Uvf5C?@;|1=t3N+*u6~Ay|A+o|{7wg#|z5q%-5)?Q??9ii99ZnT9 zZ0a5X#vDOsCc4M&9YKmfDpFx!gtae9VPEFGk`j5LO`4kQtxq(dmNq|s4lAeZ-npmp zvJVc9jg0K6tn!8f7lsQpwFI&|v(XVE1uiCN#fdl;(nFjx&h8tkHZ*H|w3`O-VQhH# ziPttAoe#COIk#>JHh47&WAKLfAgE1AnT1=+9r-da50T^G=<+5@ zweMG-{5fA+OMP;6Y|I_!L|a?-bM+k=GLn*e-hjLo*=@;^*VNL2ZqHmr472COLoSXN z-OfkAX)7JSyVoS2_wOE_9fAhQ)7STYfqPxyGdyyDLnPCGiyl329WkuXbKg`Z)Cyag&ZWj!jk2Vo_jT3V^CTuPwbFdD?X+!OF^yTVYk!4t18n8ZFHkM~gq*}B53KhkK; z@e?PIqyuoJ&a3j;H4qG;6k=Te?W#G<2H$#nW#jrmsn^o3*f%p|8=IP<$t4xQTv0Yr zQQBiW0Df3n>KlCo_Vr@h6#qn-Z0?EN%Yyv;i2J}6<$7CfCj7Y$xNm<(lO^9%e(G6QS0H9T8Yrveo+xg?jLSG~;>e@2@&ayhvAju$L((GyoFJt~EQGFf zObK^e#%V&rbg9e@8?+zkNHesEbJ0j~N4(sgwHe6s6wq~jGl)(Gt>)_+8>V$Xa8QVI zTe8hjdklMGsBhyQXg@gEb91s)K@1cV)X-16!exmzf&}KvrUu7$)Et;)Z=gRbUS-`W zU89D3y{ERi%{usm1ir4ur#n}Np(J?qnYso@6`7o}ioua2v4vpJ_EeLl4M#uXBBKKf z#1vecj|0?v-qLb(hbom?kE>A6PYF!`fD6H0->s+0;X5B5cXPW*k~4L20fe$b5;@F+4JGuH{THV)twtL0m{U$wSA{yZh%@fNkH3wA~c=`h}}+W4-w z9+k)cM*YPgP`Kl1L4bl$tkT6sgnk_|HLbRwoDVC;_OVK?X59rsijZl6tykZ!-@biS zkiF!?QkOl#va|!whc~)|QH28A+uM8kdPsg2ybh?=5Heim<@ORfLr4hs4b@=nx3wLE zQJ3(d3UQo)5UWd+6S}Rw;0hyX|D+5csqNS7?47xELcYZ@50AqK4s0zF*ITFONl0+y zw$8ay13ME^W6+^1AGbDJDJ@-fZXsx&!QwpXrJs*6^dF!+Odx0$B2m$bD zvO@l0azX+udhw-Hp}bYNXF4aA7hOj?WvG*Iz|1TprGDWV!e6E18wzjVUN|SLVl0Ul zy^?E3m@(p`0mBZQ(D^wa;w_#Z0V*ND&ku~KsIsN(E@#{UK2BxnOaG=WZl73Gf=>j* zj7sGHhPrtk)CWZTq)X58*Jd@3uOg7dgW?*+ujS7o@qGkzK)Le zvSN+iX&fYwjBn>eMMef8beb(`yYvi>I$8Qf8Av|i;Q^-NtE|P;8e>BTVIZ{w`bvxeu%`na!u}fy=L6-_B=;yo@0Yh3lJkdGBhME zCYGF<3aLoaVm>j?2VDw#lGV1Sewu(ss2FGI{f+dKO9gTjWgAV^XgQWWX`P3c4iNY=SiyJluy03i>2sn$EfOc8@xZQnx*dQyiqBTBkJr z&?8oGu!4cyx|5n^BbSH|Fu*HkmxjrHB*^kGG>i^WaeKi=hpDqJy=5{Rzyo&1q+9NM zH7~C5hQTYcT!6p-lbRaj>RSuSlucqZ_}zWxLu@~%g=9D$XFY?f1|;>T{qp)$0Ab2i z-x`J5mYmG=CL~Y?c?+8`U-J*mCCWvzlEDB}l$Tp2NeOLr#d}}8%U>pl@dY=4S!Y1Q zq1d=M$N;AAw`jPJx9>j;4axDYv|U>DIOk|uRE*gDq`^!8GtV;(c$5E1I)?3-CNTsbbI@3*aRUiSiY72t$Q%zPCj{HK!V-(Clv3fQm4nd7)PvtWS4n1#E6T*7Bzz*#P+r`470Ivsk$|iR!eT8f8!0w8c z1#jtuFF4bATue+1$eo-7B10isPF&Rhz0m!++cO#WR@APv0;ShMTeZaTtpqG}NEYtfadQAex*#KMKt+T7UairyZgLsb=( z%Co=jG8m1}LFaTv^rrRIeb?&o)mz0>YRHbyEV!)1K{9r%vGb6m5CcyBI6OFQfV5l< zO-=My7ly}fPRB|zKm;5^yGp0)`k+(&=t*W7%wgf2no|;!lXsPl4iBg5oco=VK$Y2P zxyg*v!2ZXYt?B-hHebQ4LniSg{u)>n;cHe(G*IIo86B-W{Z*pKp->OWAP6pKL=jix z$M22;g`uXdo=qi29)Up@Ck}kiI#BHInC)GJMvk_cv4c~V=c}>Ffnl66-utw?8uaNX zl?OZ?LRPH-&<^R4Y=ew6ZDl0JU^MdhRV`1Nn@7R&qthEsKtkPJ1p7g4hi4DnCo(*7 zr^{8fX}Pl$-B(7oQn7NxNaTO+kH4G>xlc76PQ*>szO`w zp*a)VbssF2I72 zgSN}cl6R_%NO%BbRIV@yi}B`PilN z=`|EF^@!11J2<9Z~a^g6|v((gRxH|Auzk@&Dq3Xi+`vP@dF~&)MV&5@%a87 zFvSRwJ^h*g^(*vv*=0+AHh~5L4_N;r&~Z_HLj(L=vKeAR*B@XlS{~}au1{4Pran_-JnU<`Wc6pEQc6ZeqlX*;%&>F3|Gj%a7P2h*# zuye&uI|~YtlSbg_C6LFl@@*oDP_hyPWqYxEA*5g~PlA*$c5;XzGA<)USSsY%tK(<( VMQeA>!#)E3qv`LZ-qN)T{cpp^WxxOc literal 0 HcmV?d00001 diff --git a/docs/modules/13_mission_planning/state_machine/state_machine_main.rst b/docs/modules/13_mission_planning/state_machine/state_machine_main.rst new file mode 100644 index 0000000000..abaece1b11 --- /dev/null +++ b/docs/modules/13_mission_planning/state_machine/state_machine_main.rst @@ -0,0 +1,74 @@ +State Machine +------------- + +A state machine is a model used to describe the transitions of an object between different states. It clearly shows how an object changes state based on events and may trigger corresponding actions. + +Core Concepts +~~~~~~~~~~~~~ + +- **State**: A distinct mode or condition of the system (e.g. "Idle", "Running"). Managed by State class with optional on_enter/on_exit callbacks +- **Event**: A trigger signal that may cause state transitions (e.g. "start", "stop") +- **Transition**: A state change path from source to destination state triggered by an event +- **Action**: An operation executed during transition (before entering new state) +- **Guard**: A precondition that must be satisfied to allow transition + +API +~~~ + +.. autoclass:: MissionPlanning.StateMachine.state_machine.StateMachine + :members: add_transition, process, register_state + :special-members: __init__ + +PlantUML Support +~~~~~~~~~~~~~~~~ + +The ``generate_plantuml()`` method creates diagrams showing: + +- Current state (marked with [*] arrow) +- All possible transitions +- Guard conditions in [brackets] +- Actions prefixed with / + +Example +~~~~~~~ + +state machine diagram: ++++++++++++++++++++++++ +.. image:: robot_behavior_case.png + +state transition table: ++++++++++++++++++++++++ +.. list-table:: State Transitions + :header-rows: 1 + :widths: 20 15 20 20 20 + + * - Source State + - Event + - Target State + - Guard + - Action + * - patrolling + - detect_task + - executing_task + - + - + * - executing_task + - task_complete + - patrolling + - + - reset_task + * - executing_task + - low_battery + - returning_to_base + - is_battery_low + - + * - returning_to_base + - reach_base + - charging + - + - + * - charging + - charge_complete + - patrolling + - + - \ No newline at end of file diff --git a/tests/test_state_machine.py b/tests/test_state_machine.py new file mode 100644 index 0000000000..e36a8120fd --- /dev/null +++ b/tests/test_state_machine.py @@ -0,0 +1,51 @@ +import conftest + +from MissionPlanning.StateMachine.state_machine import StateMachine + + +def test_transition(): + sm = StateMachine("state_machine") + sm.add_transition(src_state="idle", event="start", dst_state="running") + sm.set_current_state("idle") + sm.process("start") + assert sm.get_current_state().name == "running" + + +def test_guard(): + class Model: + def can_start(self): + return False + + sm = StateMachine("state_machine", Model()) + sm.add_transition( + src_state="idle", event="start", dst_state="running", guard="can_start" + ) + sm.set_current_state("idle") + sm.process("start") + assert sm.get_current_state().name == "idle" + + +def test_action(): + class Model: + def on_start(self): + self.start_called = True + + model = Model() + sm = StateMachine("state_machine", model) + sm.add_transition( + src_state="idle", event="start", dst_state="running", action="on_start" + ) + sm.set_current_state("idle") + sm.process("start") + assert model.start_called + + +def test_plantuml(): + sm = StateMachine("state_machine") + sm.add_transition(src_state="idle", event="start", dst_state="running") + sm.set_current_state("idle") + assert sm.generate_plantuml() + + +if __name__ == "__main__": + conftest.run_this_test(__file__) From 346037a6e2362ed17870928ccf26992b60b0e54c Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 4 Mar 2025 07:26:47 +0900 Subject: [PATCH 32/35] build(deps): bump pytest from 8.3.4 to 8.3.5 in /requirements (#1178) Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.3.4 to 8.3.5. - [Release notes](https://github.com/pytest-dev/pytest/releases) - [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst) - [Commits](https://github.com/pytest-dev/pytest/compare/8.3.4...8.3.5) --- updated-dependencies: - dependency-name: pytest dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/requirements.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/requirements/requirements.txt b/requirements/requirements.txt index b439ea4266..b46a0e41f1 100644 --- a/requirements/requirements.txt +++ b/requirements/requirements.txt @@ -2,7 +2,7 @@ numpy == 2.2.3 scipy == 1.15.2 matplotlib == 3.10.0 cvxpy == 1.5.3 -pytest == 8.3.4 # For unit test +pytest == 8.3.5 # For unit test pytest-xdist == 3.6.1 # For unit test mypy == 1.15.0 # For unit test ruff == 0.9.7 # For unit test From cd09abd5e00ca29d15f42c539121cf62751ecd52 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 4 Mar 2025 12:21:33 +0900 Subject: [PATCH 33/35] build(deps): bump ruff from 0.9.7 to 0.9.9 in /requirements (#1179) Bumps [ruff](https://github.com/astral-sh/ruff) from 0.9.7 to 0.9.9. - [Release notes](https://github.com/astral-sh/ruff/releases) - [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md) - [Commits](https://github.com/astral-sh/ruff/compare/0.9.7...0.9.9) --- updated-dependencies: - dependency-name: ruff dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/requirements.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/requirements/requirements.txt b/requirements/requirements.txt index b46a0e41f1..3e2a1e7c7c 100644 --- a/requirements/requirements.txt +++ b/requirements/requirements.txt @@ -5,4 +5,4 @@ cvxpy == 1.5.3 pytest == 8.3.5 # For unit test pytest-xdist == 3.6.1 # For unit test mypy == 1.15.0 # For unit test -ruff == 0.9.7 # For unit test +ruff == 0.9.9 # For unit test From 5f3be9bccd6366493ea8d159ceeae26d55ed5250 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 4 Mar 2025 12:52:57 +0900 Subject: [PATCH 34/35] build(deps): bump matplotlib from 3.10.0 to 3.10.1 in /requirements (#1181) Bumps [matplotlib](https://github.com/matplotlib/matplotlib) from 3.10.0 to 3.10.1. - [Release notes](https://github.com/matplotlib/matplotlib/releases) - [Commits](https://github.com/matplotlib/matplotlib/compare/v3.10.0...v3.10.1) --- updated-dependencies: - dependency-name: matplotlib dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/requirements.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/requirements/requirements.txt b/requirements/requirements.txt index 3e2a1e7c7c..552bf8482b 100644 --- a/requirements/requirements.txt +++ b/requirements/requirements.txt @@ -1,6 +1,6 @@ numpy == 2.2.3 scipy == 1.15.2 -matplotlib == 3.10.0 +matplotlib == 3.10.1 cvxpy == 1.5.3 pytest == 8.3.5 # For unit test pytest-xdist == 3.6.1 # For unit test From 30a61add126e2c21c035d50a7a448ed8eb06afb0 Mon Sep 17 00:00:00 2001 From: Surya Singh <133056660+spnsingh@users.noreply.github.com> Date: Fri, 7 Mar 2025 09:01:37 -0500 Subject: [PATCH 35/35] bug: fix typo on line 6 of SpaceTimeAStar.py (#1182) * bug: fix typo on line 6 of SpaceTimeAStar.py * bug: removed extra line return on line 11 of SpaceTimeAStar.py --- PathPlanning/TimeBasedPathPlanning/SpaceTimeAStar.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/PathPlanning/TimeBasedPathPlanning/SpaceTimeAStar.py b/PathPlanning/TimeBasedPathPlanning/SpaceTimeAStar.py index 3b3613d695..a7aed41869 100644 --- a/PathPlanning/TimeBasedPathPlanning/SpaceTimeAStar.py +++ b/PathPlanning/TimeBasedPathPlanning/SpaceTimeAStar.py @@ -3,7 +3,7 @@ This script demonstrates the Space-time A* algorithm for path planning in a grid world with moving obstacles. This algorithm is different from normal 2D A* in one key way - the cost (often notated as g(n)) is the number of time steps it took to get to a given node, instead of the number of cells it has - traversed. This ensures the path is time-optimal, while respescting any dynamic obstacles in the environment. + traversed. This ensures the path is time-optimal, while respecting any dynamic obstacles in the environment. Reference: https://www.davidsilver.uk/wp-content/uploads/2020/03/coop-path-AIWisdom.pdf """