Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DM-25856: add support for compensated moves in the CSC #20

Merged
merged 11 commits into from Aug 11, 2020
8 changes: 1 addition & 7 deletions bin/command_hexapod.py
Expand Up @@ -36,13 +36,7 @@
separated by spaces, then <return>. "help" is a command.
"""
import asyncio
import argparse

from lsst.ts import hexapod

parser = argparse.ArgumentParser("Command the hexapod from the command line")
parser.add_argument("index", type=int, help="Hexapod index; 1=Camera 2=M2")
args = parser.parse_args()

commander = hexapod.HexapodCommander(index=args.index)
asyncio.get_event_loop().run_until_complete(commander.amain())
asyncio.run(hexapod.HexapodCommander.amain(index=True))
28 changes: 27 additions & 1 deletion doc/version_history.rst
Expand Up @@ -6,10 +6,36 @@
Version History
###############

v0.8.0
======

Major Changes:

* Add support for compensated moves, where compensation is done in the CSC instead of in the low-level controller.
* Add a data fitter for compensation data. See the README in new directory ``fitter``.
* Overhaul the SAL API.

Minor Changes:

* Add missing ``config_dir`` constructor argument to `HexapodCsc`.
* Use `lsst.ts.salobj.BaseCscTestCase` and `lsst.ts.salobj.CscCommander` instead of the versions in ts_hexrotcomm.
* Add several ``<x>_jitter`` attributes to `MockMTHexapodController` to clarify the mount of jitter added to measured values.

Requires:

* ts_hexrotcomm 0.7
* ts_salobj 5.15
* ts_idl 1
* ts_xml 6.2
* Hexapod IDL files, e.g. made using ``make_idl_files.py Hexapod``
* ts_simactuators 2

v0.7.0
======

* Make HexapodCsc configurable.
Changes:

* Make `HexapodCsc` configurable.

Requires:

Expand Down
36 changes: 36 additions & 0 deletions fitter/README.rst
@@ -0,0 +1,36 @@
Code to fit coefficients for compensated moves.

Contents
--------

* ``fit_data.py``: a command-line script that fits a data file
* ``zd_camera.txt`` zenith distance data for the Camera hexapod, from finite element analysis
* ``zd_m2.txt`` zenith distance data for the M2 hexapod, from finite element analysis

Data Format
-----------

The data format is as follows. See the data files for examples:

* Data is text, with space separated columns.
* Comment lines must begin with #
* There must be exactly 7 columns of data in the following order:

* cause: cause of motion; one of zenith distance, elevation, azimuth (deg), or temperature (C).
* x offset: (um)
* y offset: (um)
* z offset: (um)
* u (rot about x) offset: (deg)
* v (rot about y) offset: (deg)
* w (rot about z) offset: (deg)
* The sign of the offsets is: compensated value = offset + uncompensated (user-specified) value.

The files were obtained by selecting and copying the appropriate region of these spreadsheets:
https://github.com/bxin/hexrot/blob/master/LUT/Camera%20Hexapod%20Motions%20in%20Elevation%20Axis%202020%2007%2023.xlsx
https://github.com/bxin/hexrot/blob/master/LUT/M2%20Hexapod%20Motions%20in%20Elevation%20Axis%202020%2007%2023.xlsx

Notes
-----

``fit_data.py`` is completely self-contained because I wanted to be able to run it on macOS,
without worrying about OpenSplice.
285 changes: 285 additions & 0 deletions fitter/fit_data.py
@@ -0,0 +1,285 @@
#!/usr/bin/env python

"""Fit coefficients for a Hexpod compensation model.

Example of use::

fit_data.py hexapod_motions_camera.txt poly -n 5 6 7

You can change matplotlib's backend by editing the code.

The data format is explained in the README file.
"""
import argparse
import itertools
import pdb
import math

import numpy as np
import matplotlib
import scipy.optimize

# Edit the following to change matplotlib's backend
matplotlib.use("Qt5Agg") # noqa

import matplotlib.pyplot as plt

RAD_PER_DEG = math.pi / 180.0

# Allowed names for the cause being compensated for
CAUSE_NAMES = ("elevation", "zd", "azimuth", "temperature")

# Names of hexapod axes in the data files.
COLUMN_NAMES = ("cause", "x", "y", "z", "u", "v", "w")

UNITS_DICT = dict(
elevation="deg",
azimuth="deg",
temperature="C",
x="um",
y="um",
z="um",
u="deg",
v="deg",
w="deg",
)

# Names of hexapod axes to fit.
# We only expect compensation for y, z, and u.
# However, if you measure reproducible effects in other axes
# then edit the following.
AXES_TO_FIT = ["x", "y", "z", "u"]


def read_data(path):
"""Read and parse a data file.

Parameters
----------
path : `str` or `pathlib.Path`
Path to data file

Returns
-------
data : `numpy.ndarray`
Parsed data as a list of arrays named:
cause, x, y, z (um), u, v, w (deg),
where cause is the literal string "cause".
"""
data = np.loadtxt(
fname=path,
dtype=dict(names=COLUMN_NAMES, formats=(float,) * 7),
usecols=[0, 1, 2, 3, 4, 5, 6],
)
return data


def cospoly(ang, *coeffs):
"""Polynomial with x = cos(ang) and ang in degrees.

f(ang) = C0 + C1 cos(ang) + C2 cos(ang)^2 + ...
"""
x = np.cos(np.radians(ang))
poly = np.polynomial.Polynomial(coeffs)
return poly(x)


def poly(x, *coeffs):
"""Standard polynomial.

f(x) = C0 + C1 x + C2 x^2 + ...
"""
poly = np.polynomial.Polynomial(coeffs)
return poly(x)


def return_one(data):
return 1


def fourier(ang, *coeffs):
"""Real-valued Fourier series with ang in degrees.

f(ang) = C0 + C1 sin(ang) + C2 cos(ang) + C3 sin(2 ang) + C4 cos(2 ang) ...
"""
function_iter = itertools.cycle([np.sin, np.cos])
functions = [return_one] + [next(function_iter) for i in range(len(coeffs) - 1)]
# Angle multipliers for C0, C1, C2, C3
angle_multipliers = [((i + 1) // 2) * RAD_PER_DEG for i in range(len(coeffs))]

result = 0
for coeff, function, angle_multiplier in zip(coeffs, functions, angle_multipliers):
result += coeff * function(ang * angle_multiplier)
return result


def fit_one(data, cause, axis, model, ncoeffs):
"""Fit one axis of Hexapod motion vs cause to a model.

Parameters
----------
data : `numpy.ndarray`
Data as a structured array with fields: elevation (deg)
plus all fields in COLUMN_NAMES
cause : `str`
Name of cause of motion, e.g. "elevation".
axis : `str`
Name of hexapod axis to fit, e.g. "u".
model : callable
Model to fit. It must take the following arguments:

* x (positional): a numpy.ndarray of values;
most models require an angle in degrees.
* *coeffs: a sequence of coefficients
ncoeffs : `int`
Number of coefficients.

Returns
-------
coeffs : `list` [`float`]
The fit coefficients.
covariance : `numpy.ndarray`
Estimated covariance of coeffs. The diagonals provide the variance
of the parameter estimate. See the documentation for
``scipy.optimize.curve_fit`` for more information.
"""
if ncoeffs < 1:
raise ValueError("Must be at least 1 coeff")
cause_data = data["cause"]
effect_data = data[axis]
coeffs, covariance = scipy.optimize.curve_fit(
model, cause_data, effect_data, p0=[0] * ncoeffs
)
return coeffs, covariance


def fit_all_axes(
data, cause, model, ncoeffs_arr, print_covariance=False, graph_path=None
):
"""Fit and plot all AXES_TO_FIT using varying numbers of coeffs.

Parameters
----------
data : `numpy.ndarray`
Data as a structured array with fields: elevation (deg)
plus all fields in COLUMN_NAMES
cause : `str`
Name of cause being compensated for, e.g. "elevation".
Note that `read_data` converts zd to elevation,
so you should do the same for this argument.
model : callable
Model to fit. It must take the following arguments:

* ang: a numpy.ndarray of angles, in deg
* *coeffs: a sequence of coefficients
ncoeffs_arr : `list` [`int`]
Sequence of numbers of coefficients to try.
print_covariance : `bool`, optional
Print the covariance matrix from the fit?
graph_path : `str` or `None`, optional
Path to which to save the graph.
If `None` do not save the graph and pause to show it.
"""
cause_data = data["cause"]
cause_units = UNITS_DICT[cause]
dense_cause_data = np.linspace(start=cause_data[0], stop=cause_data[-1])

nfits = len(AXES_TO_FIT)
fig, plots = plt.subplots(2, nfits, sharex=True, figsize=(25, 10))
plot_dict = {axis: (plots[0, i], plots[1, i]) for i, axis in enumerate(AXES_TO_FIT)}

for axis in AXES_TO_FIT:
print()
effect_units = UNITS_DICT[axis]
effect_data = data[axis]

data_plot, resid_plot = plot_dict[axis]
data_plot.set(ylabel=f"{axis} ({effect_units})")
resid_plot.set(xlabel=f"{cause} ({cause_units})", ylabel=f"{axis} residuals)")
data_plot.set_title(f"{axis} vs. {cause}")

for ncoeffs in ncoeffs_arr:
coeffs, covariance = fit_one(
data=data, cause=cause, axis=axis, model=model, ncoeffs=ncoeffs
)
dense_modeled_data = model(dense_cause_data, *coeffs)
data_plot.plot(dense_cause_data, dense_modeled_data)
residuals = model(cause_data, *coeffs) - effect_data
resid_plot.plot(cause_data, residuals, ".")
print(
f"{axis} coeffs: {np.array2string(coeffs, separator=', ', max_line_width=1000)}"
)
abs_residuals = np.abs(residuals)
print(
f"{axis} residuals: mean(abs) = {abs_residuals.mean():0.3g}; "
f"max(abs) = {abs_residuals.max():0.3g}; "
f"std dev = {residuals.std():0.3g}"
)
if print_covariance:
print(f"{axis} coeffs covariance = {covariance}")

# Plot data last so colors match between data and residual plots.
data_plot.plot(cause_data, effect_data, ".")

legend_labels = [f"{n} coeffs" for n in ncoeffs_arr]
resid_plot.legend(legend_labels)

fig.show()
if graph_path:
fig.savefig(graph_path)
else:
pdb.set_trace()


model_dict = dict(poly=poly, fourier=fourier, cospoly=cospoly)
model_names = sorted(model_dict.keys())


def main():
parser = argparse.ArgumentParser(f"fit Hexapod compensation coefficients")
parser.add_argument(
"datafile", help="Path to data file.",
)
parser.add_argument(
"cause", help="Cause being compensated for.", choices=CAUSE_NAMES
)
parser.add_argument("model", help="Model to fit.", choices=model_names)
parser.add_argument(
"-n",
"--ncoeffs",
type=int,
help="Number of coefficients",
nargs="*",
default=(3, 5, 7, 9),
)
parser.add_argument(
"--covar",
help="Print the covariance matrix returned by the fitter?",
action="store_true",
)
parser.add_argument(
"--graphpath",
help="Path to which to save the graph. The graph is not saved, if omitted.",
)
args = parser.parse_args()

model = model_dict[args.model]
if args.cause == "temperature" and args.model != "poly":
parser.error("Only the poly model can be used to fit temperature")

data = read_data(path=args.datafile)
if args.cause == "zd":
print("Converting zenith distance to elevation")
data["cause"][:] = 90 - data["cause"]
args.cause = "elevation"
fit_all_axes(
data=data,
cause=args.cause,
model=model,
ncoeffs_arr=args.ncoeffs,
print_covariance=args.covar,
graph_path=args.graphpath,
)


main()
23 changes: 23 additions & 0 deletions fitter/zd_camera.txt
@@ -0,0 +1,23 @@
# Camera Hexapod motion
# from https://github.com/bxin/hexrot/blob/master/LUT/Camera%20Hexapod%20Motions%20in%20Elevation%20Axis%202020%2007%2023.xlsx
# ZD dx dy dz du dv dw
# deg um um um deg deg deg
90 0.927786 641.860517 -605.175634 0.017729 -2.275764e-06 -3.441770e-05
85 0.782674 639.419713 -525.880258 0.017593 -1.451699e-06 -2.948852e-05
80 0.637150 626.915855 -447.260678 0.017188 -6.995946e-07 -2.465315e-05
75 0.492320 604.444105 -369.915237 0.016518 -2.517423e-08 -1.994838e-05
70 0.349289 572.175486 -294.432581 0.015586 5.664294e-07 -1.541003e-05
65 0.209143 530.355582 -221.387177 0.014401 1.070714e-06 -1.107263e-05
60 0.072949 479.302667 -151.334946 0.012971 1.483841e-06 -6.969198e-06
55 -0.058255 419.405286 -84.809028 0.011308 1.802667e-06 -3.130954e-06
50 -0.183472 351.119292 -22.315724 0.009424 2.024765e-06 4.128845e-07
45 -0.301748 274.964385 35.669353 0.007332 2.148445e-06 3.635348e-06
40 -0.412184 191.520149 88.704902 0.005050 2.172766e-06 6.511912e-06
35 -0.513938 101.421645 136.387290 0.002595 2.097542e-06 9.020684e-06
30 -0.606237 5.354576 178.353626 -0.000015 1.923347e-06 1.114257e-05
25 -0.688378 -95.949928 214.284521 -0.002761 1.651505e-06 1.286142e-05
20 -0.759735 -201.720880 243.906518 -0.005620 1.284086e-06 1.416416e-05
15 -0.819767 -311.153298 266.994177 -0.008571 8.238858e-07 1.504086e-05
10 -0.868015 -423.414336 283.371786 -0.011592 2.744070e-07 1.548487e-05
5 -0.904113 -537.649619 292.914701 -0.014660 -3.601685e-07 1.549279e-05
0 -0.927786 -652.989748 295.550297 -0.017752 -1.075011e-06 1.506457e-05