Skip to content

Commit

Permalink
542 xarray (#543)
Browse files Browse the repository at this point in the history
* added sarus doc 🍔

* enhanced doc 🎏

* added conda-prefix in dockerfile to have correct set env-var in sarus container 👨

* bugfix gpu-testing 📒

* skip pinocchio-test because it seems not to work properly on each platform 😖

* integrated user-build testing in workflow 8️⃣

* updated dockerfile-dev 🎅

* removed test-user-package because testing is now happening before pushing the image 👠

* adapted docker-dev 🍹

* Revert "removed test-user-package because testing is now happening before pushing the image 👠"

This reverts commit a828dfd.

* updated dockerfile-dev 🔌

* changed dev-workflow for testing 💀

* removed workflow-run trigger of dev-image because out of memory 🚈

* added mpich std-loc in Dockerfile.user 🍞

* adapted docker-user ⛎

* minor bugfix dockerfile-user 📥

* updated environments-files 🎇

* updated mpi-version in dockerfile 🆎

* updated Dockerfile-user structure 👾

* added mpi-tests to karabo 🌱

* updated requirements.txt with new versions 🎼

* integrated mpi-tests into github workflow and codecov-report 🆖

* minor fix using pytest-cov instead of coverage in github workflow 🌊

* fixed coverage-files discovery (I think) 🏧

* adapted codecov workflow according to github.com/codecov/codecov-action 🎲

* adapted container-doc 🍗

* bugfix Dockerfile mpi-compilation needs python 🍚

* minor adjustments in Dockerfile.user 😉

* added mpi-supported h5py wheel as karabo dependency 🍟

* removed pinocchio from environment.yaml 🔘

* updated environment.yaml with mpich-deps 🈴

* updated conda-build files 🚟

* replaced np.object0 with according non-deprecated alias 🐳

* minor bugfix in pyproject.toml 🙇

* deleted test-pinocchio (wrongly merged) 🍭

* updated environment files 🐎

* imporved dev-setup 🔉

* removed requirements.txt :godmode:

* added ipykernel to dev-deps 🚙

* bugfix filter-sources for xarray>2023.2 🔦

* updated build-procedure ⛪

* implemented dynamic versioneering 😔

* removed remaining requirements.txt stuff 👜

* updated Dockerfile-user mpi-installation to be dependent on karabo-user installation version 🛀

* refactored dockerfiles 🐭

* added versioneer to meta.yaml build-stage 📚

* added versioneer to meta.yaml build-stage 🚕

* bugfix conda-build meta.yaml for versioneer 💩

* adapted conda-build to custom pkg-version 😤

* adapted Dockerfile.user to a testable (before release) setup 👸

* bugfix build-user-image.yml ➿

* bugfix build-user-image.yml 🚧

* updated Dockerfile-user 👏

* bugfix user-dockerfile 🕘

* bugfix Dockerfile-dev 🔖

* minor doc-update in dockerfile-dev 🚸

* updated description of build-user-image workflow inputs 🔫

* introduced venv in dockerfile-user to not f*** up base env 😌

* updated dev-img with venv 😓

* updated build-user-image workflow to be able to run on workflow-dispatch 🐬

* added file to build-and-export-docker action 📌

* bugfix build-args in build-user-image workflow 👨

* bugfix get-shallow git-repo from git-rev (not only branches or tags) 📷

* bugfix remote from https and not from ssh 🎮

* updated workflow-dispatch descriptions ‼️

* bugfix build-args passing to docker-build-push action 🆘

* throw exit-code in dockerfile-user if build not set correctly 🎵

* added dev-flag to conda-build 🌹

* added security to dev-builds in conda-build workflow 🛄

* bugfix dev-evaluation in conda-build.yml 🈂️

* outcommented mpich-compilation in Dockerfile 😥

* bugfix conda-build export of env-vars in same step ⛺

* added failing tests if PR is draft 🆖

* setup build-user-img, conda-build & test-user-package for dev-testing 🔍

* added df-h to build-user-image workflow 🎃

* made docker-build by myself ✂️

* made docker-build by myself entirely 🚥

* ensured to push on ghcr.io ✈️

* bugfix docker-push user-image workflow 💘

* bugfix docker-push user-image workflow 📢

* bugfix added img-name to push image 🚸

* bugfix registry docker push 🍎

* adapted docker-img address accordingly to ghcr.io 🎩

* adapted build-user-image to standard 🐗

* added pytest installation to docker-user image testing workflow 🎢

* bugfix pytest-call in user-image 💴

* added bash shell in docker-run in build-test-user-image 👥

* adapted ld-library-path to base-image 2️⃣

* made site-package-location identification in build-user-image more robust 📵

* defined entrypoint to Dockerfile-user ➡️

* bugfix calling tests in build-user-image 🐻

* readded conda-activate-karabo to .bashrc for interactive mode 💓

* bugfix removed unnecessary " at the end of docker-run 🚅

* set env-vars in test-user-image ♓

* minor changes 🎵

* adapted test-workflow to main-setup 💡

* adapted docker-img dev to karabo-venv 📄

* adapted mpi-doc ❇️

* addressed mypy-issues 6️⃣

* added type-ignore to __init__.py because mypy can't handle that 🍥

* adapted exclude-option in setup.cfg to hopefully ignore __init__.py on the runners 🐽

* adapted mypy-exclude-regex to exclude all __init__.py ⏰

* hopefully bugfix to ignore __init__.py by mypy 🍟

* trying editable install to avoid duplicate modules 😓

* added verbose-flag to pytest-testing of docker-image 📫

* removed pytest-mpi as dep and added --only-mpi flag handling in conftest.py 👪

* minor changes in pytest test-discovery 🎁

* removed mpi-pytest from codecov because of race-condition issue which doesn't seem to be solvable atm 👬

* bugfix compute filter-mask in filter-sky 👫

* ugly hotfix to initialize dask-mpi inside docker-container 🏯

* added mpi-tests to docker-user tests 💀

* bugfix only enter dask-initialize with mpi if mpirun 🐚

* bugfix conda-build inherited env-vars [skip ci] 👚

* bugfix conda-build inherited env-vars [skip ci] 🚑

* bugfix conda-build set output-vars [skip ci] 🌈

* removed notebook-test from user-img test because data-dirs are not part of package and therefore would cause an error [skip ci] 🕠

* bugfix casting to str-to-boolean for reproducable github-workflows 🔩

* added debugging logs to github workflows [skip ci] ☎️

* adapted meta.yaml to pin compatible numpy ❓

* updated conda-build to be closer to a best-practise build 😿

* fixed bluebild-version 🌅

* removed echos and improved build-user-image workflow 🔻

* removed build-string default to ensure that workflow-user knows what he/she does 🐟

* minor update in container.md ⏩

* bugfix check leading v in tag in conda-build workflow 🍙

* bugfix set dev-string check correclty in conda-build.yml 🚷

* added versioneer to dev-dep ⚪

* improved version definition security for conda-build workflows ♒

* minor imporvements in conda-build ⤵️

* hopefully bugfix of github boolean passing in reusable workflow see #1483 of github runners 🎈

* improved version definition security for conda-build workflows ↪️

* added little verbosity to conda-build workflow 📍

* hopefully bugfix to trigger build-docker by taking boolean values directly from input 👂

* bugfix build-user-image.yml 🛅

* minor improvements in build-user-image.yml 🎫

* replaced exit 2 with exit 1 in all bash scripts 3️⃣

* bugfix install environment in user Dockerfile correctly 💙

* removed weird leading v to github-workflows version-args 🍚

* updated codecov.yml to not fail if below target 🎰

* bugfix: added python interpreter and versioneer through conda for checking conda-build worklow inputs 🇷🇺

* added conda-prefix to python interpreter to hopefully get viable binary Ⓜ️

* added conda-prefix to python interpreter to hopefully get viable binary :bowtie:

* removed dev-deps in environment.yaml and meta.yaml ⌛

* bugfix: removed bdsf dev-dep 💒

* addressed mypy-issues 🚝

* updated mypy-complaints chunks-dict issue in sky-model 🚞

* fixed build-nr of feedstock-deps 🐹

* addresses mypy attr-defined for matplotlib BLUE 😾

* renamed build-user-image do build-docker-image 🐚

* adapted readme-badges 🍰

* addressed pr-request to install mpich via apt 🐒

* adapted documentation and dockerfile-steps 🍶

* bugfix added -y to apt install in Dockerfile [skip ci] ↩️

* improved docs ♈

* changed Dockerfile setup to use andromeda user instead of root 😉

* minor update in dockerfile 🔛

* changed run-stages of dockerfile 🍲

* removed user-changeing because of singularity uid issues 🔫

* moved Dockerfile to root & removed docker-dir 🚌

* added bash-env to env-vars for singularity noninteractive-shell [skip ci] 🕣

* bugfix: correctly activate venv in docker & singularity container for interactive and non-interactive shells 💛

* fixed conda activate functionality for docker & singularity interactive and non-interactive shells 💉

* removed unnecessary sourcing in dockerfile [skip ci] 🔣

* added ldconfig after mpich-installation in dockerfile 👤

* minor changes in dockerfile [skip ci] 🐟

* outcommented mpi-hook from dockerfile because it's still error-prone 📢

* adapted container-doc ↩️

* bugfix: in test docker-image to enable --only-mpi custom flag for pytest 📓

* added karabo shared lib to ldconfig cache to enable native cscs-mpi-hook 🕡

* replaced weird file-handler root-dir setup with /tmp with honor of TMP, TMPDIR & TEMP 🌐

* changed tmp-dir-name setup to avoid collisions 🐂

* loosened mpich-version constraints because we no longer rely on apt to install mpich ◻️

* updated mpi-doc 😡

* minor bugfix for mpi-tests in ci :hurtrealbad:

* added scratch as a possible tmpdir in FileHandler ⏰

* bugfix get-tmp-dir 🐲

* added cachetools as dep 😇

* redesigned FileHandler for short- and long-term-memory caching 😡

* improved file-handler by getting unique tmp-dir per unique object 💥

* refactored image and imager to new filehandler-setup 😔

* adapted interferometer.py and telescope.py to new FileHandler setup ↪️

* adapted visibility and sourcedetection to new FileHandler setup 🙌

* enhanced FileHandler get-tmp-dir with subdir & mkdir option 🚘

* adapted Karabo to new FileHandler setup 🐍

* bugfix accessing .ltm and .stm through FileHandler 👅

* bugfixes FileHandler & according tests 📝

* changed Downloadobject-storage cache from site-packages to tmp 🌁

* removed weird KaraboResource pseudo-interface from repo ♥️

* removed FileHandler get-tmp-dir subdir option because seems unnecessary ⛽

* implemented seed-option to FileHandler.get-tmp-dir 🍢

* changed ltm & stm of FileHandler from static functions to lazy class-attributes 📄

* intermediate commit separating Dask & Slurm concerns in dask.py 🕔

* adapted other karabo-files to new DaskHandler setup 🚨

* readded accidentally removed plotting-util.py 👡

* bugfix dask-usage 🍀

* minor bugfix in test-dask 🚬

* refactored create_baseline_cut_telelescope to improved disk-caching & be less error-prone 😷

* bugfixes in Telescope.create-baseline-cut-telescope 🔑

* addressed mypy-issues 😐

* updated singularity-doc 🍓

* addressed PR-requests 🎍

* addressed PR526 requests 🍀

* bugfix removed set libmamba-solver globally in installation-user.md 📦

* made plot-function api more consistent 😰

* refactored DaskHandler to use just a single DaskHandler class for function-calling purpose 🎼

* bugfix removed fetch-dask-handler function from karabo 🎁

* addressed PR540 requests 🐻

* bugfix seeding ltm-memory 🎵

* improved api-signatures & bugfix in imager and result 🆖

* constrained xarray <=2023.2 (see issue #542) 💒

* bugfix cls to classmethod 🐐

* bugfix kwargs quiet passing in detect-sources-in-images 🌱

* removed guess-beam-parameters from result.py because it doesn't work anymore 🌀

* removed fixing of build-nr to enable dependency-constraint updates for past releases 💉

* reverted xarray-constraint because it's an issue of ska-sdp-func-python and not of Karabo ⚠️

* added imaging-rascil test to test-image 💯

* enhanced environment.yaml description 🈴

* added guess-beam-parameters again after removal with improvements ♐

* relocated and enhanced guess-beam-parameters and it's usage 📔
  • Loading branch information
Lukas113 committed Feb 26, 2024
1 parent e1b75fa commit d8dcf2e
Show file tree
Hide file tree
Showing 13 changed files with 427 additions and 301 deletions.
26 changes: 13 additions & 13 deletions conda/meta.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,39 +23,39 @@ requirements:
- versioneer
run: # constrain-notes see `environment.yaml`
- python {{ python }}
- aratmospy =1.0.0=*_0
- aratmospy =1.0.0
- astropy
- bdsf =1.10.2=*_0
- bluebild =0.1.0=*_0
- bdsf =1.10.2
- bluebild =0.1.0
- cuda-cudart
- dask =2022.12.1
- dask-mpi
- distributed
- eidos =1.1.0=*_0
- eidos =1.1.0
- healpy
- h5py =*=mpi_mpich*
- ipython
- katbeam =0.1.0=*_0
- katbeam =0.1.0
- libcufft
- matplotlib
- montagepy =6.0.0=*_0
- montagepy =6.0.0
- mpi4py
- mpich
- nbformat
- nbconvert
- {{ pin_compatible('numpy') }}
- oskarpy =2.8.3=*_0
- oskarpy =2.8.3
- pandas
- psutil
- rascil =1.0.0=*_0
- rascil =1.0.0
- reproject >=0.9,<=10.0
- requests
- scipy >=1.10.1
- ska-gridder-nifty-cuda =0.3.0=*_0
- ska-sdp-datamodels =0.1.3=*_0
- ska-sdp-func-python =0.1.4=*_0
- tools21cm =2.0.2=*_0
- xarray >=2022.10.0
- ska-gridder-nifty-cuda =0.3.0
- ska-sdp-datamodels =0.1.3
- ska-sdp-func-python =0.1.4
- tools21cm =2.0.2
- xarray >=2022.11
# transversal dependencies which we need to reference to get mpi-wheels
- conda-forge::fftw =*=mpi_mpich*

Expand Down
32 changes: 18 additions & 14 deletions environment.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,41 +2,45 @@ channels:
- i4ds
- nvidia/label/cuda-11.7.0
- conda-forge
dependencies: # package-version & build-number of Karabo-Feedstock deps should be fixed (see PR #526)
dependencies:
# Set dependencies and it's contraints in the package they belong. Otherwise you just cause chaos for future builds.
# Just Karabo's direct dependencies and Karabo constraints (from our code) should be handeled here.
# Dependencies with unstable APIs (this usually includes Karabo-Feedstock) should be fixed.
# Don't fix anything regarding build-string (except mpi) here. There's a lot you could do wrong.
- python =3.9
- aratmospy =1.0.0=*_0
- aratmospy =1.0.0
- astropy
- bdsf =1.10.2=*_0
- bluebild =0.1.0=*_0
- bdsf =1.10.2
- bluebild =0.1.0
- cuda-cudart
- dask =2022.12.1
- dask-mpi
- distributed
- eidos =1.1.0=*_0
- eidos =1.1.0
- healpy
- h5py =*=mpi_mpich*
- ipython
- katbeam =0.1.0=*_0
- katbeam =0.1.0
- libcufft
- matplotlib
- montagepy =6.0.0=*_0
- montagepy =6.0.0
- mpi4py
- mpich
- nbformat
- nbconvert
- numpy >=1.21, !=1.24.0
- oskarpy =2.8.3=*_0
- oskarpy =2.8.3
- pandas
- psutil
- rascil =1.0.0=*_0
- rascil =1.0.0
- reproject >=0.9,<=10.0
- requests
- scipy >=1.10.1
- ska-gridder-nifty-cuda =0.3.0=*_0
- ska-sdp-datamodels =0.1.3=*_0
- ska-sdp-func-python =0.1.4=*_0
- tools21cm =2.0.2=*_0
- xarray >=2022.10.0
- ska-gridder-nifty-cuda =0.3.0
- ska-sdp-datamodels =0.1.3
- ska-sdp-func-python =0.1.4
- tools21cm =2.0.2
- xarray >=2022.11
# transversal dependencies which we need to reference to get mpi-wheels
# casacore hast just no-mpi & open-mpi, but no mpich-wheel
- conda-forge::fftw =*=mpi_mpich* # oskarpy(oskar(casacore)), tools21cm, bluebild(finufft) -> from conda-forge to ignore channel-prio & not take our legacy fftw-wheel
302 changes: 126 additions & 176 deletions karabo/examples/source_detection.ipynb

Large diffs are not rendered by default.

41 changes: 35 additions & 6 deletions karabo/imaging/image.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@
import matplotlib.pyplot as plt
import numpy as np
from astropy.io import fits
from astropy.io.fits.header import Header
from astropy.nddata import Cutout2D, NDData
from astropy.wcs import WCS
from numpy.typing import NDArray
Expand All @@ -27,7 +28,7 @@
from reproject.mosaicking import find_optimal_celestial_wcs, reproject_and_coadd
from scipy.interpolate import RegularGridInterpolator

from karabo.util._types import FilePathType
from karabo.util._types import BeamType, FilePathType
from karabo.util.file_handler import FileHandler, assert_valid_ending
from karabo.util.plotting_util import get_slices

Expand Down Expand Up @@ -56,7 +57,7 @@ def __init__(
*,
path: Literal[None] = None,
data: NDArray[np.float_],
header: fits.header.Header,
header: Header,
**kwargs: Any,
) -> None:
...
Expand All @@ -66,9 +67,10 @@ def __init__(
*,
path: Optional[FilePathType] = None,
data: Optional[NDArray[np.float_]] = None,
header: Optional[fits.header.Header] = None,
header: Optional[Header] = None,
**kwargs: Any,
) -> None:
self.header: Header
if path is not None and (data is None and header is None):
self.path = path
self.data, self.header = fits.getdata(
Expand Down Expand Up @@ -208,10 +210,10 @@ def cutout(

@staticmethod
def update_header_from_image_header(
new_header: fits.header.Header,
old_header: fits.header.Header,
new_header: Header,
old_header: Header,
keys_to_copy: Optional[List[str]] = None,
) -> fits.header.Header:
) -> Header:
if keys_to_copy is None:
keys_to_copy = [
"CTYPE3",
Expand Down Expand Up @@ -417,6 +419,33 @@ def has_beam_parameters(self) -> bool:
["BMAJ", "BMIN", "BPA"],
)

def get_beam_parameters(self) -> BeamType:
"""Gets the beam-parameters fom the image-header.
"bmaj": FWHM of the major axis of the elliptical Gaussian beam in arcsec
"bmin": FWHM of the minor minor axis of the elliptical Gaussian beam in arcsec
"bpa": position angle of the major axis of the elliptical Gaussian beam in
degrees, counter-clock from the North direction
Returns:
"bmaj" (arcsec), "bmin" (arcsec), "bpa" (deg)
"""
try:
bmaj = float(self.header["BMAJ"])
bmin = float(self.header["BMIN"])
bpa = float(self.header["BPA"])
except Exception as e:
raise RuntimeError(
f"No beam-parameters 'BMAJ', 'BMIN', 'BPA' found in {self.path}. "
+ "Use `has_beam_parameters` for save use of this function."
) from e
beam: BeamType = {
"bmaj": bmaj,
"bmin": bmin,
"bpa": bpa,
}
return beam

def get_quality_metric(self) -> Dict[str, Any]:
"""
Get image statistics.
Expand Down
106 changes: 104 additions & 2 deletions karabo/imaging/imager.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,11 @@
from __future__ import annotations

import os
import warnings
from typing import Dict, List, Literal, Optional, Tuple, Union

import numpy as np
from astropy.modeling import fitting, models
from astropy.wcs import WCS
from distributed import Client
from numpy.typing import NDArray
Expand All @@ -12,6 +16,7 @@
create_visibility_from_ms_rsexecute,
)
from rascil.workflows.rsexecute.execution_support import rsexecute
from scipy.optimize import minpack
from ska_sdp_datamodels.image.image_model import Image as SkaSdpImage
from ska_sdp_datamodels.science_data_model import PolarisationFrame
from ska_sdp_func_python.image import image_gather_channels
Expand All @@ -26,9 +31,10 @@
from karabo.imaging.image import Image
from karabo.simulation.sky_model import SkyModel
from karabo.simulation.visibility import Visibility
from karabo.util._types import FilePathType
from karabo.util._types import BeamType, FilePathType
from karabo.util.dask import DaskHandler
from karabo.util.file_handler import FileHandler, assert_valid_ending
from karabo.warning import KaraboWarning

ImageContextType = Literal["awprojection", "2d", "ng", "wg"]
CleanAlgorithmType = Literal["hogbom", "msclean", "mmclean"]
Expand Down Expand Up @@ -355,7 +361,10 @@ def imaging_rascil(
if not client:
client = DaskHandler.get_dask_client()
print(client.cluster)
rsexecute.set_client(use_dask=use_dask, client=client, use_dlg=False)
rsexecute.set_client(client=client, use_dask=use_dask, use_dlg=False)
else: # set use_dask through `set_client` to False,
# because it's the only way to disable dask for `rsexecute` singleton
rsexecute.set_client(client=None, use_dask=False, use_dlg=False)
# Set CUDA parameters
if use_cuda:
if img_context != "wg":
Expand Down Expand Up @@ -516,3 +525,96 @@ def radian_degree(rad: float) -> float:
img_coords = np.array([px, py])

return img_coords, idxs

@classmethod
def _convert_clean_beam_to_degrees(
cls,
im: Image,
beam_pixels: tuple[float, float, float],
) -> BeamType:
"""Convert clean beam in pixels to arcsec, arcesc, degree.
Source: https://gitlab.com/ska-telescope/sdp/ska-sdp-func-python/-/blob/main/src/ska_sdp_func_python/image/operations.py # noqa: E501
Args:
im: Image
beam_pixels: Beam size in pixels
Returns:
"bmaj" (arcsec), "bmin" (arcsec), "bpa" (degree)
"""
cellsize = im.get_cellsize()
to_mm: np.float64 = np.sqrt(8.0 * np.log(2.0))
clean_beam: BeamType
if beam_pixels[1] > beam_pixels[0]:
clean_beam = {
"bmaj": np.rad2deg(beam_pixels[1] * cellsize * to_mm),
"bmin": np.rad2deg(beam_pixels[0] * cellsize * to_mm),
"bpa": np.rad2deg(beam_pixels[2]),
}
else:
clean_beam = {
"bmaj": np.rad2deg(beam_pixels[0] * cellsize * to_mm),
"bmin": np.rad2deg(beam_pixels[1] * cellsize * to_mm),
"bpa": np.rad2deg(beam_pixels[2]) + 90.0,
}
return clean_beam

@classmethod
def guess_beam_parameters(cls, img: Image) -> BeamType:
"""Fit a two-dimensional Gaussian to img using astropy.modeling.
This function is usually applied on a PSF-image. Therefore, just
images who don't have beam-params in the header (e.g. dirty image) may need a
beam-guess.
Source: https://gitlab.com/ska-telescope/sdp/ska-sdp-func-python/-/blob/main/src/ska_sdp_func_python/image/deconvolution.py # noqa: E501
Args:
img: Image to guess the beam
Returns:
major-axis (arcsec), minor-axis (arcsec), position-angle (degree)
"""
if img.has_beam_parameters():
warnings.warn(
f"Image {img.path} already has beam-info in the header.",
KaraboWarning,
)
npixel = img.data.shape[3]
sl = slice(npixel // 2 - 7, npixel // 2 + 8)
y, x = np.mgrid[sl, sl]
z = img.data[0, 0, sl, sl]

# isotropic at the moment!
try:
p_init = models.Gaussian2D(
amplitude=np.max(z), x_mean=np.mean(x), y_mean=np.mean(y)
)
fit_p = fitting.LevMarLSQFitter()
with warnings.catch_warnings():
# Ignore model linearity warning from the fitter
warnings.simplefilter("ignore")
fit = fit_p(p_init, x, y, z)
if fit.x_stddev <= 0.0 or fit.y_stddev <= 0.0:
warnings.warn(
"guess_beam_parameters: error in fitting to psf, "
+ "using 1 pixel stddev"
)
beam_pixels = (1.0, 1.0, 0.0)
else:
beam_pixels = (
fit.x_stddev.value,
fit.y_stddev.value,
fit.theta.value,
)
except minpack.error:
warnings.warn("guess_beam_parameters: minpack error, using 1 pixel stddev")
beam_pixels = (1.0, 1.0, 0.0)
except ValueError:
warnings.warn(
"guess_beam_parameters: warning in fit to psf, using 1 pixel stddev"
)
beam_pixels = (1.0, 1.0, 0.0)

return cls._convert_clean_beam_to_degrees(img, beam_pixels)
3 changes: 1 addition & 2 deletions karabo/performance_test/time_karabo.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@
import numpy as np
from numpy.typing import NDArray

from karabo.error import KaraboError
from karabo.imaging.imager import Imager
from karabo.simulation.interferometer import InterferometerSimulation
from karabo.simulation.observation import Observation
Expand Down Expand Up @@ -125,7 +124,7 @@ def main(n_random_sources: int) -> None:
# Source detection
detection_result = PyBDSFSourceDetectionResult.detect_sources_in_image(restored)
if detection_result is None:
raise KaraboError("`detection_result` is None.")
raise ValueError("`detection_result` is None.")

ground_truth, sky_idxs = Imager.project_sky_to_image(
sky=sky,
Expand Down
4 changes: 2 additions & 2 deletions karabo/simulation/telescope.py
Original file line number Diff line number Diff line change
Expand Up @@ -207,12 +207,12 @@ def constructor(
)
try:
configuration = create_named_configuration(name)
except ValueError:
except ValueError as e:
raise ValueError(
f"""Requested telescope {name} is not supported by this backend.
For more details, see
https://gitlab.com/ska-telescope/sdp/ska-sdp-datamodels/-/blob/d6dcce6288a7bf6d9ce63ab16e799977723e7ae5/src/ska_sdp_datamodels/configuration/config_create.py""" # noqa
)
) from e

config_earth_location = configuration.location
telescope = Telescope(
Expand Down

0 comments on commit d8dcf2e

Please sign in to comment.