Skip to content

Commit

Permalink
Merge branch 'dev' into feature/swift-facility
Browse files Browse the repository at this point in the history
  • Loading branch information
jchate6 committed Oct 17, 2023
2 parents 98c0498 + 7ee5576 commit 26874a3
Show file tree
Hide file tree
Showing 41 changed files with 2,304 additions and 268 deletions.
128 changes: 128 additions & 0 deletions docs/managing_data/forced_photometry.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,128 @@
Integrating Forced Photometry Service Queries
---------------------------------------

The base TOM Toolkit comes with Atlas, panSTARRS, and ZTF query services. More services
can be added by extending the base ForcedPhotometryService implementation.


Integrating existing Forced Photometry Services
###############################################

You must add certain configuration to your TOM's ``settings.py`` to setup the existing forced
photometry services. This configuration will go in the ``FORCED_PHOTOMETRY_SERVICES`` section
shown below:

.. code:: python
FORCED_PHOTOMETRY_SERVICES = {
'atlas': {
'class': 'tom_dataproducts.forced_photometry.atlas.AtlasForcedPhotometryService',
'url': "https://fallingstar-data.com/forcedphot",
'api_key': os.getenv('ATLAS_FORCED_PHOTOMETRY_API_KEY', 'your atlas account api token')
},
'panstarrs': {
#TODO
},
'ztf': {
#TODO
}
}
DATA_PRODUCT_TYPES = {
...
'atlas_photometry': ('atlas_photometry', 'Atlas Photometry'),
...
}
DATA_PROCESSORS = {
...
'atlas_photometry': 'tom_dataproducts.processors.atlas_processor.AtlasProcessor',
...
}
Configuring your TOM to serve tasks asynchronously:
***************************************************

Several of the services are best suited to be queried asynchronously, especially if you plan to make large
queries that would take a long time. The TOM Toolkit is setup to use `dramatiq <https://dramatiq.io/index.html>`_
as an asynchronous task manager, but doing so requires you to run either a `redis <https://github.com/redis/redis>`_
or `rabbitmq <https://github.com/rabbitmq/rabbitmq-server>`_ server to act as the task queue. To use dramatiq with
a redis server, you would add the following to your ``settings.py``:

.. code:: python
INSTALLED_APPS = [
...
'django_dramatiq',
...
]
DRAMATIQ_BROKER = {
"BROKER": "dramatiq.brokers.redis.RedisBroker",
"OPTIONS": {
"url": "redis://your-redis-service-url:your-redis-port"
},
"MIDDLEWARE": [
"dramatiq.middleware.AgeLimit",
"dramatiq.middleware.TimeLimit",
"dramatiq.middleware.Callbacks",
"dramatiq.middleware.Retries",
"django_dramatiq.middleware.DbConnectionsMiddleware",
]
}
After adding the ``django_dramatiq`` installed app, you will need to run ``./manage.py migrate`` once to setup
its DB tables. If this configuration is set in your TOM, the existing services which support asynchronous queries,
Atlas and ZTF, should start querying asynchronously. If you do not add these settings, those services will still
function but will fall back to synchronous queries.


Adding a new Forced Photometry Service
######################################

The Forced Photometry services fulfill an interface defined in
`BaseForcedPhotometryService <https://github.com/TOMToolkit/tom_base/blob/dev/tom_dataproducts/forced_photometry/forced_photometry_service.py>`_.
To implement your own Forced Photometry service, you need to do 3 things:
1. Subclass BaseForcedPhotometryService
2. Subclass BaseForcedPhotometryQueryForm
3. Subclass DataProcessor
Once those are implemented, don't forget to update your settings for ``FORCED_PHOTOMETRY_SERVICES``,
``DATA_PRODUCT_TYPES``, and ``DATA_PROCESSORS`` for your new service and its associated data product type.


Subclass BaseForcedPhotometryService:
*************************************

The most important method here is the ``query_service`` method which is where you put your service's business logic
for making the query, given the form parameters and target. This method is expected to create a DataProduct in the database
at the end of the query, storing the result file or files. If queries to your service are expected to take a long time and
you would like to make them asynchronously (not blocking the UI while calling), then follow the example in the
`atlas implementation <https://github.com/TOMToolkit/tom_base/blob/dev/tom_dataproducts/forced_photometry/atlas.py>`_ and place your
actual asynchronous query method in your module's ``tasks.py`` file so it can be found by dramatiq. Like in the atlas implementation,
your code should check to see if ``django_dramatiq`` is in the settings ``INSTALLED_APPS`` before trying to enqueue it with dramatiq.

The ``get_data_product_type`` method should return the name of your new data product type you are going to define a
DataProcessor for. This must match the name you add to ``DATA_PROCESSORS`` and ``DATA_PRODUCT_TYPES`` in your ``settings.py``.
You will also need to define a `DataProcessor <https://github.com/TOMToolkit/tom_base/blob/dev/tom_dataproducts/data_processor.py#L46>`
for this data type.


Subclass BaseForcedPhotometryQueryForm:
***************************************

This class defines the form users will need to fill out to query the service. It uses
`django-crispy-forms <https://django-crispy-forms.readthedocs.io/en/latest/>`_ to define the layout
programmatically. You first will add whatever form fields you need to the base of your
subclass, and then just fill in the ``layout()`` method with a django-crispy-forms layout
for your fields, and optionally the ``clean()`` method if you want to perform any field validation.
The values of the fields from this form will be available to you in your service class in the
``query_service`` method.


Subclass DataProcessor:
***********************

You must create a custom DataProcessor that knows how to convert data returned from your service into
a series of either photometry or spectroscopy datums. Without defining this step, your queries will still
result in a DataProduct file being stored from the service's ``query_service`` method, but those files will
not be parsed into photometry or spectroscopy datums. You can read more about how to implement a custom
DataProcessor `here <../customizing_data_processing>`_.
4 changes: 3 additions & 1 deletion docs/managing_data/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ Managing Data
customizing_data_processing
tom_direct_sharing
stream_pub_sub
forced_photometry


:doc:`Creating Plots from TOM Data <plotting_data>` - Learn how to create plots using plot.ly and your TOM
Expand All @@ -23,4 +24,5 @@ TOM from uploaded data products.

:doc:`Publish and Subscribe to a Kafka Stream <stream_pub_sub>` - Learn how to publish and subscribe to a Kafka stream topic.


:doc:`Integrating Forced Photometry Service Queries <forced_photometry>` - Learn how to integrate the existing Atlas, panSTARRS, and ZTF
forced photometry services into your TOM, and learn how to add new services.
100 changes: 71 additions & 29 deletions docs/managing_data/tom_direct_sharing.rst
Original file line number Diff line number Diff line change
@@ -1,32 +1,74 @@
Sharing Data with Other TOMs
############################

TOM Toolkit does not yet support direct sharing between TOMs, however we hope to add this functionality soon.


.. Configuring your TOM to submit data to another TOM:
.. ***************************************************
..
.. You will need to add a ``DATA_SHARING`` configuration dictionary to your ``settings.py`` that gives the credentials
.. for the various TOMs with which you wish to share data.
..
.. .. code:: python
..
.. # Define the valid data sharing destinations for your TOM.
.. DATA_SHARING = {
.. 'tom-demo-dev': {
.. 'DISPLAY_NAME': os.getenv('TOM_DEMO_DISPLAY_NAME', 'TOM Demo Dev'),
.. 'BASE_URL': os.getenv('TOM_DEMO_BASE_URL', 'http://tom-demo-dev.lco.gtn/'),
.. 'USERNAME': os.getenv('TOM_DEMO_USERNAME', 'set TOM_DEMO_USERNAME value in environment'),
.. 'PASSWORD': os.getenv('TOM_DEMO_PASSWORD', 'set TOM_DEMO_PASSWORD value in environment'),
.. },
.. 'localhost-tom': {
.. # for testing; share with yourself
.. 'DISPLAY_NAME': os.getenv('LOCALHOST_TOM_DISPLAY_NAME', 'Local'),
.. 'BASE_URL': os.getenv('LOCALHOST_TOM_BASE_URL', 'http://127.0.0.1:8000/'),
.. 'USERNAME': os.getenv('LOCALHOST_TOM_USERNAME', 'set LOCALHOST_TOM_USERNAME value in environment'),
.. 'PASSWORD': os.getenv('LOCALHOST_TOM_PASSWORD', 'set LOCALHOST_TOM_PASSWORD value in environment'),
.. }
..
.. }
..
TOM Toolkit supports direct data sharing between TOMs.


Permissions:
************
To save data to a destination TOM your TOM will need to have access to a user account on that TOM with the correct
permissions. This is handled by your TOM's administrator as described below.

.. warning:: Any user who has permission to access the relevant target or data in your TOM will have permission to
submit that data to the destination TOM once DATA_SHARING is configured.


Configuring your TOM to submit data to another TOM:
***************************************************

You will need to add a ``DATA_SHARING`` configuration dictionary to your ``settings.py`` that gives the credentials
for the various TOMs with which you wish to share data. This should be the same ``DATA_SHARING`` dictionary that is used
to :doc:`/managing_data/stream_pub_sub` such as `Hermes <https://hermes.lco.global>`_.

.. code:: python
# Define the valid data sharing destinations for your TOM.
DATA_SHARING = {
'not-my-tom': {
# For sharing data with another TOM
'DISPLAY_NAME': os.getenv('NOT_MY_TOM_DISPLAY_NAME', 'Not My Tom'),
'BASE_URL': os.getenv('NOT_MY_TOM_BASE_URL', 'http://notmytom.com/'),
'USERNAME': os.getenv('NOT_MY_TOM_USERNAME', 'set NOT_MY_TOM_USERNAME value in environment'),
'PASSWORD': os.getenv('NOT_MY_TOM_PASSWORD', 'set NOT_MY_TOM_PASSWORD value in environment'),
},
'localhost-tom': {
# for testing; share with yourself
'DISPLAY_NAME': os.getenv('LOCALHOST_TOM_DISPLAY_NAME', 'Local'),
'BASE_URL': os.getenv('LOCALHOST_TOM_BASE_URL', 'http://127.0.0.1:8000/'),
'USERNAME': os.getenv('LOCALHOST_TOM_USERNAME', 'set LOCALHOST_TOM_USERNAME value in environment'),
'PASSWORD': os.getenv('LOCALHOST_TOM_PASSWORD', 'set LOCALHOST_TOM_PASSWORD value in environment'),
}
}
Receiving Shared Data:
**********************

Reduced Datums:
---------------
When your TOM receives a new ``ReducedDatum`` from another TOM it will be saved to your TOM's database with its source
set to the name of the TOM that submitted it. Currently, only Photometry data can be directly shared between
TOMS and a ``Target`` with a matching name or alias must exist in both TOMS for sharing to take place.

Data Products:
--------------
When your TOM receives a new ``DataProduct`` from another TOM it will be saved to your TOM's database / storage and run
through the appropriate :doc:`data_processor </managing_data/customizing_data_processing>` pipeline. Only data products
associated with a ``Target`` with a name or alias that matches that of a target in the destination TOM will be shared.

Targets:
--------
When your TOM receives a new ``Target`` from another TOM it will be saved to your TOM's database. If the target's name
or alias doesn't match that of a target that already exists in the database, a new target will be created and added to a
new ``TargetList`` called "Imported from <TOM Name>".

Target Lists:
-------------
When your TOM receives a new ``TargetList`` from another TOM it will be saved to your TOM's database. If the targets in
the ``TargetList`` are also shared, but already exist in the destination TOM, they will be added to the new
``TargetList``.






5 changes: 4 additions & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,12 +32,14 @@
'astroplan~=0.8',
'astropy>=5.0',
'beautifulsoup4~=4.9',
'dramatiq[redis, watch]<2.0.0',
'django>=3.1,<5', # TOM Toolkit requires db math functions
'djangorestframework~=3.12',
'django-bootstrap4>=3,<24',
'django-contrib-comments~=2.0', # Earlier version are incompatible with Django >= 3.0
'django-crispy-forms~=2.0',
'crispy-bootstrap4~=2022.0',
'django-dramatiq<1.0.0',
'django-extensions~=3.1',
'django-filter>=21,<24',
'django-gravatar2~=1.4',
Expand All @@ -52,7 +54,8 @@
'specutils~=1.8',
],
extras_require={
'test': ['factory_boy>=3.2.1,<3.4.0'],
'test': ['factory_boy>=3.2.1,<3.4.0',
'responses~=0.23'],
'docs': [
'recommonmark~=0.7',
'sphinx>=4,<8',
Expand Down
48 changes: 24 additions & 24 deletions tom_base/settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -237,30 +237,30 @@
}

# Configuration for the TOM/Kafka Stream receiving data from this TOM
DATA_SHARING = {
'hermes': {
'DISPLAY_NAME': os.getenv('HERMES_DISPLAY_NAME', 'Hermes'),
'BASE_URL': os.getenv('HERMES_BASE_URL', 'https://hermes.lco.global/'),
'CREDENTIAL_USERNAME': os.getenv('SCIMMA_CREDENTIAL_USERNAME',
'set SCIMMA_CREDENTIAL_USERNAME value in environment'),
'CREDENTIAL_PASSWORD': os.getenv('SCIMMA_CREDENTIAL_PASSWORD',
'set SCIMMA_CREDENTIAL_PASSWORD value in environment'),
'USER_TOPICS': ['hermes.test', 'tomtoolkit.test']
},
'tom-demo-dev': {
'DISPLAY_NAME': os.getenv('TOM_DEMO_DISPLAY_NAME', 'TOM Demo Dev'),
'BASE_URL': os.getenv('TOM_DEMO_BASE_URL', 'http://tom-demo-dev.lco.gtn/'),
'USERNAME': os.getenv('TOM_DEMO_USERNAME', 'set TOM_DEMO_USERNAME value in environment'),
'PASSWORD': os.getenv('TOM_DEMO_PASSWORD', 'set TOM_DEMO_PASSWORD value in environment'),
},
'localhost-tom': {
# for testing; share with yourself
'DISPLAY_NAME': os.getenv('LOCALHOST_TOM_DISPLAY_NAME', 'Local'),
'BASE_URL': os.getenv('LOCALHOST_TOM_BASE_URL', 'http://127.0.0.1:8000/'),
'USERNAME': os.getenv('LOCALHOST_TOM_USERNAME', 'set LOCALHOST_TOM_USERNAME value in environment'),
'PASSWORD': os.getenv('LOCALHOST_TOM_PASSWORD', 'set LOCALHOST_TOM_PASSWORD value in environment'),
}
}
# DATA_SHARING = {
# 'hermes': {
# 'DISPLAY_NAME': os.getenv('HERMES_DISPLAY_NAME', 'Hermes'),
# 'BASE_URL': os.getenv('HERMES_BASE_URL', 'https://hermes.lco.global/'),
# 'CREDENTIAL_USERNAME': os.getenv('SCIMMA_CREDENTIAL_USERNAME',
# 'set SCIMMA_CREDENTIAL_USERNAME value in environment'),
# 'CREDENTIAL_PASSWORD': os.getenv('SCIMMA_CREDENTIAL_PASSWORD',
# 'set SCIMMA_CREDENTIAL_PASSWORD value in environment'),
# 'USER_TOPICS': ['hermes.test', 'tomtoolkit.test']
# },
# 'tom-demo-dev': {
# 'DISPLAY_NAME': os.getenv('TOM_DEMO_DISPLAY_NAME', 'TOM Demo Dev'),
# 'BASE_URL': os.getenv('TOM_DEMO_BASE_URL', 'http://tom-demo-dev.lco.gtn/'),
# 'USERNAME': os.getenv('TOM_DEMO_USERNAME', 'set TOM_DEMO_USERNAME value in environment'),
# 'PASSWORD': os.getenv('TOM_DEMO_PASSWORD', 'set TOM_DEMO_PASSWORD value in environment'),
# },
# 'localhost-tom': {
# # for testing; share with yourself
# 'DISPLAY_NAME': os.getenv('LOCALHOST_TOM_DISPLAY_NAME', 'Local'),
# 'BASE_URL': os.getenv('LOCALHOST_TOM_BASE_URL', 'http://127.0.0.1:8000/'),
# 'USERNAME': os.getenv('LOCALHOST_TOM_USERNAME', 'set LOCALHOST_TOM_USERNAME value in environment'),
# 'PASSWORD': os.getenv('LOCALHOST_TOM_PASSWORD', 'set LOCALHOST_TOM_PASSWORD value in environment'),
# }
# }

TOM_CADENCE_STRATEGIES = [
'tom_observations.cadences.retry_failed_observations.RetryFailedObservationsStrategy',
Expand Down
2 changes: 1 addition & 1 deletion tom_dataproducts/alertstreams/hermes.py
Original file line number Diff line number Diff line change
Expand Up @@ -136,7 +136,7 @@ def get_hermes_topics(**kwargs):
response = requests.get(url=submit_url, headers=headers)

topics = response.json()['writable_topics']
except KeyError:
except (KeyError, requests.exceptions.JSONDecodeError):
topics = settings.DATA_SHARING['hermes']['USER_TOPICS']
return topics

Expand Down
31 changes: 29 additions & 2 deletions tom_dataproducts/api_views.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,15 +4,15 @@
from guardian.shortcuts import assign_perm, get_objects_for_user
from rest_framework import status
from rest_framework.mixins import CreateModelMixin, DestroyModelMixin, ListModelMixin
from rest_framework.parsers import MultiPartParser
from rest_framework.parsers import MultiPartParser, FormParser, JSONParser
from rest_framework.response import Response
from rest_framework.viewsets import GenericViewSet

from tom_common.hooks import run_hook
from tom_dataproducts.data_processor import run_data_processor
from tom_dataproducts.filters import DataProductFilter
from tom_dataproducts.models import DataProduct, ReducedDatum
from tom_dataproducts.serializers import DataProductSerializer
from tom_dataproducts.serializers import DataProductSerializer, ReducedDatumSerializer


class DataProductViewSet(CreateModelMixin, DestroyModelMixin, ListModelMixin, GenericViewSet, PermissionListMixin):
Expand All @@ -38,6 +38,7 @@ def create(self, request, *args, **kwargs):
response = super().create(request, *args, **kwargs)

if response.status_code == status.HTTP_201_CREATED:
response.data['message'] = 'Data product successfully uploaded.'
dp = DataProduct.objects.get(pk=response.data['id'])
try:
run_hook('data_product_post_upload', dp)
Expand Down Expand Up @@ -68,3 +69,29 @@ def get_queryset(self):
)
else:
return get_objects_for_user(self.request.user, 'tom_dataproducts.view_dataproduct')


class ReducedDatumViewSet(CreateModelMixin, DestroyModelMixin, ListModelMixin, GenericViewSet, PermissionListMixin):
"""
Viewset for ReducedDatum objects. Supports list, create, and delete.
To view supported query parameters, please use the OPTIONS endpoint, which can be accessed through the web UI.
**Please note that ``groups`` are an accepted query parameters for the ``CREATE`` endpoint. The groups parameter
will specify which ``groups`` can view the created ``DataProduct``. If no ``groups`` are specified, the
``ReducedDatum`` will only be visible to the user that created the ``DataProduct``. Make sure to check your
``groups``!!**
"""
queryset = ReducedDatum.objects.all()
serializer_class = ReducedDatumSerializer
filter_backends = (drf_filters.DjangoFilterBackend,)
permission_required = 'tom_dataproducts.view_reduceddatum'
parser_classes = [FormParser, JSONParser]

def create(self, request, *args, **kwargs):
response = super().create(request, *args, **kwargs)

if response.status_code == status.HTTP_201_CREATED:
response.data['message'] = 'Data successfully uploaded.'

return response

0 comments on commit 26874a3

Please sign in to comment.