Skip to content

Commit

Permalink
Merge pull request #186 from simonsobs/develop
Browse files Browse the repository at this point in the history
Release v0.7.1
  • Loading branch information
BrianJKoopman committed Oct 29, 2020
2 parents c8c7979 + 13d49cc commit d8dcb41
Show file tree
Hide file tree
Showing 59 changed files with 3,584 additions and 351 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/develop.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ jobs:
- name: Test with pytest within a docker container
run: |
docker run -v $PWD:/coverage --rm ocs sh -c "COVERAGE_FILE=/coverage/.coverage.docker python3 -m pytest -p no:wampy --cov /app/ocs/ocs/ ./tests/"
docker run -v $PWD:/coverage --rm ocs sh -c "COVERAGE_FILE=/coverage/.coverage.docker python3 -m pytest -p no:wampy -m 'not integtest' --cov /app/ocs/ocs/ ./tests/"
- name: Report test coverage
env:
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/official-docker-images.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ jobs:
# Test (already been run by pytest workflow, but they don't take long...)
- name: Test with pytest wtihin a docker container
run: |
docker run -v $PWD:/coverage --rm ocs sh -c "COVERAGE_FILE=/coverage/.coverage.docker python3 -m pytest -p no:wampy --cov /app/ocs/ocs/ ./tests/"
docker run -v $PWD:/coverage --rm ocs sh -c "COVERAGE_FILE=/coverage/.coverage.docker python3 -m pytest -p no:wampy -m 'not integtest' --cov /app/ocs/ocs/ ./tests/"
- name: Test documentation build
run: |
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/pytest.yml
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ jobs:

- name: Test with pytest wtihin a docker container
run: |
docker run -v $PWD:/coverage --rm ocs sh -c "COVERAGE_FILE=/coverage/.coverage.docker python3 -m pytest -p no:wampy --cov /app/ocs/ocs/ ./tests/"
docker run -v $PWD:/coverage --rm ocs sh -c "COVERAGE_FILE=/coverage/.coverage.docker python3 -m pytest -p no:wampy -m 'not integtest' --cov /app/ocs/ocs/ ./tests/"
- name: Report test coverage
env:
Expand Down
3 changes: 3 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,9 @@ RUN apt-get update && apt-get install -y python3 \
python3-pip \
vim

# Install init system
RUN pip3 install dumb-init

# Copy the current directory contents into the container at /app
COPY . /app/ocs/

Expand Down
36 changes: 27 additions & 9 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,9 @@ OCS - Observatory Control System
:target: https://github.com/simonsobs/ocs/actions?query=workflow%3A%22Build+Develop+Images%22
:alt: GitHub Workflow Status

.. image:: https://readthedocs.org/projects/ocs/badge/?version=latest
:target: https://ocs.readthedocs.io/en/latest/?badge=latest
.. image:: https://readthedocs.org/projects/ocs/badge/?version=develop
:target: https://ocs.readthedocs.io/en/develop/?badge=develop
:alt: Documentation Status

.. image:: https://coveralls.io/repos/github/simonsobs/ocs/badge.svg
:target: https://coveralls.io/github/simonsobs/ocs

Expand Down Expand Up @@ -55,10 +54,10 @@ Installation
------------
Clone this repository and install using pip::

git clone https://github.com/simonsobs/ocs.git
cd ocs/
pip3 install -r requirements.txt
python3 setup.py install
$ git clone https://github.com/simonsobs/ocs.git
$ cd ocs/
$ pip3 install -r requirements.txt
$ python3 setup.py install

**Note:** If you want to install locally, not globally, throw the `--user` flag
on both the pip3 and setup.py commands.
Expand Down Expand Up @@ -89,14 +88,33 @@ Documentation
The OCS documentation can be built using sphinx once you have performed the
installation::

cd docs/
make html
$ cd docs/
$ make html

You can then open ``docs/_build/html/index.html`` in your preferred web
browser. You can also find a copy hosted on `Read the Docs`_.

.. _Read the Docs: https://ocs.readthedocs.io/en/latest/

Tests
-----
The tests for OCS can be run using pytest, and should be run from the
``tests/`` directory::

$ cd tests/
$ python3 -m pytest

To run the tests within a Docker container (useful if your local environment is
missing some dependencies), first make sure you build the latest ocs image,
then use docker run::

$ docker build -t ocs .
$ docker run --rm ocs sh -c "python3 -m pytest -p no:wampy -m 'not integtest' ./tests/"

For more details see `tests/README.rst <tests_>`_.

.. _tests: tests/README.rst

Example
-------

Expand Down
8 changes: 3 additions & 5 deletions agents/aggregator/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -13,11 +13,9 @@ COPY aggregator_agent.py .
RUN mkdir -p /data && \
chown ocs:ocs /data



# Run registry on container startup
ENTRYPOINT ["python3", "-u", "aggregator_agent.py"]
ENTRYPOINT ["dumb-init", "python3", "-u", "aggregator_agent.py"]

# Sensible defaults for setup with sisock
CMD ["--site-hub=ws://sisock-crossbar:8001/ws", \
"--site-http=http://sisock-crossbar:8001/call"]
CMD ["--site-hub=ws://crossbar:8001/ws", \
"--site-http=http://crossbar:8001/call"]
27 changes: 27 additions & 0 deletions agents/aggregator/aggregator_agent.py
Original file line number Diff line number Diff line change
@@ -1,10 +1,16 @@
import time
import queue
import argparse
import txaio

from os import environ
from ocs import ocs_agent, site_config
from ocs.agent.aggregator import Aggregator

# For logging
txaio.use_twisted()
LOG = txaio.make_logger()


class AggregatorAgent:
"""
Expand Down Expand Up @@ -66,12 +72,30 @@ def enqueue_incoming_data(self, _data):
return

self.incoming_data.put((data, feed))
self.log.debug("Enqueued {d} from Feed {f}", d=data, f=feed)

def start_aggregate(self, session: ocs_agent.OpSession, params=None):
"""
Process for starting data aggregation. This process will create an
Aggregator instance, which will collect and write provider data to disk
as long as this process is running.
The most recent file and active providers will be returned in
session.data::
{"current_file": "/data/16020/1602089117.g3",
"providers": {
"observatory.fake-data1.feeds.false_temperatures": {
"last_refresh": 1602089118.8225083,
"sessid": "1602088928.8294137",
"stale": false,
"last_block_received": "temps"},
"observatory.LSSIM.feeds.temperatures": {
"last_refresh": 1602089118.8223345,
"sessid": "1602088932.335811",
"stale": false,
"last_block_received": "temps"}}}
"""
session.set_status('starting')
self.aggregate = True
Expand Down Expand Up @@ -117,6 +141,9 @@ def make_parser(parser=None):


if __name__ == '__main__':
# Start logging
txaio.start_logging(level=environ.get("LOGLEVEL", "info"))

parser = make_parser()
args = site_config.parse_args(agent_class='AggregatorAgent',
parser=parser)
Expand Down
6 changes: 3 additions & 3 deletions agents/fake_data/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,8 @@ WORKDIR /app/ocs/agents/fake_data/
COPY . .

# Run registry on container startup
ENTRYPOINT ["python3", "-u", "fake_data_agent.py"]
ENTRYPOINT ["dumb-init", "python3", "-u", "fake_data_agent.py"]

# Sensible defaults for setup with sisock
CMD ["--site-hub=ws://sisock-crossbar:8001/ws", \
"--site-http=http://sisock-crossbar:8001/call"]
CMD ["--site-hub=ws://crossbar:8001/ws", \
"--site-http=http://crossbar:8001/call"]
81 changes: 77 additions & 4 deletions agents/fake_data/fake_data_agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,23 @@
import time
import threading
import os
from autobahn.wamp.exception import ApplicationError
import txaio

from os import environ
import numpy as np
from autobahn.wamp.exception import ApplicationError
from twisted.internet.defer import inlineCallbacks
from autobahn.twisted.util import sleep as dsleep

# For logging
txaio.use_twisted()
LOG = txaio.make_logger()

class FakeDataAgent:
def __init__(self, agent,
num_channels=2,
sample_rate=10.):
sample_rate=10.,
frame_length=60):
self.agent = agent
self.log = agent.log
self.lock = threading.Semaphore()
Expand All @@ -18,7 +28,7 @@ def __init__(self, agent,

# Register feed
agg_params = {
'frame_length': 60
'frame_length': frame_length
}
print('registering')
self.agent.register_feed('false_temperatures',
Expand All @@ -44,6 +54,20 @@ def start_acq(self, session, params=None):
This Process has no useful parameters.
The most recent fake values are stored in the session.data object in
the format::
{"fields":
{"channel_00": 0.10250430068515494,
"channel_01": 0.08550903376216404,
"channel_02": 0.10481891991693446,
"channel_03": 0.10793263271024509},
"timestamp":1600448753.9288929}
The channels kept in fields are the 'faked' data, in a similar
structure to the Lakeshore agents. 'timestamp' is the lastest time these values
were updated.
"""
ok, msg = self.try_set_job('acq')
if not ok: return ok, msg
Expand Down Expand Up @@ -111,6 +135,13 @@ def start_acq(self, session, params=None):
# self.log.info('Sending %i data on %i channels.' % (len(t), len(T)))
session.app.publish_to_feed('false_temperatures', block.encoded())

# Update session.data
data_cache = {"fields": {}, "timestamp": None}
for channel, samples in block.data.items():
data_cache['fields'][channel] = samples[-1]
data_cache['timestamp'] = block.timestamps[-1]
session.data.update(data_cache)

self.agent.feeds['false_temperatures'].flush_buffer()
self.set_job_done()
return True, 'Acquisition exited cleanly.'
Expand All @@ -124,6 +155,8 @@ def stop_acq(self, session, params=None):
return (ok, {True: 'Requested process stop.',
False: 'Failed to request process stop.'}[ok])

# Tasks

def set_heartbeat_state(self, session, params=None):
"""Task to set the state of the agent heartbeat.
Expand All @@ -139,6 +172,39 @@ def set_heartbeat_state(self, session, params=None):

return True, "Set heartbeat_on: {}".format(heartbeat_state)

@inlineCallbacks
def delay_task(self, session, params={}):
"""Task that will take the requested number of seconds to complete.
This can run simultaneously with the acq Process. This Task
should run in the reactor thread.
The session data will be updated with the requested delay as
well as the time elapsed so far, for example::
{'requested_delay': 5.,
'delay_so_far': 1.2}
Args:
delay (float): Time to wait before returning, in seconds.
Defaults to 5.
succeed (bool): Whether to return success or not.
Defaults to True.
"""
session.set_status('running')
delay = params.get('delay', 5)
session.data = {'requested_delay': delay,
'delay_so_far': 0}
succeed = params.get('succeed', True) is True
t0 = time.time()
while True:
session.data['delay_so_far'] = time.time() - t0
sleep_time = min(0.5, delay - session.data['delay_so_far'])
if sleep_time < 0:
break
yield dsleep(sleep_time)
return succeed, 'Exited after %.1f seconds' % session.data['delay_so_far']


def add_agent_args(parser_in=None):
Expand All @@ -152,10 +218,15 @@ def add_agent_args(parser_in=None):
'Channels are co-sampled.')
pgroup.add_argument('--sample-rate', default=9.5, type=float,
help='Frequency at which to produce data.')
pgroup.add_argument('--frame-length', default=60, type=int,
help='Frame length to pass to the aggregator parameters.')

return parser_in

if __name__ == '__main__':
# Start logging
txaio.start_logging(level=environ.get("LOGLEVEL", "info"))

parser = add_agent_args()
args = site_config.parse_args(agent_class='FakeDataAgent', parser=parser)

Expand All @@ -167,9 +238,11 @@ def add_agent_args(parser_in=None):

fdata = FakeDataAgent(agent,
num_channels=args.num_channels,
sample_rate=args.sample_rate)
sample_rate=args.sample_rate,
frame_length=args.frame_length)
agent.register_process('acq', fdata.start_acq, fdata.stop_acq,
blocking=True, startup=startup)
agent.register_task('set_heartbeat', fdata.set_heartbeat_state)
agent.register_task('delay_task', fdata.delay_task, blocking=False)

runner.run(agent, auto_reconnect=True)
6 changes: 3 additions & 3 deletions agents/influxdb_publisher/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,8 @@ WORKDIR /app/ocs/agents/influxdb_publisher/
COPY influxdb_publisher.py .

# Run publisher on container startup
ENTRYPOINT ["python3", "-u", "influxdb_publisher.py"]
ENTRYPOINT ["dumb-init", "python3", "-u", "influxdb_publisher.py"]

# Sensible defaults for setup with sisock
CMD ["--site-hub=ws://sisock-crossbar:8001/ws", \
"--site-http=http://sisock-crossbar:8001/call"]
CMD ["--site-hub=ws://crossbar:8001/ws", \
"--site-http=http://crossbar:8001/call"]
6 changes: 3 additions & 3 deletions agents/registry/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,8 @@ WORKDIR /app/ocs/agents/registry/
COPY . .

# Run registry on container startup
ENTRYPOINT ["python3", "-u", "registry.py"]
ENTRYPOINT ["dumb-init", "python3", "-u", "registry.py"]

# Sensible defaults for setup with sisock
CMD ["--site-hub=ws://sisock-crossbar:8001/ws", \
"--site-http=http://sisock-crossbar:8001/call"]
CMD ["--site-hub=ws://crossbar:8001/ws", \
"--site-http=http://crossbar:8001/call"]
Loading

0 comments on commit d8dcb41

Please sign in to comment.