Skip to content

Commit

Permalink
Improve security documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
matllubos committed Jan 20, 2023
1 parent 9604086 commit c52b4a0
Show file tree
Hide file tree
Showing 4 changed files with 84 additions and 37 deletions.
19 changes: 3 additions & 16 deletions docs/commands.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,24 +11,11 @@ Remove old request, command or celery logs that are older than defined value, pa
* ``expiration`` - timedelta from which logs will be removed. Units are h - hours, d - days, w - weeks, m - months, y - years
* ``noinput`` - tells Django to NOT prompt the user for input of any kind
* ``backup`` - tells Django where to backup removed logs in JSON format
* ``type`` - tells Django what type of requests should be removed (input-request/output-request/command/celery)
* ``type`` - tells Django what type of requests should be removed (input-request/output-request/command/celery-task-invocation/celery-task-run)

Logs can be removed only for ``elasticsearch`` and ``sql`` backends.

set_celery_task_log_state
-------------------------

Set celery tasks which are in WAITING state. Tasks which were not started more than ``SECURITY_CELERY_STALE_TASK_TIME_LIMIT_MINUTES`` (by default 60 minutes) to the failed state. Task with succeeded/failed task run is set to succeeded/failed state.

run_celery
---------

Run celery worker or beater with autoreload, parameters:

* ``type`` - type of the startup (beat or worker)
* ``celerysettings`` - path to the celery configuration file
* ``autoreload`` - tells Django to use the auto-reloader
* ``extra`` - extra celery startup arguments

celery_health_check
-------------------

Check Celery queue health. Either by count of tasks with state ``WAITING`` (``--max-tasks-count``) or by time waiting in queue (``--max-created-at-diff``, in seconds) or both at once. Default queue name is ``default``. You can change queue name with argument ``--queue-name``.
18 changes: 18 additions & 0 deletions docs/contrib.rst
Original file line number Diff line number Diff line change
Expand Up @@ -56,3 +56,21 @@ Do not forget turn on django DEBUG.
To show results in ``django-is-core`` you must set setting::

SECURITY_SHOW_DEBUG_TOOLBAR = True


django-is-core
--------------

Backends ``elasticsearch`` and ``sql`` provide prepared django-is-core administration. If you are using django-is-core library you can find admin core classes in:
* elasticsearch - ``security.elasticsearch.is_core.cores``
* ``InputRequestLogCore``
* ``OutputRequestLogCore``
* ``CommandLogCore``
* ``CeleryTaskRunLogCore``
* ``CeleryTaskInvocationLogCore``
* sql - ``security.sql.is_core.cores``
* ``InputRequestLogCore``
* ``OutputRequestLogCore``
* ``CommandLogCore``
* ``CeleryTaskRunLogCore``
* ``CeleryTaskInvocationLogCore``
46 changes: 37 additions & 9 deletions docs/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ For test purposes you will need to configure both databases to be tested::
databases = ('default', 'log')


The second solution is have second storage for logs in this case you will use ``MultipleDBSecurityLoggerRouter``::
The second solution is have a independent database for logs in this case you can use ``MultipleDBSecurityLoggerRouter``::

DATABASES = {
'default': {
Expand Down Expand Up @@ -130,18 +130,45 @@ Elasticsearch backend can be configured via ``SECURITY_ELASTICSEARCH_DATABASE``
}


For elasticsearch database initialization you must run ``./manage.py init_elasticsearch_log`` command.
For elasticsearch database initialization you must run ``./manage.py init_elasticsearch_log`` command to create indexes in the database.

There are two ways how to store logs in the elasticsearch: direct connection or via logstash. Direct connection is defined by default and no extra configuration is not required. For the logstash solution you need to allow configuration ``SECURITY_ELASTICSEARCH_LOGSTASH_WRITER``::

SECURITY_ELASTICSEARCH_LOGSTASH_WRITER = True


Now you have to run logstash with configuration defined in ``logstash.example.conf``.

Django will send data to the logstash via logger with this settings::

LOGGING.update({
'handlers': {
...
'logstash': {
'level': 'INFO',
'class': 'security.backends.elasticsearch.logstash.handler_tcp.TCPLogstashHandler',
'host': 'logstash',
'port': 5044,
'formatter': 'logstash',
},
...
},
'loggers': {
...
'security.logstash': {
'handlers': ['logstash'],
'level': 'INFO',
'propagate': False,
},
}


Testing backend
---------------

For testing purposes you can use `'security.backends.testing'` and add it to the installed apps::
For testing purposes you can use `'security.backends.testing'` and turn off log writers::

INSTALLED_APPS = (
...
'security.backends.testing',
...
)
SECURITY_BACKEND_WRITERS = [] # Turn off log writers

Your test you can surround with `security.backends.testing.capture_security_logs` decorator/context processor::

Expand All @@ -153,11 +180,12 @@ Your test you can surround with `security.backends.testing.capture_security_logs
assert_length_equal(logged_data.command, 1)
assert_length_equal(logged_data.celery_task_invocation, 1)
assert_length_equal(logged_data.celery_task_run, 1)
assert_equal(logged_data.input_request[0].request_body, 'test')

Readers
-------

Some ``elasticsearch``, ``sql`` and ``testing`` backends can be used as readers too. You can use these helpers to get data from it:
Some ``elasticsearch``, ``sql`` and ``testing`` backends can be used as readers too. You can use these helpers to get data from these backends (no matter which wan is set):

* ``security.backends.reader.get_count_input_requests(from_time, ip=None, path=None, view_slug=None, slug=None, method=None, exclude_log_id=None)`` - to get count input requests with input arguments
* ``security.backends.reader.get_logs_related_with_object(logger_name, related_object)`` - to get list of logs which are related with object
Expand Down
38 changes: 26 additions & 12 deletions docs/logger.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@ Input requests

Input requests are logged automatically with ``security.middleware.LogMiddleware``. The middleware creates ``security.models.InputLoggedRequest`` object before sending request to next middleware. Response data to the logged requests are completed in the end. You can found logged request in the Django request objects with that way ``request.input_logged_request``.

Decorators
^^^^^^^^^^
View decorators
^^^^^^^^^^^^^^^

There are several decorators for views and generic views that can be used for view logging configuration:

Expand All @@ -28,29 +28,43 @@ Logging of output requests is a little bit complicated and is related to the way
requests
^^^^^^^^

The first method is used for logging simple HTTP requests using ``requests`` library. The only change necessary is to import ``from security.transport import security_requests as requests`` instead of ``import requests``. Same methods (get, post, put, ..) are available as in the requests library. Every method has two extra optional parameters:
The first method is used for logging simple HTTP requests using ``requests`` library. The only change necessary is to import ``from security import requests`` instead of ``import requests``. Same methods (get, post, put, ..) are available as in the requests library. Every method has two extra optional parameters:

* ``slug`` - text slug that is stored with the logged request to tag concrete logged value
* ``related_objects`` - list or tuple of related objects that will be related with output logged request

Example where user is stored in the related objects and log slug is set to the value ``'request'``::

from security import requests
from users.models import User

user = User.objects.first()
requests.get('https:///github.com/druids/', slug='request', related_objects=[user])

suds
^^^^

For SOAP based clients there are extensions to the ``suds`` library. You must only use ``security.transport.security_suds.Client`` class without standard suds client or ``security.transport.security_suds.SecurityRequestsTransport`` with standard suds client object.
As init data of ``security.transport.security_suds.SecurityRequestsTransport`` you can send ``slug`` and ``related_objects``.
The ``security.transport.security_suds.Client`` has ``slug`` as initial parameter bug related objects must be added via ``add_related_objects(self, *related_objects)`` method.
For SOAP based clients there are extensions to the ``suds`` library. You must only use ``security.suds.Client`` class without standard suds client or ``security.suds.SecurityRequestsTransport`` with standard suds client object.
As init data of ``security.suds.SecurityRequestsTransport`` you can send ``slug`` and ``related_objects``.
The ``security.suds.Client`` has ``slug`` and ``related_objects`` input parameter::

from security.suds import Client
from users.models import User

user = User.objects.first()
client = Client('http://your.service.url, slug='suds', related_objects=[user])

Decorators/context processors
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

``security.decorators.atomic_log`` - because logged requests are stored in models, they are subject to rollback, if you are using transactions. To solve this problem you can use this decorator before Django ``transaction.atomic`` decorator. The logs are stored on the end of the transaction (even with raised exception). Decorator can be nested, logs are saved only with the last decorator. If you want to join a object with output request log you can use this decorator too. In the example user is logged with output request::
``security.decorators.log_with_data`` - because logged requests are stored in models, they are subject to rollback, if you are using transactions. To solve this problem you can use this decorator before Django ``transaction.atomic`` decorator. The logs are stored on the end of the transaction (even with raised exception). Decorator can be nested, logs are saved only with the last decorator. If you want to join a object with output request log you can use this decorator too. In the example user is logged with output request::

from security.decorators import atomic_log
from security.transport import security_requests as requests
from security import requests

user = User.objects.first()
with atomic_log(output_requests_slug='github-request', output_requests_related_objects=[user]):
requests.get('https:///github.com/druids/')
with log_with_data(slug='github-request', output_requests_related_objects=[user], extra_data={'extra': 'data'}):
requests.get('https://github.com/druids/')



Expand Down Expand Up @@ -103,7 +117,7 @@ If you want to call command from code, you should use ``security.management.call
Celery tasks log
----------------

If you want to log celery tasks you must firsly install celery library (celery==4.3.0). Then you must define your task as in example::
If you want to log celery tasks you must install celery library (``celery>=5``). Then you must use ``security.task import LoggedTask`` as a base class of your celery task, example::

from security.task import LoggedTask

Expand All @@ -114,4 +128,4 @@ If you want to log celery tasks you must firsly install celery library (celery==
def sum_task(self, task_id, a, b):
return a + b

Task result will be automatically logged to the ``security.models.CeleryTaskLog``.
Task result will be automatically logged to the log.

0 comments on commit c52b4a0

Please sign in to comment.