Skip to content


Repository files navigation

DRF API Logger

version Downloads Downloads Open Source Donate

Join Community Badge Join Instagram GitHub

An API Logger for your Django Rest Framework project.

It logs all the API information for content type "application/json".

  1. URL
  2. Request Body
  3. Request Headers
  4. Request Method
  5. API Response
  6. Status Code
  7. API Call Time
  8. Server Execution Time
  9. Client IP Address

You can log API information into the database or listen to the logger signals for different use cases, or you can do both.

  • The logger uses a separate thread to run, so it won't affect your API response time.


Install or add drf-api-logger.

pip install drf-api-logger



    'drf_api_logger',  #  Add here



    'drf_api_logger.middleware.api_logger_middleware.APILoggerMiddleware', # Add here

* Add these lines in the Django Rest Framework settings file.

Store logs into the database

Log every request into the database.

DRF_API_LOGGER_DATABASE = True  # Default to False
  • Logs will be available in the Django Admin Panel.

  • The search bar will search in Request Body, Response, Headers, and API URL.

  • You can also filter the logs based on the "added_on" date, Status Code, and Request Methods.

Alt text

Alt text

Alt text

Alt text

Note: Make sure to migrate. It will create a table for the logger if "DRF_API_LOGGER_DATABASE" is True else if already exists, it will delete the table.

To listen for the logger signals.

Listen to the signal as soon as any API is called. So you can log the API data into a file or for different use cases.

DRF_API_LOGGER_SIGNAL = True  # Default to False

Example code to listen to the API Logger Signal.

from drf_api_logger import API_LOGGER_SIGNAL

Create a function that is going to listen to the API logger signals.
def listener_one(**kwargs):

def listener_two(**kwargs):

It will listen to all the API logs whenever an API is called.
You can also listen to signals in multiple functions.
API_LOGGER_SIGNAL.listen += listener_one
API_LOGGER_SIGNAL.listen += listener_two

Unsubscribe to signals.

API_LOGGER_SIGNAL.listen -= listener_one


DRF API Logger usage queue to hold the logs before inserting them into the database. Once the queue is full, it bulk inserts into the database.

Specify the queue size.

DRF_LOGGER_QUEUE_MAX_SIZE = 50  # Default to 50 if not specified.


DRF API Logger also waits for a period of time. If the queue is not full and there are some logs to be inserted, it inserts after the interval ends.

Specify an interval (In Seconds).

DRF_LOGGER_INTERVAL = 10  # In Seconds, Default to 10 seconds if not specified.

Note: The API call time (added_on) is a timezone-aware datetime object. It is the actual time of the API call irrespective of interval value or queue size.

Skip namespace

You can skip the entire app to be logged into the database by specifying the namespace of the app as a list.


Skip URL Name

You can also skip any API to be logged by using the url_name of the API.

DRF_API_LOGGER_SKIP_URL_NAME = ['url_name1', 'url_name2']

Note: It does not log Django Admin Panel API calls.

Hide Sensitive Data From Logs

You may wish to hide sensitive information from being exposed in the logs. You do this by setting DRF_API_LOGGER_EXCLUDE_KEYS in to a list of your desired sensitive keys. The default is

DRF_API_LOGGER_EXCLUDE_KEYS = ['password', 'token', 'access', 'refresh']
# Sensitive data will be replaced with "***FILTERED***".

Change the default database to store API logs

DRF_API_LOGGER_DEFAULT_DATABASE = 'default'  # Default to "default" if not specified
Make sure to migrate the database specified in DRF_API_LOGGER_DEFAULT_DATABASE.

Want to identify slow APIs? (Optional)

You can also identify slow APIs by specifying DRF_API_LOGGER_SLOW_API_ABOVE in

A new filter (By API Performance) will be visible, and you can choose a slow or fast API.

DRF_API_LOGGER_SLOW_API_ABOVE = 200  # Default to None
# Specify in milli-seconds.

Want to log only selected request methods? (Optional)

You can log only selected methods by specifying DRF_API_LOGGER_METHODS in

DRF_API_LOGGER_METHODS = ['GET', 'POST', 'DELETE', 'PUT']  # Default to an empty list (Log all the requests).

Want to log only selected response status codes? (Optional)

You can log only selected responses by specifying DRF_API_LOGGER_STATUS_CODES in

DRF_API_LOGGER_STATUS_CODES = [200, 400, 404, 500]  # Default to an empty list (Log all responses).

Want to log custom content types? (Optional)

You can log custom content types by specifying DRF_API_LOGGER_CONTENT_TYPES in The specified content types will be added alongside with default content types.

]  # Default content types.

Want to see the API information in the local timezone? (Optional)

You can also change the timezone by specifying DRF_API_LOGGER_TIMEDELTA in It won't change the Database timezone. It will remain UTC or the timezone you have defined.

DRF_API_LOGGER_TIMEDELTA = 330 # UTC + 330 Minutes = IST (5:Hours, 30:Minutes ahead from UTC) 
# Specify in minutes.
# You can specify negative values for the countries behind the UTC timezone.

Ignore data based on maximum request or response body? (Optional)

Request/Response bodies By default, DRF API LOGGER will save the request and response bodies for each request for future viewing no matter how large. If DRF API LOGGER is used in production under heavy volume with large bodies this can have a huge impact on space/time performance.

This behavior can be configured with the following options additional:

# DRF API LOGGER takes anything < 0 as no limit.
# If response body > 1024 bytes, ignore.
DRF_API_LOGGER_MAX_REQUEST_BODY_SIZE = 1024  # default to -1, no limit.
DRF_API_LOGGER_MAX_RESPONSE_BODY_SIZE = 1024  # default to -1, no limit.


You can enable tracing by specifying DRF_API_LOGGER_ENABLE_TRACING in This will add a tracing ID (UUID.uuid4()) in the signals of the DRF API Logger (if enabled).

In views, you can use request.tracing_id to get the tracing ID.

DRF_API_LOGGER_ENABLE_TRACING = True  # default to False

Want to generate your tracing uuid?

By default, the DRF API Logger uses uuid.uuid4() to generate tracing id. If you want to use your custom function to generate uuid, specify DRF_API_LOGGER_TRACING_FUNC in the file.


Tracing already present in headers?

If the tracing ID is already coming as a part of request headers, you can specify the header name.

DRF_API_LOGGER_TRACING_ID_HEADER_NAME: str = 'X_TRACING_ID'  # Replace with actual header name.

API with or without Host

You can specify whether an endpoint of API should have absolute URI or not by setting this variable in the DRF file.

DRF_API_LOGGER_PATH_TYPE = 'ABSOLUTE'  # Default to ABSOLUTE if not specified
# Possible values are ABSOLUTE, FULL_PATH or RAW_URI

Considering we are accessing the following URL: DRF_API_LOGGER_PATH_TYPE possible values are:

  1. ABSOLUTE (Default) :

    Function used request.build_absolute_uri()



    Function used request.get_full_path()

    Output: /api/v1/?page=123

  3. RAW_URI

    Function used request.get_raw_uri()


    Note: Similar to ABSOLUTE but skip allowed hosts protection, so may return an insecure URI.

Use the DRF API Logger Model to query

You can use the DRF API Logger Model to query some information.

Note: Make sure to set "DRF_API_LOGGER_DATABASE = True" in the file.

from drf_api_logger.models import APILogsModel

Select records for status_code 200.

result_for_200_status_code = APILogsModel.objects.filter(status_code=200)

DRF API Logger Model:

class APILogsModel(Model):
   id = models.BigAutoField(primary_key=True)
   api = models.CharField(max_length=1024, help_text='API URL')
   headers = models.TextField()
   body = models.TextField()
   method = models.CharField(max_length=10, db_index=True)
   client_ip_address = models.CharField(max_length=50)
   response = models.TextField()
   status_code = models.PositiveSmallIntegerField(help_text='Response status code', db_index=True)
   execution_time = models.DecimalField(decimal_places=5, max_digits=8,
                                       help_text='Server execution time (Not complete response time.)')
   added_on = models.DateTimeField()
   def __str__(self):
      return self.api

   class Meta:
      db_table = 'drf_api_logs'
      verbose_name = 'API Log'
      verbose_name_plural = 'API Logs'


After some time, there will be too much data in the database. Searching and filtering may get slower. If you want, you can delete or archive the older data. To improve the searching or filtering, try to add indexes in the 'drf_api_logs' table.