Skip to content

Commit

Permalink
Merge pull request #10538 from DefectDojo/master-into-dev/2.36.1-2.37…
Browse files Browse the repository at this point in the history
….0-dev

Release: Merge back 2.36.1 into dev from: master-into-dev/2.36.1-2.37.0-dev
  • Loading branch information
blakeaowens authored Jul 8, 2024
2 parents d7c6d9c + 42ef652 commit f8cff1b
Show file tree
Hide file tree
Showing 41 changed files with 353 additions and 303 deletions.
6 changes: 5 additions & 1 deletion .github/renovate.json
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,11 @@
"dependencyDashboardApproval": false,
"baseBranches": ["dev"],
"rebaseWhen": "conflicted",
"ignorePaths": ["requirements.txt", "components/package.json", "components/package-lock.json", "dojo/components/yarn.lock", "dojo/components/package.json", "Dockerfile**"],
"ignorePaths": ["requirements.txt", "requirements-lint.txt", "components/package.json", "components/package-lock.json", "dojo/components/yarn.lock", "dojo/components/package.json", "Dockerfile**"],
"ignoreDeps": [
"mysql",
"rabbitmq"
],
"packageRules": [{
"packagePatterns": ["*"],
"commitMessageExtra": "from {{currentVersion}} to {{#if isMajor}}v{{{newMajor}}}{{else}}{{#if isSingleVersion}}v{{{toVersion}}}{{else}}{{{newValue}}}{{/if}}{{/if}}",
Expand Down
36 changes: 0 additions & 36 deletions .github/workflows/flake8.yml

This file was deleted.

20 changes: 1 addition & 19 deletions .github/workflows/ruff.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,31 +2,13 @@ name: Ruff Linter

on:
workflow_dispatch:
pull_request_target:
push:

pull_request:
jobs:
ruff-linting:
runs-on: ubuntu-latest
steps:
- name: Checkout
if: github.event_name == 'pull_request' || github.event_name == 'pull_request_target'
uses: actions/checkout@v4
# by default the pull_requst_target event checks out the base branch, i.e. dev
# so we need to explicitly checkout the head of the PR
# we use fetch-depth 0 to make sure the full history is checked out and we can compare against
# the base commit (branch) of the PR
# more info https://github.community/t/github-actions-are-severely-limited-on-prs/18179/16
# we checkout merge_commit here as this contains all new code from dev also. we don't need to compare against base_commit
with:
persist-credentials: false
fetch-depth: 0
ref: refs/pull/${{ github.event.pull_request.number }}/merge
# repository: ${{github.event.pull_request.head.repo.full_name}}

- name: Checkout
# for non PR runs we just checkout the default, which is a sha on a branch probably
if: github.event_name != 'pull_request' && github.event_name != 'pull_request_target'
uses: actions/checkout@v4

- name: Install Ruff Linter
Expand Down
1 change: 0 additions & 1 deletion docker-compose.override.unit_tests.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
---
version: '3.8'
services:
nginx:
image: busybox:1.36.1-musl
Expand Down
1 change: 1 addition & 0 deletions docker-compose.override.unit_tests_cicd.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ services:
environment:
PYTHONWARNINGS: error # We are strict about Warnings during testing
DD_DEBUG: 'True'
DD_LOG_LEVEL: 'ERROR'
DD_TEST_DATABASE_NAME: ${DD_TEST_DATABASE_NAME}
DD_DATABASE_NAME: ${DD_TEST_DATABASE_NAME}
DD_DATABASE_ENGINE: ${DD_DATABASE_ENGINE}
Expand Down
4 changes: 3 additions & 1 deletion docker/entrypoint-unit-tests-devDocker.sh
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,9 @@ EOF

echo "Unit Tests"
echo "------------------------------------------------------------"
python3 manage.py test unittests -v 3 --keepdb --no-input

python3 manage.py test unittests -v 3 --keepdb --no-input --failfast --shuffle --parallel --exclude-tag="non-parallel"
python3 manage.py test unittests -v 3 --keepdb --no-input --failfast --shuffle --tag="non-parallel"

# you can select a single file to "test" unit tests
# python3 manage.py test unittests.tools.test_npm_audit_scan_parser.TestNpmAuditParser --keepdb -v 3
Expand Down
4 changes: 3 additions & 1 deletion docker/entrypoint-unit-tests.sh
Original file line number Diff line number Diff line change
Expand Up @@ -79,4 +79,6 @@ python3 manage.py migrate

echo "Unit Tests"
echo "------------------------------------------------------------"
python3 manage.py test unittests -v 3 --keepdb --no-input

python3 manage.py test unittests -v 3 --keepdb --no-input --failfast --shuffle --parallel --exclude-tag="non-parallel"
python3 manage.py test unittests -v 3 --keepdb --no-input --failfast --shuffle --tag="non-parallel"
6 changes: 5 additions & 1 deletion docs/config.dev.toml
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,8 @@ weight = 1
# See a complete list of available styles at https://xyproto.github.io/splash/docs/all.html
style = "dracula"
# Uncomment if you want your chosen highlight style used for code blocks without a specified language
guessSyntax = "true"
# Do not uncomment otherwise it breaks mermaid
# guessSyntax = "true"

# Everything below this are Site Params

Expand Down Expand Up @@ -198,3 +199,6 @@ enable = false
url = "https://owasp.slack.com/archives/C014H3ZV9U6"
icon = "fab fa-slack"
desc = "Chat with other project developers"

[params.mermaid]
enable = true
6 changes: 5 additions & 1 deletion docs/config.master.toml
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,8 @@ weight = 1
# See a complete list of available styles at https://xyproto.github.io/splash/docs/all.html
style = "dracula"
# Uncomment if you want your chosen highlight style used for code blocks without a specified language
guessSyntax = "true"
# Do not uncomment otherwise it breaks mermaid
# guessSyntax = "true"

# Everything below this are Site Params

Expand Down Expand Up @@ -198,3 +199,6 @@ enable = false
url = "https://owasp.slack.com/archives/C014H3ZV9U6"
icon = "fab fa-slack"
desc = "Chat with other project developers"

[params.mermaid]
enable = true
13 changes: 11 additions & 2 deletions docs/content/en/getting_started/upgrading/2.36.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,15 @@
title: 'Upgrading to DefectDojo Version 2.36.x'
toc_hide: true
weight: -20240603
description: No special instructions.
description: Breaking Change for HELM deployments with PostgreSQL
---
There are no special instructions for upgrading to 2.36.x. Check the [Release Notes](https://github.com/DefectDojo/django-DefectDojo/releases/tag/2.36.0) for the contents of the release.

Previous HELM deployments (HELM chart `<=1.6.136`, DefectDojo `<=2.35.4`) used a pinned version of PostgreSQL in versions `11.x`. These are incompatible with Django in version `4.2` (used from DefectDojo version `3.36.0`; HELM chart `1.6.137`). Because of this, it is necessary to upgrade PostgreSQL to version `12.x` or higher. DefectDojo in version `3.36.1` (HELM chart `1.6.138`) uses this new version of PostgreSQL.

Unfortunately, an upgrade of PostgreSQL is not enough because PostgreSQL does not support automatic migration of data structures in the filesystem. Because of this, migration is needed. There are different ways (many of them similar to migration between different database backends (e.g. from MySQL to PostgreSQL)). Please find inspiration and the best fitting way for you in:

- https://github.com/DefectDojo/django-DefectDojo/discussions/9480
- https://owasp.slack.com/archives/C2P5BA8MN/p1717610931766739?thread_ts=1717587117.831149&cid=C2P5BA8MN
- https://dev.to/jkostolansky/how-to-upgrade-postgresql-from-11-to-12-2la6

There are no other special instructions for upgrading to 2.36.x. Check the [Release Notes](https://github.com/DefectDojo/django-DefectDojo/releases/tag/2.36.0) for the contents of the release.
4 changes: 2 additions & 2 deletions docs/content/en/integrations/ldap-authentication.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ Please check for the latest version of these requirements at the time of impleme

Otherwise add the following to requirements.txt:

```
```python
python-ldap==3.4.2
django-auth-ldap==4.1.0
```
Expand Down Expand Up @@ -119,7 +119,7 @@ Read the docs for Django Authentication with LDAP here: https://django-auth-ldap
In order to pass the variables to the settings.dist.py file via docker, it's a good idea to add these to the docker-compose file.

You can do this by adding the following variables to the environment section for the uwsgi image:
```
```yaml
DD_LDAP_SERVER_URI: "${DD_LDAP_SERVER_URI:-ldap://ldap.example.com}"
DD_LDAP_BIND_DN: "${DD_LDAP_BIND_DN:-}"
DD_LDAP_BIND_PASSWORD: "${DD_LDAP_BIND_PASSWORD:-}"
Expand Down
2 changes: 1 addition & 1 deletion docs/content/en/integrations/parsers/file/fortify.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,6 @@ per category. To get all issues, copy the [DefaultReportDefinitionAllIssues.xml]

Once this is complete, you can run the following command on your .fpr file to generate the
required XML:
```
```bash
./path/to/ReportGenerator -format xml -f /path/to/output.xml -source /path/to/downloaded/artifact.fpr -template DefaultReportDefinitionAllIssues.xml
```
4 changes: 2 additions & 2 deletions docs/content/en/integrations/parsers/file/veracode.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ Veracode reports can be ingested in either XML or JSON Format
- Requires slight modification of the response returned from the API
- Exmample of a request being: `url <endpoint> | jq "{findings}"`
- Desired Format:
```
```json
{
"findings": [
{
Expand All @@ -28,7 +28,7 @@ Veracode reports can be ingested in either XML or JSON Format
- This response can be saved directly to a file and uploaded
- Not as ideal for crafting a refined report consisting of multiple requests
- Desired Format:
```
```json
{
"_embedded": {
"findings": [
Expand Down
4 changes: 2 additions & 2 deletions docs/content/en/integrations/social-authentication.md
Original file line number Diff line number Diff line change
Expand Up @@ -312,7 +312,7 @@ Edit the settings (see [Configuration]({{< ref "/getting_started/configuration"

or, alternatively, for helm configuration, add this to the `extraConfig` section:

```
```yaml
DD_SESSION_COOKIE_SECURE: 'True'
DD_CSRF_COOKIE_SECURE: 'True'
DD_SECURE_SSL_REDIRECT: 'True'
Expand Down Expand Up @@ -453,7 +453,7 @@ Some Identity Providers are able to send list of groups to which should user bel

You can bypass the login form if you are only using SSO/Social authentication for login in by enabling these two environment variables:

```
```yaml
DD_SOCIAL_LOGIN_AUTO_REDIRECT: "true"
DD_SOCIAL_AUTH_SHOW_LOGIN_FORM: "false"
```
Expand Down
2 changes: 1 addition & 1 deletion docs/content/en/usage/productgrading.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ Note that the following abbreviations were used:
- med: amount of medium findings within the product
- low: amount of low findings within the product

```
```python
health=100
if crit > 0:
health = 40
Expand Down
6 changes: 3 additions & 3 deletions dojo/api_v2/serializers.py
Original file line number Diff line number Diff line change
Expand Up @@ -1645,10 +1645,10 @@ class FindingSerializer(TaggitSerializer, serializers.ModelSerializer):
age = serializers.IntegerField(read_only=True)
sla_days_remaining = serializers.IntegerField(read_only=True)
finding_meta = FindingMetaSerializer(read_only=True, many=True)
related_fields = serializers.SerializerMethodField()
related_fields = serializers.SerializerMethodField(allow_null=True)
# for backwards compatibility
jira_creation = serializers.SerializerMethodField(read_only=True)
jira_change = serializers.SerializerMethodField(read_only=True)
jira_creation = serializers.SerializerMethodField(read_only=True, allow_null=True)
jira_change = serializers.SerializerMethodField(read_only=True, allow_null=True)
display_status = serializers.SerializerMethodField()
finding_groups = FindingGroupSerializer(
source="finding_group_set", many=True, read_only=True,
Expand Down
Loading

0 comments on commit f8cff1b

Please sign in to comment.