Skip to content

Commit

Permalink
Migrate to Legacy Builtin NDB Library (#4018)
Browse files Browse the repository at this point in the history
Towards #3995

This will simplify a few things, mainly the fact we can just have native interop between old/new apps without having to do a bunch of hackery to get datastore seriliazation/memcache to agree on a data format.
  • Loading branch information
phil-lopreiato committed Nov 26, 2021
1 parent 42ff5dc commit 707e9b7
Show file tree
Hide file tree
Showing 398 changed files with 8,348 additions and 6,141 deletions.
4 changes: 2 additions & 2 deletions .flake8
Original file line number Diff line number Diff line change
Expand Up @@ -15,8 +15,8 @@ exclude =
application_import_names = backend
import-order-style = edited
banned-modules =
google.appengine.api.datastore = Use google.cloud.datastore
google.appengine.ext.ndb = Use google.cloud.ndb
google.cloud.datastore = Use google.appengine.api.datastore
google.cloud.ndb = Use google.appengine.ext.ndb

[flake8:local-plugins]
extension =
Expand Down
3 changes: 0 additions & 3 deletions .pyre_configuration
Original file line number Diff line number Diff line change
Expand Up @@ -10,10 +10,7 @@
{"site-package": "flask_cors"},
{"site-package": "freezegun"},
{"site-package": "google/appengine"},
{"site-package": "google/cloud/datastore_v1"},
{"site-package": "google/cloud/tasks_v2/"},
{"site-package": "google/cloud/storage"},
{"site-package": "InMemoryCloudDatastoreStub"},
{"site-package": "numpy"},
{"site-package": "pytest"},
{"site-package": "_pytest"},
Expand Down
4 changes: 2 additions & 2 deletions docs/Architecture/Architecture-Data-Model.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
TBA uses the [`google-cloud-ndb`](https://googleapis.dev/python/python-ndb/latest/) library to interface with the [Google Cloud Datastore](https://cloud.google.com/datastore)
TBA uses the ndb library from the [Appengine Standard Runtime](https://github.com/GoogleCloudPlatform/appengine-python-standard) to interface with the [Google Cloud Datastore](https://cloud.google.com/datastore)

Datastore is highly scalable document based NoSQL database. On top of the datastore itself, the NDB library acts as an [ORM](https://en.wikipedia.org/wiki/Object%E2%80%93relational_mapping) and provides interfaces to manage the data's schema and manipulate the data stored.

[This page](https://cloud.google.com/appengine/docs/standard/python/ndb) describes an overview of the legacy NDB library - the python3 one we use is meant to be a drop-in replacement.
[This page](https://cloud.google.com/appengine/docs/standard/python/ndb) describes an overview of the NDB library. We continue to use the legacy library to avoid breaking changes with the py2 runtime, as well as to continue using builtin memcache.

The core models of the site are:
- `Team`, which represents the most up to date info about a given FRC team. The primary key for these is of the format `frc<team number>`
Expand Down
2 changes: 1 addition & 1 deletion docs/Developing/tba_dev_config.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ $ vagrant halt && vagrant up
| Parameter | Description |
| --- | --- |
| `auth_use_prod` | Set to `true` (or any non-empty value) to use an upstream Firebase project for authentication. Requires `google_application_credentials` to be set. If unset, will use the Firebase emulator locally for authentication. |
| `datastore_mode` | Can be either `local` or `remote`. By default this is set to `local` and will use the [Datastore emulator](https://cloud.google.com/datastore/docs/tools/datastore-emulator) bundled with the App Engine SDK. If instead you want to point your instance to a real Datatsore instance, set this to `remote` and also set the `google_application_credentials` property |
| `datastore_mode` | Can be either `local` or `remote`. By default this is set to `local` and will use the [Datastore emulator](https://cloud.google.com/datastore/docs/tools/datastore-emulator) bundled with the App Engine SDK. If instead you want to point your instance to a real Datatsore instance, set this to `remote` and also set the `google_application_credentials` property (**note: current remote mode does not work with the builtin NDB library**)|
| `redis_cache_url` | The url of a Redis cache used for caching datastore responses. Defaults to `redis://localhost:6739`, which is the address of Redis running inside the dev container. To disable the global cache, set this property to an empty string. |
| `flask_response_cache_enabled` | Can be either `true` or `false`. This is used to configure whether or not we store rendered html pages in Redis or not. By default this is `false`, to make development iteration faster. |
| `google_application_credentials` | A path (relative to the repository root) to a [service account JSON key](https://cloud.google.com/iam/docs/creating-managing-service-account-keys) used to authenticate to a production Google Cloud service. We recommend to put these in `ops/dev/keys` (which will be ignored by `git`). Example: `ops/dev/keys/tba-prod-key.json` |
Expand Down
2 changes: 1 addition & 1 deletion ops/shell/lib/__init__.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import contextlib
import os

from google.cloud import ndb
from google.appengine.ext import ndb


@contextlib.contextmanager
Expand Down
2 changes: 0 additions & 2 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -6,15 +6,13 @@ flake8-tidy-imports
blinker==1.4
fakeredis
freezegun==1.1.0
InMemoryCloudDatastoreStub
pre-commit
pyre-check==0.9.8
pytest
pytest-cov
pytest-custom_exit_code
pytest-testmon
PyYAML==6.0
redis==3.5.3
hiredis==2.0.0
requests-mock
shellcheck-py
61 changes: 29 additions & 32 deletions src/backend/api/handlers/helpers/tests/model_properties_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

import pytest

from google.cloud import ndb
from google.appengine.ext import ndb

from backend.api.handlers.helpers.model_properties import (
filter_event_properties,
Expand All @@ -23,16 +23,15 @@
from backend.common.queries.dict_converters.team_converter import TeamConverter


def test_filter_event_properties(ndb_client: ndb.Client) -> None:
with ndb_client.context():
event = EventConverter(
Event(
id="2020casj",
event_type_enum=EventType.REGIONAL,
event_short="casj",
year=2020,
)
).convert(ApiMajorVersion.API_V3)
def test_filter_event_properties(ndb_stub) -> None:
event = EventConverter(
Event(
id="2020casj",
event_type_enum=EventType.REGIONAL,
event_short="casj",
year=2020,
)
).convert(ApiMajorVersion.API_V3)

assert set(event.keys()).difference(set(simple_event_properties)) != set()

Expand All @@ -47,24 +46,23 @@ def test_filter_event_properties(ndb_client: ndb.Client) -> None:
filter_event_properties([event], ModelType("bad_type"))


def test_filter_match_properties(ndb_client: ndb.Client) -> None:
with ndb_client.context():
match = MatchConverter(
Match(
id="2020casj_qm1",
year=2020,
event=ndb.Key("Event", "2020casj"),
comp_level="qm",
match_number=1,
set_number=1,
alliances_json=json.dumps(
{
"red": {"score": 0, "teams": []},
"blue": {"score": 0, "teams": []},
}
),
)
).convert(ApiMajorVersion.API_V3)
def test_filter_match_properties(ndb_stub) -> None:
match = MatchConverter(
Match(
id="2020casj_qm1",
year=2020,
event=ndb.Key("Event", "2020casj"),
comp_level="qm",
match_number=1,
set_number=1,
alliances_json=json.dumps(
{
"red": {"score": 0, "teams": []},
"blue": {"score": 0, "teams": []},
}
),
)
).convert(ApiMajorVersion.API_V3)

assert set(match.keys()).difference(set(simple_match_properties)) != set()

Expand All @@ -79,9 +77,8 @@ def test_filter_match_properties(ndb_client: ndb.Client) -> None:
filter_match_properties([match], ModelType("bad_type"))


def test_filter_team_properties(ndb_client: ndb.Client) -> None:
with ndb_client.context():
team = TeamConverter(Team(id="frc604")).convert(ApiMajorVersion.API_V3)
def test_filter_team_properties(ndb_stub) -> None:
team = TeamConverter(Team(id="frc604")).convert(ApiMajorVersion.API_V3)

assert set(team.keys()).difference(set(simple_team_properties)) != set()

Expand Down
54 changes: 24 additions & 30 deletions src/backend/api/handlers/tests/add_match_video_test.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import json
from typing import Dict, List

from google.cloud import ndb
from google.appengine.ext import ndb
from werkzeug.test import Client

from backend.api.trusted_api_auth_helper import TrustedApiAuthHelper
Expand Down Expand Up @@ -68,19 +68,17 @@ def get_auth_headers(request_path: str, request_body) -> Dict[str, str]:
}


def test_no_auth(ndb_client: ndb.Client, api_client: Client) -> None:
with ndb_client.context():
setup_event()
def test_no_auth(ndb_stub, api_client: Client) -> None:
setup_event()

resp = api_client.post(REQUEST_PATH, data=json.dumps({}))
assert resp.status_code == 401


def test_set_video(ndb_client: ndb.Client, api_client: Client, taskqueue_stub) -> None:
with ndb_client.context():
setup_event()
setup_auth(access_types=[AuthType.MATCH_VIDEO])
setup_matches()
def test_set_video(ndb_stub, api_client: Client, taskqueue_stub) -> None:
setup_event()
setup_auth(access_types=[AuthType.MATCH_VIDEO])
setup_matches()

request_body = json.dumps({"qm1": "aFZy8iibMD0", "sf1m1": "RpSgUrsghv4"})

Expand All @@ -91,22 +89,19 @@ def test_set_video(ndb_client: ndb.Client, api_client: Client, taskqueue_stub) -
)
assert response.status_code == 200

with ndb_client.context():
assert set(Match.get_by_id("2014casj_qm1").youtube_videos) == {
"abcdef",
"aFZy8iibMD0",
}
assert set(Match.get_by_id("2014casj_sf1m1").youtube_videos) == {"RpSgUrsghv4"}
assert set(Match.get_by_id("2014casj_qm1").youtube_videos) == {
"abcdef",
"aFZy8iibMD0",
}
assert set(Match.get_by_id("2014casj_sf1m1").youtube_videos) == {"RpSgUrsghv4"}


def test_bad_match_id(ndb_client: ndb.Client, api_client: Client) -> None:
with ndb_client.context():
setup_event()
setup_auth(access_types=[AuthType.MATCH_VIDEO])
setup_matches()
def test_bad_match_id(ndb_stub, api_client: Client) -> None:
setup_event()
setup_auth(access_types=[AuthType.MATCH_VIDEO])
setup_matches()

request_body = json.dumps({"qm1": "aFZy8iibMD0", "qm2": "abc123"})

response = api_client.post(
REQUEST_PATH,
headers=get_auth_headers(REQUEST_PATH, request_body),
Expand All @@ -115,15 +110,15 @@ def test_bad_match_id(ndb_client: ndb.Client, api_client: Client) -> None:
assert response.status_code == 404

# make sure the valid match is unchnaged
with ndb_client.context():
assert set(Match.get_by_id("2014casj_qm1").youtube_videos) == {"abcdef"}
assert set(Match.get_by_id("2014casj_qm1", use_cache=False).youtube_videos) == {
"abcdef"
}


def test_malformed_match_id(ndb_client: ndb.Client, api_client: Client) -> None:
with ndb_client.context():
setup_event()
setup_auth(access_types=[AuthType.MATCH_VIDEO])
setup_matches()
def test_malformed_match_id(ndb_stub, api_client: Client) -> None:
setup_event()
setup_auth(access_types=[AuthType.MATCH_VIDEO])
setup_matches()

request_body = json.dumps({"qm1": "aFZy8iibMD0", "zzz": "abc123"})

Expand All @@ -136,5 +131,4 @@ def test_malformed_match_id(ndb_client: ndb.Client, api_client: Client) -> None:
assert response.json["Error"] == "Invalid match IDs provided: ['zzz']"

# make sure the valid match is unchnaged
with ndb_client.context():
assert set(Match.get_by_id("2014casj_qm1").youtube_videos) == {"abcdef"}
assert set(Match.get_by_id("2014casj_qm1").youtube_videos) == {"abcdef"}

0 comments on commit 707e9b7

Please sign in to comment.