Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create in-db audit log for archtype changes. #7052

Merged
merged 1 commit into from Feb 13, 2020
Merged

Create in-db audit log for archtype changes. #7052

merged 1 commit into from Feb 13, 2020

Conversation

@bakert
Copy link
Member

bakert commented Feb 12, 2020

No description provided.

@TravisBuddy

This comment has been minimized.

Copy link

TravisBuddy commented Feb 12, 2020

Travis tests have failed

Hey @bakert,
Please read the following log in order to understand the failure reason.
It'll be awesome if you fix what's wrong and commit the changes.

1st Build

View build log

echo $CMD
python dev.py tests
$CMD
$ $CMD
CONFIG: web_cache=.web_cache
CONFIG: redis_enabled=True
CONFIG: redis_host=localhost
CONFIG: redis_port=6379
CONFIG: redis_db=0
CONFIG: whoosh_index_dir=whoosh_index
CONFIG: image_dir=./images
CONFIG: production=False
>>>> Running tests with ""
Fetching https://api.scryfall.com/bulk-data (cache ok)
CONFIG: slow_fetch=10.0
CONFIG: magic_database=cards
CONFIG: mysql_host=localhost
CONFIG: mysql_port=3306
CONFIG: mysql_user=root
CONFIG: mysql_passwd=
CONFIG: slow_query=999.0
Creating database cards
Database update required
Before BEGIN ([])
After BEGIN (['update_database'])
Fetching https://api.scryfall.com/sets (cache ok)
Fetching https://archive.scryfall.com/json/scryfall-default-cards.json (cache ok)
Fetching https://pennydreadfulmtg.github.io/modo-bugs/bugs.json (cache ok)
Before BEGIN (['update_database'])
After BEGIN (['update_database', 'update_bugged_cards'])
UNKNOWN BUGGED CARD: Eldrazi Spawn
UNKNOWN BUGGED CARD: Eldrazi Spawn
Before COMMIT (['update_database', 'update_bugged_cards'])
After COMMIT (['update_database'])
Fetching http://whatsinstandard.com/api/v6/standard.json (cache ok)
Fetching http://pdmtgo.com/EMN_legal_cards.txt (no cache)
CONFIG: save_historic_legal_lists=False
Before BEGIN (['update_database'])
After BEGIN (['update_database', 'set_legal_cards'])
Before COMMIT (['update_database', 'set_legal_cards'])
After COMMIT (['update_database'])
Fetching http://pdmtgo.com/KLD_legal_cards.txt (no cache)
Before BEGIN (['update_database'])
After BEGIN (['update_database', 'set_legal_cards'])
Before COMMIT (['update_database', 'set_legal_cards'])
After COMMIT (['update_database'])
Fetching http://pdmtgo.com/AER_legal_cards.txt (no cache)
Before BEGIN (['update_database'])
After BEGIN (['update_database', 'set_legal_cards'])
Before COMMIT (['update_database', 'set_legal_cards'])
After COMMIT (['update_database'])
Fetching http://pdmtgo.com/AKH_legal_cards.txt (no cache)
Before BEGIN (['update_database'])
After BEGIN (['update_database', 'set_legal_cards'])
Before COMMIT (['update_database', 'set_legal_cards'])
After COMMIT (['update_database'])
Fetching http://pdmtgo.com/HOU_legal_cards.txt (no cache)
Before BEGIN (['update_database'])
After BEGIN (['update_database', 'set_legal_cards'])
Before COMMIT (['update_database', 'set_legal_cards'])
After COMMIT (['update_database'])
Fetching http://pdmtgo.com/XLN_legal_cards.txt (no cache)
Before BEGIN (['update_database'])
After BEGIN (['update_database', 'set_legal_cards'])
Before COMMIT (['update_database', 'set_legal_cards'])
After COMMIT (['update_database'])
Fetching http://pdmtgo.com/RIX_legal_cards.txt (no cache)
Before BEGIN (['update_database'])
After BEGIN (['update_database', 'set_legal_cards'])
Before COMMIT (['update_database', 'set_legal_cards'])
After COMMIT (['update_database'])
Fetching http://pdmtgo.com/DOM_legal_cards.txt (no cache)
Before BEGIN (['update_database'])
After BEGIN (['update_database', 'set_legal_cards'])
Before COMMIT (['update_database', 'set_legal_cards'])
After COMMIT (['update_database'])
Fetching http://pdmtgo.com/M19_legal_cards.txt (no cache)
Before BEGIN (['update_database'])
After BEGIN (['update_database', 'set_legal_cards'])
Before COMMIT (['update_database', 'set_legal_cards'])
After COMMIT (['update_database'])
Fetching http://pdmtgo.com/GRN_legal_cards.txt (no cache)
Before BEGIN (['update_database'])
After BEGIN (['update_database', 'set_legal_cards'])
Before COMMIT (['update_database', 'set_legal_cards'])
After COMMIT (['update_database'])
Fetching http://pdmtgo.com/RNA_legal_cards.txt (no cache)
Before BEGIN (['update_database'])
After BEGIN (['update_database', 'set_legal_cards'])
Before COMMIT (['update_database', 'set_legal_cards'])
After COMMIT (['update_database'])
Fetching http://pdmtgo.com/WAR_legal_cards.txt (no cache)
Before BEGIN (['update_database'])
After BEGIN (['update_database', 'set_legal_cards'])
Before COMMIT (['update_database', 'set_legal_cards'])
After COMMIT (['update_database'])
Fetching http://pdmtgo.com/M20_legal_cards.txt (no cache)
Before BEGIN (['update_database'])
After BEGIN (['update_database', 'set_legal_cards'])
Before COMMIT (['update_database', 'set_legal_cards'])
After COMMIT (['update_database'])
Fetching http://pdmtgo.com/ELD_legal_cards.txt (no cache)
Before BEGIN (['update_database'])
After BEGIN (['update_database', 'set_legal_cards'])
Before COMMIT (['update_database', 'set_legal_cards'])
After COMMIT (['update_database'])
Before COMMIT (['update_database'])
After COMMIT ([])
Fetching http://pdmtgo.com/legal_cards.txt (no cache)
Before BEGIN ([])
After BEGIN (['set_legal_cards'])
Before COMMIT (['set_legal_cards'])
After COMMIT ([])
CONFIG: card_alias_file=./card_aliases.tsv
Rewriting index in whoosh_index
============================= test session starts ==============================
platform linux -- Python 3.6.7, pytest-5.3.5, py-1.7.0, pluggy-0.13.1
rootdir: /home/travis/build/PennyDreadfulMTG/Penny-Dreadful-Tools, inifile: pytest.ini
plugins: cov-2.8.1, asyncio-0.10.0
collected 238 items                                                            

dev_test.py .                                                            [  0%]
analysis/analysis_test.py .                                              [  0%]
decksite/deck_name_test.py ............................................. [ 19%]
..........................................                               [ 37%]
decksite/league_test.py ..                                               [ 38%]
decksite/smoke_test.py .                                                 [ 38%]
decksite/translation_test.py .                                           [ 39%]
decksite/view_test.py ...                                                [ 40%]
decksite/data/card_test.py ..                                            [ 41%]
decksite/data/query_test.py ..                                           [ 42%]
decksite/scrapers/gatherling_test.py .....                               [ 44%]
decksite/scrapers/scraper_test.py xxFF                                   [ 45%]
discordbot/command_test.py .....                                         [ 47%]
discordbot/emoji_test.py .                                               [ 48%]
discordbot/functional_test.py ...........................                [ 59%]
discordbot/unit_test.py ............                                     [ 64%]
logsite/smoke_test.py ..                                                 [ 65%]
logsite/tests/importing_test.py ...                                      [ 66%]
magic/card_test.py .                                                     [ 67%]
magic/decklist_test.py ..............                                    [ 73%]
magic/legality_test.py .                                                 [ 73%]
magic/mana_test.py ....................                                  [ 81%]
magic/multiverse_test.py ....                                            [ 83%]
magic/oracle_test.py .....                                               [ 85%]
magic/rotation_test.py ....                                              [ 87%]
magic/whoosh_search_test.py ...............                              [ 93%]
shared/database_test.py .                                                [ 94%]
shared/dtutil_test.py ..........                                         [ 98%]
shared/text_test.py ..                                                   [ 99%]
shared_web/template_test.py ..                                           [100%]

=================================== FAILURES ===================================
____________________________ test_manual_tappedout _____________________________

self = <shared.database.Database object at 0x7f04a7fa49b0>
sql = 'INSERT INTO deck_card (deck_id, card, n, sideboard) VALUES (%s, %s, %s, %s)'
args = [13, 'Island', 60, False], fetch_rows = False

    def execute_anything(self, sql: str, args: Optional[List[ValidSqlArgumentDescription]] = None, fetch_rows: bool = True) -> Tuple[int, List[Dict[str, ValidSqlArgumentDescription]]]:
        if args is None:
            args = []
        try:
>           return self.execute_with_reconnect(sql, args, fetch_rows)

shared/database.py:51: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <shared.database.Database object at 0x7f04a7fa49b0>
sql = 'INSERT INTO deck_card (deck_id, card, n, sideboard) VALUES (%s, %s, %s, %s)'
args = [13, 'Island', 60, False], fetch_rows = False

    def execute_with_reconnect(self, sql: str, args: Optional[List[ValidSqlArgumentDescription]] = None, fetch_rows: Optional[bool] = False) -> Tuple[int, List[ValidSqlArgumentDescription]]:
        result = None
        # Attempt to execute the query and reconnect 3 times, then give up
        for _ in range(3):
            try:
                p = perf.start()
                n = self.cursor.execute(sql, args)
                perf.check(p, 'slow_query', (f'```{sql}```', f'```{args}```'), 'mysql')
                if fetch_rows:
                    rows = self.cursor.fetchall()
                    result = (n, rows)
                else:
                    result = (n, [])
                break
            except OperationalError as e:
                if 'MySQL server has gone away' in str(e):
                    print('MySQL server has gone away: trying to reconnect')
                    self.connect()
                else:
                    # raise any other exception
>                   raise e

shared/database.py:81: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <shared.database.Database object at 0x7f04a7fa49b0>
sql = 'INSERT INTO deck_card (deck_id, card, n, sideboard) VALUES (%s, %s, %s, %s)'
args = [13, 'Island', 60, False], fetch_rows = False

    def execute_with_reconnect(self, sql: str, args: Optional[List[ValidSqlArgumentDescription]] = None, fetch_rows: Optional[bool] = False) -> Tuple[int, List[ValidSqlArgumentDescription]]:
        result = None
        # Attempt to execute the query and reconnect 3 times, then give up
        for _ in range(3):
            try:
                p = perf.start()
>               n = self.cursor.execute(sql, args)

shared/database.py:67: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <MySQLdb.cursors.DictCursor object at 0x7f04a7ef2320>
query = b"INSERT INTO deck_card (deck_id, card, n, sideboard) VALUES (13, 'Island', 60, 0)"
args = (b'13', b"'Island'", b'60', b'0')

    def execute(self, query, args=None):
        """Execute a query.
    
        query -- string, query to execute on server
        args -- optional sequence or mapping, parameters to use with query.
    
        Note: If args is a sequence, then %s must be used as the
        parameter placeholder in the query. If a mapping is used,
        %(key)s must be used as the placeholder.
    
        Returns integer represents rows affected, if any
        """
        while self.nextset():
            pass
        db = self._get_db()
    
        if isinstance(query, unicode):
            query = query.encode(db.encoding)
    
        if args is not None:
            if isinstance(args, dict):
                nargs = {}
                for key, item in args.items():
                    if isinstance(key, unicode):
                        key = key.encode(db.encoding)
                    nargs[key] = db.literal(item)
                args = nargs
            else:
                args = tuple(map(db.literal, args))
            try:
                query = query % args
            except TypeError as m:
                raise ProgrammingError(str(m))
    
        assert isinstance(query, (bytes, bytearray))
>       res = self._query(query)

../../../virtualenv/python3.6.7/lib/python3.6/site-packages/MySQLdb/cursors.py:209: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <MySQLdb.cursors.DictCursor object at 0x7f04a7ef2320>
q = b"INSERT INTO deck_card (deck_id, card, n, sideboard) VALUES (13, 'Island', 60, 0)"

    def _query(self, q):
        db = self._get_db()
        self._result = None
>       db.query(q)

../../../virtualenv/python3.6.7/lib/python3.6/site-packages/MySQLdb/cursors.py:315: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <_mysql.connection open to 'localhost' at 0x42cbf98>
query = b"INSERT INTO deck_card (deck_id, card, n, sideboard) VALUES (13, 'Island', 60, 0)"

    def query(self, query):
        # Since _mysql releases GIL while querying, we need immutable buffer.
        if isinstance(query, bytearray):
            query = bytes(query)
>       _mysql.connection.query(self, query)
E       MySQLdb._exceptions.OperationalError: (1205, 'Lock wait timeout exceeded; try restarting transaction')

../../../virtualenv/python3.6.7/lib/python3.6/site-packages/MySQLdb/connections.py:239: OperationalError

During handling of the above exception, another exception occurred:

    @pytest.mark.functional
    @pytest.mark.tappedout
    @pytest.mark.external
    @TEST_VCR.use_cassette
    def test_manual_tappedout() -> None:
        with APP.app_context(): # type: ignore
>           tappedout.scrape_url('https://tappedout.net/mtg-decks/60-island/')

decksite/scrapers/scraper_test.py:41: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
decksite/scrapers/tappedout.py:99: in scrape_url
    return deck.add_deck(raw_deck)
decksite/data/deck.py:377: in add_deck
    add_cards(deck_id, params['cards'])
decksite/data/deck.py:410: in add_cards
    insert_deck_card(deck_id, name, n, False)
decksite/data/deck.py:427: in insert_deck_card
    db().execute(sql, [deck_id, name, n, in_sideboard])
shared/database.py:44: in execute
    [n, _] = self.execute_anything(sql, args, False)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <shared.database.Database object at 0x7f04a7fa49b0>
sql = 'INSERT INTO deck_card (deck_id, card, n, sideboard) VALUES (%s, %s, %s, %s)'
args = [13, 'Island', 60, False], fetch_rows = False

    def execute_anything(self, sql: str, args: Optional[List[ValidSqlArgumentDescription]] = None, fetch_rows: bool = True) -> Tuple[int, List[Dict[str, ValidSqlArgumentDescription]]]:
        if args is None:
            args = []
        try:
            return self.execute_with_reconnect(sql, args, fetch_rows)
        except MySQLdb.Warning as e:
            if e.args[0] == 1050 or e.args[0] == 1051:
                return (0, []) # we don't care if a CREATE IF NOT EXISTS raises an "already exists" warning or DROP TABLE IF NOT EXISTS raises an "unknown table" warning.
            if e.args[0] == 1062:
                return (0, []) # We don't care if an INSERT IGNORE INTO didn't do anything.
            raise DatabaseException('Failed to execute `{sql}` with `{args}` because of `{e}`'.format(sql=sql, args=args, e=e))
        except MySQLdb.Error as e:
>           raise DatabaseException('Failed to execute `{sql}` with `{args}` because of `{e}`'.format(sql=sql, args=args, e=e))
E           shared.pd_exception.DatabaseException: Failed to execute `INSERT INTO deck_card (deck_id, card, n, sideboard) VALUES (%s, %s, %s, %s)` with `[13, 'Island', 60, False]` because of `(1205, 'Lock wait timeout exceeded; try restarting transaction')`

shared/database.py:59: DatabaseException
----------------------------- Captured stdout call -----------------------------
Fetching https://tappedout.net/mtg-decks/60-island/?fmt=printable (cache ok)
Fetching https://tappedout.net/mtg-decks/60-island/?fmt=txt (cache ok)
CONFIG: mysql_user=root
Before BEGIN ([])
After BEGIN (['add_deck'])
Before BEGIN (['add_deck'])
After BEGIN (['add_deck', 'add_cards'])
________________________________ test_goldfish _________________________________

self = <shared.database.Database object at 0x7f04a7dba6d8>
sql = 'INSERT INTO deck_card (deck_id, card, n, sideboard) VALUES (%s, %s, %s, %s)'
args = [14, 'Bosk Banneret', 4, False], fetch_rows = False

    def execute_anything(self, sql: str, args: Optional[List[ValidSqlArgumentDescription]] = None, fetch_rows: bool = True) -> Tuple[int, List[Dict[str, ValidSqlArgumentDescription]]]:
        if args is None:
            args = []
        try:
>           return self.execute_with_reconnect(sql, args, fetch_rows)

shared/database.py:51: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <shared.database.Database object at 0x7f04a7dba6d8>
sql = 'INSERT INTO deck_card (deck_id, card, n, sideboard) VALUES (%s, %s, %s, %s)'
args = [14, 'Bosk Banneret', 4, False], fetch_rows = False

    def execute_with_reconnect(self, sql: str, args: Optional[List[ValidSqlArgumentDescription]] = None, fetch_rows: Optional[bool] = False) -> Tuple[int, List[ValidSqlArgumentDescription]]:
        result = None
        # Attempt to execute the query and reconnect 3 times, then give up
        for _ in range(3):
            try:
                p = perf.start()
                n = self.cursor.execute(sql, args)
                perf.check(p, 'slow_query', (f'```{sql}```', f'```{args}```'), 'mysql')
                if fetch_rows:
                    rows = self.cursor.fetchall()
                    result = (n, rows)
                else:
                    result = (n, [])
                break
            except OperationalError as e:
                if 'MySQL server has gone away' in str(e):
                    print('MySQL server has gone away: trying to reconnect')
                    self.connect()
                else:
                    # raise any other exception
>                   raise e

shared/database.py:81: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <shared.database.Database object at 0x7f04a7dba6d8>
sql = 'INSERT INTO deck_card (deck_id, card, n, sideboard) VALUES (%s, %s, %s, %s)'
args = [14, 'Bosk Banneret', 4, False], fetch_rows = False

    def execute_with_reconnect(self, sql: str, args: Optional[List[ValidSqlArgumentDescription]] = None, fetch_rows: Optional[bool] = False) -> Tuple[int, List[ValidSqlArgumentDescription]]:
        result = None
        # Attempt to execute the query and reconnect 3 times, then give up
        for _ in range(3):
            try:
                p = perf.start()
>               n = self.cursor.execute(sql, args)

shared/database.py:67: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <MySQLdb.cursors.DictCursor object at 0x7f04a7dfe828>
query = b"INSERT INTO deck_card (deck_id, card, n, sideboard) VALUES (14, 'Bosk Banneret', 4, 0)"
args = (b'14', b"'Bosk Banneret'", b'4', b'0')

    def execute(self, query, args=None):
        """Execute a query.
    
        query -- string, query to execute on server
        args -- optional sequence or mapping, parameters to use with query.
    
        Note: If args is a sequence, then %s must be used as the
        parameter placeholder in the query. If a mapping is used,
        %(key)s must be used as the placeholder.
    
        Returns integer represents rows affected, if any
        """
        while self.nextset():
            pass
        db = self._get_db()
    
        if isinstance(query, unicode):
            query = query.encode(db.encoding)
    
        if args is not None:
            if isinstance(args, dict):
                nargs = {}
                for key, item in args.items():
                    if isinstance(key, unicode):
                        key = key.encode(db.encoding)
                    nargs[key] = db.literal(item)
                args = nargs
            else:
                args = tuple(map(db.literal, args))
            try:
                query = query % args
            except TypeError as m:
                raise ProgrammingError(str(m))
    
        assert isinstance(query, (bytes, bytearray))
>       res = self._query(query)

../../../virtualenv/python3.6.7/lib/python3.6/site-packages/MySQLdb/cursors.py:209: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <MySQLdb.cursors.DictCursor object at 0x7f04a7dfe828>
q = b"INSERT INTO deck_card (deck_id, card, n, sideboard) VALUES (14, 'Bosk Banneret', 4, 0)"

    def _query(self, q):
        db = self._get_db()
        self._result = None
>       db.query(q)

../../../virtualenv/python3.6.7/lib/python3.6/site-packages/MySQLdb/cursors.py:315: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <_mysql.connection open to 'localhost' at 0x432e328>
query = b"INSERT INTO deck_card (deck_id, card, n, sideboard) VALUES (14, 'Bosk Banneret', 4, 0)"

    def query(self, query):
        # Since _mysql releases GIL while querying, we need immutable buffer.
        if isinstance(query, bytearray):
            query = bytes(query)
>       _mysql.connection.query(self, query)
E       MySQLdb._exceptions.OperationalError: (1205, 'Lock wait timeout exceeded; try restarting transaction')

../../../virtualenv/python3.6.7/lib/python3.6/site-packages/MySQLdb/connections.py:239: OperationalError

During handling of the above exception, another exception occurred:

    @pytest.mark.functional
    @pytest.mark.goldfish
    @pytest.mark.external
    @TEST_VCR.use_cassette
    def test_goldfish() -> None:
        with APP.app_context(): # type: ignore
>           mtggoldfish.scrape(1)

decksite/scrapers/scraper_test.py:49: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
decksite/scrapers/mtggoldfish.py:43: in scrape
    deck.add_deck(d)
decksite/data/deck.py:377: in add_deck
    add_cards(deck_id, params['cards'])
decksite/data/deck.py:410: in add_cards
    insert_deck_card(deck_id, name, n, False)
decksite/data/deck.py:427: in insert_deck_card
    db().execute(sql, [deck_id, name, n, in_sideboard])
shared/database.py:44: in execute
    [n, _] = self.execute_anything(sql, args, False)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <shared.database.Database object at 0x7f04a7dba6d8>
sql = 'INSERT INTO deck_card (deck_id, card, n, sideboard) VALUES (%s, %s, %s, %s)'
args = [14, 'Bosk Banneret', 4, False], fetch_rows = False

    def execute_anything(self, sql: str, args: Optional[List[ValidSqlArgumentDescription]] = None, fetch_rows: bool = True) -> Tuple[int, List[Dict[str, ValidSqlArgumentDescription]]]:
        if args is None:
            args = []
        try:
            return self.execute_with_reconnect(sql, args, fetch_rows)
        except MySQLdb.Warning as e:
            if e.args[0] == 1050 or e.args[0] == 1051:
                return (0, []) # we don't care if a CREATE IF NOT EXISTS raises an "already exists" warning or DROP TABLE IF NOT EXISTS raises an "unknown table" warning.
            if e.args[0] == 1062:
                return (0, []) # We don't care if an INSERT IGNORE INTO didn't do anything.
            raise DatabaseException('Failed to execute `{sql}` with `{args}` because of `{e}`'.format(sql=sql, args=args, e=e))
        except MySQLdb.Error as e:
>           raise DatabaseException('Failed to execute `{sql}` with `{args}` because of `{e}`'.format(sql=sql, args=args, e=e))
E           shared.pd_exception.DatabaseException: Failed to execute `INSERT INTO deck_card (deck_id, card, n, sideboard) VALUES (%s, %s, %s, %s)` with `[14, 'Bosk Banneret', 4, False]` because of `(1205, 'Lock wait timeout exceeded; try restarting transaction')`

shared/database.py:59: DatabaseException
----------------------------- Captured stdout call -----------------------------
Fetching https://www.mtggoldfish.com/deck/custom/penny_dreadful?page=1#online (cache ok)
Fetching https://www.mtggoldfish.com/deck/2596440#online (cache ok)
Fetching https://www.mtggoldfish.com/deck/download/2596440 (cache ok)
CONFIG: mysql_user=root
Before BEGIN ([])
After BEGIN (['add_deck'])
Before BEGIN (['add_deck'])
After BEGIN (['add_deck', 'add_cards'])
=============================== warnings summary ===============================
discordbot/functional_test.py:19
  /home/travis/build/PennyDreadfulMTG/Penny-Dreadful-Tools/discordbot/functional_test.py:19: PytestCollectionWarning: cannot collect test class 'TestContext' because it has a __init__ constructor (from: discordbot/functional_test.py)
    class TestContext(MtgContext):

discordbot/functional_test.py::test_command[art-kwargs0]
  /home/travis/virtualenv/python3.6.7/lib/python3.6/site-packages/cachecontrol/serialize.py:182: DeprecationWarning: encoding is deprecated, Use raw=False instead.
    cached = msgpack.loads(data, encoding="utf-8")

-- Docs: https://docs.pytest.org/en/latest/warnings.html


=========================== slowest 2 test durations ===========================
56.49s call     decksite/scrapers/scraper_test.py::test_goldfish
51.12s call     decksite/scrapers/scraper_test.py::test_manual_tappedout
=========================== short test summary info ============================
XFAIL decksite/scrapers/scraper_test.py::test_tappedout
  Tappedout temporarily disabled due to rate limiting.
XFAIL decksite/scrapers/scraper_test.py::test_gatherling
  Tappedout temporarily disabled due to rate limiting.
FAILED decksite/scrapers/scraper_test.py::test_manual_tappedout - shared.pd_e...
FAILED decksite/scrapers/scraper_test.py::test_goldfish - shared.pd_exception...
======= 2 failed, 234 passed, 2 xfailed, 2 warnings in 161.83s (0:02:41) =======
>>>> Upload coverage
Storing https://codecov.io/bash in codecov.sh

  _____          _
 / ____|        | |
| |     ___   __| | ___  ___ _____   __
| |    / _ \ / _` |/ _ \/ __/ _ \ \ / /
| |___| (_) | (_| |  __/ (_| (_) \ V /
 \_____\___/ \__,_|\___|\___\___/ \_/
                              Bash-20191211-b8db533


==> Travis CI detected.
    project root: .
    Fixing merge commit SHA
    Yaml found at: .codecov.yml
==> Running gcov in . (disable via -X gcov)
==> Python coveragepy exists disable via -X coveragepy
    -> Running coverage xml
==> Searching for coverage reports in:
    + .
    -> Found 1 reports
==> Detecting git/mercurial file structure
==> Appending build variables
    + TRAVIS_OS_NAME
    + 3.6
==> Reading reports
    + ./coverage.xml bytes=688651
==> Appending adjustments
    http://docs.codecov.io/docs/fixing-reports
    -> No adjustments found
==> Gzipping contents
==> Uploading reports
    url: https://codecov.io
    query: branch=master&commit=05d3defbf0c1809a5c0b77b9c1aaf8dcac8f0923&build=8245.1&build_url=&name=&tag=&slug=PennyDreadfulMTG%2FPenny-Dreadful-Tools&service=travis&flags=&pr=7052&job=649696319
    -> Pinging Codecov
https://codecov.io/upload/v4?package=bash-20191211-b8db533&token=secret&branch=master&commit=05d3defbf0c1809a5c0b77b9c1aaf8dcac8f0923&build=8245.1&build_url=&name=&tag=&slug=PennyDreadfulMTG%2FPenny-Dreadful-Tools&service=travis&flags=&pr=7052&job=649696319
    -> Uploading
    -> View reports at https://codecov.io/github/PennyDreadfulMTG/Penny-Dreadful-Tools/commit/05d3defbf0c1809a5c0b77b9c1aaf8dcac8f0923
TestFailedException running ['dev.py', 'tests']:  [(<ExitCode.TESTS_FAILED: 1>,)] ExitCode.TESTS_FAILED
TravisBuddy Request Identifier: 1baaf190-4e41-11ea-b00e-d175f66e4ddd
@bakert bakert merged commit b9c56f7 into master Feb 13, 2020
3 of 6 checks passed
3 of 6 checks passed
continuous-integration/travis-ci/pr The Travis CI build failed
Details
continuous-integration/travis-ci/push The Travis CI build failed
Details
pdm/automerge Waiting for continuous-integration/travis-ci/push
Codacy/PR Quality Review Up to standards. A positive pull request.
Details
Summary no rules match, no planned actions
Details
pyup.io/safety-ci No dependencies with known security vulnerabilities.
Details
@vorpal-buildbot

This comment has been minimized.

Copy link
Contributor

vorpal-buildbot commented Feb 13, 2020

Seen on LOGS, PROD (merged by @bakert 17 seconds ago) Please check your changes!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked issues

Successfully merging this pull request may close these issues.

None yet

3 participants
You can’t perform that action at this time.