Skip to content

Commit

Permalink
Merge pull request #151 from jacebrowning/release/v0.6
Browse files Browse the repository at this point in the history
Release v0.6
  • Loading branch information
jacebrowning committed Jan 25, 2020
2 parents 010d68a + 33f0342 commit b4279b4
Show file tree
Hide file tree
Showing 27 changed files with 920 additions and 345 deletions.
2 changes: 2 additions & 0 deletions .appveyor.yml
Expand Up @@ -2,6 +2,8 @@ environment:
matrix:
- PYTHON_MAJOR: 3
PYTHON_MINOR: 7
- PYTHON_MAJOR: 3
PYTHON_MINOR: 8

cache:
- .venv -> poetry.lock
Expand Down
1 change: 1 addition & 0 deletions .gitignore
Expand Up @@ -3,6 +3,7 @@
*.egg-info
__pycache__
.ipynb_checkpoints
setup.py

# Temporary OS files
Icon*
Expand Down
10 changes: 4 additions & 6 deletions .travis.yml
Expand Up @@ -2,19 +2,17 @@ dist: xenial

language: python
python:
- "3.7"
- "3.8-dev"
- 3.7
- 3.8

cache:
pip: true
directories:
- ${VIRTUAL_ENV}

before_install:
# https://github.com/sdispater/poetry/issues/613
# - curl -sSL https://raw.githubusercontent.com/sdispater/poetry/master/get-poetry.py | python
# - source $HOME/.poetry/env
- pip install poetry
- curl -sSL https://raw.githubusercontent.com/sdispater/poetry/master/get-poetry.py | python
- source $HOME/.poetry/env
- make doctor

install:
Expand Down
4 changes: 2 additions & 2 deletions .verchew.ini
Expand Up @@ -6,12 +6,12 @@ version = GNU Make
[Python]

cli = python
version = Python 3.7 || Python 3.8
version = 3.7 || 3.8

[Poetry]

cli = poetry
version = 0.12
version = 1

[Graphviz]

Expand Down
33 changes: 0 additions & 33 deletions .vscode/settings.json

This file was deleted.

10 changes: 8 additions & 2 deletions CHANGELOG.md
@@ -1,3 +1,9 @@
# 0.6 (2020-01-25)

- Added a registration system for custom formatter classes.
- Fixed loading of missing attribute from disk for ORM methods.
- Added support for file patterns relative to the current directory.

# 0.5.1 (2019-11-14)

- Removed unnecessary warning when loading objects.
Expand Down Expand Up @@ -32,9 +38,9 @@

- Added an option to automatically resave files after loading.
- Added an option to automatically reload files after saving.
- Added registration system for custom class converters.
- Added a registration system for custom converter classes.
- Added initial support for file inference via `auto(filename)`.

# 0.1 (2019-01-13)

- Initial release.
- Initial release.
11 changes: 6 additions & 5 deletions README.md
Expand Up @@ -2,18 +2,19 @@

Datafiles is a bidirectional serialization library for Python [dataclasses](https://docs.python.org/3/library/dataclasses.html) to synchronizes objects to the filesystem using type annotations. It supports a variety of file formats with round-trip preservation of formatting and comments, where possible. Object changes are automatically saved to disk and only include the minimum data needed to restore each object.

[![PyPI Version](https://img.shields.io/pypi/v/datafiles.svg)](https://pypi.org/project/datafiles)
[![PyPI License](https://img.shields.io/pypi/l/datafiles.svg)](https://pypi.org/project/datafiles)
[![Travis CI](https://img.shields.io/travis/jacebrowning/datafiles/develop.svg?label=unix)](https://travis-ci.org/jacebrowning/datafiles)
[![AppVeyor](https://img.shields.io/appveyor/ci/jacebrowning/datafiles/develop.svg?label=windows)](https://ci.appveyor.com/project/jacebrowning/datafiles)
[![Coveralls](https://img.shields.io/coveralls/jacebrowning/datafiles.svg)](https://coveralls.io/r/jacebrowning/datafiles)
[![PyPI License](https://img.shields.io/pypi/l/datafiles.svg)](https://pypi.org/project/datafiles)
[![PyPI Version](https://img.shields.io/pypi/v/datafiles.svg)](https://pypi.org/project/datafiles)
[![Gitter](https://img.shields.io/gitter/room/jacebrowning/datafiles?color=blue)](https://gitter.im/jacebrowning/datafiles)

Popular use cases include:

- Coercing user-editable files into the proper Python types
- Storing program configuration and data in version control
- Loading data fixtures for demonstration or testing purposes
- Prototyping data models agnostic of persistance backends
- Prototyping data models agnostic of persistence backends

## Overview

Expand All @@ -34,7 +35,7 @@ class InventoryItem:
return self.unit_price * self.quantity_on_hand
```

and decorate it with directory pattern to synchronize instances:
and decorate it with a directory pattern to synchronize instances:

```python
from datafiles import datafile
Expand Down Expand Up @@ -113,4 +114,4 @@ $ poetry add datafiles

## Documentation

To see additional syncrhonization and formatting options, please consult the [full documentation](https://datafiles.readthedocs.io).
To see additional synchronization and formatting options, please consult the [full documentation](https://datafiles.readthedocs.io).
1 change: 1 addition & 0 deletions datafiles/converters/__init__.py
Expand Up @@ -15,6 +15,7 @@


def register(cls: Union[type, str], converter: type):
"""Associate the given type signature with a converter class."""
_REGISTRY[cls] = converter
if not isinstance(cls, str):
_REGISTRY[cls.__name__] = converter
Expand Down
28 changes: 21 additions & 7 deletions datafiles/formats.py
@@ -1,5 +1,6 @@
import json
from abc import ABCMeta, abstractmethod
from contextlib import suppress
from io import StringIO
from pathlib import Path
from typing import IO, Any, Dict, List
Expand All @@ -11,6 +12,14 @@
from . import settings


_REGISTRY: Dict[str, type] = {}


def register(extension: str, formatter: type):
"""Associate the given file extension with a formatter class."""
_REGISTRY[extension] = formatter


class Formatter(metaclass=ABCMeta):
"""Base class for object serialization and text deserialization."""

Expand Down Expand Up @@ -92,17 +101,22 @@ def serialize(cls, data):


def deserialize(path: Path, extension: str) -> Dict:
for formatter in Formatter.__subclasses__():
if extension in formatter.extensions():
with path.open('r') as file_object:
return formatter.deserialize(file_object)

raise ValueError(f'Unsupported file extension: {extension}')
formatter = _get_formatter(extension)
with path.open('r') as file_object:
return formatter.deserialize(file_object)


def serialize(data: Dict, extension: str = '.yml') -> str:
formatter = _get_formatter(extension)
return formatter.serialize(data)


def _get_formatter(extension: str):
with suppress(KeyError):
return _REGISTRY[extension]

for formatter in Formatter.__subclasses__():
if extension in formatter.extensions():
return formatter.serialize(data)
return formatter

raise ValueError(f'Unsupported file extension: {extension!r}')
9 changes: 5 additions & 4 deletions datafiles/hooks.py
Expand Up @@ -159,9 +159,10 @@ def enabled(mapper, args) -> bool:
@contextmanager
def disabled():
"""Globally disable method hooks, temporarily."""
if settings.HOOKS_ENABLED:
enabled = settings.HOOKS_ENABLED
if enabled:
settings.HOOKS_ENABLED = False
try:
yield
settings.HOOKS_ENABLED = True
else:
yield
finally:
settings.HOOKS_ENABLED = enabled
75 changes: 37 additions & 38 deletions datafiles/manager.py
Expand Up @@ -12,6 +12,8 @@
import log
from parse import parse

from . import hooks


if TYPE_CHECKING:
from .mapper import Mapper
Expand All @@ -21,10 +23,44 @@
Missing = dataclasses._MISSING_TYPE


class HasDatafile(Protocol):
datafile: Mapper


class Splats:
def __getattr__(self, name):
return '*'


class Manager:
def __init__(self, cls):
self.model = cls

def get(self, *args, **kwargs) -> HasDatafile:
fields = dataclasses.fields(self.model)
missing_args = [Missing] * (len(fields) - len(args) - len(kwargs))
args = (*args, *missing_args)

with hooks.disabled():
instance = self.model(*args, **kwargs)
instance.datafile.load()

return instance

def get_or_none(self, *args, **kwargs) -> Optional[HasDatafile]:
try:
return self.get(*args, **kwargs)
except FileNotFoundError:
log.info("File not found")
return None

def get_or_create(self, *args, **kwargs) -> HasDatafile:
try:
return self.get(*args, **kwargs)
except FileNotFoundError:
log.info(f"File not found, creating '{self.model.__name__}' object")
return self.model(*args, **kwargs)

def all(self) -> Iterator[HasDatafile]:
root = Path(inspect.getfile(self.model)).parent
pattern = str(root / self.model.Meta.datafile_pattern)
Expand All @@ -33,35 +69,7 @@ def all(self) -> Iterator[HasDatafile]:
for filename in iglob(splatted):
log.debug(f'Found matching path: {filename}')
results = parse(pattern, filename)
args = list(results.named.values())
for _ in range(9):
try:
yield self.model(*args)
except TypeError:
args.append(Missing)
else:
break

def get_or_none(self, *args, **kwargs) -> Optional[HasDatafile]:
original_manual = self.model.Meta.datafile_manual

self.model.Meta.datafile_manual = True
instance = self.model(*args, **kwargs)
self.model.Meta.datafile_manual = original_manual

if instance.datafile.exists:
instance.datafile._manual = original_manual
return instance

return None

def get_or_create(self, *args, **kwargs) -> HasDatafile:
instance = self.model(*args, **kwargs)

if not instance.datafile.exists:
instance.datafile.save()

return instance
yield self.get(*results.named.values())

def filter(self, **query):
for item in self.all():
Expand All @@ -71,12 +79,3 @@ def filter(self, **query):
match = False
if match:
yield item


class HasDatafile(Protocol):
datafile: Mapper


class Splats:
def __getattr__(self, name):
return '*'
7 changes: 5 additions & 2 deletions datafiles/mapper.py
Expand Up @@ -57,6 +57,10 @@ def path(self) -> Optional[Path]:
if not self._pattern:
return None

path = Path(self._pattern.format(self=self._instance))
if path.is_absolute() or self._pattern.startswith('./'):
return path.resolve()

cls = self._instance.__class__
try:
root = Path(inspect.getfile(cls)).parent
Expand All @@ -65,8 +69,7 @@ def path(self) -> Optional[Path]:
log.log(level, f'Unable to determine module for {cls}')
root = Path.cwd()

relpath = self._pattern.format(self=self._instance)
return (root / relpath).resolve()
return (root / path).resolve()

@property
def relpath(self) -> Path:
Expand Down

0 comments on commit b4279b4

Please sign in to comment.