Skip to content

Commit

Permalink
Create python-package.yml (#141)
Browse files Browse the repository at this point in the history
* Create python-package.yml

* Switch from setup.py to pyproject.toml

* Add caspailleur to dependencies

* Update requirements.txt and pyproject.toml

* Fixing python versions nuances

* Update python version

* Update requirements

* Update README shields

* Skip plotly tests

---------

Co-authored-by: Egor Dudyrev <egor.dudyrev@loria.fr>
  • Loading branch information
EgorDudyrev and Egor Dudyrev committed May 31, 2023
1 parent eb60549 commit 42f56c3
Show file tree
Hide file tree
Showing 19 changed files with 461 additions and 140 deletions.
41 changes: 41 additions & 0 deletions .github/workflows/python-package.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# This workflow will install Python dependencies, run tests and lint with a variety of Python versions
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python

name: Python package

on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]

jobs:
build:

runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
python-version: ["3.8", "3.9", "3.10"]

steps:
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v3
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
python -m pip install flake8 pytest
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Lint with flake8
run: |
# stop the build if there are Python syntax errors or undefined names
flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics
# exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics
- name: Test with pytest
run: |
pip install pytest pytest-cov
pytest --doctest-modules --cov=com --cov-report=xml --cov-report=html
2 changes: 1 addition & 1 deletion .readthedocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ formats:

# Optionally set the version of Python and requirements required to build your docs
python:
version: 3.7
version: 3.8
install:
- method: pip
path: .
Expand Down
2 changes: 1 addition & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
language: python
python:
- 3.6
- 3.8
before_install:
- python --version
- pip install -U pip
Expand Down
6 changes: 6 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,11 @@
# Changelog

## [0.1.4.2] - 2022-06-01

Elaborate Pattern Structures.
Rewrite Sofia algorithm to mine hundreds of most stable concepts on big data.
Add IntervalPS and SetPS to basic pattern structures.

## [0.1.4.1] - 2022-12-03

OSDA toolkit edition.
Expand Down
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
# FCApy

[![Travis (.com)](https://img.shields.io/travis/com/EgorDudyrev/FCApy)](https://travis-ci.com/github/EgorDudyrev/FCApy)
[![PyPi](https://img.shields.io/pypi/v/fcapy)](https://pypi.org/project/fcapy)
[![GitHub Workflow](https://img.shields.io/github/actions/workflow/status/EgorDudyrev/caspailleur/python-package.yml?logo=github)](https://github.com/EgorDudyrev/fcapy/actions/workflows/python-package.yml)
[![Read the Docs (version)](https://img.shields.io/readthedocs/fcapy/latest)](https://fcapy.readthedocs.io/en/latest/)
[![Codecov](https://img.shields.io/codecov/c/github/EgorDudyrev/FCApy)](https://codecov.io/gh/EgorDudyrev/FCApy)
[![GitHub](https://img.shields.io/github/license/EgorDudyrev/FCApy)](https://github.com/EgorDudyrev/FCApy/blob/main/LICENSE)
Expand Down
2 changes: 1 addition & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@
author = 'Egor Dudyrev'

# The full version, including alpha/beta/rc tags
release = '0.1.4.1'
release = '0.1.4.2'


# -- General configuration ---------------------------------------------------
Expand Down
2 changes: 1 addition & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ FCApy can be installed from `PyPI <https://pypi.org/project/fcapy>`_::

The library has no strict dependencies. However one would better install it with all the additional packages::

pip install fcapy[all]
pip install "fcapy[all]"


Contents
Expand Down
3 changes: 3 additions & 0 deletions fcapy/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,3 +22,6 @@ def check_installed_packages(package_descriptions):
'networkx': "The package to convert POSets to Graphs and to visualize them as graphs",
}
LIB_INSTALLED = check_installed_packages(PACKAGE_DESCRIPTION)


__version__ = '0.1.4.2'
27 changes: 14 additions & 13 deletions fcapy/algorithms/concept_construction.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
"""
from collections import deque
from typing import List, Tuple, Iterator, Iterable
from typing import List, Tuple, Iterator, Iterable, Union

import numpy as np
from bitarray import frozenbitarray as fbarray
Expand All @@ -20,8 +20,8 @@
from fcapy.utils import utils


def close_by_one(context: FormalContext | MVContext, n_projections_to_binarize: int = 1000)\
-> Iterator[FormalConcept | PatternConcept]:
def close_by_one(context: Union[FormalContext, MVContext], n_projections_to_binarize: int = 1000)\
-> Iterator[Union[FormalConcept, PatternConcept]]:
"""Return a list of concepts generated by CloseByOne (CbO) algorithm
Parameters
Expand Down Expand Up @@ -77,7 +77,7 @@ def pattern_concept_factory(extent_i, extent):
return concepts_iterator


def close_by_one_objectwise(context: FormalContext | MVContext) -> Iterator[FormalConcept | PatternConcept]:
def close_by_one_objectwise(context: Union[FormalContext, MVContext]) -> Iterator[Union[FormalConcept, PatternConcept]]:
"""Return a list of concepts generated by CloseByOne (CbO) algorithm
Parameters
Expand Down Expand Up @@ -135,7 +135,8 @@ def create_concept(extent_idxs, intent_idxs):
combinations_to_check.extend(new_combs)


def close_by_one_objectwise_fbarray(context: FormalContext | MVContext) -> Iterator[FormalConcept | PatternConcept]:
def close_by_one_objectwise_fbarray(context: Union[FormalContext, MVContext])\
-> Iterator[Union[FormalConcept, PatternConcept]]:
"""Return a list of concepts generated by CloseByOne (CbO) algorithm
Parameters
Expand All @@ -154,7 +155,7 @@ def close_by_one_objectwise_fbarray(context: FormalContext | MVContext) -> Itera
object_names, attribute_names = context.object_names, context.attribute_names
context_hash = context.hash_fixed()

def create_concept(extent_idxs: list[int], intent_ba: fbarray):
def create_concept(extent_idxs: List[int], intent_ba: fbarray):
extent_idxs = sorted(extent_idxs)
extent = [object_names[g_i] for g_i in extent_idxs]
if type(context) == FormalContext:
Expand Down Expand Up @@ -215,13 +216,13 @@ def extension_iter(intent_ba: fbarray, base_objects: Iterable[int] = range(n_obj


def sofia(
K: FormalContext | MVContext, L_max: int = 100, min_supp: float = 0,
K: Union[FormalContext, MVContext], L_max: int = 100, min_supp: float = 0,
use_tqdm: bool = False,use_log_stability_bound=True
) -> list[FormalConcept | PatternConcept]:
) -> List[Union[FormalConcept, PatternConcept]]:
min_supp = min_supp * len(K) if min_supp < 1 else min_supp

if use_log_stability_bound:
def stability_lbounds(extents: list[fbarray]) -> list[float]:
def stability_lbounds(extents: List[fbarray]) -> List[float]:
#assert all(a.count() <= b.count() for a, b in zip(extents, extents[1:]))
bounds = []
for i, extent in enumerate(extents):
Expand All @@ -233,7 +234,7 @@ def stability_lbounds(extents: list[fbarray]) -> list[float]:
bounds.append(bound)
return bounds
else:
def stability_lbounds(extents: list[fbarray]) -> list[float]:
def stability_lbounds(extents: List[fbarray]) -> List[float]:
children_ordering = inverse_order(sort_intents_inclusion(extents))
children_intersections = (
((extent & (~extents[child])).count() for child in children.itersearch(True))
Expand All @@ -243,7 +244,7 @@ def stability_lbounds(extents: list[fbarray]) -> list[float]:
bounds = [1-sum(2**(-v) for v in intersections) for intersections in children_intersections]
return bounds

extents_proj: list[fbarray] = [fbarray(~bazeros(K.n_objects))]
extents_proj: List[fbarray] = [fbarray(~bazeros(K.n_objects))]

n_projs = K.n_bin_attrs
proj_iterator = utils.safe_tqdm(enumerate(K.to_bin_attr_extents()), total=n_projs,
Expand Down Expand Up @@ -475,8 +476,8 @@ def direct_super_concepts(concept):
return lattice


def lcm_skmine(context: FormalContext | MVContext, min_supp: float = 1, n_jobs: int = 1)\
-> list[FormalConcept | PatternConcept]:
def lcm_skmine(context: Union[FormalContext, MVContext], min_supp: float = 1, n_jobs: int = 1)\
-> List[Union[FormalConcept, PatternConcept]]:
from skmine.itemsets import LCM

context_bin = context if isinstance(context, FormalContext) else context.binarize()
Expand Down
20 changes: 10 additions & 10 deletions fcapy/algorithms/lattice_construction.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,17 +8,17 @@
"""
from copy import deepcopy
from typing import Collection
from typing import Collection, Union, Tuple, List, Dict, Set

from fcapy.lattice.formal_concept import FormalConcept
from fcapy.lattice.pattern_concept import PatternConcept
from fcapy.utils import utils


def complete_comparison(
concepts: Collection[FormalConcept or PatternConcept],
concepts: Collection[Union[FormalConcept, PatternConcept]],
is_concepts_sorted: bool = False, n_jobs: int = 1, use_tqdm: bool = False
) -> dict[int, set[int]]:
) -> Dict[int, Set[int]]:
"""Return a dict with subconcepts relation on given ``concepts``. A slow but accurate bruteforce method
Parameters
Expand Down Expand Up @@ -76,7 +76,7 @@ def get_subconcepts(a_i, a, concepts):
return subconcepts_dict


def construct_spanning_tree(concepts, is_concepts_sorted=False, use_tqdm=False) -> dict[int, int]:
def construct_spanning_tree(concepts, is_concepts_sorted=False, use_tqdm=False) -> Dict[int, int]:
"""Return a spanning tree of subconcepts relation on given ``concepts``.
A spanning tree means that for each concept from ``concepts`` we look for one parent concept only
Expand Down Expand Up @@ -145,7 +145,7 @@ def construct_spanning_tree(concepts, is_concepts_sorted=False, use_tqdm=False)


def construct_lattice_from_spanning_tree(concepts, sptree_chains, is_concepts_sorted=False, use_tqdm=False)\
-> dict[int, set[int]]:
-> Dict[int, Set[int]]:
"""Return a dict with subconcepts relation on given concepts from given spanning tree of the relation.
Parameters
Expand Down Expand Up @@ -419,7 +419,7 @@ def sort_key(sc_i):


def construct_lattice_by_spanning_tree(concepts, is_concepts_sorted=False, n_jobs=1, use_tqdm=False)\
-> dict[int, set[int]]:
-> Dict[int, Set[int]]:
"""Return a dict with subconcepts relation on given ``concepts``. Uses spanning tree approach to fasten the computation
Parameters
Expand Down Expand Up @@ -453,9 +453,9 @@ def construct_lattice_by_spanning_tree(concepts, is_concepts_sorted=False, n_job
return subconcepts_dict


def order_extents_comparison(concepts: list[FormalConcept | PatternConcept]) -> dict[int, set[int]]:
def order_extents_comparison(concepts: List[Union[FormalConcept, PatternConcept]]) -> Dict[int, Set[int]]:
from caspailleur.order import inverse_order, sort_intents_inclusion, topological_sorting, test_topologically_sorted
from caspailleur.base_functions import isets2bas, bas2isets
from caspailleur.base_functions import isets2bas

n_objects = max(len(c.extent_i) for c in concepts)
extents_ba = list(isets2bas([c.extent_i for c in concepts], n_objects))
Expand All @@ -477,7 +477,7 @@ def add_concept(
new_concept, concepts, subconcepts_dict, superconcepts_dict,
top_concept_i=None, bottom_concept_i=None,
inplace=True
) -> tuple[list[FormalConcept | PatternConcept], dict[int, set[int]], dict[int, set[int]], int, int]:
) -> Tuple[List[Union[FormalConcept, PatternConcept]], Dict[int, Set[int]], Dict[int, Set[int]], int, int]:
"""Add ``new_concept`` into a set of ``concepts`` regarding its subconcept relation
Parameters
Expand Down Expand Up @@ -590,7 +590,7 @@ def remove_concept(
concept_i, concepts, subconcepts_dict, superconcepts_dict,
top_concept_i=None, bottom_concept_i=None,
inplace=True
) -> tuple[list[FormalConcept | PatternConcept], dict[int, set[int]], dict[int, set[int]], int, int]:
) -> Tuple[List[Union[FormalConcept, PatternConcept]], Dict[int, Set[int]], Dict[int, Set[int]], int, int]:
"""Remove a ``concept_i`` from a set of ``concepts`` regarding its subconcept relation
Parameters
Expand Down
4 changes: 2 additions & 2 deletions fcapy/context/bintable.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
from fcapy.context import bintable_errors as berrors
from fcapy import LIB_INSTALLED
#if LIB_INSTALLED['bitarray']:
from bitarray import frozenbitarray as fbarray, bitarray as barray, util as butil
from bitarray import frozenbitarray as fbarray, util as butil

#if LIB_INSTALLED['numpy']:
import numpy as np
Expand Down Expand Up @@ -232,7 +232,7 @@ def decide_dataclass(data: Collection) -> str:
return 'BinTableBitarray'
if isinstance(data, np.ndarray):
return 'BinTableNumpy'
if isinstance(data, tuple) and len(data) == 2 and isinstance(data[0], fbitarray) and isinstance(data[1], int):
if isinstance(data, tuple) and len(data) == 2 and isinstance(data[0], fbarray) and isinstance(data[1], int):
return 'BinTableBitarray'

raise berrors.UnknownDataTypeError(type(data))
Expand Down
2 changes: 1 addition & 1 deletion fcapy/context/formal_context.py
Original file line number Diff line number Diff line change
Expand Up @@ -713,7 +713,7 @@ def to_numeric(self):
"""
return self._data.to_list(), self._attribute_names

def to_bin_attr_extents(self) -> Iterator[tuple[str, fbarray]]:
def to_bin_attr_extents(self) -> Iterator[Tuple[str, fbarray]]:
for i, m in enumerate(self.attribute_names):
extent = fbarray(self.data[:, i])
yield m, extent
Expand Down
10 changes: 5 additions & 5 deletions fcapy/mvcontext/mvcontext.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
from frozendict import frozendict
from itertools import combinations
import zlib
from typing import Tuple, Iterator
from typing import Tuple, Iterator, List
import json
from bitarray import frozenbitarray as fbarray

Expand Down Expand Up @@ -97,7 +97,7 @@ def attribute_names(self, value):
self._attribute_names = value

@property
def pattern_structures(self) -> list[PS.AbstractPS]:
def pattern_structures(self) -> List[PS.AbstractPS]:
"""A list of pattern structures kept in a context"""
return self._pattern_structures

Expand All @@ -115,7 +115,7 @@ def target(self):
"""A list of target values for Supervised ML scenarios"""
return self._target

def assemble_pattern_structures(self, data, pattern_types) -> list[PS.AbstractPS]:
def assemble_pattern_structures(self, data, pattern_types) -> List[PS.AbstractPS]:
"""Return pattern_structures based on ``data`` and the ``pattern_types``"""
if data is None:
return None
Expand Down Expand Up @@ -302,7 +302,7 @@ def read_json(path: str = None, json_data: str = None, pattern_types: Tuple[PS.A
A path to .json file
json_data: `str`
A json encoded data
pattern_types: `tuple[AbstractPS]`
pattern_types: `Tuple[AbstractPS]`
Tuple of additional Pattern Structures not defined in fcapy.mvcontext.pattern_structure
Returns
Expand Down Expand Up @@ -608,7 +608,7 @@ def describe_pattern(self, data: dict) -> str:
description = [descr for descr in description if descr]
return '; '.join(description)

def to_bin_attr_extents(self) -> Iterator[tuple[str, fbarray]]:
def to_bin_attr_extents(self) -> Iterator[Tuple[str, fbarray]]:
for ps_i, ps in enumerate(self.pattern_structures):
for m, extent in ps.to_bin_attr_extents():
yield m, extent
Expand Down
Loading

0 comments on commit 42f56c3

Please sign in to comment.