Skip to content

Commit

Permalink
Update change log [skip ci]
Browse files Browse the repository at this point in the history
  • Loading branch information
standage committed Nov 13, 2018
1 parent 3f9d0e3 commit f6e4e65
Showing 1 changed file with 23 additions and 0 deletions.
23 changes: 23 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,29 @@
All notable changes to this project will be documented in this file.
This project adheres to [Semantic Versioning](http://semver.org/).

## [Unreleased]

### Added
- The `kevlar count` operation now supports masks and 8-, 4-, or 1-bit counters (see #277 and #291).
- A Jupyter notebook and supporting code and data for evaluating kevlar's performance on a simulated data set (see #271).
- New flags for filtering gDNA cutouts or calls from specified sequences (see #285).
- New filter that discards any contig/gDNA alignment with more than 4 mismatches (see #288).
- A new feature that generates a Nodetable containing only variant-spanning k-mers to support re-counting k-mers and computing likelihood scores in low memory (see #289, #292).
- A new `kevlar cutout` command that mimics the behavior of `kevlar localize` but does so much more efficiently (see #294).

### Changed
- Ported augfastx handling from `kevlar.seqio` module to a new Cython module (see #279).
- Dynamic error model for likelihood calculations is now an configurable option (see #286).
- Cleaned up overlap-related code with a new `ReadPair` class (see #283).
- Updated `kevlar assemble`, `kevlar localize`, and `kevlar call` to accept streams of partitioned reads—previously, only reads for a single partition were permitted (see #294).

### Fixed
- Minor bug with .gml output due to a change in the networkx package (see #278).

### Removed
- Buggy home-grown greedy assembler dropped (see #279). Some parts of the overlap code retained and refactored (see #283).


## [0.5.0] 2018-06-14

### Fixed
Expand Down

0 comments on commit f6e4e65

Please sign in to comment.