Skip to content

Commit

Permalink
Merge pull request #15 from slicebox/release-2017-05-10
Browse files Browse the repository at this point in the history
Release 2017 05 10
  • Loading branch information
kobmic committed May 10, 2017
2 parents 39bb283 + 3970ca6 commit 73971db
Show file tree
Hide file tree
Showing 20 changed files with 1,729 additions and 252 deletions.
12 changes: 12 additions & 0 deletions .travis.yml
@@ -0,0 +1,12 @@
language: scala

jdk:
- oraclejdk8

notifications:
email:
- karl.sjostrand@gmail.com

script: "sbt clean coverage test"
after_success: "sbt coverageReport coveralls"

56 changes: 53 additions & 3 deletions README.md
@@ -1,5 +1,11 @@
# dcm4che-streams

Service | Status | Description
------- | ------ | -----------
Travis | [![Build Status](https://travis-ci.org/slicebox/dcm4che-streams.svg?branch=develop)](https://travis-ci.org/slicebox/dcm4che-streams.svg?branch=develop) | [Tests](https://travis-ci.org/slicebox/dcm4che-streams/)
Coveralls | [![Coverage Status](https://coveralls.io/repos/github/slicebox/dcm4che-streams/badge.svg?branch=develop)](https://coveralls.io/github/slicebox/dcm4che-streams?branch=develop) | Code coverage


The purpose of this project is to integrate [akka-streams](http://doc.akka.io/docs/akka/current/scala/stream/index.html)
with [dcm4che](https://github.com/dcm4che/dcm4che). Features will be added as needed (mainly in the
[slicebox](https://github.com/slicebox/slicebox) project) and may include streaming reading and writing of DICOM data,
Expand All @@ -11,8 +17,8 @@ DICOM data chunk size and network utilization using back-pressure as specified i

### Usage

The following example reads a DICOM file from disk, validates that it is a DICOM file, discards all attributes but
PatientName and PatientID and writes it to a new file.
The following example reads a DICOM file from disk, validates that it is a DICOM file, discards all private attributes
and writes it to a new file.

```scala
import akka.stream.scaladsl.FileIO
Expand All @@ -24,7 +30,51 @@ import se.nimsa.dcm4che.streams.DicomPartFlow._
FileIO.fromPath(Paths.get("source-file.dcm"))
.via(validateFlow)
.via(partFlow)
.via(partFilter(Seq(Tag.PatientName, Tag.PatientID)))
.via(blacklistFilter(DicomParsing.isPrivateAttribute(_)))
.map(_.bytes)
.runWith(FileIO.toPath(Paths.get("target-file.dcm")))
```

Same result can be achieved with a whitelist filter instead, but we need to tell the filter
to keep the preamble:

```scala
import akka.stream.scaladsl.FileIO
import java.nio.file.Paths
import org.dcm4che3.data.Tag
import se.nimsa.dcm4che.streams.DicomFlows._
import se.nimsa.dcm4che.streams.DicomPartFlow._

FileIO.fromPath(Paths.get("source-file.dcm"))
.via(validateFlow)
.via(partFlow)
.via(whitelistFilter(!DicomParsing.isPrivateAttribute(_), keepPreamble = true))
.map(_.bytes)
.runWith(FileIO.toPath(Paths.get("target-file.dcm")))
```


The next example materializes the above stream as dcm4che `Attributes` objects instead of writing data to disk.


```scala
import akka.stream.scaladsl.FileIO
import java.nio.file.Paths
import org.dcm4che3.data.{Attributes, Tag}
import scala.concurrent.Future
import se.nimsa.dcm4che.streams.DicomAttributesSink._
import se.nimsa.dcm4che.streams.DicomFlows._
import se.nimsa.dcm4che.streams.DicomPartFlow._

val futureAttributes: Future[(Option[Attributes], Option[Attributes])] =
FileIO.fromPath(Paths.get("source-file.dcm"))
.via(validateFlow)
.via(partFlow)
.via(whitelistFilter(Seq(Tag.PatientName, Tag.PatientID)))
.via(attributeFlow) // must turn headers + chunks into complete attributes before materializing
.runWith(attributesSink)

futureAttributes.map {
case (maybeMetaInformation, maybeDataset) => ??? // do something with attributes here
}
```
48 changes: 43 additions & 5 deletions build.sbt
@@ -1,10 +1,10 @@
import de.heikoseeberger.sbtheader.license.Apache2_0

name := "dcm4che-streams"
version := "1.0-SNAPSHOT"
version := "0.1.1"
organization := "se.nimsa"
scalaVersion := "2.12.1"
crossScalaVersions := Seq("2.11.8", "2.12.1")
scalaVersion := "2.12.2"
crossScalaVersions := Seq("2.11.8", "2.12.2")
scalacOptions := Seq("-encoding", "UTF-8", "-Xlint", "-deprecation", "-unchecked", "-feature", "-target:jvm-1.8")

// define the project
Expand All @@ -14,16 +14,18 @@ lazy val root = (project in file(".")).enablePlugins(GitBranchPrompt)
// repos

resolvers ++= Seq(
"Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/")
"Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/",
"dcm4che Repository" at "http://www.dcm4che.org/maven2/")

// deps

libraryDependencies ++= {
val akkaVersion = "2.4.16"
val akkaVersion = "2.4.17"
Seq(
"com.typesafe.akka" %% "akka-stream" % akkaVersion,
"com.typesafe.akka" %% "akka-slf4j" % akkaVersion,
"org.slf4j" % "slf4j-simple" % "1.7.22",
"org.dcm4che" % "dcm4che-core" % "3.3.8" % "provided",
"org.scalatest" %% "scalatest" % "3.0.1" % "test",
"com.typesafe.akka" %% "akka-stream-testkit" % akkaVersion % "test"
)
Expand All @@ -34,3 +36,39 @@ updateOptions := updateOptions.value.withCachedResolution(true)
// for automatic license stub generation

headers := Map("scala" -> Apache2_0("2017", "Lars Edenbrandt"))

// publish
publishMavenStyle := true

publishArtifact in Test := false

publishTo := {
val nexus = "https://oss.sonatype.org/"
if (isSnapshot.value)
Some("snapshots" at nexus + "content/repositories/snapshots")
else
Some("releases" at nexus + "service/local/staging/deploy/maven2")
}

pomIncludeRepository := { _ => false }

pomExtra := (
<url>https://github.com/slicebox/dcm4che-streams</url>
<licenses>
<license>
<name>Apache-2.0</name>
<url>https://opensource.org/licenses/Apache-2.0</url>
<distribution>repo</distribution>
</license>
</licenses>
<scm>
<url>git@github.com:slicebox/dcm4che-streams.git</url>
<connection>scm:git:git@github.com:slicebox/dcm4che-streams.git</connection>
</scm>
<developers>
<developer>
<id>KarlSjostrand</id>
<name>Karl Sjöstrand</name>
<url>https://github.com/KarlSjostrand</url>
</developer>
</developers>)
Binary file removed lib/dcm4che-core-3.3.7.jar
Binary file not shown.
Expand Up @@ -25,6 +25,14 @@ import akka.util.ByteString
import scala.annotation.tailrec
import scala.util.control.{NoStackTrace, NonFatal}

/**
* This class is borrowed (with minor modifications) from the
* <a href="https://github.com/akka/akka/blob/master/akka-stream/src/main/scala/akka/stream/impl/io/ByteStringParser.scala">AKKA internal API</a>.
* It provides a stateful parser from a stream of byte chunks to a stream of objects of type <code>T</code>. The main
* addition made to this class is the possiblity to handle deflated byte streams which may be inflated on the fly before
* parsing.
* @tparam T the type created by this parser
*/
abstract class ByteStringParser[T] extends GraphStage[FlowShape[ByteString, T]] {

import ByteStringParser._
Expand Down
17 changes: 13 additions & 4 deletions src/main/scala/se/nimsa/dcm4che/streams/DicomAttributesSink.scala
Expand Up @@ -18,15 +18,12 @@ package se.nimsa.dcm4che.streams

import akka.stream.scaladsl.Sink
import org.dcm4che3.data.{Attributes, Fragments, Sequence}
import se.nimsa.dcm4che.streams.DicomParts._

import scala.concurrent.{ExecutionContext, Future}


object DicomAttributesSink {

import DicomPartFlow._
import DicomFlows._

private case class AttributesData(attributesStack: Seq[Attributes],
sequenceStack: Seq[Sequence],
currentFragments: Option[Fragments])
Expand Down Expand Up @@ -65,6 +62,18 @@ object DicomAttributesSink {
}
}

/**
* Creates a <code>Sink</code> which ingests DICOM parts as output by the <code>DicomPartFlow</code> followed by the
* <code>DicomFlows.attributeFlow</code> and materializes into two dcm4che <code>Attributes</code> objects, one for
* meta data and one for the dataset.
*
* Based heavily and exclusively on the dcm4che
* <a href="https://github.com/dcm4che/dcm4che/blob/master/dcm4che-core/src/test/java/org/dcm4che3/io/DicomInputStreamTest.java">DicomInputStream</a>
* class (complementing what is not covered by <code>DicomPartFlow</code>.
*
* @param ec an implicit ExecutionContext
* @return a <code>Sink</code> for materializing a flow of DICOM parts into dcm4che <code>Attribute</code>s.
*/
def attributesSink(implicit ec: ExecutionContext): Sink[DicomPart, Future[(Option[Attributes], Option[Attributes])]] =
Sink.fold[AttributesSinkData, DicomPart](AttributesSinkData(None, None)) { case (attributesSinkData, dicomPart) =>
dicomPart match {
Expand Down

0 comments on commit 73971db

Please sign in to comment.