Skip to content

Commit

Permalink
Add example in readme file
Browse files Browse the repository at this point in the history
  • Loading branch information
bassel-zeidan committed Oct 2, 2017
1 parent acc4cb0 commit 6a1e018
Showing 1 changed file with 57 additions and 1 deletion.
58 changes: 57 additions & 1 deletion scala/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -133,7 +133,7 @@ within a DSX Jupyter notebook, you can obtain your account credentials in the fo
If your Object Storage was created with a Softlayer account, each part of the credentials will
be found as text that you can copy and paste into the example code below.

### IBM Cloud Object Storage / Data Science Experience
### Softlayer Cloud Object Storage / Data Science Experience
```scala
import com.ibm.ibmos2spark.CloudObjectStorage

Expand Down Expand Up @@ -161,6 +161,62 @@ var dfData1 = spark.
load(cos.url(bucketName, objectname))
```

### Bluemix Cloud Object Storage / Data Science Experience
The class CloudObjectStorage allows you to connect to bluemix cos. You can connect to
bluemix using api keys as follows:
```scala
import com.ibm.ibmos2spark.CloudObjectStorage

// The credentials HashMap may be created for you with the
// "insert to code" link in your DSX notebook.

var credentials = scala.collection.mutable.HashMap[String, String](
"endPoint"->"xxx",
"apiKey"->"xxx",
"serviceId"->"xxx"
)
var bucketName = "myBucket"
var objectname = "mydata.csv"

var configurationName = "cos_config_name" // you can choose any string you want
var cos = new CloudObjectStorage(sc, credentials, configurationName, "bluemix_cos")
var spark = SparkSession.
builder().
getOrCreate()

var dfData1 = spark.
read.format("org.apache.spark.sql.execution.datasources.csv.CSVFileFormat").
option("header", "true").
option("inferSchema", "true").
load(cos.url(bucketName, objectname))
```
Alternatively, you can connect to bluemix cos using IAM token. Example:
```scala
import com.ibm.ibmos2spark.CloudObjectStorage

// The credentials HashMap may be created for you with the
// "insert to code" link in your DSX notebook.

var credentials = scala.collection.mutable.HashMap[String, String](
"endPoint"->"xxx",
"iamToken"->"xxx",
"serviceId"->"xxx"
)
var bucketName = "myBucket"
var objectname = "mydata.csv"

var configurationName = "cos_config_name" // you can choose any string you want
var cos = new CloudObjectStorage(sc, credentials, configurationName, "bluemix_cos", "iam_token")
var spark = SparkSession.
builder().
getOrCreate()

var dfData1 = spark.
read.format("org.apache.spark.sql.execution.datasources.csv.CSVFileFormat").
option("header", "true").
option("inferSchema", "true").
load(cos.url(bucketName, objectname))
```

### Bluemix / Data Science Experience

Expand Down

0 comments on commit 6a1e018

Please sign in to comment.