Skip to content

Commit

Permalink
update comments and readme file
Browse files Browse the repository at this point in the history
  • Loading branch information
bassel-zeidan committed Sep 22, 2017
1 parent 74463f2 commit 54967f9
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 10 deletions.
7 changes: 4 additions & 3 deletions scala/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,8 +49,8 @@ This library is cross-built on both Scala 2.10 (for Spark 1.6.0) and Scala 2.11

#### IBM Spark Service

The `ibmos2spark` Scala library package is now pre-installed on IBM Apache Spark as a service. This includes
service instances created in Bluemix or in Data Science Experience.
The `ibmos2spark` Scala library package is now pre-installed on IBM Apache Spark as a service. This includes
service instances created in Bluemix or in Data Science Experience.

### Snapshots

Expand Down Expand Up @@ -148,7 +148,8 @@ var credentials = scala.collection.mutable.HashMap[String, String](
var bucketName = "myBucket"
var objectname = "mydata.csv"

var cos = new CloudObjectStorage(sc, credentials)
var configurationName = "cos_config_name" // you can choose any string you want
var cos = new CloudObjectStorage(sc, credentials, configurationName)
var spark = SparkSession.
builder().
getOrCreate()
Expand Down
11 changes: 4 additions & 7 deletions scala/src/main/scala/Osconfig.scala
Original file line number Diff line number Diff line change
Expand Up @@ -141,20 +141,17 @@ class bluemix(sc: SparkContext, name: String, creds: HashMap[String, String],
* secretKey
* cosId [optional]: this parameter is the cloud object storage unique id. It is useful
to keep in the class instance for further checks after the initialization. However,
it is not mandatory for the class instance to work. This value can be retrieved by
calling the getCosId function.
* configurationName [optional]: string that identifies this configuration. You can
use any string you like. This allows you to create
multiple configurations to different Object Storage accounts.
if a configuration name is not passed the default one will be used "service".
bucket_name (projectId in DSX) [optional]: string that identifies the defult
bucket nameyou want to access files from in the COS service instance.
In DSX, bucket_name is the same as projectId. One bucket is
associated with one project.
If this value is not specified, you need to pass it when
you use the url function.
*
Warning: creating a new instance of this class would overwrite the existing
spark hadoop configs if set before if used with the same spark context instance.
*/
class CloudObjectStorage(sc: SparkContext, credentials: HashMap[String, String], configurationName: String = "") {

Expand Down

0 comments on commit 54967f9

Please sign in to comment.