Skip to content

Commit

Permalink
Merge pull request #43 from Tubular/fix/enable-setting-override
Browse files Browse the repository at this point in the history
FIX make it possible to override spark.sql.catalogImplementation
  • Loading branch information
thesamet committed Jun 2, 2017
2 parents ba137da + 4370a95 commit c892c7e
Show file tree
Hide file tree
Showing 3 changed files with 5 additions and 2 deletions.
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
## 2.0.4
* Make it possible to override default value of spark.sql.catalogImplementation

## 2.0.3
* Add KafkaWatcher to facilitate testing of writing to Kafka
* Fix a few minor pyflakes warnings and typos
Expand Down
2 changes: 1 addition & 1 deletion sparkly/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,4 +19,4 @@
assert SparklySession


__version__ = '2.0.3'
__version__ = '2.0.4'
2 changes: 1 addition & 1 deletion sparkly/session.py
Original file line number Diff line number Diff line change
Expand Up @@ -74,8 +74,8 @@ def __init__(self, additional_options=None):

# Init SparkContext
spark_conf = SparkConf()
spark_conf.setAll(self._setup_options(additional_options))
spark_conf.set('spark.sql.catalogImplementation', 'hive')
spark_conf.setAll(self._setup_options(additional_options))
spark_context = SparkContext(conf=spark_conf)

# Init HiveContext
Expand Down

0 comments on commit c892c7e

Please sign in to comment.