Skip to content

Commit

Permalink
Add migration doc and update R doc
Browse files Browse the repository at this point in the history
  • Loading branch information
zjffdu committed Dec 15, 2015
1 parent 3e71b06 commit d72a3af
Show file tree
Hide file tree
Showing 2 changed files with 9 additions and 2 deletions.
4 changes: 2 additions & 2 deletions R/pkg/R/DataFrame.R
Original file line number Diff line number Diff line change
Expand Up @@ -1886,7 +1886,7 @@ setMethod("except",
#' @param df A SparkSQL DataFrame
#' @param path A name for the table
#' @param source A name for external data source
#' @param mode One of 'append', 'overwrite', 'error', 'ignore' save mode
#' @param mode One of 'append', 'overwrite', 'error', 'ignore' save mode (it is 'error' by default)
#'
#' @family DataFrame functions
#' @rdname write.df
Expand Down Expand Up @@ -1951,7 +1951,7 @@ setMethod("saveDF",
#' @param df A SparkSQL DataFrame
#' @param tableName A name for the table
#' @param source A name for external data source
#' @param mode One of 'append', 'overwrite', 'error', 'ignore' save mode
#' @param mode One of 'append', 'overwrite', 'error', 'ignore' save mode (it is 'error' by default)
#'
#' @family DataFrame functions
#' @rdname saveAsTable
Expand Down
7 changes: 7 additions & 0 deletions docs/sparkr.md
Original file line number Diff line number Diff line change
Expand Up @@ -387,3 +387,10 @@ The following functions are masked by the SparkR package:
Since part of SparkR is modeled on the `dplyr` package, certain functions in SparkR share the same names with those in `dplyr`. Depending on the load order of the two packages, some functions from the package loaded first are masked by those in the package loaded after. In such case, prefix such calls with the package name, for instance, `SparkR::cume_dist(x)` or `dplyr::cume_dist(x)`.

You can inspect the search path in R with [`search()`](https://stat.ethz.ch/R-manual/R-devel/library/base/html/search.html)


# Migration Guide

## Upgrading From SparkR 1.6 to 1.7

- Before Spark 1.7, the default save mode is `append` in api saveDF/write.df/saveAsTable, it is changed to `error` to be consistent with scala api.

0 comments on commit d72a3af

Please sign in to comment.