Skip to content

Commit

Permalink
Merge 883f37c into 5fbc644
Browse files Browse the repository at this point in the history
  • Loading branch information
zzcclp committed Mar 18, 2019
2 parents 5fbc644 + 883f37c commit 79106d1
Show file tree
Hide file tree
Showing 2 changed files with 19 additions and 10 deletions.
23 changes: 15 additions & 8 deletions docs/dml-of-carbondata.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,10 +35,14 @@ CarbonData DML statements are documented here,which includes:
This command is used to load csv files to carbondata, OPTIONS are not mandatory for data loading process.

```
LOAD DATA [LOCAL] INPATH 'folder_path'
LOAD DATA INPATH 'folder_path'
INTO TABLE [db_name.]table_name
OPTIONS(property_name=property_value, ...)
```
**NOTE**:
- If run the sql on cluster mode, please upload all input files to HDFS or S3 and so on.
- If run the sql on local mode with HDFS, please upload all input files to HDFS.
- If run the sql on local mode with local file system, it just supports to read input files from local file system.

**Supported Properties:**

Expand Down Expand Up @@ -232,7 +236,7 @@ CarbonData DML statements are documented here,which includes:
Example:

```
LOAD DATA local inpath '/opt/rawdata/data.csv' INTO table carbontable
LOAD DATA inpath '/opt/rawdata/data.csv' INTO table carbontable
options('DELIMITER'=',', 'QUOTECHAR'='"','COMMENTCHAR'='#',
'HEADER'='false',
'FILEHEADER'='empno,empname,designation,doj,workgroupcategory,
Expand Down Expand Up @@ -350,17 +354,19 @@ CarbonData DML statements are documented here,which includes:
This command allows you to load data using static partition.

```
LOAD DATA [LOCAL] INPATH 'folder_path'
LOAD DATA INPATH 'folder_path'
INTO TABLE [db_name.]table_name PARTITION (partition_spec)
OPTIONS(property_name=property_value, ...)
OPTIONS(property_name=property_value, ...)
INSERT INTO INTO TABLE [db_name.]table_name PARTITION (partition_spec) <SELECT STATEMENT>
```

Example:
```
LOAD DATA LOCAL INPATH '${env:HOME}/staticinput.csv'
LOAD DATA INPATH '${env:HOME}/staticinput.csv'
INTO TABLE locationTable
PARTITION (country = 'US', state = 'CA')
PARTITION (country = 'US', state = 'CA')
INSERT INTO TABLE locationTable
PARTITION (country = 'US', state = 'AL')
SELECT <columns list excluding partition columns> FROM another_user
Expand All @@ -372,8 +378,9 @@ CarbonData DML statements are documented here,which includes:

Example:
```
LOAD DATA LOCAL INPATH '${env:HOME}/staticinput.csv'
INTO TABLE locationTable
LOAD DATA INPATH '${env:HOME}/staticinput.csv'
INTO TABLE locationTable
INSERT INTO TABLE locationTable
SELECT <columns list excluding partition columns> FROM another_user
```
Expand Down
6 changes: 4 additions & 2 deletions docs/quick-start-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -241,7 +241,9 @@ mv carbondata.tar.gz carbonlib/
--executor-cores 2
```

**NOTE**: Make sure you have permissions for CarbonData JARs and files through which driver and executor will start.
**NOTE**:
- Make sure you have permissions for CarbonData JARs and files through which driver and executor will start.
- If use Spark + Hive 1.1.X, it needs to add carbondata assembly jar and carbondata-hive jar into parameter 'spark.sql.hive.metastore.jars' in spark-default.conf file.



Expand Down Expand Up @@ -485,4 +487,4 @@ select * from carbon_table;

**Note :** Create Tables and data loads should be done before executing queries as we can not create carbon table from this interface.

```
```

0 comments on commit 79106d1

Please sign in to comment.