Skip to content

Commit

Permalink
Docs (#938)
Browse files Browse the repository at this point in the history
* docs

* docs

* format

* lint
  • Loading branch information
haifeng-jin committed Jan 31, 2020
1 parent 9f33532 commit bfaa475
Show file tree
Hide file tree
Showing 8 changed files with 37 additions and 41 deletions.
23 changes: 11 additions & 12 deletions autokeras/hypermodels/wrapper.py
Original file line number Diff line number Diff line change
Expand Up @@ -111,19 +111,18 @@ class StructuredDataBlock(block_module.Block):
"""Block for structured data.
# Arguments
feature_encoding: Boolean. Whether to use feature encoding block to encode
the categorical features. Defaults to True. If specified as None, it will
be tuned automatically.
categorical_encoding: Boolean. Whether to use the CategoricalToNumerical to
encode the categorical features to numerical features. Defaults to True.
If specified as None, it will be tuned automatically.
seed: Int. Random seed.
"""

def __init__(self,
feature_encoding=True,
block_type=None,
categorical_encoding=True,
seed=None,
**kwargs):
super().__init__(**kwargs)
self.feature_encoding = feature_encoding
self.categorical_encoding = categorical_encoding
self.seed = seed
self.column_types = None
self.column_names = None
Expand All @@ -136,12 +135,12 @@ def get_config(self):

def build_feature_encoding(self, hp, input_node):
output_node = input_node
feature_encoding = self.feature_encoding
if feature_encoding is None:
feature_encoding = hp.Choice('feature_encoding',
[True, False],
default=True)
if feature_encoding:
categorical_encoding = self.categorical_encoding
if categorical_encoding is None:
categorical_encoding = hp.Choice('feature_encoding',
[True, False],
default=True)
if categorical_encoding:
block = preprocessing.CategoricalToNumerical()
block.column_types = self.column_types
block.column_names = self.column_names
Expand Down
6 changes: 2 additions & 4 deletions docs/autogen.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,6 @@
],
'base.md': [
'autokeras.Node',
'autokeras.Preprocessor',
'autokeras.Block',
'autokeras.Block.build',
'autokeras.Head',
Expand All @@ -69,7 +68,7 @@
'block.md': [
'autokeras.ConvBlock',
'autokeras.DenseBlock',
'autokeras.EmbeddingBlock',
'autokeras.Embedding',
'autokeras.Merge',
'autokeras.ResNetBlock',
'autokeras.RNNBlock',
Expand All @@ -81,12 +80,11 @@
'autokeras.TextBlock',
],
'preprocessor.md': [
'autokeras.FeatureEngineering',
'autokeras.ImageAugmentation',
'autokeras.LightGBM',
'autokeras.Normalization',
'autokeras.TextToIntSequence',
'autokeras.TextToNgramVector',
'autokeras.CategoricalToNumerical',
],
'head.md': [
'autokeras.ClassificationHead',
Expand Down
2 changes: 1 addition & 1 deletion docs/templates/tutorial/customized.md
Original file line number Diff line number Diff line change
Expand Up @@ -151,7 +151,7 @@ print(auto_model.evaluate(x_test, y_test))
**Blocks**:
[ConvBlock](/block/#convblock-class),
[DenseBlock](/block/#denseblock-class),
[EmbeddingBlock](/block/#embeddingblock-class),
[Embedding](/block/#embedding-class),
[Merge](/block/#merge-class),
[ResNetBlock](/block/#resnetblock-class),
[RNNBlock](/block/#rnnblock-class),
Expand Down
14 changes: 8 additions & 6 deletions docs/templates/tutorial/multi.md
Original file line number Diff line number Diff line change
Expand Up @@ -110,8 +110,8 @@ graph LR
id3 --> id5(ResNet V2)
id4 --> id6(Merge)
id5 --> id6
id7(StructuredDataInput) --> id8(Feature Engineering)
id8 --> id9(LightGBM)
id7(StructuredDataInput) --> id8(CategoricalToNumerical)
id8 --> id9(DenseBlock)
id6 --> id10(Merge)
id9 --> id10
id10 --> id11(Classification Head)
Expand All @@ -129,8 +129,8 @@ output_node2 = ak.ResNetBlock(version='v2')(output_node)
output_node1 = ak.Merge()([output_node1, output_node2])

input_node2 = ak.StructuredDataInput()
output_node = ak.FeatureEngineering()(input_node2)
output_node2 = ak.LightGBM()(output_node)
output_node = ak.CategoricalToNumerical()(input_node2)
output_node2 = ak.DenseBlock()(output_node)

output_node = ak.Merge()([output_node1, output_node2])
output_node1 = ak.ClassificationHead()(output_node)
Expand Down Expand Up @@ -160,5 +160,7 @@ You can also refer to the Data Format section of the tutorials of
[AutoModel](/auto_model/#automodel-class),
[ImageInput](/node/#imageinput-class),
[StructuredDataInput](/node/#structureddatainput-class),
[RegressionHead](/head/#regressionhead-class).
[ClassificationHead](/head/#classificationhead-class).
[DenseBlock](/block/#denseblock-class),
[RegressionHead](/head/#regressionhead-class),
[ClassificationHead](/head/#classificationhead-class),
[CategoricalToNumerical](/preprocessor/#categoricaltonumerical-class).
7 changes: 3 additions & 4 deletions docs/templates/tutorial/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,17 +39,16 @@ The following are the links to the documentation of the predefined input nodes a
[TextInput](/node/#textinput-class).

**Preprocessors**:
[FeatureEngineering](/preprocessor/#featureengineering-class),
[ImageAugmentation](/preprocessor/#imageaugmentation-class),
[LightGBM](/preprocessor/#lightgbm-class),
[Normalization](/preprocessor/#normalization-class),
[TextToIntSequence](/preprocessor/#texttointsequence-class),
[TextToNgramVector](/preprocessor/#texttongramvector-class).
[TextToNgramVector](/preprocessor/#texttongramvector-class),
[CategoricalToNumerical](/preprocessor/#categoricaltonumerical-class).

**Blocks**:
[ConvBlock](/block/#convblock-class),
[DenseBlock](/block/#denseblock-class),
[EmbeddingBlock](/block/#embeddingblock-class),
[Embedding](/block/#embedding-class),
[Merge](/block/#merge-class),
[ResNetBlock](/block/#resnetblock-class),
[RNNBlock](/block/#rnnblock-class),
Expand Down
18 changes: 7 additions & 11 deletions docs/templates/tutorial/structured_data_classification.md
Original file line number Diff line number Diff line change
Expand Up @@ -148,11 +148,8 @@ For advanced users, you may customize your search space by using
[AutoModel](/auto_model/#automodel-class) instead of
[StructuredDataClassifier](/structured_data_classifier). You can configure the
[StructuredDataBlock](/block/#structureddatablock-class) for some high-level
configurations, e.g., `feature_engineering` for whether to use the
[FeatureEngineering](/preprocessor/#featureengineering-class) block, `block_type` for
which type of block you want to use in the search space. You can use 'dense' for
[DenseBlock](/block/#denseblock-class), or you can use 'lightgbm' for
[LightGBM](/preprocessor/#lightgbm-class). You can also do not specify these
configurations, e.g., `categorical_encoding` for whether to use the
[CategoricalToNumerical](/preprocessor/#categoricaltonumerical-class). You can also do not specify these
arguments, which would leave the different choices to be tuned automatically. See
the following example for detail.

Expand All @@ -161,7 +158,7 @@ import autokeras as ak

input_node = ak.StructuredDataInput()
output_node = ak.StructuredDataBlock(
feature_engineering=False,
categorical_encoding=True,
block_type='dense')(input_node)
output_node = ak.ClassificationHead()(output_node)
clf = ak.AutoModel(inputs=input_node, outputs=output_node, max_trials=10)
Expand All @@ -180,8 +177,8 @@ further. See the following example.
import autokeras as ak

input_node = ak.StructuredDataInput()
output_node = ak.FeatureEngineering(max_columns=500)(input_node)
output_node = ak.LightGBM()(output_node)
output_node = ak.CategoricalToNumerical()(input_node)
output_node = ak.DenseBlock()(output_node)
output_node = ak.ClassificationHead()(output_node)
clf = ak.AutoModel(inputs=input_node, outputs=output_node, max_trials=10)
clf.fit(x_train, y_train)
Expand All @@ -193,8 +190,7 @@ clf.fit(x_train, y_train)
[AutoModel](/auto_model/#automodel-class),
[StructuredDataClassifier](/structured_data_classifier),
[StructuredDataBlock](/block/#structureddatablock-class),
[FeatureEngineering](/preprocessor/#featureengineering-class),
[DenseBlock](/block/#denseblock-class),
[LightGBM](/preprocessor/#lightgbm-class),
[StructuredDataInput](/node/#structureddatainput-class),
[ClassificationHead](/head/#classificationhead-class).
[ClassificationHead](/head/#classificationhead-class),
[CategoricalToNumerical](/preprocessor/#categoricaltonumerical-class).
6 changes: 3 additions & 3 deletions docs/templates/tutorial/text_classification.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ For advanced users, you may customize your search space by using
[TextBlock](/block/#textblock-class) for some high-level configurations, e.g., `vectorizer`
for the type of text vectorization method to use. You can use 'sequence', which uses
[TextToInteSequence](/preprocessor/#texttointsequence-class) to convert the words to
integers and use [EmbeddingBlock](/block/#embeddingblock-class) for embedding the
integers and use [Embedding](/block/#embedding-class) for embedding the
integer sequences, or you can use 'ngram', which uses
[TextToNgramVector](/preprocessor/#texttongramvector-class) to vectorize the
sentences. You can also do not specify these arguments, which would leave the
Expand Down Expand Up @@ -111,7 +111,7 @@ import autokeras as ak

input_node = ak.TextInput()
output_node = ak.TextToIntSequence()(input_node)
output_node = ak.EmbeddingBlock()(output_node)
output_node = ak.Embedding()(output_node)
# Use separable Conv layers in Keras.
output_node = ak.ConvBlock(separable=True)(output_node)
output_node = ak.ClassificationHead()(output_node)
Expand Down Expand Up @@ -152,7 +152,7 @@ print(clf.evaluate(test_set))
[AutoModel](/auto_model/#automodel-class),
[TextBlock](/block/#textblock-class),
[TextToInteSequence](/preprocessor/#texttointsequence-class),
[EmbeddingBlock](/block/#embeddingblock-class),
[Embedding](/block/#embedding-class),
[TextToNgramVector](/preprocessor/#texttongramvector-class),
[ConvBlock](/block/#convblock-class),
[TextInput](/node/#textinput-class),
Expand Down
2 changes: 2 additions & 0 deletions shell/lint.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
isort -rc -sl -c
flake8

0 comments on commit bfaa475

Please sign in to comment.