Skip to content
This repository has been archived by the owner on May 12, 2021. It is now read-only.

Commit

Permalink
Fix typos in Markdown files
Browse files Browse the repository at this point in the history
Closes #464
  • Loading branch information
takezoe authored and dszeto committed Sep 20, 2018
1 parent c9e564d commit 6c607aa
Show file tree
Hide file tree
Showing 35 changed files with 66 additions and 66 deletions.
2 changes: 1 addition & 1 deletion docs/manual/source/appintegration/index.html.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ Overview](/images/overview-singleengine.png)
## Sending Event Data

Apache PredictionIO's Event Server receives event data from your
application. The data can be used by engines as training data to build preditive
application. The data can be used by engines as training data to build predictive
models.

Event Server listens to port 7070 by default. You can change the port with the
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ Please follow this styleguide for any documentation contributions.

### Text

View our [Sample Typography](/samples/) page for all posible styles.
View our [Sample Typography](/samples/) page for all possible styles.

### Headings

Expand Down
2 changes: 1 addition & 1 deletion docs/manual/source/community/contribute-webhook.html.md
Original file line number Diff line number Diff line change
Expand Up @@ -197,7 +197,7 @@ and tests should be in
data/src/test/scala/org.apache.predictionio/data/webhooks/segmentio/
```

**For form-submission data**, you can find the comple example [the GitHub
**For form-submission data**, you can find the complete example [the GitHub
repo](https://github.com/apache/predictionio/blob/develop/data/src/main/scala/org/apache/predictionio/data/webhooks/exampleform/ExampleFormConnector.scala)
and how to write [tests for the
connector](https://github.com/apache/predictionio/blob/develop/data/src/test/scala/org/apache/predictionio/data/webhooks/exampleform/ExampleFormConnectorSpec.scala).
Expand Down
2 changes: 1 addition & 1 deletion docs/manual/source/customize/dase.html.md.erb
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ DataSource reads and selects useful data from the Event Store (data store of the

## readTraining()

You need to implment readTraining() of [PDataSource](https://predictionio.apache.org/api/current/#org.apache.predictionio.controller.PDataSource), where you can use the [PEventStore Engine API](https://predictionio.apache.org/api/current/#org.apache.predictionio.data.store.PEventStore$) to read the events and create the TrainingData based on the events.
You need to implement readTraining() of [PDataSource](https://predictionio.apache.org/api/current/#org.apache.predictionio.controller.PDataSource), where you can use the [PEventStore Engine API](https://predictionio.apache.org/api/current/#org.apache.predictionio.data.store.PEventStore$) to read the events and create the TrainingData based on the events.

The following code example reads user "view" and "buy" item events, filters specific type of events for future processing and returns TrainingData accordingly.

Expand Down
2 changes: 1 addition & 1 deletion docs/manual/source/datacollection/eventapi.html.md
Original file line number Diff line number Diff line change
Expand Up @@ -328,7 +328,7 @@ Field | Type | Description
| | are reserved and shouldn't be used.
`targetEntityId` | String | (Optional) The target entity ID.
`properties` | JSON | (Optional) See **Note About Properties** below
| | **Note**: All peroperty names start with "$" and "pio_"
| | **Note**: All property names start with "$" and "pio_"
| | are reserved and shouldn't be used as keys inside `properties`.
`eventTime` | String | (Optional) The time of the event. Although Event Server's
| | current system time and UTC timezone will be used if this is
Expand Down
10 changes: 5 additions & 5 deletions docs/manual/source/datacollection/eventmodel.html.md.erb
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ This section explains how to model your application data as events.

For example, your application may have users and some items which the user can interact with. Then you can model them as two entity types: **user** and **item** and the entityId can uniquely identify the entity within each entityType (e.g. user with ID 1, item with ID 1).

An entity may peform some events (e.g user 1 does something), and entity may have properties associated with it (e.g. user may have gender, age, email etc). Hence, **events** involve **entities** and there are three types of events, respectively:
An entity may perform some events (e.g user 1 does something), and entity may have properties associated with it (e.g. user may have gender, age, email etc). Hence, **events** involve **entities** and there are three types of events, respectively:

1. Generic events performed by an entity.
2. Special events for recording changes of an entity's properties
Expand Down Expand Up @@ -78,7 +78,7 @@ The following are some simple examples:

## 2. Special events for recording changes of an entity's properties

The generic events described above are used to record general actions performed by the entity. However, an entity may have properties (or attributes) associated with it. Morever, the properties of the entity may change over time (for example, user may have new address, item may have new categories). In order to record such changes of an entity's properties. Special events `$set` , `$unset` and `$delete` are introduced.
The generic events described above are used to record general actions performed by the entity. However, an entity may have properties (or attributes) associated with it. Moreover, the properties of the entity may change over time (for example, user may have new address, item may have new categories). In order to record such changes of an entity's properties. Special events `$set` , `$unset` and `$delete` are introduced.

The following special events are reserved for updating entities and their properties:

Expand Down Expand Up @@ -108,7 +108,7 @@ For example, setting entity `user-1`'s properties `birthday` and `address`:

NOTE: Although it doesn't hurt to import duplicated special events for an entity (exactly same properties) into event server (it just means that the entity changes to the same state as before and new duplicated event provides no new information about the user), it could waste storage space.

To demonstrate the concept of these special events, we are going to import a sequence of events and see how it affects the retrieved entitiy's properties.
To demonstrate the concept of these special events, we are going to import a sequence of events and see how it affects the retrieved entity's properties.

Assuming you have created the App (named "MyTestApp") for testing and Event Server is started.

Expand Down Expand Up @@ -151,7 +151,7 @@ After this eventTime, user-2 is created and has properties of a = 3 and b = 4.

#### Event 2

Then, on `2014-09-10T...`, let's say the user has updated the properties b = 5 and c = 6. To record such propertiy change, create another `$set` event. Run the following command:
Then, on `2014-09-10T...`, let's say the user has updated the properties b = 5 and c = 6. To record such property change, create another `$set` event. Run the following command:

```bash
$ curl -i -X POST http://localhost:7070/events.json?accessKey=$ACCESS_KEY \
Expand Down Expand Up @@ -283,7 +283,7 @@ scala> import org.joda.time.DateTime
scala> PEventStore.aggregateProperties(appName=appName, entityType="user", untilTime=Some(new DateTime(2014, 9, 11, 0, 0)))(sc).collect()
```

You should see the following ouptut and the aggregated properties matches what we expected as described earlier (right befor event 3): user-2 has properties of a = 3, b = 5 and c = 6.
You should see the following ouptut and the aggregated properties matches what we expected as described earlier (right before event 3): user-2 has properties of a = 3, b = 5 and c = 6.

```
res2: Array[(String, org.apache.predictionio.data.storage.PropertyMap)] =
Expand Down
2 changes: 1 addition & 1 deletion docs/manual/source/demo/tapster.html.md
Original file line number Diff line number Diff line change
Expand Up @@ -417,7 +417,7 @@ demo and build upon it. If you produce something cool shoot us an email and we
will link to it from here.

Found a typo? Think something should be explained better? This tutorial (and all
our other documenation) live in the main repo
our other documentation) live in the main repo
[here](https://github.com/apache/predictionio/blob/livedoc/docs/manual/source/demo/tapster.html.md).
Our documentation is in the `livedoc` branch. Find out how to contribute
documentation at
Expand Down
6 changes: 3 additions & 3 deletions docs/manual/source/demo/textclassification.html.md.erb
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ limitations under the License.

## Introduction

In the real world, there are many applications that collect text as data. For example, spam detectors take email and header content to automatically determine what is or is not spam; applications can gague the general sentiment in a geographical area by analyzing Twitter data; and news articles can be automatically categorized based solely on the text content.There are a wide array of machine learning models you can use to create, or train, a predictive model to assign an incoming article, or query, to an existing category. Before you can use these techniques you must first transform the text data (in this case the set of news articles) into numeric vectors, or feature vectors, that can be used to train your model.
In the real world, there are many applications that collect text as data. For example, spam detectors take email and header content to automatically determine what is or is not spam; applications can gauge the general sentiment in a geographical area by analyzing Twitter data; and news articles can be automatically categorized based solely on the text content.There are a wide array of machine learning models you can use to create, or train, a predictive model to assign an incoming article, or query, to an existing category. Before you can use these techniques you must first transform the text data (in this case the set of news articles) into numeric vectors, or feature vectors, that can be used to train your model.

The purpose of this tutorial is to illustrate how you can go about doing this using PredictionIO's platform. The advantages of using this platform include: a dynamic engine that responds to queries in real-time; [separation of concerns](http://en.wikipedia.org/wiki/Separation_of_concerns), which offers code re-use and maintainability, and distributed computing capabilities for scalability and efficiency. Moreover, it is easy to incorporate non-trivial data modeling tasks into the DASE architecture allowing Data Scientists to focus on tasks related to modeling. This tutorial will exemplify some of these ideas by guiding you through PredictionIO's [text classification template](/gallery/template-gallery/#natural-language-processing).

Expand Down Expand Up @@ -91,7 +91,7 @@ $ pio import --appid *** --input data/emails.json

### 3. Set the engine parameters in the file `engine.json`.

The default settings are shown below. By default, it uses the algorithm name "lr" which is logstic regression. Please see later section for more detailed explanation of engine.json setting.
The default settings are shown below. By default, it uses the algorithm name "lr" which is logistic regression. Please see later section for more detailed explanation of engine.json setting.

Make sure the "appName" is same as the app you created in step1.

Expand Down Expand Up @@ -272,7 +272,7 @@ Note that `readEventData` and `readStopWords` use different entity types and eve

Now, the default dataset used for training is contained in the file `data/emails.json` and contains a set of e-mail spam data. If we want to switch over to one of the other data sets we must make sure that the `eventNames` and `entityType` fields are changed accordingly.

In the data/ directory, you will find different sets of data files for different types of text classifcaiton application. The following show one observation from each of the provided data files:
In the data/ directory, you will find different sets of data files for different types of text classificaiton application. The following show one observation from each of the provided data files:

- `emails.json`:

Expand Down
2 changes: 1 addition & 1 deletion docs/manual/source/deploy/monitoring.html.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ sudo apt-get install monit
```

##Configure Basics
Now we can configure monit by the configuration file `/etc/monit/monitrc` with your favorite editor. You will notice that this file contains quite a bit already, most of which is commented intructions/examples.
Now we can configure monit by the configuration file `/etc/monit/monitrc` with your favorite editor. You will notice that this file contains quite a bit already, most of which is commented instructions/examples.

First, choose the interval on which you want monit to check the status of your system. Use the `set daemon` command for this, it should already exist in the configuration file.

Expand Down
4 changes: 2 additions & 2 deletions docs/manual/source/evaluation/index.html.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ limitations under the License.

PredictionIO's evaluation module allows you to streamline the process of
testing lots of knobs in engine parameters and deploy the best one out
of it using statisically sound cross-validation methods.
of it using statistically sound cross-validation methods.

There are two key components:

Expand Down Expand Up @@ -51,6 +51,6 @@ We will discuss various aspects of evaluation with PredictionIO.
where you can see a detailed breakdown of all previous evaluations.
- [Choosing Evaluation Metrics](/evaluation/metricchoose/) - we cover some basic
machine learning metrics
- [Bulding Evaluation Metrics](/evaluation/metricbuild/) - we illustrate how to
- [Building Evaluation Metrics](/evaluation/metricbuild/) - we illustrate how to
implement a custom metric with as few as one line of code (plus some
boilerplates).
4 changes: 2 additions & 2 deletions docs/manual/source/evaluation/metricbuild.html.md
Original file line number Diff line number Diff line change
Expand Up @@ -97,11 +97,11 @@ negative cases.

PredictionIO provides a helper class `OptionAverageMetric` allows user to
specify *don't care* values as `None`. It only aggregates the non-None values.
Lines 3 to 4 is the method signature of `calcuate` method. The key difference
Lines 3 to 4 is the method signature of `calculate` method. The key difference
is that the return value is a `Option[Double]`, in contrast to `Double` for
`AverageMetric`. This class only computes the average of `Some(.)` results.
Lines 5 to 13 are the actual logic. The first `if` factors out the
positively predicted case, and the computation is similiar to the accuracy
positively predicted case, and the computation is similar to the accuracy
metric. The negatively predicted case are the *don't cares*, which we return
`None`.

Expand Down
2 changes: 1 addition & 1 deletion docs/manual/source/evaluation/paramtuning.html.md
Original file line number Diff line number Diff line change
Expand Up @@ -278,7 +278,7 @@ validation set, `EvaluationInfo` can be used to hold some global evaluation data
; it is not used in the current example.

Lines 11 to 41 is the logic of reading and transforming data from the
datastore; it is equvialent to the existing `readTraining` method. After line
datastore; it is equivalent to the existing `readTraining` method. After line
41, the variable `labeledPoints` contains the complete dataset with which we use
to generate the (training, validation) sequence.

Expand Down
4 changes: 2 additions & 2 deletions docs/manual/source/install/install-vagrant.html.md.erb
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ INFO: When you run `vagrant up` for the first time, it will download the base
box ubuntu/trusty64 if you don't have it. Then it will also install all
necessary libraries and setup PredictionIO in the virtual machine.

When it finishes successfully, you should see somthing like the following:
When it finishes successfully, you should see something like the following:

```
==> default: Installation done!
Expand Down Expand Up @@ -112,7 +112,7 @@ $ vagrant halt
```

WARNING: If you didn't shut down VM properly or you ran `vagrant suspend`, the
VM may go to suspend state. HBase may not be running propoerly next time when
VM may go to suspend state. HBase may not be running properly next time when
you run `vagrant up.` In this case, you can always run `vagrant halt` to do a
clean shutdown first before run `vagrant up` again.

Expand Down
2 changes: 1 addition & 1 deletion docs/manual/source/install/launch-aws.html.md.erb
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ You should see the following screen after you have logged in.

![alt text](../images/awsm-product.png)

Under the big yellow "Continue" botton, select the region where you want to
Under the big yellow "Continue" button, select the region where you want to
launch the PredictionIO EC2 instance, then click "Continue".

![alt text](../images/awsm-1click.png)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -176,7 +176,7 @@ The data is now in the event server.
## Principal Component Analysis


PCA begins with the data matrix \\(\bf X\\) whose rows are feature vectors corresponding to a set of observations. In our case, each row represents the pixel information of the corresponding hand-written numerc digit image. The model then computes the [covariance matrix](https://en.wikipedia.org/wiki/Covariance_matrix) estimated from the data matrix \\(\bf X\\). The algorithm then takes the covariance matrix and computes the [eigenvectors](https://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors) that correspond to its \\(k\\) (some integer) largest [eigenvalues](https://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors). The data matrix is then mapped to the space generated by these \\(k\\) vectors, which are called the \\(k\\) **ptincipal components** of \\(\bf X\\). What this is doing is mapping the data observations into a lower-dimensional space that explains the largest variability in the data (contains the most information). The algorithm for implementing PCA is listed as follows:
PCA begins with the data matrix \\(\bf X\\) whose rows are feature vectors corresponding to a set of observations. In our case, each row represents the pixel information of the corresponding hand-written numeric digit image. The model then computes the [covariance matrix](https://en.wikipedia.org/wiki/Covariance_matrix) estimated from the data matrix \\(\bf X\\). The algorithm then takes the covariance matrix and computes the [eigenvectors](https://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors) that correspond to its \\(k\\) (some integer) largest [eigenvalues](https://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors). The data matrix is then mapped to the space generated by these \\(k\\) vectors, which are called the \\(k\\) **principal components** of \\(\bf X\\). What this is doing is mapping the data observations into a lower-dimensional space that explains the largest variability in the data (contains the most information). The algorithm for implementing PCA is listed as follows:

### PCA Algorithm

Expand Down
2 changes: 1 addition & 1 deletion docs/manual/source/resources/faq.html.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ Storage Backend Connections
2015-02-03 18:40:04,812 ERROR zookeeper.ZooKeeperWatcher - hconnection-0x1e4075ce, quorum=localhost:2181, baseZNode=/hbase Received unexpected KeeperException, re-throwing exception
org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase/hbaseid
...
2015-02-03 18:40:07,021 ERROR hbase.StorageClient - Failed to connect to HBase. Plase check if HBase is running properly.
2015-02-03 18:40:07,021 ERROR hbase.StorageClient - Failed to connect to HBase. Please check if HBase is running properly.
2015-02-03 18:40:07,026 ERROR storage.Storage$ - Error initializing storage client for source HBASE
2015-02-03 18:40:07,027 ERROR storage.Storage$ - Can't connect to ZooKeeper
java.util.NoSuchElementException: None.get
Expand Down
2 changes: 1 addition & 1 deletion docs/manual/source/resources/glossary.html.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ Algorithm, [S] Serving, [E] Evaluation Metrics.

**EngineClient**
- Part of PredictionSDK. It sends queries to a deployed engine instance through
the Engine API and retrives prediction results.
the Engine API and retrieves prediction results.

**Event API**
- Please see Event Server.
Expand Down
2 changes: 1 addition & 1 deletion docs/manual/source/resources/intellij.html.md.erb
Original file line number Diff line number Diff line change
Expand Up @@ -220,7 +220,7 @@ the following.

You can execute a query with the correct SDK. For a recommender that has been
trained with the sample MovieLens dataset perhaps the easiest query is a `curl`
one. Start by running or debuging your `pio deploy` config so the service is
one. Start by running or debugging your `pio deploy` config so the service is
waiting for the query. Then go to the "Terminal" tab at the very bottom of the
IntelliJ IDEA window and enter the `curl` request:

Expand Down
4 changes: 2 additions & 2 deletions docs/manual/source/support/index.html.md.erb
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,8 @@ limitations under the License.

## Community Support

Apahce PredictionIO has a welcoming and active community. We are
here to support you and make sure that you can use Apahce PredictionIO
Apache PredictionIO has a welcoming and active community. We are
here to support you and make sure that you can use Apache PredictionIO
successfully.

If you are a user, please subscribe to our user mailing list.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -191,7 +191,7 @@ client.createEvent(event);

Note that you can also set the properties for the user with multiple `$set` events (They will be aggregated during engine training).

To set properties "attr0", "attr1" and "attr2", and "plan" for user "u1" at different time, you can send follwing `$set` events for the user. To send these events, run the following `curl` command:
To set properties "attr0", "attr1" and "attr2", and "plan" for user "u1" at different time, you can send following `$set` events for the user. To send these events, run the following `curl` command:

<div class="tabs">
<div data-tab="REST API" data-lang="json">
Expand Down
Loading

0 comments on commit 6c607aa

Please sign in to comment.