Skip to content

Commit 1c68c0c

Browse files
author
Sam Foo
committed
Fixes to avoid incorrect dictionary generation
1 parent 5eac812 commit 1c68c0c

File tree

6 files changed

+18
-18
lines changed

6 files changed

+18
-18
lines changed

docs/applications/big-data/how-to-move-machine-learning-model-to-production.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ author:
44
email: docs@linode.com
55
description: 'This guide shows how to use an existing deep learning model as part of a production application. A pre-trained model is included as an API endpoint for a Flask app.'
66
keywords: ["deep learning", "big data", "python", "keras", "flask", "machine learning", "neural networks"]
7-
og_description: 'Use an pre-trained deep learning model as part of a production application.'
7+
og_description: 'Use a pre-trained deep learning model as part of a production application.'
88
license: '[CC BY-ND 4.0](https://creativecommons.org/licenses/by-nd/4.0)'
99
published: 2017-10-09
1010
modified: 2017-10-10
@@ -43,7 +43,7 @@ You will be using Python both to create a model and to deploy the model to a Fla
4343
1. Download and install Miniconda, a lightweight version of Anaconda. Follow the instructions in the terminal and allow Anaconda to add a PATH location to `.bashrc`:
4444

4545
wget https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh
46-
bash Anaconda3-5.0.0.1-Linux-x86_64.sh
46+
bash Miniconda3-latest-Linux-x86_64.sh
4747
source .bashrc
4848

4949
2. Create and activate a new Python virtual environment:
@@ -218,7 +218,7 @@ Apache modules are typically installed with the system installation of Apache. H
218218
The output should be similar to:
219219

220220
LoadModule wsgi_module "/home/linode/miniconda3/envs/deeplearning/lib/python3.6/site-packages/mod_wsgi-4.5.20-py3.6-linux-x86_64.egg/mod_wsgi/server/mod_wsgi-py36.cpython-36m-x86_64-linux-gnu.so"
221-
WSGIPythonHome "/home/linode/miniconda3/envs/deeplearning"
221+
WSGIPythonHome "/home/linode/miniconda3/envs/deeplearning"
222222

223223
4. Create a `wsgi.load` file in the Apache `mods-available` directory. Copy the `LoadModule` directive from above and paste it into the file:
224224

docs/applications/cloud-storage/how-to-install-a-turtl-server-on-ubuntu.md

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -45,22 +45,22 @@ The Turtl server has to be built from source. Download all of the dependencies a
4545

4646
Download the Libuv package from the official repository:
4747

48-
wget https://dist.libuv.org/dist/v1.13.0/libuv-v1.13.0.tar.gz
49-
tar -xvf libuv-v1.13.0.tar.gz
48+
wget https://dist.libuv.org/dist/v1.13.0/libuv-v1.13.0.tar.gz
49+
tar -xvf libuv-v1.13.0.tar.gz
5050

5151
Build the package from source:
5252

5353
cd libuv-v1.13.0
5454
sudo sh autogen.sh
55-
sudo ./configure
56-
sudo make
57-
sudo make install
55+
sudo ./configure
56+
sudo make
57+
sudo make install
5858

5959
After the package is built, run `sudo ldconfig` to maintain the shared libracy cache.
6060

6161
#### RethinkDB
6262

63-
[RethinkDB](https://rethinkdb.com/faq/) is a flexible JSON datbase. According to the Turtl [documentation](https://turtlapp.com/docs/server/), RethinkDB just needs to be installed; Turtl will take care of the rest.
63+
[RethinkDB](https://rethinkdb.com/faq/) is a flexible JSON database. According to the Turtl [documentation](https://turtlapp.com/docs/server/), RethinkDB just needs to be installed; Turtl will take care of the rest.
6464

6565
RehinkDB has community-maintained packages on most distributions. On Ubuntu, you have to add the RethinkDB to your list of repositories:
6666

@@ -98,13 +98,13 @@ According to the CCL [documentation](https://ccl.clozure.com/download.html), you
9898
Quickly check if CCL has been installed correctly by updating the sources:
9999

100100
cd ccl
101-
svn update
101+
svn update
102102

103103
Move `ccl` to `/usr/bin` so `ccl` can run from the command line:
104104

105105
cd ..
106106
sudo cp -r ccl/ /usr/local/src
107-
sudo cp /usr/local/src/ccl/scripts/ccl64 /usr/local/bin
107+
sudo cp /usr/local/src/ccl/scripts/ccl64 /usr/local/bin
108108

109109
Now, running `ccl64`, or `ccl` depending on your system, will launch a Lisp environment:
110110

@@ -163,8 +163,8 @@ Download ASDF:
163163
Load and install `asdf.lisp` in your CCL environment:
164164

165165
ccl64 --load quicklisp.lisp
166-
(load (compile-file "asdf.lisp"))
167-
(quit)
166+
(load (compile-file "asdf.lisp"))
167+
(quit)
168168

169169

170170
### Install Turtl
@@ -191,7 +191,7 @@ Turtl does not ship with all of its dependencies. Instead, the Turtl community p
191191
Edit the `/home/turtl/.ccl-init.lisp` to include:
192192

193193
(cwd "/home/turtl/api")
194-
(load "/home/turtl/api/launch")
194+
(load "/home/turtl/api/launch")
195195

196196
The first line tells Lisp to use the `cl-cwd` package that you cloned to change the current working directory to `/home/turtl/api`. You can change this to anything, but your naming conventions should be consistent. The second line loads your `launch.lisp`, loading `asdf` so that Turtl can run.
197197

docs/applications/messaging/instant-messaging-services-with-ejabberd-on-ubuntu-8-04-hardy.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ title: 'Instant Messaging Services with ejabberd on Ubuntu 8.04 (Hardy)'
1616

1717

1818

19-
Ejabberd is a Jabber daemon written in the Erlang programming language. It is extensible, flexible and very high performance. With a web-based interface, and broad support for [XMPP standards](http://xmpp.org/), ejabberd is a great choice for a multi-purpose XMPP server. Ejabberd can be considered "heavyweight" by critics, but mostly due to the requirements of the Erlang run-times. However, it is incredibly robust and can scale to support incredibly heavy loads: ebjabberd servers are believed to be the backbone for some of the largest Jabber servers running now.
19+
Ejabberd is a Jabber daemon written in the Erlang programming language. It is extensible, flexible and very high performance. With a web-based interface, and broad support for [XMPP standards](http://xmpp.org/), ejabberd is a great choice for a multi-purpose XMPP server. Ejabberd can be considered "heavyweight" by critics, but mostly due to the requirements of the Erlang run-times. However, it is incredibly robust and can scale to support incredibly heavy loads: ejabberd servers are believed to be the backbone for some of the largest Jabber servers running now.
2020

2121
This installation process assumes that you have a working installation of Ubuntu 8.04 (Hardy) and have followed the steps in the [getting started](/docs/getting-started/) guide, and now have an up to date instance of the Ubuntu Hardy operating system and are connected to your Linode via SSH and have root access. Once you've completed these requirements we can begin with the installation process.
2222

docs/applications/messaging/using-irssi-for-internet-relay-chat.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -243,7 +243,7 @@ The `hilight` command will highlight certain words used in the channels you have
243243

244244
/hilight word
245245

246-
To remove a hilight, use the command:
246+
To remove a `hilight`, use the command:
247247

248248
/dehilight word
249249

docs/development/frameworks/yesod-nginx-mysql-on-debian-7-wheezy.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -196,7 +196,7 @@ We don't need to modify this configuration file, it's acceptable as is. So you o
196196

197197
If your Linode has a firewall, the port ``3000`` is probably inaccessible from outside, so you will not be able to see your site at http://www.yoursite.com:3000/. This port is only for testing or developing, so don't open it on your firewall. Instead, you can set up an SSH tunnel on your Linode, and view your site at http://localhost:3000/ via this tunnel. Please check [Setting up an SSH Tunnel with Your Linode for Safe Browsing](/docs/networking/ssh/setting-up-an-ssh-tunnel-with-your-linode-for-safe-browsing/) for more details.
198198

199-
You may have noticed that we haven't configure Nginx yet. In fact, Yesod applications contain an http server called Warp, which is written in Haskell, and has a very fast run-time. Without http servers like Apache or Nginx installed, you can run Yesod applications as standalones. This feature is similar to the Express framework on Node.js.
199+
You may have noticed that we haven't configure Nginx yet. In fact, Yesod applications contain an http server called Warp, which is written in Haskell, and has a very fast run-time. Without http servers like Apache or Nginx installed, you can run standalone Yesod applications. This feature is similar to the Express framework on Node.js.
200200

201201
The initial setup of your first Yesod site has been finished. To start more advanced development of your Yesod site, please read [The Yesod Book](http://www.yesodweb.com/book/) for more details.
202202

docs/uptime/analytics/open-web-analytics-install-and-launch-on-your-server.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -154,7 +154,7 @@ Version 1.5.7 is the current version and may be different by the time you read t
154154

155155
### Configure
156156

157-
1. Navigate to the OWA installation page in your webbrowser. Replace `your.domain` with your Linode's IP address or FQDN:
157+
1. Navigate to the OWA installation page in your web browser. Replace `your.domain` with your Linode's IP address or FQDN:
158158

159159
http://your.domain/owa/
160160

0 commit comments

Comments
 (0)