Skip to content

Commit

Permalink
edit start dev env
Browse files Browse the repository at this point in the history
  • Loading branch information
Stan Varlamov committed Dec 21, 2018
1 parent bcdb5b6 commit 7073957
Show file tree
Hide file tree
Showing 11 changed files with 107 additions and 52 deletions.
@@ -0,0 +1,27 @@
### Setting Up Local .env Configuration File

Now that we have the development environment hot, let's make sure our local configuration file `.env` is set up correctly.

As explained before, we need to copy the default configuration file provided in the Git project into it. In the host shell:

```
cd C:/docker-vol/demo-gql-mongo
or
cd ~/docker-vol/demo-gql-mongo
then
cp .default.env .env
```

You can also use Copy functionality of Explorer in your local VSC IDE.

Unless you use a different MongoDB setup from the one described in the previous chapter, you would not need to change anything in the `.env` file, let's just review it:

- `PORT=80` - indicates that the HTTP server in NodeJS will run on port 80. As you recall, *container* port 80 in the `docker run` command is mapped to port `8080` of the host
- `DB_URI=mongodb://172.17.0.1:27017` - sets the MongoDB URI for the NodeJS application. From inside the NodeJS container, the path to MongoDB is via port `27017` of the docker *host*.
In the "Docker Networking ..." lesson you learned that `172.17.0.1` is the IP of the *host* on the default Docker network that both of our containers run in. How do we know the containers are running in the *default* Docker network? First, because we haven't configured any additional Docker network on our host. Second, in the `docker inspect node-dev` output, `"NetworkSettings"`, you can see the details, e.g., `"Gateway": "172.17.0.1"`. Now, on the MongoDB container side, local port `27017` is mapped to port `27017` on the host. So, host's port `27017` in effect is the port that our dev MongoDB is listening on.
This is the classic way of orchestrating multiple services and servers running on a Docker host. Simple and powerful.
As a disclaimer, the database is not secured, so don't put any sensitive info in it and don't expose your host's port `27017` to the open Internet. Even if you configure this DB with a password, you'll end up saving it in the open in the `.env` file. So, don't bother, just don't keep anything confidential in your dev DB. As we discussed in the "Project Custom Configuration File" lesson, the Production environment is well protected by virtue of running your Docker Production containers and services inside a secured orchestration framework. In dev, it's all on your laptop - with the intent that you can develop efficiently. Unless you know how to absolutely secure access to your laptop - don't copy your production database down into it to seed the development. Not a good practice.
- `DB_NAME=web_dev` - defines the name of the MongoDB database. The database will be automatically created if it doesn't exist - the first time we hit the DB server loading data into it


Next, on to the `install` step in our dev project lifecycle!
14 changes: 0 additions & 14 deletions 00_A$AP Learn/06_Running NodeJS/00_Setting up dot-env/index.md

This file was deleted.

30 changes: 24 additions & 6 deletions 00_A$AP Learn/06_Running NodeJS/01_Running npm install/index.md
@@ -1,16 +1,34 @@
### npm install
### Doing npm install

As a reminder, in order to load the required npm packages your laptop should be connected to the Internet.

We talked about `npm install` in the "Use Case and Project Components" chapter. It is run in the system that has NodeJS and `npm` software present, which is the `node-dev` container. So, we need to get into the container's shell - we tested its shell access in the previous chapter, after starting the container.

In your host's terminal shell (PowerShell on Windows), type:

As a reminder, in order to load the required npm packages your host should be connected to the Internet. `npm install` is run in the project folder in the `node-dev` container shell (where we've got NodeJS and npm as we're using the official NodeJS image for the container). To get into the `node-dev` shell, from your host's terminal shell (PowerShell on Windows), type:
```
docker exec -it node-dev bash
```
Then switch to the folder that is mapped to the host's project folder:

This puts you into the container's shall. Then switch to the folder that is mapped to the host's project folder:

```
cd /myapp
```

All set, now run `npm install`, which takes a minute or two to complete. The warning are generally ignorable. If you see vulnerabilities reported, please enter an issue in the [demo GitHub project](https://github.com/exlskills/demo-gql-mongo/issues).
All set, now run:

```
npm install
```

The process which takes a minute or two to complete. Much longer on slow connections and weaker hardware.

Conveniently, the process displays its status and produces some output as well. The warning are generally ignorable. If you see vulnerabilities reported, please enter an issue in the [demo GitHub project](https://github.com/exlskills/demo-gql-mongo/issues). Although, npm vulnerabilities would unlikely compromise your box when running dev/demo in an isolated local environment, with no access to your serving port (8080) from the Internet.

As a recap of what we learned earlier, `npm install` is a standard `npm` command that reads `package.json` and `package-lock.json` (if present). It evaluates which external package dependencies should be installed and loads them into the `node_modules` folder in the project directory. `package-lock.json` is created (or updated if anything in `package.json` changed since the last `npm install` run). Every time you update `package.json`, you should re-run `npm install`, and upon significant updates or package version changes, you should be deleting `node_modules` folder and `package-lock.json` to allow `npm` to reprocess the dependencies tree from scratch. However, changes in the tree may cause your code to break, so test thoroughly. Luckily, you always have your last working version and a full changes history available in the GitHub project to fall back to. What a wonderful dev world!

As `node_modules` folder is created and loaded by the container-run process, in Linux, your laptop user will not have access to it without sudo. This is somewhat inconvenient when you want to check external packages out from your IDE. If that is the case, you can change the ownership of the loaded folder, of course. Or, as mentioned earlier, run the container under a non-root user.

As this is just a demo/dev run in an isolated local environment, npm vulnerabilities should not compromise your box.

As a recap of what we learned earlier, `npm install` is a standard `npm` command that reads `package.json` and `package-lock.json` (if present), evaluates which external package dependencies should be installed and loads them into `node_modules` folder in the project directory. `package-lock.json` is created/updated if anything in `package.json` changed since the last `npm install` run. Every time you update `package.json` you should re-run `npm install`, and upon significant updates or package version changes, you should be deleting `node_modules` folder and `package-lock.json` to allow `npm` to reprocess the dependencies tree from scratch. However, changes in the tree may cause your code to break, so test thoroughly. Luckily, you always have your last working version and a full changes history available in the GitHub project to fall back to. What a wonderful dev world!
On to seeding the test data!
38 changes: 23 additions & 15 deletions 00_A$AP Learn/06_Running NodeJS/02_Seeding Test Data/index.md
@@ -1,43 +1,51 @@
### Seeding Data
### Seeding Test Data

People make a living developing test data generators. Here, we'll use a primitive seeding method: we'll have some data typed in into YAML files and then run a program to load the data into our dev instance of MongoDB
People make a living developing test data generators. Here, as a poor man's seeding method, we'll type in some data into YAML files and run a few simple custom programs to load the data into our dev instance of MongoDB.

## Review YAML Files

In the IDE, check out `src/data-seed/sample-data/item.yaml` and `user.yaml`. Note, we're hardcoding the `_id` so that we can use these IDs in sample GraphQL queries and mutations delivered with the demo. When seeding development, you'd leave the `_id` assignment lines out, just make sure to keep the dash separating the objects per YAML List format, as in `src/data-seed/sample-data/user-order.yaml`
## Review data-seed YAML Files

`>-` is a YAML indicator that is used for open text info.
In the IDE, review `src/data-seed/sample-data/item.yaml` and `user.yaml`. Note, we're hardcoding the `_id` so that we can use them in the sample GraphQL queries and mutations delivered with the demo. When seeding actual development, you'd leave the `_id` assignment lines out.

Notice that `item_details` have different structures in the two sample items and user order items as well - to illustrate how MongoDB can handle flexible structure Documents.
In case you don't know much about YAML other than everything must be indented and list elements must start with a dash, `>-` is a YAML indicator used in front of open text info.

## Review The Loaders
Notice that `item_details` have different structures in the two sample Items and User Order Items as well - to illustrate how MongoDB handles flexible structure Documents.

Net, using the IDE, review `src/data-seed/item-seed.js` and `user-seed.js`.

In `async function startRun()`, we plug in into the core app's code to initiate the config and connect to MongoDB. This part of code is basically borrowed from `src/server.js`. Once connected, we call `await loadData()` and then exit, closing the DB connection.
## Review data-seed Loaders

`async function loadData()` reads data from the YAML file with the hardcoded name, converts into JS object by using `js-yaml` external 3rd party package, loops over the object's records and creates DB Documents, one by one. Note a particularly efficient way, but simple enough to load a few records. One area for improvement - batch up Create operations, which is achievable by using *bulk* operations, but requires some tweaking, which is out of scope for this course. If interested, you can check out this sample in the [EXLskills gql-server](https://github.com/exlskills/gql-server/blob/master/src/data-load/maintenance-and-conversion/card-interaction-set-course-item-ref.js)
Next, using the IDE, review `src/data-seed/item-seed.js`, `user-seed.js` and `user-order-seed.js`

In `async function startRun()`, we're plugging in into the code of the core application so that we can utilize the same configuration and connect to MongoDB. This part of the code is, basically, borrowed from `src/server.js` that we'll review later. Once connected to the DB, we call `await loadData()` and then exit, closing the DB connection.

`async function loadData()` reads data from the YAML file with the hardcoded name, converts into JS object with the help of `js-yaml` external 3rd party package, loops over the object's records and creates DB Documents, one by one. Not a particularly efficient way, but simple enough to load a few records. One area for improvement - batch up Creates, which is achievable by using *bulk* operations in MongoDB. This would requires some tweaking, which is out of scope for this course. If interested, you can check out [this sample in the EXLskills gql-server](https://github.com/exlskills/gql-server/blob/master/src/data-load/maintenance-and-conversion/card-interaction-set-course-item-ref.js)

The `src/data-seed/user-order-seed.js` is similar in its structure to the two files above.

## Run the Loaders

From your host's terminal shell (PowerShell on Windows), enter `node-dev` shell:

```
docker exec -it node-dev bash
cd /myapp
```

Then copy - paste the load launch commands from the top of the `.js` files and execute them from the `node-dev` shell:

```
npx babel-node src/data-seed/item-seed.js
npx babel-node src/data-seed/user-seed.js
npx babel-node src/data-seed/user-order-seed.js
```

There may be a few-seconds delay after launching the command till the detailed debug logging fills the screen, but overall the process completes fast.
There may be a few-seconds delay after launching the command till the detailed debug logging starts filling the screen, but, overall, the process completes fast.

Yes!!! We've got it kicking.


## Review The Data in MongoDB Compass

Yes!!! We've got it kicking
Start MongoDB Compass on your host and connect to the `localhost` port `27017` database, with no Authentication - the default offer of the entry screen. You'll see `web-dev` database. Open it up and start clicking around. It's fun!

## Check Out The Database Via Compass

Start MongoDB Compass on your host and connect to the `localhost` port `27017` database, with no Authentication, as the entry screen offers as the default. You'll see `web-dev` database. Open it up and start clicking around. It's fun!
We can start the GraphQL server now, let's do it!
@@ -1,24 +1,30 @@
### Starting the Server Flow

If you are not in the `node-dev` shell yet, get in there: from your host's terminal shell (PowerShell on Windows), enter `node-dev` shell:

```
docker exec -it node-dev bash
cd /myapp
```

We'll start the server via the `npm start` command that has an override in `package.json` to go through `babel`. This was discussed in the "Use Case and Project Components" chapter.
We'll start the server via the `npm start` command that has an override in `package.json` channeling it through `babel`. As was discussed in the "Use Case and Project Components" chapter.

So,
```
npm start
```

After a brief hesitation, you'll see `info` and `debug` messages coming from the `logger` commands and `Mongoose:` message from the `mongoose` debug logging. The process runs in the `node-dev` bash shell, so to stop it you do `CTRL-C` while in the shell. You can open multiple shell sessions by launching `docker exec -it node-dev bash` from your host's terminal shell, in case you'd like to check something in the running container while your first shell is occupied by the NodeJS server process. In the dev mode, you do want to run NodeJS "interactively" so that you can see the logs as you test functionality via a browser. Multiple displays would come handy.
After a brief hesitation, you'll see `info` and `debug` messages coming up on the screen from the `logger` commands and `Mongoose` messages from the `mongoose` debug logging.

The process runs in the `node-dev` bash shell, so to stop it you do `CTRL-C` while in the shell. In case you'd like to check something out in the running container while your first shell is occupied by the NodeJS server process, you can open more container shell sessions by launching `docker exec -it node-dev bash` from your host's terminal shell.

In the dev mode, you do want to run NodeJS "interactively" so that you can see the real-time logs as you test functionality via the browser. Multiple displays would come handy.

You can check `docker stats` on the host to see how much resources the containers take up. Not bad at all! Neither NodeJS nor MongoDB are resource hogs, thankfully.


### Quick Test from the Browser

On the host, open a browser and navigate to `http://localhost:8080/graph` (change the `8080` port to the one you used in the `docker run` launching `node-dev`, if it was different)
On the host, open a browser and navigate to `http://localhost:8080/graph` (change the `8080` port to the one you used in the `docker run` when launching `node-dev`, if it was different)


You'll see a `GraphiQL` work screen. Great, everything works!
You'll see a `GraphiQL` work screen. Great, everything works! Back in the code now - to see what is actually running

0 comments on commit 7073957

Please sign in to comment.