Permalink
Browse files

update code blocks to always specify syntax, fix ordered list (#3734)

  • Loading branch information...
jesseditson committed Sep 23, 2017
1 parent e6c106f commit 8be0ae7c6b4548df49314e23d9a74ccb78bd7b14
View
@@ -9,7 +9,7 @@ Don't forget to [setup your `$GOPATH` and `$BIN` environment variables](https://
You can test your setup like so:
```
```shell
# This should print something
echo $GOPATH
@@ -23,7 +23,7 @@ Add `NOMS_VERSION_NEXT=1` to your environment. The current trunk codebase is a d
## Get and build Noms
```
```shell
go get github.com/attic-labs/noms/cmd/noms
cd $GOPATH/src/github.com/attic-labs/noms/cmd/noms
go build
View
@@ -50,7 +50,7 @@ Embed Noms into mobile applications, making it easier to build offline-first, fu
## Setup
```
```shell
# You probably want to add this to your environment
export NOMS_VERSION_NEXT=1
@@ -64,21 +64,21 @@ go install github.com/attic-labs/noms/cmd/noms
Import some data:
```
```shell
go install github.com/attic-labs/noms/samples/go/csv/import
curl 'https://data.cityofnewyork.us/api/views/kku6-nxdu/rows.csv?accessType=DOWNLOAD' > /tmp/data.csv
csv-import /tmp/data.csv /tmp/noms::nycdemo
```
Explore:
```
```shell
noms show /tmp/noms::nycdemo
```
Should show:
```
```go
struct Commit {
meta: struct Meta {
date: "2017-09-19T19:33:01Z",
View
@@ -1,6 +1,6 @@
## Example
```
```shell
cd $GOPATH/src/github.com/attic-labs/noms/samples/go/counter
go build
./counter /tmp/nomsdb::counter
@@ -12,7 +12,7 @@ noms serve /tmp/nomsdb
Then, in a separate shell:
```
```shell
# This starts where the previous count left off because we're serving the same database
./counter http://localhost:8000::counter
View
@@ -16,7 +16,7 @@ This is a quick introduction to the Noms command-line interface. It should only
Now you should be able to run `noms`:
```
```shell
> noms
Noms is a tool for goofing with Noms data.
@@ -39,7 +39,7 @@ Use "noms help [command]" for more information about a command.
Without any arguments, `noms` lists out all available commands. To get information on a specific command, we can use `noms help [command]`:
```
```shell
> noms help sync
usage: noms sync [options] <source-object> <dest-dataset>
@@ -54,7 +54,7 @@ There's a sample database running at http://demo.noms.io. Let's take a look insi
The `noms ds` command lists the _datasets_ within a particular database:
```
```shell
> noms ds http://demo.noms.io
...
sf-film-locations/raw
@@ -66,7 +66,7 @@ sf-film-locations
Noms datasets are versioned. You can see the history with `log`:
```
```shell
> !? noms log http://demo.noms.io::sf-film-locations
commit aprsmg0j2eegk8eehbgj7cd3tmmd1be8
Parent: None
@@ -82,7 +82,7 @@ Note that Noms is a typed system. What is being shown here for each entry is not
You can see the entire serialization of any object in the database with `noms show`:
```
```shell
> noms show 'http://demo.noms.io::#aprsmg0j2eegk8eehbgj7cd3tmmd1be8'
struct Commit {
@@ -121,22 +121,22 @@ You can work with Noms databases that are remote exactly the same as you work wi
Moving data in Noms is done with the `sync` command. Note that unlike Git, we do not make a distinction between _push_ and _pull_. It's the same operation in both directions:
```
```shell
> noms sync http://demo.noms.io::sf-film-locations /tmp/noms::films
> noms ds /tmp/noms
films
```
We can now make an edit locally:
```
```shell
> go install github.com/attic-labs/noms/samples/go/csv/...
> csv-export /tmp/noms::films > /tmp/film-locations.csv
```
open /tmp/film-location.csv and edit it, then:
```
```shell
> csv-import --column-types=String,String,String,String,String,String,String,String,Number,String,String \
/tmp/film-locations.csv /tmp/noms::films
```
@@ -145,7 +145,7 @@ open /tmp/film-location.csv and edit it, then:
The `noms diff` command can show you the differences between any two values. Let's see our change:
```
```shell
> noms diff http://demo.noms.io::sf-film-locations /tmp/noms::films
./.meta {
@@ -161,4 +161,3 @@ The `noms diff` command can show you the differences between any two values. Let
- "Locations": "Epic Roasthouse (399 Embarcadero)"
+ "Locations": "Epic Roadhouse (399 Embarcadero)"
```
@@ -17,7 +17,7 @@ Peers that are listening for these message can decide if that data is relevent t
Peers can use a flow similar to the following in order to sync changes with one another:
```
```nohighlight
for {
listen for new message
if new msg is relevant {
@@ -49,7 +49,7 @@ Another potential architecture for decentralized apps uses a decentralized chunk
The flow used by peers to sync with one another is similar to the peer-to-peer architecture. The main difference is data is not duplicated on local machines and doesn't have to be pulled during sync. Each app keeps track of it's latest commit in the chunk store.
```
```nohighlight
for {
listen for new message
if new msg is relevant {
@@ -11,28 +11,36 @@ This sample app demonstrates backing a P2P noms app by a decentralized blockstor
Demo app code is in the
[ipfs-chat](https://github.com/attic-labs/noms/tree/master/samples/go/decent/ipfs-chat/)
directory. To get it up and running take the following steps:
* Use git to clone the noms repository onto your computer:
```
```shell
go get github.com/attic-labs/noms/samples/go/decent/ipfs-chat
```
* From the noms/samples/go/decent/ipfs-chat directory, build the program with the following command:
```
```shell
go build
```
* Run the ipfs-chat client with the following command:
```
```shell
./ipfs-chat client --username <aname1> --node-idx=1 ipfs:/tmp/ipfs1::chat >& /tmp/err1
```
* Run a second ipfs-chat client with the following command:
```
```shell
./ipfs-chat client --username <aname2> --node-idx=2 ipfs:/tmp/ipfs2::chat >& /tmp/err2
```
If desired, ipfs-chat can be run as a daemon which will replicate all
chat content in a local store which will enable clients to go offline
without causing data to become unavailable to other clients:
```
```shell
./ipfs-chat daemon --node-idx=3 ipfs:/tmp/ipfs3::chat
```
@@ -13,21 +13,29 @@ Currently, nodes have to have a publicly routable IP, but it should be possible
Demo app code is in the
[p2p](https://github.com/attic-labs/noms/tree/master/samples/go/decent/p2p-chat)
directory. To get it up and running take the following steps:
* Use git to clone the noms repository onto your computer:
```
```shell
go get github.com/attic-labs/noms/samples/go/decent/p2p-chat
```
* From the noms/samples/go/decent/p2p-chat directory, build the program with the following command:
```
```shell
go build
```
* Run the p2p client with the following command:
```
```shell
mkdir /tmp/noms1
./p2p-chat client --username=<aname1> --node-idx=1 /tmp/noms1 >& /tmp/err1
```
* Run a second p2p client with the following command:
```
```shell
mkdir /tmp/noms2
./p2p-chat client --username=<aname2> --node-idx=2 /tmp/noms2 >& /tmp/err2
```
View
@@ -14,7 +14,7 @@ The steps you’ll need to take are:
number, string, blob, map, list, set, structs, ref, and
union. (Note: if you are interested in using CRDTs as an alternative
to classic datatypes please let us know.)
1. Consider...
2. Consider...
* How peers will discover each other
* How peers will notify each other of changes
* How and when they will pull changes, and
@@ -34,31 +34,40 @@ The steps you’ll need to take are:
`Struct{sender, ordinal}`: the resulting `Map` is the same no
matter what order messages are added.
1. Vendor the code into your project.
1. Set `NOMS_VERSION_NEXT=1` in your environment.
1. Decide which type of storage you'd like to use: memory (convenient for playing around), disk, IPFS, or S3. (If you want to implement a store on top of another type of storage that's possible too; email us or reach out on slack and we can help.)
1. Set up and instantiate a database for your storage. Generally, you use the spec package to parse a [dataset spec](https://github.com/attic-labs/noms/blob/master/doc/spelling.md) like `mem::mydataset` which you can then ask for [`Database`](https://github.com/attic-labs/noms/blob/master/go/datas/database.go) and [`Dataset`](https://github.com/attic-labs/noms/blob/master/go/datas/dataset.go).
3. Vendor the code into your project.
4. Set `NOMS_VERSION_NEXT=1` in your environment.
5. Decide which type of storage you'd like to use: memory (convenient for playing around), disk, IPFS, or S3. (If you want to implement a store on top of another type of storage that's possible too; email us or reach out on slack and we can help.)
6. Set up and instantiate a database for your storage. Generally, you use the spec package to parse a [dataset spec](https://github.com/attic-labs/noms/blob/master/doc/spelling.md) like `mem::mydataset` which you can then ask for [`Database`](https://github.com/attic-labs/noms/blob/master/go/datas/database.go) and [`Dataset`](https://github.com/attic-labs/noms/blob/master/go/datas/dataset.go).
* **Memory**: no setup required, just instantiate it:
```go
sp := spec.ForDataset("mem::test") // Dataset name is "test"
```
```go
sp := spec.ForDataset("mem::test") // Dataset name is "test"
```
* **Disk**: identify a directory for storage, say `/path/to/chunks`, and then instantiate:
```go
sp := spec.ForDataset("/path/to/chunks::test") // Dataset name is "test"
```
```go
sp := spec.ForDataset("/path/to/chunks::test") // Dataset name is "test"
```
* **IPFS**: identify an IPFS node by directory. If an IPFS node doesn't exist at that directory, one will be created:
```go
sp := spec.ForDataset("ipfs:/path/to/ipfs_repo::test") // Dataset name is "test"
```
```go
sp := spec.ForDataset("ipfs:/path/to/ipfs_repo::test") // Dataset name is "test"
```
* **S3**: Follow the [S3 setup instructions](https://github.com/attic-labs/noms/blob/master/go/nbs/NBS-on-AWS.md) then instantiate a database and dataset:
```go
sess := session.Must(session.NewSession(aws.NewConfig().WithRegion("us-west-2")))
store := nbs.NewAWSStore("dynamo-table", "store-name", "s3-bucket", s3.New(sess), dynamodb.New(sess), 1<<28))
database := datas.NewDatabase(store)
dataset := database.GetDataset("aws://dynamo-table:s3-bucket/store-name::test") // Dataset name is "test"
```
1. Implement using the [Go API](https://github.com/attic-labs/noms/blob/master/doc/go-tour.md). If you're just playing around you could try something like this:
```go
```go
sess := session.Must(session.NewSession(aws.NewConfig().WithRegion("us-west-2")))
store := nbs.NewAWSStore("dynamo-table", "store-name", "s3-bucket", s3.New(sess), dynamodb.New(sess), 1<<28))
database := datas.NewDatabase(store)
dataset := database.GetDataset("aws://dynamo-table:s3-bucket/store-name::test") // Dataset name is "test"
```
7. Implement using the [Go API](https://github.com/attic-labs/noms/blob/master/doc/go-tour.md). If you're just playing around you could try something like this:
```go
package main
import (
@@ -109,12 +118,16 @@ The steps you’ll need to take are:
"male": types.Bool(male),
})
}
```
1. You can inspect data that you've committed via the [noms command-line interface](https://github.com/attic-labs/noms/blob/master/doc/cli-tour.md). For example:
```
noms log /path/to/store::ds
noms show /path/to/store::ds
```
* Note that Memory tables won't be inspectable because they exist only in the memory of the process that created them.
1. Implement pull and merge. The [pull API](../../go/datas/pull.go) is used pull changes from a peer and the [merge API](../../go/merge/) is used to merge changes before commit. There's an [example of merging in the IPFS-based-chat sample
```
8. You can inspect data that you've committed via the [noms command-line interface](https://github.com/attic-labs/noms/blob/master/doc/cli-tour.md). For example:
```shell
noms log /path/to/store::ds
noms show /path/to/store::ds
```
> Note that Memory tables won't be inspectable because they exist only in the memory of the process that created them.
9. Implement pull and merge. The [pull API](../../go/datas/pull.go) is used pull changes from a peer and the [merge API](../../go/merge/) is used to merge changes before commit. There's an [example of merging in the IPFS-based-chat sample
app](https://github.com/attic-labs/noms/blob/master/samples/go/ipfs-chat/pubsub.go).
View
@@ -149,7 +149,7 @@ func main() {
Now you will get output of the data type of our Dataset value:
```
```shell
> go run noms-tour.go
data type: List<struct {
given: String
View
@@ -68,7 +68,7 @@ func main() {
A dataset is nothing more than a named pointer into the DAG. Consider the following command to copy the dataset named `foo` to the dataset named `bar` within a database:
```
```shell
noms sync http://localhost:8000::foo http://localhost:8000::bar
```
@@ -144,7 +144,7 @@ This is usually completely implicit, done based on the data you store (you can s
We do the same thing for datasets. If you commit a `Set<Number>`, the type of the commit we create for you is:
```
```go
struct Commit {
Value: Set<Number>
Parents: Set<Ref<Cycle<Commit>>>
@@ -155,7 +155,7 @@ This tells you that the current and all previous commits have values of type `Se
But if you then commit a `Set<String>` to this same dataset, then the type of that commit will be:
```
```go
struct Commit {
Value: Set<String>
Parents: Set<Ref<Cycle<Commit>> |
Oops, something went wrong.

0 comments on commit 8be0ae7

Please sign in to comment.