Skip to content
This repository has been archived by the owner on Feb 12, 2024. It is now read-only.

Commit

Permalink
feat: switch to esm (#3879)
Browse files Browse the repository at this point in the history
Refactors the code to be ESM.  Dual publishes CJS and ESM for maximum compatibility.

There are no default exports, so code similar to:

```js
import IPFS from 'ipfs'

const ipfs = await IPFS.create()
```

should be refactored to:

```js
import { create } as IPFS from 'ipfs'

const ipfs = await create()
```

BREAKING CHANGE: There are no default exports and everything is now dual published as ESM/CJS
  • Loading branch information
achingbrain authored Sep 22, 2021
1 parent 4126a5a commit 9a40109
Show file tree
Hide file tree
Showing 1,024 changed files with 9,298 additions and 7,976 deletions.
263 changes: 137 additions & 126 deletions .github/workflows/test.yml

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -33,3 +33,4 @@ tsconfig-check.aegir.json

# Operating system files
.DS_Store
types
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ $ npm install ipfs-core
Then start a node in your app:

```javascript
const IPFS = require('ipfs-core')
import * as IPFS from 'ipfs-core'

const ipfs = await IPFS.create()
const { cid } = await ipfs.add('Hello world')
Expand Down
2 changes: 1 addition & 1 deletion docs/DAEMON.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ The IPFS Daemon exposes the API defined in the [HTTP API spec](https://docs.ipfs
If you want a programmatic way to spawn a IPFS Daemon using JavaScript, check out the [ipfsd-ctl](https://github.com/ipfs/js-ipfsd-ctl) module.

```javascript
const { createFactory } = require('ipfsd-ctl')
import { createFactory } from 'ipfsd-ctl'
const factory = createFactory({
type: 'proc' // or 'js' to run in a separate process
})
Expand Down
4 changes: 2 additions & 2 deletions docs/FAQ.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,8 +73,8 @@ Yes, however, bear in mind that there isn't a 100% stable solution to use WebRTC
To add WebRTC support in a IPFS node instance, do:

```JavaScript
const wrtc = require('wrtc') // or require('electron-webrtc')()
const WebRTCStar = require('libp2p-webrtc-star')
import wrtc from 'wrtc' // or 'electron-webrtc'
import WebRTCStar from 'libp2p-webrtc-star'

const node = await IPFS.create({
repo: 'your-repo-path',
Expand Down
38 changes: 22 additions & 16 deletions docs/IPLD.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,38 +49,41 @@ If your application requires support for extra codecs, you can configure them as
1. Configure the [IPLD layer](https://github.com/ipfs/js-ipfs/blob/master/packages/ipfs/docs/MODULE.md#optionsipld) of your IPFS daemon to support the codec. This step is necessary so the node knows how to prepare data received over HTTP to be passed to IPLD for serialization:

```javascript
const ipfs = require('ipfs')
import { create } from 'ipfs'
import customBlockCodec from 'custom-blockcodec'
import customMultibase from 'custom-multibase'
import customMultihasher from 'custom-multihasher'

const node = await ipfs({
const node = await create({
ipld: {
// either specify BlockCodecs as part of the `codecs` list
codecs: [
require('custom-blockcodec')
customBlockCodec
],

// and/or supply a function to load them dynamically
loadCodec: async (codecNameOrCode) => {
return require(codecNameOrCode)
return import(codecNameOrCode)
},

// either specify Multibase codecs as part of the `bases` list
bases: [
require('custom-multibase')
customMultibase
],

// and/or supply a function to load them dynamically
loadBase: async (baseNameOrCode) => {
return require(baseNameOrCode)
return import(baseNameOrCode)
},

// either specify Multihash hashers as part of the `hashers` list
hashers: [
require('custom-multibase')
customMultihasher
],

// and/or supply a function to load them dynamically
loadHasher: async (hashNameOrCode) => {
return require(hashNameOrCode)
return import(hashNameOrCode)
}
}
})
Expand All @@ -89,39 +92,42 @@ If your application requires support for extra codecs, you can configure them as
2. Configure your IPFS HTTP API Client to support the codec. This is necessary so that the client can send the data to the IPFS node over HTTP:

```javascript
const ipfsHttpClient = require('ipfs-http-client')
import { create } from 'ipfs-http-client'
import customBlockCodec from 'custom-blockcodec'
import customMultibase from 'custom-multibase'
import customMultihasher from 'custom-multihasher'
const client = ipfsHttpClient({
const client = create({
url: 'http://127.0.0.1:5002',
ipld: {
// either specify BlockCodecs as part of the `codecs` list
codecs: [
require('custom-blockcodec')
customBlockCodec
],
// and/or supply a function to load them dynamically
loadCodec: async (codecNameOrCode) => {
return require(codecNameOrCode)
return import(codecNameOrCode)
},
// either specify Multibase codecs as part of the `bases` list
bases: [
require('custom-multibase')
customMultibase
],
// and/or supply a function to load them dynamically
loadBase: async (baseNameOrCode) => {
return require(baseNameOrCode)
return import(baseNameOrCode)
},
// either specify Multihash hashers as part of the `hashers` list
hashers: [
require('custom-multibase')
customMultihasher
],
// and/or supply a function to load them dynamically
loadHasher: async (hashNameOrCode) => {
return require(hashNameOrCode)
return import(hashNameOrCode)
}
}
})
Expand Down
50 changes: 25 additions & 25 deletions docs/MIGRATION-TO-ASYNC-AWAIT.md
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ const peerId = PeerId.createFromB58String(peerIdStr)
You can get hold of the `PeerId` class using npm or in a script tag:

```js
const PeerId = require('peer-id')
import PeerId from 'peer-id'
const peerId = PeerId.createFromB58String(peerIdStr)
```

Expand Down Expand Up @@ -128,7 +128,7 @@ You can get hold of the `PeerInfo` class using npm or in a script tag:

```js
const PeerInfo = require('peer-info')
const PeerId = require('peer-id')
import PeerId from 'peer-id'
const peerInfo = new PeerInfo(PeerId.createFromB58String(info.id))
info.addrs.forEach(addr => peerInfo.multiaddrs.add(addr))
```
Expand Down Expand Up @@ -217,7 +217,7 @@ readable.on('end', () => {
Becomes:

```js
const toStream = require('it-to-stream')
import toStream from 'it-to-stream'
const readable = toStream.readable(ipfs.cat('QmHash'))
const decoder = new TextDecoder()

Expand Down Expand Up @@ -285,7 +285,7 @@ console.log(decoder.decode(data))
...which, by the way, could more succinctly be written as:

```js
const toBuffer = require('it-to-buffer')
import toBuffer from 'it-to-buffer'
const decoder = new TextDecoder()
const data = await toBuffer(ipfs.cat('QmHash'))
console.log(decoder.decode(data))
Expand Down Expand Up @@ -321,7 +321,7 @@ pipeline(
Becomes:

```js
const toStream = require('it-to-stream')
import toStream from 'it-to-stream'
const { pipeline, Writable } = require('stream')
const decoder = new TextDecoder()

Expand Down Expand Up @@ -353,7 +353,7 @@ Use `it-pipe` and a [for/await](https://developer.mozilla.org/en-US/docs/Web/Jav
e.g.

```js
const fs = require('fs')
import fs from 'fs'
const { pipeline } = require('stream')

const items = []
Expand All @@ -378,7 +378,7 @@ pipeline(
Becomes:

```js
const fs = require('fs')
import fs from 'fs'
const pipe = require('it-pipe')

const items = []
Expand All @@ -400,9 +400,9 @@ console.log(items)
...which, by the way, could more succinctly be written as:

```js
const fs = require('fs')
import fs from 'fs'
const pipe = require('it-pipe')
const all = require('it-all')
import all from 'it-all'

const items = await pipe(
fs.createReadStream('/path/to/file'),
Expand All @@ -420,7 +420,7 @@ Convert the async iterable to a readable stream.
e.g.

```js
const fs = require('fs')
import fs from 'fs'
const { pipeline } = require('stream')

const items = []
Expand All @@ -445,8 +445,8 @@ pipeline(
Becomes:

```js
const toStream = require('it-to-stream')
const fs = require('fs')
import toStream from 'it-to-stream'
import fs from 'fs'
const { pipeline } = require('stream')

const items = []
Expand Down Expand Up @@ -568,7 +568,7 @@ Becomes:

```js
const pipe = require('it-pipe')
const concat = require('it-concat')
import concat from 'it-concat'
const decoder = new TextDecoder()

const data = await pipe(
Expand All @@ -590,7 +590,7 @@ Use `it-pipe` and `it-all` to collect all items from an async iterable.
e.g.

```js
const fs = require('fs')
import fs from 'fs'
const toPull = require('stream-to-pull-stream')

pull(
Expand All @@ -605,7 +605,7 @@ pull(
Becomes:

```js
const fs = require('fs')
import fs from 'fs'

const file = await ipfs.add(fs.createReadStream('/path/to/file'))

Expand All @@ -619,7 +619,7 @@ Convert the async iterable to a pull stream.
e.g.

```js
const fs = require('fs')
import fs from 'fs'
const toPull = require('stream-to-pull-stream')

pull(
Expand All @@ -634,7 +634,7 @@ pull(
Becomes:

```js
const fs = require('fs')
import fs from 'fs'
const streamToPull = require('stream-to-pull-stream')
const itToPull = require('async-iterator-to-pull-stream')

Expand Down Expand Up @@ -685,7 +685,7 @@ for await (const file of addSource) {
Alternatively you can buffer up the results using the `it-all` utility:

```js
const all = require('it-all')
import all from 'it-all'

const results = await all(ipfs.addAll([
{ path: 'root/1.txt', content: 'one' },
Expand Down Expand Up @@ -744,7 +744,7 @@ Reading files.
e.g.

```js
const fs = require('fs')
import fs from 'fs'

const data = await ipfs.cat('/ipfs/QmHash')

Expand All @@ -759,8 +759,8 @@ Becomes:

```js
const pipe = require('it-pipe')
const toIterable = require('stream-to-it')
const fs = require('fs')
import toIterable from 'stream-to-it'
import fs from 'fs'

// Note that as chunks arrive they are written to the file and memory can be freed and re-used
await pipe(
Expand All @@ -774,8 +774,8 @@ console.log('done')
Alternatively you can buffer up the chunks using the `it-concat` utility (not recommended!):

```js
const fs = require('fs')
const concat = require('it-concat')
import fs from 'fs'
import concat from 'it-concat'

const data = await concat(ipfs.cat('/ipfs/QmHash'))

Expand Down Expand Up @@ -812,7 +812,7 @@ for await (const file of filesSource) {
Alternatively you can buffer up the directory listing using the `it-all` utility:

```js
const all = require('it-all')
import all from 'it-all'

const results = await all(ipfs.ls('/ipfs/QmHash'))

Expand Down Expand Up @@ -905,7 +905,7 @@ files.forEach(file => {
Becomes:

```js
const fs = require('fs')
import fs from 'fs'
const ipfs = IpfsHttpClient()

const file = await ipfs.add(fs.createReadStream('/path/to/file.txt'))
Expand Down
Loading

0 comments on commit 9a40109

Please sign in to comment.