Skip to content
Switch branches/tags

Latest commit

* fix: pass eslint

There are two types of errors that should not be treated as errors, configure
eslint in the comments to ignore them.

1. @typescript-eslint/require-await, the code does not use await in async, but
we do want a Promise.
2. @typescript-eslint/explicit-module-boundary-types, there is no explicit
return statement in the function, but we really don't care about its return

In order for eslint to make sense, I didn't disable these rules in the file
but to line.

* fix: update code to pass eslint instead of using comment to disable lint rules

Git stats


Failed to load latest commit information.
Latest commit message
Commit time

lowdb Node.js CI

Simple to use local JSON database. Powered by plain JavaScript 🦉

// Edit db.json content using plain JS
  .push({ id: 1, title: 'lowdb is awesome' })

// Save to file
// db.json
  "posts": [
    { "id": 1, "title": "lowdb is awesome" }

If you like lowdb, see also xv (test runner) and steno (fast file writer).


Become a sponsor and have your company logo here.

Please help me build OSS 👉 GitHub Sponsors


  • Lightweight
  • Minimalist
  • TypeScript
  • plain JS
  • Atomic write
  • Hackable:
    • Change storage, file format (JSON, YAML, ...) or add encryption via adapters
    • Add lodash, ramda, ... for super powers!


npm install lowdb


Lowdb 3 is a pure ESM package. If you're having trouble importing it in your project, please read this.

import { join, dirname } from 'path'
import { Low, JSONFile } from 'lowdb'
import { fileURLToPath } from 'url'

const __dirname = dirname(fileURLToPath(import.meta.url));

// Use JSON file for storage
const file = join(__dirname, 'db.json')
const adapter = new JSONFile(file)
const db = new Low(adapter)

// Read data from JSON file, this will set content

// If file.json doesn't exist, will be null
// Set default data
// = || { posts: [] } // Node < v15.x ||= { posts: [] }             // Node >= 15.x

// Create and query items using plain JS'hello world')
const firstPost =[0]

// Alternatively, you can also use this syntax if you prefer
const { posts } =
posts.push('hello world')

// Finally write content to file
await db.write()
// db.json
  "posts": [ "hello world" ]


You can use TypeScript to type check your data.

type Data = {
  words: string[]

const adapter = new JSONFile<Data>('db.json')
const db = new Low(adapter)
  .push('foo') // ✅
  .push(1) // ❌


You can also add lodash or other utility libraries to improve lowdb.

import lodash from 'lodash'

type Post = {
  id: number;
  title: string;

type Data = {
  posts: Post[]

// Extend Low class with a new `chain` field
class LowWithLodash<T> extends Low<T> {
  chain: lodash.ExpChain<this['data']> = lodash.chain(this).get('data')

const adapter = new JSONFile<Data>('db.json')
const db = new LowWithLodash(adapter)

// Instead of use db.chain to access lodash API
const post = db.chain
  .find({ id: 1 })
  .value() // Important: value() must be called to execute chain

More examples

For CLI, server and browser usage, see examples/ directory.



Lowdb has two classes (for asynchronous and synchronous adapters).

new Low(adapter)

import { Low, JSONFile } from 'lowdb'

const db = new Low(new JSONFile('file.json'))
await db.write()

new LowSync(adapterSync)

import { LowSync, JSONFileSync } from 'lowdb'

const db = new LowSync(new JSONFileSync('file.json'))


Calls and sets

Note: JSONFile and JSONFileSync adapters will set to null if file doesn't exist. // === null // !== null


Calls adapter.write( = { posts: [] }
db.write() // file.json will be { posts: [] } = {}
db.write() // file.json will be {}


Holds your db content. If you're using the adapters coming with lowdb, it can be any type supported by JSON.stringify.

For example: = 'string' = [1, 2, 3] = { key: 'value' }


Lowdb adapters


Adapters for reading and writing JSON files.

new Low(new JSONFile(filename))
new LowSync(new JSONFileSync(filename))

Memory MemorySync

In-memory adapters. Useful for speeding up unit tests.

new Low(new Memory())
new LowSync(new MemorySync())


Synchronous adapter for window.localStorage.

new LowSync(new LocalStorage(name))

TextFile TextFileSync

Adapters for reading and writing text. Useful for creating custom adapters.

Third-party adapters

If you've published an adapter for lowdb, feel free to create a PR to add it here.

Writing your own adapter

You may want to create an adapter to write to YAML, XML, encrypt data, a remote storage, ...

An adapter is a simple class that just needs to expose two methods:

class AsyncAdapter {
  read() { /* ... */ } // should return Promise<data>
  write(data) { /* ... */ } // should return Promise<void>

class SyncAdapter {
  read() { /* ... */ } // should return data
  write(data) { /* ... */ } // should return nothing

For example, let's say you have some async storage and want to create an adapter for it:

import { api } from './AsyncStorage'

class CustomAsyncAdapter {
  // Optional: your adapter can take arguments
  constructor(args) {
    // ...

  async read() {
    const data = await
    return data

  async write(data) {
    await api.write(data)

const adapter = new CustomAsyncAdapter()
const db = new Low(adapter)

See src/adapters/ for more examples.

Custom serialization

To create an adapter for another format than JSON, you can use TextFile or TextFileSync.

For example:

import { Adapter, Low, TextFile } from 'lowdb'
import YAML from 'yaml'

class YAMLFile {
  constructor(filename) {
    this.adapter = new TextFile(filename)

  async read() {
    const data = await
    if (data === null) {
      return null
    } else {
      return YAML.parse(data)

  write(obj) {
    return this.adapter.write(YAML.stringify(obj))

const adapter = new YAMLFile('file.yaml')
const db = new Low(adapter)


Lowdb doesn't support Node's cluster module.

If you have large JavaScript objects (~10-100MB) you may hit some performance issues. This is because whenever you call db.write, the whole is serialized using JSON.stringify and written to storage.

Depending on your use case, this can be fine or not. It can be mitigated by doing batch operations and calling db.write only when you need it.

If you plan to scale, it's highly recommended to use databases like PostgreSQL or MongoDB instead.