Lowdb

Simple to use local JSON database. Use native JavaScript API to query. Writ...

README

lowdb undefined Node.js CI


Simple to use local JSON database. Use native JavaScript API to query. Written in TypeScript. 🦉


  1. ``` js
  2. // Edit db.json content using native JS API
  3. db.data
  4.   .posts
  5.   .push({ id: 1, title: 'lowdb is awesome' })

  6. // Save to file
  7. db.write()
  8. ```

  1. ``` js
  2. // db.json
  3. {
  4.   "posts": [
  5.     { "id": 1, "title": "lowdb is awesome" }
  6.   ]
  7. }
  8. ```

If you like lowdb, see also xv (test runner) and steno (fast file writer).

Sponsors






Please help me build OSS 👉 GitHub Sponsors

Features


- __Lightweight__
- __Minimalist__
- __TypeScript__
- __plain JS__
- Atomic write
- Hackable:
  - Change storage, file format (JSON, YAML, ...) or add encryption via adapters
  - Add lodash, ramda, ... for super powers!

Install


  1. ```sh
  2. npm install lowdb
  3. npm install lowdb@4 # If you're using Next.js or having trouble importing lowdb/node
  4. ```

See v4 docs for usage.

Usage


_Lowdb is a pure ESM package. If you're having trouble using it in your project, please read this._

__Next.js__: there's a known issue with Next.js. Until the issue is fixed, please use lowdb^4.0.0. The only difference between v5 and v4 is the way lowdb is imported.

  1. ``` js
  2. // Remember to set type: module in package.json or use .mjs extension
  3. import { join, dirname } from 'node:path'
  4. import { fileURLToPath } from 'node:url'

  5. import { Low } from 'lowdb'
  6. import { JSONFile } from 'lowdb/node'

  7. // File path
  8. const __dirname = dirname(fileURLToPath(import.meta.url));
  9. const file = join(__dirname, 'db.json')

  10. // Configure lowdb to write to JSONFile
  11. const adapter = new JSONFile(file)
  12. const db = new Low(adapter)

  13. // Read data from JSON file, this will set db.data content
  14. await db.read()

  15. // If db.json doesn't exist, db.data will be null
  16. // Use the code below to set default data
  17. // db.data = db.data || { posts: [] } // For Node < v15.x
  18. db.data ||= { posts: [] }             // For Node >= 15.x

  19. // Create and query items using native JS API
  20. db.data.posts.push('hello world')
  21. const firstPost = db.data.posts[0]

  22. // Alternatively, you can also use this syntax if you prefer
  23. const { posts } = db.data
  24. posts.push('hello world')

  25. // Finally write db.data content to file
  26. await db.write()
  27. ```

  1. ``` js
  2. // db.json
  3. {
  4.   "posts": [ "hello world" ]
  5. }
  6. ```

TypeScript


You can use TypeScript to check your data types.

  1. ```ts
  2. type Data = {
  3.   words: string[]
  4. }

  5. const adapter = new JSONFile<Data>('db.json')
  6. const db = new Low(adapter)

  7. db.data
  8.   .words
  9.   .push('foo') // ✅ Success

  10. db.data
  11.   .words
  12.   .push(1) // ❌ TypeScript error
  13. ```

Lodash


You can also add lodash or other utility libraries to improve lowdb.

  1. ```ts
  2. import lodash from 'lodash'

  3. type Post = {
  4.   id: number;
  5.   title: string;
  6. }

  7. type Data = {
  8.   posts: Post[]
  9. }

  10. // Extend Low class with a new `chain` field
  11. class LowWithLodash<T> extends Low<T> {
  12.   chain: lodash.ExpChain<this['data']> = lodash.chain(this).get('data')
  13. }

  14. const adapter = new JSONFile<Data>('db.json')
  15. const db = new LowWithLodash(adapter)
  16. await db.read()

  17. // Instead of db.data use db.chain to access lodash API
  18. const post = db.chain
  19.   .get('posts')
  20.   .find({ id: 1 })
  21.   .value() // Important: value() must be called to execute chain
  22. ```

CLI, Server and Browser usage


See [examples/](/examples) directory.

API


Classes


Lowdb has two classes (for asynchronous and synchronous adapters).

new Low(adapter)


  1. ``` js
  2. import { Low } from 'lowdb'
  3. import { JSONFile } from 'lowdb/node'

  4. const db = new Low(new JSONFile('file.json'))
  5. await db.read()
  6. await db.write()
  7. ```

new LowSync(adapterSync)


  1. ``` js
  2. import { LowSync } from 'lowdb'
  3. import { JSONFileSync } from 'lowdb/node'

  4. const db = new LowSync(new JSONFileSync('file.json'))
  5. db.read()
  6. db.write()
  7. ```

Methods


db.read()


Calls adapter.read() and sets db.data.

Note: JSONFile and JSONFileSync adapters will set db.data to null if file doesn't exist.

  1. ``` js
  2. db.data // === null
  3. db.read()
  4. db.data // !== null
  5. ```

db.write()


Calls adapter.write(db.data).

  1. ``` js
  2. db.data = { posts: [] }
  3. db.write() // file.json will be { posts: [] }
  4. db.data = {}
  5. db.write() // file.json will be {}
  6. ```

Properties


db.data


Holds your db content. If you're using the adapters coming with lowdb, it can be any type supported by [JSON.stringify](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify).

For example:

  1. ``` js
  2. db.data = 'string'
  3. db.data = [1, 2, 3]
  4. db.data = { key: 'value' }
  5. ```

Adapters


Lowdb adapters


JSONFile JSONFileSync


Adapters for reading and writing JSON files.

  1. ``` js
  2. import { JSONFile, JSONFileSync } from 'lowdb/node'

  3. new Low(new JSONFile(filename))
  4. new LowSync(new JSONFileSync(filename))
  5. ```

Memory MemorySync


In-memory adapters. Useful for speeding up unit tests. See [examples/](/examples) directory.

  1. ``` js
  2. import { Memory, MemorySync } from 'lowdb'

  3. new Low(new Memory())
  4. new LowSync(new MemorySync())
  5. ```

LocalStorage


Synchronous adapter for window.localStorage.

  1. ``` js
  2. import { LocalStorage } from 'lowdb/browser'

  3. new LowSync(new LocalStorage(name))
  4. ```

TextFile TextFileSync


Adapters for reading and writing text. Useful for creating custom adapters.

Third-party adapters


If you've published an adapter for lowdb, feel free to create a PR to add it here.

Writing your own adapter


You may want to create an adapter to write db.data to YAML, XML, encrypt data, a remote storage, ...

An adapter is a simple class that just needs to expose two methods:

  1. ``` js
  2. class AsyncAdapter {
  3.   read() { /* ... */ } // should return Promise
  4.   write(data) { /* ... */ } // should return Promise
  5. }

  6. class SyncAdapter {
  7.   read() { /* ... */ } // should return data
  8.   write(data) { /* ... */ } // should return nothing
  9. }
  10. ```

For example, let's say you have some async storage and want to create an adapter for it:

  1. ``` js
  2. import { api } from './AsyncStorage'

  3. class CustomAsyncAdapter {
  4.   // Optional: your adapter can take arguments
  5.   constructor(args) {
  6.     // ...
  7.   }

  8.   async read() {
  9.     const data = await api.read()
  10.     return data
  11.   }

  12.   async write(data) {
  13.     await api.write(data)
  14.   }
  15. }

  16. const adapter = new CustomAsyncAdapter()
  17. const db = new Low(adapter)
  18. ```

See [src/adapters/](src/adapters) for more examples.

Custom serialization


To create an adapter for another format than JSON, you can use TextFile or TextFileSync.

For example:

  1. ``` js
  2. import { Adapter, Low } from 'lowdb'
  3. import { TextFile } from 'lowdb/node'
  4. import YAML from 'yaml'

  5. class YAMLFile {
  6.   constructor(filename) {
  7.     this.adapter = new TextFile(filename)
  8.   }

  9.   async read() {
  10.     const data = await this.adapter.read()
  11.     if (data === null) {
  12.       return null
  13.     } else {
  14.       return YAML.parse(data)
  15.     }
  16.   }

  17.   write(obj) {
  18.     return this.adapter.write(YAML.stringify(obj))
  19.   }
  20. }

  21. const adapter = new YAMLFile('file.yaml')
  22. const db = new Low(adapter)
  23. ```

Limits


Lowdb doesn't support Node's cluster module.

If you have large JavaScript objects (~10-100MB) you may hit some performance issues. This is because whenever you call db.write, the whole db.data is serialized using JSON.stringify and written to storage.

Depending on your use case, this can be fine or not. It can be mitigated by doing batch operations and calling db.write only when you need it.

If you plan to scale, it's highly recommended to use databases like PostgreSQL or MongoDB instead.