This package recursively walks through a directory and caches files that match a regexp into a radix tree.
Since it spawns one goroutine for each file / directory lookup, it is also context-aware, enabling all the process to return earlier when the context is done.
Full documentation here.
vgo get -u github.com/gbrlsnchs/filecache
go get -u github.com/gbrlsnchs/filecache
import (
// ...
"github.com/gbrlsnchs/filecache"
)
c, err := filecache.ReadDir("foobar", "")
if err != nil {
// If err != nil, directory "foobar" doesn't exist, or maybe one of the files
// inside this directory has been deleted during the reading.
}
txt := c.Get("bazqux.txt")
log.Print(txt)
c, err := filecache.ReadDir("foobar", `\.sql$`)
if err != nil {
// ...
}
q := c.Get("bazqux.sql")
log.Print(q)
log.Print(c.Len()) // amount of files cached
log.Print(c.Size()) // total size in bytes
c := filecache.New("foobar")
// do stuff...
if err := c.Load(`\.log$`); err != nil {
// ...
}
By default, this package spawns goroutines for each file inside each directory.
Currently, the limit of goroutines is the result of runtime.NumCPU()
. However, it is possible to use a cache with a custom limit by using the method SetSemaphoreSize
.
c := filecache.New("foobar")
c.SetSemaphoreSize(100)
if err := c.Load(`\.log$`); err != nil {
// ...
}
- For bugs and opinions, please open an issue
- For pushing changes, please open a pull request