Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Influxdb - Draft v0.4.0 #222

Merged
merged 87 commits into from
May 9, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
87 commits
Select commit Hold shift + click to select a range
8a46931
!!!!WIP!!!!
AnalogJ Jun 27, 2021
694fc74
fixing history.
AnalogJ Jun 27, 2021
5c614c5
adding docker build -> ghcr.io
AnalogJ Jul 25, 2021
ef01415
tagging.
AnalogJ Jul 25, 2021
967a927
setting the image type as suffix.
AnalogJ Jul 25, 2021
1fc910f
retest
AnalogJ Jul 25, 2021
80f4660
validate thresholds whenever SMART data is recieved.
AnalogJ Jul 26, 2021
bd19230
make sure data is persisted to DB.
AnalogJ Jul 26, 2021
a60edff
fixing mocked data
AnalogJ Aug 1, 2021
abe7a16
creating influxdb config file during startup.
AnalogJ Aug 7, 2021
975c034
WIP downsample scripts.
AnalogJ Oct 23, 2021
9878985
adding aggregation code
AnalogJ Oct 24, 2021
31b5dfa
ensure that all buckets are created during init. Remove all reference…
AnalogJ Oct 24, 2021
deba21f
update timestamps for testing.
AnalogJ Oct 24, 2021
5789c83
make sure the status is always exposed in the json data. make sure di…
AnalogJ Oct 25, 2021
7b7b4fe
fixing test.
AnalogJ Oct 25, 2021
ce032c5
fixes for Scrutiny end-to-end testing.
AnalogJ Oct 28, 2021
8fb5859
fixes for Scrutiny end-to-end testing.
AnalogJ Oct 28, 2021
b776fb8
tweaking retention policy code so we can test downsampling scripts.
AnalogJ Nov 17, 2021
772063a
find the temp history for the last week (by default). Smooth out data…
AnalogJ Nov 17, 2021
03bfdd3
changing the duration dropdown for temp history data. adding an /api/…
AnalogJ Nov 17, 2021
bff83de
query temp data across multiple buckets
AnalogJ Nov 18, 2021
47e8595
using constant vars for duration key magic strings. Fixing Errorf cal…
AnalogJ Nov 18, 2021
0872da5
fixes for tests.
AnalogJ Nov 18, 2021
903d571
fixes for tests.
AnalogJ Nov 21, 2021
f569ab6
[BROKEN COMMIT]
AnalogJ Apr 28, 2022
7a77719
broke scrutiny_repository.go into multiple files for easier explorati…
AnalogJ Apr 29, 2022
f60636a
broke scrutiny_repository.go into multiple files for easier explorati…
AnalogJ Apr 29, 2022
0a9d364
adding duration key to smart attributes api endpoint
AnalogJ Apr 29, 2022
bd39b2c
fixes for aggregation.
AnalogJ Apr 29, 2022
7cd828e
update the influxdb version in the standalone container.
AnalogJ Apr 29, 2022
00bc6ec
make sure we can pull config from env variables.
AnalogJ Apr 29, 2022
702c7cd
if running test iin github actions, use influxdb service for testing.
AnalogJ Apr 29, 2022
8462d21
waiting for influxdb before starting scrutiny app.
AnalogJ Apr 30, 2022
9a5c667
tweak the wait script.
AnalogJ Apr 30, 2022
88a99a1
information about downsampling.
AnalogJ Apr 30, 2022
9ebf252
information about downsampling.
AnalogJ Apr 30, 2022
5fb5b9a
if we're completing the InfluxDB setup via automation, attempt to sto…
AnalogJ Apr 30, 2022
5a1e390
started writing a TROUBLESHOOTING guide for the device collector.
AnalogJ Apr 30, 2022
9ee2674
started writing a TROUBLESHOOTING guide for the device collector.
AnalogJ Apr 30, 2022
e243d55
Document NVMe block device vs device controller binding.
AnalogJ Apr 30, 2022
5a4bcda
adding more docs.
AnalogJ May 1, 2022
d42faf3
fix WriteConfig interface.
AnalogJ May 1, 2022
0dba9f8
Merge branch 'master' into influxdb
AnalogJ May 1, 2022
62e5a71
build darwin/amd64 and darwin/arm64 binaries.
AnalogJ May 1, 2022
cfe77c9
install tzdata package everywhere.
AnalogJ May 1, 2022
8d052f0
update the bug report template
AnalogJ May 1, 2022
ccbb922
remove darwin builds.
AnalogJ May 1, 2022
84f3327
adding codecov coverage support.
AnalogJ May 1, 2022
20411af
adding name for coverage step.
AnalogJ May 1, 2022
035c946
referencing Github container registry for all images.
AnalogJ May 1, 2022
36617c8
using environmental variable in files path.
AnalogJ May 1, 2022
08f2471
using environmental variable in files path.
AnalogJ May 1, 2022
ae99ffd
using github container registry images.
AnalogJ May 2, 2022
49c1ef6
fixing local persistent dir for influxdb in omnibus.
AnalogJ May 2, 2022
75de6eb
addign ability to customize the scrutiny collector cron schedule usin…
AnalogJ May 2, 2022
ce3d45e
instructions in readme for how to override cron schedule.
AnalogJ May 2, 2022
646d0ef
change the environmental variable to COLLECTOR_CRON_SCHEDULE
AnalogJ May 2, 2022
97f6564
adding documetnation for smartctl exit codes and addl test.
AnalogJ May 2, 2022
9d85920
started working on migration code.
AnalogJ May 3, 2022
2750cce
call out deprecated structs so they are not accidentally used via aut…
AnalogJ May 3, 2022
7d963c9
writing pseudocode algorithm for data migration.
AnalogJ May 3, 2022
8fe0dbe
partially working. Some datapoints are failing with panic and are sil…
AnalogJ May 4, 2022
fc5a9ba
fixed device processing in details page. Summary query is still broken.
AnalogJ May 5, 2022
7025185
fixed summary query.
AnalogJ May 5, 2022
5f12fbb
enable final migration cleanup.
AnalogJ May 5, 2022
1ced219
cleanup log messages.
AnalogJ May 5, 2022
3dbe597
Merge pull request #224 from AnalogJ/influx_migrations
AnalogJ May 5, 2022
fabc629
handle case where WWN not detected for a device (print error messages…
AnalogJ May 6, 2022
5bab9ac
make sure we can correctly save the config file if onboarding influx.
AnalogJ May 6, 2022
57c0f89
build multi arch controller image.
AnalogJ May 6, 2022
d48d0b9
build multi arch images
AnalogJ May 6, 2022
6fbe710
Merge pull request #228 from AnalogJ/multiarch_builds
AnalogJ May 6, 2022
a893d2d
fix multi-arch builds.
AnalogJ May 6, 2022
2350c13
fix multi-arch builds.
AnalogJ May 6, 2022
77da0f5
fixing download command (using curl).
AnalogJ May 6, 2022
d26e452
fixing download command (using curl).
AnalogJ May 6, 2022
8ea194b
fixing download command (using curl).
AnalogJ May 6, 2022
87ba8ff
better message for what's services we're currently waiting for.
AnalogJ May 6, 2022
f39628e
by default show all temp data.
AnalogJ May 6, 2022
0cee744
highlight last updated dates when more than 2 weeks or 1 month.
AnalogJ May 7, 2022
786e7d0
make sure we print the overall device status in the details page.
AnalogJ May 7, 2022
2214feb
simple rename.
AnalogJ May 7, 2022
21d07a0
adding tests for Detect struct in collector. Adding ability to mock o…
AnalogJ May 7, 2022
2b5c864
update the codecov action version.
AnalogJ May 7, 2022
5ed69d7
adding tests for Smart and parser.
AnalogJ May 7, 2022
2967b6c
make sure that we set the config path when ReadConfig is called.
AnalogJ May 8, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/ISSUE_TEMPLATE/bug_report.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ If applicable, add screenshots to help explain your problem.

**Log Files**
If related to missing devices or SMART data, please run the `collector` in DEBUG mode, and attach the log file.
See [/docs/TROUBLESHOOTING_DEVICE_COLLECTOR.md](docs/TROUBLESHOOTING_DEVICE_COLLECTOR.md) for other troubleshooting tips.

```
docker run -it --rm -p 8080:8080 \
Expand All @@ -29,13 +30,12 @@ docker run -it --rm -p 8080:8080 \
-e COLLECTOR_LOG_FILE=/tmp/collector.log \
-e SCRUTINY_LOG_FILE=/tmp/web.log \
--name scrutiny \
analogj/scrutiny
ghcr.io/analogj/scrutiny:master-omnibus

# in another terminal trigger the collector
docker exec scrutiny scrutiny-collector-metrics run

# then use docker cp to copy the log files out of the container.
docker cp scrutiny:/tmp/collector.log collector.log
docker cp scrutiny:/tmp/web.log web.log

```
27 changes: 26 additions & 1 deletion .github/workflows/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,20 @@ jobs:
name: Build
runs-on: ubuntu-latest
container: techknowlogick/xgo:go-1.13.x

# Service containers to run with `build` (Required for end-to-end testing)
services:
influxdb:
image: influxdb:2.2
env:
DOCKER_INFLUXDB_INIT_MODE: setup
DOCKER_INFLUXDB_INIT_USERNAME: admin
DOCKER_INFLUXDB_INIT_PASSWORD: password12345
DOCKER_INFLUXDB_INIT_ORG: scrutiny
DOCKER_INFLUXDB_INIT_BUCKET: metrics
DOCKER_INFLUXDB_INIT_ADMIN_TOKEN: my-super-secret-auth-token
ports:
- 8086:8086
env:
PROJECT_PATH: /go/src/github.com/analogj/scrutiny
CGO_ENABLED: 1
Expand All @@ -26,6 +40,13 @@ jobs:

go mod vendor
go test -race -coverprofile=coverage.txt -covermode=atomic -v -tags "static" $(go list ./... | grep -v /vendor/)
- name: Generate coverage report
uses: codecov/codecov-action@v2
with:
files: ${{ env.PROJECT_PATH }}/coverage.txt
flags: unittests
fail_ci_if_error: true
verbose: true
- name: Build Binaries
run: |

Expand All @@ -49,9 +70,13 @@ jobs:
/build/scrutiny-collector-metrics-linux-arm-7
/build/scrutiny-web-windows-4.0-amd64.exe
/build/scrutiny-collector-metrics-windows-4.0-amd64.exe
# /build/scrutiny-web-darwin-arm64
# /build/scrutiny-collector-metrics-darwin-arm64
# /build/scrutiny-web-darwin-amd64
# /build/scrutiny-collector-metrics-darwin-amd64
# /build/scrutiny-web-freebsd-amd64
# /build/scrutiny-collector-metrics-freebsd-amd64
- uses: codecov/codecov-action@v1
- uses: codecov/codecov-action@v2
with:
file: ${{ env.PROJECT_PATH }}/coverage.txt
flags: unittests
Expand Down
16 changes: 15 additions & 1 deletion .github/workflows/docker-build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,6 @@ env:
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}


jobs:
collector:
runs-on: ubuntu-latest
Expand All @@ -22,6 +21,10 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v2
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
# Login against a Docker registry except on PR
# https://github.com/docker/login-action
- name: Log into registry ${{ env.REGISTRY }}
Expand All @@ -46,6 +49,7 @@ jobs:
- name: Build and push Docker image
uses: docker/build-push-action@ad44023a93711e3deb337508980b4b5e9bcdc5dc
with:
platforms: linux/amd64,linux/arm64
context: .
file: docker/Dockerfile.collector
push: ${{ github.event_name != 'pull_request' }}
Expand All @@ -61,6 +65,10 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v2
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
# Login against a Docker registry except on PR
# https://github.com/docker/login-action
- name: Log into registry ${{ env.REGISTRY }}
Expand All @@ -85,6 +93,7 @@ jobs:
- name: Build and push Docker image
uses: docker/build-push-action@ad44023a93711e3deb337508980b4b5e9bcdc5dc
with:
platforms: linux/amd64,linux/arm64
context: .
file: docker/Dockerfile.web
push: ${{ github.event_name != 'pull_request' }}
Expand All @@ -100,6 +109,10 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v2
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
# Login against a Docker registry except on PR
# https://github.com/docker/login-action
- name: Log into registry ${{ env.REGISTRY }}
Expand All @@ -124,6 +137,7 @@ jobs:
- name: Build and push Docker image
uses: docker/build-push-action@ad44023a93711e3deb337508980b4b5e9bcdc5dc
with:
platforms: linux/amd64,linux/arm64
context: .
file: docker/Dockerfile
push: ${{ github.event_name != 'pull_request' }}
Expand Down
41 changes: 41 additions & 0 deletions .github/workflows/release.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -212,3 +212,44 @@ jobs:
asset_path: /build/scrutiny-collector-metrics-windows-4.0-amd64.exe
asset_name: scrutiny-collector-metrics-windows-4.0-amd64.exe
asset_content_type: application/octet-stream
#
# - name: Release Asset - Web - darwin-arm64
# id: upload-release-asset13
# uses: actions/upload-release-asset@v1
# env:
# GITHUB_TOKEN: ${{ secrets.SCRUTINY_GITHUB_TOKEN }}
# with:
# upload_url: ${{ steps.create_release.outputs.upload_url }} # This pulls from the CREATE RELEASE step above, referencing it's ID to get its outputs object, which include a `upload_url`. See this blog post for more info: https://jasonet.co/posts/new-features-of-github-actions/#passing-data-to-future-steps
# asset_path: /build/scrutiny-web-darwin-arm64
# asset_name: scrutiny-web-darwin-arm64
# asset_content_type: application/octet-stream
# - name: Release Asset - Collector - darwin-arm64
# id: upload-release-asset14
# uses: actions/upload-release-asset@v1
# env:
# GITHUB_TOKEN: ${{ secrets.SCRUTINY_GITHUB_TOKEN }}
# with:
# upload_url: ${{ steps.create_release.outputs.upload_url }} # This pulls from the CREATE RELEASE step above, referencing it's ID to get its outputs object, which include a `upload_url`. See this blog post for more info: https://jasonet.co/posts/new-features-of-github-actions/#passing-data-to-future-steps
# asset_path: /build/scrutiny-collector-metrics-darwin-arm64
# asset_name: scrutiny-collector-metrics-darwin-arm64
# asset_content_type: application/octet-stream
# - name: Release Asset - Web - darwin-amd64
# id: upload-release-asset15
# uses: actions/upload-release-asset@v1
# env:
# GITHUB_TOKEN: ${{ secrets.SCRUTINY_GITHUB_TOKEN }}
# with:
# upload_url: ${{ steps.create_release.outputs.upload_url }} # This pulls from the CREATE RELEASE step above, referencing it's ID to get its outputs object, which include a `upload_url`. See this blog post for more info: https://jasonet.co/posts/new-features-of-github-actions/#passing-data-to-future-steps
# asset_path: /build/scrutiny-web-darwin-amd64
# asset_name: scrutiny-web-darwin-amd64
# asset_content_type: application/octet-stream
# - name: Release Asset - Collector - darwin-amd64
# id: upload-release-asset16
# uses: actions/upload-release-asset@v1
# env:
# GITHUB_TOKEN: ${{ secrets.SCRUTINY_GITHUB_TOKEN }}
# with:
# upload_url: ${{ steps.create_release.outputs.upload_url }} # This pulls from the CREATE RELEASE step above, referencing it's ID to get its outputs object, which include a `upload_url`. See this blog post for more info: https://jasonet.co/posts/new-features-of-github-actions/#passing-data-to-future-steps
# asset_path: /build/scrutiny-collector-metrics-darwin-amd64
# asset_name: scrutiny-collector-metrics-darwin-amd64
# asset_content_type: application/octet-stream
38 changes: 36 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ docker run -it --rm -p 8080:8080 \
--cap-add SYS_RAWIO \
--device=/dev/sda \
--device=/dev/sdb \
analogj/scrutiny
ghcr.io/analogj/scrutiny:master-omnibus
/scrutiny/bin/scrutiny-collector-metrics run
```

Expand Down Expand Up @@ -54,7 +54,8 @@ web:
src:
frontend:
path: ./dist

influxdb:
retention_policy: false

log:
file: 'web.log' #absolute or relative paths allowed, eg. web.log
Expand All @@ -71,6 +72,39 @@ go run webapp/backend/cmd/scrutiny/scrutiny.go start --config ./scrutiny.yaml
Now visit http://localhost:8080


If you'd like to populate the database with some test data, you can run the following commands:

> NOTE: you may need to update the `local_time` key within the JSON file, any timestamps older than ~3 weeks will be automatically ignored
> (since the downsampling & retention policy takes effect at 2 weeks)
> This is done automatically by the `webapp/backend/pkg/models/testdata/helper.go` script

```
docker run -p 8086:8086 --rm influxdb:2.2


docker run --rm -p 8086:8086 \
-e DOCKER_INFLUXDB_INIT_MODE=setup \
-e DOCKER_INFLUXDB_INIT_USERNAME=admin \
-e DOCKER_INFLUXDB_INIT_PASSWORD=password12345 \
-e DOCKER_INFLUXDB_INIT_ORG=scrutiny \
-e DOCKER_INFLUXDB_INIT_BUCKET=metrics \
influxdb:2.2


# curl -X POST -H "Content-Type: application/json" -d @webapp/backend/pkg/web/testdata/register-devices-req.json localhost:8080/api/devices/register
# curl -X POST -H "Content-Type: application/json" -d @webapp/backend/pkg/models/testdata/smart-ata.json localhost:8080/api/device/0x5000cca264eb01d7/smart
# curl -X POST -H "Content-Type: application/json" -d @webapp/backend/pkg/models/testdata/smart-ata-date.json localhost:8080/api/device/0x5000cca264eb01d7/smart
# curl -X POST -H "Content-Type: application/json" -d @webapp/backend/pkg/models/testdata/smart-ata-date2.json localhost:8080/api/device/0x5000cca264eb01d7/smart
# curl -X POST -H "Content-Type: application/json" -d @webapp/backend/pkg/models/testdata/smart-fail2.json localhost:8080/api/device/0x5000cca264ec3183/smart
# curl -X POST -H "Content-Type: application/json" -d @webapp/backend/pkg/models/testdata/smart-nvme.json localhost:8080/api/device/0x5002538e40a22954/smart
# curl -X POST -H "Content-Type: application/json" -d @webapp/backend/pkg/models/testdata/smart-scsi.json localhost:8080/api/device/0x5000cca252c859cc/smart
# curl -X POST -H "Content-Type: application/json" -d @webapp/backend/pkg/models/testdata/smart-scsi2.json localhost:8080/api/device/0x5000cca264ebc248/smart
go run webapp/backend/pkg/models/testdata/helper.go

curl localhost:8080/api/summary

```

### Collector
```
brew install smartmontools
Expand Down
1 change: 0 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ BINARY=\
linux/arm-7 \
linux/arm64 \


.PHONY: all $(BINARY)
all: $(BINARY) windows/amd64

Expand Down
31 changes: 23 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,40 +72,48 @@ If you're using Docker, getting started is as simple as running the following co

```bash
docker run -it --rm -p 8080:8080 \
-v `pwd`/scrutiny:/scrutiny/config \
-v `pwd`/influxdb2:/scrutiny/influxdb \
-v /run/udev:/run/udev:ro \
--cap-add SYS_RAWIO \
--device=/dev/sda \
--device=/dev/sdb \
--name scrutiny \
analogj/scrutiny
ghcr.io/analogj/scrutiny:master-omnibus
```

- `/run/udev` is necessary to provide the Scrutiny collector with access to your device metadata
- `--cap-add SYS_RAWIO` is necessary to allow `smartctl` permission to query your device SMART data
- NOTE: If you have **NVMe** drives, you must add `--cap-add SYS_ADMIN` as well. See issue [#26](https://github.com/AnalogJ/scrutiny/issues/26#issuecomment-696817130)
- `--device` entries are required to ensure that your hard disk devices are accessible within the container.
- `analogj/scrutiny` is a omnibus image, containing both the webapp server (frontend & api) as well as the S.M.A.R.T metric collector. (see below)
- `ghcr.io/analogj/scrutiny:master-omnibus` is a omnibus image, containing both the webapp server (frontend & api) as well as the S.M.A.R.T metric collector. (see below)

### Hub/Spoke Deployment

In addition to the Omnibus image (available under the `latest` tag) there are 2 other Docker images available:

- `analogj/scrutiny:collector` - Contains the Scrutiny data collector, `smartctl` binary and cron-like scheduler. You can run one collector on each server.
- `analogj/scrutiny:web` - Contains the Web UI, API and Database. Only one container necessary
- `ghcr.io/analogj/scrutiny:master-collector` - Contains the Scrutiny data collector, `smartctl` binary and cron-like scheduler. You can run one collector on each server.
- `ghcr.io/analogj/scrutiny:master-web` - Contains the Web UI, API and Database. Only one container necessary

```bash
docker run -it --rm -p 8080:8080 \
docker run --rm -p 8086:8086 \
-v `pwd`/influxdb2:/var/lib/influxdb2 \
--name scrutiny-influxdb \
influxdb:2.2

docker run --rm -p 8080:8080 \
-v `pwd`/scrutiny:/scrutiny/config \
--name scrutiny-web \
analogj/scrutiny:web
ghcr.io/analogj/scrutiny:master-web

docker run -it --rm \
docker run --rm \
-v /run/udev:/run/udev:ro \
--cap-add SYS_RAWIO \
--device=/dev/sda \
--device=/dev/sdb \
-e SCRUTINY_API_ENDPOINT=http://SCRUTINY_WEB_IPADDRESS:8080 \
--name scrutiny-collector \
analogj/scrutiny:collector
ghcr.io/analogj/scrutiny:master-collector
```

## Manual Installation (without-Docker)
Expand Down Expand Up @@ -140,6 +148,13 @@ There are two configuration files available:

Neither file is required, however if provided, it allows you to configure how Scrutiny functions.

## Cron Schedule
Unfortunately the Cron schedule cannot be configured via the `collector.yaml` (as the collector binary needs to be trigged by a scheduler/cron).
However, if you are using the official `ghcr.io/analogj/scrutiny:master-collector` or `ghcr.io/analogj/scrutiny:master-omnibus` docker images,
you can use the `COLLECTOR_CRON_SCHEDULE` environmental variable to override the default cron schedule (daily @ midnight - `0 0 * * *`).

`docker run -e COLLECTOR_CRON_SCHEDULE="0 0 * * *" ...`

## Notifications

Scrutiny supports sending SMART device failure notifications via the following services:
Expand Down
1 change: 1 addition & 0 deletions collector/pkg/collector/base.go
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@ func (c *BaseCollector) postJson(url string, body interface{}, target interface{
return json.NewDecoder(r.Body).Decode(target)
}

// http://www.linuxguide.it/command_line/linux-manpage/do.php?file=smartctl#sect7
func (c *BaseCollector) LogSmartctlExitCode(exitCode int) {
if exitCode&0x01 != 0 {
c.logger.Errorln("smartctl could not parse commandline")
Expand Down
10 changes: 8 additions & 2 deletions collector/pkg/collector/metrics.go
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ import (
"bytes"
"encoding/json"
"fmt"
"github.com/analogj/scrutiny/collector/pkg/common"
"github.com/analogj/scrutiny/collector/pkg/common/shell"
"github.com/analogj/scrutiny/collector/pkg/config"
"github.com/analogj/scrutiny/collector/pkg/detect"
"github.com/analogj/scrutiny/collector/pkg/errors"
Expand All @@ -20,6 +20,7 @@ type MetricsCollector struct {
config config.Interface
BaseCollector
apiEndpoint *url.URL
shell shell.Interface
}

func CreateMetricsCollector(appConfig config.Interface, logger *logrus.Entry, apiEndpoint string) (MetricsCollector, error) {
Expand All @@ -34,6 +35,7 @@ func CreateMetricsCollector(appConfig config.Interface, logger *logrus.Entry, ap
BaseCollector: BaseCollector{
logger: logger,
},
shell: shell.Create(),
}

return sc, nil
Expand Down Expand Up @@ -107,6 +109,10 @@ func (mc *MetricsCollector) Validate() error {
//func (mc *MetricsCollector) Collect(wg *sync.WaitGroup, deviceWWN string, deviceName string, deviceType string) {
func (mc *MetricsCollector) Collect(deviceWWN string, deviceName string, deviceType string) {
//defer wg.Done()
if len(deviceWWN) == 0 {
mc.logger.Errorf("no device WWN detected for %s. Skipping collection for this device (no data association possible).\n", deviceName)
return
}
mc.logger.Infof("Collecting smartctl results for %s\n", deviceName)

args := []string{"-x", "-j"}
Expand All @@ -116,7 +122,7 @@ func (mc *MetricsCollector) Collect(deviceWWN string, deviceName string, deviceT
}
args = append(args, fmt.Sprintf("%s%s", detect.DevicePrefix(), deviceName))

result, err := common.ExecCmd(mc.logger, "smartctl", args, "", os.Environ())
result, err := mc.shell.Command(mc.logger, "smartctl", args, "", os.Environ())
resultBytes := []byte(result)
if err != nil {
if exitError, ok := err.(*exec.ExitError); ok {
Expand Down
5 changes: 5 additions & 0 deletions collector/pkg/common/shell/factory.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
package shell

func Create() Interface {
return new(localShell)
}
11 changes: 11 additions & 0 deletions collector/pkg/common/shell/interface.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
package shell

import (
"github.com/sirupsen/logrus"
)

// Create mock using:
// mockgen -source=collector/pkg/common/shell/interface.go -destination=collector/pkg/common/shell/mock/mock_shell.go
type Interface interface {
Command(logger *logrus.Entry, cmdName string, cmdArgs []string, workingDir string, environ []string) (string, error)
}