Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

diskusage: Support Amazon AWS NVMe disks #693

Merged
merged 2 commits into from
Mar 2, 2022
Merged

diskusage: Support Amazon AWS NVMe disks #693

merged 2 commits into from
Mar 2, 2022

Conversation

somechris
Copy link
Contributor

Amazon AWS EC2's new C5 and M5 instance types export disks as NVMe
[1]. diskusage's default devices list did not catch them, so we
extend the default to match NVMes.

[1] https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/nvme-ebs-volumes.html

@coveralls
Copy link

coveralls commented Jan 2, 2018

Coverage Status

Coverage increased (+0.04%) to 24.703% when pulling 1c90be4 on somechris:diskusage-nvme into f2bece4 on python-diamond:master.

@shortdudey123
Copy link
Member

#723 added the collector code but no testing, Can you rebase on master so your tests can be added?

@somechris
Copy link
Contributor Author

#723 added the collector code but no testing [...]

#723 also came without support for partitions.

Can you rebase on master so your tests can be added?

Sure thing. I've split the commit in two (adding NVMe partitions and adding NVMe test) so the different things are better visible and rebase.

However, I currently lack a setup to run the tests. I hope they still pass.

@shortdudey123 shortdudey123 merged commit f518e72 into python-diamond:master Mar 2, 2022
@adsharmaroku
Copy link

adsharmaroku commented Aug 9, 2022

Do we have any same support for GCP VMs as well? Unfortunately, I don't see metrics collected for nvme SSDs attached on GCP VMs -
Below is the diskstatsoutput
[root@gcp1 diskusage]# cat /proc/diskstats
259 0 nvme0n1 679 0 377600 263 31606 0 9544800 97399 0 104426 97662
259 1 nvme0n2 694 0 382088 238 31618 0 9550944 87974 0 95299 88212
259 2 nvme0n3 690 0 383760 289 31806 0 9647200 91713 0 99148 92002
259 3 nvme0n4 693 0 385808 303 31642 0 9563232 85520 0 93017 85823
259 4 nvme0n5 693 0 385808 268 31822 0 9655392 87752 0 95166 88020
259 5 nvme0n6 686 0 379680 280 31666 0 9575520 97347 0 104401 97627
259 6 nvme0n7 702 0 384784 236 31718 0 9602144 90051 0 97448 90287
259 7 nvme0n8 703 0 385040 292 31714 0 9600096 91561 0 98616 91853
259 8 nvme0n9 679 0 377600 262 31682 0 9583712 110936 0 118057 111198
259 9 nvme0n10 688 0 383744 225 31714 0 9600096 98759 0 106141 98984
259 10 nvme0n11 688 0 383744 272 31658 0 9571424 109249 0 116445 109521
259 11 nvme0n12 688 0 383744 281 31610 0 9546848 109529 0 116500 109810
259 12 nvme0n13 685 0 381696 266 31658 0 9571424 109378 0 116436 109644
259 13 nvme0n14 688 0 383744 246 31654 0 9569376 100230 0 107557 100476
259 14 nvme0n15 679 0 377600 259 31742 0 9614432 104777 0 111978 105036
259 15 nvme0n16 688 0 383744 261 31690 0 9587808 102222 0 109702 102483
8 0 sda 24917 109 1145330 14814 38064 3546 1241817 163566 0 39843 178380
8 1 sda1 318 0 6505 565 1 0 1 0 0 52 565
8 2 sda2 24464 109 1131953 14208 37850 3546 1241816 163558 0 39792 177766
8 16 sdb 182 0 8722 47 39387 26318 698064 15806 0 31535 15853
[root@gcp1 diskusage]#

@shortdudey123
Copy link
Member

Please confirm your build is after this PR was merged. If so, open an issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants