Skip to content

Commit

Permalink
Merge branch 'release/0.0.10'
Browse files Browse the repository at this point in the history
  • Loading branch information
Jenkins committed Apr 5, 2022
2 parents 3ff9ea0 + 8b1674c commit 3ff6f4b
Show file tree
Hide file tree
Showing 23 changed files with 127 additions and 101 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,15 +9,15 @@
Export your favorite GitHub repositories to Prometheus

* Use it _as a service_: See https://gh.skuzzle.de for instructions
* Deploy it _on-premise_: `docker pull ghcr.io/skuzzle/gh-prom-exporter/gh-prom-exporter:0.0.9`
* Deploy it _on-premise_: `docker pull ghcr.io/skuzzle/gh-prom-exporter/gh-prom-exporter:0.0.10`

## On-Premise deployment with docker
This application can easily be run as a docker container in whatever environment you like:

```
docker run -p 8080:8080 \
-e WEB_ALLOWANONYMOUSSCRAPE=true \
ghcr.io/skuzzle/gh-prom-exporter/gh-prom-exporter:0.0.9
ghcr.io/skuzzle/gh-prom-exporter/gh-prom-exporter:0.0.10
```

With _anonymous scraping_ allowed, you can now easily view the scrape results directly in the browser by navigating to
Expand All @@ -38,7 +38,7 @@ scrape_configs:
In case you want to enforce authenticated scrapes only, use this configuration instead:
```
docker run -p 8080:8080 \
ghcr.io/skuzzle/gh-prom-exporter/gh-prom-exporter:0.0.9
ghcr.io/skuzzle/gh-prom-exporter/gh-prom-exporter:0.0.10
```

Scraping now requires a GitHub access token, otherwise the service will respond with 401/Unauthorized.
Expand Down
6 changes: 2 additions & 4 deletions RELEASE_NOTES.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,10 @@

[![Coverage Status](https://coveralls.io/repos/github/skuzzle/gh-prom-exporter/badge.svg?branch=master)](https://coveralls.io/github/skuzzle/gh-prom-exporter?branch=master) [![Twitter Follow](https://img.shields.io/twitter/follow/skuzzleOSS.svg?style=social)](https://twitter.com/skuzzleOSS)

* Upgrade to Spring-Boot `2.6.3` (coming from `2.6.2`)
* Refactoring and improve internal documentation
* Add new metric: main branch commit count
* Add more internal metrics: `registered_scrapers`, `scrape_failures`, `abuses`, `api_calls` and `rate_limit_hits`
* Upgrade to Spring-Boot 2.6.6 (coming from 2.6.3)


```
docker pull ghcr.io/skuzzle/gh-prom-exporter/gh-prom-exporter:0.0.9
docker pull ghcr.io/skuzzle/gh-prom-exporter/gh-prom-exporter:0.0.10
```
6 changes: 3 additions & 3 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@

<groupId>de.skuzzle.ghpromexporter</groupId>
<artifactId>gh-prom-exporter</artifactId>
<version>0.0.9</version>
<version>0.0.10</version>

<name>gh-prom-exporter</name>
<description>Export GitHub repository metrics in prometheus format</description>
Expand All @@ -36,11 +36,11 @@
<docker.registry.name>ghcr.io</docker.registry.name>
<docker.image.name>${docker.registry.name}/${github.user}/${github.name}/${project.artifactId}</docker.image.name>

<spring-boot.version>2.6.3</spring-boot.version>
<spring-boot.version>2.6.6</spring-boot.version>
<spring-cloud.version>2021.0.0-RC1</spring-cloud.version>
<guava.version>31.0.1-jre</guava.version>
<github-api.version>1.135</github-api.version>
<snapshottest.version>0.0.5</snapshottest.version>
<snapshottest.version>1.2.3</snapshottest.version>
</properties>

<repositories>
Expand Down
4 changes: 1 addition & 3 deletions readme/RELEASE_NOTES.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,8 @@

[![Coverage Status](https://coveralls.io/repos/github/${github.user}/${github.name}/badge.svg?branch=${github.main-branch})](https://coveralls.io/github/${github.user}/${github.name}?branch=${github.main-branch}) [![Twitter Follow](https://img.shields.io/twitter/follow/skuzzleOSS.svg?style=social)](https://twitter.com/skuzzleOSS)

* Upgrade to Spring-Boot `2.6.3` (coming from `2.6.2`)
* Refactoring and improve internal documentation
* Add new metric: main branch commit count
* Add more internal metrics: `registered_scrapers`, `scrape_failures`, `abuses`, `api_calls` and `rate_limit_hits`
* Upgrade to Spring-Boot 2.6.6 (coming from 2.6.3)


```
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,11 @@
import org.kohsuke.github.GHRepositoryStatistics.ContributorStats;
import org.kohsuke.github.GitHub;

/**
* Downloads all relevant information from a github repository.
*
* @author Simon Taddiken
*/
public final class ScrapableRepository {

private final GHRepository repository;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,13 +30,12 @@ public class AsynchronousScrapeService {
this.tracer = tracer;
}

public Mono<RepositoryMetrics> scrapeReactive(GitHubAuthentication authentication,
ScrapeRepositoryRequest request) {
final RegisteredScraper scrapeTarget = new RegisteredScraper(authentication, request);
public Mono<ScrapeResult> scrapeReactive(GitHubAuthentication authentication, ScrapeTarget scrapeTarget) {
final RegisteredScraper registeredScraper = new RegisteredScraper(authentication, scrapeTarget);

return registrationRepository
.getExistingOrLoad(scrapeTarget, scraper -> {
final RepositoryMetrics repositoryMetrics = scraper.scrapeWith(scrapeRepositoryService);
.getExistingOrLoad(registeredScraper, scraper -> {
final ScrapeResult repositoryMetrics = scraper.scrapeWith(scrapeRepositoryService);
log.info("Cache miss for {}. Scraped fresh metrics now in {}ms", scraper,
repositoryMetrics.scrapeDuration());
return repositoryMetrics;
Expand All @@ -58,26 +57,24 @@ void scheduledScraping() {
final Span newSpan = tracer.nextSpan().name("scheduledScrape");
try (var ws = tracer.withSpan(newSpan.start())) {
registrationRepository.registeredScrapers()
.doOnNext(scrapeTarget -> scrapeAndUpdateCache(newSpan, scrapeTarget))
.doOnNext(registeredScraper -> scrapeAndUpdateCache(newSpan, registeredScraper))
.doOnTerminate(() -> log.info("Updated cached metrics for all registered scrapers"))
.blockLast();
} finally {
newSpan.end();
}
}

private void scrapeAndUpdateCache(Span parentSpan, RegisteredScraper scrapeTarget) {
private void scrapeAndUpdateCache(Span parentSpan, RegisteredScraper scraper) {
final Span nextSpan = tracer.nextSpan(parentSpan).name("scrapeSingleRepo");
try (var ws = tracer.withSpan(nextSpan.start())) {
final RepositoryMetrics repositoryMetrics = scrapeTarget.scrapeWith(scrapeRepositoryService);
registrationRepository.updateRegistration(scrapeTarget, repositoryMetrics);
log.info("Asynschronously updated metrics for: {} in {}ms", scrapeTarget,
repositoryMetrics.scrapeDuration());
final ScrapeResult scrapeResult = scraper.scrapeWith(scrapeRepositoryService);
registrationRepository.updateRegistration(scraper, scrapeResult);
log.info("Asynschronously updated metrics for: {} in {}ms", scraper, scrapeResult.scrapeDuration());
} catch (final Exception e) {
registrationRepository.deleteRegistration(scrapeTarget);
registrationRepository.deleteRegistration(scraper);
AppMetrics.scrapeFailures().increment();
log.error("Scrape using '{}' threw exception. Will be removed from cache of active scrapers", scrapeTarget,
e);
log.error("Scrape using '{}' threw exception. Will be removed from cache of active scrapers", scraper, e);
} finally {
nextSpan.end();
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,9 @@

class MemoryRegistrationRepository implements RegistrationRepository {

private final Cache<RegisteredScraper, RepositoryMetrics> registeredScrapers;
private final Cache<RegisteredScraper, ScrapeResult> registeredScrapers;

public MemoryRegistrationRepository(Cache<RegisteredScraper, RepositoryMetrics> registeredScrapers) {
public MemoryRegistrationRepository(Cache<RegisteredScraper, ScrapeResult> registeredScrapers) {
this.registeredScrapers = registeredScrapers;
}

Expand All @@ -36,7 +36,7 @@ public Flux<RegisteredScraper> registeredScrapers() {
}

@Override
public void updateRegistration(RegisteredScraper scraper, RepositoryMetrics freshResult) {
public void updateRegistration(RegisteredScraper scraper, ScrapeResult freshResult) {
registeredScrapers.put(scraper, freshResult);
}

Expand All @@ -51,8 +51,8 @@ public void deleteAll() {
}

@Override
public Mono<RepositoryMetrics> getExistingOrLoad(RegisteredScraper scraper,
Function<RegisteredScraper, RepositoryMetrics> loader) {
public Mono<ScrapeResult> getExistingOrLoad(RegisteredScraper scraper,
Function<RegisteredScraper, ScrapeResult> loader) {
return Mono.fromSupplier(() -> {
try {
return registeredScrapers.get(scraper, () -> loader.apply(scraper));
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,12 @@
import io.prometheus.client.Summary;

/**
* Holds all the prometheus meters that will be updated when a repository is freshly
* scraped.
* Aggregates the scrape results of multiple repositories into a single Prometheus
* registry.
*
* @author Simon Taddiken
*/
public final class RepositoryMeters {
public final class PrometheusRepositoryMetricAggration {

private static final String LABEL_REPOSITORY = "repository";
private static final String LABEL_OWNER = "owner";
Expand All @@ -28,12 +28,12 @@ public final class RepositoryMeters {
private final Counter size;
private final Summary scrapeDuration;

public static RepositoryMeters newRegistry() {
return new RepositoryMeters(new CollectorRegistry());
public static PrometheusRepositoryMetricAggration newRegistry() {
return new PrometheusRepositoryMetricAggration();
}

private RepositoryMeters(CollectorRegistry registry) {
this.registry = registry;
private PrometheusRepositoryMetricAggration() {
this.registry = new CollectorRegistry();
this.additions = Counter.build("additions", "Sum of additions over the last 52 weeks")
.namespace(NAMESPACE).labelNames(LABEL_OWNER, LABEL_REPOSITORY).register(registry);
this.deletions = Counter.build("deletions", "Negative sum of deletions over the last 52 weeks")
Expand All @@ -56,7 +56,9 @@ private RepositoryMeters(CollectorRegistry registry) {
.namespace(NAMESPACE).labelNames(LABEL_OWNER, LABEL_REPOSITORY).register(registry);
}

public RepositoryMeters addRepositoryScrapeResults(ScrapeRepositoryRequest repository, RepositoryMetrics metrics) {
public PrometheusRepositoryMetricAggration addRepositoryScrapeResults(
ScrapeTarget repository,
ScrapeResult metrics) {
additions.labels(repository.owner(), repository.name()).inc(metrics.totalAdditions());
deletions.labels(repository.owner(), repository.name()).inc(metrics.totalDeletions());
commitsToMainBranch.labels(repository.owner(), repository.name()).inc(metrics.commitsToMainBranch());
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ interface RegistrationRepository {
* @param scraper The scraper.
* @param freshResult The scrape result.
*/
void updateRegistration(RegisteredScraper scraper, RepositoryMetrics freshResult);
void updateRegistration(RegisteredScraper scraper, ScrapeResult freshResult);

/**
* Deletes the given single scraper from this repository.
Expand All @@ -65,19 +65,19 @@ interface RegistrationRepository {
* @return The metrics for the given scraper. Either freshly scraped or obtained from
* cache.
*/
Mono<RepositoryMetrics> getExistingOrLoad(RegisteredScraper scraper,
Function<RegisteredScraper, RepositoryMetrics> loader);
Mono<ScrapeResult> getExistingOrLoad(RegisteredScraper scraper,
Function<RegisteredScraper, ScrapeResult> loader);

/**
* Combines a single scrape target (GitHub repository) along with authentication
* information that are needed to access said target.
*
* @author Simon Taddiken
*/
record RegisteredScraper(GitHubAuthentication authentication, ScrapeRepositoryRequest repository) {
record RegisteredScraper(GitHubAuthentication authentication, ScrapeTarget target) {

RepositoryMetrics scrapeWith(ScrapeService scrapeRepositoryService) {
return scrapeRepositoryService.scrape(authentication, repository);
ScrapeResult scrapeWith(ScrapeService scrapeRepositoryService) {
return scrapeRepositoryService.scrape(authentication, target);
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
*
* @author Simon Taddiken
*/
public record RepositoryMetrics(
public record ScrapeResult(
long totalAdditions,
long totalDeletions,
int commitsToMainBranch,
Expand Down
19 changes: 9 additions & 10 deletions src/main/java/de/skuzzle/ghpromexporter/scrape/ScrapeService.java
Original file line number Diff line number Diff line change
Expand Up @@ -11,9 +11,8 @@
import reactor.core.scheduler.Schedulers;

/**
* Internal service for actually scraping a repository (given as
* {@link ScrapeRepositoryRequest}), accessing it using a given
* {@link GitHubAuthentication}.
* Internal service for actually scraping a repository (given as {@link ScrapeTarget}),
* accessing it using a given {@link GitHubAuthentication}.
*
* @author Simon Taddiken
*/
Expand All @@ -22,19 +21,19 @@ class ScrapeService {

private static final Logger log = LoggerFactory.getLogger(ScrapeService.class);

public Mono<RepositoryMetrics> scrapeReactive(GitHubAuthentication authentication,
ScrapeRepositoryRequest repository) {
return Mono.fromSupplier(() -> scrape(authentication, repository))
public Mono<ScrapeResult> scrapeReactive(GitHubAuthentication authentication,
ScrapeTarget target) {
return Mono.fromSupplier(() -> scrape(authentication, target))
.subscribeOn(Schedulers.boundedElastic());
}

public RepositoryMetrics scrape(GitHubAuthentication authentication, ScrapeRepositoryRequest repository) {
public ScrapeResult scrape(GitHubAuthentication authentication, ScrapeTarget target) {
final long start = System.currentTimeMillis();
return AppMetrics.scrapeDuration().record(() -> {
final var repositoryFullName = repository.repositoryFullName();
final var repositoryFullName = target.repositoryFullName();
final var scrapableRepository = ScrapableRepository.load(authentication, repositoryFullName);

final RepositoryMetrics repositoryMetrics = new RepositoryMetrics(
final ScrapeResult repositoryMetrics = new ScrapeResult(
scrapableRepository.totalAdditions(),
scrapableRepository.totalDeletions(),
scrapableRepository.commitsToMainBranch(),
Expand All @@ -46,7 +45,7 @@ public RepositoryMetrics scrape(GitHubAuthentication authentication, ScrapeRepos
scrapableRepository.sizeInKb(),
System.currentTimeMillis() - start);

log.debug("Scraped fresh metrics for {} in {}ms", repository, repositoryMetrics.scrapeDuration());
log.debug("Scraped fresh metrics for {} in {}ms", target, repositoryMetrics.scrapeDuration());
return repositoryMetrics;
});
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@

import java.util.Objects;

public record ScrapeRepositoryRequest(String owner, String repository) {
public record ScrapeTarget(String owner, String repository) {

public static ScrapeRepositoryRequest of(String owner, String repository) {
return new ScrapeRepositoryRequest(
public static ScrapeTarget of(String owner, String repository) {
return new ScrapeTarget(
Objects.requireNonNull(owner, "owner must not be null"),
Objects.requireNonNull(repository, "repository must not be null"));
}
Expand Down
13 changes: 13 additions & 0 deletions src/main/java/de/skuzzle/ghpromexporter/web/AbuseLimiter.java
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,13 @@ public AbuseLimiter(Cache<InetAddress, Integer> abusers, int abuseLimit) {
this.abuseLimit = abuseLimit;
}

/**
* Returns an empty optional if the abuse limit was hit. Otherwise the Mono will
* contain just a arbitrary object.
*
* @param origin The origin IP to check.
* @return An empty Mono if the abuse limit was violated by that IP.
*/
Mono<Object> blockAbusers(InetAddress origin) {
return Mono.fromSupplier(() -> _0IfNull(abusers.getIfPresent(origin)))
.filter(actualAbuses -> abuseLimitExceeded(origin, actualAbuses))
Expand All @@ -38,6 +45,12 @@ private boolean abuseLimitExceeded(InetAddress origin, int actualAbuses) {
return true;
}

/**
* Records a potential abuse case for the given origin IP.
*
* @param e The error that occurred during request processing.
* @param origin The origin IP.
*/
void recordFailedCall(Throwable e, InetAddress origin) {
log.warn("Abuse recorded for {}: {}", origin, e.getMessage());
abusers.put(origin, _0IfNull(abusers.getIfPresent(origin)) + 1);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,27 +3,27 @@
import java.util.Arrays;
import java.util.List;

import de.skuzzle.ghpromexporter.scrape.ScrapeRepositoryRequest;
import de.skuzzle.ghpromexporter.scrape.ScrapeTarget;
import reactor.core.publisher.Flux;

final class MultipleRepositories {
final class MultipleScrapeTargets {

private final String owner;
private final List<String> repositories;

private MultipleRepositories(String owner, List<String> repositories) {
private MultipleScrapeTargets(String owner, List<String> repositories) {
this.owner = owner;
this.repositories = repositories;
}

public static MultipleRepositories parse(String owner, String repositoriesString) {
public static MultipleScrapeTargets parse(String owner, String repositoriesString) {
final String[] repositories = repositoriesString.split(",");
return new MultipleRepositories(owner, Arrays.asList(repositories));
return new MultipleScrapeTargets(owner, Arrays.asList(repositories));
}

Flux<ScrapeRepositoryRequest> requests() {
Flux<ScrapeTarget> targets() {
return Flux.fromStream(repositories.stream()
.map(repository -> ScrapeRepositoryRequest.of(owner, repository)));
.map(repository -> ScrapeTarget.of(owner, repository)));
}

@Override
Expand Down
Loading

0 comments on commit 3ff6f4b

Please sign in to comment.