Skip to content

metaisbeta/asniffer

Repository files navigation

codecov ASniffer-Actions-Build DOI DOI Maven Central

Annotation Sniffer

Annotation Sniffer is a tool that extracts code annotation metrics from java source code.

How to install

Download the source code and generate an executable jar file. Or download the jar file provided with the latest release.

mvn clean package -P executable

How to use

java -jar asniffer.jar -p <path to project> -r <path to output report> -t <report type> -m <single/multi>

The "path to project" is mandatory, and should be the path to the java project to be analyzed (i.e, contains the source code files). Considering that only one java project is being analyzed, the directory should have the arrangement below.

.
├── project                # Directory containing the source file for the project. This is the path provided

In this case the ASniffer will consider that every .java file inside the directory project belongs to the same project.

The ASniffer can also analyze multiple projects at once. In this case, the user should provide a directory with the arrangement described below.

.
├── projects                # Root directory for projects. This is the path to be provided
    ├── project1            # Contains the source files for project1
    ├── project2            # Contains the source files for project2
    └── ...         

In this case the directory projects is a root folder, and the sub-directories project1, project2 and so forth, are each different java projects. They can be completely different projects. The user should manually arrange their projects directories to fit the arrangement described above, in order to use this ASniffer feature.

The second parameter, "path to output report", is optional. It tells the ASniffer where to store the output report file. If no path is provided, the ASniffer will place the report in the "path to project". This parameter is a path to a directoy and should not include any file name or extension.json in its name. The output report file will be generated by the ASniffer, with the projects name being the name of the report file, i.e., projectsName.json. The ASniffer assumes that the name of the root directory is the name of the project. In the case several projects are being analyzed, the ASniffer considers that each sub-directory (inside the provided root directory) is the name of a separate project, and each project will have its own output report file placed in the "path to output report" (if provided, or in the "path to project" otherwise).

The third parameter determines the type of output report file. Currently ASniffer outputs a .json file. If no option is provided, ASniffer outputs a default json. The output will be placed in a folder called asniffer_results

The following parameters can be used for the type of output file:

.
├── -t json #default output JSON
├── -t jsonAV #outputs three json files suitable to be used by the Annotation Visualizer 

For more information about the Annotation Visualizer and the mentioneds views, please refer to AVisualizer

The fourth parameter (single/multi) informs the ASniffer if the "path to project" contains only one project (i.e, every .java file belongs to only one project) or several projects (i.e, the root directory contains several sub-directories, with each being a separate project). If not options is provided, ASniffer assumes it is a single project.

Example Usage

As a running example we will collect annotation metrics from the ASniffer code itself. Consider that both the asniffer.jar and the asniffer directory with the source code are on the same root directory. We have the following directory structure.

.
├── Documents                
    ├── asniffer.jar         # The ASniffer jar file. Can be manually generated or downloaded from the release section
    ├── asniffer/            # The ASniffer project folder, downloaded from its GitHub repository   

To run the tool, we use the following command:

java -jar asniffer.jar -p asniffer

Notice that only one argument is being passed, asniffer.It is the path to the source code being analyzed. Since no -m (single/multi) was provided, the ASniffer assumes that this is a single project, i.e., every .java file inside the directory asniffer/ belongs to one project, i.e., the asniffer project. No output report type was provided (-t option), hence the output will be a json. And since no output path was provided, the JSON report will be placed under asniffer/ with the name asniffer.json.

After executing the command, the asniffer.json was generated and placed under asniffer/asniffer_results. The following is a sample of the asniffer directory.

.
├── asniffer                
    ├── asniffer_results
        ├── asniffer.json  # The JSON report generated after the ASniffer collected metrics from the ASniffer project
    ├── src            # The source code folder 
    ├── pom.xml        # The pom file 
    └── ... 

The generated JSON can be found in the repository, under annotationtest/asniffer.json

Here is a sample of this JSON file.

{
  "projectName": "asniffer",
  "packages": [
    {
      "packageName": "annotationtest",
      "results": {
        "annotationtest.AnnotationTest": {
          "className": "annotationtest.AnnotationTest",
          "type": "class",
          "annotSchemasMap": {
            "Annotation0-149": "java.lang",
            "JoinColumn-139": "javax.persistence",
            "Override-192": "java.lang",
            "JoinColumn-137": "javax.persistence",
            "AssociationOverride-138": "javax.persistence",
            "Override-72": "java.lang",
            "AssociationOverride-136": "javax.persistence",
            "Annotation0-144": "java.lang",
            "Override-49": "java.lang",
            "Id-150": "javax.persistence",
            ...
            },
            "classMetric": {
              "LOC": 189,
              "ASC": 3,
              "AC": 27,
              "NAEC": 16,
              "UAC": 17,
              "NEC": 32
            },
            ...

Annotation Metrics

The Annotations Sniffer was developed to aid research in code annotations analysis. It collects 9 annotation metrics. These metrics were proposed and defined in the the paper A Metrics Suite for Code Annoation Assessment

Collected metrics

  • AC: Annotations in Class
  • UAC: Unique Annotations in Class
  • ASC: Annotation Schema in Class
  • AED: Annotation in Element Declaration
  • AA: Attributes in Annotation
  • ANL: Annotation Nesting Level
  • LOCAD: LOC in Annotation Declaration
  • NEC: Number of Elements in Class
  • NAEC: Number of Annotated Elements in Class

JSON Output Format

  • Class Metrics: These metrics have one value per class, they are AC, UAC, ASC, NAEC and NEC

  • Code Element Metrics: These metrics have one value per code element (method, field, enum, type). Our suite has one metric, AED (Annotations in Element Declaration), that measures the number of annotations declared in any given code element.

  • Annotation Metrics: These metrics have one value values per annotation declared in the class. They evaluate the annotation itself (AA, LOCAD, ANL).

  • For each code element, the report contains the element name, type (field, method, enum, etc), the source code line where the element is located and "code element metric values" (for now, only AED fits this category)

  • If the AED is greater than zero, then the code element contains annotations, and so the "annotation metrics" values are printed on the JSON. The report has the annotation name,source-code line and the values for AA, ANL and LOCAD.

  • In case of multiple projects, one JSON file is generated for each one of them

Creating a new Metric for Annotation Sniffer

The Annotation Sniffer uses Reflection to know which metrics it should collect. If you wish to use Annotation Sniffer on your project and create you owrn custom metrics, follow these steps:

  • Class Metrics: If you wish to create your own Metric Class, your class must:

    • Extend ASTVisitor (to visit the compilation unit)
    • Implement the IClassMetricCollectorinterface. It contains two methods, execute(CompilationUnit, MetricResult, AMReport) and setResult(MetricResult). The MetricResult class is where you want to store your value, as well as the name of your custom metric. Check the code for AC, ASC and UAC for examplee
    • Annotate the class with @ClassMetric.
  • If you wish to create new Annotation Metrics, then you need to:

    • Annotate the class with @AnnotationMetric.
    • Implement the interface IAnnotationMetricCollector. This interface has only one method, execute(CompilationUnit, AnnotationMetricModel, Annotation). The AnnotionMetricModel class is where you will store the metric value and name. The Annotation class is the JDT (Java Development Tools) representation of the annotation that you can perform your analysis. Check the code for: ANL, AA and LOCAD for more examples.
  • Check the metrics included in the package br.inpe.cap.asniffer.metric for more information.

ASniffer API

ASniffer Maven Central

<dependency>
  <groupId>io.github.phillima</groupId>
  <artifactId>asniffer</artifactId>
  <version>2.4.7</version>
</dependency>

If you wish to use the ASniffer as an API on your own projects, we provide some methods for this. The first thing you need is an instance of the ASniffer class, passing to its constructor two parameters: A path to the source code to be analyzed and the path to where you wish to store the output file. Afterwards you may call collectSingle() or collectMultiple().

 
 String pathToCode = "path to the source code to by analyzed";
 String pathToReport = "path to where you wish to store the generated output file report";

 ASniffer aSniffer = new ASniffer(pathToCode, pathToReport);
 
 aSniffer.collectSingle(); //for a single project
 
 aSniffer.collectMultiple(); //for multiple projects. It will run considering the directory structure is prepared for
                           //multiple projects, as described in the "How to Use" section on this Readme.
 

With these calls, the ASniffer will run, collect the annotation metrics, and place the output report file on the provided path. However, if you would like to perform some analysis on the metrics values, both collectSingle() and collectMultiple() return, respectively, an instance of AMReport and a List<AMReport>. The AMReport class contains the complete report of the collected metrics for each Java project. For this reason the collectMultiple() returns a list of AMReports (one for each Java project).

The AMReport (located here) contains the project's name, and the list of packages. The packages are stored in the PackageModel class, which in turn contains the ClassModel class that stores the annotation metrics values.

The following example collects the annotation metrics on multiple projects, prints their name and the name of every package.

 List<AMReport> reports;
 String path = "projects";

 ASniffer aSniffer = new ASniffer(path, path);
 reports = aSniffer.collectMultiple();

 for (AMReport amReport : reports) {
  System.out.println(amReport.getProjectName());
  for (PackageModel packages : amReport.getPackages()) 
  	System.out.println(packages.getPackageName());
 }

How to Cite ASniffer

@article{Lima2020,
  doi = {10.21105/joss.01960},
  url = {https://doi.org/10.21105/joss.01960},
  year = {2020},
  publisher = {The Open Journal},
  volume = {5},
  number = {47},
  pages = {1960},
  author = {Phyllipe Lima and Eduardo Guerra and Paulo Meirelles},
  title = {Annotation Sniffer: A tool to Extract Code Annotations Metrics},
  journal = {Journal of Open Source Software}
}