Install plugin in IntelliJ IDEA: lombok
./gradlew :user-service:bootRun
./gradlew :logistics-service:bootRun
./gradlew :product-service:bootRun
./gradlew :order-service:bootRun
Debug mode: --debug-jvm
./gradlew build --continuous
├── README.md
├── order-service
├── user-service
├── product-service
└── logistics-service
http://localhost:8080/order-service/orders/1234567890
-> orderId -> Order User -> userId -> User Service
-> productId -> Product Service
-> logisticsId -> Logistics Service
You can access the two endpoints once you start the services:
- http://localhost:8080/order-service/orders/1234567890
- http://localhost:8080/order-service/orders/1234567890/sync
-
add dependency for zipkin-server
compile('io.zipkin.java:zipkin-server') runtime('io.zipkin.java:zipkin-autoconfigure-ui')
-
add
@EnableZipkinServer
annotation for ZipkinServerApplication -
run
./gradlew :zipkin-server:bootRun
-
check zipkin dashboard
configure trace client(for both order-service, user-service, product-service and logistics-service):
- add dependency for each service
compile('org.springframework.cloud:spring-cloud-starter-sleuth') compile('org.springframework.cloud:spring-cloud-starter-zipkin')
- run command ./gradlew cI idea, then start all services using ./gradlew :xxx-service:bootRun
- add log for http request(slf4j)
- start application check log
- check zipkin dashboard
- add sampler properties sleuth.sampler.percentage=0.2
- check zipkin dashboard
- add dependency: compile('io.zipkin.java:zipkin-autoconfigure-storage-mysql:2.2.1') add dependency: compile('mysql:mysql-connector-java:5.1.13') add dependency: compile('org.springframework.boot:spring-boot-starter-jdbc')
- we choose to use mysql as dbms, create related database;
- add spring datasource configuration.(schema/username/password/url/driver-class-name...)
- check the database
- add log output file
- start application, check if log file is generated
- add log appender to generate log(create a file named logback.xml under resource directory)
<configuration>
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file>build/log/application.log</file>
<encoder>
<pattern>%-4relative [%thread] %-5level %logger{35} - %msg%n</pattern>
</encoder>
</appender>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<root level="info">
<appender-ref ref="FILE" />
<appender-ref ref="STDOUT" />
</root>
</configuration>
- create LoggingInterceptor class implement the HandlerInterceptor interface
- override the prehandler behavior
- create a log configuration class extends from WebMvcConfigureAdapter
- override the addInterceptors method, register the LoggingInterceptor to the InterceptorRegistry
- restart the server and check the log.
- create log annotation
- implement AOP method for logging
- Configure Forwarder connection to Index Server: ./SplunkForwarder/bin/splunk add forward-server localhost:9997
- restart the forwarder: ./SplunkForwarder/bin/splunk restart
- go to splunk dashboard(default port:8000), enable the 9997 port of the indexer(setting->forwarding & receiving->receive data add new)
- check if the forwarder and server's connection is good: ./SplunkForwarder/bin/splunk list forward-server
- search index=_internal in plunk, check the log
- add monitor for our api log: ./SplunkForwarder/bin/splunk add monitor /path/to/app/logs/ -index ${index} -sourcetype ${sourcetype}
- restart the forwarder
- add index for server in splunk dashboard(settings->indexes->new index)
- call our api
- search our index in splunk, check the log.