A professional Java Spring Boot microservice with OAuth2 authentication and WebSocket capabilities following MCP (Model Context Protocol) design principles.
- Quick Start
- Features
- System Architecture
- API Documentation
- Configuration
- Testing
- Deployment
- Security
- Contributing
- License
- Acknowledgments
-
Clone the repository
git clone https://github.com/markbsigler/CodePipeline-MCP-JavaSpring-OAuth2.git cd CodePipeline-MCP-JavaSpring-OAuth2
-
Start dependencies
docker-compose up -d
-
Import Keycloak realm
- Go to Keycloak Admin Console
- Login:
admin
/admin
- Import:
keycloak/realm-export.json
-
Run the app
./mvnw spring-boot:run -Dspring-boot.run.profiles=dev
-
Access API
-
Test Reliability & Transactional Context
- All integration tests now run within a transactional context, ensuring reliable rollback and isolation.
- Fixed TransactionRequiredException errors in repository integration tests by annotating test classes and setup methods with
@Transactional
. - Improved documentation for running and troubleshooting integration tests with Testcontainers and PostgreSQL.
-
Enhanced Message Service
- Added input validation for message creation (null/empty checks for content and sender)
- Improved error messages for validation failures
- Fixed concurrency issues in message deletion
- Added comprehensive test coverage for all service methods
- Improved test reliability and isolation
- Added proper cleanup in test cases
- Fixed version handling in update operations
-
Infrastructure & Testing
- Upgraded to Testcontainers 1.19.7 for better Java 17 compatibility
- Added PostgreSQL Testcontainers integration for reliable database testing
- Implemented comprehensive MessageRepository integration tests with real database
- Added BasePostgresRepositoryTest for consistent test setup
- Improved test isolation with @DataJpaTest and proper transaction management
- Added PostgresTestConfig for container configuration
- Implemented comprehensive test cases for all CRUD operations
- Added test coverage for custom repository methods
- Improved test reliability with proper container lifecycle management
This service implements the ISPW API specification, providing endpoints for managing assignments, tasks, releases, and release sets. The implementation follows RESTful principles and includes comprehensive error handling, validation, and security.
graph TD
A[GET /api/assignments] --> B[List Assignments]
B --> C[Filter by status/application]
D[POST /api/assignments] --> E[Create Assignment]
E --> F[Validate input]
G[GET /api/assignments/id] --> H[Get Assignment Details]
H --> I[Include tasks]
J[PUT /api/assignments/id] --> K[Update Assignment]
K --> L[Validate ownership]
M[DELETE /api/assignments/id] --> N[Delete Assignment]
N --> O[Cascade delete tasks]
graph TD
A[GET /api/assignments/id/tasks] --> B[List Tasks]
B --> C[Filter by status]
D[POST /api/assignments/id/tasks] --> E[Create Task]
E --> F[Validate assignment exists]
G[PUT /api/tasks/id] --> H[Update Task]
H --> I[Check permissions]
J[DELETE /api/tasks/id] --> K[Delete Task]
K --> L[Update assignment status]
graph TD
A[GET /api/releases] --> B[List Releases]
B --> C[Filter by application/status]
D[POST /api/releases] --> E[Create Release]
E --> F[Validate input]
G[GET /api/releases/id] --> H[Get Release Details]
H --> I[Include release sets]
J[PUT /api/releases/id] --> K[Update Release]
K --> L[Validate ownership]
M[DELETE /api/releases/id] --> N[Delete Release]
N --> O[Cascade delete sets]
graph TD
A[POST /api/releases/id/sets] --> B[Create Release Set]
B --> C[Validate release exists]
D[GET /api/releases/id/sets/setId] --> E[Get Set Details]
E --> F[Include deployment history]
G[PUT /api/releases/id/sets/setId] --> H[Update Set]
H --> I[Validate status transition]
J[POST /api/releases/id/sets/setId/deploy] --> K[Deploy Set]
K --> L[Trigger deployment workflow]
erDiagram
ASSIGNMENT ||--o{ TASK : has
ASSIGNMENT {
String id PK
String title
String description
String status
String owner
LocalDateTime dueDate
}
TASK {
String id PK
String title
String description
String status
String assignee
LocalDateTime dueDate
String assignmentId FK
}
RELEASE ||--o{ RELEASE_SET : contains
RELEASE {
String id PK
String name
String application
String status
String owner
LocalDateTime targetDate
}
RELEASE_SET {
String id PK
String name
String status
String owner
String description
String releaseId FK
LocalDateTime deployedAt
String deployedBy
}
sequenceDiagram
participant C as Client
participant A as Application
participant K as Keycloak
C->>K: 1. Request Access Token (Client Credentials/Password)
K-->>C: 2. JWT Token
C->>A: 3. API Request with JWT
A->>K: 4. Validate Token
K-->>A: 5. Token Info (Roles/Permissions)
A-->>C: 6. Response
Note over C,A: All endpoints require valid JWT token
Note over C,A: Admin role required for write operations
The API returns appropriate HTTP status codes and error messages in the following format:
{
"timestamp": "2025-06-05T19:45:30.123456Z",
"status": 404,
"error": "Not Found",
"message": "Release not found with id: 123",
"path": "/api/releases/123"
}
Common error responses include:
400 Bad Request
: Invalid input data401 Unauthorized
: Missing or invalid authentication403 Forbidden
: Insufficient permissions404 Not Found
: Resource not found409 Conflict
: Resource conflict (e.g., duplicate name)500 Internal Server Error
: Server-side error
All endpoints are protected by rate limiting:
- 100 requests per minute per authenticated user
- 1000 requests per minute per IP for public endpoints
All endpoints are instrumented for monitoring:
- Request/response metrics
- Error rates
- Response times
- Active connections
All responses include security headers:
Content-Security-Policy
X-Content-Type-Options: nosniff
X-Frame-Options: DENY
X-XSS-Protection: 1; mode=block
Strict-Transport-Security: max-age=31536000 ; includeSubDomains
graph TD
A[Client] -->|HTTPS| B[API Gateway / Load Balancer]
B --> C[Spring Boot Application]
C --> D[PostgreSQL]
C --> E[Keycloak]
C --> F[Redis Cache]
G[WebSocket Client] -->|WSS| C
C -->|WebSocket Events| G
H[Monitoring] -->|Metrics| C
C -->|Logs| I[ELK Stack]
sequenceDiagram
participant C as Client
participant A as Application
participant K as Keycloak
C->>K: 1. Request Access Token (Client Credentials/Password)
K-->>C: 2. JWT Token
C->>A: 3. API Request with JWT
A->>K: 4. Validate Token
K-->>A: 5. Token Info
A-->>C: 6. Response
sequenceDiagram
participant C as WebSocket Client
participant S as Spring App
participant B as Message Broker
C->>S: 1. Connect (STOMP)
S-->>C: 2. CONNECTED
C->>S: 3. SUBSCRIBE /topic/messages
C->>S: 4. SEND /app/chat
S->>B: 5. Process Message
B-->>S: 6. Broadcast
S-->>C: 7. MESSAGE (to all subscribers)
-
Spring Boot 3.2 with Java 17
- Modern Java features and performance improvements
- Auto-configuration and standalone deployment
-
Security
- OAuth2 Resource Server with JWT validation
- Keycloak integration for identity management
- Role-based access control (RBAC)
- CSRF protection
- CORS configuration
-
Real-time Communication
- WebSocket support with STOMP protocol
- Message broadcasting and direct messaging
- User presence tracking
- Event-driven architecture
-
Data Layer
- PostgreSQL with Spring Data JPA
- Flyway for database migrations
- QueryDSL for type-safe queries
- Auditing (createdAt, updatedAt, etc.)
-
API
- RESTful endpoints
- OpenAPI 3.0 documentation with Swagger UI
- HATEOAS for discoverable APIs
- Pagination and filtering
- Validation and error handling
-
DevOps
- Docker and Docker Compose support
- Maven build system
- Git version control
- CI/CD ready
-
Monitoring
- Spring Boot Actuator
- Health checks
- Metrics with Prometheus
- Distributed tracing
-
Testing
- JUnit 5 for unit and integration testing
- TestContainers for integration testing with real databases
- PostgreSQL Testcontainers for reliable database testing
- @DataJpaTest for repository layer testing
- MockMVC for controller testing
- Test coverage reports with JaCoCo
- Test profiles for different environments
- Container reuse for faster test execution
-
Java Development Kit (JDK) 17 or later
- Download Eclipse Temurin JDK 17
- Verify:
java -version
- Required for Java 17 language features and modules
-
Maven 3.9+ or Gradle 8+
- Install Maven
- Verify:
mvn -v
-
Docker and Docker Compose
- Install Docker Desktop
- Verify:
docker --version
anddocker-compose --version
-
Database
- PostgreSQL 14+ (included in Docker Compose)
- Or install PostgreSQL locally
-
IDE (Recommended)
- IntelliJ IDEA
- VS Code with Java extensions
- Eclipse
-
API Testing Tools
Run all unit tests:
./mvnw test
Integration tests require Docker (for Testcontainers). If Docker is not available, these tests will be skipped automatically.
Run integration tests:
./mvnw verify -Pintegration-test
- Integration tests use Testcontainers with a real PostgreSQL instance for full reliability and production-like behavior.
- All repository and service integration tests are annotated with
@Transactional
to ensure automatic rollback and isolation. - Test data is isolated and rolled back after each test using
@Transactional
. - Test reliability is improved by parameterized tests and Docker availability checks.
- Transactional Context: All integration tests are annotated with
@Transactional
at the class or method level. If you seeTransactionRequiredException
, ensure your test class or setup methods are annotated with@Transactional
. - Testcontainers: Integration tests require Docker to be running. If Docker is not available, tests will be skipped. Make sure Docker is running before executing integration tests.
- Database Cleanup: Test data is automatically cleaned up between tests using
deleteAllInBatch()
and transactional rollbacks. - Running Specific Tests:
- Run only repository integration tests:
./mvnw test -Dtest=MessageRepositoryIT
- Run a specific test method:
./mvnw test -Dtest=MessageRepositoryIT#shouldSaveMessageSuccessfully
- Run only repository integration tests:
- Common Issues:
TransactionRequiredException
: Add@Transactional
to your test class or method.Connection refused
orCould not connect to PostgreSQL
: Ensure Docker and the PostgreSQL container are running.Testcontainers not found
: Ensure you are using the correct Maven profile and dependencies.
-
Clone the repository
git clone https://github.com/markbsigler/CodePipeline-MCP-JavaSpring-OAuth2.git cd CodePipeline-MCP-JavaSpring-OAuth2
Or using SSH:
git clone git@github.com:markbsigler/CodePipeline-MCP-JavaSpring-OAuth2.git cd CodePipeline-MCP-JavaSpring-OAuth2
-
Start the infrastructure
docker-compose up -d
This will start:
- PostgreSQL database
- Keycloak OAuth2 server
- PgAdmin (optional, for database management)
-
Configure Keycloak
- Access Keycloak Admin Console: http://localhost:8081/
- Login with
admin/admin
- Import the realm configuration from
keycloak/realm-export.json
-
Run the application
# Using Maven wrapper ./mvnw spring-boot:run -Dspring-boot.run.profiles=dev # Or build and run the JAR ./mvnw clean package java -jar target/codepipeline-mcp-0.0.1-SNAPSHOT.jar
-
Access the application
- API Base URL: http://localhost:8080/api
- Swagger UI: http://localhost:8080/api/swagger-ui.html
- H2 Console: http://localhost:8080/api/h2-console
- JDBC URL:
jdbc:h2:mem:testdb
- Username:
sa
- Password: (leave empty)
- JDBC URL:
-
Test Users
- Admin User
- Username:
admin
- Password:
admin
- Roles:
ROLE_ADMIN
,ROLE_USER
- Username:
- Regular User
- Username:
user
- Password:
password
- Roles:
ROLE_USER
- Username:
- Admin User
-
Run all tests
./mvnw clean test
-
Run integration tests with Testcontainers
# Make sure Docker is running docker --version # Run integration tests ./mvnw test -Dtest=*IT
-
Run a specific test class
./mvnw test -Dtest=MessageRepositoryIT
-
Run a specific test method
./mvnw test -Dtest=MessageRepositoryIT#shouldSaveMessage
-
Generate test coverage report
./mvnw clean verify # Report will be available at: target/site/jacoco/index.html
- Tests use Testcontainers with PostgreSQL 14
- Test data is automatically cleaned up between tests
- Database schema is managed by Hibernate in test profile
- Tests run with JUnit 5 and AssertJ for assertions
test
: Default profile for unit tests (in-memory H2 database)itest
: Integration test profile (PostgreSQL Testcontainer)
To run with a specific profile:
./mvnw test -Pitest
- Generate JaCoCo coverage report
./mvnw jacoco:report # Open target/site/jacoco/index.html in browser
-
Get Access Token
POST /auth/realms/mcp/protocol/openid-connect/token Content-Type: application/x-www-form-urlencoded client_id=mcp-client &username=user &password=password &grant_type=password &client_secret=your-client-secret
-
Using the Access Token
GET /api/messages Authorization: Bearer <access_token>
GET /api/messages
- Get all messages (paginated)GET /api/messages/{id}
- Get message by IDPOST /api/messages
- Create a new messagePUT /api/messages/{id}
- Update a messageDELETE /api/messages/{id}
- Delete a messageGET /api/messages/search?query={query}
- Search messages
/ws
- WebSocket endpoint/topic/messages
- Subscribe to message updates/queue/private
- Private message queue/app/chat
- Send a message
Create a Message
POST /api/messages
Content-Type: application/json
Authorization: Bearer <access_token>
{
"content": "Hello, World!",
"recipient": "user2"
}
Subscribe to Messages
const socket = new SockJS('/ws');
const stompClient = Stomp.over(socket);
stompClient.connect({}, function(frame) {
console.log('Connected: ' + frame);
// Subscribe to public messages
stompClient.subscribe('/topic/messages', function(message) {
console.log('New message: ' + message.body);
});
// Subscribe to private messages
stompClient.subscribe('/user/queue/private', function(message) {
console.log('Private message: ' + message.body);
});
});
Create a .env
file in the project root with the following variables:
# Application
SERVER_PORT=8080
SERVER_SERVLET_CONTEXT_PATH=/api
# Database
SPRING_DATASOURCE_URL=jdbc:postgresql://localhost:5432/mcp_db
SPRING_DATASOURCE_USERNAME=postgres
SPRING_DATASOURCE_PASSWORD=postgres
# JPA/Hibernate
SPRING_JPA_HIBERNATE_DDL_AUTO=update
SPRING_JPA_SHOW_SQL=true
# OAuth2 Resource Server
SPRING_SECURITY_OAUTH2_RESOURCESERVER_JWT_ISSUER_URI=http://localhost:8081/realms/mcp
SPRING_SECURITY_OAUTH2_RESOURCESERVER_JWT_JWK_SET_URI=http://localhost:8081/realms/mcp/protocol/openid-connect/certs
# Logging
LOGGING_LEVEL_ORG_SPRINGFRAMEWORK_WEB=INFO
LOGGING_LEVEL_COM_CODEPIPELINE=DEBUG
# Actuator
MANAGEMENT_ENDPOINTS_WEB_EXPOSURE_INCLUDE=health,info,metrics
MANAGEMENT_ENDPOINT_HEALTH_SHOW_DETAILS=always
-
Build the application
./mvnw clean package -DskipTests docker build -t codepipeline-mcp .
-
Run with Docker Compose
# For development docker-compose -f docker-compose.yml -f docker-compose.dev.yml up -d # For production docker-compose -f docker-compose.yml -f docker-compose.prod.yml up -d
-
Create Kubernetes secrets
kubectl create secret generic db-secret \ --from-literal=username=postgres \ --from-literal=password=postgres
-
Deploy the application
kubectl apply -f k8s/
- Client requests an access token from Keycloak
- Keycloak issues a JWT token
- Client includes the token in the
Authorization
header - Resource server validates the token and checks scopes/roles
The application includes the following security headers by default:
- Content Security Policy (CSP)
- X-Content-Type-Options
- X-Frame-Options
- X-XSS-Protection
- Strict-Transport-Security (HSTS)
Endpoint | Description |
---|---|
/actuator/health |
Application health information |
/actuator/info |
Application information |
/actuator/metrics |
Application metrics |
/actuator/prometheus |
Prometheus metrics |
Logs are written to logs/application.log
and can be configured in logback-spring.xml
.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature
) - Commit your changes (
git commit -m 'Add some AmazingFeature'
) - Push to the branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
.
βββ src/
β βββ main/
β β βββ java/com/codepipeline/mcp/
β β β βββ config/ # Configuration classes
β β β βββ controller/ # REST controllers
β β β βββ dto/ # Data Transfer Objects
β β β βββ exception/ # Exception handling
β β β βββ model/ # JPA entities
β β β βββ repository/ # Spring Data repositories
β β β βββ security/ # Security configurations
β β β βββ service/ # Business logic
β β β βββ websocket/ # WebSocket configurations
β β βββ resources/
β β βββ application.yml # Main configuration
β β βββ application-dev.yml # Development profile
β β βββ application-prod.yml # Production profile
β βββ test/ # Test classes
βββ .github/ # GitHub workflow files
βββ docker/ # Docker configuration files
βββ k8s/ # Kubernetes manifests
βββ keycloak/ # Keycloak configuration
βββ .gitignore
βββ docker-compose.yml
βββ Dockerfile
βββ mvnw
βββ pom.xml
βββ README.md
-
config/
: Spring configuration classesSecurityConfig.java
: Security configurationWebSocketConfig.java
: WebSocket configurationOpenAPIConfig.java
: OpenAPI/Swagger configuration
-
controller/
: REST controllersMessageController.java
: Message REST endpointsWebSocketController.java
: WebSocket message handling
-
model/
: JPA entitiesMessage.java
: Message entity with JPA annotations
-
service/
: Business logicMessageService.java
: Message business logic
-
security/
: Security configurationsJwtRoleConverter.java
: JWT role conversionSecurityUtils.java
: Security utilities
-
websocket/
: WebSocket componentsWebSocketEventListener.java
: WebSocket event handlingWebSocketMessageBrokerConfig.java
: WebSocket broker configuration
API documentation is available at runtime using Swagger UI:
- Swagger UI: http://localhost:8080/api/swagger-ui.html
- OpenAPI JSON: http://localhost:8080/api/v3/api-docs
Run unit tests:
./mvnw test
Run integration tests (requires Docker):
./mvnw verify -Pintegration-test
Build the application:
./mvnw clean package -DskipTests
Build Docker image:
docker build -t codepipeline-mcp .
Run with Docker Compose:
docker-compose up -d
Deploy to Kubernetes:
kubectl apply -f k8s/
The application can be deployed to any cloud platform that supports Docker containers:
- AWS ECS/EKS
- Google Cloud Run/GKE
- Azure Container Apps/AKS
- Heroku
- OAuth2 with JWT tokens
- Role-based access control (RBAC)
- CSRF protection
- CORS configuration
- Secure headers
- Input validation
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature
) - Commit your changes (
git commit -m 'Add some AmazingFeature'
) - Push to the branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.