AI-powered semantic validation for Spring Boot applications using Large Language Models (LLMs)
Semantic AI Validator is a Kotlin library that enables intelligent, context-aware validation of form fields using Large Language Models. Instead of writing complex regex patterns or business logic, simply describe what you want to validate in plain English.
Created and maintained by SoftwareMill - we help clients scale their business through software.
- Simple Annotation-Based API: Add
@AIVerifyto any field with a validation prompt - Multiple LLM Providers: OpenAI (GPT-4), Anthropic (Claude), Google (Gemini)
- Context-Aware Validation: Validate fields based on values of other fields
- Spring Boot Integration: Seamless integration with Spring Validation framework
- Async by Default: Built on Kotlin coroutines for high performance
- Type-Safe Configuration: Fully typed configuration properties with IDE support
- Production Ready: Comprehensive error handling, logging, and testing
Get started with AI-powered validation in under 5 minutes:
Gradle (Kotlin DSL)
dependencies {
implementation("com.softwaremill.aivalidator:semantic-ai-validator-spring-boot-starter:0.2.1")
}Gradle (Groovy DSL)
dependencies {
implementation 'com.softwaremill.aivalidator:semantic-ai-validator-spring-boot-starter:0.2.1'
}Maven
<dependency>
<groupId>com.softwaremill.aivalidator</groupId>
<artifactId>semantic-ai-validator-spring-boot-starter</artifactId>
<version>0.2.1</version>
</dependency>Add to application.yml:
ai:
validator:
openai:
api-key: ${OPENAI_API_KEY}
default-provider: OPENAI
default-model: gpt-4Or use environment variables:
export OPENAI_API_KEY=sk-...import com.softwaremill.ai.validator.annotations.AIVerify
import com.softwaremill.ai.validator.spring.web.ValidateWithAI
@ValidateWithAI
data class UserProfileForm(
@AIVerify(
prompt = "Verify this bio is professional and appropriate for a business platform"
)
val bio: String,
@AIVerify(
prompt = "Check if this looks like a valid LinkedIn profile URL"
)
val linkedInUrl: String
)@RestController
@RequestMapping("/api/users")
class UserController {
@PostMapping("/profile")
suspend fun updateProfile(@Valid @RequestBody form: UserProfileForm): ResponseEntity<*> {
// Validation happens automatically before this method is called
return ResponseEntity.ok(mapOf("message" to "Profile updated"))
}
}That's it! The library will automatically validate fields using AI when forms are submitted.
This library is fully compatible with Java projects! While the library is written in Kotlin, it provides Java-friendly APIs.
When using this library from a Java project, you need to ensure compatible Kotlin and kotlinx-serialization versions:
Gradle (Groovy DSL):
plugins {
id 'java'
id 'org.springframework.boot' version '3.5.7'
id 'io.spring.dependency-management' version '1.1.7'
id 'org.jetbrains.kotlin.jvm' version '2.1.20' // Required for compatibility
}
configurations.all {
resolutionStrategy {
eachDependency { DependencyResolveDetails details ->
// Force Kotlin 2.1.20 for all Kotlin libraries
if (details.requested.group == 'org.jetbrains.kotlin') {
details.useVersion '2.1.20'
details.because 'Koog/Ktor 3.3.0 requires Kotlin 2.1+'
}
// Force kotlinx-serialization 1.8.1 for Koog compatibility
if (details.requested.group == 'org.jetbrains.kotlinx' &&
details.requested.name.startsWith('kotlinx-serialization')) {
switch (details.requested.name) {
case 'kotlinx-serialization-core':
case 'kotlinx-serialization-core-jvm':
case 'kotlinx-serialization-json':
case 'kotlinx-serialization-json-jvm':
details.useVersion '1.8.1'
details.because 'Koog 0.5.0 requires kotlinx-serialization 1.8.1'
break
}
}
}
}
}
dependencies {
implementation 'org.springframework.boot:spring-boot-starter-web'
implementation 'com.softwaremill.aivalidator:semantic-ai-validator-spring-boot-starter:0.2.1'
implementation 'org.jetbrains.kotlin:kotlin-stdlib:2.1.20' // Required
}import com.softwaremill.ai.validator.annotations.AIVerify;
import com.softwaremill.ai.validator.spring.web.ValidateWithAI;
import jakarta.validation.Valid;
import org.springframework.web.bind.annotation.*;
@ValidateWithAI
public class UserProfileForm {
@AIVerify(
prompt = "Verify this bio is professional and appropriate for a business platform"
)
private String bio;
@AIVerify(
prompt = "Check if this looks like a valid LinkedIn profile URL"
)
private String linkedInUrl;
// Constructors, getters, setters...
}
@RestController
@RequestMapping("/api/users")
public class UserController {
@PostMapping("/profile")
public ResponseEntity<?> updateProfile(@Valid @RequestBody UserProfileForm form) {
// Validation happens automatically before this method is called
return ResponseEntity.ok(Map.of("message", "Profile updated"));
}
}For more control, inject BlockingAIObjectValidator:
import com.softwaremill.ai.validator.model.ValidationResult;
import com.softwaremill.ai.validator.validator.BlockingAIObjectValidator;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.util.List;
import java.util.Map;
@RestController
@RequestMapping("/api/users")
public class UserController {
@Autowired
private BlockingAIObjectValidator validator;
@PostMapping("/profile")
public ResponseEntity<?> updateProfile(@RequestBody UserProfileForm form) {
// Manual validation
Map<String, ValidationResult> results = validator.validateBlocking(form);
// Check if validation passed
if (!validator.isValid(results)) {
Map<String, List<String>> errors = validator.getAllErrors(results);
return ResponseEntity.badRequest().body(errors);
}
// Process valid form
return ResponseEntity.ok(Map.of("message", "Profile updated"));
}
}- Kotlin stdlib required: The library requires
kotlin-stdlib:2.1.20due to internal dependencies - Blocking API: Java code should use
BlockingAIObjectValidatorinstead of the suspend-basedAIObjectValidator - Lombok compatibility: Works fine with Lombok - annotate your forms with
@Dataas usual - Performance: Fields are still validated in parallel internally for optimal performance
- Java 17+ or Kotlin 1.9+
- Spring Boot 3.0+
- At least one configured LLM provider
- For Java projects: Kotlin 2.1.20+ and kotlinx-serialization 1.8.1 (see Java Usage section)
ai:
validator:
openai:
api-key: ${OPENAI_API_KEY}
base-url: https://api.openai.com # optional, default shown
default-model: gpt-4ai:
validator:
anthropic:
api-key: ${ANTHROPIC_API_KEY}
default-provider: ANTHROPIC
default-model: claude-3-opus-20240229ai:
validator:
google:
api-key: ${GOOGLE_API_KEY}
default-provider: GOOGLE
default-model: gemini-2.0-flash| Property | Description | Default |
|---|---|---|
ai.validator.openai.api-key |
OpenAI API key | - |
ai.validator.anthropic.api-key |
Anthropic API key | - |
ai.validator.google.api-key |
Google API key | - |
ai.validator.default-provider |
Default LLM provider | OPENAI |
ai.validator.default-model |
Default model name | gpt-4 |
ai.validator.default-temperature |
Temperature (0.0-2.0) | 0.0 |
ai.validator.default-max-tokens |
Max response tokens | 150 |
ai.validator.default-timeout-millis |
Request timeout in milliseconds | 60000 |
ai.validator.enable-logging |
Enable debug logging | false |
Note: Ollama support is planned for a future release. Currently supported providers are OpenAI, Anthropic, and Google.
Validate that a project description contains all required elements:
@ValidateWithAI
data class ProjectDescriptionForm(
@AIVerify(
prompt = """Verify that this project description contains:
1. Project name
2. Main goal or objective
3. Target audience
4. Timeline
If any are missing, list what's missing."""
)
val description: String
)Valid Input:
CloudSync Pro - A real-time file synchronization tool for remote teams.
Target audience: Small to medium businesses with distributed teams.
Timeline: Beta launch in Q2 2025, full release Q3 2025.
Invalid Input:
A new app for file sharing
AI Response: "Missing: project name, target audience, and timeline"
Validate product reviews for quality and rating consistency:
@ValidateWithAI
data class ProductReviewForm(
@AIVerify(
prompt = "Check if this review is constructive, non-offensive, and product-related"
)
val reviewText: String,
@AIVerify(
prompt = "Verify the {fieldValue} star rating matches the review sentiment",
contextFields = ["reviewText", "productName"]
)
val rating: Int,
val productName: String
)Valid Review:
{
"reviewText": "Excellent laptop! Fast performance, great battery life.",
"rating": 5,
"productName": "Dell XPS 15"
}Invalid Review (rating mismatch):
{
"reviewText": "Terrible battery life, keeps crashing, very disappointed.",
"rating": 5,
"productName": "Smartphone X"
}AI Response: "Rating mismatch: Review is negative but rating is 5 stars"
Validate invoice numbers against uploaded PDFs:
@ValidateWithAI
data class InvoiceValidationForm(
@AIVerify(
prompt = "Extract the invoice number from the PDF and verify it matches the provided number",
contextFields = ["invoicePdf"],
llmProvider = LLMProvider.OPENAI,
model = "gpt-4-vision-preview"
)
val invoiceNumber: String,
@AIVerifyContext
val invoicePdf: MultipartFile
)@AIVerify(
prompt = "Analyze this image and verify it's appropriate",
llmProvider = LLMProvider.OPENAI,
model = "gpt-4-vision-preview",
maxTokens = 300
)
val imageDescription: Stringinterface CreateValidation
interface UpdateValidation
@AIVerify(
prompt = "Verify this is a unique username",
groups = [CreateValidation::class]
)
val username: String
@Validated(CreateValidation::class)
@PostMapping("/users")
suspend fun createUser(@Valid @RequestBody form: UserForm) { ... }@AIVerify(
prompt = "Validate the content",
failOnError = false // Don't fail if LLM is unreachable
)
val content: StringThe core library can be used standalone without Spring Boot:
import com.softwaremill.ai.validator.validator.AIObjectValidator
import com.softwaremill.ai.validator.validator.AIFieldValidator
import com.softwaremill.ai.validator.llm.LLMClientFactory
import com.softwaremill.ai.validator.llm.PromptBuilder
import com.softwaremill.ai.validator.annotations.LLMProvider
import kotlinx.coroutines.runBlocking
// Create LLM client factory
val clientFactory = LLMClientFactory(
openAIApiKey = "sk-...",
defaultModels = mapOf(LLMProvider.OPENAI to "gpt-4"),
enableLogging = true,
promptBuilder = PromptBuilder()
)
// Create validators
val fieldValidator = AIFieldValidator(llmClientFactory = clientFactory)
val objectValidator = AIObjectValidator(fieldValidator = fieldValidator)
// Validate
runBlocking {
val results = objectValidator.validate(myFormObject)
results.forEach { (field, result) ->
if (!result.isValid) {
println("$field: ${result.errors}")
}
}
}See semantic-ai-validator-core/README.md for standalone usage details.
- Annotation Processing: Library scans for
@AIVerifyannotations on form fields - Prompt Building: Constructs prompts with field values and context data
- LLM Validation: Sends prompts to configured LLM provider asynchronously
- Response Parsing: Parses LLM response to determine validity and error messages
- Spring Integration: Integrates with Spring's
@Validmechanism for seamless validation
┌─────────────┐
│ Form Data │
└──────┬──────┘
│
▼
┌─────────────────────────────────┐
│ Spring Validation Framework │
│ (@Valid triggers validation) │
└──────┬──────────────────────────┘
│
▼
┌─────────────────────────────────┐
│ AI Validator │
│ - Find @AIVerify fields │
│ - Build prompts with context │
│ - Validate in parallel │
└──────┬──────────────────────────┘
│
▼
┌─────────────────────────────────┐
│ LLM Provider (Koog Framework) │
│ - OpenAI / Anthropic / etc. │
│ - Async execution │
└──────┬──────────────────────────┘
│
▼
┌─────────────────────────────────┐
│ Validation Result │
│ - isValid: true/false │
│ - errors: List<String> │
└─────────────────────────────────┘
- Parallel Validation: Multiple fields validated concurrently
- Typical Response Time: 1-3 seconds per field (depends on LLM provider)
- Async Architecture: Non-blocking coroutine-based implementation
- Efficient Context Passing: Only relevant data sent to LLM
- Use
temperature = 0.0for consistent, fast validation responses - Keep
maxTokenslow (150-300) for faster responses - Configure appropriate timeouts for your use case
- Consider caching strategies for repeated validations
The project includes a full demo application showcasing all features:
export OPENAI_API_KEY=sk-...
./gradlew :semantic-ai-validator-demo:bootRun -x testVisit http://localhost:8080 to try:
- Project description validation
- Invoice PDF validation
- Product review quality checks
See semantic-ai-validator-demo/README.md for details.
Full API documentation is available in the module READMEs:
- Core Library - Standalone usage, architecture, extension points
- Spring Boot Starter - Auto-configuration, properties reference
- Demo Application - Running demos, API endpoints, examples
Problem: LLM validation doesn't seem to work correctly.
Solutions:
- Check your API key is correctly set:
echo $OPENAI_API_KEY - Enable debug logging:
ai.validator.enable-logging: true - Review logs for LLM errors or rate limiting
- Verify your prompt is clear and specific
- Try reducing
temperatureto 0.0 for more deterministic results
Problem: Validation takes too long.
Solutions:
- Use faster models (e.g.,
gpt-3.5-turboinstead ofgpt-4) - Reduce
maxTokensto minimum needed - Check your network connection to LLM provider
- Increase timeout value if needed:
ai.validator.default-timeout-millis
Problem: Cannot connect to LLM provider.
Solutions:
- Verify API key is valid and has sufficient credits
- Check firewall/proxy settings
- Review base URL configuration (if using custom endpoints)
- Check provider status page for outages
- Verify network connectivity and DNS resolution
Problem: Spring Boot doesn't return validation errors.
Solutions:
- Ensure
@Validannotation is present on controller parameter - Check
@ValidateWithAIis on the form class - Verify error handling is configured:
server: error: include-message: always include-binding-errors: always
- Review exception handler configuration
Problem: Cannot use multiple providers simultaneously.
Solutions:
- Configure API keys for all providers you want to use
- Specify provider per field:
llmProvider = LLMProvider.ANTHROPIC - Check auto-configuration logs to see which clients were created
- Verify conditional beans are not being excluded
We welcome contributions! Please see CONTRIBUTING.md for:
- Development setup
- Code style guidelines
- Testing requirements
- Pull request process
- Fork the repository
- Create a feature branch:
git checkout -b feature/my-feature - Make your changes with tests
- Run tests:
./gradlew test - Submit a pull request
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Copyright 2025 SoftwareMill
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
SoftwareMill is a software development and consulting company with a focus on building high-quality, scalable applications. We specialize in:
- Custom software development
- Scala, Kotlin, and Java expertise
- Distributed systems and microservices
- Cloud-native architectures
- AI/ML integration
Want to build something amazing? Get in touch with us!
This library is built on top of:
- Koog - AI agent framework for Kotlin
- Spring Boot - Application framework
- Kotlin Coroutines - Async programming
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Commercial Support: Contact SoftwareMill
- Blog: SoftwareMill Blog
Star the project if you find it useful! Follow @softwaremill for updates.