Skip to content
🚀 Play in Aletyx Sandbox to start building your Business Processes and Decisions today! ×

Why Adaptive Process Architecture is the Future

The evolution from traditional jBPM deployments to Kogito's microservice-oriented approach represents a fundamental shift in how we design and deploy business automation solutions. The Adaptive Process Architecture emerged as the optimal solution to address the challenges of modern cloud-native deployments while retaining the power of sophisticated process orchestration.

Strategic Advantages of Adaptive Process Architecture

  1. Balance of Scalability and Operational Simplicity While fully distributed architectures offer theoretical maximum scalability, they introduce exponential complexity in deployment, monitoring, and troubleshooting. The Adaptive Process Architecture strikes an optimal balance by intelligently co-locating related services while maintaining clean service boundaries. This approach delivers near-linear scalability with significantly reduced operational overhead.

  2. Drastically Reduced Operational Complexity By carefully packaging contextually related services together, the operational overhead is drastically reduced. This makes sophisticated process orchestration accessible to organizations without requiring specialized microservice expertise, while still preserving the cloud-native benefits that modern enterprises demand.

  3. Perfect Alignment with Domain-Driven Design The architecture naturally supports bounded contexts where each business service encapsulates its complete functionality. This alignment with DDD principles ensures that your technical architecture mirrors your business domains, creating a unified language between technical and business stakeholders.

  4. Intelligent Context Management Modern business processes require sophisticated context management across multiple components. Co-location dramatically reduces the challenges of distributed context coordination, allowing your processes to maintain rich contextual information without the complexity of distributed state synchronization.

  5. Optimized Performance for Critical Paths Critical process operations execute through in-memory or local service calls rather than network calls, delivering significant performance improvements. This architecture ensures consistent sub-millisecond response times for critical operations while maintaining the flexibility of distributed communications where appropriate.

Core Components of Adaptive Process Architecture

Process Definition Models

The Business Process Model and Notation (BPMN) artifacts serve as the digital blueprint of your business processes. These models transcend traditional software specifications by providing a visual representation that both business and technical stakeholders can understand and refine.

During compilation, the Kogito build chain transforms these models into highly optimized executable code, generating specialized components tailored to your specific process requirements. This bridge between business design and technical implementation enables true collaborative development where domain experts and engineers work with a unified understanding.

The resulting executable models capture business intent with exceptional fidelity while leveraging modern software engineering principles. This separation of concerns allows business experts to focus on process design while technical teams optimize deployment and integration aspects.

Orchestration Engine

The intelligent core of your process applications, powered by the battle-tested jBPM engine reimagined for cloud-native environments. This sophisticated engine coordinates the flow of activities across your entire process landscape, making intelligent routing decisions and dynamically delegating specialized capabilities to purpose-built subsystems.

The engine handles complex orchestration patterns including parallel execution paths, conditional branching logic, event-driven flow control, and compensation handling - all while maintaining transactional integrity across distributed components. Its modular architecture allows seamless interaction with other specialized subsystems like the Human Collaboration Framework for human-in-the-loop scenarios and the Temporal Event Coordinator for time-based orchestration.

This next-generation engine combines the reliability of traditional BPM systems with the agility and scalability demands of modern cloud architectures, delivering a foundation that adapts to changing business requirements without sacrificing operational stability.

Cloud-Native Runtime

The Kogito-powered foundation provides essential enterprise capabilities required for mission-critical deployments. This runtime layer elegantly handles cross-cutting concerns including transaction management, API exposure, resource pooling, security controls, and component lifecycle management.

Built on Quarkusâ„¢, the runtime delivers exceptional startup performance and memory efficiency, whether deployed in containers, Kubernetes clusters, or serverless environments. The cloud-native design ensures your process applications can scale elastically to meet demand spikes while maintaining consistent performance characteristics.

The runtime's modular design allows selective inclusion of capabilities based on your specific requirements, avoiding the bloat of traditional application servers while delivering enhanced resilience through simplified deployment topologies.

Data Intelligence Layer (Data-Index)

A sophisticated real-time snapshot of process state that enables immediate process visibility across your organization. This component receives incremental state change events from the Orchestration Engine and efficiently computes the current state by intelligently integrating these changes with existing contextual data.

The Data Intelligence Layer exposes powerful GraphQL interfaces that allow both systems and users to query process data using flexible, domain-specific queries. This capability enables dashboards, monitoring tools, and advanced analytics to provide real-time insights into your processes without impacting operational performance.

http://localhost:8080/<root-path>/graphql-ui/

The Data Intelligence Layer can be added to your projects when adding the dependency for the kogito-addons-quarkus-data-index-jpa:

<dependency>
  <groupId>org.kie</groupId>
  <artifactId>kogito-addons-quarkus-data-index-jpa</artifactId>
</dependency>

Process History Service (Data-Audit)

A comprehensive temporal view of process execution that captures the complete evolution of process instances throughout their lifecycle. This component maintains an immutable record of every significant event, creating a trusted audit trail for compliance, analysis, and process improvement initiatives.

The rich historical data enables authorized users to replay processes from any point in time, understand decision paths, and identify optimization opportunities. Through its flexible GraphQL interface, the Process History Service supports complex historical queries that can reveal insights for performance analysis and business intelligence.

The Data-Audit subsystem provides several powerful capabilities:

  • Runs as a colocated service within your Quarkus application
  • Includes extension points for customization and integration
  • Provides GraphQL querying capabilities for flexible data access
  • Supports multiple storage implementations through extension points

To add the Data-Audit capability to your projects, the following dependencies must be added to your project:

<dependency>
  <groupId>org.kie</groupId>
  <artifactId>kogito-addons-quarkus-data-audit</artifactId>
</dependency>

<dependency>
  <groupId>org.kie</groupId>
  <artifactId>kogito-addons-quarkus-data-audit-jpa</artifactId>
</dependency>

Temporal Event Coordinator (Jobs Service)

An intelligent scheduling system that manages the time-dependent aspects of your business processes. This component handles various temporal patterns including timers, deadlines, SLA enforcement, and scheduled notifications.

The Temporal Event Coordinator maintains execution guarantees even in distributed environments, ensuring time-based events trigger reliably despite infrastructure changes or restarts. It seamlessly integrates with other components to initiate time-based process actions or trigger escalation paths when activities exceed their expected duration.

To enable this capability in your project:

<!-- Required for the Jobs Service add-on transport tier definition -->
<dependency>
  <groupId>org.kie</groupId>
  <artifactId>kogito-addons-quarkus-jobs-management</artifactId>
</dependency>

<dependency>
  <groupId>org.kie</groupId>
  <artifactId>kogito-addons-quarkus-jobs</artifactId>
</dependency>

<!-- Coming from Other Distributions of Job Service you will need to remove the Jobs Service add-on storage definition -->
<!-- <dependency>
  <groupId>org.kie</groupId>
  <artifactId>jobs-service-storage-jpa</artifactId>
</dependency> -->

With the Temporal Event Coordinator, events are typically driven with the ISO-8601 standard to drive the events. This standard will handle things like time format, repeatability, and more.

Human Collaboration Framework (User Tasks)

A sophisticated system for seamlessly integrating human judgment and expertise into automated processes. This framework implements a comprehensive lifecycle for tasks requiring human input, approval, or decision-making.

The Human Collaboration Framework manages task assignment strategies, permission controls, and state transitions using a well-defined task lifecycle. It supports rich interaction patterns including attachments, comments, forms, and notifications - creating an intuitive collaboration experience that connects human expertise with automated process execution.

When a process instance reaches a User Task, the framework creates a new task in the Created state, which then moves through various states including Ready, Reserved, and ultimately Completed (or alternative terminal states like Failed or Obsolete).

By default, the User Tasks are persisted in memory, but this can be changed to persistent storage by incorporating the following dependency:

<dependency>
  <groupId>org.jbpm</groupId>
  <artifactId>jbpm-addons-quarkus-usertask-storage-jpa</artifactId>
</dependency>

Persistent Context Store (Storage)

The resilient foundation that maintains process context across all components. This storage layer ensures process consistency and durability, protecting against data loss even during system failures or maintenance windows.

Using a relational database, it maintains a consistent view of process state that can be accessed by all components within the architecture. This shared storage approach dramatically simplifies deployment while providing strong data consistency guarantees. The Temporal Event Coordinator and Data Intelligence layer work seamlessly with the Persistent Context Store to provide realtime process updates and interactions.

The storage layer is optimized for both transactional integrity and query performance, supporting both operational needs and analytical workloads from a single data store. This unified approach eliminates the complexity of data synchronization while providing a single source of truth for all process-related information.

Getting Started with Adaptive Process Architecture

Implementing this architecture has been streamlined to allow teams to quickly build sophisticated process applications. The following steps will get you started:

  1. Create a new Quarkus project with the required dependencies:
<!-- Core dependencies -->
<dependency>
    <groupId>io.quarkus</groupId>
    <artifactId>quarkus-resteasy</artifactId>
</dependency>
<dependency>
    <groupId>io.quarkus</groupId>
    <artifactId>quarkus-resteasy-jackson</artifactId>
</dependency>
<dependency>
    <groupId>io.quarkus</groupId>
    <artifactId>quarkus-smallrye-openapi</artifactId>
</dependency>
<dependency>
    <groupId>io.quarkus</groupId>
    <artifactId>quarkus-smallrye-health</artifactId>
</dependency>

<!-- For intelligent process orchestration with Adaptive Architecture -->

<!-- Process and Decisions  -->
<dependency>
    <groupId>org.jbpm</groupId>
    <artifactId>jbpm-with-drools-quarkus</artifactId>
</dependency>

<dependency>
    <groupId>org.kie</groupId>
    <artifactId>kie-addons-quarkus-process-management</artifactId>
</dependency>
<!-- Process History Service -->
<dependency>
    <groupId>org.kie</groupId>
    <artifactId>kogito-addons-quarkus-data-audit-jpa</artifactId>
</dependency>
<dependency>
    <groupId>org.kie</groupId>
    <artifactId>kogito-addons-quarkus-data-audit</artifactId>
</dependency>
<!-- Temporal Event Coordinator -->
<dependency>
  <groupId>org.kie</groupId>
  <artifactId>kogito-addons-quarkus-jobs-management</artifactId>
</dependency>
<dependency>
  <groupId>org.kie</groupId>
  <artifactId>kogito-addons-quarkus-jobs</artifactId>
</dependency>
<!-- Reminder to remove jobs-service-storage-jpa -->
<!--
<dependency>
  <groupId>org.kie</groupId>
  <artifactId>jobs-service-storage-jpa</artifactId>
</dependency> -->

<!-- Human Collaboration Framework Persisted -->
<dependency>
  <groupId>org.jbpm</groupId>
  <artifactId>jbpm-addons-quarkus-usertask-storage-jpa</artifactId>
</dependency>

<!-- Data Intelligence Layer (Optional) -->
<dependency>
  <groupId>org.kie</groupId>
  <artifactId>kogito-addons-quarkus-data-index-jpa</artifactId>
</dependency>

<!-- Database connectivity -->
<dependency>
  <groupId>io.quarkus</groupId>
  <artifactId>quarkus-jdbc-postgresql</artifactId>
</dependency>
<dependency>
    <groupId>io.quarkus</groupId>
    <artifactId>quarkus-agroal</artifactId>
</dependency>
<dependency>
    <groupId>org.kie</groupId>
    <artifactId>kie-addons-quarkus-persistence-jdbc</artifactId>
</dependency
<dependency>
    <groupId>org.kie</groupId>
    <artifactId>kogito-addons-quarkus-data-index-persistence-postgresql</artifactId>
</dependency>

<!-- Process Diagram SVGs used with Consoles -->
<dependency>
    <groupId>org.kie</groupId>
    <artifactId>kie-addons-quarkus-process-svg</artifactId>
</dependency>
<dependency>
    <groupId>org.kie</groupId>
    <artifactId>kie-addons-quarkus-source-files</artifactId>
</dependency>

<!-- Container image creation -->
<dependency>
    <groupId>io.quarkus</groupId>
    <artifactId>quarkus-container-image-jib</artifactId>
</dependency>
  1. Configure your application in application.properties:
#####################################
# Core HTTP Configuration
#####################################
quarkus.http.port=8080
quarkus.http.root-path=/

# CORS Configuration
quarkus.http.cors=true
quarkus.http.cors.origins=*
quarkus.http.cors.methods=GET,POST,PUT,DELETE,OPTIONS,PATCH
quarkus.http.cors.headers=accept,authorization,content-type,x-requested-with,x-forward-for,content-length,host,origin,referer,Access-Control-Request-Method,Access-Control-Request-Headers
quarkus.http.cors.exposed-headers=Content-Disposition,Content-Type
quarkus.http.cors.access-control-max-age=24H
quarkus.http.cors.access-control-allow-credentials=true

#####################################
# API Documentation
#####################################
quarkus.smallrye-openapi.path=/docs/openapi.json
quarkus.swagger-ui.always-include=true

#####################################
# Logging Configuration
#####################################
# Minimize logging for all categories
quarkus.log.level=WARN
# Enable more verbose logging for application specific messages
quarkus.log.category."com.example".level=INFO
# Uncomment for troubleshooting
#quarkus.log.category."org.jbpm".level=DEBUG
#quarkus.log.category."org.kie.kogito".level=DEBUG

#####################################
# Database Configuration
#####################################
# Common database settings
quarkus.hibernate-orm.database.generation=update
quarkus.hibernate-orm.log.sql=false

# Production Database Configuration
%prod.quarkus.datasource.db-kind=postgresql
%prod.quarkus.datasource.username=${POSTGRESQL_USER:kogito}
%prod.quarkus.datasource.password=${POSTGRESQL_PASSWORD:kogito123}
%prod.quarkus.datasource.jdbc.url=jdbc:postgresql://${POSTGRESQL_SERVICE}:5432/${POSTGRESQL_DATABASE}

# Development Database Configuration
%dev.quarkus.datasource.db-kind=postgresql
%dev.quarkus.datasource.devservices.enabled=true
%dev.quarkus.datasource.devservices.port=5432

#####################################
# Flyway Migration Settings
#####################################
# Development Mode - Clean start for rapid iteration
%dev.quarkus.flyway.clean-at-start=true
%dev.quarkus.flyway.migrate-at-start=true
%dev.quarkus.flyway.baseline-on-migrate=true
%dev.quarkus.flyway.out-of-order=true
%dev.quarkus.flyway.baseline-version=0.0
%dev.quarkus.flyway.locations=classpath:/db/migration,classpath:/db/jobs-service,classpath:/db/data-audit/postgresql
%dev.quarkus.flyway.table=FLYWAY_RUNTIME_SERVICE
%dev.kie.flyway.enabled=true

# Production Mode - Safe migration
%prod.kie.flyway.enabled=false
%prod.quarkus.flyway.migrate-at-start=true
%prod.quarkus.flyway.baseline-on-migrate=true
%prod.quarkus.flyway.out-of-order=true
%prod.quarkus.flyway.baseline-version=0.0
%prod.quarkus.flyway.locations=classpath:/db/migration,classpath:/db/jobs-service,classpath:/db/data-audit/postgresql
%prod.quarkus.flyway.table=FLYWAY_RUNTIME_SERVICE

#####################################
# Process Automation Engine
#####################################
# Enable transactions
kogito.transactionEnabled=true

# Dev users for testing
%dev.jbpm.devui.users.jdoe.groups=admin,HR,IT
%dev.jbpm.devui.users.mscott.groups=admin,HR,IT

#####################################
# Jobs Service Configuration
#####################################
# Run periodic job loading every minute
kogito.jobs-service.loadJobIntervalInMinutes=1
# Load jobs into the InMemory scheduler that expire within the next 10 minutes
kogito.jobs-service.schedulerChunkInMinutes=10
# Load jobs into the InMemory scheduler that have expired in the last 5 minutes
kogito.jobs-service.loadJobFromCurrentTimeIntervalInMinutes=5

#####################################
# Security Configuration
#####################################
# Security disabled by default
quarkus.oidc.enabled=false
quarkus.kogito.security.auth.enabled=false

# OIDC Configuration (commented out, enable when needed)
#quarkus.oidc.auth-server-url=https://keycloak.aletyx-labs.aletyx.dev/auth/realms/jbpm-openshift
#quarkus.oidc.client-id=cc-application-approval
#quarkus.oidc.credentials.secret=your-secret-here
#quarkus.oidc.enabled=true
#quarkus.oidc.tenant-enabled=true
#quarkus.oidc.application-type=service
#quarkus.http.auth.permission.authenticated.paths=/*
#quarkus.http.auth.permission.authenticated.policy=authenticated
#quarkus.http.auth.permission.public.paths=/q/*,/docs/*,/kogito/security/oidc/*
#quarkus.http.auth.permission.public.policy=permit
#kogito.security.auth.enabled=true
#kogito.security.auth.impersonation.allowed-for-roles=managers

#####################################
# Environment URLs
#####################################
# Development Environment
%dev.kogito.service.url=http://localhost:8080
%dev.quarkus.devservices.enabled=true
%dev.quarkus.kogito.devservices.enabled=true

# Production Environment
%prod.quarkus.devservices.enabled=false
%prod.quarkus.kogito.devservices.enabled=false
%prod.kogito.service.url=${KOGITO_SERVICE_URL:http://localhost:8080}

#####################################
# Container Image Configuration
#####################################
# Kubernetes Deployment
%prod.quarkus.kubernetes.deploy=true
%prod.quarkus.kubernetes.deployment-target=kubernetes
%prod.quarkus.kubernetes.ingress.expose=true
%prod.quarkus.kubernetes.ingress.host=${SERVICE_HOST:aletyx.dev}

# Container Image Settings
%prod.quarkus.container-image.build=true
%prod.quarkus.container-image.registry=${CONTAINER_REGISTRY:docker.io}
%prod.quarkus.container-image.group=${user.name}
%prod.quarkus.container-image.name=claim-initiation

#####################################
# Event Configuration (Optional)
#####################################
# Uncomment to enable Kafka event publishing
# kafka.bootstrap.servers=localhost:9092
# kogito.events.usertasks.enabled=true
# kogito.events.variables.enabled=true
# kogito.events.processinstances.enabled=true
# mp.messaging.outgoing.kogito-processinstances-events.connector=smallrye-kafka
# mp.messaging.outgoing.kogito-processinstances-events.topic=kogito-processinstances-events
# mp.messaging.outgoing.kogito-processinstances-events.value.serializer=org.apache.kafka.common.serialization.StringSerializer
# mp.messaging.outgoing.kogito-usertaskinstances-events.connector=smallrye-kafka
# mp.messaging.outgoing.kogito-usertaskinstances-events.topic=kogito-usertaskinstances-events
# mp.messaging.outgoing.kogito-usertaskinstances-events.value.serializer=org.apache.kafka.common.serialization.StringSerializer
# mp.messaging.outgoing.kogito-variables-events.connector=smallrye-kafka
# mp.messaging.outgoing.kogito-variables-events.topic=kogito-variables-events
# mp.messaging.outgoing.kogito-variables-events.value.serializer=org.apache.kafka.common.serialization.StringSerializer
  1. Create your BPMN process models in the src/main/resources directory

  2. Launch your application:

mvn quarkus:dev

All components will be intelligently co-located within your service and share a consistent data store for seamless process execution.

Conclusion

The Adaptive Process Architecture represents the future of process orchestration, bringing together the best elements of traditional business process management and modern cloud-native architectures. By intelligently co-locating related process components, it dramatically simplifies development, deployment, and operations while maintaining the benefits of a modular, scalable design. The benefit to using Aletyx Enterprise Build of Kogito and Drools is that true cloud-native process can be achieved with scalability capable of driving the Temporal Event Coordinator and the Audit services at the scale required for enterprise deployments!

This architecture is uniquely suited for modern business processes that require sophisticated orchestration including human collaboration, temporal coordination, and long-running transactions. The unified approach delivers exceptional developer productivity without sacrificing operational flexibility, allowing your organization to rapidly adapt to changing business requirements.