Overview
Develop, scale, and troubleshoot event-driven microservice applications.
Learn to use Kafka and AMQ Streams to design, develop, and test event-driven applications. Event-driven microservices scale globally, store and stream process data, and provide low-latency feedback to customers. This course is for application developers and is based on Red Hat AMQ Streams 1.8 and Red Hat OpenShift Container Platform 4.6.
Skills Covered
- Describe the basics of Kafka and its architecture.
- Develop applications with the Kafka Streams API.
- Integrate applications with Kafka Connect.
- Capture data change with Debezium.
- Troubleshoot common application streaming issues.
Who Should Attend
- Application developers with microservice development experience.
Course Curriculum
Prerequisites
- Experience with microservice application development and design, such as DO378 or equivalent experience.
- OpenShift experience is recommended, but not required.
Download Course Syllabus
Course Modules
- Describe the principles of event-driven applications.
- Build applications with basic read-and-write messaging capabilities.
- Leverage the Streams API to create data streaming applications.
- Create and migrate to asynchronous services using the event collaboration pattern.
- Connect data systems and react to data changes using Kafka Connect and Debezium.
- Handle common problems in Kafka and AMQ Streams applications.
Request More Information
Training Options
- ILT: Instructor-Led Training
- VILT: Virtual Instructor-Led Training
RM4,890.00Enroll Now
RM4,890.00Enroll Now
RM4,890.00Enroll Now
RM4,890.00Enroll Now
RM4,890.00Enroll Now
RM4,890.00Enroll Now
RM4,890.00Enroll Now
RM4,890.00Enroll Now
RM4,890.00Enroll Now
RM4,890.00Enroll Now
RM4,890.00Enroll Now
RM4,890.00Enroll Now
RM4,890.00Enroll Now
RM4,890.00Enroll Now
RM4,890.00Enroll Now
RM4,890.00Enroll Now
Exam & Certification
Red Hat Certified Specialist in Event-Driven Development with Kafka.
A Red Hat Certified Specialist in Event-Driven Development with Kafka has demonstrated the ability to develop applications using Apache Kafka and Apache Kafka Streams.
A Red Hat Certified Specialist in Event-Driven Development with Kafka is able to:
- Understand and work with event-driven applications with the AMQ Streams API
- Understand the Kafka ecosystem and architecture
- Understand and work with a Quarkus application connected to Kafka
- Provide and configure access to a Kafka cluster
- Provide and use the schema Red Hat Service Registry to decouple the data from the client applications, share, and manage the data types at runtime
- Understand, produce, test, and secure data-stream processing using the Kafka Streams API to perform an efficient management and real-time querying of application state
- Data integration with Kafka Connect
- Understand and use advanced event-driven patterns in applications based on Apache Kafka
- Troubleshoot most common problems in event-driven applications like maintaining message ordering, retries and idempotency, handling duplicate events, implement Streams test cases
Training & Certification Guide
- Organizations are recognizing that traditional synchronous applications are not able to scale consistently and adjust to the massive amounts of data from customers while still meeting customers’ expectations of immediate results. With event-driven applications using Kafka and AMQ Streams, organizations can expect to be able to globally scale their applications, store and stream process data, and provide feedback to customers with extremely low latency.
- As a result of attending this course, students will understand the architecture of Kafka and AMQ Streams and will be able to identify proper use cases for event-driven applications. In addition to learning the fundamental principles and features of Kafka and AMQ Streams, Students will learn how to design, develop, and test event-driven applications.
- Students should be able to demonstrate the following skills:
- Design, build, and use event-driven applications for relevant scenarios with standard patterns.
- Detect and react to data changes with Debezium to improve application performance.
- Troubleshoot common problems with event-driven applications.
The Red Hat Certified Specialist in Event-Driven Application Development exam tests your skills and knowledge with regard to coding event-driven applications using Apache Kafka and developing Apache Kafka Streams. The exam focuses on the basic skills required for building applications using event-driven architecture.
By passing this exam, you become a Red Hat Certified Specialist in Event-Driven Development with Kafka, which also counts toward earning a Red Hat Certified Architect (RHCA®) certification.
This exam is based on Red Hat® AMQ® Streams 1.8 with Apache Kafka 2.8.
The following audiences may be interested in earning the Red Hat Certified Specialist in Event-Driven Development with Kafka credential:
- Java developers and architects who are implementing event-driven applications using Apache Kafka and Kubernetes.
- Red Hat Certified professionals who wish to pursue Red Hat Certified Architect (RHCA) certification.
- Familiarity with using VSCode/VSCodium in a Red Hat Enterprise Linux environment.
- Good experience with Java SE, including a knowledge and understanding of the core Java concepts and APIs. For example, exceptions, annotations, lambdas, and familiarity with functional programming and the Collections API are all required.
- Some familiarity with OpenShift/Kubernetes is beneficial.
- Take our free assessment to find the course that best supports your preparation for this exam.
As part of this exam, you should be able to perform these tasks:
-
- Understand and work with event-driven applications with AMQ Streams API.
- Know how to send and read data from Kafka.
- Be able to develop microservices and other types of applications to share data with extremely high throughput and low latency.
- Understand the Kafka ecosystem and architecture:
- How to create, configure, and manage topics.
- How to configure the ecosystem to share data with extremely high throughput and low latency.
- How to scale and guarantee message ordering.
- Message compaction to remove old records, and how to set them.
- Configuration and use of the replication of data to control fault tolerance.
- Retention of high volumes of data for immediate access.
- Understand and work with a Quarkus application connected to Kafka
- Connect to Kafka with Reactive Messaging
- Connect to Apache Kafka with its native API
- Produce and consume messages and implement event-driven and data-streaming applications
- Be familiar with the reactive libraries used by Quarkus : Asynchronous Java or Publisher API, RxJava or Reactor APIs, Mutiny, etc.
- Provide and configure access to a Kafka cluster.
- Be able to access the external listeners of Kafka on the cloud. In the cases of Kubernetes or Red Hat OpenShift, connect via node ports, load balancers, and externally, using an ingress or OpenShift route.
- Understand how to configure the security of the communications between the Kafka client and the cluster.
- Produce and consume messages and implement event-driven and data-streaming applications
- Understand and provide the Kafka client configuration for the required authentication and authorization security.
- Provide and use the schema Red Hat Service Registry to decouple the data from client applications, share and manage the data types at runtime:
- Understand and work with the different Kafka Streams APIs like Streams DSL and Processor API.
- Configure and provide the proper Kafka SerDes (Serializer/Deserializer) for the records to correctly materialize the data
- Be able to receive data from one or more input streams, execute complex operations like mapping, filtering or joining, repartition and/or grouping, and write the results into one or more output streams.
- Understand the stream-table duality and perform stateful operations like joins, aggregations, and windowed joins.
- Understand how to define and connect custom processors and transformers to interact with state stores using the Processor API.
- Understand the event manipulation deriving new collections from existing ones and describing changes between them.
- Data integration with Kafka Connect:
- Understand how Kafka Connect provides reliability and scalability data transferring between Kafka and other heterogeneous data systems.
- Understand how Kafka Connect facilitates data conversion, transformation, and offset management.
- Apply the detecting and capturing data changes (CDC) with Debezium.
- Understand the different stand-alone/distributed running modes and their use cases.
- Use the pre-built AMQ Streams connectors.
- Understand and use advanced event-driven patterns in applications based on Apache Kafka:
- Recognize and work in an application with Event Sourcing and CQRS patterns
- Know and work in advanced techniques like long-running business transactions with Saga orchestration and outbox patterns to exchange data between different services.
- Troubleshoot most common problems in event-driven applications like maintaining message ordering, retries and idempotency, handling duplicate events, implement Streams test cases
- Understand and work with event-driven applications with AMQ Streams API.
During the exam you may be required to work with one or more pre-written Java applications. You will be required to modify some parts of the application code.
As with all Red Hat performance-based exams, configurations must persist after reboot without intervention.
This exam is a hands-on, practical exam that requires you to undertake real-world development tasks. Internet access is not provided during the exam, and you will not be permitted to bring any hard copy or electronic documentation into the exam. This prohibition includes notes, books, or any other material. AMQ and AMQ Streams, Kafka and Kafka Streams related documentation is available during the exam.
DO328: Building Resilient Microservices with Istio and Red Hat OpenShift Service Mesh
Building Resilient Microservices with Istio and Red Hat OpenShift Service Mesh (DO328) is an introduction to Red Hat OpenShift Service Mesh that teaches students installation, service monitoring, service resilience, and service security with Red Hat OpenShift Service Mesh.
Frequently Asked Questions
Red Hat official training courses delivered by Trainocate Malaysia teaches you the fundamentals of each Red Hat technologies, with practical use cases and hands-on labs designed to help you use the products successfully. By certifying those skills, you prove what you know and can do and validate your knowledge to the market or an employer.
You can get further details—including requirements and objectives—about each offering by reviewing the individual descriptions within our full lists of courses and certifications.
An IT professional who becomes a Red Hat Certified System Administrator (RHCSA) is able to perform the core system administration skills required in Red Hat Enterprise Linux environments. The credential is earned after successfully passing the Red Hat Certified System Administrator (RHCSA) exam.
A Red Hat Certified Engineer (RHCE) is an RHCSA who is ready to automate Red Hat Enterprise Linux tasks, integrate Red Hat emerging technologies, and apply automation for efficiency and innovation. To learn more about how the credential is changing to match new needs and fill skills gaps, check out our frequently asked questions about the RHCE program.
Depending on the course or certification, we offer a number of learning styles for you. We conduct classroom, virtual, and on-site training, along with video classroom and online learning formats. Learn more about all the ways to train that Red Hat offers.
Organizations hiring employees, contractors, and consultants can look to Red Hat certifications as an input into hiring, assignment, promotion, and other management decisions. Similarly, individuals who earn these certifications benefit and see value by having official, impartial, and proven validation of their skills and knowledge.