A comprehensive example demonstrating how to use Confluent's JavaScript Client for Apache Kafka® (CJSK) with TypeScript. This project showcases real-world implementation of Kafka producers, consumers, admin operations, and Schema Registry integration.
- TypeScript Integration: Fully typed Kafka implementation
- Schema Registry: AVRO schema definition and management
- Producer/Consumer Pattern: Complete producer and consumer implementation
- Admin Operations: Programmatic topic creation and management
- Error Handling: Robust error handling and graceful shutdowns
This project demonstrates a complete Kafka workflow:
- Schema Registration: Registers an AVRO schema with Schema Registry
- Admin Operations: Creates Kafka topics if they don't exist
- Producer: Sends serialized messages to Kafka topics
- Consumer: Consumes and deserializes messages from topics
- Node.js (v16+)
- Kafka cluster (local or remote)
- Schema Registry service
# Clone the repository
git clone https://github.com/MrDay2Day/typescript_kafka.git
cd typescript_kafka
# Install dependencies
npm install
Update the Kafka and Schema Registry connection details in:
src/config/Config.ts
- Kafka broker configurationsrc/schema/Config.ts
- Schema Registry configuration
For production environments, uncomment and configure the SSL and SASL authentication options.
# Start the application
npm run dev
This will:
- Register the schema with Schema Registry
- Create the topic if it doesn't exist
- Start the consumer
- Start the producer (which sends a message every second)
typescript_kafka/
├── index.ts # Main application entry point
├── src/
│ ├── Admin.ts # Kafka Admin operations
│ ├── Consumer.ts # Kafka Consumer implementation
│ ├── Producer.ts # Kafka Producer implementation
│ ├── config/
│ │ └── Config.ts # Kafka configuration
│ └── schema/
│ ├── Config.ts # Schema Registry configuration
│ └── Order.ts # AVRO schema definition
├── package.json
└── README.md
const schemaString = JSON.stringify({
type: "record",
name: "Order",
fields: [
{ name: "region", type: "string" },
{ name: "item_type", type: "string" },
{ name: "item_id", type: "string" },
{ name: "order_id", type: "int" },
{ name: "units", type: "int" },
],
});
The project demonstrates both serialization for producers:
const serializer = new AvroSerializer(
RegistryClient,
SerdeType.VALUE,
avroSerializerConfig
);
And deserialization for consumers:
const deserializer = new AvroDeserializer(RegistryClient, SerdeType.VALUE, {});
Advanced topic configuration with retention policies:
const configEntries: IResourceConfigEntry[] = [
{ name: "cleanup.policy", value: "delete" },
{ name: "retention.ms", value: "30000" }, // 2 minutes retention
];
The implementation includes proper error handling and graceful shutdown hooks:
// Handle termination signals
process.on("SIGINT", shutdown);
process.on("SIGTERM", shutdown);
- Strongly Typed: Complete TypeScript implementation
- Async/Await Patterns: Modern asynchronous code patterns
- Separation of Concerns: Modular code organization
- Graceful Error Handling: Proper error handling throughout
- Resource Management: Proper connection and disconnection
- Kafka JavaScript Client
- Introducing Confluent’s JavaScript Client for Apache Kafka®
- Building a Full-Stack Application With Kafka and Node.js
MrDay2Day
This project demonstrates proficiency with Confluent's JavaScript Client for Apache Kafka® (CJSK), a fully supported JavaScript client backed by Confluent with native support for Confluent's Governance products.