Kafka Fix Unknown Magic Byte Errors
Kafka makes it easy to exchange messages between producers and consumers, but things can go wrong when the message format is not what the consumer expects. A common error, especially with Avro serialization, is the “Unknown magic byte” exception. This article explores what causes this error and how to fix it using Spring Kafka.
1. Understanding the “Unknown Magic Byte” Error
In Kafka, a magic byte (or magic number) is a special byte at the start of a serialized message that indicates its format, commonly used with Avro, Protobuf, or JSON Schema. When using KafkaAvroSerializer, the message begins with a magic byte (0 for Avro), followed by a schema ID and the actual data. This format requires the consumer to use a matching deserializer that understands it.
The “Unknown magic byte” error occurs when the consumer tries to read an Avro-encoded message using an incompatible deserializer, such as StringDeserializer or JsonDeserializer, leading to a deserialization failure.
2. Reproducing the “Unknown Magic Byte” Error
Maven Dependencies
Ensure that the project includes the necessary dependencies for Spring Kafka, Avro, and kafka-avro-serializer along with the Confluent Maven repository in your pom.xml.
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>1.12.0</version>
</dependency>
<dependency>
<groupId>io.confluent</groupId>
<artifactId>kafka-avro-serializer</artifactId>
<version>8.0.0</version>
</dependency>
<repositories>
<repository>
<id>confluent</id>
<url>https://packages.confluent.io/maven/</url>
</repository>
</repositories>
This configuration ensures the application has the Kafka client libraries and Confluent Avro serializer/deserializer support required to read Avro messages.
To reproduce the error, we simulate a mismatch where the producer sends messages using StringSerializer, while the consumer incorrectly expects Avro-formatted messages using KafkaAvroDeserializer.
Send an Avro Message with Magic Byte
@Configuration
public class AvroProducerConfig {
@Bean
public ProducerFactory<String, String> producerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put("key.converter.schema.registry.url", "http://localhost:8081");
props.put("value.converter.schema.registry.url", "http://localhost:8081");
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
return new DefaultKafkaProducerFactory<>(props);
}
@Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
}
This configuration class sets up a Kafka producer to send plain string messages. It defines a ProducerFactory that connects to a Kafka broker at localhost:9092 and uses StringSerializer, meaning the messages will be sent as plain text.
@Component
public class AvroMessageSender implements CommandLineRunner {
private final KafkaTemplate<String, String> kafkaTemplate;
public AvroMessageSender(KafkaTemplate<String, String> kafkaTemplate) {
this.kafkaTemplate = kafkaTemplate;
}
@Override
public void run(String... args) {
String rawMessage = "This is NOT Avro!";
kafkaTemplate.send("avro-topic", rawMessage);
System.out.println("Produced raw message: " + rawMessage);
}
}
This class sends a plain text (String) message to the Kafka topic avro-topic at application startup using KafkaTemplate.
Faulty Consumer to Trigger the Error
Now let’s simulate a misconfigured consumer that expects messages in Avro format instead of plain strings. This misconfiguration will cause the deserializer to encounter an unknown byte at the start.
# application.yml (wrong config)
spring:
kafka:
bootstrap-servers: localhost:9092
consumer:
group-id: faulty-group
auto-offset-reset: earliest
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
properties:
schema.registry.url: http://localhost:8081
specific.avro.reader: true
@Component
public class FaultyListener {
@KafkaListener(topics = "avro-topic", groupId = "faulty-group")
public void listen(String message) {
System.out.println("Received message: " + message);
}
}
When you run the application, the KafkaTemplate sends a plain text message to the avro-topic, but the consumer attempts to deserialize it using KafkaAvroDeserializer. Since the message doesn’t begin with Avro’s expected magic byte (0x00), a deserialization error is thrown.
java.lang.IllegalStateException: This error handler cannot process 'SerializationException's directly; please consider configuring an 'ErrorHandlingDeserializer' in the value and/or key deserializer at org.springframework.kafka.listener.DefaultErrorHandler.handleOtherException(DefaultErrorHandler.java:192) ~[spring-kafka-3.3.7.jar:3.3.7] at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1335) ~[spring-kafka-3.3.7.jar:3.3.7] ... 2 common frames omitted Caused by: org.apache.kafka.common.errors.SerializationException: Unknown magic byte! at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.getByteBuffer(AbstractKafkaSchemaSerDe.java:649) ~[kafka-schema-serializer-7.9.1.jar:na] at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer$DeserializationContext.(AbstractKafkaAvroDeserializer.java:389) ~[kafka-avro-serializer-7.9.1.jar:na] at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:181) ~[kafka-avro-serializer-7.9.1.jar:na]
3. Fixing the Error with ErrorHandlingDeserializer
To fix the issue, configure the consumer to use Spring Kafka’s ErrorHandlingDeserializer, which wraps the actual Avro deserializer and handles any deserialization exceptions gracefully.
spring:
kafka:
consumer:
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
properties:
spring.deserializer.value.delegate.class: io.confluent.kafka.serializers.KafkaAvroDeserializer
schema.registry.url: http://localhost:8081
This configuration ensures the consumer recognizes Avro messages and can gracefully handle any misformatted messages.
4. Using DefaultErrorHandler and DeadLetterPublishingRecoverer
To prevent bad messages (also known as poison pills) from crashing the consumer, configure Spring Kafka to reroute them to a Dead Letter Topic (DLT).
@Bean
public ConcurrentKafkaListenerContainerFactory<String, Object> kafkaListenerContainerFactory(
DefaultKafkaConsumerFactory<String, Object> consumerFactory,
KafkaTemplate<String, String> kafkaTemplate) {
ConcurrentKafkaListenerContainerFactory<String, Object> factory
= new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory);
DeadLetterPublishingRecoverer recoverer = new DeadLetterPublishingRecoverer(kafkaTemplate);
DefaultErrorHandler errorHandler = new DefaultErrorHandler(recoverer, new FixedBackOff(0L, 2));
errorHandler.addNotRetryableExceptions(SerializationException.class);
factory.setCommonErrorHandler(errorHandler);
return factory;
}
This configuration retries each record up to two times immediately and uses the DeadLetterPublishingRecoverer to route failed records to a .DLT topic (e.g., your-topic.dlt) for later examination or reprocessing. It also prevents retries for deserialization-related errors such as SerializationException, ensuring that problematic records are not retried indefinitely.
Now the listener can consume Avro messages without crashing due to bad data. Valid messages will be processed; bad ones will go to the DLQ.
@Component
public class AvroMessageListener {
private static final Logger logger = Logger.getLogger(AvroMessageListener.class.getName());
@KafkaListener(topics = "avro-topic", groupId = "avro-consumer-group")
public void listen(GenericRecord record) {
logger.log(Level.INFO, "Received Avro record: {0}", record);
}
}
Handling the Dead-Letter Topic
To monitor or debug failed messages, set up a listener for the DLT:
@Component
public class DeadLetterListener {
private static final Logger logger = Logger.getLogger(DeadLetterListener.class.getName());
@KafkaListener(topics = "avro-topic.dlt", groupId = "avro-consumer-group")
public void handleDlq(ConsumerRecord<String, byte[]> record) {
logger.log(Level.WARNING, "Received poison pill from DLQ:{0}", record);
}
}
This allows the system to log, inspect, and possibly reprocess poison-pill messages later.
5. Conclusion
In this article, we examined how to resolve the “Unknown magic byte” error in Kafka when consuming Avro messages with Spring Boot. We covered the root cause, which is mismatched deserializers, and showed how to fix it using ErrorHandlingDeserializer and KafkaAvroDeserializer. Finally, we implemented a DefaultErrorHandler with a DeadLetterPublishingRecoverer to ensure failed records are redirected to a dead-letter topic, allowing consumers to continue processing without interruption.
6. Download the Source Code
This article covered how to fix “unknown magic byte” errors in Kafka.
You can download the full source code of this example here: kafka fix unknown magic byte errors




