-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Make it easier to configure the JsonDeserializer ObjectMapper without losing the JsonDeserializer configuratin loaded from application.yaml #1703
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I think this one has to be handled on Spring Boot side: when we have a property like I mean the issue must go to Spring Boot, but that's my vision how to be. |
ConsumerFactory<String, Object> kafkaConsumerFactory(
KafkaProperties properties, ObjectMapper objectMapper) { The Boot team doesn't consider The preferred mechanism is to use something like this... @Component
class Customizer {
Customizer(DefaultKafkaConsumerFactory<?, ?> factory, ObjectMapper mapper) {
factory.setValueDeserializer(...);
}
} i.e. customize the auto-configured factory. But yes, auto-configuration of the serializer/deserializer would have to be done in Boot. |
In spring-cloud-stream we added this customizer - the binder detects if one of these beans is present and calls it. Perhaps Boot could do something similar. |
Can you point to the right documentation for this? Can't find how to actually get this to work. |
"Can't find how to actually get this to work." is not very helpful; you should state what issues you saw. Turns out I was wrong, Boot wires the CF as a CF, not a DKCF (which is a bit surprising because they prefer to narrow types as much as possible). This is not currently documented, but it's just normal Spring bean wiring. This works... @SpringBootApplication
public class Kgh1703Application {
public static void main(String[] args) {
SpringApplication.run(Kgh1703Application.class, args);
}
@KafkaListener(id = "kgh1703", topics = "kgh1703")
public void listen(String in) {
System.out.println(in);
}
}
@Configuration
class CustomizeIt {
CustomizeIt(ConsumerFactory<Object, Object> cf, ObjectMapper mapper) {
((DefaultKafkaConsumerFactory<Object, Object>) cf).setValueDeserializer(new JsonDeserializer<>(mapper));
}
}
|
See my comment in the mentioned SO thread:
The main point of that question is to keep as much as possible configuration in the YAML file and let Spring Boot to auto-wire properly |
I don't see much value in configuring the deserializer via properties when you can create it directly. Such properties would rarely change and are only done via properties because Kafka instantiates it. The Boot team discourages direct use of |
Well, this issue is really just an outcome from that SO thread discussion... We probably just need to move this issue to Spring Boot since there is really nothing more to do from this project perspective. |
Correct me if I am wrong, but this still doesn't seem to solve the original problem. The |
That is correct.
|
Well that's quite a subjective opinion. If it is not supposed to be the way, then why provide it in the first place. It is going to be confusing for another developer (or even myself after a few months) to try setting a property and it doesn't work because somewhere hidden in the code there is a new bean that is instantiating a new deserializer and not giving any consideration of any other properties that should be set. So essentially the ones listed here: https://docs.spring.io/spring-kafka/api/constant-values.html will be ignored. Yes, these properties rarely change, but sometimes they do. Case in point this whole thread started because I noticed that my application wasn't deserializing the right timezones, so I naively tried to just set |
Fair point. I don't personally agree with the Boot position (that This should work for you (and comply with Boot's position)... @Configuration
class CustomizeIt {
CustomizeIt(ConsumerFactory<Object, Object> cf, ObjectMapper mapper) {
JsonDeserializer<Object> valueDeserializer = new JsonDeserializer<>(mapper);
valueDeserializer.configure(cf.getConfigurationProperties(), false);
((DefaultKafkaConsumerFactory<Object, Object>) cf).setValueDeserializer(valueDeserializer);
}
} |
In a Spring Cloud Stream (Boot) application I've tried both the @Configuration
class KafkaClientConfig {
KafkaClientConfig(ConsumerFactory<Object, Object> consumerFactory, ObjectMapper mapper) {
JsonDeserializer<Object> valueDeserializer = new JsonDeserializer<>(mapper);
valueDeserializer.configure(consumerFactory.getConfigurationProperties(), false);
((DefaultKafkaConsumerFactory<Object, Object>) consumerFactory).setValueDeserializer(valueDeserializer);
}
@Bean
public ClientFactoryCustomizer kafkaClientCustomizer(ObjectMapper mapper) {
return new ClientFactoryCustomizer() {
@Override
public void configure(ConsumerFactory<?, ?> cf) {
Deserializer<Object> valueDeserializer = new JsonDeserializer<>(mapper);
valueDeserializer.configure(cf.getConfigurationProperties(), false);
((DefaultKafkaConsumerFactory<Object, Object>) cf).setValueDeserializer(valueDeserializer);
}
};
}
} Based on another discussion thread, I also defined the following bean (an org.apache.kafka class), which is also being called but does not work, either. @Bean
public Deserializer<?> kafkaDeserializer(ObjectMapper objectMapper) {
return new JsonDeserializer<>(objectMapper);
} None of these resolves the deserialization errors (which are solved by my custom mix-in that I add to the app context's |
Don't comment on old, closed, issues; the @SpringBootApplication
public class Kgh17031Application {
public static void main(String[] args) {
SpringApplication.run(Kgh17031Application.class, args);
}
@Bean
Consumer<String> input() {
return System.out::println;
}
@Bean
ClientFactoryCustomizer cust() {
return new ClientFactoryCustomizer() {
@Override
public void configure(ConsumerFactory<?, ?> cf) {
System.out.println("here");
}
};
}
} If you can't figure out what's wrong, I suggest you ask a question on Stack Overflow, showing a Mimimal, Complete, Reproducible Example that exhibits the behavior you see. If you can prove it to be a bug, open an issue against the binder, not here. |
@garyrussell what are the preconditions for a default ConsumerFactory bean to be initialised, following the spring.kafka.* auto-config? No matter what I try, I'm getting on the contrary, the sample code:
|
Don't ask questions in closed issues; use the See If you run with If you can't figure it out; start a discussion and provide a minimal, complete, reproducible example. |
Uh oh!
There was an error while loading. Please reload this page.
Currently it is very cumbersome to configure the ObjectMapper of the JsonDeserializer. One would incorrectly assume that the ObjectMapper configuration set through
application.yaml
such as the following would work:But it doesn't, for some reason the JsonDeserializer uses its own ObjectMapper ignoring the one configured in the rest of the application. More details seem to be here #680
The only alternative here, as indicated in the documentation, is to define your own JsonDeserializer bean, like this:
However this means that the configuration passed through the
application.yaml
forJsonDeserializer
is now lost.After lots of searching and seeking help here: https://stackoverflow.com/questions/66061869/how-to-make-spring-kafka-jsondeserializer-retain-the-timezone-offset-when-deseri/
thanks to @artembilan the only solution to this is to do something like the following:
For some reason the following typesafe way does not work either, which makes things even more confusing:
This is not intuitive at all, and difficult to come up with on your own without knowing the internals (apart from the ugly
Object
generic type).If it is not possible to use the same
ObjectMapper
used in the spring context, maybe the right properties for theObjectMapper
can be passed through thespring.kafka.consumer.properties
just like thespring.json.value.default.type
.The text was updated successfully, but these errors were encountered: