所以我在我的pom里有依赖关系。
org.springframework.kafka
spring-kafka
我在application.yml中进行了配置spring:
kafka:
bootstrap-servers: localhost:9092
consumer:
group-id: foo
auto-offset-reset: earliest
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
properties:
spring:
json:
value:
default:
type: java.lang.Object
最后这里是我的监听器代码。@KafkaListener(topics = "videoEnrichedEvents")
public void consume(@Payload VideoEnrichedEventsvideoEnrichedEvents){
LOGGER.debug("Consumed message :"+videoEnrichedEvents);
System.out.println("Consumed Message :"videoEnrichedEvents);
}
因为我有不同的主题和不同的消费者 我希望消费者的配置足够通用 这样我就可以读取任何对象 然后将其委托给处理程序。在错误日志中,我可以看到。Caused by: org.springframework.messaging.converter.MessageConversionException: Cannot handle message; nested exception is org.springframework.messaging.converter.MessageConversionException: Cannot convert from [java.util.LinkedHashMap] to [com.calamp.connect.vs.model.VideoEnrichedEvents] for GenericMessage [payload={anyotherjson={groups=null, id=0, driverName=from Kusum's console, deviceIdType=null, assetId=null, operatorId=null, avlEventTime=null, videoLink=null, tripId=null, avlEventUuid=null, deviceId=null, appMessageUuid=null, parentAccountList=null, appmsgEventTime=null, enrichedMessage=null, accountId=null}}, headers={kafka_offset=9, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer@18213932, kafka_timestampType=CREATE_TIME, kafka_receivedMessageKey=null, kafka_receivedPartitionId=0, kafka_receivedTopic=videoEnrichedEvents, kafka_receivedTimestamp=1590218109430}], failedMessage=GenericMessage [payload={anyotherjson={groups=null, id=0, driverName=from Kusum's console, deviceIdType=null, assetId=null, operatorId=null, avlEventTime=null, videoLink=null, tripId=null, avlEventUuid=null, deviceId=null, appMessageUuid=null, parentAccountList=null, appmsgEventTime=null, enrichedMessage=null, accountId=null}}, headers={kafka_offset=9, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer@18213932, kafka_timestampType=CREATE_TIME, kafka_receivedMessageKey=null, kafka_receivedPartitionId=0, kafka_receivedTopic=videoEnrichedEvents, kafka_receivedTimestamp=1590218109430}]
我查了一下,发现到处都是用ConsumerRecord代替LinkedHashMap. 而我的新代码看起来是这样的。@KafkaListener(topics = "videoEnrichedEvents")
public void consume(@Payload ConsumerRecord consumerRecord){
LOGGER.debug("Consumed message!!!Full :"+consumerRecord);
System.out.println("Consumed Message!!! Actual object :"+((LinkedHashMap)consumerRecord.value()));
}
从技术上讲,它可以处理任何发送给我的对象。所以它解决了我的目的。但我的问题是为什么是ConsumerRecord而不是LinkedHashMap,有什么具体原因吗?