Apache camel Apache Camel-同时使用Retry和Hystrix断路器

Apache camel Apache Camel-同时使用Retry和Hystrix断路器,apache-camel,circuit-breaker,onexception,Apache Camel,Circuit Breaker,Onexception,我正在尝试配置一条使用Hystrix断路器和交付重试的路由。断路器正常工作并跳闸,但当断路器的回退功能触发拒绝后续呼叫时,不会发生重试 以下是断路器和重试策略的路由配置: // Route: DES Evaluation Engine -> DES Service Bus -> VSA and KMS Data Collectors from("activemq:" + amqBrokerConfig.getDesDataCollectionRequest

我正在尝试配置一条使用Hystrix断路器和交付重试的路由。断路器正常工作并跳闸,但当断路器的回退功能触发拒绝后续呼叫时,不会发生重试

以下是断路器和重试策略的路由配置:

  // Route: DES Evaluation Engine -> DES Service Bus -> VSA and KMS Data Collectors
  from("activemq:" + amqBrokerConfig.getDesDataCollectionRequestsOutQueue() + jmsFromEndpointOptions)
     .onException(Exception.class)
        .retryWhile(deliveryRetryRuleset)
        .maximumRedeliveries(serviceBusConfig.getDesMaxRedeliveries())
        .maximumRedeliveryDelay(serviceBusConfig.getDesMaxRedeliveryDelayMs())
        .redeliveryDelay(serviceBusConfig.getDesInitialRedeliveryDelayMs())
        .end()
     .log("Relaying messages to Luma/VSA & Luma/KMS endpoints:: Sending ${body} with correlation key ${header.CorrelationId}")
     // Splits the message to be sent to different end points
     .split().method(ServiceCommandSplitter.class, "splitCommands")
     // Use the assembler to aggregate facts received from each individual endpoints i.e. Luma/VSA & Luma/KMS
     .aggregationStrategy(new FactsAssembler())
     // Sets a total timeout to breakout from multicast if replies aren't received within the stipulated time
     .timeout(serviceBusConfig.getDesMulticastProcessingTimeoutMs())
     // Sending of message to the multiple endpoints must occur concurrently
     .parallelProcessing()
     .log("Splitted message: ${body}")
     // Encapsulate operation within a circuit breaker and monitor it for failures
     .circuitBreaker().hystrixConfiguration(hystrixConfig)
        // Identify the end point
        .choice()
           // Forward the message to Luma/VSA Data Collector
           .when(jsonpath("$.commandName").isEqualTo(CommandName.GET_INTENT_REPRESENTATION.getName()))
              .to("activemq:" + amqBrokerConfig.getVsaDataCollectionRequestsInQueue() + toOptions)
           // Forward the message to Luma/KMS Data Collector
           .when(jsonpath("$.commandName").isEqualTo(CommandName.SEARCH_KNOWLEDGE_BASE.getName()))
              .to("activemq:" + amqBrokerConfig.getKmsDataCollectionRequestsInQueue() + toOptions)
     .endCircuitBreaker()
     // The fallback route path to execute that does not go over the network.
     .onFallback()
        // Add the custom processor
        .process(new DataCollectionFallbackProcessor())
     // Mark end of current block and returns back to the circuitBreaker()
     .end();
有人能建议我在出现异常时触发重试时可能缺少什么吗

以下是确认触发断路器功能的日志跟踪

29-Sep-2020 06:49:57,615 | DEBUG | DESserviceBus | Darpan-Laptop |  |  | 225 |  | o.a.c.c.j.JmsConfiguration | Sending JMS message to: queue://MQ.VSADataCollectionRequests.in with message: ActiveMQTextMessage {commandId = 0, responseRequired = false, messageId = null, originalDestination = null, originalTransactionId = null, producerId = null, destination = null, transactionId = null, expiration = 1601362198609, timestamp = 0, arrival = 0, brokerInTime = 0, brokerOutTime = 0, correlationId = Camel-ID-Darpan-Laptop-1601362137830-0-60, replyTo = temp-queue://ID:Darpan-Laptop-55243-1601362137447-13:1:1, persistent = true, type = null, priority = 4, groupID = null, groupSequence = 0, targetConsumerId = null, compressed = false, userID = null, content = null, marshalledProperties = null, dataStructure = null, redeliveryCounter = 0, size = 0, properties = {TenantExternalId=48939482-840a-4c0a-8df1-437c27208eff, ContactId=-1}, readOnlyProperties = false, readOnlyBody = false, droppable = false, jmsXGroupFirstForConsumer = false, text = {"serviceId":"DESserviceBus","requestId":"Dar...ctId":-1}}}}}
29-Sep-2020 06:49:57,615 | DEBUG | DESserviceBus | Darpan-Laptop |  |  | 224 |  | o.a.c.c.j.JmsConfiguration | Sending JMS message to: queue://MQ.KMSDataCollectionRequests.in with message: ActiveMQTextMessage {commandId = 0, responseRequired = false, messageId = null, originalDestination = null, originalTransactionId = null, producerId = null, destination = null, transactionId = null, expiration = 1601362198609, timestamp = 0, arrival = 0, brokerInTime = 0, brokerOutTime = 0, correlationId = Camel-ID-Darpan-Laptop-1601362137830-0-61, replyTo = temp-queue://ID:Darpan-Laptop-55243-1601362137447-14:1:1, persistent = true, type = null, priority = 4, groupID = null, groupSequence = 0, targetConsumerId = null, compressed = false, userID = null, content = null, marshalledProperties = null, dataStructure = null, redeliveryCounter = 0, size = 0, properties = {TenantExternalId=51a014bd-cb6c-44d1-b44f-43ea87d9d2ac, ContactId=-1}, readOnlyProperties = false, readOnlyBody = false, droppable = false, jmsXGroupFirstForConsumer = false, text = {"serviceId":"DESserviceBus","requestId":"Dar...ctId":-1}}}}}
29-Sep-2020 06:50:00,615 | WARN  | DESserviceBus | Darpan-Laptop |  |  | 231 |  | c.s.d.e.c.FactsAssembler | Parallel processing timed out after 3000 millis for number 0. This task will be cancelled and will not be aggregated.
29-Sep-2020 06:50:00,615 | DEBUG | DESserviceBus | Darpan-Laptop |  |  | 223 |  | o.a.c.c.h.p.HystrixProcessorCommand | Error occurred processing. Will now run fallback. Exception class: com.netflix.hystrix.exception.HystrixTimeoutException message: null.
29-Sep-2020 06:50:00,615 | DEBUG | DESserviceBus | Darpan-Laptop |  |  | 223 |  | o.a.c.c.h.p.HystrixProcessorCommand | Running fallback: Channel[DelegateSync[com.sa.des.esb.components.DataCollectionFallbackProcessor@7ed8b44]] with exchange: Exchange[ID-Darpan-Laptop-1601362137830-0-59]
29-Sep-2020 06:50:00,615 | DEBUG | DESserviceBus | Darpan-Laptop |  |  | 222 |  | o.a.c.c.h.p.HystrixProcessorCommand | Error occurred processing. Will now run fallback. Exception class: com.netflix.hystrix.exception.HystrixTimeoutException message: null.
29-Sep-2020 06:50:00,615 | DEBUG | DESserviceBus | Darpan-Laptop |  |  | 225 |  | o.a.c.c.h.p.HystrixProcessorCommand | Exiting run command due to a hystrix execution timeout in processing exchange: Exchange[ID-Darpan-Laptop-1601362137830-0-58]
29-Sep-2020 06:50:00,615 | DEBUG | DESserviceBus | Darpan-Laptop |  |  | 224 |  | o.a.c.c.h.p.HystrixProcessorCommand | Exiting run command due to a hystrix execution timeout in processing exchange: Exchange[ID-Darpan-Laptop-1601362137830-0-59]
29-Sep-2020 06:50:00,615 | WARN  | DESserviceBus | Darpan-Laptop |  |  | 231 |  | c.s.d.e.c.FactsAssembler | Parallel processing timed out after 3000 millis for number 1. This task will be cancelled and will not be aggregated.
29-Sep-2020 06:50:00,615 | DEBUG | DESserviceBus | Darpan-Laptop |  |  | 222 |  | o.a.c.c.h.p.HystrixProcessorCommand | Running fallback: Channel[DelegateSync[com.sa.des.esb.components.DataCollectionFallbackProcessor@7ed8b44]] with exchange: Exchange[ID-Darpan-Laptop-1601362137830-0-58]
29-Sep-2020 06:50:00,616 | DEBUG | DESserviceBus | Darpan-Laptop |  |  | 231 |  | c.s.d.e.c.FactsAssembler | Assembling and aggregating all the facts collected using aggregated exchange: 
29-Sep-2020 06:50:00,616 | DEBUG | DESserviceBus | Darpan-Laptop |  |  | 222 |  | c.s.d.e.c.DataCollectionFallbackProcessor | Executing fallback processing for data collection message routing.
29-Sep-2020 06:50:00,615 | DEBUG | DESserviceBus | Darpan-Laptop |  |  | 223 |  | c.s.d.e.c.DataCollectionFallbackProcessor | Executing fallback processing for data collection message routing.
29-Sep-2020 06:50:00,616 | DEBUG | DESserviceBus | Darpan-Laptop |  |  | 222 |  | o.a.c.c.h.p.HystrixProcessorCommand | Running fallback: Channel[DelegateSync[com.sa.des.esb.components.DataCollectionFallbackProcessor@7ed8b44]] with exchange: Exchange[ID-Darpan-Laptop-1601362137830-0-58] done
29-Sep-2020 06:50:00,616 | DEBUG | DESserviceBus | Darpan-Laptop |  |  | 223 |  | o.a.c.c.h.p.HystrixProcessorCommand | Running fallback: Channel[DelegateSync[com.sa.des.esb.components.DataCollectionFallbackProcessor@7ed8b44]] with exchange: Exchange[ID-Darpan-Laptop-1601362137830-0-59] done