美文网首页
kafka+java

kafka+java

作者: 猴子小心粑 | 来源:发表于2016-09-13 17:08 被阅读0次

写在前面

在之前的两篇文章中,我们介绍了log->logstash->kafka的流程连通,以及相关的环境搭建,现在准备工作都做好了,我们开始从kafka接收错误日志来发送邮件吧。

工程准备

这里我们搭建两个应用:

  • kafka-spring :用来监听kafka服务,判断错误系统,发送邮件给对应系统的负责人。
  • MessageDispatcher :用来发送消息,目前只只支持MAIL,后面会陆续加入SMS等。

kaka-spring

1.工程目录结构

kaka-spring.png

2.重要文件

重点看下applicationContext-consumer.xml和KafkaConsumerService这两个文件

  • applicationContext-consumer.xml
    这个文件用来连接kafka和spring,看下具体内容
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
       xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
       xmlns:int="http://www.springframework.org/schema/integration"
       xmlns:int-kafka="http://www.springframework.org/schema/integration/kafka"
       xmlns:task="http://www.springframework.org/schema/task"
       xsi:schemaLocation="http://www.springframework.org/schema/integration/kafka http://www.springframework.org/schema/integration/kafka/spring-integration-kafka.xsd
      http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd
      http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
      http://www.springframework.org/schema/task http://www.springframework.org/schema/task/spring-task.xsd">
    <int:channel id="inputFromKafka">
        <int:queue/>
    </int:channel>

    <!--<int:service-activator auto-startup="true"-->
    <!--input-channel="inputFromKafka" ref="kafkaConsumerService" method="receiveMessage">-->
    <!--</int:service-activator>-->
    <!-- 上下两种方式都可以 -->
    <!-- 使用kafkaConsumerService来接收kafka消息 -->
    <int:outbound-channel-adapter channel="inputFromKafka"
                                  ref="kafkaConsumerService" method="receiveMessage" auto-startup="true"/>

    <int:poller default="true" id="default" fixed-rate="5"
                time-unit="MILLISECONDS" max-messages-per-poll="5">
    </int:poller>
    <int-kafka:inbound-channel-adapter
            kafka-consumer-context-ref="consumerContext" channel="inputFromKafka">
    </int-kafka:inbound-channel-adapter>
    <bean id="consumerProperties" class="org.springframework.beans.factory.config.PropertiesFactoryBean">
        <property name="properties">
            <props>
                <prop key="auto.offset.reset">smallest</prop>
                <prop key="socket.receive.buffer.bytes">10485760</prop>
                <!-- 10M -->
                <prop key="fetch.message.max.bytes">5242880</prop>
                <prop key="auto.commit.interval.ms">1000</prop>
            </props>
        </property>
    </bean>
    <int-kafka:consumer-context id="consumerContext"
                                consumer-timeout="4000" zookeeper-connect="zookeeperConnect"
                                consumer-properties="consumerProperties">
        <int-kafka:consumer-configurations>
            <int-kafka:consumer-configuration
                    group-id="mygroup" max-messages="5000">
                <!-- 这里的topic就是我们再kafka中创建的那个 -->
                <int-kafka:topic id="kafkatopic" streams="4"/>
            </int-kafka:consumer-configuration>
        </int-kafka:consumer-configurations>
    </int-kafka:consumer-context>
    <!-- zookeeper地址按照自己的地址配置 -->
    <int-kafka:zookeeper-connect id="zookeeperConnect"
                                 zk-connect="192.168.1.120:2181" zk-connection-timeout="6000"
                                 zk-session-timeout="400" zk-sync-time="200"/>
</beans>
  • KafkaConsumerService.java
    好,我们再来看下kafkaConsumerService中的receiveMessage方法是如何实现的。
public void receiveMessage(HashMap map)
    {
        logger.info("received Messages from kafka ================" + map.size());
        Set<Map.Entry> set = map.entrySet();

        for (Map.Entry entry : set)
        {
            String topic = (String) entry.getKey();
            logger.info("Topic:" + topic);
            ConcurrentHashMap<Integer, List<byte[]>> messages = (ConcurrentHashMap<Integer, List<byte[]>>) entry
                    .getValue();
            Collection<List<byte[]>> values = messages.values();

            for (Iterator<List<byte[]>> iterator = values.iterator(); iterator.hasNext(); )
            {
                List<byte[]> list = iterator.next();
                for (byte[] object : list)
                {
                    String message = new String(object);
                    Message msg = JSON.parseObject(message.replace("@", ""), Message.class);
                    String address = addressStore.pick(msg);
                    Map<String, String> request = new HashMap<String, String>();
                    request.put("messageType", Constants.MESSAGE_TYPE_MAIL);
                    request.put("address", address);
                    request.put("content", msg.toString());

                    try
                    {
                        //httpclient调用发送邮件
                        HttpClientUtil.postParameters(postUrl, request);
                    }
                    catch (Exception e)
                    {
                        logger.error("HttpClientUtil postParameters exception :" + e.getMessage());
                    }
                }
            }
        }
    }

3.重要的依赖

使用spring-integration项目中的kafka连接器

<dependency>
      <groupId>org.springframework.integration</groupId>
      <artifactId>spring-integration-kafka</artifactId>
      <version>1.1.0.RELEASE</version>
</dependency>

其他的依赖就不一一写啦,相信大家都能搞定~以上就是kafka-spring的接收kafka消息的配置和java代码,其他的功能就在这个基础上加上去就可以了。

MessageDispatcher

1.目录结构

MessageDispatcher.png

典型的spring mvc项目,用作消息的分发。

2.重要文件

消息发送顶层接口MessageSender,不同的消息类型按照自己的要求实现接口的sendMessage方法即可。

package com.allinpay.message;

/**
 * 消息发送接口
 * Created by gejunqing on 16/9/8.
 */
public interface MessageSender
{
    void sendMessage(String address, String content);
}

再看看mvc的入口MessageController

package com.allinpay.mvc;

import com.allinpay.base.Constants;
import com.allinpay.base.MessageRequest;
import com.allinpay.base.SpringContextHolder;
import com.allinpay.message.MessageSender;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Controller;
import org.springframework.ui.ModelMap;
import org.springframework.web.bind.annotation.ModelAttribute;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.servlet.ModelAndView;
import org.springframework.web.servlet.view.json.MappingJackson2JsonView;

import java.util.HashMap;
import java.util.Map;

@Controller
@RequestMapping("/api")
public class MessageController
{
    private Logger logger = LoggerFactory.getLogger(MessageController.class);


    @RequestMapping(value = "/sendMessage", method = RequestMethod.POST)
    public ModelAndView sendMessage(@ModelAttribute("request") MessageRequest request, ModelMap model)
    {
        logger.info("sendMessage {}", request);
        Map<String, String> result = new HashMap<String, String>();
        try
        {
            MessageSender sender = SpringContextHolder.getBean(request.getMessageType());
            sender.sendMessage(request.getAddress(), request.getContent());
            result.put("retCode", Constants.RET_CODE_SUCCESS);
        }
        catch (Exception e)
        {
            logger.error(e.getMessage());
            result.put("retCode", Constants.RET_CODE_SYTEMERROR);
            result.put("retMsg", e.getMessage());
        }
        return new ModelAndView(new MappingJackson2JsonView(), result);
    }
}

其他系统通过http post方式请求/api/sendMessage方式即可调用发送消息功能。

总结

至此,我们整个监控日志,发送邮件的流程已经全部结束。

相关文章

  • kafka+java

    写在前面 在之前的两篇文章中,我们介绍了log->logstash->kafka的流程连通,以及相关的环境搭建,现...

网友评论

      本文标题:kafka+java

      本文链接:https://www.haomeiwen.com/subject/qbbmettx.html