189 8069 5689

Flink如何读取数据源

这篇文章主要为大家展示了“Flink如何读取数据源”,内容简而易懂,条理清晰,希望能够帮助大家解决疑惑,下面让小编带领大家一起研究并学习一下“Flink如何读取数据源”这篇文章吧。

乌鲁木齐ssl适用于网站、小程序/APP、API接口等需要进行数据传输应用场景,ssl证书未来市场广阔!成为成都创新互联的ssl证书销售渠道,可以享受市场价格4-6折优惠!如果有意向欢迎电话联系或者加微信:028-86922220(备注:SSL证书合作)期待与您的合作!

从集合中读取
    private static void radFromCollection(String[] args) throws Exception {
        //将参数转成对象
        MultipleParameterTool params = MultipleParameterTool.fromArgs(args);
        //创建批处理执行环境
//        ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
        //创建流程处理
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        //设置每个算子的的并行度,默认为cup核数(测试环境下)
        env.setParallelism(2);
        //设置最大并行度
        env.setMaxParallelism(6);

        //从集合中读取
        List collectionData = Arrays.asList("a", "b", "c", "d");
        DataStreamSource dataStreamSource = env.fromCollection(collectionData);
        //从数组中读取
        // env.fromElements("a", "b", "c", "d");
        dataStreamSource.print(); //dataStreamSource.addSink(new PrintSinkFunction<>());

        env.execute();
    }
从文件中读取
    StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        DataStreamSource dataStreamSource = env.readTextFile("E:\\GIT\\flink-learn\\flink1\\word.txt", "utf-8");
        dataStreamSource.print();
        env.execute();
从kafka 中读取
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        Properties properties = new Properties();
        properties.put("bootstrap.servers", "10.1.5.130:9092");
        properties.put("zookeeper.connect", "10.2.5.135:2181");
        properties.put("group.id", "my-flink");
        properties.put("auto.offset.reset", "latest");
        properties.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
        properties.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");

        FlinkKafkaConsumer010 kafkaConsumer010 = new FlinkKafkaConsumer010<>(
                "flink",// topic
                new SimpleStringSchema(),
                properties
        );
        DataStreamSource dataStreamSource = env.addSource(kafkaConsumer010);
        dataStreamSource.print();
        env.execute();
从自定义Source 中读取
  • 实现 org.apache.flink.streaming.api.functions.source.SourceFunction

 public static final class MyDataSource implements SourceFunction {

        private Boolean running = true;

        @Override
        public void run(SourceContext sourceContext) throws Exception {
            Random random = new Random();
            while (running) {
                double data = random.nextDouble() * 100;
                sourceContext.collectWithTimestamp(String.valueOf(data), System.currentTimeMillis());
                TimeUnit.SECONDS.sleep(1);
            }

        }

        @Override
        public void cancel() {
            this.running = false;
        }
    }
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        DataStreamSource dataStreamSource = env.addSource(new MyDataSource());
        dataStreamSource.print();
        env.execute();

以上是“Flink如何读取数据源”这篇文章的所有内容,感谢各位的阅读!相信大家都有了一定的了解,希望分享的内容对大家有所帮助,如果还想学习更多知识,欢迎关注创新互联行业资讯频道!


文章标题:Flink如何读取数据源
标题路径:http://cdxtjz.cn/article/gogjee.html

其他资讯