flink吧 关注:442贴子:932
  • 1回复贴,共1

关于flink窗口计算的时机?

只看楼主收藏回复

datastream.assignAscendingTimestamps(_.times) //事务时间
.keyBy(k => (k.db, k.tbl))
.window(SlidingEventTimeWindows.of(Time.days(3), Time.days(1)))
.reduce((x, y) => {
TableCall(x.db, x.tbl, x.conuts + 1, y.times)})
.print()
我现在不知道这个计算是在什么时间触发的。按我的解理应该是每天触发一次计算。然后打印出来
{"timestamp":"2021-08-01 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part"}
{"timestamp":"2021-08-01 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part1"}
{"timestamp":"2021-08-02 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part"}
{"timestamp":"2021-08-02 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part1"}
{"timestamp":"2021-08-03 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part"}
{"timestamp":"2021-08-03 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part1"}
{"timestamp":"2021-08-04 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part"}
{"timestamp":"2021-08-04 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part1"}
{"timestamp":"2021-08-05 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part"}
{"timestamp":"2021-08-05 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part1"}
{"timestamp":"2021-08-06 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part"}
{"timestamp":"2021-08-06 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part1"}
{"timestamp":"2021-08-07 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part"}
{"timestamp":"2021-08-07 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part1"}
{"timestamp":"2021-08-08 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part"}
{"timestamp":"2021-08-08 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part1"}
{"timestamp":"2021-08-09 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part"}
{"timestamp":"2021-08-09 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part1"}
{"timestamp":"2021-08-10 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part"}
{"timestamp":"2021-08-10 15:31:56,895","msg":" INFO org.apache.hadoop.hive.metastore.HiveMetaStore: [pool-9-thread-97]: 88: get_table : db=hivetest tbl=chinese_part1"}
我现在发送这样的数据。但感觉不是每天计算一次。比如我发送了2021-08-01 15:31:56,895到2021-08-03 15:31:56,895这样的三条数据。但终端没有输出。不知道要发送到那一天才会计算并输出
我知道可以重写Trigger类中的方法
override def onElement(t: TableCall, l: Long, w: Window, triggerContext: Trigger.TriggerContext): TriggerResult = {
TriggerResult.FIRE
}
每来一条数据就触发一次计算
但这onEventTime是在什么时间点触发的啊


IP属地:湖北1楼2021-08-26 15:50回复

    好像从第二条数据就触发了计算。但没有终端上输出结果。
    换jdbc也没写入数据库


    IP属地:湖北2楼2021-08-26 16:12
    回复