Java 类名:com.alibaba.alink.operator.batch.timeseries.DeepARPredictBatchOp
Python 类名:DeepARPredictBatchOp

功能介绍

使用 DeepAR 进行时间序列训练和预测。

使用方式

参考文档 https://www.yuque.com/pinshu/alink_guide/xbp5ky

参数说明

| 名称 | 中文名称 | 描述 | 类型 | 是否必须? | 取值范围 | 默认值 | | —- | —- | —- | —- | —- | —- | —- |

| predictionCol | 预测结果列名 | 预测结果列名 | String | ✓ | | |

| valueCol | value列,类型为MTable | value列,类型为MTable | String | ✓ | 所选列类型为 [M_TABLE] | |

| modelFilePath | 模型的文件路径 | 模型的文件路径 | String | | | null |

| predictNum | 预测条数 | 预测条数 | Integer | | | 1 |

| predictionDetailCol | 预测详细信息列名 | 预测详细信息列名 | String | | | |

| reservedCols | 算法保留列名 | 算法保留列 | String[] | | | null |

| numThreads | 组件多线程线程个数 | 组件多线程线程个数 | Integer | | | 1 |

代码示例

Python 代码

  1. from pyalink.alink import *
  2. import pandas as pd
  3. useLocalEnv(1)
  4. import time, datetime
  5. import numpy as np
  6. import pandas as pd
  7. data = pd.DataFrame([
  8. [0, datetime.datetime.fromisoformat('2021-11-01 00:00:00'), 100.0],
  9. [0, datetime.datetime.fromisoformat('2021-11-02 00:00:00'), 100.0],
  10. [0, datetime.datetime.fromisoformat('2021-11-03 00:00:00'), 100.0],
  11. [0, datetime.datetime.fromisoformat('2021-11-04 00:00:00'), 100.0],
  12. [0, datetime.datetime.fromisoformat('2021-11-05 00:00:00'), 100.0]
  13. ])
  14. source = dataframeToOperator(data, schemaStr='id int, ts timestamp, series double', op_type='batch')
  15. deepARTrainBatchOp = DeepARTrainBatchOp()\
  16. .setTimeCol("ts")\
  17. .setSelectedCol("series")\
  18. .setNumEpochs(10)\
  19. .setWindow(2)\
  20. .setStride(1)
  21. groupByBatchOp = GroupByBatchOp()\
  22. .setGroupByPredicate("id")\
  23. .setSelectClause("mtable_agg(ts, series) as mtable_agg_series")
  24. deepARPredictBatchOp = DeepARPredictBatchOp()\
  25. .setPredictNum(2)\
  26. .setPredictionCol("pred")\
  27. .setValueCol("mtable_agg_series")
  28. deepARPredictBatchOp\
  29. .linkFrom(
  30. deepARTrainBatchOp.linkFrom(source),
  31. groupByBatchOp.linkFrom(source)
  32. )\
  33. .print()

Java 代码

  1. import org.apache.flink.types.Row;
  2. import com.alibaba.alink.operator.batch.BatchOperator;
  3. import com.alibaba.alink.operator.batch.source.MemSourceBatchOp;
  4. import com.alibaba.alink.operator.batch.sql.GroupByBatchOp;
  5. import com.alibaba.alink.operator.batch.timeseries.DeepARPredictBatchOp;
  6. import com.alibaba.alink.operator.batch.timeseries.DeepARTrainBatchOp;
  7. import org.junit.Test;
  8. import java.sql.Timestamp;
  9. import java.util.Arrays;
  10. import java.util.List;
  11. public class DeepARTrainBatchOpTest {
  12. @Test
  13. public void testDeepARTrainBatchOp() throws Exception {
  14. BatchOperator.setParallelism(1);
  15. List <Row> data = Arrays.asList(
  16. Row.of(0, Timestamp.valueOf("2021-11-01 00:00:00"), 100.0),
  17. Row.of(0, Timestamp.valueOf("2021-11-02 00:00:00"), 100.0),
  18. Row.of(0, Timestamp.valueOf("2021-11-03 00:00:00"), 100.0),
  19. Row.of(0, Timestamp.valueOf("2021-11-04 00:00:00"), 100.0),
  20. Row.of(0, Timestamp.valueOf("2021-11-05 00:00:00"), 100.0)
  21. );
  22. MemSourceBatchOp memSourceBatchOp = new MemSourceBatchOp(data, "id int, ts timestamp, series double");
  23. DeepARTrainBatchOp deepARTrainBatchOp = new DeepARTrainBatchOp()
  24. .setTimeCol("ts")
  25. .setSelectedCol("series")
  26. .setNumEpochs(10)
  27. .setWindow(2)
  28. .setStride(1);
  29. GroupByBatchOp groupByBatchOp = new GroupByBatchOp()
  30. .setGroupByPredicate("id")
  31. .setSelectClause("mtable_agg(ts, series) as mtable_agg_series");
  32. DeepARPredictBatchOp deepARPredictBatchOp = new DeepARPredictBatchOp()
  33. .setPredictNum(2)
  34. .setPredictionCol("pred")
  35. .setValueCol("mtable_agg_series");
  36. deepARPredictBatchOp
  37. .linkFrom(
  38. deepARTrainBatchOp.linkFrom(memSourceBatchOp),
  39. groupByBatchOp.linkFrom(memSourceBatchOp)
  40. )
  41. .print();
  42. }
  43. }

运行结果

| id | mtable_agg_series | pred |
|——+—————————————————————————————————————————————————————————————————————————————————————————————————————————————+————————————————————————————————————————————————————————————————————————————-|
| 0 | {“data”:{“ts”:[“2021-11-01 00:00:00.0”,”2021-11-02 00:00:00.0”,”2021-11-03 00:00:00.0”,”2021-11-04 00:00:00.0”,”2021-11-05 00:00:00.0”],”series”:[100.0,100.0,100.0,100.0,100.0]},”schema”:”ts TIMESTAMP,series DOUBLE”} | {“data”:{“ts”:[“2021-11-06 00:00:00.0”,”2021-11-07 00:00:00.0”],”series”:[31.424224853515625,39.10265350341797]},”schema”:”ts TIMESTAMP,series DOUBLE”} |