Java 类名:com.alibaba.alink.operator.batch.timeseries.LSTNetTrainBatchOp
Python 类名:LSTNetTrainBatchOp

功能介绍

使用 LSTNet 进行时间序列训练和预测。

使用方式

参考文档 https://www.yuque.com/pinshu/alink_guide/xbp5ky

参数说明

| 名称 | 中文名称 | 描述 | 类型 | 是否必须? | 取值范围 | 默认值 | | —- | —- | —- | —- | —- | —- | —- |

| checkpointFilePath | 保存 checkpoint 的路径 | 用于保存中间结果的路径,将作为 TensorFlow 中 Estimatormodel_dir 传入,需要为所有 worker 都能访问到的目录 | String | ✓ | | |

| timeCol | 时间戳列(TimeStamp) | 时间戳列(TimeStamp) | String | ✓ | 所选列类型为 [TIMESTAMP] | |

| batchSize | 数据批大小 | 数据批大小 | Integer | | | 128 |

| horizon | horizon大小 | horizon大小 | Integer | | [1, +inf) | 12 |

| intraOpParallelism | Op 间并发度 | Op 间并发度 | Integer | | | 4 |

| learningRate | 学习率 | 学习率 | Double | | | 0.001 |

| numEpochs | epoch数 | epoch数 | Integer | | | 10 |

| numPSs | PS 角色数 | PS 角色的数量。值未设置时,如果 Worker 角色数也未设置,则为作业总并发度的 1/4(需要取整),否则为总并发度减去 Worker 角色数。 | Integer | | | null |

| numWorkers | Worker 角色数 | Worker 角色的数量。值未设置时,如果 PS 角色数也未设置,则为作业总并发度的 3/4(需要取整),否则为总并发度减去 PS 角色数。 | Integer | | | null |

| pythonEnv | Python 环境路径 | Python 环境路径,一般情况下不需要填写。如果是压缩文件,需要解压后得到一个目录,且目录名与压缩文件主文件名一致,可以使用 http://, https://, oss://, hdfs:// 等路径;如果是目录,那么只能使用本地路径,即 file://。 | String | | | “” |

| removeCheckpointBeforeTraining | 是否在训练前移除 checkpoint 相关文件 | 是否在训练前移除 checkpoint 相关文件用于重新训练,只会删除必要的文件 | Boolean | | | null |

| selectedCol | 计算列对应的列名 | 计算列对应的列名, 默认值是null | String | | | null |

| vectorCol | 向量列名 | 向量列对应的列名,默认值是null | String | | 所选列类型为 [DENSE_VECTOR, SPARSE_VECTOR, STRING, VECTOR] | null |

| window | 窗口大小 | 窗口大小 | Integer | | | 5 |

代码示例

Python 代码

  1. from pyalink.alink import *
  2. import pandas as pd
  3. useLocalEnv(1)
  4. import time, datetime
  5. import numpy as np
  6. import pandas as pd
  7. data = pd.DataFrame([
  8. [0, datetime.datetime.fromisoformat("2021-11-01 00:00:00"), 100.0],
  9. [0, datetime.datetime.fromisoformat("2021-11-02 00:00:00"), 200.0],
  10. [0, datetime.datetime.fromisoformat("2021-11-03 00:00:00"), 300.0],
  11. [0, datetime.datetime.fromisoformat("2021-11-04 00:00:00"), 400.0],
  12. [0, datetime.datetime.fromisoformat("2021-11-06 00:00:00"), 500.0],
  13. [0, datetime.datetime.fromisoformat("2021-11-07 00:00:00"), 600.0],
  14. [0, datetime.datetime.fromisoformat("2021-11-08 00:00:00"), 700.0],
  15. [0, datetime.datetime.fromisoformat("2021-11-09 00:00:00"), 800.0],
  16. [0, datetime.datetime.fromisoformat("2021-11-10 00:00:00"), 900.0],
  17. [0, datetime.datetime.fromisoformat("2021-11-11 00:00:00"), 800.0],
  18. [0, datetime.datetime.fromisoformat("2021-11-12 00:00:00"), 700.0],
  19. [0, datetime.datetime.fromisoformat("2021-11-13 00:00:00"), 600.0],
  20. [0, datetime.datetime.fromisoformat("2021-11-14 00:00:00"), 500.0],
  21. [0, datetime.datetime.fromisoformat("2021-11-15 00:00:00"), 400.0],
  22. [0, datetime.datetime.fromisoformat("2021-11-16 00:00:00"), 300.0],
  23. [0, datetime.datetime.fromisoformat("2021-11-17 00:00:00"), 200.0],
  24. [0, datetime.datetime.fromisoformat("2021-11-18 00:00:00"), 100.0],
  25. [0, datetime.datetime.fromisoformat("2021-11-19 00:00:00"), 200.0],
  26. [0, datetime.datetime.fromisoformat("2021-11-20 00:00:00"), 300.0],
  27. [0, datetime.datetime.fromisoformat("2021-11-21 00:00:00"), 400.0],
  28. [0, datetime.datetime.fromisoformat("2021-11-22 00:00:00"), 500.0],
  29. [0, datetime.datetime.fromisoformat("2021-11-23 00:00:00"), 600.0],
  30. [0, datetime.datetime.fromisoformat("2021-11-24 00:00:00"), 700.0],
  31. [0, datetime.datetime.fromisoformat("2021-11-25 00:00:00"), 800.0],
  32. [0, datetime.datetime.fromisoformat("2021-11-26 00:00:00"), 900.0],
  33. [0, datetime.datetime.fromisoformat("2021-11-27 00:00:00"), 800.0],
  34. [0, datetime.datetime.fromisoformat("2021-11-28 00:00:00"), 700.0],
  35. [0, datetime.datetime.fromisoformat("2021-11-29 00:00:00"), 600.0],
  36. [0, datetime.datetime.fromisoformat("2021-11-30 00:00:00"), 500.0],
  37. [0, datetime.datetime.fromisoformat("2021-12-01 00:00:00"), 400.0],
  38. [0, datetime.datetime.fromisoformat("2021-12-02 00:00:00"), 300.0],
  39. [0, datetime.datetime.fromisoformat("2021-12-03 00:00:00"), 200.0]
  40. ])
  41. source = dataframeToOperator(data, schemaStr='id int, ts timestamp, series double', op_type='batch')
  42. lstNetTrainBatchOp = LSTNetTrainBatchOp()\
  43. .setTimeCol("ts")\
  44. .setSelectedCol("series")\
  45. .setNumEpochs(10)\
  46. .setWindow(24)\
  47. .setHorizon(1)
  48. groupByBatchOp = GroupByBatchOp()\
  49. .setGroupByPredicate("id")\
  50. .setSelectClause("mtable_agg(ts, series) as mtable_agg_series")
  51. lstNetPredictBatchOp = LSTNetPredictBatchOp()\
  52. .setPredictNum(1)\
  53. .setPredictionCol("pred")\
  54. .setReservedCols([])\
  55. .setValueCol("mtable_agg_series")\
  56. lstNetPredictBatchOp\
  57. .linkFrom(
  58. lstNetTrainBatchOp.linkFrom(source),
  59. groupByBatchOp.linkFrom(source.filter("ts >= TO_TIMESTAMP('2021-11-10 00:00:00')"))
  60. )\
  61. .print()

Java 代码

  1. import org.apache.flink.types.Row;
  2. import com.alibaba.alink.operator.batch.BatchOperator;
  3. import com.alibaba.alink.operator.batch.source.MemSourceBatchOp;
  4. import com.alibaba.alink.operator.batch.sql.GroupByBatchOp;
  5. import com.alibaba.alink.operator.batch.timeseries.LSTNetPredictBatchOp;
  6. import com.alibaba.alink.operator.batch.timeseries.LSTNetTrainBatchOp;
  7. import org.junit.Test;
  8. import java.sql.Timestamp;
  9. import java.util.Arrays;
  10. import java.util.List;
  11. public class LSTNetTrainBatchOpTest {
  12. @Test
  13. public void testLSTNetTrainBatchOp() throws Exception {
  14. BatchOperator.setParallelism(1);
  15. List <Row> data = Arrays.asList(
  16. Row.of(0, Timestamp.valueOf("2021-11-01 00:00:00"), 100.0),
  17. Row.of(0, Timestamp.valueOf("2021-11-02 00:00:00"), 200.0),
  18. Row.of(0, Timestamp.valueOf("2021-11-03 00:00:00"), 300.0),
  19. Row.of(0, Timestamp.valueOf("2021-11-04 00:00:00"), 400.0),
  20. Row.of(0, Timestamp.valueOf("2021-11-06 00:00:00"), 500.0),
  21. Row.of(0, Timestamp.valueOf("2021-11-07 00:00:00"), 600.0),
  22. Row.of(0, Timestamp.valueOf("2021-11-08 00:00:00"), 700.0),
  23. Row.of(0, Timestamp.valueOf("2021-11-09 00:00:00"), 800.0),
  24. Row.of(0, Timestamp.valueOf("2021-11-10 00:00:00"), 900.0),
  25. Row.of(0, Timestamp.valueOf("2021-11-11 00:00:00"), 800.0),
  26. Row.of(0, Timestamp.valueOf("2021-11-12 00:00:00"), 700.0),
  27. Row.of(0, Timestamp.valueOf("2021-11-13 00:00:00"), 600.0),
  28. Row.of(0, Timestamp.valueOf("2021-11-14 00:00:00"), 500.0),
  29. Row.of(0, Timestamp.valueOf("2021-11-15 00:00:00"), 400.0),
  30. Row.of(0, Timestamp.valueOf("2021-11-16 00:00:00"), 300.0),
  31. Row.of(0, Timestamp.valueOf("2021-11-17 00:00:00"), 200.0),
  32. Row.of(0, Timestamp.valueOf("2021-11-18 00:00:00"), 100.0),
  33. Row.of(0, Timestamp.valueOf("2021-11-19 00:00:00"), 200.0),
  34. Row.of(0, Timestamp.valueOf("2021-11-20 00:00:00"), 300.0),
  35. Row.of(0, Timestamp.valueOf("2021-11-21 00:00:00"), 400.0),
  36. Row.of(0, Timestamp.valueOf("2021-11-22 00:00:00"), 500.0),
  37. Row.of(0, Timestamp.valueOf("2021-11-23 00:00:00"), 600.0),
  38. Row.of(0, Timestamp.valueOf("2021-11-24 00:00:00"), 700.0),
  39. Row.of(0, Timestamp.valueOf("2021-11-25 00:00:00"), 800.0),
  40. Row.of(0, Timestamp.valueOf("2021-11-26 00:00:00"), 900.0),
  41. Row.of(0, Timestamp.valueOf("2021-11-27 00:00:00"), 800.0),
  42. Row.of(0, Timestamp.valueOf("2021-11-28 00:00:00"), 700.0),
  43. Row.of(0, Timestamp.valueOf("2021-11-29 00:00:00"), 600.0),
  44. Row.of(0, Timestamp.valueOf("2021-11-30 00:00:00"), 500.0),
  45. Row.of(0, Timestamp.valueOf("2021-12-01 00:00:00"), 400.0),
  46. Row.of(0, Timestamp.valueOf("2021-12-02 00:00:00"), 300.0),
  47. Row.of(0, Timestamp.valueOf("2021-12-03 00:00:00"), 200.0)
  48. );
  49. MemSourceBatchOp memSourceBatchOp = new MemSourceBatchOp(data, "id int, ts timestamp, series double");
  50. LSTNetTrainBatchOp lstNetTrainBatchOp = new LSTNetTrainBatchOp()
  51. .setTimeCol("ts")
  52. .setSelectedCol("series")
  53. .setNumEpochs(10)
  54. .setWindow(24)
  55. .setHorizon(1);
  56. GroupByBatchOp groupByBatchOp = new GroupByBatchOp()
  57. .setGroupByPredicate("id")
  58. .setSelectClause("mtable_agg(ts, series) as mtable_agg_series");
  59. LSTNetPredictBatchOp lstNetPredictBatchOp = new LSTNetPredictBatchOp()
  60. .setPredictNum(1)
  61. .setPredictionCol("pred")
  62. .setReservedCols()
  63. .setValueCol("mtable_agg_series");
  64. lstNetPredictBatchOp
  65. .linkFrom(
  66. lstNetTrainBatchOp.linkFrom(memSourceBatchOp),
  67. groupByBatchOp.linkFrom(memSourceBatchOp.filter("ts >= TO_TIMESTAMP('2021-11-10 00:00:00')"))
  68. )
  69. .print();
  70. }
  71. }

运行结果

| pred | | —- |

| {“data”:{“ts”:[“2021-12-04 00:00:00.0”],”series”:[441.76019287109375]},”schema”:”ts TIMESTAMP,series DOUBLE”} |