Java 类名:com.alibaba.alink.operator.batch.regression.GlmEvaluationBatchOp
Python 类名:GlmEvaluationBatchOp

功能介绍

GLM(Generalized Linear Model)又称为广义线性回归模型,是一种常用的统计模型,也是一种非线性模型族,许多常用的模型都属于广义线性回归。
它描述了响应和预测因子之间的非线性关系。广义线性回归模型具有线性回归模型的广义特征。响应变量遵循正态、二项式、泊松分布、伽马分布或逆高斯分布,链接函数f定义了μ和预测值的线性组合之间的关系。
GLM功能包括GLM训练,GLM预测(批和流)和GLM评估, 其中训练使用迭代最小二乘方法。

算法使用

| 分布 | 连接函数 | 对应算法 | | —- | —- | —- |

| 二项分布 | Logit | 逻辑回归 |

| 多项分布 | Logit | softmax |

| 高斯分布 | Identity | 线性回归 |

| Poisson分布 | Log | Possion回归 |

文献或出处

[1] https://en.wikipedia.org/wiki/Generalized_linear_model

代码示例

Python 代码

  1. from pyalink.alink import *
  2. import pandas as pd
  3. useLocalEnv(1)
  4. df = pd.DataFrame([
  5. [1.6094,118.0000,69.0000,1.0000,2.0000],
  6. [2.3026,58.0000,35.0000,1.0000,2.0000],
  7. [2.7081,42.0000,26.0000,1.0000,2.0000],
  8. [2.9957,35.0000,21.0000,1.0000,2.0000],
  9. [3.4012,27.0000,18.0000,1.0000,2.0000],
  10. [3.6889,25.0000,16.0000,1.0000,2.0000],
  11. [4.0943,21.0000,13.0000,1.0000,2.0000],
  12. [4.3820,19.0000,12.0000,1.0000,2.0000],
  13. [4.6052,18.0000,12.0000,1.0000,2.0000]
  14. ])
  15. source = BatchOperator.fromDataframe(df, schemaStr='u double, lot1 double, lot2 double, offset double, weights double')
  16. featureColNames = ["lot1", "lot2"]
  17. labelColName = "u"
  18. # train
  19. train = GlmTrainBatchOp()\
  20. .setFamily("gamma")\
  21. .setLink("Log")\
  22. .setRegParam(0.3)\
  23. .setMaxIter(5)\
  24. .setFeatureCols(featureColNames)\
  25. .setLabelCol(labelColName)
  26. source.link(train)
  27. # predict
  28. predict = GlmPredictBatchOp()\
  29. .setPredictionCol("pred")
  30. predict.linkFrom(train, source)
  31. # eval
  32. eval = GlmEvaluationBatchOp()\
  33. .setFamily("gamma")\
  34. .setLink("Log")\
  35. .setRegParam(0.3)\
  36. .setMaxIter(5)\
  37. .setFeatureCols(featureColNames)\
  38. .setLabelCol(labelColName)
  39. eval.linkFrom(train, source)
  40. predict.lazyPrint(10)
  41. eval.print()

Java 代码

  1. import org.apache.flink.types.Row;
  2. import com.alibaba.alink.operator.batch.BatchOperator;
  3. import com.alibaba.alink.operator.batch.regression.GlmEvaluationBatchOp;
  4. import com.alibaba.alink.operator.batch.regression.GlmPredictBatchOp;
  5. import com.alibaba.alink.operator.batch.regression.GlmTrainBatchOp;
  6. import com.alibaba.alink.operator.batch.source.MemSourceBatchOp;
  7. import org.junit.Test;
  8. import java.util.Arrays;
  9. import java.util.List;
  10. public class GlmEvaluationBatchOpTest {
  11. @Test
  12. public void testGlmEvaluationBatchOp() throws Exception {
  13. List <Row> df = Arrays.asList(
  14. Row.of(1.6094, 118.0000, 69.0000, 1.0000, 2.0000),
  15. Row.of(2.3026, 58.0000, 35.0000, 1.0000, 2.0000),
  16. Row.of(2.7081, 42.0000, 26.0000, 1.0000, 2.0000),
  17. Row.of(2.9957, 35.0000, 21.0000, 1.0000, 2.0000),
  18. Row.of(3.4012, 27.0000, 18.0000, 1.0000, 2.0000),
  19. Row.of(3.6889, 25.0000, 16.0000, 1.0000, 2.0000),
  20. Row.of(4.0943, 21.0000, 13.0000, 1.0000, 2.0000),
  21. Row.of(4.3820, 19.0000, 12.0000, 1.0000, 2.0000),
  22. Row.of(4.6052, 18.0000, 12.0000, 1.0000, 2.0000)
  23. );
  24. BatchOperator <?> source = new MemSourceBatchOp(df,
  25. "u double, lot1 double, lot2 double, offset double, weights double");
  26. String[] featureColNames = new String[] {"lot1", "lot2"};
  27. String labelColName = "u";
  28. BatchOperator <?> train = new GlmTrainBatchOp()
  29. .setFamily("gamma")
  30. .setLink("Log")
  31. .setRegParam(0.3)
  32. .setMaxIter(5)
  33. .setFeatureCols(featureColNames)
  34. .setLabelCol(labelColName);
  35. source.link(train);
  36. BatchOperator <?> predict = new GlmPredictBatchOp()
  37. .setPredictionCol("pred");
  38. predict.linkFrom(train, source);
  39. BatchOperator <?> eval = new GlmEvaluationBatchOp()
  40. .setFamily("gamma")
  41. .setLink("Log")
  42. .setRegParam(0.3)
  43. .setMaxIter(5)
  44. .setFeatureCols(featureColNames)
  45. .setLabelCol(labelColName);
  46. eval.linkFrom(train, source);
  47. predict.lazyPrint(10);
  48. eval.print();
  49. }
  50. }

运行结果

预测结果

| u | lot1 | lot2 | offset | weights | pred | | —- | —- | —- | —- | —- | —- |

| 0 | 1.6094 | 118.0 | 69.0 | 1.0 | 2.0 |

| 1 | 2.3026 | 58.0 | 35.0 | 1.0 | 2.0 |

| 2 | 2.7081 | 42.0 | 26.0 | 1.0 | 2.0 |

| 3 | 2.9957 | 35.0 | 21.0 | 1.0 | 2.0 |

| 4 | 3.4012 | 27.0 | 18.0 | 1.0 | 2.0 |

| 5 | 3.6889 | 25.0 | 16.0 | 1.0 | 2.0 |

| 6 | 4.0943 | 21.0 | 13.0 | 1.0 | 2.0 |

| 7 | 4.3820 | 19.0 | 12.0 | 1.0 | 2.0 |

| 8 | 4.6052 | 18.0 | 12.0 | 1.0 | 2.0 |

评估结果

| summary | | —- |

| {“rank”:3,”degreeOfFreedom”:6,”residualDegreeOfFreeDom”:6,”residualDegreeOfFreedomNull”:8,”aic”:9702.08856968678,”dispersion”:0.01600672089664272,”deviance”:0.09638590199190636,”nullDeviance”:0.8493577599031797,”coefficients”:[0.007797743508551773,-0.031175844426501245],”intercept”:1.6095243247335171,”coefficientStandardErrors”:[0.030385113783611032,0.05301723001061871,0.10937960484662167],”tValues”:[0.2566303869744822,-0.5880323136508093,14.715031444760513],”pValues”:[0.8060371545111102,0.5779564640149484,6.188226474801439E-6]} |