Created
August 23, 2019 08:12
-
-
Save zyazxr/9f505edb854e168c118fd6299d1c693c to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| "C:\Program Files\Java\jdk1.8.0_191\bin\java.exe" "-javaagent:C:\Program Files\JetBrains\IntelliJ IDEA 2018.3.5\lib\idea_rt.jar=56190:C:\Program Files\JetBrains\IntelliJ IDEA 2018.3.5\bin" -Dfile.encoding=UTF-8 -classpath C:\Users\zhaoy\AppData\Local\Temp\classpath1662208160.jar com.nari.bdp.mine_server_test_behavior.classification.RNNClassificationTest | |
| heelo | |
| [INFO ] 2019-08-23 16:04:19 [main] c.n.b.m.e.service.MineJobExecutor - =========开始执行========= | |
| ########### className = com.nari.bdp.mine.cloud.io.readdatabank.behavior.ReadDataBankBehavior | |
| ########### className = com.nari.bdp.mine.cloud.dp.setrole.behavior.SetroleBehavior | |
| ########### className = com.nari.bdp.mine.classification.rnn.behavior.RNNClassifyBehavior | |
| RNN分类 cost time: 1.308000s | |
| [INFO ] 2019-08-23 16:04:19 [main] c.n.b.m.c.i.r.b.ReadDataBankBehavior - ===========开始执行 com.nari.bdp.mine.cloud.io.readdatabank.behavior.ReadDataBankBehavior节点============= | |
| DataCheck-LabelString cost time: 56.350000s | |
| Temp--static column:ID,V,M,W,S,P,Q,I,P_DVALUE,Q_DVALUE,I_DVALUE,P_PVALUE,Q_PVALUE,I_PVALUE used 7.882 seconds | |
| Temp--static column:null,null,null,null,null,null,null,null,null,null,null,null,null,null,null used 0.0 seconds | |
| +----+-----+---+---+-------+------------+------------+-----------+------------+------------+------------+-----------+-----------+-----------+-----+ | |
| | ID| V| M| W| S| P| Q| I| P_DVALUE| Q_DVALUE| I_DVALUE| P_PVALUE| Q_PVALUE| I_PVALUE|LABEL| | |
| +----+-----+---+---+-------+------------+------------+-----------+------------+------------+------------+-----------+-----------+-----------+-----+ | |
| | 1.0|500.0|2.0|4.0| 9660.0| 651.6300049| 131.9499969|762.4500122| 10.15003014| 2.029999971| 9.384030342|1.015820026|1.015619993|1.012459993| 异常| | |
| | 2.0|500.0|2.0|4.0|12600.0| 651.6300049| 131.9499969|762.4500122| 18.27001953| 6.090000153| 23.46002007|1.028849959|1.048390031|1.031749964| 异常| | |
| | 3.0|500.0|2.0|4.0|13560.0| 655.6900024| 133.9799957|764.7959595| 20.2999897| 6.090000153| 25.80596924|1.031949997|1.047620058|1.034919977| 异常| | |
| | 4.0|500.0|2.0|4.0|14520.0| 651.6300049| 133.9799957|760.1040039| 6.090030193| 6.090000153| 9.384030342|1.009430051|1.047620058|1.012500048| 异常| | |
| | 5.0|500.0|2.0|4.0|14580.0| 655.6900024| 136.0099945|764.7959595| 4.059999943| 2.029999971| 4.691959858|1.006229997|1.015149951|1.006170034| 异常| | |
| | 6.0|500.0|2.0|4.0|14640.0| 657.7199707| 136.0099945|764.7959595| 2.029969931| 0.0| 0.0|1.003100038| 1.0| 1.0| 异常| | |
| | 7.0|500.0|2.0|4.0|14700.0| 657.7199707| 136.0099945|771.8339844| 0.0| 0.0| 7.038020134| 1.0| 1.0|1.009199977| 异常| | |
| | 8.0|500.0|2.0|4.0|15300.0| 659.75| 133.9799957|767.1419678| 12.17998981| 6.090000153| 9.383970261|1.018810034|1.047620058|1.012380004| 异常| | |
| | 9.0|500.0|2.0|4.0|15780.0| 651.6300049| 129.9199982|762.4500122| 26.39001083| 4.059999943| 32.84405899|1.042209983|1.032259941|1.045019984| 异常| | |
| |10.0|500.0|2.0|4.0|18960.0| 655.6900024| 131.9499969|762.4500122| 36.54003906| 8.119999886| 44.57403183|1.059020042|1.065569997|1.062090039| 异常| | |
| |11.0|500.0|2.0|4.0|22440.0| 655.6900024| 127.8899994|764.7959595| 42.63000107| 6.090000153| 51.61199951|1.069540024|1.049999952|1.072370052| 异常| | |
| |12.0|500.0|2.0|4.0|84180.0| 655.6900024| 125.8600006|764.7959595| 36.54003906| 10.14999962| 39.88196182|1.059020042|1.087720037|1.055019975| 异常| | |
| |13.0|500.0|2.0|4.0|85860.0| 657.7199707| 119.7699966|755.4119873| 20.2999897| 0.0| 16.42200089| 1.03184998| 1.0|1.022220016| 异常| | |
| |14.0|500.0|2.0|2.0|30360.0|-838.0743408| -10.9871397|921.7219238| 0.0|-10.37674046| 0.078919999| 1.0|17.99999046|1.000090003| 异常| | |
| |15.0|500.0|2.0|2.0|41160.0|-716.6054688|-12.20792961|788.1759644| 10.37670994|-10.37674046|-11.29961967|0.985729992|6.666659832|0.985870004| 异常| | |
| |16.0|500.0|2.0|2.0|67740.0| -873.477356|-15.25990963|960.7225342| 0.0|-14.64951992| 0.146300003| 1.0|24.99998093|1.000149965| 异常| | |
| |17.0|500.0|2.0|2.0|68280.0|-875.9189453|-15.25990963|963.4072266| 0.0|-11.59753036| 0.137759998| 1.0|4.166669846|1.000139952| 异常| | |
| |18.0|500.0|2.0|2.0|73380.0|-868.5941772|-11.59753036|955.2910156| 0.0| -10.9871397| 0.084899999| 1.0|18.99998093|1.000090003| 异常| | |
| |19.0|500.0|2.0|2.0|74400.0|-869.8150024|-6.103960037|956.5719604|-5.493589878|-5.493569851| 6.064700127|1.006360054|9.999990463|1.006379962| 异常| | |
| |20.0|500.0|2.0|2.0|74820.0|-864.3214111|-12.20792961|950.6018677| 0.0|-11.59753036| 0.094609998| 1.0|19.99998093|1.000100017| 异常| | |
| +----+-----+---+---+-------+------------+------------+-----------+------------+------------+------------+-----------+-----------+-----------+-----+ | |
| only showing top 20 rows | |
| 设置角色 cost time: 1.016000s | |
| +----+-----+---+---+-------+------------+------------+-----------+------------+------------+------------+-----------+-----------+-----------+-----+-------------+--------------------+--------------------+ | |
| | ID| V| M| W| S| P| Q| I| P_DVALUE| Q_DVALUE| I_DVALUE| P_PVALUE| Q_PVALUE| I_PVALUE|LABEL|label_numeric| featureoutput| featureoutputStd| | |
| +----+-----+---+---+-------+------------+------------+-----------+------------+------------+------------+-----------+-----------+-----------+-----+-------------+--------------------+--------------------+ | |
| | 1.0|500.0|2.0|4.0| 9660.0| 651.6300049| 131.9499969|762.4500122| 10.15003014| 2.029999971| 9.384030342|1.015820026|1.015619993|1.012459993| 异常| 1.0|[1.0,500.0,2.0,4....|[0.0,0.5,0.5,0.66...| | |
| | 2.0|500.0|2.0|4.0|12600.0| 651.6300049| 131.9499969|762.4500122| 18.27001953| 6.090000153| 23.46002007|1.028849959|1.048390031|1.031749964| 异常| 1.0|[2.0,500.0,2.0,4....|[0.00199203187250...| | |
| | 3.0|500.0|2.0|4.0|13560.0| 655.6900024| 133.9799957|764.7959595| 20.2999897| 6.090000153| 25.80596924|1.031949997|1.047620058|1.034919977| 异常| 1.0|[3.0,500.0,2.0,4....|[0.00398406374501...| | |
| | 4.0|500.0|2.0|4.0|14520.0| 651.6300049| 133.9799957|760.1040039| 6.090030193| 6.090000153| 9.384030342|1.009430051|1.047620058|1.012500048| 异常| 1.0|[4.0,500.0,2.0,4....|[0.00597609561752...| | |
| | 5.0|500.0|2.0|4.0|14580.0| 655.6900024| 136.0099945|764.7959595| 4.059999943| 2.029999971| 4.691959858|1.006229997|1.015149951|1.006170034| 异常| 1.0|[5.0,500.0,2.0,4....|[0.00796812749003...| | |
| | 6.0|500.0|2.0|4.0|14640.0| 657.7199707| 136.0099945|764.7959595| 2.029969931| 0.0| 0.0|1.003100038| 1.0| 1.0| 异常| 1.0|[6.0,500.0,2.0,4....|[0.00996015936254...| | |
| | 7.0|500.0|2.0|4.0|14700.0| 657.7199707| 136.0099945|771.8339844| 0.0| 0.0| 7.038020134| 1.0| 1.0|1.009199977| 异常| 1.0|[7.0,500.0,2.0,4....|[0.01195219123505...| | |
| | 8.0|500.0|2.0|4.0|15300.0| 659.75| 133.9799957|767.1419678| 12.17998981| 6.090000153| 9.383970261|1.018810034|1.047620058|1.012380004| 异常| 1.0|[8.0,500.0,2.0,4....|[0.01394422310756...| | |
| | 9.0|500.0|2.0|4.0|15780.0| 651.6300049| 129.9199982|762.4500122| 26.39001083| 4.059999943| 32.84405899|1.042209983|1.032259941|1.045019984| 异常| 1.0|[9.0,500.0,2.0,4....|[0.01593625498007...| | |
| |10.0|500.0|2.0|4.0|18960.0| 655.6900024| 131.9499969|762.4500122| 36.54003906| 8.119999886| 44.57403183|1.059020042|1.065569997|1.062090039| 异常| 1.0|[10.0,500.0,2.0,4...|[0.01792828685258...| | |
| |11.0|500.0|2.0|4.0|22440.0| 655.6900024| 127.8899994|764.7959595| 42.63000107| 6.090000153| 51.61199951|1.069540024|1.049999952|1.072370052| 异常| 1.0|[11.0,500.0,2.0,4...|[0.01992031872509...| | |
| |12.0|500.0|2.0|4.0|84180.0| 655.6900024| 125.8600006|764.7959595| 36.54003906| 10.14999962| 39.88196182|1.059020042|1.087720037|1.055019975| 异常| 1.0|[12.0,500.0,2.0,4...|[0.02191235059760...| | |
| |13.0|500.0|2.0|4.0|85860.0| 657.7199707| 119.7699966|755.4119873| 20.2999897| 0.0| 16.42200089| 1.03184998| 1.0|1.022220016| 异常| 1.0|[13.0,500.0,2.0,4...|[0.02390438247011...| | |
| |14.0|500.0|2.0|2.0|30360.0|-838.0743408| -10.9871397|921.7219238| 0.0|-10.37674046| 0.078919999| 1.0|17.99999046|1.000090003| 异常| 1.0|[14.0,500.0,2.0,2...|[0.02589641434262...| | |
| |15.0|500.0|2.0|2.0|41160.0|-716.6054688|-12.20792961|788.1759644| 10.37670994|-10.37674046|-11.29961967|0.985729992|6.666659832|0.985870004| 异常| 1.0|[15.0,500.0,2.0,2...|[0.02788844621513...| | |
| |16.0|500.0|2.0|2.0|67740.0| -873.477356|-15.25990963|960.7225342| 0.0|-14.64951992| 0.146300003| 1.0|24.99998093|1.000149965| 异常| 1.0|[16.0,500.0,2.0,2...|[0.02988047808764...| | |
| |17.0|500.0|2.0|2.0|68280.0|-875.9189453|-15.25990963|963.4072266| 0.0|-11.59753036| 0.137759998| 1.0|4.166669846|1.000139952| 异常| 1.0|[17.0,500.0,2.0,2...|[0.03187250996015...| | |
| |18.0|500.0|2.0|2.0|73380.0|-868.5941772|-11.59753036|955.2910156| 0.0| -10.9871397| 0.084899999| 1.0|18.99998093|1.000090003| 异常| 1.0|[18.0,500.0,2.0,2...|[0.03386454183266...| | |
| |19.0|500.0|2.0|2.0|74400.0|-869.8150024|-6.103960037|956.5719604|-5.493589878|-5.493569851| 6.064700127|1.006360054|9.999990463|1.006379962| 异常| 1.0|[19.0,500.0,2.0,2...|[0.03585657370517...| | |
| |20.0|500.0|2.0|2.0|74820.0|-864.3214111|-12.20792961|950.6018677| 0.0|-11.59753036| 0.094609998| 1.0|19.99998093|1.000100017| 异常| 1.0|[20.0,500.0,2.0,2...|[0.03784860557768...| | |
| +----+-----+---+---+-------+------------+------------+-----------+------------+------------+------------+-----------+-----------+-----------+-----+-------------+--------------------+--------------------+ | |
| only showing top 20 rows | |
| root | |
| |-- ID: double (nullable = true) | |
| |-- V: double (nullable = true) | |
| |-- M: double (nullable = true) | |
| |-- W: double (nullable = true) | |
| |-- S: double (nullable = true) | |
| |-- P: double (nullable = true) | |
| |-- Q: double (nullable = true) | |
| |-- I: double (nullable = true) | |
| |-- P_DVALUE: double (nullable = true) | |
| |-- Q_DVALUE: double (nullable = true) | |
| |-- I_DVALUE: double (nullable = true) | |
| |-- P_PVALUE: double (nullable = true) | |
| |-- Q_PVALUE: double (nullable = true) | |
| |-- I_PVALUE: double (nullable = true) | |
| |-- LABEL: string (nullable = true) | |
| |-- prediction: string (nullable = true) | |
| |-- probability: vector (nullable = false) | |
| |-- prob_正常: double (nullable = true) | |
| |-- prob_异常: double (nullable = true) | |
| RNN分类 cost time: 319.274000s | |
| org.apache.spark.SparkException: Job aborted due to stage failure: Task 3 in stage 146.0 failed 1 times, most recent failure: Lost task 3.0 in stage 146.0 (TID 551, localhost, executor driver): java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: org.apache.spark.ml.linalg.DenseVector is not a valid external type for schema of vector | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 0, ID), DoubleType) AS ID#522 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 1, V), DoubleType) AS V#523 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 2, M), DoubleType) AS M#524 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 3, W), DoubleType) AS W#525 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 4, S), DoubleType) AS S#526 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 5, P), DoubleType) AS P#527 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 6, Q), DoubleType) AS Q#528 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 7, I), DoubleType) AS I#529 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 8, P_DVALUE), DoubleType) AS P_DVALUE#530 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 9, Q_DVALUE), DoubleType) AS Q_DVALUE#531 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 10, I_DVALUE), DoubleType) AS I_DVALUE#532 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 11, P_PVALUE), DoubleType) AS P_PVALUE#533 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 12, Q_PVALUE), DoubleType) AS Q_PVALUE#534 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 13, I_PVALUE), DoubleType) AS I_PVALUE#535 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 14, LABEL), StringType), true, false) AS LABEL#536 | |
| validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 15, label_numeric), DoubleType) AS label_numeric#537 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutput#538 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutputStd#539 | |
| validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 18, prediction), DoubleType) AS prediction#540 | |
| newInstance(class org.apache.spark.mllib.linalg.VectorUDT).serialize AS probability#541 | |
| at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:291) | |
| at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589) | |
| at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589) | |
| at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) | |
| at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) | |
| at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) | |
| at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614) | |
| at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:216) | |
| at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:295) | |
| at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:266) | |
| at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830) | |
| at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830) | |
| at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
| at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) | |
| at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) | |
| at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
| at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) | |
| at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) | |
| at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) | |
| at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) | |
| at org.apache.spark.scheduler.Task.run(Task.scala:109) | |
| at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) | |
| at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
| at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
| at java.lang.Thread.run(Thread.java:748) | |
| Caused by: java.lang.RuntimeException: org.apache.spark.ml.linalg.DenseVector is not a valid external type for schema of vector | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.Invoke2$(Unknown Source) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields3_5$(Unknown Source) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source) | |
| at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:288) | |
| ... 25 more | |
| Driver stacktrace: | |
| at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1599) | |
| at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1587) | |
| at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1586) | |
| at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) | |
| at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) | |
| at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1586) | |
| at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:831) | |
| at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:831) | |
| at scala.Option.foreach(Option.scala:257) | |
| at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:831) | |
| at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1820) | |
| at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1769) | |
| at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1758) | |
| at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) | |
| at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:642) | |
| at org.apache.spark.SparkContext.runJob(SparkContext.scala:2027) | |
| at org.apache.spark.SparkContext.runJob(SparkContext.scala:2048) | |
| at org.apache.spark.SparkContext.runJob(SparkContext.scala:2067) | |
| at org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:363) | |
| at org.apache.spark.sql.execution.CollectLimitExec.executeCollect(limit.scala:38) | |
| at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collectFromPlan(Dataset.scala:3272) | |
| at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2484) | |
| at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2484) | |
| at org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3253) | |
| at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77) | |
| at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3252) | |
| at org.apache.spark.sql.Dataset.head(Dataset.scala:2484) | |
| at org.apache.spark.sql.Dataset.take(Dataset.scala:2698) | |
| at org.apache.spark.sql.Dataset.takeAsList(Dataset.scala:2709) | |
| at com.nari.bdp.mine.base.utils.MineBaseUtil.convertDfToJavaListDColumns(MineBaseUtil.java:75) | |
| at com.nari.bdp.mine.base.utils.MineBaseUtil.convertDfToJavaList(MineBaseUtil.java:28) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.getResuleTable(AutoBehavior.java:646) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.getDatasetMeta(AutoBehavior.java:475) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.generateDatasetInsight(AutoBehavior.java:499) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.generatePortInsight(AutoBehavior.java:523) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.leave(AutoBehavior.java:612) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.executeBehavior(AutoBehavior.java:437) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.leave(AutoBehavior.java:635) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.executeBehavior(AutoBehavior.java:437) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.leave(AutoBehavior.java:635) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.executeBehavior(AutoBehavior.java:437) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.leave(AutoBehavior.java:635) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.executeBehavior(AutoBehavior.java:437) | |
| at com.nari.bdp.mine.operator.impl.ExecutionImpl.executeNode(ExecutionImpl.java:75) | |
| at com.nari.bdp.mine.operator.impl.ProcessImpl.run(ProcessImpl.java:79) | |
| at com.nari.bdp.mine.executor.executor.Executor.rProcessExecute(Executor.java:125) | |
| at com.nari.bdp.mine.executor.executor.Executor.rProcessExecute(Executor.java:111) | |
| at com.nari.bdp.mine.executor.executor.Executor.excute(Executor.java:89) | |
| at com.nari.bdp.mine.executor.service.MineJobExecutor.jobExecutor(MineJobExecutor.java:75) | |
| at com.nari.bdp.mine_server_test_behavior.classification.RNNClassificationTest.main(RNNClassificationTest.java:34) | |
| Caused by: java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: org.apache.spark.ml.linalg.DenseVector is not a valid external type for schema of vector | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 0, ID), DoubleType) AS ID#522 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 1, V), DoubleType) AS V#523 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 2, M), DoubleType) AS M#524 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 3, W), DoubleType) AS W#525 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 4, S), DoubleType) AS S#526 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 5, P), DoubleType) AS P#527 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 6, Q), DoubleType) AS Q#528 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 7, I), DoubleType) AS I#529 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 8, P_DVALUE), DoubleType) AS P_DVALUE#530 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 9, Q_DVALUE), DoubleType) AS Q_DVALUE#531 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 10, I_DVALUE), DoubleType) AS I_DVALUE#532 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 11, P_PVALUE), DoubleType) AS P_PVALUE#533 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 12, Q_PVALUE), DoubleType) AS Q_PVALUE#534 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 13, I_PVALUE), DoubleType) AS I_PVALUE#535 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 14, LABEL), StringType), true, false) AS LABEL#536 | |
| validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 15, label_numeric), DoubleType) AS label_numeric#537 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutput#538 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutputStd#539 | |
| validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 18, prediction), DoubleType) AS prediction#540 | |
| newInstance(class org.apache.spark.mllib.linalg.VectorUDT).serialize AS probability#541 | |
| at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:291) | |
| at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589) | |
| at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589) | |
| at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) | |
| at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) | |
| at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) | |
| at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614) | |
| at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:216) | |
| at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:295) | |
| at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:266) | |
| at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830) | |
| at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830) | |
| at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
| at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) | |
| at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) | |
| at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
| at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) | |
| at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) | |
| at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) | |
| at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) | |
| at org.apache.spark.scheduler.Task.run(Task.scala:109) | |
| at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) | |
| at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
| at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
| at java.lang.Thread.run(Thread.java:748) | |
| Caused by: java.lang.RuntimeException: org.apache.spark.ml.linalg.DenseVector is not a valid external type for schema of vector | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.Invoke2$(Unknown Source) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields3_5$(Unknown Source) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source) | |
| at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:288) | |
| ... 25 more | |
| process: org.apache.spark.SparkException: Job aborted due to stage failure: Task 3 in stage 146.0 failed 1 times, most recent failure: Lost task 3.0 in stage 146.0 (TID 551, localhost, executor driver): java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: org.apache.spark.ml.linalg.DenseVector is not a valid external type for schema of vector | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 0, ID), DoubleType) AS ID#522 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 1, V), DoubleType) AS V#523 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 2, M), DoubleType) AS M#524 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 3, W), DoubleType) AS W#525 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 4, S), DoubleType) AS S#526 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 5, P), DoubleType) AS P#527 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 6, Q), DoubleType) AS Q#528 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 7, I), DoubleType) AS I#529 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 8, P_DVALUE), DoubleType) AS P_DVALUE#530 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 9, Q_DVALUE), DoubleType) AS Q_DVALUE#531 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 10, I_DVALUE), DoubleType) AS I_DVALUE#532 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 11, P_PVALUE), DoubleType) AS P_PVALUE#533 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 12, Q_PVALUE), DoubleType) AS Q_PVALUE#534 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 13, I_PVALUE), DoubleType) AS I_PVALUE#535 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 14, LABEL), StringType), true, false) AS LABEL#536 | |
| validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 15, label_numeric), DoubleType) AS label_numeric#537 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutput#538 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutputStd#539 | |
| validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 18, prediction), DoubleType) AS prediction#540 | |
| newInstance(class org.apache.spark.mllib.linalg.VectorUDT).serialize AS probability#541 | |
| at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:291) | |
| at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589) | |
| at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589) | |
| at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) | |
| at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) | |
| at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) | |
| at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614) | |
| at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:216) | |
| at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:295) | |
| at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:266) | |
| at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830) | |
| at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830) | |
| at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
| at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) | |
| at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) | |
| at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
| at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) | |
| at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) | |
| at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) | |
| at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) | |
| at org.apache.spark.scheduler.Task.run(Task.scala:109) | |
| at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) | |
| at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
| at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
| at java.lang.Thread.run(Thread.java:748) | |
| Caused by: java.lang.RuntimeException: org.apache.spark.ml.linalg.DenseVector is not a valid external type for schema of vector | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.Invoke2$(Unknown Source) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields3_5$(Unknown Source) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source) | |
| at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:288) | |
| ... 25 more | |
| Driver stacktrace: | |
| process: java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: org.apache.spark.ml.linalg.DenseVector is not a valid external type for schema of vector | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 0, ID), DoubleType) AS ID#522 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 1, V), DoubleType) AS V#523 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 2, M), DoubleType) AS M#524 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 3, W), DoubleType) AS W#525 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 4, S), DoubleType) AS S#526 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 5, P), DoubleType) AS P#527 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 6, Q), DoubleType) AS Q#528 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 7, I), DoubleType) AS I#529 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 8, P_DVALUE), DoubleType) AS P_DVALUE#530 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 9, Q_DVALUE), DoubleType) AS Q_DVALUE#531 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 10, I_DVALUE), DoubleType) AS I_DVALUE#532 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 11, P_PVALUE), DoubleType) AS P_PVALUE#533 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 12, Q_PVALUE), DoubleType) AS Q_PVALUE#534 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 13, I_PVALUE), DoubleType) AS I_PVALUE#535 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 14, LABEL), StringType), true, false) AS LABEL#536 | |
| validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 15, label_numeric), DoubleType) AS label_numeric#537 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutput#538 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutputStd#539 | |
| validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 18, prediction), DoubleType) AS prediction#540 | |
| newInstance(class org.apache.spark.mllib.linalg.VectorUDT).serialize AS probability#541 | |
| process: java.lang.RuntimeException: org.apache.spark.ml.linalg.DenseVector is not a valid external type for schema of vector | |
| Match Error: Job aborted due to stage failure: Task 3 in stage 146.0 failed 1 times, most recent failure: Lost task 3.0 in stage 146.0 (TID 551, localhost, executor driver): java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: org.apache.spark.ml.linalg.DenseVector is not a valid external type for schema of vector | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 0, ID), DoubleType) AS ID#522 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 1, V), DoubleType) AS V#523 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 2, M), DoubleType) AS M#524 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 3, W), DoubleType) AS W#525 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 4, S), DoubleType) AS S#526 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 5, P), DoubleType) AS P#527 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 6, Q), DoubleType) AS Q#528 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 7, I), DoubleType) AS I#529 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 8, P_DVALUE), DoubleType) AS P_DVALUE#530 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 9, Q_DVALUE), DoubleType) AS Q_DVALUE#531 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 10, I_DVALUE), DoubleType) AS I_DVALUE#532 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 11, P_PVALUE), DoubleType) AS P_PVALUE#533 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 12, Q_PVALUE), DoubleType) AS Q_PVALUE#534 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 13, I_PVALUE), DoubleType) AS I_PVALUE#535 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 14, LABEL), StringType), true, false) AS LABEL#536 | |
| validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 15, label_numeric), DoubleType) AS label_numeric#537 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutput#538 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutputStd#539 | |
| validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 18, prediction), DoubleType) AS prediction#540 | |
| newInstance(class org.apache.spark.mllib.linalg.VectorUDT).serialize AS probability#541 | |
| at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:291) | |
| at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589) | |
| at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589) | |
| at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) | |
| at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) | |
| at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) | |
| at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614) | |
| at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:216) | |
| at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:295) | |
| at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:266) | |
| at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830) | |
| at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830) | |
| at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
| at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) | |
| at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) | |
| at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
| at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) | |
| at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) | |
| at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) | |
| at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) | |
| at org.apache.spark.scheduler.Task.run(Task.scala:109) | |
| at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) | |
| at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
| at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
| at java.lang.Thread.run(Thread.java:748) | |
| Caused by: java.lang.RuntimeException: org.apache.spark.ml.linalg.DenseVector is not a valid external type for schema of vector | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.Invoke2$(Unknown Source) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields3_5$(Unknown Source) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source) | |
| at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:288) | |
| ... 25 more | |
| Driver stacktrace: | |
| 2019-08-23 16:11:21 ERROR ExecutionImpl:94 - Job aborted due to stage failure: Task 3 in stage 146.0 failed 1 times, most recent failure: Lost task 3.0 in stage 146.0 (TID 551, localhost, executor driver): java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: org.apache.spark.ml.linalg.DenseVector is not a valid external type for schema of vector | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 0, ID), DoubleType) AS ID#522 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 1, V), DoubleType) AS V#523 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 2, M), DoubleType) AS M#524 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 3, W), DoubleType) AS W#525 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 4, S), DoubleType) AS S#526 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 5, P), DoubleType) AS P#527 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 6, Q), DoubleType) AS Q#528 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 7, I), DoubleType) AS I#529 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 8, P_DVALUE), DoubleType) AS P_DVALUE#530 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 9, Q_DVALUE), DoubleType) AS Q_DVALUE#531 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 10, I_DVALUE), DoubleType) AS I_DVALUE#532 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 11, P_PVALUE), DoubleType) AS P_PVALUE#533 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 12, Q_PVALUE), DoubleType) AS Q_PVALUE#534 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 13, I_PVALUE), DoubleType) AS I_PVALUE#535 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 14, LABEL), StringType), true, false) AS LABEL#536 | |
| validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 15, label_numeric), DoubleType) AS label_numeric#537 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutput#538 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutputStd#539 | |
| validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 18, prediction), DoubleType) AS prediction#540 | |
| newInstance(class org.apache.spark.mllib.linalg.VectorUDT).serialize AS probability#541 | |
| at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:291) | |
| at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589) | |
| at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589) | |
| at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) | |
| at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) | |
| at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) | |
| at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614) | |
| at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:216) | |
| at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:295) | |
| at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:266) | |
| at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830) | |
| at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830) | |
| at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
| at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) | |
| at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) | |
| at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
| at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) | |
| at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) | |
| at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) | |
| at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) | |
| at org.apache.spark.scheduler.Task.run(Task.scala:109) | |
| at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) | |
| at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
| at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
| at java.lang.Thread.run(Thread.java:748) | |
| Caused by: java.lang.RuntimeException: org.apache.spark.ml.linalg.DenseVector is not a valid external type for schema of vector | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.Invoke2$(Unknown Source) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields3_5$(Unknown Source) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source) | |
| at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:288) | |
| ... 25 more | |
| Driver stacktrace: | |
| org.apache.spark.SparkException: Job aborted due to stage failure: Task 3 in stage 146.0 failed 1 times, most recent failure: Lost task 3.0 in stage 146.0 (TID 551, localhost, executor driver): java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: org.apache.spark.ml.linalg.DenseVector is not a valid external type for schema of vector | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 0, ID), DoubleType) AS ID#522 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 1, V), DoubleType) AS V#523 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 2, M), DoubleType) AS M#524 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 3, W), DoubleType) AS W#525 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 4, S), DoubleType) AS S#526 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 5, P), DoubleType) AS P#527 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 6, Q), DoubleType) AS Q#528 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 7, I), DoubleType) AS I#529 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 8, P_DVALUE), DoubleType) AS P_DVALUE#530 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 9, Q_DVALUE), DoubleType) AS Q_DVALUE#531 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 10, I_DVALUE), DoubleType) AS I_DVALUE#532 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 11, P_PVALUE), DoubleType) AS P_PVALUE#533 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 12, Q_PVALUE), DoubleType) AS Q_PVALUE#534 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 13, I_PVALUE), DoubleType) AS I_PVALUE#535 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 14, LABEL), StringType), true, false) AS LABEL#536 | |
| validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 15, label_numeric), DoubleType) AS label_numeric#537 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutput#538 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutputStd#539 | |
| validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 18, prediction), DoubleType) AS prediction#540 | |
| newInstance(class org.apache.spark.mllib.linalg.VectorUDT).serialize AS probability#541 | |
| at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:291) | |
| at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589) | |
| at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589) | |
| at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) | |
| at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) | |
| at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) | |
| at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614) | |
| at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:216) | |
| at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:295) | |
| at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:266) | |
| at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830) | |
| at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830) | |
| at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
| at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) | |
| at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) | |
| at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
| at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) | |
| at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) | |
| at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) | |
| at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) | |
| at org.apache.spark.scheduler.Task.run(Task.scala:109) | |
| at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) | |
| at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
| at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
| at java.lang.Thread.run(Thread.java:748) | |
| Caused by: java.lang.RuntimeException: org.apache.spark.ml.linalg.DenseVector is not a valid external type for schema of vector | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.Invoke2$(Unknown Source) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields3_5$(Unknown Source) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source) | |
| at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:288) | |
| ... 25 more | |
| Driver stacktrace: | |
| at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1599) | |
| at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1587) | |
| at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1586) | |
| at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) | |
| at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) | |
| at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1586) | |
| at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:831) | |
| at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:831) | |
| at scala.Option.foreach(Option.scala:257) | |
| at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:831) | |
| at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1820) | |
| at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1769) | |
| at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1758) | |
| at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) | |
| at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:642) | |
| at org.apache.spark.SparkContext.runJob(SparkContext.scala:2027) | |
| at org.apache.spark.SparkContext.runJob(SparkContext.scala:2048) | |
| at org.apache.spark.SparkContext.runJob(SparkContext.scala:2067) | |
| at org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:363) | |
| at org.apache.spark.sql.execution.CollectLimitExec.executeCollect(limit.scala:38) | |
| at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collectFromPlan(Dataset.scala:3272) | |
| at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2484) | |
| at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2484) | |
| at org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3253) | |
| at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77) | |
| at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3252) | |
| at org.apache.spark.sql.Dataset.head(Dataset.scala:2484) | |
| at org.apache.spark.sql.Dataset.take(Dataset.scala:2698) | |
| at org.apache.spark.sql.Dataset.takeAsList(Dataset.scala:2709) | |
| at com.nari.bdp.mine.base.utils.MineBaseUtil.convertDfToJavaListDColumns(MineBaseUtil.java:75) | |
| at com.nari.bdp.mine.base.utils.MineBaseUtil.convertDfToJavaList(MineBaseUtil.java:28) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.getResuleTable(AutoBehavior.java:646) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.getDatasetMeta(AutoBehavior.java:475) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.generateDatasetInsight(AutoBehavior.java:499) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.generatePortInsight(AutoBehavior.java:523) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.leave(AutoBehavior.java:612) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.executeBehavior(AutoBehavior.java:437) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.leave(AutoBehavior.java:635) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.executeBehavior(AutoBehavior.java:437) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.leave(AutoBehavior.java:635) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.executeBehavior(AutoBehavior.java:437) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.leave(AutoBehavior.java:635) | |
| at com.nari.bdp.mine.operator.impl.AutoBehavior.executeBehavior(AutoBehavior.java:437) | |
| at com.nari.bdp.mine.operator.impl.ExecutionImpl.executeNode(ExecutionImpl.java:75) | |
| at com.nari.bdp.mine.operator.impl.ProcessImpl.run(ProcessImpl.java:79) | |
| at com.nari.bdp.mine.executor.executor.Executor.rProcessExecute(Executor.java:125) | |
| at com.nari.bdp.mine.executor.executor.Executor.rProcessExecute(Executor.java:111) | |
| at com.nari.bdp.mine.executor.executor.Executor.excute(Executor.java:89) | |
| at com.nari.bdp.mine.executor.service.MineJobExecutor.jobExecutor(MineJobExecutor.java:75) | |
| at com.nari.bdp.mine_server_test_behavior.classification.RNNClassificationTest.main(RNNClassificationTest.java:34) | |
| Caused by: java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: org.apache.spark.ml.linalg.DenseVector is not a valid external type for schema of vector | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 0, ID), DoubleType) AS ID#522 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 1, V), DoubleType) AS V#523 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 2, M), DoubleType) AS M#524 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 3, W), DoubleType) AS W#525 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 4, S), DoubleType) AS S#526 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 5, P), DoubleType) AS P#527 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 6, Q), DoubleType) AS Q#528 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 7, I), DoubleType) AS I#529 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 8, P_DVALUE), DoubleType) AS P_DVALUE#530 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 9, Q_DVALUE), DoubleType) AS Q_DVALUE#531 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 10, I_DVALUE), DoubleType) AS I_DVALUE#532 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 11, P_PVALUE), DoubleType) AS P_PVALUE#533 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 12, Q_PVALUE), DoubleType) AS Q_PVALUE#534 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 13, I_PVALUE), DoubleType) AS I_PVALUE#535 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 14, LABEL), StringType), true, false) AS LABEL#536 | |
| validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 15, label_numeric), DoubleType) AS label_numeric#537 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutput#538 | |
| if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutputStd#539 | |
| validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 18, prediction), DoubleType) AS prediction#540 | |
| newInstance(class org.apache.spark.mllib.linalg.VectorUDT).serialize AS probability#541 | |
| at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:291) | |
| at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589) | |
| at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589) | |
| at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) | |
| at scala.collection.Iterator$$anon$11.next(Iterator.scala:409) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) | |
| at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) | |
| at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614) | |
| at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:216) | |
| at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:295) | |
| at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:266) | |
| at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830) | |
| at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830) | |
| at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
| at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) | |
| at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) | |
| at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
| at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) | |
| at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) | |
| at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) | |
| at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) | |
| at org.apache.spark.scheduler.Task.run(Task.scala:109) | |
| at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) | |
| at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
| at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
| at java.lang.Thread.run(Thread.java:748) | |
| Caused by: java.lang.RuntimeException: org.apache.spark.ml.linalg.DenseVector is not a valid external type for schema of vector | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.Invoke2$(Unknown Source) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields3_5$(Unknown Source) | |
| at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source) | |
| at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:288) | |
| ... 25 more | |
| com.meritdata.tempo.force.exit is:null | |
| Process finished with exit code 1024 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment