麦粉社区
>
帖子详情

LEN函数怎么使用呢

数据分析 发表于 2023-4-23 11:56
发表于 2023-4-23 11:56:42

LEN([字段名])这个写法没问题吧,使用报错了


日志:


2023-04-23 11:51:02.152 [993] INFO launcher.DefaultLauncher.run:59 - Task start. (id:76fac8d9c16e6859656a643cd0db203e,nameERIVE_COLUMN)
2023-04-23 11:51:02.163 [993] INFO repository.NodeStatusRepository.executeUpdate:138 - Report status successful.(state:RUNNING)
2023-04-23 11:51:02.170 [993] WARN sql.SparkSession$Builder.logWarning:69 - Using an existing SparkSession; some spark core configurations may not take effect.
2023-04-23 11:51:02.170 [993] INFO node.GenericNode.start:107 - Node start. (id:76fac8d9c16e6859656a643cd0db203e,nameERIVE_COLUMN)
2023-04-23 11:51:02.203 [993] INFO datasources.InMemoryFileIndex.logInfo:57 - It took 3 ms to list leaf files for 1 paths.
2023-04-23 11:51:02.283 [993] INFO spark.SparkContext.logInfo:57 - Starting job: parquet at DatasetEvent.java:229
2023-04-23 11:51:02.325 [993] INFO scheduler.DAGScheduler.logInfo:57 - Job 77 finished: parquet at DatasetEvent.java:229, took 0.039619 s
2023-04-23 11:51:02.332 [993] INFO util.EventSerializeUtil.deserialize:106 - Deserialization event finished,took 0.162 s
2023-04-23 11:51:02.432 [993] ERROR node.GenericNode.handleExecuteError:148 - Node execution failed.(id:76fac8d9c16e6859656a643cd0db203e,nameERIVE_COLUMN)
org.apache.spark.sql.AnalysisException: Undefined function: 'len'. This function is neither a registered temporary function nor a permanent function registered in the database 'default'.; line 1 pos 11
        at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$16.$anonfun$applyOrElse$121(Analyzer.scala:2108) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.package$.withPosition(package.scala:53) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$16.applyOrElse(Analyzer.scala:2108) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$16.applyOrElse(Analyzer.scala:2099) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDown$1(TreeNode.scala:318) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:74) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:318) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDown$3(TreeNode.scala:323) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$mapChildren$1(TreeNode.scala:408) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:244) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:406) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:359) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:323) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$transformExpressionsDown$1(QueryPlan.scala:94) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$1(QueryPlan.scala:116) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:74) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpression$1(QueryPlan.scala:116) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.recursiveTransform$1(QueryPlan.scala:127) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$3(QueryPlan.scala:132) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238) ~[scala-library-2.12.10.jar:?]
        at scala.collection.immutable.List.foreach(List.scala:392) ~[scala-library-2.12.10.jar:?]
        at scala.collection.TraversableLike.map(TraversableLike.scala:238) ~[scala-library-2.12.10.jar:?]
        at scala.collection.TraversableLike.map$(TraversableLike.scala:231) ~[scala-library-2.12.10.jar:?]
        at scala.collection.immutable.List.map(List.scala:298) ~[scala-library-2.12.10.jar:?]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.recursiveTransform$1(QueryPlan.scala:132) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$4(QueryPlan.scala:137) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:244) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.mapExpressions(QueryPlan.scala:137) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionsDown(QueryPlan.scala:94) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressions(QueryPlan.scala:85) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveExpressions$1.applyOrElse(AnalysisHelper.scala:153) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveExpressions$1.applyOrElse(AnalysisHelper.scala:152) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$2(AnalysisHelper.scala:110) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:74) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$1(AnalysisHelper.scala:110) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:223) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown(AnalysisHelper.scala:108) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown$(AnalysisHelper.scala:106) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsDown(LogicalPlan.scala:29) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperators(AnalysisHelper.scala:73) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperators$(AnalysisHelper.scala:72) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:29) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveExpressions(AnalysisHelper.scala:152) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveExpressions$(AnalysisHelper.scala:151) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveExpressions(LogicalPlan.scala:29) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$.apply(Analyzer.scala:2099) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$.apply(Analyzer.scala:2096) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$2(RuleExecutor.scala:216) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at scala.collection.IndexedSeqOptimized.foldLeft(IndexedSeqOptimized.scala:60) ~[scala-library-2.12.10.jar:?]
        at scala.collection.IndexedSeqOptimized.foldLeft$(IndexedSeqOptimized.scala:68) ~[scala-library-2.12.10.jar:?]
        at scala.collection.mutable.WrappedArray.foldLeft(WrappedArray.scala:38) ~[scala-library-2.12.10.jar:?]
        at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:213) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1$adapted(RuleExecutor.scala:205) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at scala.collection.immutable.List.foreach(List.scala:392) ~[scala-library-2.12.10.jar:?]
        at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:205) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer.org$apache$spark$sql$catalyst$analysis$Analyzer$$executeSameContext(Analyzer.scala:198) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:192) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:155) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:183) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:88) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:183) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:176) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:230) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:175) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:73) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:143) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:143) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:73) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:71) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:63) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:98) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at smartbix.datamining.engine.execute.node.preprocess.DeriveColumnNode.execute(DeriveColumnNode.java:67) ~[EngineCommonNode-1.0.jar:?]
        at smartbix.datamining.engine.execute.node.GenericNode.start(GenericNode.java:117) ~[EngineCore-1.0.jar:?]
        at smartbix.datamining.engine.agent.execute.executor.DefaultNodeExecutor.execute(DefaultNodeExecutor.java:43) ~[EngineAgent-1.0.jar:?]
        at smartbix.datamining.engine.agent.execute.launcher.DefaultLauncher.run(DefaultLauncher.java:67) ~[EngineAgent-1.0.jar:?]
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_202-ea]
        at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_202-ea]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_202-ea]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_202-ea]
        at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_202-ea]
2023-04-23 11:51:02.449 [993] INFO repository.NodeStatusRepository.executeUpdate:138 - Report status successful.(state:FAIL)
发表于 2023-4-23 13:37:55
当前什么使用场景下用到了len函数呢?看报错是数据库不支持len函数的使用

回复

使用道具 举报

高级模式
B Color Image Link Quote Code Smilies
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

5回帖数 0关注人数 403浏览人数
最后回复于:2023-4-23 13:37
快速回复 返回顶部 返回列表