月下旬更新速递丨 实战场景深化、集成能力升级与开发进阶

初冬来临,一波热气腾腾的更新也准时抵达!我们聚焦实战技巧、集成增强与开发进阶,一系列新功能与新教程,助你在数据分析与系统开发的效率上再进一步。

一、实战技巧精讲

雷达图:多维度数据的“透视镜”,3步读懂数据真相》→ 聚焦雷达图核心应用场景,快速掌握多对象、多维度数据的可视化分析方法。

用图表解锁你的生活“数据密码”!》→ 探索图表在日常场景中的应用,让数据解读更直观、更具操作性。

二、直播上线

2025新特性实战解读(上)数据分析效率倍增秘籍》→ 解析2025新特性落地路径,助力实现数据分析效率成倍提升。

三、技术经验分享

【专家分享】数据排序的“权力游戏”:优先级规则决定谁先谁后》→解读高级排序的业务配置逻辑,让关键数据始终处于优先展示位置。

四、二次开发视频

扩展包开发知识点——前端改造》→从需求分析入手到最终实现的全流程讲解,帮助您快速入门上手Smartbi前端改造。

五、任务持续上线

【初级任务】解锁生活“数据密码”,可视化创意实践任务》→发起可视化创意任务,推动数据表达更生动、更具趣味性。

【初级任务】玩转雷达图解数据,200麦豆等你拿!》→推出雷达图实战任务,以激励方式提升多维数据分析技能。

六、全新素材上线

AD域(LDAP/LDAPS)登录验证V2》→扩展域账号登录支持,实现与企业Windows认证体系无缝对接。

数据模型:对接RestfulAPI接口》→打通数据模型与RestfulAPI对接通道,提升系统集成与数据获取效率。

计划任务:定时清空用户属性缓存→引入缓存自动清理机制,确保权限变更实时生效、业务数据及时更新。

用户同步:BI系统自定义用户所属组》→优化用户组同步逻辑,实现自定义组信息自动识别与补全。

审核流程:可以调用自助ETL》→增强审核流程集成能力,支持在用户任务节点直接调用自助ETL过程。

麦粉社区
>
帖子详情

LEN函数怎么使用呢

数据分析 发表于 2023-4-23 11:56
发表于 2023-4-23 11:56:42

LEN([字段名])这个写法没问题吧,使用报错了


日志:


2023-04-23 11:51:02.152 [993] INFO launcher.DefaultLauncher.run:59 - Task start. (id:76fac8d9c16e6859656a643cd0db203e,name:DERIVE_COLUMN)
2023-04-23 11:51:02.163 [993] INFO repository.NodeStatusRepository.executeUpdate:138 - Report status successful.(state:RUNNING)
2023-04-23 11:51:02.170 [993] WARN sql.SparkSession$Builder.logWarning:69 - Using an existing SparkSession; some spark core configurations may not take effect.
2023-04-23 11:51:02.170 [993] INFO node.GenericNode.start:107 - Node start. (id:76fac8d9c16e6859656a643cd0db203e,name:DERIVE_COLUMN)
2023-04-23 11:51:02.203 [993] INFO datasources.InMemoryFileIndex.logInfo:57 - It took 3 ms to list leaf files for 1 paths.
2023-04-23 11:51:02.283 [993] INFO spark.SparkContext.logInfo:57 - Starting job: parquet at DatasetEvent.java:229
2023-04-23 11:51:02.325 [993] INFO scheduler.DAGScheduler.logInfo:57 - Job 77 finished: parquet at DatasetEvent.java:229, took 0.039619 s
2023-04-23 11:51:02.332 [993] INFO util.EventSerializeUtil.deserialize:106 - Deserialization event finished,took 0.162 s
2023-04-23 11:51:02.432 [993] ERROR node.GenericNode.handleExecuteError:148 - Node execution failed.(id:76fac8d9c16e6859656a643cd0db203e,name:DERIVE_COLUMN)
org.apache.spark.sql.AnalysisException: Undefined function: 'len'. This function is neither a registered temporary function nor a permanent function registered in the database 'default'.; line 1 pos 11
        at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$16.$anonfun$applyOrElse$121(Analyzer.scala:2108) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.package$.withPosition(package.scala:53) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$16.applyOrElse(Analyzer.scala:2108) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$$anonfun$apply$16.applyOrElse(Analyzer.scala:2099) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDown$1(TreeNode.scala:318) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:74) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:318) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDown$3(TreeNode.scala:323) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$mapChildren$1(TreeNode.scala:408) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:244) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:406) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:359) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:323) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$transformExpressionsDown$1(QueryPlan.scala:94) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$1(QueryPlan.scala:116) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:74) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpression$1(QueryPlan.scala:116) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.recursiveTransform$1(QueryPlan.scala:127) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$3(QueryPlan.scala:132) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238) ~[scala-library-2.12.10.jar:?]
        at scala.collection.immutable.List.foreach(List.scala:392) ~[scala-library-2.12.10.jar:?]
        at scala.collection.TraversableLike.map(TraversableLike.scala:238) ~[scala-library-2.12.10.jar:?]
        at scala.collection.TraversableLike.map$(TraversableLike.scala:231) ~[scala-library-2.12.10.jar:?]
        at scala.collection.immutable.List.map(List.scala:298) ~[scala-library-2.12.10.jar:?]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.recursiveTransform$1(QueryPlan.scala:132) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$4(QueryPlan.scala:137) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:244) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.mapExpressions(QueryPlan.scala:137) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionsDown(QueryPlan.scala:94) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressions(QueryPlan.scala:85) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveExpressions$1.applyOrElse(AnalysisHelper.scala:153) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveExpressions$1.applyOrElse(AnalysisHelper.scala:152) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$2(AnalysisHelper.scala:110) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:74) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$1(AnalysisHelper.scala:110) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:223) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown(AnalysisHelper.scala:108) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown$(AnalysisHelper.scala:106) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsDown(LogicalPlan.scala:29) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperators(AnalysisHelper.scala:73) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperators$(AnalysisHelper.scala:72) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:29) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveExpressions(AnalysisHelper.scala:152) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveExpressions$(AnalysisHelper.scala:151) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveExpressions(LogicalPlan.scala:29) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$.apply(Analyzer.scala:2099) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer$LookupFunctions$.apply(Analyzer.scala:2096) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$2(RuleExecutor.scala:216) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at scala.collection.IndexedSeqOptimized.foldLeft(IndexedSeqOptimized.scala:60) ~[scala-library-2.12.10.jar:?]
        at scala.collection.IndexedSeqOptimized.foldLeft$(IndexedSeqOptimized.scala:68) ~[scala-library-2.12.10.jar:?]
        at scala.collection.mutable.WrappedArray.foldLeft(WrappedArray.scala:38) ~[scala-library-2.12.10.jar:?]
        at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:213) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1$adapted(RuleExecutor.scala:205) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at scala.collection.immutable.List.foreach(List.scala:392) ~[scala-library-2.12.10.jar:?]
        at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:205) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer.org$apache$spark$sql$catalyst$analysis$Analyzer$$executeSameContext(Analyzer.scala:198) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:192) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:155) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:183) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:88) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:183) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:176) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:230) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:175) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:73) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111) ~[spark-catalyst_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:143) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:143) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:73) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:71) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:63) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:98) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613) ~[spark-sql_2.12-3.1.3.jar:3.1.3]
        at smartbix.datamining.engine.execute.node.preprocess.DeriveColumnNode.execute(DeriveColumnNode.java:67) ~[EngineCommonNode-1.0.jar:?]
        at smartbix.datamining.engine.execute.node.GenericNode.start(GenericNode.java:117) ~[EngineCore-1.0.jar:?]
        at smartbix.datamining.engine.agent.execute.executor.DefaultNodeExecutor.execute(DefaultNodeExecutor.java:43) ~[EngineAgent-1.0.jar:?]
        at smartbix.datamining.engine.agent.execute.launcher.DefaultLauncher.run(DefaultLauncher.java:67) ~[EngineAgent-1.0.jar:?]
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_202-ea]
        at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_202-ea]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_202-ea]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_202-ea]
        at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_202-ea]
2023-04-23 11:51:02.449 [993] INFO repository.NodeStatusRepository.executeUpdate:138 - Report status successful.(state:FAIL)
发表于 2023-4-23 13:37:55
当前什么使用场景下用到了len函数呢?看报错是数据库不支持len函数的使用

回复

使用道具 举报

高级模式
B Color Image Link Quote Code Smilies
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

5回帖数 0关注人数 666浏览人数
最后回复于:2023-4-23 13:37
快速回复 返回顶部 返回列表