windows下使用intellij 开发 spark mllib 程序 发现如下错误。
var spark=SparkSession.builder().master("local").appName("spark_mllib").getOrCreate()
运行发现如下错误:
16/08/11 15:39:20 INFO SharedState: Warehouse path is 'file:D:\development\intellij_idea\workspace\SparkFaultBench/spark-warehouse'.
Exception in thread "main" java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: file:D:/development/intellij_idea/workspace/SparkFaultBench/spark-warehouse
at org.apache.hadoop.fs.Path.initialize(Path.java:205)
at org.apache.hadoop.fs.Path.<init>(Path.java:171)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.makeQualifiedPath(SessionCatalog.scala:114)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.createDatabase(SessionCatalog.scala:145)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.<init>(SessionCatalog.scala:89)
at org.apache.spark.sql.internal.SessionState.catalog$lzycompute(SessionState.scala:95)
at org.apache.spark.sql.internal.SessionState.catalog(SessionState.scala:95)
at org.apache.spark.sql.internal.SessionState$$anon$1.<init>(SessionState.scala:112)
at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:112)
at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:111)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
at org.apache.spark.sql.SparkSession.createDataFrame(SparkSession.scala:266)
at mllib.ml_wjf.TestMLlib$.main(TestMLlib.scala:29)
at mllib.ml_wjf.TestMLlib.main(TestMLlib.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.net.URISyntaxException: Relative path in absolute URI: file:D:/development/intellij_idea/workspace/SparkFaultBench/spark-warehouse
解决方法:进行如下设置
var spark=SparkSession.builder().master("local").appName("spark_mllib") .config("spark.sql.warehouse.dir","file:///")
.getOrCreate()
完美解决
网友评论