scala - HDFS Directory as parameter in Spark Streaming -


Text after "

I'm having problems with SPARC streaming examples:

When I started it using SBT Try to

  run local / user / dir / subdir /   

I get this exception

  [info] Running org apache.spark.streaming.examples.HdfsWordCount local / user / dir / subdir / 14/04/21 18:45:55 Information StreamingExamples: Use the spark's default log4j profile: org / apache / spark / log4j-defaults.properties 14 / 04/21 18:45:55 Information Streaming Exams: Setting the log level for streaming example [Warning] A custom log in classpath 4 J. Properties to override 14/04/21 18:45:55 Warne Utils: Your hostname, Ubuntu resolves a loopback address: 127.0.1.1; (Interface at eth0) instead of using 10.4.4.6 14/04/21 18:45:55 Warning Utils: set SPARK_LOCAL_IP if you need to force another address to 14/04/21 18:45:57 NativeCodeLoader's Warning: Unable to load native -hadoop library ... used builtin-Java classes where applicable 14/04/21 18:46:00 Error JobScheduler: time generating error jobs for MS 1398098760000 java.io.FileNotFoundException: file / user / dir / subdir / does not exist   

I'm sure that Complaints and s exists on Hadoop fs and I have copied a file. Is there any type of input formatting that I do not know about?

I have found solutions to my answer. The right way to input an HDFS directory is at least in my case:

  Run local HDFS: // local host: 9000 / user / dir / subdir /   

I found it in Spark Document:

Comments

Popular posts from this blog

Verilog Error: output or inout port "Q" must be connected to a structural net expression -

jasper reports - How to center align barcode using jasperreports and barcode4j -

c# - ASP.NET MVC - Attaching an entity of type 'MODELNAME' failed because another entity of the same type already has the same primary key value -