hadoop - reading a file into array list in scala spark -


i new spark , scala.

i want read file array list.

this how done in java.

list<string> sourcerecords; sourcerecords = new arraylist<string>(); bufferedreader sw; sw = new bufferedreader(new filereader(srcpath[0].tostring())); string srcline ; while ((srcline = sw.readline()) != null)  { sourcerecords.add(srcline.tostring()); } 

how in scala in spark

it's easy. example,

val rdd = sc.textfile("your_file_path") val sourcerecords = rdd.toarray 

however, don't need convert rdd array. can manipulate rdd array.

you can find more information in https://spark.incubator.apache.org/examples.html


Comments

Popular posts from this blog

Android layout hidden on keyboard show -

google app engine - 403 Forbidden POST - Flask WTForms -

c - Why would PK11_GenerateRandom() return an error -8023? -