Javatpoint Logo
Javatpoint Logo

Spark Map function

In Spark, the Map passes each element of the source through a function and forms a new distributed dataset.

Example of Map function

In this example, we add a constant value 10 to each element.

  • To open the spark in Scala mode, follow the below command
Spark Map function
  • Create an RDD using parallelized collection.
  • Now, we can read the generated result by using the following command.
Spark Map function
  • Apply the map function and pass the expression required to perform.
  • Now, we can read the generated result by using the following command.
Spark Map function

Here, we got the desired output.






Help Others, Please Share

facebook twitter google plus pinterest

Learn Latest Tutorials


Preparation


Trending Technologies


B.Tech / MCA