PySpark StorageLevelPySpark StorageLevel is used to decide how RDD should be stored in memory. It also determines the weather serialize RDD and weather to replicate RDD partitions. In Apache Spark, it is responsible for RDD should be saved in the memory or should it be stored over the disk, or in both. It contains commonly-used PySpark StorageLevels, static constant like MEMORY_ONLY. The following code block consist the class definition of a StorageLevel- Class VariablesThere are different PySpark StorageLevels to decide the storage of RDD, such as:
Instance MethodExample of PySpark StorageLevelHere we use the storage level Memory_And_Disk_2, which means RDD partition will have replication of 2. Output: Disk Memory Serialized 2x Replicated Next TopicPySpark Profiler |