How to get most common for each element of array list (pyspark)

Keywords´╝Ü apache-spark pyspark rdd


I have a List of arrays [array(0,1,1),array(0,0,1),array(1,1,0)] for which I need to find highest frequency element for each element of the list

def finalml(listn): return Counter(listn).most_common(1)

results = xw: bc_knnobj.value.kneighbors(xw, return_distance=False)).collect() # the array list is return by this

labels = xw: finalml(xw)).collect()

For the above code "unhashable type: 'list'" error is thrown.

expected output

labels |

1 | 0 |

1 |