I have a DataFrame named df, and I want to count the top frequency elements in column app_0, app_1 and app_2 on different sex.
import pandas as pd
import numpy as np
df=pd.DataFrame({'id':[1,2,3,4],'app_0':['a','b','c','d'],
'app_1':['b','c','d',np.nan],'app_2':['c','b','a','a'],'sex':[0,0,1,1]})
Input:
df
id app_0 app_1 app_2 sex
0 1 a b c 0
1 2 b c b 0
2 3 c d a 1
3 4 d NaN a 1
As you see, the sex of both id 1 and id 2 is 0. For sex 0, b appears the most in column app_0, app_1 and app_2, c appears the second most. So for id 1 and id 2, the most frequency element is b, and the second most is c.
Expected:
df
id app_0 app_1 app_2 sex top_1 top_2
0 1 a b c 0 b c
1 2 b c b 0 b c
2 3 c d a 1 a d
3 4 d NaN a 1 a d
Use custom function with stack and value_counts:
def f(x):
s = x.stack().value_counts()
return pd.Series([s.index[0], s.index[1]], index=['top_1','top_2'])
Or use Counter with flattened values with Counter.most_common:
from collections import Counter
def f(x):
c = Counter([y for x in x.values.tolist() for y in x])
a = c.most_common(2)
return pd.Series([a[0][0], a[1][0]], index=['top_1','top_2'])
df1 = df.groupby('sex')['app_0','app_1','app_2'].apply(f)
df = df.join(df1, on='sex')
print (df)
id app_0 app_1 app_2 sex top_1 top_2
0 1 a b c 0 b c
1 2 b c b 0 b c
2 3 c d a 1 a d
3 4 d NaN a 1 a d
EDIT:
More general solution working if second top values not exist with next:
df=pd.DataFrame({'id':[1,2,3,4],'app_0':['a','a','a','a'],
'app_1':['a','a','a',np.nan],'app_2':['a','a','a','a'],'sex':[0,0,1,1]})
print (df)
id app_0 app_1 app_2 sex
0 1 a a a 0
1 2 a a a 0
2 3 a a a 1
3 4 a NaN a 1
def f(x):
c = Counter([y for x in x.values.tolist() for y in x])
a = iter(c.most_common(2))
return pd.Series([next(a, ['no top1'])[0],
next(a, ['no top2'])[0]], index=['top_1','top_2'])
df1 = df.groupby('sex')['app_0','app_1','app_2'].apply(f)
df = df.join(df1, on='sex')
print (df)
id app_0 app_1 app_2 sex top_1 top_2
0 1 a a a 0 a no top2
1 2 a a a 0 a no top2
2 3 a a a 1 a NaN
3 4 a NaN a 1 a NaN
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With