Dan ish Dan ish - 6 months ago 8
Python Question

Applying same calcuation to each element of dataframe in python

I have a dataframe like this.

user tag1 tag2 tag3
0 Roshan ghai 0.0 1.0 1.0
1 mank nion 1.0 1.0 2.0
2 pop rajuel 2.0 0.0 1.0
3 random guy 2.0 1.0 1.0


I have to apply a calculation to each row. which is for each element x

x =(( specific tag's count for that user ##that element itself##))/ max no. of count of that tag ##max value of that column##)) * (ln(no. of total user ##lenth of df##)/(no. of of user having that tag ##no. of user having non 0 count for that particular tag or column ##))


I have used ## to describe that particular value. I have to do it for each element of dataframe what is the most efficient way to this as i have a large no. of elements . I am using python2.7.
output:

user tag1 tag2 tag3
0 Roshan ghai 0 .287 0
1 mank nion .143 .287 0
2 pop rajuel .287 0 0
3 random guy .287 .287 0


I have just used the formula which i have written like for mank nion and tag1
x =((1.0)/2.0)*(ln(4/3) = .143 .

Answer

You can first select all values without first column by ix. Then use max, sum of non 0 values and numpy.log:

import pandas as pd
import numpy as np

print (df.ix[:, 'tag1':].max())
tag1    2.0
tag2    1.0
tag3    2.0
dtype: float64

print ((df.ix[:, 'tag1':] != 0).sum())
tag1    3
tag2    3
tag3    4
dtype: int64

df.ix[:, 'tag1':] = (df.ix[:, 'tag1':] / df.ix[:, 'tag1':].max() * 
                    (np.log(len(df) / (df.ix[:, 'tag1':] != 0).sum())))

print (df)
          user      tag1      tag2  tag3
0  Roshan-ghai  0.000000  0.287682   0.0
1    mank-nion  0.143841  0.287682   0.0
2   pop-rajuel  0.287682  0.000000   0.0
3   random-guy  0.287682  0.287682   0.0

Another solution with iloc:

df1 = df.iloc[:, 1:]
df.iloc[:, 1:] = (df1 / df1.max() * (np.log(len(df) / (df1 != 0).sum())))
print (df)
          user      tag1      tag2  tag3
0  Roshan-ghai  0.000000  0.287682   0.0
1    mank-nion  0.143841  0.287682   0.0
2   pop-rajuel  0.287682  0.000000   0.0
3   random-guy  0.287682  0.287682   0.0
Comments