thomas.mac thomas.mac - 3 months ago 22
Python Question

Calculating yearly standard deviation, given monthly returns in pandas

I have a function to calculate monthly returns:

def monthlyreturns(df):
first = df.resample('M').first().to_period('M')
last = df.resample('M').last().to_period('M')
return ((last-first)/first) * 100


and a resulting df from monthlyreturns(stocks):

FOX FOXA MMM
Date
2012-01 5.4 3.2 -.08
2012-02 .07 1.2 -.62
...
2017-08 -.2 -4.2 2.3


My question is - how can I calculate the standard deviation for ea year? My expected output would be to keep the df in the same format (with stocks in columns, and date as index) , but calculate the yearly standard deviation, given monthly returns (so there should be about 7 values for ea stock)

So far I have tried:

sd = pd.DataFrame()
x = -13
y = -1
for date in reversed(periods): #where periods is ea year
sd[date] = np.std(monthly_returns.iloc[x:y])
x -= 12
y -= 12
if x < -72:
break


This works - but the dates and columns are swapped , and was wondering if there was a cleaner code to do this

DYZ DYZ
Answer Source
monthly_returns.groupby(monthly_returns.index.year).std()

For your example:

#           FOX      FOXA       MMM
#2012  3.768879  1.414214  0.381838
#2017       NaN       NaN       NaN