NoobyD - 7 months ago 48

Python Question

I am stuck - need some help to get started - just hitting a brick wall.

So there are supposed to be two lists and one table defined as follows.

• Actors, a list of strings that are actor names. For each i such that 0 ≤ i ≤ len(Actors) − 1, we refer to Actor[i] as the i-th actor.

• Films, a list of strings that are film names. For each i such that 0 ≤ i ≤ len(Films) − 1, we refer to Film[i] as the i-th film.

• Scores, a table whose rows correspond to the actors and columns correspond to the films. Scores[i][j] is an integer number defined as follows.

– If Scores[i][j] = −1 this means that the i-th actor is not a star for the j-th film.

– If Scores[i][j] ≥ 0 then this is the score of the i-th actor for the j-th film. You can assume that the scores are in range 0-100, there is no need to check the validity of the data.

I am allowed to define the above structures as fixed in my program, there is no need to request the user to enter them.

So how do I write a function whose arguments are a table A of integer numbers and a positive integer number i. The function should return the average of non-negative entries of A[i] (the row i of A).

Thanks Jemma

Answer

```
import numpy as np
actors = ['Brad Pitt', 'George Clooney', 'Matt Damon', 'Rowan Atkinson']
films = ['Oceans 11', 'Oceans 12', 'Bean']
actors_dimension = (len(actors))
longest_actor_length = len(max(actors, key=len))
longest_film_length = len(max(films, key=len))
longest_overall_length = max(longest_actor_length, longest_film_length)
padding = longest_overall_length
scores_width = len(films) + 1
scores_height = len(actors) + 1
scores = [[' '.rjust(padding) for x in range(scores_width)] for y in range(scores_height)]
#Setting films
for i, film in enumerate(films):
scores[0][i+1] = film.rjust(padding)
#Setting actors
for i, actor in enumerate(actors):
scores[i+1][0] = actor.rjust(padding)
#Filling data
#Brad Pitt
scores[1][1] = '1'.rjust(padding)
scores[1][2] = '1'.rjust(padding)
scores[1][3] = '-1'.rjust(padding)
#George Clooney
scores[2][1] = '1'.rjust(padding)
scores[2][2] = '1'.rjust(padding)
scores[2][3] = '-1'.rjust(padding)
'Matt Damon'
scores[3][1] = '1'.rjust(padding)
scores[3][2] = '1'.rjust(padding)
scores[3][3] = '-1'.rjust(padding)
'Rowan Atkinson'
scores[4][1] = '-1'.rjust(padding)
scores[4][2] = '-1'.rjust(padding)
scores[4][3] = '1'.rjust(padding)
def average_of_row(row):
if((row > actors_dimension) or (row <= 0 )):
print('That row is not in the table or has no actor')
else:
actor = (scores[:][row]).pop(0).strip()
actors_scores = [int(x) for x in ((scores[:][row]))]
print("%s's average score is: %f" % (actor, float((sum(actors_scores) / len(actors_scores)))))
print(np.matrix(scores))
average_of_row(1) #Brad Pitt
average_of_row(4) #Rowan Atkinson
```

**Output:**

```
[[' ' ' Oceans 11' ' Oceans 12' ' Bean']
[' Brad Pitt' ' 1' ' 1' ' -1']
['George Clooney' ' 1' ' 1' ' -1']
[' Matt Damon' ' 1' ' 1' ' -1']
['Rowan Atkinson' ' -1' ' -1' ' 1']]
Brad Pitt's average score is: 0.333333
Rowan Atkinson's average score is: -0.333333
```

Try it here!

Source (Stackoverflow)