General / Off-Topic Calculating entropy of source

What would be the entropy of a source that generates equal-length messages of 200 symbols
chosen uniformly and randomly from an alphabet of 32 symbols?

I tried to calculate Log 2 of 32 and it gives me 5 but would that be the correct answer? Is it possible to calculate source entropy using the information given above? I am not able to make any sense of it. In order to calculate source entropy, we should be given a number of messages but here only the message length and number of alphabets is given.
 
Log 2 of 32 and it gives me 5

All that does is say
2x2x2x2x2 or 2^5 = 32.
The information that 200 characters is needed per message is not used.

If for each position in the message there are 32 possible choices, then the total number of messages is 32^200.

How does that work:
2 position message choosing from 3 letters X Y or Z:

XX........YX.........ZX Matrix runs X-Z ➡️⬇️
XY.........YY.........YY
XZ.........YZ.........ZZ

9 possible outcomes, or 3 x 3 = 9.
3^2.
Number of choices per position, to the power of the number of positions.

Is this what you are looking for?
 
Top Bottom