What would be the entropy of a source that generates equal-length messages of 200 symbols
chosen uniformly and randomly from an alphabet of 32 symbols?
I tried to calculate Log 2 of 32 and it gives me 5 but would that be the correct answer? Is it possible to calculate source entropy using the information given above? I am not able to make any sense of it. In order to calculate source entropy, we should be given a number of messages but here only the message length and number of alphabets is given.
chosen uniformly and randomly from an alphabet of 32 symbols?
I tried to calculate Log 2 of 32 and it gives me 5 but would that be the correct answer? Is it possible to calculate source entropy using the information given above? I am not able to make any sense of it. In order to calculate source entropy, we should be given a number of messages but here only the message length and number of alphabets is given.