zoj 3827 Information Entropy(2014牡丹江区域赛I题)
2014-10-12 16:22
429 查看
Information Entropy
Time Limit: 2 Seconds Memory Limit: 65536 KB Special Judge
Information Theory is one of the most popular courses in Marjar University. In this course, there is an important chapter about information entropy.
Entropy is the average amount of information contained in each message received. Here, a message stands for an event, or a sample or a character drawn from a distribution or a data stream.
Entropy thus characterizes our uncertainty about our source of information. The source is also characterized by the probability distribution of the samples drawn from it. The idea here is that the less likely an event is, the more information it provides when
it occurs.
Generally, "entropy" stands for "disorder" or uncertainty. The entropy we talk about here was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".
We also call it Shannon entropy or information entropy to distinguish from other occurrences of the term, which appears in various parts of physics in different forms.
Named after Boltzmann's H-theorem, Shannon defined the entropy Η (Greek letter Η, η) of a discrete random variable X with possible values {x1, x2,
..., xn} and probability mass functionP(X) as:
H(X)=E(−ln(P(x)))
Here E is the expected value operator. When taken from a finite sample, the entropy can explicitly be written as
H(X)=−∑i=1nP(xi)log b(P(xi))
Where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10. The unit of entropy is bit for b = 2, nat for b = e,
and dit (or digit) for b = 10 respectively.
In the case of P(xi) = 0 for some i, the value of the corresponding summand 0 logb(0) is taken to be a well-known limit:
0log b(0)=limp→0+plog b(p)
Your task is to calculate the entropy of a finite sample with N values.
The first line contains an integer N (1 <= N <= 100) and a string S. The string S is one of "bit", "nat" or "dit", indicating the unit of entropy.
In the next line, there are N non-negative integers P1, P2, .., PN. Pi means the probability
of the i-th value in percentage and the sum of Pi will be 100.
Any solution with a relative or absolute error of at most 10-8 will be accepted.
代码:
Time Limit: 2 Seconds Memory Limit: 65536 KB Special Judge
Information Theory is one of the most popular courses in Marjar University. In this course, there is an important chapter about information entropy.
Entropy is the average amount of information contained in each message received. Here, a message stands for an event, or a sample or a character drawn from a distribution or a data stream.
Entropy thus characterizes our uncertainty about our source of information. The source is also characterized by the probability distribution of the samples drawn from it. The idea here is that the less likely an event is, the more information it provides when
it occurs.
Generally, "entropy" stands for "disorder" or uncertainty. The entropy we talk about here was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".
We also call it Shannon entropy or information entropy to distinguish from other occurrences of the term, which appears in various parts of physics in different forms.
Named after Boltzmann's H-theorem, Shannon defined the entropy Η (Greek letter Η, η) of a discrete random variable X with possible values {x1, x2,
..., xn} and probability mass functionP(X) as:
H(X)=E(−ln(P(x)))
Here E is the expected value operator. When taken from a finite sample, the entropy can explicitly be written as
H(X)=−∑i=1nP(xi)log b(P(xi))
Where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10. The unit of entropy is bit for b = 2, nat for b = e,
and dit (or digit) for b = 10 respectively.
In the case of P(xi) = 0 for some i, the value of the corresponding summand 0 logb(0) is taken to be a well-known limit:
0log b(0)=limp→0+plog b(p)
Your task is to calculate the entropy of a finite sample with N values.
Input
There are multiple test cases. The first line of input contains an integer T indicating the number of test cases. For each test case:The first line contains an integer N (1 <= N <= 100) and a string S. The string S is one of "bit", "nat" or "dit", indicating the unit of entropy.
In the next line, there are N non-negative integers P1, P2, .., PN. Pi means the probability
of the i-th value in percentage and the sum of Pi will be 100.
Output
For each test case, output the entropy in the corresponding unit.Any solution with a relative or absolute error of at most 10-8 will be accepted.
Sample Input
3 3 bit 25 25 50 7 nat 1 2 4 8 16 32 37 10 dit 10 10 10 10 10 10 10 10 10 10
Sample Output
1.500000000000 1.480810832465 1.000000000000
代码:
#include <iostream> #include <cstdio> #include <cstring> #include <algorithm> #include <cmath> using namespace std; const int N = 111; int a ; int main() { int T; scanf("%d", &T); while (T--) { int n; char s[5]; scanf("%d", &n); scanf("%s", s); for (int i = 1; i <= n; ++i) scanf("%d", &a[i]); double ans = 0; if (s[0] == 'b') { for (int i = 1; i <= n; ++i) { if (!a[i]) continue; ans += 1.0 * a[i] * log2((double) a[i] * 1.0 / 100.0); } } else if (s[0] == 'n') { for (int i = 1; i <= n; ++i) { if (!a[i]) continue; ans += 1.0 * a[i] * log((double) a[i] * 1.0 / 100.0); } } else if (s[0] == 'd') { for (int i = 1; i <= n; ++i) { if (!a[i]) continue; ans += 1.0 * a[i] * log10((double) a[i] * 1.0 / 100.0); } } ans /= 100; printf("%.12f\n", -ans); } return 0; }
相关文章推荐
- ZOJ 3827 Information Entropy (2014牡丹江区域赛)
- zoj 3827(2014牡丹江现场赛 I题 )
- zoj 3822 Domination(2014牡丹江区域赛D题) (概率dp)
- ACM学习历程——ZOJ 3829 Known Notation (2014牡丹江区域赛K题)(策略,栈)
- zoj 3822 Domination(2014牡丹江区域赛D称号)
- ZOJ 3829 Known Notation / 2014牡丹江区域赛 K (模拟, 贪心)
- zoj 3829 Known Notation(2014在牡丹江区域赛k称号)
- zoj 3822 (2014 牡丹江区域赛 D) Domination
- zoj 3809 水 2014 ACM牡丹江区域赛网赛
- 2014牡丹江区域赛B(树的直径)ZOJ3820
- 2014ACM/ICPC亚洲区域赛牡丹江站现场赛-I ( ZOJ 3827 ) Information Entropy
- ZOJ 3822 Domination The 2014 ACM-ICPC 牡丹江区域赛 概率dp 先算概率,再转成期望
- ZOJ3819 ACM-ICPC 2014 亚洲区域赛牡丹江赛区现场赛A题 Average Score 签到题
- [zoj 3822]2014牡丹江区域赛 Domination 概率dp求期望
- 2014牡丹江现场赛A题D题I题(水,概率Dp,水)ZOJ 3819,3822,3827
- ACM学习历程——ZOJ 3822 Domination (2014牡丹江区域赛 D题)(概率,数学递推)
- ZOJ 3819 Average Score (2014牡丹江区域赛)
- zoj 3829 (2014牡丹江区域赛K) Known Notation
- 2014 牡丹江现场赛 i题 (zoj 3827 Information Entropy)
- zoj 3822 Domination(2014牡丹江区域赛D题)