ZOJ 3827 Information Entropy【水题、简单相加】
2017-06-26 15:28
489 查看
Information Entropy
Time Limit: 2 Seconds Memory Limit: 65536 KB Special Judge
Information Theory is one of the most popular courses in Marjar University. In this course, there is an important chapter about information entropy.
Entropy is the average amount of information contained in each message received. Here, a message stands for an event, or a sample or a character drawn from a distribution or a data stream.
Entropy thus characterizes our uncertainty about our source of information. The source is also characterized by the probability distribution of the samples drawn from it. The idea here is that the less likely an event is, the more information it provides when
it occurs.
Generally, "entropy" stands for "disorder" or uncertainty. The entropy we talk about here was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".
We also call it Shannon entropy or information entropy to distinguish from other occurrences of the term, which appears in various parts of physics in different forms.
Named after Boltzmann's H-theorem, Shannon defined the entropy Η (Greek letter Η, η) of a discrete random variable X
4000
with possible values {x1, x2,
..., xn} and probability mass function P(X) as:
H(X)=E(−ln(P(x)))
Here E is the expected value operator. When taken from a finite sample, the entropy can explicitly be written as
H(X)=−∑i=1nP(xi)log b(P(xi))
Where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10. The unit of entropy is bit for b = 2, nat for b = e,
and dit (or digit) for b = 10 respectively.
In the case of P(xi) = 0 for some i, the value of the corresponding summand 0 logb(0) is taken to be a well-known limit:
0log b(0)=limp→0+plog b(p)
Your task is to calculate the entropy of a finite sample with N values.
The first line contains an integer N (1 <= N <= 100) and a string S. The string S is one of "bit", "nat" or "dit", indicating the unit of entropy.
In the next line, there are N non-negative integers P1, P2, .., PN. Pi means the probability
of the i-th value in percentage and the sum of Pi will be 100.
Any solution with a relative or absolute error of at most 10-8 will be accepted.
Time Limit: 2 Seconds Memory Limit: 65536 KB Special Judge
Information Theory is one of the most popular courses in Marjar University. In this course, there is an important chapter about information entropy.
Entropy is the average amount of information contained in each message received. Here, a message stands for an event, or a sample or a character drawn from a distribution or a data stream.
Entropy thus characterizes our uncertainty about our source of information. The source is also characterized by the probability distribution of the samples drawn from it. The idea here is that the less likely an event is, the more information it provides when
it occurs.
Generally, "entropy" stands for "disorder" or uncertainty. The entropy we talk about here was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".
We also call it Shannon entropy or information entropy to distinguish from other occurrences of the term, which appears in various parts of physics in different forms.
Named after Boltzmann's H-theorem, Shannon defined the entropy Η (Greek letter Η, η) of a discrete random variable X
4000
with possible values {x1, x2,
..., xn} and probability mass function P(X) as:
H(X)=E(−ln(P(x)))
Here E is the expected value operator. When taken from a finite sample, the entropy can explicitly be written as
H(X)=−∑i=1nP(xi)log b(P(xi))
Where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10. The unit of entropy is bit for b = 2, nat for b = e,
and dit (or digit) for b = 10 respectively.
In the case of P(xi) = 0 for some i, the value of the corresponding summand 0 logb(0) is taken to be a well-known limit:
0log b(0)=limp→0+plog b(p)
Your task is to calculate the entropy of a finite sample with N values.
Input
There are multiple test cases. The first line of input contains an integer T indicating the number of test cases. For each test case:The first line contains an integer N (1 <= N <= 100) and a string S. The string S is one of "bit", "nat" or "dit", indicating the unit of entropy.
In the next line, there are N non-negative integers P1, P2, .., PN. Pi means the probability
of the i-th value in percentage and the sum of Pi will be 100.
Output
For each test case, output the entropy in the corresponding unit.Any solution with a relative or absolute error of at most 10-8 will be accepted.
Sample Input
3 3 bit 25 25 50 7 nat 1 2 4 8 16 32 37 10 dit 10 10 10 10 10 10 10 10 10 10
Sample Output
1.500000000000 1.480810832465 1.000000000000
注意:Pi ==0时特判(不计入概率)。
#include<iostream> #include<algorithm> #include<cmath> #include<cstdio> #include<cstdlib> #include<queue> #include<map> #include<set> #include<stack> #include<bitset> #include<numeric> #include<vector> #include<string> #include<iterator> #include<cstring> #include<functional> #define INF 0x3f3f3f3f #define ms(a,b) memset(a,b,sizeof(a)) using namespace std; const int maxn = 110; const int mod = 1e9 + 7; const double pi = 3.14159265358979; typedef pair<int, int> P; typedef long long ll; typedef unsigned long long ull; int a[maxn]; int main() { int t; scanf("%d", &t); while (t--) { int n; char s[20] = { 0 }; scanf("%d %s", &n, s); // e =2.71828 18284 59045 23536 02874 double ans = 0; double b = 0; if (!strcmp(s, "bit")) b = 2; else if (!strcmp(s, "nat")) b = 2.71828182845904523536; else b = 10; for (int i = 0; i < n; i++) { int p; scanf("%d", &p); if(p) ans-=(double)p/100.0*(log((double)p/100.0)/log(b)); } printf("%.12f\n", ans); } }
相关文章推荐
- ZOJ 3827 Information Entropy 水题
- ZOJ 3827 简单数学推导+简单模拟
- zoj 3827 Information Entropy(水题)
- zoj 3827(水题)
- 1002 大数相加 (简单模拟) 水题
- zoj 3827 Information Entropy 【水题】
- zoj 1874 || poj 1562 Primary Arithmetic(水题。)
- zoj 2402 简单dp
- zoj 2727 List the Books(三级排序 = =水题)
- ZOJ 1608 Two Circles and a Rectangle(简单计算几何)
- ZOJ 1160 历法水题
- ZOJ 2207 听说是简单题,结果我是被折磨得半死写出来的代码还特丑
- zoj 1973 水题。。。
- 水题几枚zoj 1195 zoj 2421 zoj 2405 zoj 2835 zoj 1274
- zoj 2358 水题.
- zoj 2679 水题
- zoj 1494 Climbing Worm(经典数学水题)
- zoj 1716 又是一道简单题。直接暴力枚举实现!
- zoj 1074 || poj 1050 To the Max(简单DP)
- zoj 1745 又是简单题