Add Again
Input: Standard Input

Output: Standard Output

Summation of sequence of integers is always a common problem in Computer Science. Rather than computing blindly, some intelligent techniques make the task simpler. Here you have to find the summation of a sequence of integers. The sequence is an interesting one and it is the all possible permutations of a given set of digits. For example, if the digits are <1 2 3>, then six possible permutations are <123>, <132>, <213>, <231>, <312>, <321> and the sum of them is 1332.

Input

Each input set will start with a positive integer N (1≤N≤12). The next line will contain N decimal digits. Input will be terminated by N=0. There will be at most 20000 test set.

Output

For each test set, there should be a one line output containing the summation. The value will fit in 64-bit unsigned integer.

Sample Input         Output for Sample Input

3

1 2 3

3

1 1 2

0

 

1332

444

 

 

題意:給你n個數字(0~9,1<=n<=12),問這些數字構成的所有不重複排列的和。

分析:舉個例子

含重複數字時能構成的所有不重複排列的個數為:(n!)/((n1!)*(n2!)*...*(nn!)),其中ni指數字i出現的次數。

又因為每個數字在每一位出現的概率時等可能的。

比如1 1 2,所能構成的所有情況為

1 2 3

1 3 2

2 1 3

2 3 1

3 1 2

3 2 1

而1、2、3出現在個、十、百位的次數時一樣的,即6/3;

則每個數字在每一位出現的次數為 [(n!)/((n1!)*(n2!)*...*(nn!))]/n;(含重複數字時同樣適用)

簡化加法,即每個數字在每一位均出現1次時這個數字的和為 x*1...1 (n個1)

則n個數字在每一位出現times次,即為所求答案。ans = (a1+a2+...+an)*(1...1)*[(n!)/((n1!)*(n2!)*...*(nn!))]/n;

切忌:[(n!)/((n1!)*(n2!)*...*(nn!))]/n*(a1+a2+...+an)*(1...1)這樣表達時錯誤的,當n個數字相同時,[(n!)/((n1!)*(n2!)*...*(nn!))] = 1, 1/n會得到0,所以應先乘再除;

【程式碼】:

 #include<iostream>
#include<cstdio>
#include<cstdlib>
#include<cstring>
using namespace std;
typedef unsigned long long ull;
const int maxn = ;
int x, a[maxn], num[maxn];
ull C[maxn];
const ull basic[] =
{
, , , , , , , ,
, , ,
};
void init()
{
C[] = C[] = ;
for(int i = ; i <= ; i++)
{
C[i] = C[i-]*i;
}
} int main()
{
init();
int n;
while(scanf("%d", &n) && n)
{
memset(num, , sizeof(num));
ull ans = ;
for(int i = ; i < n; i++)
{
scanf("%d", &x);
ans += x;
num[x]++;
}
ull times = C[n];
for(int i = ; i < ; i++)
{
times /= C[num[i]];
}
ans = ans*times*basic[n-]/n;
cout << ans << endl;
}
return ;
}