# 4.9

Jump to navigation Jump to search

Here the main thing to notice is that we need a O($n^{k-1}logn$ ) solution.
For various values of k,
k Solution Time Complexity
1 O($n^{0}logn$ )
2 O($n^{1}logn$ )
3 O($n^{2}logn$ )
4 O($n^{3}logn$ )

for k = 2 onwards
1. sort the array ( nlogn )
2. use k-1 loops for iterating over array, where ith loop starts from i-1 + 1 element of array, and then use binary search
eg:
for k = 3
for( i = 0; i < n; i++ )
for( j = i+1; j < n; j++ )

1. now use binary search to find ( T - a[i] - a[j] ) from j+1 element to end of array

### Another recursive solution

First, we note that an O($n\lg n$ ) solution for $k=2$ exists, which is already within the required bounds. Now, for $k\geq 3$ , we can do something like this:

```sort S ascending;
CheckSumOfK(S, k, T); // defined below

// S must be aorted array of integers
function CheckSumOfK(S, k, T)
if k <= 2:
Use any method in the solution of ex 4.8 and return the result. This can be done in O(n) time because S is sorted.

Initialize array A of n - 1 elements;
for i from 0 to n - 1:
k = 0
// Collect in A all numbers in S but S[i]. Note that A will still be sorted.
for j from 0 to n - 1:
if i != j:
A[k] = S[j];
k = k + 1
// If S[i] is included in the k integers that sum up to T, then there must
// exactly (k - 1) integers in the rest of S (i.e., A) that sum to (T - S[i]).
// We can find that out by calling ourselves recursively.
if CheckSumOfK(A, k - 1, T - S[i]):
return True
return False
```

### Complexity

For each item in S, there is $O(n)$ work done in assembling the array A, except at the ${k-1}^{th}$ recursive call, which completes in $O(n\lg n)$ time. So, for each number in S, we have $O(kn)+O(n\lg n)$ work, and since $k<=n$ , each iteration of the outer loop takes $O(n^{2})+O(n\lg n)=O(n^{2})$ work. Now since the outer loop goes on for at most $n$ iterations, we have a total runtime of $O(n^{3})$ .

A trick can be used to lower this runtime to $O(n^{2}\lg n)$ by having the routines in this solution take an index to ignore when iterating in their inner loops. With this, we save the $O(n)$ construction of A for every item, and then each iteration of the outer loop becomes $O(n\lg n)$ (How? $O(k)=O(n)$ constant time recursive calls plus $O(n\lg n)$ time spent in the ${k-1}^{th}$ calls), giving a total runtime of $O(n^{2}\lg n)$ for $n$ elements in S.

Note that this cost includes the initial $O(n\lg n)$ cost for sorting S. Also, this algorithm is always going to be $O(n^{k-1}\lg n)$ for all $k>=2$ . The exercise just states an upper bound.

Back to Chapter 4