Easy to understand, but not completely correct argument:
First consider only series of positive terms that diverge to infinity. Given any divergent series of this type a
n, we are asked to partition it into two other series a
ni and a
nj such that both diverge. The series will diverge if, for every real number K, there's a partial sum of it that will be larger than K. Obviously, if the series diverges, if we remove a finite number of terms from it, it'll continue diverging.
Now we have to partition it into two divergent series. First, you have a
n, and pick the number 1, keep summing the series until you exceed 1 (this will always happen because it diverges), put all these values in the sub-sequence a
ni. After that, you removed a finite number of terms of the series, it still diverges. So, starting from the first one you didn't remove, you keep summing the series until you exceed 1 again, and put all these values in a
nj.
Keep repeating this process with the remaining sequence. We now see that our method of construction can divide each subsequence we created in infinitely many intervals, each whose sum exceeds 1. So, given any real number K, we know there's a natural number n > K, and picking n of such intervals, we know that the sum of all of them will exceed n, and since n > K, will exceed K also. So, they both diverge.
Obviously, that doesn't solve the problem, because in order to diverge, a series doesn't need to be positive for every term or even its sum go to infinity ( (-1,1,-1,1,...) for example). So, we need to generalize this argument to account for the general case.
Proof:
By definition, a series diverges if and only if the sequence of its partial sums also diverges. Since the set of real numbers is complete, a sequence diverges if and only if it's not a
Cauchy sequence.
So, with a divergent series a
n, there's an epsilon>0 such that for any choice of N>0, there are n,m>=N, with |a_n+a_(n+1)+...+a_m| >= epsilon.
We then partition the series like in the previous method. For a divergent series, we find any epsilon, N, n, m that satisfy the inequality and put all terms from a_1 to a_m in a
ni. After removing the finite number of terms, the inequality can still be satisfied with the same epsilon, since it must hold for all N>0. So, we find another n,m that satisfy the inequality and put all remaining terms up to a
m in a
nj.
After repeating this process indefinitely, we see that the two series have an infinite amount of intervals whose sum exceeds a given epsilon in absolute value, so the partial sums of both do not form a Cauchy sequence, and so they diverge.
This method is able to partition any divergent series in two diverging ones, so no series can "diverge minimally".