ah, homework time again.
if i remember correctly, O(f(n)) is the same as O(f(n) + g(n)) if g is a lower order function than f. thus O(n^2 - n) is effectively O(n^2). look at the definition of big O notation again, i'm sure this result follows directly from it.
as for the optimal solution, i can think of one that has time complexity O(n) but also space complexity O(M) where M is the number you wish to sum to.
using your example, given a[4] = {1, 2, 3, 4}, initialize an array b[7] to all 0s. then loop from i = 0 to 3, set b[a[i]] = 1. then loop again, this time checking if (b[7 - a[i]] == 1). if true, then you've found 2 numbers that add up to 7 (namely a[i] and (7 - a[i])). of course, this solution only works if you're dealing with integers and those integers are bounded. and you'll have to go through some additional pains if you plan to handle negatives.
i can't really think of a reasonable solution that runs in O(n * log(n)). perhaps something involving an initial sort?