alizadeh@umn-cs.CS.UMN.EDU (Farid Alizadeh) (11/26/89)
Let S be a set of m linear functions over R^n:
S={(a_i,x) + b_i | a_i and x are n-vectors, b_i a real
number, 1 <= i <= m }
where (x,y) is the inner product of two vectors.
Define the function f_k : R^n -> R where
f_k(x)= sum of the k largest functions in S when evaluated at x.
We want to compute min f_k(x).
Now it is easy to show that f_k(x) is a convex function and clearly can be
computed for any x. Therefore, one can use the ellipsoid method and compute
the minimum in polynomial time. Are there better methods?
Note that for k=1 one can easily transform the problem into a linear program
by introducing a new variable z. So, for k=1 the problem is equivalent to
the following linear program with m constraints:
minimize z
subject to z - (a_i,x) - b_i >= 0 for 1 <= i <= m
Of course, similar transformation for k in general will yield C(m,k) (m choose
k) constraints and so will not be polynomial.
My question is: are there good algorithms other than the ellipsoid method for
this problem? I would accept a simplex type method as an efficient method. How
about interior point methods? Can we transform this problem to one with
few (polynomially bounded in n,m and k) number of constraints? Would it help
to look at the dual?