Abstract
In the load balancing problem, there is a set of servers, and jobs arrive sequentially. Each job can be run on some subset of the servers, and must be assigned to one of them in an online fashion. Traditionally, the assignment of jobs to servers is measured by the L∞ norm; in other words, an assignment of jobs to servers is quantified by the maximum load assigned to any server. In this measure the performance of the greedy load balancing algorithm may be a logarithmic factor higher than the offline optimal [3]. In many applications, the L∞ norm is not a suitable way to measure how well the jobs are balanced. If each job sees a delay that is proportional to the number of jobs on its server, then the average delay among all jobs is proportional to the sum of the squares of the numbers of jobs assigned to the servers. Minimizing the average delay is equivalent to minimizing the Euclidean (or L2) norm. For any fixed p, 1 ≤ p ≤ ∞, we show that the greedy algorithm performs within a constant factor of the offline optimal with respect to the Lp norm. The constant grows linearly with p, which is best possible, but does not depend on the number of servers and jobs.
Original language | English (US) |
---|---|
Pages (from-to) | 383-391 |
Number of pages | 9 |
Journal | Annual Symposium on Foundations of Computer Science - Proceedings |
State | Published - 1995 |
Event | Proceedings of the 1995 IEEE 36th Annual Symposium on Foundations of Computer Science - Milwaukee, WI, USA Duration: Oct 23 1995 → Oct 25 1995 |
ASJC Scopus subject areas
- Hardware and Architecture