This paper concerns the power minimization problem in server farms. The power minimization problem over dynamic power allocation schemes is formally defined and formulated as an optimization problem. It is shown that finding the optimal solution for this optimization problem is not feasible. Inspired by control theory, a well-established method to optimize a cost function over the constraints imposed by the evolution of a dynamical system, called Real-Time Optimization (RTO), is invoked to find a sub-optimal solution for the power minimization problem. The obtained algorithm is simulated and compared with the state-of-the-art optimal static power allocation solution. A considerable improvement in energy consumption is attained for the same quality of service (QoS) level, when dynamic power allocation is used.