This paper describes the implementation of algorithms for unconstrained optimization which have the property of minimizing conic objective functions in a finite number of steps, when line searches are exact. This work extends the algorithms of Davidon [“Conjugate directions for conic functions” in Nonlinear Optimization 1981, M.J.D. Powell, ed., Academic Press, London, 1981] and Gourgeon and Nocedal [SIAM J. Sci. Statist. Comput., 6 (1985), pp. 253–267] to general nonlinear objective functions, paying much attention to the practical behavior of the new methods. Three types of algorithms are described; they are extensions of the conjugate gradient method, the BFGS method, and a limited-memory BFGS method. The numerical results show that new methods are very effective in solving practical problems.