Stratified sampling and Latin hypercube sampling (LHS) reduce variance, relative to naïve Monte Carlo sampling, by partitioning the support of a random vector into strata. When creating these estimators, we must determine: (i) the number of strata; and, (ii) the partition that defines the strata. In this paper, we address the second point by formulating a nonlinear optimization model that designs the strata to yield a minimum-variance stratified sampling estimator. Under a discrete set of candidate boundary points, the optimization model can be solved via dynamic programming. We extend this technique to LHS, using an approximation of estimator variance to obtain strata for the domain of a multivariate function. Empirical results show significant variance reduction compared to using equal-probability strata for LHS or naïve Monte Carlo sampling.