Demodulation of a direct-sequence spread-spec brum (DS/SS) code-division multiple-access (CDMA) signal in the presence of multiple-access interference is considered. The channel output is first passed through a filter matched to the chip waveform, and then sampled at the chip rate. Rather than correlating the output samples with Ihe spreading sequence of the desired user, an N-tap delay line can be used to minimize the Mean Squared Error (MSE) between the transmitted and detected symbols, where N is the processing gain. In principle, this filter can be adapted to minimize the MSE for a desired user in the absence of knowledge of the spreading sequences for the other users. Because of the complexity and coefficient noise associated with such an adaptive filter when A' is large, simpler structures with fewer adaptive components are proposed. In the first, the channel output is connected to a bank of filters, each of which is a shifted version of the standard matched filter and whose output is sampled once per symbol interval. In the second, the output of a standard matched filter is sampled multiple times per symbol interval. In each case the multiple samples per symbol are combined via a tapped delay line, where the taps are selected to minimize the MSE. It is shown that the complexity of both of these schemes are comparable, but that the first scheme is somewhat more effective in cancelling interference. Numerical results are presented for specific examples illustrating the efficacy of the proposed methods.