This study considers a modification of the usual continuous-time optimal control problem, in which a decision, chosen from a continuum of admissible decisions, is rendered at each of a discrete and finite set of points in time. Dynamic programming techniques are used to derive two necessary conditions for relative minimality of a trajectory under the rather strong assumption of sufficient smoothness of the optimal value function. Under certain convexity assumptions, a "maximum principle" of Pontryagin type is also deduced, although, in general, there is no such principle for discrete problems. 31 pp. Ref.