ds
I doubt you'll get far with timid attempts to guess correct answer. A better strategy would be to post some solution and let others review it. I'll try doing this myself, but I am very rusty on diff. geometry, and I don't have any textbooks, so my sketch of a proof might have holes and errors. For that reason I've decided to break the rules and not wait for mandatory week after the op and post it now. Please feel free to question and ask questions about my write-up.
Let
ϕ:Ux→Rn−1 be a local map at
x and
θ its inverse (a.k.a. parametrization of
M):
θ:V⊂Rn−1→U⊂M⊂Rn or
θ:V⊂Rn−1→RnFunctions
θ and
f have derivatives
∇θ and
∇f respectively, where
∇θ is a
n×n−1 matrix, a.k.a Jacobian, and
∇f, a.k.a. gradient, is a
1×n matrix, a.k.a. row vector. Note: it can be shown that
∇θ and
∇f have ranks
n−1 and
1 respectively.
Since
f∘θ=0 we have
∇f×∇θ=0.
Lemma: vector
X∈Rn is tangent to
M if and only if
X=∇θ×Y for some
Y∈Rn−1.
The proof of the lemma is left to the reader

, but here is the rest of the proof in both directions ("if and only if"):
- If X∈Rn is a tangent vector to M then Xx(f)=∇f×X=∇f×∇θ×Y=0
- If for some X∈Rn we have ∇f×X=0 then consider an n×n matrix A=(∇θ,X), i.e., ∇θ with X added as n-th column. We now have ∇f×A=0, which means that detA=0 and thus the columns of A are linearly dependent, i.e. X belongs to the subspace formed by other columns of A, which are tangent vectors to M.