Traditionally, multiplication takes two elements of a set and turns them into another element of that set. The dot product doesn't do that, it takes two vectors and turns them into a scalar.
The cross product only exists in three dimensions. And it's not associative (A×B×C gives a different answer depending which order you do it in), which is another thing multiplication usually satisfies.
There are two other not-quite-multiplication operators that I recall seeing. There's an analog of the cross product in two dimensions: (a,b,0)×(c,d,0) = (0,0,ad-bc), so it can be useful to have an operator (a,b)×(c,d) = ad-bc, again turning two vectors into a scalar.
And if the dot product is defined in terms of matrix multiplication by A·B = AᵀB, then you can also define an operator ABᵀ, turning two vectors into a matrix. These vectors don't even need to have the same length.
> The cross product only exists in three dimensions
While true, the Wedge product is a useful concept that generalizes a cross product to arbitrary dimensions and is used in multivariable calculus for proving various integral theorems in high dimension. Here, generalized Stokes theorems apply despite the cross product not being defined. Admittedly, it isn't really a map on the vector space, but the fact that Stokes theorems still hold makes it pretty darn useful to me.
To add to the other responses at this level, I want to point out that one form of vector-vector "multiplication"—inner products—corresponds to applying linear functionals, i.e., to linearly mapping a vector space into its underlying scalar field.
So just as every matrix is a representation of a linear transformation of vectors into other vectors, with matrix-vector multiplication corresponding to function application, it is also true that each vector in a vector space represents a linear transformation that turns vectors in the space into a scalar, with vector-vector multiplication in the form of inner products corresponding to function application. The converse is also true: every linear functional on a vector space can be represented by a vector in the space.
This last insight is known (in various forms) as the Riesz representation theorem and holds not only on finite inner-product spaces (i.e., vector spaces on which an inner product is defined) but also on Hilbert spaces (complete inner product spaces, whether finite or infinite). It turns out to be quite powerful.
Well, it actually depends what you've been told about what "multiplication" is. Multiplication should be closed, hence dot product is not a multiplication because the result is not a vector (unless you are using 1-dimensional vectors, sure, but the result is still not a vector.) Wedge (or outer, or cross) product is a delicate issue, because, well, it works as a product but to get it to be actually defined you get the generalisation (exterior algebras) and then they are also not closed (because the exterior algebra is different from the source algebra and is only the same dimension in a few cases)