Covariance kernels for random processes with multidimensional output do not output symmetric matrices

I think there is a mistake in the way covariance kernels for random processes with multidimensional output are handled by the library.

In the following code block, I do not think operator() should return a CovarianceMatrix because that matrix is not necessarily symmetric.

To prove this point, consider a stationary Gaussian process Y with 1D input and 1D output. Let k:\mathbb{R} \rightarrow \mathbb{R} be its (stationary) covariance kernel.

Now let h be a real number and let Z be the Gaussian process defined by Z(t) = Y(t+h).

The process (Y,Z)^\intercal is Gaussian, stationary, with 1D input and 2D output. Let K be its covariance kernel. For any \tau \in \mathbb{R},

K(\tau) = \begin{pmatrix} Cov(Y(0), Y(\tau)) & Cov(Y(0), Z(\tau)) \\ Cov(Z(0), Y(\tau)) & Cov(Z(0), Z(\tau)) \end{pmatrix} ;
K(\tau) = \begin{pmatrix} Cov(Y(0), Y(\tau)) & Cov(Y(0), Y(\tau+h)) \\ Cov(Y(h), Y(\tau)) & Cov(Y(h), Y(\tau+h)) \end{pmatrix} ;
K(\tau) = \begin{pmatrix} k(\tau) & k(\tau + h) \\ k(\tau-h) & k(\tau) \end{pmatrix} .

I will try to propose a fix shortly.

Let \boldsymbol{Y} = (Y_1,...,Y_d)^T be a stationary Gaussian Process with one-dimensional input and d-dimensional output. Let K be its (stationary) covariance kernel. For any \tau \in \mathbb{R}, the (i,j)-th element of K(\tau) is equal to Cov(Y_i(0), Y_j(\tau)). And the (j,i)-th element of K(-\tau) is

Cov(Y_j(0), Y_i(-\tau)) = Cov(Y_i(-\tau), Y_j(0)) = Cov(Y_i(0), Y_j(\tau)):

it is equal to the (i,j)-th element of K(\tau).
Therefore the matrix K(-\tau) is the transpose of the matrix K(\tau).

In the UserDefinedStationaryCovarianceModel class, we only need the user to specify the matrices K(t_k) for nonnegative timestamps t_k, but we need to transpose the matrix K(t_k) when the operator() is called on -t_k.