The Effective Number of Parameters
The Effective Number of Parameters
More complex model is, more increased the number of parameter is. Many parameter is good for reducing error, but it is vulnerable to overfitting. In this case, we can lower our model complexity by imposing penalty on our model. Using this penalty term, we only can choose beta truly necessary in our model. How the number of necessary beta(parameter) is decided? It depends on matrix property multiplied by output vector.
Degree of freedom is relevant to the number of eigenvalues in Smoother matrix.
Smoother matrix: https://math.stackexchange.com/questions/2784061/how-to-interpret-the-smoother-matrix
The degree of freedom in matrix is the number of independent column vectors(or the number of non-zero eigenvalues) We can do eigenvalue decomposition of our smoother matrix because it is symmetric matrix.
If it is projection matrix not smoother matrix, trace(D) becomes rank(S) because projection matrix(idempotent matrix) only has 0 or 1 as an eigenvalue. is always small than , so eigenvalue is between 0 and 1. I think it would be reason of
If is orthogonal-projection matrix and has M parameters, . It replaces d in Cp statistic.
In the above assumption, the following expression is satisfied.
VC dimension
The Vapnik-Chervonenkis theory provides such a general measure of complexity, and gives associated bounds on the optimism. It measures the complexity by assessing how wiggly its members can be.
The VC dimension of the class {f(x,alpha)} is defined to be the largest number of points that can be shattered by members of {f(x,alpha)}
If our function can perfectly separate the three points into two classes, we can say that the VC dimension of the function is 3(# of points).
https://keepmind.net/%EA%B8%B0%EA%B3%84%ED%95%99%EC%8A%B5-vc-dimension/
Last updated