python – ValueError:x_new中的值低于插值范围

这是我在学习时得到的一个scikit-learn错误

my_estimator = LassoLarsCV(fit_intercept=False, normalize=False, positive=True, max_n_alphas=1e5)

请注意,如果我将max_n_alphas从1e5减少到1e4,我不会再出现此错误.

任何人都知道发生了什么?

我打电话时发生错误

my_estimator.fit(x, y)

我有40个维度的40k数据点.

完整的堆栈跟踪看起来像这样

File "/usr/lib64/python2.7/site-packages/sklearn/linear_model/least_angle.py", line 1113, in fit
    axis=0)(all_alphas)
  File "/usr/lib64/python2.7/site-packages/scipy/interpolate/polyint.py", line 79, in __call__
    y = self._evaluate(x)
  File "/usr/lib64/python2.7/site-packages/scipy/interpolate/interpolate.py", line 498, in _evaluate
    out_of_bounds = self._check_bounds(x_new)
  File "/usr/lib64/python2.7/site-packages/scipy/interpolate/interpolate.py", line 525, in _check_bounds
    raise ValueError("A value in x_new is below the interpolation "
ValueError: A value in x_new is below the interpolation range.
必须有一些特定的数据. LassoLarsCV()似乎正在使用这个相当良好的数据的合成示例:

import numpy
import sklearn.linear_model

# create 40000 x 40 sample data from linear model with a bit of noise
npoints = 40000
ndims = 40
numpy.random.seed(1)
X = numpy.random.random((npoints, ndims))
w = numpy.random.random(ndims)
y = X.dot(w) + numpy.random.random(npoints) * 0.1

clf = sklearn.linear_model.LassoLarsCV(fit_intercept=False, normalize=False, max_n_alphas=1e6)
clf.fit(X, y)

# coefficients are almost exactly recovered, this prints 0.00377
print max(abs( clf.coef_ - w ))

# alphas actually used are 41 or ndims+1
print clf.alphas_.shape

这是在sklearn 0.16,我没有positive = True选项.

我不知道你为什么要使用非常大的max_n_alphas.虽然我不知道为什么1e 4工作而1e 5不在你的情况下,我怀疑你从max_n_alphas = ndims 1和max_n_alphas = 1e 4获得的路径或者对于表现良好的数据而言是相同的.此外,通过clf.alpha_中的交叉验证估计的最佳alpha将是相同的.查看Lasso path using LARS示例了解alpha正在尝试做什么.

此外,来自LassoLars documentation

alphas_ array, shape (n_alphas + 1,)

Maximum of covariances (in
absolute value) at each iteration. n_alphas is either max_iter,
n_features, or the number of nodes in the path with correlation
greater than alpha, whichever is smaller.

所以我们以上面的大小为ndims 1(即n_features 1)的alphas_结束是有意义的.

附:使用sklearn 0.17.1和positive = True测试,也测试了一些正负系数,结果相同:alphas_是ndims 1或更小.

相关文章
相关标签/搜索