Our thorough and on-going exploration of the structure function for oddly shaped masks has demonstrated that the slope is dominated by the mask on most of the relevant scales. Since the VCS is inherently linked to this slope, it suggests that it too should show this sensitivity. Thus the slopes will definitely be affected, and should return $m\approx1.5$, where $m$ is the structure function slope, but will it also cause the break point to change? This break point is the resolution limit of the data (Lazarian & Pogosyan, 2006). Even without the inclusion of noise in simulation, we see a change in the position of the break.
In [86]:
%matplotlib inline
In [87]:
from spectral_cube import SpectralCube
cube = SpectralCube.read("/Users/eric/Dropbox/AstroStatistics/Design4_21_0_0_flatrho_0021_13co.fits")
In [88]:
from turbustat.statistics import VCS
In [108]:
vcs_clean = VCS(cube).run(verbose=True)
Breaks found from spline are: [-1.19382003 -0.89962945 -0.60205999]
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.869
Model: OLS Adj. R-squared: 0.869
Method: Least Squares F-statistic: 3284.
Date: Mon, 21 Mar 2016 Prob (F-statistic): 4.07e-220
Time: 23:19:47 Log-Likelihood: -389.24
No. Observations: 495 AIC: 782.5
Df Residuals: 493 BIC: 790.9
Df Model: 1
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 0.3577 0.051 7.079 0.000 0.258 0.457
x1 -3.5672 0.062 -57.305 0.000 -3.690 -3.445
==============================================================================
Omnibus: 27.236 Durbin-Watson: 0.005
Prob(Omnibus): 0.000 Jarque-Bera (JB): 30.403
Skew: -0.595 Prob(JB): 2.50e-07
Kurtosis: 2.760 Cond. No. 4.07
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Iteration: 1/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.994
Model: OLS Adj. R-squared: 0.993
Method: Least Squares F-statistic: 2.509e+04
Date: Mon, 21 Mar 2016 Prob (F-statistic): 0.00
Time: 23:19:47 Log-Likelihood: 353.94
No. Observations: 495 AIC: -699.9
Df Residuals: 491 BIC: -683.1
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.0408 0.023 90.163 0.000 1.996 2.085
x1 -2.1486 0.021 -101.120 0.000 -2.190 -2.107
x2 -5.8827 0.089 -65.779 0.000 -6.058 -5.707
x3 0.3356 0.020 16.832 0.000 0.296 0.375
==============================================================================
Omnibus: 130.367 Durbin-Watson: 0.054
Prob(Omnibus): 0.000 Jarque-Bera (JB): 350.335
Skew: -1.285 Prob(JB): 8.43e-77
Kurtosis: 6.223 Cond. No. 22.7
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -0.716158235774
Epsilon: -0.950341940125
Iteration: 2/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.997
Model: OLS Adj. R-squared: 0.997
Method: Least Squares F-statistic: 6.021e+04
Date: Mon, 21 Mar 2016 Prob (F-statistic): 0.00
Time: 23:19:47 Log-Likelihood: 569.66
No. Observations: 495 AIC: -1131.
Df Residuals: 491 BIC: -1114.
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.3755 0.019 126.460 0.000 2.339 2.412
x1 -1.9033 0.016 -117.463 0.000 -1.935 -1.871
x2 -5.6072 0.041 -136.981 0.000 -5.688 -5.527
x3 -0.0150 0.013 -1.142 0.254 -0.041 0.011
==============================================================================
Omnibus: 90.404 Durbin-Watson: 0.039
Prob(Omnibus): 0.000 Jarque-Bera (JB): 156.479
Skew: -1.079 Prob(JB): 1.05e-34
Kurtosis: 4.711 Cond. No. 17.2
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -0.710806867003
Epsilon: -0.581628914735
Iteration: 3/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.997
Model: OLS Adj. R-squared: 0.997
Method: Least Squares F-statistic: 6.019e+04
Date: Mon, 21 Mar 2016 Prob (F-statistic): 0.00
Time: 23:19:47 Log-Likelihood: 569.59
No. Observations: 495 AIC: -1131.
Df Residuals: 491 BIC: -1114.
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.3648 0.019 127.151 0.000 2.328 2.401
x1 -1.9108 0.016 -118.715 0.000 -1.942 -1.879
x2 -5.6246 0.041 -135.949 0.000 -5.706 -5.543
x3 0.0024 0.013 0.185 0.854 -0.023 0.028
==============================================================================
Omnibus: 95.893 Durbin-Watson: 0.040
Prob(Omnibus): 0.000 Jarque-Bera (JB): 174.907
Skew: -1.111 Prob(JB): 1.05e-38
Kurtosis: 4.883 Cond. No. 17.3
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -0.711666713818
Epsilon: 0.000273066454996
In [104]:
print([(slop, err) for slop, err in zip(vcs_clean.slopes, vcs_clean.slope_errs)])
[(-1.9108164313550606, 0.01609584638707879), (-7.535453828029417, 0.044393727235689219)]
The fit gives slopes of $-1.91\pm0.02$ and $-7.54\pm0.04$, and a break at ~5.1 pixels. The difference between the slopes is $-4/m$, so we find $m=0.7$.
Now what happens when I start masking?
In [90]:
masked_cube = cube.with_mask(cube > cube.mean())
In [91]:
p.imshow(masked_cube.moment0().value, origin='lower', cmap='afmhot')
Out[91]:
<matplotlib.image.AxesImage at 0x12b686bd0>
In [92]:
vcs_masked = VCS(masked_cube).run(verbose=True)
Breaks found from spline are: [-1.46852108 -1.19382003 -0.89962945 -0.60205999 -0.4271284 ]
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.880
Model: OLS Adj. R-squared: 0.879
Method: Least Squares F-statistic: 3599.
Date: Mon, 21 Mar 2016 Prob (F-statistic): 1.08e-228
Time: 23:13:30 Log-Likelihood: -356.05
No. Observations: 495 AIC: 716.1
Df Residuals: 493 BIC: 724.5
Df Model: 1
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 0.4450 0.047 9.415 0.000 0.352 0.538
x1 -3.4922 0.058 -59.989 0.000 -3.607 -3.378
==============================================================================
Omnibus: 22.677 Durbin-Watson: 0.006
Prob(Omnibus): 0.000 Jarque-Bera (JB): 25.043
Skew: -0.547 Prob(JB): 3.65e-06
Kurtosis: 2.866 Cond. No. 4.07
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Iteration: 1/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.949
Model: OLS Adj. R-squared: 0.949
Method: Least Squares F-statistic: 3054.
Date: Mon, 21 Mar 2016 Prob (F-statistic): 4.74e-317
Time: 23:13:30 Log-Likelihood: -142.65
No. Observations: 495 AIC: 293.3
Df Residuals: 491 BIC: 310.1
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 1.1605 0.041 28.048 0.000 1.079 1.242
x1 -2.8581 0.045 -63.342 0.000 -2.947 -2.769
x2 -2.9016 0.786 -3.693 0.000 -4.446 -1.358
x3 0.8314 0.064 12.982 0.000 0.706 0.957
==============================================================================
Omnibus: 51.659 Durbin-Watson: 0.034
Prob(Omnibus): 0.000 Jarque-Bera (JB): 65.043
Skew: -0.849 Prob(JB): 7.52e-15
Kurtosis: 3.523 Cond. No. 68.7
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -1.00016156605
Epsilon: -0.577781215487
Iteration: 2/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.985
Model: OLS Adj. R-squared: 0.985
Method: Least Squares F-statistic: 1.058e+04
Date: Mon, 21 Mar 2016 Prob (F-statistic): 0.00
Time: 23:13:30 Log-Likelihood: 155.67
No. Observations: 495 AIC: -303.3
Df Residuals: 491 BIC: -286.5
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.7071 0.084 32.344 0.000 2.543 2.872
x1 -1.6824 0.060 -28.211 0.000 -1.800 -1.565
x2 -4.0373 0.076 -53.397 0.000 -4.186 -3.889
x3 -0.5754 0.036 -15.873 0.000 -0.647 -0.504
==============================================================================
Omnibus: 24.325 Durbin-Watson: 0.051
Prob(Omnibus): 0.000 Jarque-Bera (JB): 26.796
Skew: -0.525 Prob(JB): 1.52e-06
Kurtosis: 3.442 Cond. No. 23.3
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -0.715136740973
Epsilon: -0.700389701019
Iteration: 3/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.997
Model: OLS Adj. R-squared: 0.997
Method: Least Squares F-statistic: 6.367e+04
Date: Mon, 21 Mar 2016 Prob (F-statistic): 0.00
Time: 23:13:30 Log-Likelihood: 596.83
No. Observations: 495 AIC: -1186.
Df Residuals: 491 BIC: -1169.
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.3805 0.018 133.880 0.000 2.346 2.415
x1 -1.9018 0.015 -123.998 0.000 -1.932 -1.872
x2 -5.1594 0.039 -133.155 0.000 -5.236 -5.083
x3 0.0504 0.012 4.058 0.000 0.026 0.075
==============================================================================
Omnibus: 91.251 Durbin-Watson: 0.047
Prob(Omnibus): 0.000 Jarque-Bera (JB): 187.128
Skew: -1.005 Prob(JB): 2.32e-41
Kurtosis: 5.244 Cond. No. 17.2
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -0.734669481989
Epsilon: -0.831726266753
Iteration: 4/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.998
Model: OLS Adj. R-squared: 0.998
Method: Least Squares F-statistic: 6.587e+04
Date: Mon, 21 Mar 2016 Prob (F-statistic): 0.00
Time: 23:13:30 Log-Likelihood: 605.22
No. Observations: 495 AIC: -1202.
Df Residuals: 491 BIC: -1186.
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.4217 0.018 132.911 0.000 2.386 2.458
x1 -1.8728 0.016 -120.761 0.000 -1.903 -1.842
x2 -5.1111 0.037 -139.710 0.000 -5.183 -5.039
x3 -0.0062 0.012 -0.503 0.615 -0.030 0.018
==============================================================================
Omnibus: 71.359 Durbin-Watson: 0.044
Prob(Omnibus): 0.000 Jarque-Bera (JB): 115.386
Skew: -0.902 Prob(JB): 8.79e-26
Kurtosis: 4.530 Cond. No. 16.8
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -0.73224678363
Epsilon: -0.0332985553999
Iteration: 5/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.998
Model: OLS Adj. R-squared: 0.998
Method: Least Squares F-statistic: 6.587e+04
Date: Mon, 21 Mar 2016 Prob (F-statistic): 0.00
Time: 23:13:30 Log-Likelihood: 605.22
No. Observations: 495 AIC: -1202.
Df Residuals: 491 BIC: -1186.
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.4217 0.018 132.911 0.000 2.386 2.458
x1 -1.8728 0.016 -120.761 0.000 -1.903 -1.842
x2 -5.1111 0.037 -139.710 0.000 -5.183 -5.039
x3 0.0062 0.012 0.505 0.614 -0.018 0.030
==============================================================================
Omnibus: 71.359 Durbin-Watson: 0.044
Prob(Omnibus): 0.000 Jarque-Bera (JB): 115.386
Skew: -0.902 Prob(JB): 8.79e-26
Kurtosis: 4.530 Cond. No. 16.8
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -0.734669481989
Epsilon: -1.76668104229e-16
In [93]:
print(vcs_masked.brk_err)
0.0122812865175
In [109]:
print([(slop, err) for slop, err in zip(vcs_masked.slopes, vcs_masked.slope_errs)])
[(-1.8727869476340779, 0.015508154579274434), (-6.983839275664506, 0.03973454319892631)]
The difference in the slopes has decreased from $-5.62\pm0.04$ to $-5.11\pm0.04$. The break point has not moved significantly. We now have $m=0.78$.
Increasing the mask threshold:
In [94]:
masked_cube_70 = cube.with_mask(cube > cube.percentile(70))
p.imshow(masked_cube_70.moment0().value, origin='lower', cmap='afmhot')
Out[94]:
<matplotlib.image.AxesImage at 0x12c33cc90>
In [95]:
vcs_masked_70 = VCS(masked_cube_70).run(verbose=True)
Breaks found from spline are: [-1.19382003 -0.89962945 -0.60205999]
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.869
Model: OLS Adj. R-squared: 0.869
Method: Least Squares F-statistic: 3284.
Date: Mon, 21 Mar 2016 Prob (F-statistic): 4.07e-220
Time: 23:13:34 Log-Likelihood: -389.24
No. Observations: 495 AIC: 782.5
Df Residuals: 493 BIC: 790.9
Df Model: 1
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 0.3577 0.051 7.079 0.000 0.258 0.457
x1 -3.5672 0.062 -57.305 0.000 -3.690 -3.445
==============================================================================
Omnibus: 27.236 Durbin-Watson: 0.005
Prob(Omnibus): 0.000 Jarque-Bera (JB): 30.403
Skew: -0.595 Prob(JB): 2.50e-07
Kurtosis: 2.760 Cond. No. 4.07
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Iteration: 1/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.994
Model: OLS Adj. R-squared: 0.993
Method: Least Squares F-statistic: 2.509e+04
Date: Mon, 21 Mar 2016 Prob (F-statistic): 0.00
Time: 23:13:34 Log-Likelihood: 353.94
No. Observations: 495 AIC: -699.9
Df Residuals: 491 BIC: -683.1
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.0408 0.023 90.163 0.000 1.996 2.085
x1 -2.1486 0.021 -101.120 0.000 -2.190 -2.107
x2 -5.8827 0.089 -65.779 0.000 -6.058 -5.707
x3 0.3356 0.020 16.832 0.000 0.296 0.375
==============================================================================
Omnibus: 130.367 Durbin-Watson: 0.054
Prob(Omnibus): 0.000 Jarque-Bera (JB): 350.335
Skew: -1.285 Prob(JB): 8.43e-77
Kurtosis: 6.223 Cond. No. 22.7
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -0.716158244488
Epsilon: -0.950341948667
Iteration: 2/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.997
Model: OLS Adj. R-squared: 0.997
Method: Least Squares F-statistic: 6.021e+04
Date: Mon, 21 Mar 2016 Prob (F-statistic): 0.00
Time: 23:13:34 Log-Likelihood: 569.66
No. Observations: 495 AIC: -1131.
Df Residuals: 491 BIC: -1114.
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.3755 0.019 126.460 0.000 2.339 2.412
x1 -1.9033 0.016 -117.463 0.000 -1.935 -1.871
x2 -5.6072 0.041 -136.981 0.000 -5.688 -5.527
x3 -0.0150 0.013 -1.142 0.254 -0.041 0.011
==============================================================================
Omnibus: 90.404 Durbin-Watson: 0.039
Prob(Omnibus): 0.000 Jarque-Bera (JB): 156.479
Skew: -1.079 Prob(JB): 1.05e-34
Kurtosis: 4.711 Cond. No. 17.2
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -0.710806855939
Epsilon: -0.581628998695
Iteration: 3/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.997
Model: OLS Adj. R-squared: 0.997
Method: Least Squares F-statistic: 6.019e+04
Date: Mon, 21 Mar 2016 Prob (F-statistic): 0.00
Time: 23:13:34 Log-Likelihood: 569.59
No. Observations: 495 AIC: -1131.
Df Residuals: 491 BIC: -1114.
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.3648 0.019 127.151 0.000 2.328 2.401
x1 -1.9108 0.016 -118.715 0.000 -1.942 -1.879
x2 -5.6246 0.041 -135.949 0.000 -5.706 -5.543
x3 0.0024 0.013 0.185 0.854 -0.023 0.028
==============================================================================
Omnibus: 95.893 Durbin-Watson: 0.040
Prob(Omnibus): 0.000 Jarque-Bera (JB): 174.907
Skew: -1.111 Prob(JB): 1.05e-38
Kurtosis: 4.883 Cond. No. 17.3
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -0.711666722852
Epsilon: 0.000273064698413
In [110]:
print([(slop, err) for slop, err in zip(vcs_masked_70.slopes, vcs_masked_70.slope_errs)])
[(-1.9108164580331877, 0.016095843108132023), (-7.5354537927887302, 0.04439371819207235)]
The fit has converged remarkably close to the unmasked data. Hmm...
Masking at the 93th percentile? This completely removes some spectra.
In [96]:
masked_cube_93 = cube.with_mask(cube > cube.percentile(93))
p.imshow(masked_cube_93.moment0().value, origin='lower', cmap='afmhot')
Out[96]:
<matplotlib.image.AxesImage at 0x12c402690>
In [117]:
vcs_masked_93 = VCS(masked_cube_93).run(verbose=True, breaks=-0.9)
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.925
Model: OLS Adj. R-squared: 0.925
Method: Least Squares F-statistic: 6120.
Date: Mon, 21 Mar 2016 Prob (F-statistic): 4.33e-280
Time: 23:30:50 Log-Likelihood: -154.25
No. Observations: 495 AIC: 312.5
Df Residuals: 493 BIC: 320.9
Df Model: 1
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 1.0011 0.031 31.844 0.000 0.939 1.063
x1 -3.0292 0.039 -78.228 0.000 -3.105 -2.953
==============================================================================
Omnibus: 31.716 Durbin-Watson: 0.009
Prob(Omnibus): 0.000 Jarque-Bera (JB): 42.760
Skew: -0.521 Prob(JB): 5.19e-10
Kurtosis: 3.995 Cond. No. 4.07
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Iteration: 1/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.995
Model: OLS Adj. R-squared: 0.995
Method: Least Squares F-statistic: 3.377e+04
Date: Mon, 21 Mar 2016 Prob (F-statistic): 0.00
Time: 23:30:51 Log-Likelihood: 523.39
No. Observations: 495 AIC: -1039.
Df Residuals: 491 BIC: -1022.
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.6944 0.032 85.330 0.000 2.632 2.756
x1 -1.6996 0.024 -70.988 0.000 -1.747 -1.653
x2 -2.9940 0.036 -84.225 0.000 -3.064 -2.924
x3 -0.1137 0.016 -7.151 0.000 -0.145 -0.082
==============================================================================
Omnibus: 6.264 Durbin-Watson: 0.034
Prob(Omnibus): 0.044 Jarque-Bera (JB): 4.289
Skew: 0.063 Prob(JB): 0.117
Kurtosis: 2.562 Cond. No. 19.0
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -0.824038632294
Epsilon: -0.935280245174
Iteration: 2/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.996
Model: OLS Adj. R-squared: 0.995
Method: Least Squares F-statistic: 3.623e+04
Date: Mon, 21 Mar 2016 Prob (F-statistic): 0.00
Time: 23:30:51 Log-Likelihood: 540.77
No. Observations: 495 AIC: -1074.
Df Residuals: 491 BIC: -1057.
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.6165 0.026 102.011 0.000 2.566 2.667
x1 -1.7508 0.020 -85.633 0.000 -1.791 -1.711
x2 -3.0593 0.036 -84.842 0.000 -3.130 -2.988
x3 0.0347 0.015 2.378 0.018 0.006 0.063
==============================================================================
Omnibus: 5.010 Durbin-Watson: 0.030
Prob(Omnibus): 0.082 Jarque-Bera (JB): 5.305
Skew: 0.156 Prob(JB): 0.0705
Kurtosis: 3.399 Cond. No. 17.0
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -0.846731022687
Epsilon: -0.0677833413379
Iteration: 3/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.996
Model: OLS Adj. R-squared: 0.995
Method: Least Squares F-statistic: 3.634e+04
Date: Mon, 21 Mar 2016 Prob (F-statistic): 0.00
Time: 23:30:51 Log-Likelihood: 541.52
No. Observations: 495 AIC: -1075.
Df Residuals: 491 BIC: -1058.
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.6394 0.027 98.983 0.000 2.587 2.692
x1 -1.7356 0.021 -82.640 0.000 -1.777 -1.694
x2 -3.0514 0.035 -86.173 0.000 -3.121 -2.982
x3 -0.0155 0.015 -1.050 0.294 -0.045 0.014
==============================================================================
Omnibus: 3.401 Durbin-Watson: 0.029
Prob(Omnibus): 0.183 Jarque-Bera (JB): 3.193
Skew: 0.162 Prob(JB): 0.203
Kurtosis: 3.224 Cond. No. 17.3
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -0.836572772591
Epsilon: -0.0030257949433
Iteration: 4/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.996
Model: OLS Adj. R-squared: 0.995
Method: Least Squares F-statistic: 3.636e+04
Date: Mon, 21 Mar 2016 Prob (F-statistic): 0.00
Time: 23:30:51 Log-Likelihood: 541.62
No. Observations: 495 AIC: -1075.
Df Residuals: 491 BIC: -1058.
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.6319 0.026 100.077 0.000 2.580 2.684
x1 -1.7405 0.021 -83.694 0.000 -1.781 -1.700
x2 -3.0547 0.036 -85.831 0.000 -3.125 -2.985
x3 0.0089 0.015 0.610 0.542 -0.020 0.038
==============================================================================
Omnibus: 3.993 Durbin-Watson: 0.030
Prob(Omnibus): 0.136 Jarque-Bera (JB): 3.886
Skew: 0.162 Prob(JB): 0.143
Kurtosis: 3.288 Cond. No. 17.2
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -0.842432508797
Epsilon: -0.00039957203152
In [118]:
print([(slop, err) for slop, err in zip(vcs_masked_93.slopes, vcs_masked_93.slope_errs)])
[(-1.7355690041491698, 0.021001509778810018), (-4.7869927487605572, 0.041169841198553082)]
$m=1.3$. As soon as those masked regions show up, the break point becomes particularly indistinct. This causes the slopes to converge, the $m=3/2$ limit. The increased power in the noise is having the same effect as smoothing over a larger region.
Does this hold true if I actually smooth/downsample the data?
In [98]:
from astropy.wcs import WCS
downgraded_wcs = cube.wcs.copy()
downgraded_wcs.wcs.cdelt[2] *= 2.
data = cube.filled_data[:].value
data[np.isnan(data)] = 0.0
downgraded_cube = SpectralCube(data=nd.zoom(data, (0.5, 1., 1.))*cube.unit, wcs=downgraded_wcs)
downgraded_cube
Out[98]:
SpectralCube with shape=(250, 128, 128):
n_x: 128 type_x: RA---CAR unit_x: deg range: 179.999664 deg: 180.000330 deg
n_y: 128 type_y: DEC--CAR unit_y: deg range: 0.000005 deg: 0.000671 deg
n_s: 250 type_s: VOPT unit_s: m / s range: -20.040 m / s: 19939.877 m / s
In [119]:
vcs_downgrade = VCS(downgraded_cube).run(verbose=True, breaks=None)
Breaks found from spline are: [-0.59859946]
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.970
Model: OLS Adj. R-squared: 0.970
Method: Least Squares F-statistic: 7988.
Date: Mon, 21 Mar 2016 Prob (F-statistic): 6.92e-188
Time: 23:31:05 Log-Likelihood: 151.91
No. Observations: 245 AIC: -299.8
Df Residuals: 243 BIC: -292.8
Df Model: 1
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.1478 0.018 117.113 0.000 2.112 2.184
x1 -2.0830 0.023 -89.374 0.000 -2.129 -2.037
==============================================================================
Omnibus: 25.175 Durbin-Watson: 0.034
Prob(Omnibus): 0.000 Jarque-Bera (JB): 29.752
Skew: -0.796 Prob(JB): 3.46e-07
Kurtosis: 3.619 Cond. No. 4.29
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Iteration: 1/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.998
Model: OLS Adj. R-squared: 0.998
Method: Least Squares F-statistic: 4.040e+04
Date: Mon, 21 Mar 2016 Prob (F-statistic): 0.00
Time: 23:31:05 Log-Likelihood: 482.62
No. Observations: 245 AIC: -957.2
Df Residuals: 241 BIC: -943.2
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.5572 0.010 257.296 0.000 2.538 2.577
x1 -1.7168 0.010 -177.846 0.000 -1.736 -1.698
x2 -1.6791 0.037 -45.708 0.000 -1.751 -1.607
x3 0.0236 0.008 2.897 0.004 0.008 0.040
==============================================================================
Omnibus: 153.316 Durbin-Watson: 0.278
Prob(Omnibus): 0.000 Jarque-Bera (JB): 3044.442
Skew: -2.045 Prob(JB): 0.00
Kurtosis: 19.778 Cond. No. 22.8
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -0.626686584011
Epsilon: -0.932549137958
Iteration: 2/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.998
Model: OLS Adj. R-squared: 0.998
Method: Least Squares F-statistic: 4.088e+04
Date: Mon, 21 Mar 2016 Prob (F-statistic): 0.00
Time: 23:31:05 Log-Likelihood: 484.09
No. Observations: 245 AIC: -960.2
Df Residuals: 241 BIC: -946.2
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.5679 0.010 246.798 0.000 2.547 2.588
x1 -1.7083 0.010 -171.884 0.000 -1.728 -1.689
x2 -1.6479 0.034 -48.912 0.000 -1.714 -1.582
x3 -0.0089 0.008 -1.084 0.279 -0.025 0.007
==============================================================================
Omnibus: 141.765 Durbin-Watson: 0.276
Prob(Omnibus): 0.000 Jarque-Bera (JB): 2641.090
Skew: -1.845 Prob(JB): 0.00
Kurtosis: 18.656 Cond. No. 21.3
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -0.615876223336
Epsilon: -0.0118966803249
Iteration: 3/100
OLS Regression Results
==============================================================================
Dep. Variable: y R-squared: 0.998
Model: OLS Adj. R-squared: 0.998
Method: Least Squares F-statistic: 4.089e+04
Date: Mon, 21 Mar 2016 Prob (F-statistic): 0.00
Time: 23:31:05 Log-Likelihood: 484.10
No. Observations: 245 AIC: -960.2
Df Residuals: 241 BIC: -946.2
Df Model: 3
Covariance Type: nonrobust
==============================================================================
coef std err t P>|t| [95.0% Conf. Int.]
------------------------------------------------------------------------------
const 2.5643 0.010 250.830 0.000 2.544 2.584
x1 -1.7111 0.010 -174.246 0.000 -1.730 -1.692
x2 -1.6600 0.035 -47.994 0.000 -1.728 -1.592
x3 0.0038 0.008 0.460 0.646 -0.012 0.020
==============================================================================
Omnibus: 146.316 Durbin-Watson: 0.277
Prob(Omnibus): 0.000 Jarque-Bera (JB): 2818.979
Skew: -1.920 Prob(JB): 0.00
Kurtosis: 19.168 Cond. No. 21.8
==============================================================================
Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
Break Point: -0.620395993476
Epsilon: -7.74970931068e-05
In [112]:
print([(slop, err) for slop, err in zip(vcs_downgrade.slopes, vcs_downgrade.slope_errs)])
[(-1.7082764862189657, 0.0099385504139785398), (-3.3561555024270286, 0.03512571873738126)]
This would give $m=2.4$, which doesn't make any sense. I think this has reached the unresolved regime and the asymptotic solution no longer holds.
In [120]:
p.loglog(vcs_clean.vel_freqs, vcs_clean.ps1D, 'bD', alpha=0.05)
p.loglog(vcs_masked.vel_freqs, vcs_masked.ps1D, 'go', alpha=0.05)
p.loglog(vcs_masked_93.vel_freqs, vcs_masked_93.ps1D, 'ro', alpha=0.05)
p.loglog(vcs_downgrade.vel_freqs, vcs_downgrade.ps1D, 'cD', alpha=0.05)
Out[120]:
[<matplotlib.lines.Line2D at 0x122de9fd0>]
In [ ]:
Content source: e-koch/ewky_scripts
Similar notebooks: