In this first example, we will explore a simulated lightcurve that follows a damped random walk, which is often used to model variability in the optical flux of quasar.
In [1]:
import numpy as np
from matplotlib import pyplot as plt
In [9]:
from astroML.time_series import lomb_scargle, generate_damped_RW
from astroML.time_series import ACF_scargle
Use the numpy.arange method to generate 1000 days of data.
In [7]:
tdays = np.arange(0, 1E3)
z = 2.0 # redshift
tau = 300 # damping timescale
Use the help function to figure out how to generate a dataset of this evenly spaced damped random walk over the 1000 days.
In [ ]:
Add errors to your 1000 points using numpy.random.normal. Note, you will need 1000 points, each centered on the actual data point, and assume a sigma 0.1.
In [ ]:
Randomly select a subsample of 200 data points from your generated dataset. This is now unevenly spaced, and will serve as your observed lightcurve.
In [ ]:
Plot the observed lightcurve.
In [ ]:
Use the help menu to figure out how to calculate the autocorrelation function of your lightcurve with ACF_scargle.
In [ ]:
In this next example, we will explore data drawn from a gaussian process.
In [10]:
from sklearn.gaussian_process import GaussianProcess
Define a covariance function as the one dimensional squared-exponential covariance function described in class. This will be a function of x1, x2, and the bandwidth h. Name this function covariance_squared_exponential.
In [ ]:
Generate values for the x-axis as 1000 evenly points between 0 and 10 using numpy.linspace. Define a bandwidth of h=1.
In [ ]:
Generate an output of your covariance_squared_exponential with x as x1, x[:,None] as x2, and h as the bandwidth.
In [ ]:
Use numpy.random.multivariate_normal to generate a numpy array of the same length as your x-axis points. Each point is centered on 0 (your mean is a 1-d array of zeros), and your covariance is the output of your covariance_squared_exponential above.
In [ ]:
Choose two values in your x-range as sample x values, and put in an array, x_sample_test. Choose a function (e.g. numpy.cos) as your example function to constrain.
In [ ]:
Define an instance of a gaussian proccess
In [11]:
gp = GaussianProcess(corr='squared_exponential', theta0=0.5,
random_state=0)
Fit the Gaussian process to data x1[:,None], with the output of the function on your sample x values (e.g. numpy.cos(x_sample_test) ).
In [ ]:
Predict on x1[:,None], and get the MSE values. Plot the output function and function errors.
In [ ]: