Äîêóìåíò âçÿò èç êýøà ïîèñêîâîé ìàøèíû. Àäðåñ îðèãèíàëüíîãî äîêóìåíòà : http://hea-www.harvard.edu/AstroStat/astro193/VLK_slides_apr06.pdf
Äàòà èçìåíåíèÿ: Tue Apr 7 00:25:29 2015
Äàòà èíäåêñèðîâàíèÿ: Sun Apr 10 11:57:46 2016
Êîäèðîâêà:
Astro 193 : 2015 Apr 6
·

Follow-up
· · · · ·

p-value vs Linear smoothing edge effects Kernel Density Estimation locfit HW 10

· ·

Signal Processing: Fourier Transforms Signal Processing: Wavelets


Kernel Density Estimation
·

A "kernel" K(x) is a non-negative real-valued function such that xR dx K(x) = 1 ; E[x] = x
R

dx x K(x) = 0 ; E[x2] = xR dx x K(x) <

·

The Kernel density estimator for a bandwidth h, fh(x) = [1/nh]
i=1..n

K([x-xi]/h)

·

Types of Kernels
· ·

Boxcar, Gaussian, Triangle, Tophat RMF
i=1..n li(x)Yi

·

For linear smoothers, fh(x) = li = K([x-xi]/h] /
i=1..n

K([x-xi]/h), centered on xi, weighting all of the sample

Can write fh = LY in matrix notation Effective degrees of freedom, = trace(L) =
·
i=1..n

Lii

Choosing the bandwidth via cross-validation (Jackknife/leave-one-out) 2 h(-i)(xi)) CV|h = (1/n) i=1..n (f(xi) - f Generalized CV, GCV(h) (1/n) [1-/n]
-2 i=1..n

(f(xi)-fh(xi))

2


Kernel Density Estimation
·

A "kernel" K(x) is a non-negative real-valued function such that xR dx K(x) = 1 ; E[x] = x
R

dx x K(x) = 0 ; E[x2] = xR dx x K(x) <

·

The Kernel density estimator for a bandwidth h, fh(x) = [1/nh]
i=1..n

K([x-xi]/h)

·

Types of Kernels
· ·

Boxcar, Gaussian, Triangle, Tophat RMF
i=1..n li(x)Yi

·

For linear smoothers, fh(x) = li = K([x-xi]/h] /
i=1..n

K([x-xi]/h), centered on xi, weighting all of the sample

Can write fh = LY in matrix notation Effective degrees of freedom, = trace(L) =
·
i=1..n

Lii

Choosing the bandwidth via cross-validation (Jackknife/leave-one-out) 2 h(-i)(xi)) CV|h = (1/n) i=1..n (Yi- f Generalized CV, GCV(h) (1/n) [1-/n]
-2 i=1..n

(Yi-fh(xi))

2


KDE: Histograms
Risk, R = (f(x)-f(x))² dx = dx [f² - 2ff + f²] dx f² - 2 dx f f CV|h = dx [f(x)]² - (2/n) fi = (1/h) (1/n)
jB Ij i=1..n f(-i)



(1/h) (Ci/n), i=1..m (Ci/n)²

GCV(h) = 2/[h(n-1)] - ((n+1)/[h(n-1)])

i=1..m

Reading: Chapters 4 and 6 of All of Non-parametric Statistics, by Larry Wasserman


KDE: locfit
R package to do kernel density estimation, regression, and cross-validation to get optimum bandwidth. http://cran.r-project.org/web/packages/locfit/ Documentation: http://cran.r-project.org/web/packages/locfit/locfit.pdf

usual calling sequence: result = locfit(y ~ lp(x,nn=0,h=bandwidth,deg=d)) gcvout = gcvplot(y ~ x, alpha=array, deg=d)


Wavelets

·

Reading
· · ·

Wavelets: An Analysis Tool, M. Holschneider A Wavelet Tour of Signal Processing, Stephane Mallat Ten Lectures on Wavelets, Ingrid Daubechies

·

A small wave
·

x 1/4

·

Wavelets as filters of structure at some scale


The Mexican Hat
g(x;) = 1/[2] exp(-(x-a)²/2²) g(x;) ­ g(x;+) = ­ g/ g/ = ­ g(x)/ + g(x) (x-a)²/¨ = (­1/) (1­(x-a)²/²) g(x) MH(x;a,) = (1/) (1­(x-a)²/²) exp(-(x-a)²/2²) MH(x,y;a,b,x,y) = (1/xy) (1 - (x-a)²/x² - (y-b)²/y²) exp[-(1/2)((x-a)²/ x²+(y-b)²/y²)] FT[MH] ² exp[ - ²²/2]


Freeman et al. 2002, ApJS, 138, 185


Wavelets: Theory and Applications, Louis, Maass, & Rieder 1997


Wavelet Analysis
·

Scaling and shifting
· ·

Father wavelet (x) Mother wavelet (x) = (x)
· ·
k

a,s(x) = (1/s) [(x-a)/s] m,n(x) = s
-m/2 0

(x/s

m 0

- na0)

·

Correlation and coefficients
· ·

W[f](a,s) = dx f(x) [(x-a)/s] f * W[f]
m,n

s

= dx f(x) m,n(x) da ds s W[f](a,s) a,s(x)
2 -1 -2

·

Inverse Transforms
·

f(x) =

-1 C

C = 2 d |F()| ||
· ·

f(x) =

m,n

W[f]

m,n

m,n(x)

Filtering