Content area

Abstract

The best mean square error that the classical kernel density estimator achieves if the kernel is non-negative and f has only two continuous derivatives, is of the order of [special characters omitted]. If negative kernels are allowed, then this rate can be improved depending on the smoothness of f and the order of the kernel.

Abramson and others modified the classical kernel estimator, assumed non-negative, by allowing the bandwidth hn to depend on the data. The last and best result in the literature is Hall, Hu and Marron who show that under suitable assumptions on a non-negative kernel K and the density f, |n( t) − f(t)| = O P([special characters omitted]) for fixed t. The main result of this thesis states that [special characters omitted] |n(t) − f(t)| = OP(([special characters omitted])4/9) where Dn and n(t) are purely data driven and Dn can be taken as close as desired to the set { t : f(t) > 0}. This rate is best possible for estimating a density in the sup norm. The data driven n(t) and Dn have 'ideal' counterparts that depend on f, and for the ideal estimator, slightly sharper results are proven.

Details

Title
Asymptotic properties of generalized kernel density estimators
Author
Sang, Hailin
Year
2008
Publisher
ProQuest Dissertations Publishing
ISBN
978-0-549-87736-3
Source type
Dissertation or Thesis
Language of publication
English
ProQuest document ID
304629303
Copyright
Database copyright ProQuest LLC; ProQuest does not claim copyright in the individual underlying works.