Ultrafast laser pulse propagation in dielectrics is often modeled by a modiﬁed nonlinear Schrödinger equation (NLSE). At laser intensities suﬃcient to cause ionization, a plasma term can be included in the modiﬁed NLSE to account for free-carrier optical eﬀects. This term is linear in the ﬁeld and may be calculated from the classical Drude theory. We explore the consequences of a newly developed method for including dispersion relations of the plasma term as predicted by the Drude theory into the framework of the modiﬁed NLSE. The plasma induced dispersion terms can be shown to strongly eﬀect ultrashort pulse evolution through regions of high plasma density after propagation distances as short as the skin depth. Our results suggest that, in regions high plasma density, plasma induced dispersion may be more signiﬁcant than other linear or nonlinear dispersion corrections to the NLSE.
- ultrafast pulse propagation,
- plasma generation,
- nonlinear Schrödinger equation
Available at: http://works.bepress.com/jeremy_gulley/8/