Ultrafast laser pulse propagation in dielectrics is often modeled by a modified nonlinear Schrödinger equation (NLSE). At laser intensities sufficient to cause ionization, a plasma term can be included in the modified NLSE to account for free-carrier optical effects. This term is linear in the field and may be calculated from the classical Drude theory. We explore the consequences of a newly developed method for including dispersion relations of the plasma term as predicted by the Drude theory into the framework of the modified NLSE. The plasma induced dispersion terms can be shown to strongly effect ultrashort pulse evolution through regions of high plasma density after propagation distances as short as the skin depth. Our results suggest that, in regions high plasma density, plasma induced dispersion may be more significant than other linear or nonlinear dispersion corrections to the NLSE.
- ultrafast pulse propagation,
- plasma generation,
- nonlinear Schrödinger equation
Available at: http://works.bepress.com/jeremy_gulley/8/