Abstract
We describe a model evaluating changes in the optical isolation of a Faraday isolator when passing from air to vacuum in terms of different thermal effects in the crystal. The changes are particularly significant g a in the crystal thermal lensing (refraction index and thermal expansion) and in its Verdet constant and can be, ascribed to the less efficient convection cooling of the magneto-optic crystal of the Faraday isolator. An isolation decrease by a factor of 10 is experimentally observed in a Faraday isolator that is used in a gravitational wave experiment (Virgo) with a 10 W input laser when going from air to vacuum. A finite element model simulation reproduces with a great accuracy the experimental data measured on Virgo and on a test bench. A first set of measurements of the thermal lensing has been used to characterize the losses of the crystal, which depend on the sample. The isolation factor measured on Virgo confirms the simulation model and the absorption losses of 0.0016 +/- 0.0002/cm for the TGG magneto-optic crystal used in the Faraday isolator. (C) 2008 Optical Society of America
Original language | English |
---|---|
Pages (from-to) | 5853-5861 |
Number of pages | 9 |
Journal | Applied Optics |
Volume | 47 |
Issue number | 31 |
DOIs | |
Publication status | Published - 1 Nov 2008 |
Externally published | Yes |
Keywords
- SELF-INDUCED DEPOLARIZATION
- ADAPTIVE COMPENSATION
- LASER-RADIATION
- DISTORTIONS