Neil Alden Armstrong
Download 446 b.
|
The first dawning rays of this new age of seeing appeared—quite literally—just before the beginning of the 20th century. In 1895 a German physicist named Wilhelm Konrad Roentgen accidentally discovered a form of radiation that could penetrate opaque objects and cast ghostly images on a photographic plate. Roentgen called his discovery X-radiation (the X was for "unknown"), and to prove its existence he took a picture of his wife's hand by exposing it to a beam of its rays. The result showed the bones of her hand and a ring on her finger as dark shadows on the photographic plate. It was the first x-ray image ever deliberately recorded.The rays were soon identified as a form of electromagnetic radiation with wavelengths very much shorter than those of visible light. The shortness, or high frequency, of these wavelengths accounted for their penetrating power; their ability to delineate internal structure came from the fact that denser materials, such as bone, absorbed more of the rays. An American named William Coolidge soon put all this to practical effect with his 1913 invention of a vacuum tube that conveniently—and relatively safely—generated X rays. Medical doctors quickly seized on this wonderful new tool that enabled them to see, for example, how a bone was broken, or where a bullet might be lodged, or whether a lung harbored potentially lethal growths. A new field of diagnostic—and later therapeutic—medicine, radiology, was born. X rays also found their way out of the doctor's office and into the industrial world, where they were used to check for hidden cracks or other flaws in complex machinery and in structures such as bridges.Behind X rays' power lay a principle of physics that eventually helped researchers see things that were infinitesimally small. According to this principle, a given form of radiation cannot be used to distinguish anything smaller than half its own wavelength; therefore, the shorter the wavelength, the smaller the object that can be imaged. Beginning in the 1930s, efforts to improve traditional microscopes' magnifying power—up to 5,000 times for the best optical instruments—came to fruition as several researchers across the globe began to use streams of electrons to "illuminate" surfaces. Electron microscopes, as they were called, could reveal details many times smaller than visible light could because the wavelengths of the electron beams were up to 100,000 times shorter than the wavelengths of visible light. With improvements over the years that included the ability to scan across a surface or even probe to reveal subsurface details, electron microscopes ultimately achieved magnifications of up to two million times—equivalent to enlarging a postage stamp to the size of a large city. Now scientists can see things right down to the level of individual atoms on a surface.
One major drawback to these approaches was that they carried the risks associated with exposure to ionizing radiation, which even in relatively small amounts can cause irreparable damage to cells and tissues. But about the same time that CAT scanning was becoming a practical tool, a less harmful—and indeed more revealing—imaging technology appeared. Magnetic resonance imaging (MRI) relies not on X rays or gamma rays but on the interaction of harmless radio waves with hydrogen atoms in cells subjected to a magnetic field. It took a great deal of work by radiologists, scientists, and engineers to iron out all the wrinkles in MRI, but by the 1980s it too was proving to be an indispensable diagnostic tool. Not only was MRI completely noninvasive and free of ill effects, it also could create images of nearly any soft tissue. In its most developed form it can even chart blood flow and chemical activity in areas of the brain and heart, revealing details of functioning that had never been seen before.
|
Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2024
ma'muriyatiga murojaat qiling
ma'muriyatiga murojaat qiling