The idea that all mental functions are derived from the brain originated with Hippocrates, but it was largely neglected until the late 18th century, when Franz Gall attempted to link psychology and brain science. Gall took advantage of what was already known about the cerebral cortex. He was aware that it was bilaterally symmetrical and subdivided into four lobes. However, he found that these four lobes were, by themselves, inadequate to account for the forty-odd distinct psychological functions that psychologists had characterized by 1790. As a result he began to analyze the heads of hundreds of musicians, actors, etc., relating certain bony elevations or depressions under the scalp to the predominant talent or defects of their owners. Based on his skull palpation, Gall subdivided the cortex into roughly forty regions, each of which served as an organ for a specific mental function.
While Gall's theory that all mental processes derive from the brain proved to be correct, his methods for localizing specific functions were deeply flawed because they were not based on what we would now consider valid evidence. Gall did not test his ideas empirically by performing autopsies on the brains of patients and correlating damage to specific regions with defects in mental attributes; he distrusted the diseased brain and did not think it could reveal anything about normal behavior. Instead, he developed the notion that as each mental function is used, the particular area of the brain responsible for that function becomes enlarged. Eventually, a given area may become so bulky that it pushes out against the skull and produces a bump on the head.
The idea that all mental functions are derived from the brain originated with Hippocrates, but it was largely neglected until the late 18th century, when Franz Gall attempted to link psychology and brain science. Gall took advantage of what was already known about the cerebral cortex. He was aware that it was bilaterally symmetrical and subdivided into four lobes. However, he found that these four lobes were, by themselves, inadequate to account for the forty-odd distinct psychological functions that psychologists had characterized by 1790. As a result he began to analyze the heads of hundreds of musicians, actors, etc., relating certain bony elevations or depressions under the scalp to the predominant talent or defects of their owners. Based on his skull palpation, Gall subdivided the cortex into roughly forty regions, each of which served as an organ for a specific mental function.
While Gall's theory that all mental processes derive from the brain proved to be correct, his methods for localizing specific functions were deeply flawed because they were not based on what we would now consider valid evidence. Gall did not test his ideas empirically by performing autopsies on the brains of patients and correlating damage to specific regions with defects in mental attributes; he distrusted the diseased brain and did not think it could reveal anything about normal behavior. Instead, he developed the notion that as each mental function is used, the particular area of the brain responsible for that function becomes enlarged. Eventually, a given area may become so bulky that it pushes out against the skull and produces a bump on the head.
The idea that all mental functions are derived from the brain originated with Hippocrates, but it was largely neglected until the late 18th century, when Franz Gall attempted to link psychology and brain science. Gall took advantage of what was already known about the cerebral cortex. He was aware that it was bilaterally symmetrical and subdivided into four lobes. However, he found that these four lobes were, by themselves, inadequate to account for the forty-odd distinct psychological functions that psychologists had characterized by 1790. As a result he began to analyze the heads of hundreds of musicians, actors, etc., relating certain bony elevations or depressions under the scalp to the predominant talent or defects of their owners. Based on his skull palpation, Gall subdivided the cortex into roughly forty regions, each of which served as an organ for a specific mental function.
While Gall's theory that all mental processes derive from the brain proved to be correct, his methods for localizing specific functions were deeply flawed because they were not based on what we would now consider valid evidence. Gall did not test his ideas empirically by performing autopsies on the brains of patients and correlating damage to specific regions with defects in mental attributes; he distrusted the diseased brain and did not think it could reveal anything about normal behavior. Instead, he developed the notion that as each mental function is used, the particular area of the brain responsible for that function becomes enlarged. Eventually, a given area may become so bulky that it pushes out against the skull and produces a bump on the head.
The historical basis for the King Arthur legend has long been debated by scholars. One school of thought, citing entries in the History of the Britons and Welsh Annals, sees Arthur as a genuine historical figure, a Romano-British leader who fought against the invading Anglo-Saxons sometime in the late 5th to early 6th century. The other text that seems to support the case for Arthur's historical existence is the 10th-century Annales Cambriae. The latest research shows that the Annales Cambriae was based on a chronicle begun in the late 8th century in Wales. Additionally, the complex textual history of the Annales Cambriae precludes any certainty that the Arthurian annals were added to it even that early. They were more likely added at some point in the 10th century and may never have existed in any earlier set of annals.
This lack of convincing early evidence is the reason many recent historians exclude Arthur from their accounts of post-Roman Britain. In the view of historian Thomas Charles-Edwards there may well have been an historical Arthur, but that a historian can as yet say nothing of value about him. These modern admissions of ignorance are a relatively recent trend; earlier generations of historians were less skeptical. Historian John Morris made the putative reign of Arthur the organizing principle of his history of post-Roman Britain and Ireland. Even so, he found little to say about a historical Arthur. Partly in reaction to such theories, another school of thought emerged which argued that Arthur had no historical existence at all. Morris's Age of Arthur prompted archaeologist Nowell Myres to observe that no figure on the borderline of history and mythology has wasted more of the historian's time. Arthur is not mentioned in the Anglo-Saxon Chronicle or named in any surviving manuscript written between 400 and 820. He is absent from Bede's early-8th-century Ecclesiastical History of the English People, another major early source for post-Roman history.
Some scholars argue that Arthur was originally a fictional hero of folklore—or even a half-forgotten Celtic deity—who became credited with real deeds in the distant past. They cite parallels with figures such as the Kentish totemic horse-gods Hengest and Horsa, who later became historicized. Bede ascribed to these legendary figures a historical role in the 5th-century Anglo-Saxon conquest of eastern Britain.
Historical documents for the post-Roman period are scarce. Of the many post-Roman archeological sites and places, only a handful have been identified as "Arthurian," and these date from the 12th century or later. Archaeology can confidently reveal names only through inscriptions found in reliably dated sites. In the absence of new compelling information about post-Roman England, a definitive answer to the question of Arthur's historical existence is unlikely.
The historical basis for the King Arthur legend has long been debated by scholars. One school of thought, citing entries in the History of the Britons and Welsh Annals, sees Arthur as a genuine historical figure, a Romano-British leader who fought against the invading Anglo-Saxons sometime in the late 5th to early 6th century. The other text that seems to support the case for Arthur's historical existence is the 10th-century Annales Cambriae. The latest research shows that the Annales Cambriae was based on a chronicle begun in the late 8th century in Wales. Additionally, the complex textual history of the Annales Cambriae precludes any certainty that the Arthurian annals were added to it even that early. They were more likely added at some point in the 10th century and may never have existed in any earlier set of annals.
This lack of convincing early evidence is the reason many recent historians exclude Arthur from their accounts of post-Roman Britain. In the view of historian Thomas Charles-Edwards there may well have been an historical Arthur, but that a historian can as yet say nothing of value about him. These modern admissions of ignorance are a relatively recent trend; earlier generations of historians were less skeptical. Historian John Morris made the putative reign of Arthur the organizing principle of his history of post-Roman Britain and Ireland. Even so, he found little to say about a historical Arthur. Partly in reaction to such theories, another school of thought emerged which argued that Arthur had no historical existence at all. Morris's Age of Arthur prompted archaeologist Nowell Myres to observe that no figure on the borderline of history and mythology has wasted more of the historian's time. Arthur is not mentioned in the Anglo-Saxon Chronicle or named in any surviving manuscript written between 400 and 820. He is absent from Bede's early-8th-century Ecclesiastical History of the English People, another major early source for post-Roman history.
Some scholars argue that Arthur was originally a fictional hero of folklore—or even a half-forgotten Celtic deity—who became credited with real deeds in the distant past. They cite parallels with figures such as the Kentish totemic horse-gods Hengest and Horsa, who later became historicized. Bede ascribed to these legendary figures a historical role in the 5th-century Anglo-Saxon conquest of eastern Britain.
Historical documents for the post-Roman period are scarce. Of the many post-Roman archeological sites and places, only a handful have been identified as "Arthurian," and these date from the 12th century or later. Archaeology can confidently reveal names only through inscriptions found in reliably dated sites. In the absence of new compelling information about post-Roman England, a definitive answer to the question of Arthur's historical existence is unlikely.
The historical basis for the King Arthur legend has long been debated by scholars. One school of thought, citing entries in the History of the Britons and Welsh Annals, sees Arthur as a genuine historical figure, a Romano-British leader who fought against the invading Anglo-Saxons sometime in the late 5th to early 6th century. The other text that seems to support the case for Arthur's historical existence is the 10th-century Annales Cambriae. The latest research shows that the Annales Cambriae was based on a chronicle begun in the late 8th century in Wales. Additionally, the complex textual history of the Annales Cambriae precludes any certainty that the Arthurian annals were added to it even that early. They were more likely added at some point in the 10th century and may never have existed in any earlier set of annals.
This lack of convincing early evidence is the reason many recent historians exclude Arthur from their accounts of post-Roman Britain. In the view of historian Thomas Charles-Edwards there may well have been an historical Arthur, but that a historian can as yet say nothing of value about him. These modern admissions of ignorance are a relatively recent trend; earlier generations of historians were less skeptical. Historian John Morris made the putative reign of Arthur the organizing principle of his history of post-Roman Britain and Ireland. Even so, he found little to say about a historical Arthur. Partly in reaction to such theories, another school of thought emerged which argued that Arthur had no historical existence at all. Morris's Age of Arthur prompted archaeologist Nowell Myres to observe that no figure on the borderline of history and mythology has wasted more of the historian's time. Arthur is not mentioned in the Anglo-Saxon Chronicle or named in any surviving manuscript written between 400 and 820. He is absent from Bede's early-8th-century Ecclesiastical History of the English People, another major early source for post-Roman history.
Some scholars argue that Arthur was originally a fictional hero of folklore—or even a half-forgotten Celtic deity—who became credited with real deeds in the distant past. They cite parallels with figures such as the Kentish totemic horse-gods Hengest and Horsa, who later became historicized. Bede ascribed to these legendary figures a historical role in the 5th-century Anglo-Saxon conquest of eastern Britain.
Historical documents for the post-Roman period are scarce. Of the many post-Roman archeological sites and places, only a handful have been identified as "Arthurian," and these date from the 12th century or later. Archaeology can confidently reveal names only through inscriptions found in reliably dated sites. In the absence of new compelling information about post-Roman England, a definitive answer to the question of Arthur's historical existence is unlikely.
The historical basis for the King Arthur legend has long been debated by scholars. One school of thought, citing entries in the History of the Britons and Welsh Annals, sees Arthur as a genuine historical figure, a Romano-British leader who fought against the invading Anglo-Saxons sometime in the late 5th to early 6th century. The other text that seems to support the case for Arthur's historical existence is the 10th-century Annales Cambriae. The latest research shows that the Annales Cambriae was based on a chronicle begun in the late 8th century in Wales. Additionally, the complex textual history of the Annales Cambriae precludes any certainty that the Arthurian annals were added to it even that early. They were more likely added at some point in the 10th century and may never have existed in any earlier set of annals.
This lack of convincing early evidence is the reason many recent historians exclude Arthur from their accounts of post-Roman Britain. In the view of historian Thomas Charles-Edwards there may well have been an historical Arthur, but that a historian can as yet say nothing of value about him. These modern admissions of ignorance are a relatively recent trend; earlier generations of historians were less skeptical. Historian John Morris made the putative reign of Arthur the organizing principle of his history of post-Roman Britain and Ireland. Even so, he found little to say about a historical Arthur. Partly in reaction to such theories, another school of thought emerged which argued that Arthur had no historical existence at all. Morris's Age of Arthur prompted archaeologist Nowell Myres to observe that no figure on the borderline of history and mythology has wasted more of the historian's time. Arthur is not mentioned in the Anglo-Saxon Chronicle or named in any surviving manuscript written between 400 and 820. He is absent from Bede's early-8th-century Ecclesiastical History of the English People, another major early source for post-Roman history.
Some scholars argue that Arthur was originally a fictional hero of folklore—or even a half-forgotten Celtic deity—who became credited with real deeds in the distant past. They cite parallels with figures such as the Kentish totemic horse-gods Hengest and Horsa, who later became historicized. Bede ascribed to these legendary figures a historical role in the 5th-century Anglo-Saxon conquest of eastern Britain.
Historical documents for the post-Roman period are scarce. Of the many post-Roman archeological sites and places, only a handful have been identified as "Arthurian," and these date from the 12th century or later. Archaeology can confidently reveal names only through inscriptions found in reliably dated sites. In the absence of new compelling information about post-Roman England, a definitive answer to the question of Arthur's historical existence is unlikely.
People associate global warming with temperature, but the phrase is misleading—it fails to mention the relevance of water. Nearly every significant indicator of hydrological activity—rainfall, snowmelt, glacial melt—is changing at an accelerating pace (one can arbitrarily pick any point of the hydrological cycle and notice a disruption). One analysis pegged the increase in precipitation at 2 percent over the century. In water terms this sounds auspicious, promising increased supply, but the changing timing and composition of the precipitation more than neutralizes the advantage. For one thing, it is likely that more of the precipitation will fall in intense episodes, with flooding a reasonable prospect. In addition, while rainfall will increase, snowfall will decrease. Such an outcome means that in watersheds that depend on snowmelt, like the Indus, Ganges, Colorado river basins, less water will be stored as snow, and more of it will flow in the winter, when it plays no agricultural role; conversely, less of it will flow in the summer, when it is most needed. One computer model showed that on the Animas River an increase in temperature of 3.6 degrees Fahrenheit would cause runoff to rise by 85 percent from January to March, but drop by 40 percent from July to September. The rise in temperature increases the probability and intensity of spring floods and threatens dam safety, which is predicated on lower runoff projections. Dams in arid areas also may face increased sedimentation, since a 10 percent annual increase in precipitation can double the volume of sediment washed into rivers.
The consequences multiply. Soil moisture will intensify at the highest northern latitudes, where precipitation will grow far more than evaporation and plant transpiration but where agriculture is nonexistent. At the same time, precipitation will drop over northern mid-latitude continents in summer months, when ample soil moisture is an agricultural necessity. Meanwhile the sea level will continue to rise as temperatures warm, accelerating saline contamination of freshwater aquifers and river deltas. The temperature will cause increased evaporation, which in turn will lead to a greater incidence of drought.
Perhaps most disturbing of all, the hydrologic cycle is becoming increasingly unpredictable. This means that the last century's hydrological cycle—the set of assumptions about water on which modern irrigation is based—has become unreliable. Build a dam too large, and it may not generate its designed power; build it too small, and it may collapse or flood. Release too little dam runoff in the spring and risk flood, as the snowmelt cascades downstream with unexpected volume; release too much and the water will not be available for farmers when they need it. At a time when water scarcity calls out for intensified planning, planning itself may be stymied.
People associate global warming with temperature, but the phrase is misleading—it fails to mention the relevance of water. Nearly every significant indicator of hydrological activity—rainfall, snowmelt, glacial melt—is changing at an accelerating pace (one can arbitrarily pick any point of the hydrological cycle and notice a disruption). One analysis pegged the increase in precipitation at 2 percent over the century. In water terms this sounds auspicious, promising increased supply, but the changing timing and composition of the precipitation more than neutralizes the advantage. For one thing, it is likely that more of the precipitation will fall in intense episodes, with flooding a reasonable prospect. In addition, while rainfall will increase, snowfall will decrease. Such an outcome means that in watersheds that depend on snowmelt, like the Indus, Ganges, Colorado river basins, less water will be stored as snow, and more of it will flow in the winter, when it plays no agricultural role; conversely, less of it will flow in the summer, when it is most needed. One computer model showed that on the Animas River an increase in temperature of 3.6 degrees Fahrenheit would cause runoff to rise by 85 percent from January to March, but drop by 40 percent from July to September. The rise in temperature increases the probability and intensity of spring floods and threatens dam safety, which is predicated on lower runoff projections. Dams in arid areas also may face increased sedimentation, since a 10 percent annual increase in precipitation can double the volume of sediment washed into rivers.
The consequences multiply. Soil moisture will intensify at the highest northern latitudes, where precipitation will grow far more than evaporation and plant transpiration but where agriculture is nonexistent. At the same time, precipitation will drop over northern mid-latitude continents in summer months, when ample soil moisture is an agricultural necessity. Meanwhile the sea level will continue to rise as temperatures warm, accelerating saline contamination of freshwater aquifers and river deltas. The temperature will cause increased evaporation, which in turn will lead to a greater incidence of drought.
Perhaps most disturbing of all, the hydrologic cycle is becoming increasingly unpredictable. This means that the last century's hydrological cycle—the set of assumptions about water on which modern irrigation is based—has become unreliable. Build a dam too large, and it may not generate its designed power; build it too small, and it may collapse or flood. Release too little dam runoff in the spring and risk flood, as the snowmelt cascades downstream with unexpected volume; release too much and the water will not be available for farmers when they need it. At a time when water scarcity calls out for intensified planning, planning itself may be stymied.
People associate global warming with temperature, but the phrase is misleading—it fails to mention the relevance of water. Nearly every significant indicator of hydrological activity—rainfall, snowmelt, glacial melt—is changing at an accelerating pace (one can arbitrarily pick any point of the hydrological cycle and notice a disruption). One analysis pegged the increase in precipitation at 2 percent over the century. In water terms this sounds auspicious, promising increased supply, but the changing timing and composition of the precipitation more than neutralizes the advantage. For one thing, it is likely that more of the precipitation will fall in intense episodes, with flooding a reasonable prospect. In addition, while rainfall will increase, snowfall will decrease. Such an outcome means that in watersheds that depend on snowmelt, like the Indus, Ganges, Colorado river basins, less water will be stored as snow, and more of it will flow in the winter, when it plays no agricultural role; conversely, less of it will flow in the summer, when it is most needed. One computer model showed that on the Animas River an increase in temperature of 3.6 degrees Fahrenheit would cause runoff to rise by 85 percent from January to March, but drop by 40 percent from July to September. The rise in temperature increases the probability and intensity of spring floods and threatens dam safety, which is predicated on lower runoff projections. Dams in arid areas also may face increased sedimentation, since a 10 percent annual increase in precipitation can double the volume of sediment washed into rivers.
The consequences multiply. Soil moisture will intensify at the highest northern latitudes, where precipitation will grow far more than evaporation and plant transpiration but where agriculture is nonexistent. At the same time, precipitation will drop over northern mid-latitude continents in summer months, when ample soil moisture is an agricultural necessity. Meanwhile the sea level will continue to rise as temperatures warm, accelerating saline contamination of freshwater aquifers and river deltas. The temperature will cause increased evaporation, which in turn will lead to a greater incidence of drought.
Perhaps most disturbing of all, the hydrologic cycle is becoming increasingly unpredictable. This means that the last century's hydrological cycle—the set of assumptions about water on which modern irrigation is based—has become unreliable. Build a dam too large, and it may not generate its designed power; build it too small, and it may collapse or flood. Release too little dam runoff in the spring and risk flood, as the snowmelt cascades downstream with unexpected volume; release too much and the water will not be available for farmers when they need it. At a time when water scarcity calls out for intensified planning, planning itself may be stymied.
The US Constitution established both gold and silver as the basis of US currency: that is to say, it established a bimetallic standard for currency. This remained in place for about a century, until the Coinage Act of 1873, which embraced a "gold only" standard, a monometallic standard, effectively dropping silver as the basis of currency. Over the next several decades, advocates of bimetallism and advocates of the "gold only" standard fiercely debated.
The "gold only" advocates, such as William McKinley, argued that shifts in the relative value of the two precious metals could lead to wild fluctuations in the values of currency in a bimetallic system. Early in the United States history, Alexander Hamilton had tried to fix the gold-silver exchange rate by fiat, but of course, such restraints only inhibit the natural development of a free market.
Unemployment was high in the depression caused by the Panic of 1893, and many argued that these economic challenges had been triggered by abandoning bimetallism. One of the more prominent advocates of bimetallism was William Jennings Bryant: indeed, bimetallism was the very center of his presidential campaigns in 1896 and 1900, both of which he lost to McKinley. Bryant articulated the popular view that a "gold only" standard limited the money supply, and thus favored those who were already quite wealthy, against the interests of working people of all professions. He famously expressed this argument in his "Cross of Gold" speech at the 1896 Democratic National Convention, in which he argued that continuing the "gold only" standard would "crucify" the honest laboring classes on a "cross of gold."
Despite the eloquence of Bryant's arguments, history strongly favored the "gold-only" standard. The argument that increasing the money supply would lead to greater prosperity strikes us now as naïve: of course, we now understand that increasing the monetary supply can lead to runaway inflation, which hurts everyone. Furthermore, gold did not remain as limited as the advocates of bimetallism imagined. In the 1890s, scientists discovered a cyanide process that allowed workers to extract pure gold from much lower grade ore, thus significantly increasing domestic gold production. Additionally, the discovery of two immense gold deposits in South Africa substantially increased world gold supply. Thus, the "gold only" standard allowed for ample currency, and even robust prosperity in the 1920s, so bimetallism died a quiet death.
The US Constitution established both gold and silver as the basis of US currency: that is to say, it established a bimetallic standard for currency. This remained in place for about a century, until the Coinage Act of 1873, which embraced a "gold only" standard, a monometallic standard, effectively dropping silver as the basis of currency. Over the next several decades, advocates of bimetallism and advocates of the "gold only" standard fiercely debated.
The "gold only" advocates, such as William McKinley, argued that shifts in the relative value of the two precious metals could lead to wild fluctuations in the values of currency in a bimetallic system. Early in the United States history, Alexander Hamilton had tried to fix the gold-silver exchange rate by fiat, but of course, such restraints only inhibit the natural development of a free market.
Unemployment was high in the depression caused by the Panic of 1893, and many argued that these economic challenges had been triggered by abandoning bimetallism. One of the more prominent advocates of bimetallism was William Jennings Bryant: indeed, bimetallism was the very center of his presidential campaigns in 1896 and 1900, both of which he lost to McKinley. Bryant articulated the popular view that a "gold only" standard limited the money supply, and thus favored those who were already quite wealthy, against the interests of working people of all professions. He famously expressed this argument in his "Cross of Gold" speech at the 1896 Democratic National Convention, in which he argued that continuing the "gold only" standard would "crucify" the honest laboring classes on a "cross of gold."
Despite the eloquence of Bryant's arguments, history strongly favored the "gold-only" standard. The argument that increasing the money supply would lead to greater prosperity strikes us now as naïve: of course, we now understand that increasing the monetary supply can lead to runaway inflation, which hurts everyone. Furthermore, gold did not remain as limited as the advocates of bimetallism imagined. In the 1890s, scientists discovered a cyanide process that allowed workers to extract pure gold from much lower grade ore, thus significantly increasing domestic gold production. Additionally, the discovery of two immense gold deposits in South Africa substantially increased world gold supply. Thus, the "gold only" standard allowed for ample currency, and even robust prosperity in the 1920s, so bimetallism died a quiet death.
The US Constitution established both gold and silver as the basis of US currency: that is to say, it established a bimetallic standard for currency. This remained in place for about a century, until the Coinage Act of 1873, which embraced a "gold only" standard, a monometallic standard, effectively dropping silver as the basis of currency. Over the next several decades, advocates of bimetallism and advocates of the "gold only" standard fiercely debated.
The "gold only" advocates, such as William McKinley, argued that shifts in the relative value of the two precious metals could lead to wild fluctuations in the values of currency in a bimetallic system. Early in the United States history, Alexander Hamilton had tried to fix the gold-silver exchange rate by fiat, but of course, such restraints only inhibit the natural development of a free market.
Unemployment was high in the depression caused by the Panic of 1893, and many argued that these economic challenges had been triggered by abandoning bimetallism. One of the more prominent advocates of bimetallism was William Jennings Bryant: indeed, bimetallism was the very center of his presidential campaigns in 1896 and 1900, both of which he lost to McKinley. Bryant articulated the popular view that a "gold only" standard limited the money supply, and thus favored those who were already quite wealthy, against the interests of working people of all professions. He famously expressed this argument in his "Cross of Gold" speech at the 1896 Democratic National Convention, in which he argued that continuing the "gold only" standard would "crucify" the honest laboring classes on a "cross of gold."
Despite the eloquence of Bryant's arguments, history strongly favored the "gold-only" standard. The argument that increasing the money supply would lead to greater prosperity strikes us now as naïve: of course, we now understand that increasing the monetary supply can lead to runaway inflation, which hurts everyone. Furthermore, gold did not remain as limited as the advocates of bimetallism imagined. In the 1890s, scientists discovered a cyanide process that allowed workers to extract pure gold from much lower grade ore, thus significantly increasing domestic gold production. Additionally, the discovery of two immense gold deposits in South Africa substantially increased world gold supply. Thus, the "gold only" standard allowed for ample currency, and even robust prosperity in the 1920s, so bimetallism died a quiet death.
The US Constitution established both gold and silver as the basis of US currency: that is to say, it established a bimetallic standard for currency. This remained in place for about a century, until the Coinage Act of 1873, which embraced a "gold only" standard, a monometallic standard, effectively dropping silver as the basis of currency. Over the next several decades, advocates of bimetallism and advocates of the "gold only" standard fiercely debated.
The "gold only" advocates, such as William McKinley, argued that shifts in the relative value of the two precious metals could lead to wild fluctuations in the values of currency in a bimetallic system. Early in the United States history, Alexander Hamilton had tried to fix the gold-silver exchange rate by fiat, but of course, such restraints only inhibit the natural development of a free market.
Unemployment was high in the depression caused by the Panic of 1893, and many argued that these economic challenges had been triggered by abandoning bimetallism. One of the more prominent advocates of bimetallism was William Jennings Bryant: indeed, bimetallism was the very center of his presidential campaigns in 1896 and 1900, both of which he lost to McKinley. Bryant articulated the popular view that a "gold only" standard limited the money supply, and thus favored those who were already quite wealthy, against the interests of working people of all professions. He famously expressed this argument in his "Cross of Gold" speech at the 1896 Democratic National Convention, in which he argued that continuing the "gold only" standard would "crucify" the honest laboring classes on a "cross of gold."
Despite the eloquence of Bryant's arguments, history strongly favored the "gold-only" standard. The argument that increasing the money supply would lead to greater prosperity strikes us now as naïve: of course, we now understand that increasing the monetary supply can lead to runaway inflation, which hurts everyone. Furthermore, gold did not remain as limited as the advocates of bimetallism imagined. In the 1890s, scientists discovered a cyanide process that allowed workers to extract pure gold from much lower grade ore, thus significantly increasing domestic gold production. Additionally, the discovery of two immense gold deposits in South Africa substantially increased world gold supply. Thus, the "gold only" standard allowed for ample currency, and even robust prosperity in the 1920s, so bimetallism died a quiet death.
Unlike Mercury and Mars, Venus has a dense, opaque atmosphere that prevents direct observation of its surface. For years, surface telescopes on Earth could glean no information about the surface of Venus. In 1989, the Magellan probe was launched to do a five-year radar-mapping of the entire surface of Venus. The data that emerged provided by far the most detailed map of the Venusian surface ever seen.
The surface shows an unbelievable level of volcanic activity: more than one hundred large shield volcanoes, many more than Earth has, and a solidified river of lava longer than the Nile. The entire surface is volcanically dead, with not a single active volcano. This surface is relatively young in planetary terms, about 300 million years old. The whole surface, planet-wide, is the same age: the even pattern of craters, randomly distributed across the surface, demonstrates this.
To explain this puzzling surface, Turcotte suggested a radical model. The surface of Venus, for a period, is as it is now, a surface of uniform age with no active volcanism. While the surface is fixed, volcanic pressure builds up inside the planet. At a certain point, the pressure ruptures the surface, and the entire planet is re-coated in lava in a massive planet-wide outburst of volcanism. Having spent all this thermal energy in one gigantic outpouring, the surface cools and hardens, again producing the kind of surface we see today. Turcotte proposed that this cycle repeated several times in the past, and would still repeat in the future.
To most planetary geologists, Turcotte's model is a return to catastrophism. For two centuries, geologist of all kinds fought against the idea of catastrophic, planet-wide changes, such as the Biblical idea of Noah's Flood. The triumph of gradualism was essential to the success of geology as a serious science. Indeed, all features of Earth's geology and all features of other moons and planets in the Solar System, even those that are not volcanically active, are explained very well by current gradualist models. Planetary geologists question why all other objects would obey gradualist models, and only Venus would obey a catastrophic model. These geologists insist that the features of Venus must be able to be explained in terms of incremental changes continuously over a long period.
Turcotte, expecting these objections, points out that no incremental process could result in a planet-wide surface all the same age. Furthermore, a slow process of continual change does not well explain why a planet with an astounding history of volcanic activity is now volcanically dead. Turcotte argues that only his catastrophic model adequately explains the extremes of the Venusian surface.
Unlike Mercury and Mars, Venus has a dense, opaque atmosphere that prevents direct observation of its surface. For years, surface telescopes on Earth could glean no information about the surface of Venus. In 1989, the Magellan probe was launched to do a five-year radar-mapping of the entire surface of Venus. The data that emerged provided by far the most detailed map of the Venusian surface ever seen.
The surface shows an unbelievable level of volcanic activity: more than one hundred large shield volcanoes, many more than Earth has, and a solidified river of lava longer than the Nile. The entire surface is volcanically dead, with not a single active volcano. This surface is relatively young in planetary terms, about 300 million years old. The whole surface, planet-wide, is the same age: the even pattern of craters, randomly distributed across the surface, demonstrates this.
To explain this puzzling surface, Turcotte suggested a radical model. The surface of Venus, for a period, is as it is now, a surface of uniform age with no active volcanism. While the surface is fixed, volcanic pressure builds up inside the planet. At a certain point, the pressure ruptures the surface, and the entire planet is re-coated in lava in a massive planet-wide outburst of volcanism. Having spent all this thermal energy in one gigantic outpouring, the surface cools and hardens, again producing the kind of surface we see today. Turcotte proposed that this cycle repeated several times in the past, and would still repeat in the future.
To most planetary geologists, Turcotte's model is a return to catastrophism. For two centuries, geologist of all kinds fought against the idea of catastrophic, planet-wide changes, such as the Biblical idea of Noah's Flood. The triumph of gradualism was essential to the success of geology as a serious science. Indeed, all features of Earth's geology and all features of other moons and planets in the Solar System, even those that are not volcanically active, are explained very well by current gradualist models. Planetary geologists question why all other objects would obey gradualist models, and only Venus would obey a catastrophic model. These geologists insist that the features of Venus must be able to be explained in terms of incremental changes continuously over a long period.
Turcotte, expecting these objections, points out that no incremental process could result in a planet-wide surface all the same age. Furthermore, a slow process of continual change does not well explain why a planet with an astounding history of volcanic activity is now volcanically dead. Turcotte argues that only his catastrophic model adequately explains the extremes of the Venusian surface.
Unlike Mercury and Mars, Venus has a dense, opaque atmosphere that prevents direct observation of its surface. For years, surface telescopes on Earth could glean no information about the surface of Venus. In 1989, the Magellan probe was launched to do a five-year radar-mapping of the entire surface of Venus. The data that emerged provided by far the most detailed map of the Venusian surface ever seen.
The surface shows an unbelievable level of volcanic activity: more than one hundred large shield volcanoes, many more than Earth has, and a solidified river of lava longer than the Nile. The entire surface is volcanically dead, with not a single active volcano. This surface is relatively young in planetary terms, about 300 million years old. The whole surface, planet-wide, is the same age: the even pattern of craters, randomly distributed across the surface, demonstrates this.
To explain this puzzling surface, Turcotte suggested a radical model. The surface of Venus, for a period, is as it is now, a surface of uniform age with no active volcanism. While the surface is fixed, volcanic pressure builds up inside the planet. At a certain point, the pressure ruptures the surface, and the entire planet is re-coated in lava in a massive planet-wide outburst of volcanism. Having spent all this thermal energy in one gigantic outpouring, the surface cools and hardens, again producing the kind of surface we see today. Turcotte proposed that this cycle repeated several times in the past, and would still repeat in the future.
To most planetary geologists, Turcotte's model is a return to catastrophism. For two centuries, geologist of all kinds fought against the idea of catastrophic, planet-wide changes, such as the Biblical idea of Noah's Flood. The triumph of gradualism was essential to the success of geology as a serious science. Indeed, all features of Earth's geology and all features of other moons and planets in the Solar System, even those that are not volcanically active, are explained very well by current gradualist models. Planetary geologists question why all other objects would obey gradualist models, and only Venus would obey a catastrophic model. These geologists insist that the features of Venus must be able to be explained in terms of incremental changes continuously over a long period.
Turcotte, expecting these objections, points out that no incremental process could result in a planet-wide surface all the same age. Furthermore, a slow process of continual change does not well explain why a planet with an astounding history of volcanic activity is now volcanically dead. Turcotte argues that only his catastrophic model adequately explains the extremes of the Venusian surface.
Unlike Mercury and Mars, Venus has a dense, opaque atmosphere that prevents direct observation of its surface. For years, surface telescopes on Earth could glean no information about the surface of Venus. In 1989, the Magellan probe was launched to do a five-year radar-mapping of the entire surface of Venus. The data that emerged provided by far the most detailed map of the Venusian surface ever seen.
The surface shows an unbelievable level of volcanic activity: more than one hundred large shield volcanoes, many more than Earth has, and a solidified river of lava longer than the Nile. The entire surface is volcanically dead, with not a single active volcano. This surface is relatively young in planetary terms, about 300 million years old. The whole surface, planet-wide, is the same age: the even pattern of craters, randomly distributed across the surface, demonstrates this.
To explain this puzzling surface, Turcotte suggested a radical model. The surface of Venus, for a period, is as it is now, a surface of uniform age with no active volcanism. While the surface is fixed, volcanic pressure builds up inside the planet. At a certain point, the pressure ruptures the surface, and the entire planet is re-coated in lava in a massive planet-wide outburst of volcanism. Having spent all this thermal energy in one gigantic outpouring, the surface cools and hardens, again producing the kind of surface we see today. Turcotte proposed that this cycle repeated several times in the past, and would still repeat in the future.
To most planetary geologists, Turcotte's model is a return to catastrophism. For two centuries, geologist of all kinds fought against the idea of catastrophic, planet-wide changes, such as the Biblical idea of Noah's Flood. The triumph of gradualism was essential to the success of geology as a serious science. Indeed, all features of Earth's geology and all features of other moons and planets in the Solar System, even those that are not volcanically active, are explained very well by current gradualist models. Planetary geologists question why all other objects would obey gradualist models, and only Venus would obey a catastrophic model. These geologists insist that the features of Venus must be able to be explained in terms of incremental changes continuously over a long period.
Turcotte, expecting these objections, points out that no incremental process could result in a planet-wide surface all the same age. Furthermore, a slow process of continual change does not well explain why a planet with an astounding history of volcanic activity is now volcanically dead. Turcotte argues that only his catastrophic model adequately explains the extremes of the Venusian surface.