Sunday, June 8, 2008

I Want Answers

I Want Answers:

See http://en.wikipedia.org/wiki/Supercomputer:
"(NNSA) selected IBM to design and build the world's first supercomputer to use the Cell Broadband Engine (Cell B.E.) processor aiming to produce a machine capable of a sustained speed of up to 1,000 trillion (one quadrillion) calculations per second, or one PFLOPS."

*****

Dave,

The June 2008 issue of National Review is interesting, especially in relation to some information on the internet.

*****

See http://www.youtube.com/watch?v=8hGvQtumNAY:
(Exchange between Nicholson and CruiseJ)
You want answers?
I think I’m entitled.
You want answers?
I want the truth.
You can’t handle the truth!

*****

Taking “Free” Out Of “Will” (?):

Super data crunching computers have aided the successful mapping of the human genome. Now, ambitious efforts are underway to use computers to help conduct a genome-wide-association study (GWAS). The goal, insofar as possible, is to reduce all psychological tendencies (such as depression, shyness, disorderliness, aggressiveness, clumsiness, dullness, inattentiveness) to genetically operative explanations, sort of like an engineering problem. (See June 2, 2008 issue of National Review, “Undetermined,” page 26.)

Structural limits likely remain, in respect that much behavior is correlated not just with single genes, but with interactions among many genes and biochemical processes, involving interactions among components within components, interfacing with environmental factors within factors (nutrition, culture, previous conditioning, previous education regarding inculcation of ideals).

See http://www.theuniversityconcourse.com/IV,2,11-18-1998/Kovach.htm :
“Solvable problems belong to the group of languages called recursive. For these proble-ms the TM will return a definite answer yes or no in a finite period of time. Yet many of these cannot be practically solved because of limits of time or space.
A particularly simple example of this is the traveling salesperson problem. There is a salesperson who has to visit 50 cities. He wishes to do so by traveling the least number of miles and by not visiting any city more than once. The algorithm for this problems is deceptively simple. Measure and store the distances between the starting point and all 50 cities. Find the shortest distance between each set of the remaining 49 cities and add it to the first distance, then determine which is the shortest distance. Simple huh? Until one considers the time needed to solve this problem. This problem requires at least 50! steps to solve it. (50!, read 50 factorial, is the product of 50x49x 48x47x...x3x2x1. It is approximately 314 followed by 64 zeros. To give you a feel for the time needed to solve this, assume a computer could perform 10 billion steps a second. This would translate into approximately 31.5 quadrillion steps in a year. At this rate, it would take "a little" more than 9,650,000,000,000,000,000,000,000, 000,000,000,000,000,000,000,000,000 years to solve this problem. (In Asimov's story, the universe ends in a mere 10,000,000,000,000 years.) Thus even solvable problems may be beyond the power of computers.”

[PERSONAL COMMENTS: The above article was written in 1998. Efforts are now underway to build a supercomputer capable of 1,000 trillion (one quadrillion) calculations per second, or one PFLOPS. This appears to be more than 10,000 times faster than the 10 billion steps a second, which was the speed assumed in the example above.
Even so, by Kovach’s calculations, even with a state of the art supercomputer, it would take more than 96,500,000,000,000,000,000,000,000,000,000,000,000,000,000,000 years to solve his problem, which would still far exceed the Ten Trillion years remaining in Asimov’s universe (apparently by more than 965,000,000,000,000,000,000,000,000,000 or 9.65 x 10 to the 27th power, in years). [Note: Re-check zeros(?).]

NOT SO HARD: However, intuition suggests solving such problems will soon become far less difficult, for the following reasons:
1) Raw computing power is combined with the guiding intuitive power of its programmers;
2) Supercomputing capacities will continue to explode; and
3) AI will likely explode the intuitive power of guiding programmers of raw computing.]

See http://compnetworking.about.com/od/basicnetworkingconcepts/g/bldef_kilobit.htm:
“Definition: In computer networking, a kilobit normally represents 1000 bits of data. A megabit represents 1000 kilobits and a gigabit represents 1000 megabits (equal to one million kilobits).”

See http://www.defensenews.com/story.php?i=3306303&c=FEA&s=CVS :
“Some Army officials see hope in the Air Force's Transformational Satellite communications system, which will allow data rates of about two gigabits per second when it arrives around 2020, about eight times faster than the 250 megabits of today's Advanced Extremely High Frequency Satellites. TSAT's Internet Protocol-based system will also offer more flexibility than AEHF's point-to-point connections.”

See http://www.top500.org/system/8968 :
“BlueGene/L (BGL) clocked 478.2 trillion floating operations per second (teraFLOPS) on LINPACK, the industry standard of measure for high-performance computing.”

See http://en.wikipedia.org/wiki/Supercomputer :
“Supercomputers are used for highly calculation-intensive tasks such as problems involving quantum mechanical physics, weather forecasting, climate research (including research into global warming), molecular modeling (computing the structures and properties of chemical compounds, biological macromolecules, polymers, and crystals), physical simulations (such as simulation of airplanes in wind tunnels, simulation of the detonation of nuclear weapons, and research into nuclear fusion), cryptanalysis, and the like. Major universities, military agencies and scientific research laboratories are heavy users.
A particular class of problems, known as Grand Challenge problems, are problems whose full solution requires semi-infinite computing resources.
Relevant here is the distinction between capability computing and capacity computing, as defined by Graham et al. Capability computing is typically thought of as using the maximum computing power to solve a large problem in the shortest amount of time. Often a capability system is able to solve a problem of a size or complexity that no other computer can. Capacity computing in contrast is typically thought of as using efficient cost-effective computing power to solve somewhat large problems or many small problems or to prepare for a run on a capability system.
….
As of November 2007, the IBM Blue Gene/L at Lawrence Livermore National Laboratory (LLNL) is the fastest operational supercomputer, with a sustained processing rate of 478.2 TFLOPS.
….
On September 9, 2006 the U.S. Department of Energy's National Nuclear Security Administration (NNSA) selected IBM to design and build the world's first supercomputer to use the Cell Broadband Engine (Cell B.E.) processor aiming to produce a machine capable of a sustained speed of up to 1,000 trillion (one quadrillion) calculations per second, or one PFLOPS. Another project in development by IBM is the Cyclops64 architecture, intended to create a "supercomputer on a chip".”

[PERSONAL COMMENTS, with quote snippits from June 2008 National Review:
The June 2008 issue of National Review implicates that we will not know whether a project can be achieved for understanding how genes, body, brain, environment, and social conditioning interact to create characteristics of mind until we achieve it. We will not know whether the problem is solvable until we solve it. (“We are like cavemen trying to figure out how a computer works by poking at it with sharpened sticks.”) Such situation may be similar to the “halting problem,” as described by Turing and referenced in Kovach’s 1998 article.
“Science may someday allow us to predict human behavior comprehensively and reliably, so that we can live in Woodrow Wilson’s “perfected beehive.” Until then, however, we need to keep stumbling forward in freedom as best we can.”]

No comments:

Post a Comment