以文本方式查看主题

-  中文XML论坛 - 专业的XML技术讨论区  (http://bbs.xml.org.cn/index.asp)
--  『 理论计算机科学 』  (http://bbs.xml.org.cn/list.asp?boardid=64)
----  Three Problems in Computer Science (zz)  (http://bbs.xml.org.cn/dispbbs.asp?boardid=64&rootid=&id=15624)


--  作者:Logician
--  发布时间:3/15/2005 2:17:00 PM

--  Three Problems in Computer Science (zz)
【 以下文字转载自 小百合BBS Algorithm 讨论区 】
【 原文由 starfish@lilybbs 所发表 】

Three Problems in Computer Science

LESLIE G. VALIANT
Harvard University

We describe three problems that each have a history of research that
goes back to the early days of computer science. These past efforts,
which we do not review here,have succeeded in uncovering some the
essence of these otherwise elusive challenges, and make it possible to
focus on these questions more clearly now. Difficult as they may
appear to be, they are well-posed scientific questions, we believe,
and therefore will be solved eventually. While their solutions can be
expected to have mathematical content, the problems as we state them
are not purely mathematical. In each case, a part of the solution
would be the emergence of some consensus on the nature of the right
mathematical formulation.

1. Characterizing the Power of Computation

The aim here is to achieve a theory of practical resource bounded
computability that is as satisfying as the theory of general
computability developed by Turing[1936] and his contemporaries. In
particular, the goal is to fully characterize what can be computed in
practice in the physical world.

First, we need one characterization of the class of computations that
can be computed in the physical world in practice. In this pursuit, we
embrace the polynomial resource cutoff that has proved so fruitful to
date for articulating the limits of what is currently understood. We
therefore call our class PhP, the class of physically constructible
polynomial resource computers. This class PhP certainly contains P,
the class of problems computable in polynomial time by deterministic
Turing machines, and also, we believe, the class BPP of problems that
can be computed in polynomial time by classical randomization. The
question of whether the quantum class BQP [Bernstein and Vazirani
1997] is contained in PhP, we consider to be open, since there is no
practical proposal for realizing these machines for which scalability
has been demonstrated beyond reasonable doubt. In principle, PhP could
be smaller than, equal to, larger than, or incomparable with BQP.

Once we settle on a characterization of PhP from such a physics
viewpoint, it will remain to understand the scope of this class with
respect to its computing power, as compared, in particular, with the
known fundamental complexity classes such as NP, #P and PSPACE (see
Papadimitriou [1994] for definitions.) Numerous significant
computational problems are complete for various of these classes, and
it is clearly important to resolve whether these are computable in
practice.

When all this has been done, and we have a full characterization of
PhP, then we should have a good understanding of which problems can be
computed in practice, and which not. What we are doing here is, of
course, simply folding in with the question of which basic
computational processes physics allows, with the several central
problems of complexity theory that have emerged since the discovery of
the NP-completeness phenomenon three decades ago [Cook 1971; Karp
1972; Levin 1973]. However, it is the interplay between these two
disparate modes of thinking, and the possibility of bringing these
together to phrase a single question, the full characterization of
PhP, that lends significance to both. Since a positive solution to
almost any component of it may give powerful new computational methods
for almost all domains of computing, this single question appears at
this time to be scientifically the most fundamental in computer
science.


2. Characterizing a Semantics for Cognitive Computation

The aim here is to identify a way of looking at and manipulating
commonsense knowledge that is consistent with and can support what we
consider to be the two most fundamental aspects of intelligent
cognitive behavior: the ability to learn from experience, and the
ability to reason from what has been learned. We are therefore seeking
a semantics of knowledge that can computationally support the basic
phenomena of intelligent behavior.

The force of this formulation is that we place at the fore the
question of semantics, and posit that, if we get this right, then
algorithms and architectures for intelligent systems will follow. The
formulation brings into focus the dilemma that while both learning and
reasoning have been widely studied within computer science, and are,
in some senses, well understood, they have been formalized with
irreconcilably different semantics. Formulations of learning appear to
require a statistical component, as in the PAC model, while those of
reasoning, such as the standard Tarski semantics of predicate
calculus, do not.

The question of semantics is, of course, integral to much work in
artificial intelligence, though often only implicitly. As far as more
explicit proposals, the one that has been pursued the furthest is that
of adopting the semantics of the predicate calculus [McCarthy
1959]. There has been much discussion on whether this approach is
inherently limited, and much commentary on where it falls short. The
strength of formal logic can be expected to be most visible in domains
that lend themselves to axiomatization, where it ensures principled
deduction. In other domains, and particularly in areas related to
commonsense knowledge, its tendency to take a position in situations
that are uncertain or unknown appears to be a liability. Alternative
starting points that still offer principled deduction on the basis of
some form of axiomatization are probability theory (e.g., Carnap
[1950] and Pearl [1988]), and probabilistic logics (e.g., Halpern
[1990]).

The requirements for a semantics to be adequate for commonsense
knowledge appear to be quite subtle and not fully realized by the
above mentioned systems. Some discussion can be found in Valiant
[2000]. One set of requirements is that it should support integrated
algorithms for learning and reasoning that are computationally
tractable and have some nontrivial scope. Another requirement is that
it has a principled way of ensuring that the knowledge base from which
reasoning is done is robust, in the sense that errors in the
deductions are at least controlled. The main suggestion made in that
paper is that the way to make reasoning robust enough to be viable in
a world as complex and ill-understood as this one, is to have the
knowledge base constantly checked and updated against real world
experience. In turn, this leads to the idea that the semantics of PAC
learning may be an adequate semantics since in that field performance
is measured by goodness of fit with real world experience. This
contrasts with the standard semantics of logic, where there is a
presumption that a precise formalization of the knowledge is
possible. In practice, human designers who attempt to axiomatize
knowledge in complex commonsense domains fail to foresee all the
necessary possibilities or eventualities, and the resulting reasoning
systems that use these knowledge bases turn out to be brittle.

The technological consequence of success here may be the final
emergence of computer systems that can interact and cope with complex,
uncertain and incompletely understood environments much as humans
appear to be able to do.

3. Characterizing Cortical Computation

The aim here is to describe how knowledge is represented in the brain
and what the algorithms are for computing the most basic behavioral
tasks. The first challenge is to obtain explicit computational
accounts of such basic functions as memorizing a scene, recalling an
event, deducing a consequence of two facts and simple supervised and
unsupervised learning. Only after this has been done need one proceed
to more complex phenomena.

In order to achieve this one first needs a model of computation that
describes the essential capabilities and limitations of the brain for
computing such functions.  Models of the activity of single cells are
probably too low-level, just as in a digital computer the bit level is
not right for describing the corresponding level of
functionality. However, some biological facts about single cells may
be crucial in determining what the right high-level model is. In our
study [Valiant 1994], we concluded that the following biological
questions may be particularly informative in guiding one to the right
model: the strength of synapses, the accuracy of timing mechanisms,
the existence of state not just at synapses but also globally in a
cell, and the numerical parameters of cell interconnectivity. It was
shown there that sufficiently positive answers to these four questions
are enough to support the set of basic tasks, such as memorization,
that we listed above. In the absence of such positive answers, it
appears much more difficult, we believe, to account for such a breadth
of cognitive phenomena. Mathematical proofs of impossibility are,
however, lacking and clearly offer a possible avenue for future
enquiry.

On the biology side, it would appear that questions about cortical
cells such as the above-mentioned four, are resolvable by experiment
in the foreseeable future.  Consensus can then emerge on the needed
high-level model of computation. Some facts about the brain, such as
the stylized nature of the spikes in the long range axons, and the
apparently huge increase in size of the brain as humans evolved in the
last several hundred thousand years, suggest that the computations may
share some simple underlying algorithmic processes, and hence that
this overall endeavor of understanding the brain through its basic
computational capabilities might be viable.

Cortical computation may well turn out to be the science that speaks
to us most directly about our experience as humans.

REFERENCES

BERNSTEIN, E., AND VAZIRANI, U. V. 1997. Quantum complexity theory,
SIAM J. Comput., 1411--1473.

CARNAP, R. 1950. Logical Foundations of Probability. University of
Chicago Press, Chicago, Ill.

COOK, S. A. 1971. The complexity of theorem proving procedures. In
Proceedings of the 3rd ACM symposium on Theory of computing. ACM, New
York, 151--158.

HALPERN, J. Y. 1990. An analysis of first-order logics of
probability. Artif. Int., 46, 311--350.

KARP, R. M. 1972. Reducibility among combinatorial problems. In
Complexity of computer computations. R. E. Miller and J. W. Thatcher,
Eds. Plenum Press, New York, 85--103.

LEVIN, L. A. 1973. Universal sorting problems. Prob. Inf. Trans., 9,
265--266.

MCCARTHY, J. 1959. Programs with commonsense. In Teddington conference
on the Mechanization of Thought Processes. HMSO, London, England.

PAPADIMITRIOU, C. H. 1994. Computational Complexity. Addison-Wesley,
Reading, Mass.

PEARL, J. 1988. Probabilistic Reasoning in Intelligent Systems:
Networks of Plausible Inference. Morgan-Kaufmann, Los Altos, Calif.

TURING, A. M. 1936. On computable numbers, with an application to the
Entscheidungsproblem. Proc. Lond. Math. Soc. 42, 230--265. Also 43,
(1937), 544--546.

VALIANT, L. G. 1994. Circuits of the Mind. Oxford University Press,
New York.

VALIANT, L. G. 2000. Robust logics. Artif. Int., 117, 231--253.
Journal of the ACM, Vol. 50, No. 1, January 2003.


--  作者:eyounx
--  发布时间:3/15/2005 8:08:00 PM

--  
看起来都是hardest哎
W 3 C h i n a ( since 2003 ) 旗 下 站 点
苏ICP备05006046号《全国人大常委会关于维护互联网安全的决定》《计算机信息网络国际联网安全保护管理办法》
703.125ms