Archive for ‘Technology’

January 31, 2013

P versus NP

P versus NP is the name of a question that many mathematicians, scientists, and computer programmers want to answer. P and NP are two groups of mathematical problems. P problems are considered ‘easy’ for computers to solve. NP problems are easy only for a computer to check.

For example, if you have an NP problem, and someone says ‘The answer to your problem is 12345,’ a computer can quickly figure out if the answer is right or wrong, but it may take a very long time for the computer to come up with ‘12345’ on its own. All P problems are NP problems, because it is easy to check that a solution is correct by solving the problem and comparing the two solutions. However, people want to know about the opposite: Are there any NP problems that are not P problems, or are all NP problems just P problems?

read more »

January 30, 2013

Susan P. Crawford

Susan P. Crawford (b. 1963) is a professor at the Benjamin N. Cardozo School of Law. She has served as President Barack Obama’s Special Assistant for Science, Technology, and Innovation Policy (2009) and is a columnist for ‘Bloomberg View.’

She is a former Board Member of ICANN (which regulates the Internet), the founder of OneWebDay (an annual day of Internet celebration and awareness held on September 22), and a legal scholar. Her research focuses on telecommunications and information law.

read more »

January 28, 2013

Centre for the Study of Existential Risk

The Centre for the Study of Existential Risk (CSER) is a proposed research centre at the University of Cambridge, intended to study possible catastrophic threats posed by present or future technology. The co-founders of the project to establish the center are Huw Price (a philosophy professor at Cambridge), Martin Rees (cosmology and astrophysics professor and former President of the Royal Society) and Jaan Tallinn (a computer programmer and co-founder of Skype).

Among the risks to be studied by the proposed center are those that might arise from developments in artificial intelligence, a risk likened in some press coverage to that of a robot uprising à la ‘The Terminator.’ Speaking about this case, Professor Price said, ‘It seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology.’ He added that when this happens ‘we’re no longer the smartest things around,’ and will risk being at the mercy of ‘machines that are not malicious, but machines whose interests don’t include us.’

January 28, 2013

Security Through Obscurity

Security through obscurity is a pejorative referring to a principle in security engineering, which attempts to use secrecy of design or implementation to provide security. A system relying on security through obscurity may have theoretical or actual security vulnerabilities, but its owners or designers believe that if the flaws are not known, then attackers will be unlikely to find them. The technique stands in contrast with security by design and open security, although many real-world projects include elements of several strategies.

Security through obscurity has never achieved engineering acceptance as an approach to securing a system, as it contradicts the principle of ‘keeping it simple.’ The United States National Institute of Standards and Technology (NIST) specifically recommends against security through obscurity in more than one document. Quoting from one, ‘System security should not depend on the secrecy of the implementation or its components.’

read more »

January 23, 2013

Second Variety

Second Variety is an influential short story by Philip K. Dick first published in ‘Space Science Fiction’ magazine in 1953. A nuclear war between the Soviet Union and the West has reduced much of the world to a barren wasteland.

The war continues however among the scattered remains of humanity. The Western forces have recently developed ‘claws,’ which are autonomous self-replicating robots to fight on their side. It is one of Dick’s many stories in which nuclear war has rendered the Earth’s surface uninhabitable. The story was adapted to the movie ‘Screamers’ in 1995.

Tags:
January 23, 2013

The Cathedral and the Bazaar

The Cathedral and the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary’ is an essay by Eric S. Raymond on software engineering methods, based on his observations of the Linux kernel development process and his experiences managing an open source project, fetchmail. It examines the struggle between top-down and bottom-up design. It was first presented by the author at the Linux Kongress in 1997 in Germany and was published as part of a book of the same name in 1999.

The essay contrasts two different free software development models: the Cathedral model, in which source code is available with each software release, but code developed between releases is restricted to an exclusive group of software developers. And, the Bazaar model, in which the code is developed over the Internet in view of the public. Raymond credits Linus Torvalds, leader of the Linux kernel project, as the inventor of this process. Raymond also provides anecdotal accounts of his own implementation of this model for the Fetchmail project.

read more »

Tags: ,
January 23, 2013

Neats vs. Scruffies

Neat and scruffy are labels for two different types of artificial intelligence research. ‘Neats’ consider that solutions should be elegant, clear and provably correct. ‘Scruffies’ believe that intelligence is too complicated (or computationally intractable) to be solved with the sorts of homogeneous system such neat requirements usually mandate. Much success in AI came from combining neat and scruffy approaches. For example, there are many cognitive models matching human psychological data built in cognitive architectures Soar and ACT-R.

Both of these systems have formal representations and execution systems, but the rules put into the systems to create the models are generated ad hoc. The distinction was originally made by AI theorist Roger Schank in the mid-1970s to characterize the difference between his work on natural language processing (which represented commonsense knowledge in the form of large amorphous semantic networks) from the work of John McCarthy, Allen Newell, Herbert A. Simon, Robert Kowalski and others whose work was based on logic and formal extensions of logic.

read more »

January 23, 2013

Moravec’s Paradox

mips

Moravec’s paradox is the discovery by artificial intelligence and robotics researchers that, contrary to traditional assumptions, high-level reasoning requires very little computation, but low-level sensorimotor skills require enormous computational resources.

The principle was articulated by Hans Moravec, Rodney Brooks, Marvin Minsky and others in the 1980s. As Moravec writes, ‘it is comparatively easy to make computers exhibit adult level performance on intelligence tests or playing checkers, and difficult or impossible to give them the skills of a one-year-old when it comes to perception and mobility.’

read more »

January 23, 2013

AI Effect

The AI effect occurs when onlookers discount the behavior of an artificial intelligence program by arguing that it is not real intelligence. Technology journalist Pamela McCorduck writes: ‘It’s part of the history of the field of artificial intelligence that every time somebody figured out how to make a computer do something—play good checkers, solve simple but relatively informal problems—there was chorus of critics to say, ‘that’s not thinking.”

AI researcher Rodney Brooks complains: ‘Every time we figure out a piece of it, it stops being magical; we say, Oh, that’s just a computation.’ As soon as AI successfully solves a problem, the problem is no longer a part of AI. McCorduck calls it an ‘odd paradox,’ that ‘practical AI successes, computational programs that actually achieved intelligent behavior, were soon assimilated into whatever application domain they were found to be useful in, and became silent partners alongside other problem-solving approaches, which left AI researchers to deal only with the ‘failures,’ the tough nuts that couldn’t yet be cracked.’

read more »

January 21, 2013

JSTOR

JSTOR (short for Journal Storage) is a digital library founded in 1995. Originally containing digitized back issues of academic journals, it now also includes books and primary sources, and current issues of journals. It provides full-text searches of more than a thousand journals. More than 7,000 institutions in more than 150 countries have access to JSTOR. Most access is by subscription, but some old public domain content is freely available to anyone, and in 2012 JSTOR launched a program providing limited no-cost access to old articles for individual scholars and researchers who register.

JSTOR’s founder was William G. Bowen the president of Princeton University from 1972 to 1988. JSTOR was originally conceived as a solution to one of the problems faced by libraries, especially research and university libraries, due to the increasing number of academic journals in existence. Most libraries found it prohibitively expensive in terms of cost and space to maintain a comprehensive collection of journals. By digitizing many journal titles, JSTOR allowed libraries to outsource the storage of these journals with the confidence that they would remain available for the long term. Online access and full-text search ability improved access dramatically.

read more »

January 10, 2013

Extropianism

Extropianism [eks-tro-pee-ahn-iz-uhm], also referred to as the philosophy of Extropy, is an evolving framework of values and standards for continuously improving the human condition. Extropians believe that advances in science and technology will some day let people live indefinitely.

An extropian may wish to contribute to this goal, e.g. by doing research and development or volunteering to test new technology. Extropianism describes a pragmatic consilience of transhumanist thought guided by a proactionary approach to human evolution and progress.

read more »

January 10, 2013

User Illusion

The user illusion is the illusion created for the user by a human-computer interface, for example the visual metaphor of a desktop used in many graphical user interfaces. The phrase originated at Xerox PARC. Some philosophers of mind have argued that consciousness is a form of user illusion. This notion is explored by Danish popular science author Tor Nørretranders in his 1991 book ‘Mærk verden,’ issued in a 1998 English edition as ‘The User Illusion: Cutting Consciousness Down to Size.’

He introduced the notion of exformation  (explicitly discarded information) in this book. According to this picture, our experience of the world is not immediate, as all sensation requires processing time. It follows that our conscious experience is less a perfect reflection of what is occurring, and more a simulation produced unconsciously by the brain. Therefore, there may be phenomena that exist beyond our peripheries, beyond what consciousness could create to isolate or reduce them.