Friday, September 29, 2006

Open Access In America

When Senators John Cornyn (R-TX) and Joseph Lieberman (D-CT) introduced the Federal Research Public Access Act of 2006 this spring, many scientists had a warm fuzzy feeling: The bill would require any published paper drawing on research funded by a major US government agency to be put online within six months, enabling anyone with Internet access to obtain the latest scientific research.

The bill isn't yet passed, but a look at the possible consequences (and at the state of the open access movement in general) is here.

Wednesday, September 27, 2006

Physics Processing Units, Game AI and AI Cards

In short this article states that Intel (like AMD) is developing standards and interfaces to allow other companies to build accelerator cards that can interact very closely with the CPU.  This new technique will "improve the ubiquitous PCI technology to accommodate a wide variety of accelerator chips".  Now I'm wondering: in recent years more and more functions that used to be done by cards where integrated into the mainboard or even emulated by the CPU (think of the modems or  network cards for example).  Why now the movement to more specialized processing hardware? Will it be common for computers to have  a card dedicated to physics simulation (a PPU or Physics Processing Unit) in the near future? And an Game-AI card in two years? A real AI card in 5 years? And how would a real AI card look like, what kind of functions would it perform? Some analog computations? Very fast similarity based retrieval?

List of 175 Semantic Web Tools

Can be found on the blog of Michael K. Bergman.

Semantic Web tool sets span from comprehensive engineering environments to specific converters and editors and the like. The entire workflow extends from getting the initial content, annotating or tagging it according to existing or built ontologies, reconciling heterogeneities, and then storing and managing the RDF or OWL with subsequent querying and inferencing.

Thus, listed below, are today’s current, most comprehensive list of 175 semantic Web software tools and applications. I am now further characterizing these offline as to open source v. proprietary and categorizing according to SW-related workflow. I may later post those expansions.

The entire list is here.

Parallel Programming

In case you where wondering how far Intel is planning to go with the number of processing cores per cpu (from news.com):

SAN FRANCISCO--Intel has built a prototype of a processor with 80 cores that can perform a trillion floating-point operations per second.

CEO Paul Otellini held up a silicon wafer with the prototype chips before several thousand attendees at the Intel Developer Forum here Tuesday. The chips are capable of exchanging data at a terabyte a second, Otellini said during a keynote speech. The company hopes to have these chips ready for commercial production within a five-year window.

This is just another reminder that anyone designing computation intense algorithms needs to worry whether these can be parallel. In the coming years we will not see a big speed increase for algorithms that can only run on one cpu-core (unlike in the past decades).

Thursday, September 21, 2006

Exploiting Usage Data for the Visualization of Rule Bases

For a change something about my own work.  The abstract of one of our recent papers (or actually poster - the reviewers degraded it to one). The authors are Valentin Zacharias and Imen Borgi.

Recent developments have made it clear that the Semantic Web will not only consist of RDF data and OWL ontologies but also of knowledge formulated as rules, hence the visualization of rules will also be an important part of user interfaces for the Semantic Web.
In this paper we describe novel ideas and their prototypical implementation for the visualization of rule bases. In the creation of the visualization our approach considers not only the structure of a rule base but also records of its usage. We also discuss the challenges for visualization algorithms posed by rule bases created with high level knowledge acquisition tools. We describe the methods we employ to deal with these challenges.

I believe that what we describe is indeed a new and important idea for the visualization of rule bases. Most of the negative remarks from the reviewers stem from flaws in the way the paper was written (as opposed to flaws in the work itself) and that the work isn't evaluated yet.  I'll continue to work in this direction.

I will present it at the SWUI (Semantic Web User Interaction) workshop at the ISWC.  You can read the entire paper as well as the other submissions to the workshop online.  There you can also find a paper on the Tabulator tool from TBL and a interesting - albeit a bit abstract and inexact - article on Explanations for the Semantic Web.

Wednesday, September 20, 2006

Image Recognition

I think that image recognition, and in a way image processing, are the next low hanging fruits for AI and computer science in general. I really think that the next really amazing mainstream applications will come out of this area.

Just have a look at the following recent developments: The most amazing is probably Microsoft's Photosynth, an application that constructs a 3d model from many snapshots of the same scene.  Or research into an application that can automatically make a picture of a person more beautiful [German]. In a related development HP is already offering a tool that can automatically make you look slimmer.  Newer Canon cameras already automatically identify faces in a picture (and then focus on them). Google recently bought a company that focuses on face recognition to make your image collection searchable by person. And finally, on the Riya website (the pioneers of face recognition for personal use) we can see how a next generation visual search engine could look like.

Tuesday, September 19, 2006

The Holy Grail For Game Playing Computer

These days, after computers have mastered chess, that's playing "Go". Wired has an interesting article about  a recent improvement of Go playing computer programs and an interview with the researcher that had this idea.

It is commonly believed that humans play Go using mostly complex pattern recognition - and so I was kind of  disappointed that the described recent approaches did not go down this way but rather use a brute force approach similar to the one used for chess. The innovation is mostly a new way to evaluate board positions using a Monte Carlo algorithm. 

Monday, September 18, 2006

Loebner Prize 2006

Icogno scooped the 2006 Loebner Prize Bronze Medal after judges decided that its AI called Joan was the "most human computer program".

The competition is based on the Turing test, which suggests computers could be seen as "intelligent" if their chat was indistinguishable from humans.

The gold medal, which goes to an AI that fools the judges, is unclaimed.

(From this BBC article)

What the article misses to mention is the fact that the Loebner prize is not generally accepted as an AI prize. The setup of the competition incites people to build systems that try to fool the judges without the need for any real reasoning. To succeed in this competition people try to build systems that can send responses to chat messages but that otherwise have nothing to do with human intelligence.

Saturday, September 16, 2006

Computer Judge

Xinhua:

A court in China has used a software program to help decide prison sentences in more than 1,500 criminal cases, a Hong Kong newspaper said Wensday.

Judges enter details of a case and the system produces a sentence, the paper said.

    "The software can avoid abuse of discretionary power of judges as a result of corruption or insufficient training," the paper quoted Zichuan District People's Court chief judge, Wang Hongmei, as saying.

The article leaves many questions open, but I liked the idea that they are using an expert system for its impartiality (alright, exper system is not a good term for systems like this - nobody has yet figured out how to replicate expert reasoning in a computer. Maybe "clerk system"?).

Friday, September 15, 2006

Another Expert System On The Web?

The Product Chiron™ is a proven Artificial Intelligence like ("AI") tool, able to create the very realistic illusion that a client is actually interacting with another human. MCF's Chiron™ technology is the next generation in the convergence of Internet and telephony customer interaction. It can be used for help, learning, entertainment or education.

MCF III provides front-end solutions for companies who want the personal touch in responding to their customers needs. Using real world testing the Chiron systems can address many applications where no satisfactory solution currently exists. Relative to present generations of AI technology, MCF's new generation with the "human touch" of interaction is significantly more proactive. Other AI technology is only reactive. When you ask it a question, it gives you only a single answer and comes to a grinding halt.

MCFIII is the most powerful "Artificial Intelligence like" technology publicly available for any price. Our patent pending technology has over nine years of full time development. It is a proven technology that has been in real world use around the globe for one and half years. We have had over 100,000 users worldwide, they have helped us prove, polish and improve Chiron.
Hmm, here's more ...

Tuesday, September 12, 2006

Open, Web-Based Expert System

Exists already, have a look at the Law Underground, a expert system for legal advice. Its maintained by volunteer lawers and law students.

It uses very simple rules, similar to those used in the early expert system. In fact the rules are even simple than those used by Mycin (because they don't utilize any numeric certainty factors). A sample rule is:

CONCLUDE THAT The person has standing to object to the legality of a search

IF

1. The person owns or has a right to possess the place searched OR
2. The place searched is the person's home OR
3. The person is an overnight guest at the place searched
Rules are entered in textual form, rules are versioned, the rule base can be navigated along the "depends on" links and there is always a simple way to test rules.

Monday, September 11, 2006

Bursting Tech Bubbles Before They Balloon

Interesting IEEE Spectrum article on the technology trends for the next 50 years:
The survey identified five themes that we believe are the main arteries of science and technology over the next 50 years: “Computation and Bandwidth to Burn” involves the shift of computing power and network connectivity from scarcity to utter abundance; “Sensory Transformation” hints at what happens when, as Neil Gershenfeld, director of MIT’s Center for Bits and Atoms, puts it, “things start to think”; “Lightweight Infrastructure” is precisely the opposite of the railways, fiber-optic networks, centralized power distribution, and other massively expensive and complicated projects of the 20th century; “Small World” is what happens when nanotechnology starts to get real and is integrated with microelectromechanical systems (MEMS) and biosystems; and finally, “Extending Biology” is what results when a broad array of technologies, from genetic engineering to bioinformatics, are applied to create new life forms and reshape existing ones.

Thursday, September 07, 2006

Dedicated AI-Card

A new company called AIseek announced what it describes as the world's first dedicated processor for artificial intelligence. Called the Intia Processor, the AI chip would work in conjunction with optimized titles to improve nonplayer character AI. Similar to the way in which physics accelerators can make a game's environment look much more realistic, Intia would make the NPCs act more true to life.
More.

It's obviously not the first dedicated AI processor - but still a great idea. Will be a while until we can get these things for out computers, though: still seems to be in an very early stage.