%------------------------------------------------------------------------------% ILP Newsletter Volume 2, Number 4, 10th December 1995 %------------------------------------------------------------------------------% Editors: Saso Dzeroski and Nada Lavrac, Jozef Stefan Institute, Ljubljana, SI %------------------------------------------------------------------------------% Address all communication related to the ILP Newsletter to ilpnet@ijs.si To subscribe/unsubscribe send email with subject SUBSCRIBE/UNSUBSCRIBE ILPNEWS Send contributions in messages with subject heading ILPNEWS CONTRIBUTION Send comments and suggestions under subject heading ILPNEWS COMMENTS Back issues of the Newsletter and other information about ILPNET and ILP available via the World Wide Web (WWW), URL http://www-ai.ijs.si/ilpnet.html %------------------------------------------------------------------------------% Contents: - Abstracts of PhD theses related to ILP * Gunther Sablon: Iterative versionspaces with an application in ILP * Patrick van der Laag: An analysis of refinement operators in ILP * Aram Karalic: First order regression * Saso Dzeroski: Numerical constraints and learnability in ILP - AI and Mathematics, International Symposium 96 (Program) - International Conference on Machine Learning 96 (Call for papers) - European Conference on Artificial Intelligence 96 (Call for papers) - Ninth Conference on Computational Learning Theory 96 (Call for papers) %------------------------------------------------------------------------------% %------------------------------------------------------------------------------% Abstracts of PhD theses related to ILP In this issue we include the abstracts of four ILP-related PhD theses defended this year. If you have recently defended a thesis on an ILP-related topic, please send us an ASCII abstract. If you have recently supervised such a thesis, please encourage your student to send us an ASCII abstract. %------------------------------------------------------------------------------% Author: Gunther Sablon PhD Thesis Title: Iterative versionspaces with an application in inductive logic programming University: Katholieke Universiteit Leuven, Belgium Year: 1995 Abstract: This thesis consists of two parts: in the first part we develop a language-independent framework for efficiently solving concept learning problems; in the second part we apply this framework to Inductive Logic Programming. We view concept learning as a search problem. In the well-known framework of Versionspaces a bi-directional depth-first search algorithm that learns a maximally general and maximally specific concept representation is presented, and contrasted to the breadth-first approach of Mellish's Description Identification algorithm. In this context, we identify redundant information elements, in order to reduce the memory needed for storing information elements. We describe how automatically generated information elements can replace less informative ones. Next, we extend this framework to describe the more complex versionspaces that originate from introducing disjunctions. To be practically useful, we gradually restrict these disjunctive versionspaces by imposing preference criteria, based on notions of minimality. This leads to extensions of the non-disjunctive algorithms to the disjunctive case. In the second part of the thesis we show in detail how this general framework can be instantiated to Inductive Logic Programming. In this respect we also discuss the integration of machine learning in a planning system based on Horn clause logic. This illustrates that the use of a logical representation allows a smooth integration of Machine Learning and Problem Solving. In summary, the thesis contributes to the understanding and the development of search algorithms for concept learning in general, by developing a language independent framework, and by introducing several novel and generally applicable concept learning techniques. The application in the second part of the thesis shows that the framework is practically useful, and that it contributes to the field of Inductive Logic Programming. %------------------------------------------------------------------------------% Author: Patrick van der Laag Title: An analysis of refinement operators in inductive logic programming University: Erasmus Universiteit, Rotterdam, the Netherlands Year: 1995 Abstract: This thesis develops a classification theory for search operators employed in incremental learning systems. It characterizes specialization and generalization operators in terms of their effectiveness and efficiency, and investigates the problems of combining these desirable properties. Applying this theory to the relational, first-order representations of Inductive Logic Programming (ILP), it provides positive and negative results concerning effective and efficient learning in practical ILP-systems. %------------------------------------------------------------------------------% Author: Aram Karalic PhD Thesis Title: First order regression University: University of Ljubljana, Slovenia Year: 1995 Abstract: A system named FORS (First Order Regression System) is developed, capable of inducing numerical concepts in first order logic. FORS utilizes some useful ILP concepts (such as background knowledge), alleviates some shortcomings of the existing propositional regression systems, and integrates some of their advantages. Our goals were do develop a system which can: induce first-order logic concepts which incorporate continuous variables; make use of the background knowledge in intensional form; model dynamic systems (learn from time series); partition attribute space to subspaces and find a submodel for each subspace; handle noisy data. FORS was successfully applied in several synthetic and real-world domains where the requirements proved necessary and useful. FORS constructs a model in the form of a Prolog program. Covering approach, similar to the one of FOIL is used. The clause building part of the algorithm uses a top-down approach. The algorithm starts with the most general candidate clause, covering the entire example set and then specializes the clause by adding literals. Clause construction uses beam search to guide the algorithm through the space of possible clauses. As a part of the system, the pruning based on the Minimum description length principle was developed that can handle also continuous variables. It turned out that MDL pruning helps to build more comprehensible models, while at the same time preserves model's performance in terms of its prediction power. In experiments with physical domains FORS rediscovered Kepler's third law of planetary motion and ideal gas law (including rediscovery of the gas constant and the absolute temperature scale). In all the real-world domains, models using linear regression appealed most to the experts, since linear regression largely increased the expressive power of the models in comparison with pure first order logic. The experts were able to thoroughly analyse the induced models and carefully exploit the selected regions of attribute space and more reliably evaluate model's quality inside the regions. Modelling of water behavior in surge tank proved that FORS can successfully handle times series data. Additionally, FORS' ability to partition the attribute space coupled with its linear regression capabilities proved to be crucial in inducing a useful model without prior knowledge of ``absolute value'' function. Models induced from the Lake of Bled data adequately well describe the growth of algae in the lake. Newly induced background literals defining seasons help in better comprehensibility of the induced models. For the expert this was not the first approach to modelling the Lake of Bled behavior and experiments with FORS ~have confirmed his opinion that not much more can be done with current data without additional more precise and frequent measurements. A domain expert in the domain of steel grinding claims to be satisfied with the results of the machine learning, since our models enabled him to grasp some additional process properties, which he wouldn't be able to discover only with classical statistical tools. However, he suggests that the machine learning approach should not be used alone, but should be considered as a powerful supplement to already existent instruments. In this domain it also turned out that simple background knowledge can sometimes significantly improve the usefulness of induced models. FORS was also used in the domains of finite element mesh design and for construction of rules for the prediction of the mutagenic activity of nitroaromatic compounds, where its performance was at the level of other machine learning algorithms, used in those domains. During the work in the electrical discharge machining process, a data acquisition environment was established which enabled us to monitor and record crucial process parameters as well as the operator's control actions. Models were induced which, according to the expert, capture the main behaviour patterns of the operator. During the knowledge acquisition process several important guidelines for knowledge acquisition, concerning mainly the process of interaction with the domain experts, emerged, confirming that comprehensibility of the induced model plays an important role in the process of behaviour cloning, and again confirming, that for successful modelling of the expert behavior one can not rely only on interviews with an expert. %------------------------------------------------------------------------------% Author: Saso Dzeroski Title: Numerical constraints and learnability in inductive logic programming University: University of Ljubljana, Slovenia Year: 1995 Abstract: Inductive logic programming is a part of machine learning, a subarea of artificial intelligence, concerned with the induction of first-order clausal theories from examples and background knowledge. Two main approaches can be distinguished within ILP: the normal and nonmonotonic semantics. While several ILP systems have been developed for each of the two semantics, efficiency problems and problems with handling real numbers, among others, have hindered practical applications. We first present a new formalization of the nonmonotonic ILP setting, where examples are interpretations and concepts are clausal theories. Examples may be also given in the form of definite logic programs, from which interpretations are obtained as minimal Herbrand models. A background theory may be taken into account. This formalization makes nonmonotonic ILP a natural extension of propositional learning where a first-order representation and background knowledge are used. We then present a study of the computational complexity of inductive logic programming. We introduce PAC-learnability models for both semantics and prove positive learnability results in both. Results concerning learning in the presence of noise are also presented. The results for normal ILP are based on the transformation of ILP problems to propositional form performed by DINUS. The results for nonmonotonic ILP are based on our new formalization. The concept classes considered in our results can be induced by existing ILP systems, such as GOLEM and CLAUDIEN. Finally, we develop methods for handling real numbers and numerical constraints in both semantics. The DINUS system can be used for learning relational rules with real numbers and relational regression in normal ILP. We extend number handling in CLAUDIEN, adding the capability of using linear equations, thus combining quantitative and qualitative discovery. We finally present LAGRANGE, a machine discovery system tightly related to the two ILP semantics. From example behaviors, LAGRANGE discovers differential equations that describe the dynamics of a given system. %------------------------------------------------------------------------------% %------------------------------------------------------------------------------% ARTIFICIAL INTELLIGENCE AND MATHEMATICS January 3-5, 1996, Marina Marriott, Fort Lauderdale, Florida General Chair: Martin Golumbic, Bar-Ilan University, Ramat Gan Conference Chair: Frederick Hoffman, Florida Atlantic University Program co-chairs: Henry Kautz and Bart Selman AT&T Bell Labs Publicity Chair: Ugur Halici, Middle East Technical University Home Page: http://www.research.att.com/orgs/ssr/people/kautz/aimath96/aimath96.html Program at a glance =================== Wedn., Jan. 3 9:00--10:15 OPENING ADDRESS: DNA Computing, Richard Lipton 10:15--10:45 break 10:45--12:00 TECHNICAL SESSIONS 12:00-- 1:30 lunch 1:30-- 2:45 INVITED TALK: A Unified Semantics for Probability and Logic, Glenn Shafer 2:45-- 3:00 break 3:00-- 4:15 TECHNICAL SESSIONS 4:15-- 4:30 break 4:30-- 6:00 TECHNICAL SESSIONS Thurs., Jan. 4 9:00--10:15 INVITED TALK: Constraint Programming, Jean-Louis Lassez 10:15--10:45 break 10:45--12:00 TECHNICAL SESSIONS 10:15--10:45 break 12:00-- 1:30 lunch 1:30-- 2:45 TECHNICAL SESSIONS 2:45-- 3:15 break 3:15-- 4:30 TECHNICAL SESSIONS 4:30-- 4:45 break 4:45-- 6:00 PANEL - Knowledge Compilation and Approximation 7:00--10:00 banquet / banquet speech: Lessons from Computer Chess, Monte Newborn Fri., Jan. 5 9:00--10:15 INVITED TALK: Deduction with Constraints in Rewriting Logic, Kirchner 10:15--10:45 break 10:45--12:00 TECHNICAL SESSIONS 12:00-- 1:30 lunch 1:30-- 2:45 TECHNICAL SESSIONS 2:45-- 3:15 break 3:15-- 4:30 TUTORIAL: Logical Analysis of Data, Peter Hammer Technical Sessions ================== Wedn. morning .................. COMPLEXITY / ALGORITHMS A formal framework for evaluating heuristic programs Lenore Cowen, Joan Feigenbaum, and Sampath Kannan Graph parameters for time-space tradeoff Rina Dechter Anytime families of tractable propositional reasoners Mukesh Dalal LEARNING / DISCOVERY Computing optimal shallow decision trees David Dobkin, Dimitrios Gunopulos, and Simon Kasif PAC-learning from general examples Paul Fischer, Klaus-Uwe Hoffgen, and Hanno Lefmann Algorithms for PAC learning of functions with smoothness properties Naheswara S.V. Rao and V. Protopopescu THEOREM PROVING Heuristics for a semantic tree theorem prover Qingxun Yu, Mohammed Almulla, and Monroe Newborn Inductive theorem proving via abstraction Adolfo Villafiorita and Fausto Giunchiglia The adaption of proof methods by reformulation Xiaorong Huang, Manfred Kerber, and Lassaad Ceikhrouhou Wedn. afternoon .................... LEARNING / DISCOVERY / PROBABILITY Axioms of causal relevance David Galles and Judea Pearl On the complexity of learning decision trees J. Kent Martin and D. S. Hirschberg Scientific discovery based on belief revision Eric Martin and Daniel Osherson COMPLEXITY / ALGORITHMS An efficient algorithm for unit propagation Hantao Zhang and Mark E. Stickel Abstraction and the CSP phase transition boundary Robert Schrag and Danile Miranker The very particular structure of the very hard instances Dan Vlasie SYSTEMS Representing systems of interacting components in EUCLID K. J. Dryllerakis and M. J. Sergot Progenes: using metaknowledge to solve scientific problems P. Castells, R. Moriyon, and F. Saiz Blending AI and mathematics: the case of resource allocation Berthe Choueiry, Guevara Noubir, and Boi Faltings Wedn. 4:30 ................... ABSTRACTION AND REFORMULATION Fausto Giunchiglia, Thomas Ellman, Pandurang Nayak, Christer Backstrom, Mike Lowry, Henry Kautz, and Alon Levy NONMON / LOGIC PROGRM. Mixing a default rule with stable negation Jack Minker and Carolina Ruiz Representability by default theories Wiktor Marek, Jan Treur, and Miroslaw Truszczynski Classical negaton in non-monotonic reasoning and logic programming Jose Alferes, Luis Moniz Pereira, and Teodor Przymusinski Thurs. morning ...................... PLANNING Tractable plan existence does not imply tractable plan generation Peter Jonsson and Christer Backstrom A re-examination of the modal truth criteria for non-linear planning Maria Fox and Derek Long Simple recurrent neural networks as probability models Mostefa Golea, Masahiro Matsuoka, and Yasubumi Sakakibara AI&OR / OPTIMIZATION Using computational learning strategies as a tool for comb. optim. Andreas Birkendorf and Hans Ulrich Simon Towards a closer integration of finite domain prop. and simplex-based alg. M.T. Hajian, H. El-Sakkout, M. Wallace, J.M. Lever, and E. B. Richards Formalizing commonsense topology: the INCH calculusO Nicholas Mark Gotts PERCEPTION Introducing the mathematical category of artificial perceptions Zippora Arzi-Gonczarowski and Daniel Lehmann Sensor planning in 3D object search: its formulation and complexity Yiming Ye and John Tsotsos Thurs., 1:30 ...................... COMPLEXITY / ALGORITHMS Verification of prop. formulae by means of convex and concave transforms Hans van Maaren, Jan Friso Groote and Michiel Rozema Categorical decompositions, graph searching and problem solving Robert Zimmer and Robert Holte On the model checking complexity of circumscription Alexei P. Lisitsa MATHEMATICAL ADVANCES IN PROBABILISTIC REASONING Markov properties of cyclic graphs Jan Koster Probability Trees and their Generalizations Glenn Shafer Conditional Independence Structures Examined via Minors Frantisek Matus UNCERTAINTY Bayesian knowledge incorporation Irina Tchoumatchenko and Jean-Gabriel Ganascia Fuzzy modal logic Andrew Mironov Uncertainty reasoning using Gaussian belief functions Liping Liu Thurs., 3:15 ...................... THEOREM PROVING Detecting logical inconsistencies Bertrand Mazure, Lakhdar Sais, and Eric Gregoire Focusing ATMS problem-solving: a formal approach Gregory Provan Diagrams and mathematics Dave Barker-Plummer, Sidney Bailin, and Samuel Ehrlichman NONMONOTONIC AND META-REASONING Minimal number of perm. to compute all ext. for a finite default theory Pawel Cholewinski and Miroslaw Truszczynski Non-monotonic reasoning: from complexity to algorithms C. Cayrol, M.C. Lagasquie-Schiex, and Thomas Schiex Automating the synthesis of decision proc. in a constructive metatheory Alessandro Armando, Jason Gallagher and Alan Smaill Fri. morning ...................... AI AND OR A Fast Algorithm for Determining Falsifiability of Implication Formulas John Franco Distance-Based Classification Methods Alexander Kogan Learning of Boolean Functions under a Monotonicity Assumption Evagelos Triantaphyllou Structure and Representation of Horn Rule Bases Endre Boros MATHEMATICAL STRUCTURES Topological modal logic for subset frames with finite descent Bernhard Heinemann The structure of interlaced bilattices Aron Avron Fri. afternoon ...................... KR FORMALISMS Horn Rules and Description Logics: Can they be combined? Alon Levy and Marie-Christine Rousset From knowing how to knowing that Gilbert Ndjatou SATISFIABILITY Davis-Putnam-Loveland versus ``Resolution Search'' Vasek Chvatal Analysis of the Space of Solutions for Random Instances of the Satisfiability Problem, Olivier Dubois Solving Problems with Hard and Soft Constraints: Using a Stochastic Algorithm for MAX-SAT, Bart Selman Satisfiability from a different point of view Ewald Speckenmeyer Registration and Travel Info ============================ HOTEL AND TRAVEL INFORMATION Rooms have been blocked at the Fort Lauderdale Marina Marriott that are available to the participants at a rate of $95 per night, single or double occupancy for the Symposium, a huge savings against the ''rack rate.'' They must be booked directly with the hotel by December 19: Fort Lauderdale Marina Marriott, 1881 Southeast 17th Street, Fort Lauderdale, FL 33316. Phone: (305) 463-4000 Fax: (305) 527-6705 The applicants must mention the symposium, and should probably mention Florida Atlantic University. The hotel management would like the applicants to specify arrival and departure dates and estimated time of arrival, room preference (single or double/double, smoking or non-smoking), credit card type to be used for payment including number and expiration date (this, or a one-night deposit, is absolutely necessary to hold rooms past 6PM). Delta Air Lines, Inc., in cooperation with the Fourth International Symposium on Artificial Intelligence and Mathematics, is offering special rates to the meeting. These fares are based on Delta's published round-trip fares within the United States and Canada, San Juan, Nassau, Bermuda, St. Thomas, and St. Croix. To take advantage of these discounts, call your travel agent or call Delta at 1-800-241-6760, for reservations 7:30 a.m.-11:00 p.m., Mon-Fri, 8:30 a.m.-11:00 p.m., Sat/Sun Eastern time, and refer to file number XM0039. SYMPOSIUM REGISTRATION INFORMATION Regular Advanced Registration Fee: $165 Student Advanced Registration Fee: $85 Student Advanced Registration Fee, excluding banquet: $50 Regular On-site Registration Fee: $195 Student On-site Registration Fee: $105 Fee includes banquet (except for the special $50 student rate) and reception. Additional banquet tickets are $35. Advanced registrations must be received by December 22, 1995. Refunds will be made--less a $15 processing fee--for cancellations made in 1995. If you have full-time professional employment, you are not entitled to the student rate. To register, provide the following information: ___________________________________________________________________ Name: Affiliation: Mailing Address: Phone: Email: __ Regular $165 __ Extra banquet ticket $35 __ Student $85 __ Student, no banquet $50 If student, specify student status: Payment made by __ Mastercard, card number:_________________ expires:_____ __ Visa, card number:_________________ expires:_____ __ Check (enclosed) payable to the FAU Foundation, account S078, in US dollars. ___________________________________________________________________ ** Postal Address for registration ** International Symposium on Artificial Intelligence and Mathematics Mathematics Department Florida Atlantic University Boca Raton, FL 33431 USA ** FAX address for registration ** (407) 367-2436 attn: Frederick Hoffman ** Email address for registration ** aim4@acc.fau.edu ** Telephone registration ** Contact Mary Anne Bytheway, at (407) 367-3010; specify the Math Conference account, S078. DIRECT ALL QUESTIONS CONCERNING REGISTRATION AND TRAVEL TO: Frederick Hoffman, hoffman@acc.fau.edu or hoffman@fauvax.bitnet SYMPOSIUM HOME PAGE http://www.research.att.com/orgs/ssr/people/kautz/aimath96/aimath96.html DIRECT QUESTIONS CONCERNING THE PROGRAM (ONLY) TO: Henry Kautz, kautz@research.att.com Bart Selman, selman@research.att.com %------------------------------------------------------------------------------% %------------------------------------------------------------------------------% ***************************************************************** ICML'96 13th International Conference on Machine Learning Bari (Italy), July 3-6th, 1996 ***************************************************************** First Call for Papers and Workshop Proposals ***************************************************************** General Information =================== The 13th International Conference on Machine Learning (ICML'96) will be held in Bari, Italy, during July 3-6th, 1996, with informal workshops on July 3rd. The purpose of the conference is twofold: firstly, to emphasize the potential of machine learning approaches for solving problems in a wide range of application domains, secondly, to highlight relationships between machine learning and other fields, such as statistics, pattern recognition, artificial intelligence, control theory, instructional and cognitive sciences, computational complexity theory and software engineering. Program ======= The scientific program will include invited talks, presentations of refereed papers and a session of general discussion. Submissions are invited in all areas of Machine Learning, including, but not limited to: Abduction Analogy Applications of machine learning Artificial neural networks Case-Based learning Cognitive models of learning Computational learning theory Explanation-based learning Formal models of learning Inductive learning Inductive logic programming Genetic algorithms Knowledge discovery in databases Learning and problem solving Multistrategy learning Reinforcement learning Representation change Scientific discovery Theory revision Paper Format ============ Submissions must be clearly legible, with good quality print. Papers are limited to twelve (12) pages, excluding title page and bibliography, but including all tables and figures. Papers must be printed on 8-1/2 x 11 inch paper or A4 format, using 12 point type (10 characters per inch), with no more than 40 lines per page. A separate title page must include the title of the paper, the email and postal addresses of all authors, and a clear summary of the main contributions of the paper. The title page of accepted papers will be made available via World-Wide Web before the conference take place. Double-sided printing in encouraged. Requirement for Submissions =========================== Please send five (5) copies of each submitted paper to the Conference Chair. Submissions must be received by January 21st, 1996. Electronic or Fax submissions are not acceptable. Notification of acceptance or rejection will be mailed to the first (or designated) author by March 8th 1996. Camera-ready accepted papers are due on April 6th, 1996. Review Criteria =============== Each submitted paper will be reviewed by at least two members of the Program or Advisory Committee, and will be judged on significance, originality and clarity. Papers addressing application issues are welcome. Simultaneous submission to other conferences must be explicitly declared. In the case of multiple acceptance, presentation at ICML'96 and inclusion in the proceedings is only granted upon withdrawal from the other conference(s). Workshop Proposals ================== Workshop proposals are invited in all areas of Machine Learning. Please send a two (2) page description of the proposed workshop, its objectives, organizer(s), and expected number of attendees. The proposal must be received by the Workshop Chair by December 15th, 1995. Descriptions of accepted workshops will be made available via World-Wide Web. Notification of acceptance or rejection will be mailed to the organizer by January 31st, 1996. Calls for Papers for accepted workshops will be responsibility of the organizer(s). Program Chair ============= Lorenza Saitta, University of Torino saitta@di.unito.it Dipartimento di Informatica Phone: (+39) 11 - 7429.214 Corso Svizzera 185, 10149 Torino (Italy) Fax: (+39) 11 - 751.603 Local Chair =========== Floriana Esposito, University of Bari esposito@vm.csata.it Dipartimento di Informatica Phone: (+39) 80 - 5443.264 Via Orabona 4, 70125 Bari (Italy) Fax: (+39) 80 - 5443.196 Workshop Chair ============== Stefan Wrobel (wrobel@gmdzi.gmd.de) GMD, FIT.KI Schlo Birlinghoven 53754 Sankt Augustin (Germany) Publicity Chair =============== Jeff Schlimmer (schlimme@eecs.wsu.edu) School of Electrical Engineering and Computer Science Washington State University Pullman, WA 99164-2752 (USA) Advisory Committee ================== Jaime Carbonell (USA) William Cohen (USA) Kenneth De Jong (USA) Tom Dietterich (USA) Tom Mitchell (USA) Stuart Russell (USA) Derek Sleeman (UK) Paul Utgoff (USA) Organizing Committee ==================== Donato Malerba and Giovanni Semeraro (Italy) {malerbad, semeraro}@vm.csata.it Marco Botta and Filippo Neri (Italy) {botta, neri}@di.unito.it General Inquiries ================= Please address general inquiries to any of the members of the Organizing Committee or to the address: icml96@di.unito.it ICML'96 has its own page on the World-Wide Web in the URL at: http://www.di.unito.it/pub/WWW/ICML96/home.html This announcement is also available in PostScript in the URL at: ftp://ftp.di.unito.it/pub/ICML96/callforpapers.ps In order to receive further information, please send a note to the Publicity Chair. ----------------------------------------------------------------------------- Important dates =============== Workshop submission deadline: December 15, 1995 Paper submission deadline: January 21, 1996 Notification of workshop acceptance: January 31, 1996 Notification of paper acceptance: March 8, 1996 Camera-ready copy: April 6, 1996 %------------------------------------------------------------------------------% %------------------------------------------------------------------------------% The 12th European Conference on Artificial Intelligence (ECAI-96) Call for Papers (Short Version) The 12th European Conference on Artificial Intelligence (ECAI-96) will be held in Budapest, Hungary from August 12 - 16, 1996. The biennial European Conference on Artificial Intelligence (ECAI) is the European forum for international scientific exchange and presentation of AI research. The aim of the conference is to cover all aspects of AI and to bring together basic and applied research. The Conference Technical Programme will include paper presentations, invited talks, panels, workshops, and tutorials. ECAI-96 is organized by the European Coordinating Committee for Artificial Intelligence (ECCAI) and will be hosted by the John von Neumann Computer Society (NJSZT, Hungary). The conference venue will be the Budapest University of Economics. Submissions are invited on substantial, original and previously unpublished research in all aspects of AI, including, but not limited to: Automated Reasoning; Application and Enabling Technologies; Cognitive Modelling and Philosophical Foundations; Connectionist and PDP Models of AI; Distributed AI and Multiagent Systems; Knowledge Representation; Machine Learning; Natural Language and Intelligent User Interfaces; Planning, Scheduling, and Reasoning about Actions; Reasoning about Physical Systems and under Uncertainty; Robotics, Vision, and Signal Understanding; Standardisation, Verification, Validation and Testing of Knowledge-based Systems. Submissions must be received by the Programme Chair in hardcopy form by 10th January 1996. Successful authors will be notified on or before 18th March 1996. Camera-ready copies of the final versions of accepted papers must be received by 30th April 1996. %------------------------------------------------------------------------------% %------------------------------------------------------------------------------% CALL FOR PAPERS---COLT '96 Ninth Conference on Computational Learning Theory Desenzano del Garda, Italy June 28 -- July 1, 1996 ______________________________________________________________________ The Ninth Conference on Computational Learning Theory (COLT '96) will be held in the town of Desenzano del Garda, Italy, from Friday, June 28, through Monday, July 1, 1996. COLT '96 is sponsored by the Universita` degli Studi di Milano. We invite papers in all areas that relate directly to the analysis of learning algorithms and the theory of machine learning, including neural networks, statistics, statistical physics, Bayesian/MDL estimation, reinforcement learning, inductive inference, knowledge discovery in databases, robotics, and pattern recognition. We also encourage the submission of papers describing experimental results that are supported by theoretical analysis. ABSTRACT SUBMISSION. Authors should submit fifteen copies (preferably two-sided) of an extended abstract to: Michael Kearns --- COLT '96 AT&T Bell Laboratories, Room 2A-423 600 Mountain Avenue Murray Hill, New Jersey 07974-0636 Telephone(for overnight mail): (908) 582-4017 Abstracts must be RECEIVED by FRIDAY JANUARY 12, 1996. This deadline is firm. We are also allowing electronic submissions as an alternative to submitting hardcopy. Instructions for how to submit papers electronically can be obtained by sending email to colt96@cs.cmu.edu with subject "help", or from our web site: http://www.cs.cmu.edu/~avrim/colt96.html which will also be used to provide other program-related information. Authors will be notified of acceptance or rejection on or before Friday, March 15, 1996. Final camera-ready papers will be due by Friday, April 5. Papers that have appeared in journals or other conferences, or that are being submitted to other conferences, are not appropriate for submission to COLT. An exception to this policy is that COLT and STOC have agreed that a paper can be submitted to both conferences, with the understanding that a paper will be automatically withdrawn from COLT if accepted to STOC. ABSTRACT FORMAT. The extended abstract should include a clear definition of the theoretical model used and a clear description of the results, as well as a discussion of their significance, including comparison to other work. Proofs or proof sketches should be included. If the abstract exceeds 10 pages, only the first 10 pages may be examined. A cover letter specifying the contact author and his or her email address should accompany the abstract. PROGRAM FORMAT. At the discretion of the program committee, the program may consist of both long and short talks, corresponding to longer and shorter papers in the proceedings. The short talks will also be coupled with a poster presentation. PROGRAM CHAIRS. Avrim Blum (Carnegie Mellon University) and Michael Kearns (AT&T Bell Laboratories). CONFERENCE AND LOCAL ARRANGEMENTS CHAIRS. Nicolo` Cesa-Bianchi (Universita` di Milano) and Giancarlo Mauri (Universita` di Milano). PROGRAM COMMITTEE. Martin Anthony (London School of Economics), Avrim Blum (Carnegie Mellon University), Bill Gasarch (University of Maryland), Lisa Hellerstein (Northwestern University), Robert Holte (University of Ottawa), Sanjay Jain (National University of Singapore), Michael Kearns (AT&T Bell Laboratories), Nick Littlestone (NEC Research Institute), Yishay Mansour (Tel Aviv University), Steve Omohundro (NEC Research Institute), Manfred Opper (University of Wuerzburg), Lenny Pitt (University of Illinois), Dana Ron (Massachusetts Institute of Technology), Rich Sutton (University of Massachusetts) COLT, ML, AND EUROCOLT. The Thirteenth International Conference on Machine Learning (ML '96) will be held right after COLT '96, on July 3--7 in Bari, Italy. In cooperation with COLT, the EuroCOLT conference will not be held in 1996. STUDENT TRAVEL. We anticipate some funds will be available to partially support travel by student authors. Details will be distributed as they become available. %------------------------------------------------------------------------------%