|Latest Info!||What are Conceptual Structures?|
|Workshops & Tutorials|
|Venue & Travel|
|Exhibiting & Sponsorship|
ICCS 2007 is privileged to have the following internationally renowned speakers, making it even more important that you do not miss this event.
Please click below to view more information about our speakers:
Industry Day speaker information:
Tutorials Speaker Information:
Simon Buckingham Shum is a Senior Lecturer at the UK Open University's Knowledge Media Institute. Following degrees in Psychology, Ergonomics and HCI, he has worked on visual hypertext for mapping meetings and argumentation since 1990. He co-edited "Visualizing Argumentation" (2003), which brought together leading figures in argument mapping, and "Knowledge Cartography" (2007) expands this. He has received UK and US funding for e-science and e- learning projects, and is co-founder of the Compendium Institute, leading development of the Compendium tool for Dialogue Mapping and Conversational Modelling. He co-chairs the 2nd International Conference on the Pragmatic Web later this year. www.kmi.open.ac.uk/people/sbs
About Simon's ICCS 2007 Paper
"Hypermedia Discourse: Contesting Networks of Ideas and Arguments"
Simon Buckingham Shum
Knowledge Media Institute, The Open University, Milton Keynes, UK
In this talk I will motivate the concept of Hypermedia Discourse, an approach to reading, writing and contesting ideas as hypermedia networks grounded in discourse schemes. We're striving for cognitively and computationally tractable conceptual structures: fluid enough to serve as augmentations to group working memory, yet structured enough to support long term memory. I will describe how such networks can be (i) mapped by multiple analysts to visualize and interrogate the claims and arguments in a literature, and (ii) mapped in real time to manage a team's information sources, competing interpretations, arguments and decisions, particularly in time- pressured scenarios where harnessing collective intelligence is a priority. I will suggest that given the current geo-political and environmental context, the growth in distributed teamwork, and the need for multidisciplinary approaches to wicked problems, there has never been a greater need for sensemaking tools to help diverse stakeholders build common ground.
Yehudit Judy Dori
Yehudit Judy Dori is Associate Professor at the Department of Education in Technology and Science, Technion - Israel Institute of Technology, Haifa, Israel. She is also a Research Scholar at the Center for Educational Computing Initiatives, Massachusetts Institute of Technology, Cambridge, MA. Between 2000 and 2005 she was the Assessment Leader of the Technology Enabled Active Learning project at MIT.
Over the past twenty years, she has investigated teaching and learning of science in general and chemistry in particular. The research has focused on visualizations, higher order thinking skills, and metacognition, spanning the spectra of student levels on one hand and educational approaches on the other hand. The studies include development and implementation of new teaching approaches and curricula, and assessment of their educational value. The subjects of the studies have been students and teachers from junior high school and high school, as well as students at the higher education level.
She received her B.Sc. in chemistry from Hebrew University, Jerusalem, in 1975, M.Sc. in Life Sciences in 1981 and Ph.D. in Science Education in 1988, both from Weizmann Institute of Science, Rehovot, Israel.
Prof. Dori is a member of the National Association for Research in Science Teaching (NARST), where she served on the Editorial Board of the Journal of Research in Science Teaching. At the European Association for Learning and Instruction (EARLI) she was the Israeli Correspondent for the last four years. Between 1997 and 2001 she served as Co-Chairperson of the EARLI Special Interest Group on Evaluation and Assessment. Since 2003 Prof. Dori is the Chairperson of the National Committee for Chemistry Curriculum appointed by the Minister of Education, Israel.
She is the author of over 45 articles in international science education and technology journals, 5 book chapters, 12 textbooks in Hebrew, and of 10 courseware modules in chemistry and biology. She has been Principal Investigator of several Israeli national science education and EU projects. Prof. Dori is also a Co-Editor of two special issues on Educational Reform at MIT in the Journal of Science Education and Technology.
Wiebe is head of the Agent ART (Applications, Research and Technology) group in the Department of Computer Science at the University of Liverpool, and is renowned for his research on knowledge representation formalisms and logical foundations of agent theories, in particular on cooperation and coordination in Multi-Agent Systems. He made contributions to the theory of modal and epistemic logic in AI and computer science, belief revision, non-monotonic reasoning, logical foundations of game theory, coalition logic, knowledge and rationality, and temporal-epistemic theories of agency. He is also co-authoring a chapter on Modal Logic and Game Theory for the Handbook of Modal Logic, and a chapter on Multi-Agent Systems for the forthcoming Handbook of Knowledge Representation.
Prof. van der Hoek is founding Editor-in-Chief of the journal `Knowledge, Rationality and Action', is member of the editorial board of the `Journal of Autonomous Agents and Multi-Agent Systems', is associate editor of `Studia Logica' and is a member of the international advisory board of `Logique et Analyse'. He is also series-editor for Computer Science of a new book-series `Texts in Logic and Games'. He was invited/keynote speaker at European Workshop on Multi-Agent Systems (EUMAS) in 2003, Autonomous Agents and Multi-Agent Systems (AAMAS), 2004, and The Logic Colloquium, 2005. Since 1998 he has been one of the two co-chairs of Logic and the Foundations of Game and Decision Theory (LOFT) Conference; he was Program Chair of EUMAS 2005, and General Chair of the 10th European Conference on Logics in Artificial Intelligence (JELIA). Van der Hoek is a member of the EPSRC College.
About Wiebe's ICCS 2007 paper
"Dynamic Epistemic Logic"
(Joint work with Hans van Ditmarsch and Barteld Kooi)
When giving an analysis of knowledge in multiagent systems, one needs a framework in which higher-order information and its dynamics can both be represented. Our work contributes to such a framework. It also fits in approaches that not only dynamize the epistemics, but also epistemize the dynamics: the actions that (groups of) agents perform are epistemic actions. Different agents may have different information about which action is taking place, including higher-order information. We demonstrate that such information changes require subtle descriptions. Our contribution is to provide a complete axiomatization for an action language, in which an action is interpreted as a relation between epistemic states (pointed models) and sets of epistemic states. The applicability of the framework is found in every context where multiagent strategic decision making is at stake, and already demonstrated in game-like scenarios such as Cluedo and card games.
After completing his PhD in Philosophy at Cambridge, Christopher Hookway taught at the University of Birmingham for eighteen years before joining the Department of Philosophy at the University of Sheffield in 1995. His main research area is the thought of Charles S Peirce. Peirce, a general survey of Peirce's work for the Arguments of the Philosophers series appeared in 1985, and was drafted during a year at Harvard working on Peirce's manuscripts. In 1995, he was President of the Charles S Peirce Society. He has continued to work on Peirce, and a selection of papers on Peirce written after 1985 appeared as Truth, Rationality, and Pragmatism (2000, OUP). More recent work has concerned the content of Peirce's pragmatist maxim and his attempts to defend it, Peirce's important but neglected claim that the conclusion of an abductive reasoning is typically an interrogative, and his reasons for thinking that iconic representations have an indispensable role in cognition. The first published paper on this last topic addressed the claim that an idea is 'a sort of composite photograph'.
Hookway's other philosophical interests lie in the Philosophy of Language and, most of all, in epistemology. He is currently finishing a book defending a sort of pragmatist (indeed Peircean) version of virtue epistemology, currently entitled Epistemology as Theory of Inquiry (OUP 2008, hopefully).
About Chris's ICCS07 paper
Peirce has claimed that any language adequate for describing our surroundings and reasoning about them must make use of iconic representations, and it is a corollary of this that all cognition involves the use of icons. Such claims prompt questions about just what cognitive functions require the use of iconic representations and just what sort of content these representations have. The paper will be a sequel to '. a sort of composite photograph', and, after some general reflections upon the special contributions of iconic signs, it will examine how iconic representations can be involved in perceptual experience and survey other ways in which iconicity can be important.
Gary invented Trikonic as a diagrammatic transmutation and expansion of Charles S. Peirce's applied science of Trichotomic (representing his category theory). With support from the Center for Teaching and Learning (CTL) at the City University of New York (CUNY) he has begun applying Trikonic to the creation, observation, and manipulation of diagrams involving, for example, patterns of processes in inquiry and especially the development of information architecture. He teaches critical and creative thinking at LaGuardia College of CUNY with an emphasis on the semeiotic and pragmatism of Peirce. Through the CTL, he has been active in developing faculty programs employing an array of new technologies such as those involved in the creation and development of electronic portfolios. The implementation of Trikonic as a collaborative tool in the evolution of a Pragmatic Web is currently his dominant focus.
About Gary's ICCS07 paper
"Trikonic Architectonic" argues for the development of enterprise architecture robust and flexible enough to meet the challenges of today's heterogeneous and rapidly changing environment. Creating, developing, and deploying such architectures may prove necessary for achieving emerging research, business, and other significant social goals. In particular, a new kind of information architecture may be needed to bring about what ought to be in effect an inter-enterprise architectonic (I-EA) capable of integrating all key components and processes in an increasingly interconnected environment. The paper outlines an architectonic based on the trichotomic category theory of Peirce involving a pragmatic and evolutionary approach to the observation and manipulation of diagrams for structuring enterprise and inter-enterprise processes.
Ronald studied at Oxford in the 1950s, where he developed a passion for singing opera but, was diverted into hospital administration and then the steel industry, where he began to apply computers. Soon disillusioned by the poor organisational returns from technically excellent systems, he began to look for an alternative approach. The opportunity came when asked by the steel industry staff college to create courses for systems analysts in heavy industry. At that time, computer companies ran all the other courses for marketing their products. Instead, he treated organisations as the real information systems in which computers could play a part if appropriate.
He was one of the main contributors to a national training programme in systems analysis and was invited to join a team at the London School of Economics to develop teaching and research in information systems in 1969. His book Information, based on semiotics, was published in 1973. He had begun the research reported here in 1971 with Research Council funding. The theoretical work was largely completed before he left the LSE 20 years later for the University of Twente. With his students there and at other universities, the theory was put to the test in a large number of diverse organisations. Since retiring in 1999 he has continued the work, with funding from the EPSRC concentrating on writing up results from this lengthy research programme.
About Ronald's ICCS07 paper
Devlin argues for a "soft mathematics" to handle "meaning, context, cultural knowledge, [and] the structure of conversation". Perhaps it lies in the direction we have taken. Semiotics reveals that many information systems fail, despite their technical excellence, because they marginalize issues of meaning, intention and the social role of information. To investigate information systems and organizations, we started from the proposition that all organized behavior depends on the norms that people share. Then, to investigate social norms, we conjectured a series of formalisms for modeling organizations and, following Popper, refuted each one by empirical tests against legislation of increasing complexity. Step by step this led us into those three neglected areas, starting with semantics. Progress stalled until we acknowledged and then replaced our tacit objectivist assumption in favor of actualism, an ontology based on Gibson's theory of affordances, which accounts for perception in terms of the invariant repertoires of behavior from which each animal constructs the reality it knows. Norms that define permitted repertoires of behavior extend the theory of affordances into the social domain. Individual humans create their own limited realities but then, through words and pictures, we combine them into a shared, socially-constructed world that seems to have an objective existence, independent of any individual. But actualism overcomes a dangerous lacuna in objectivism by forcing into every sentence about reality a term that acknowledges the roles played by the agents responsible for creating it, even its most basic ingredients of space and time. We have evolved many practical tools from this seemingly abstract study: a method of semantic analysis that yields canonical schemas, which massively cut system development, support and maintenance costs; a semantic temporal manipulation language for expressing norms; and a classification of norms relevant to the architecture of organizations. The paper illustrates some of these methods in the hope of tempting better-qualified people to follow this path towards a more human-centred treatment of information systems.
(see Ronald's publications)
Mike Uschold is a research scientist at Boeing Phantom Works, the advanced research and development organization of The Boeing Company. His interests center around the field concerned with the development and application of ontologies. This includes the emerging Semantic Web, semantic integration, knowledge management, and more recently, in the area of applying ontologies to autonomous system navigation. For over two decades, Mike has been involved in a wide range of activities in these areas, including research, applications and teaching. Mike has served on the industrial advisory boards of various projects and initiatives, and given a number of invited talks at conferences, workshops, and Universities. He received his B.S. in mathematics and physics at Canisis College in Buffalo, N.Y in 1977, a Masters in computer science from Rutgers University in 1982, and a Ph.D. in Artificial Intelligence from The University of Edinburgh in 1991. Before arriving at the Boeing Company in 1997, he was a senior member of technical staff in the Artificial Intelligence Applications Institute (AIAI) at the University of Edinburgh. He has also been a lecturer and a research associate at the Department of AI at the University of Edinburgh.
For information on Mike's paper, please visit the Industry Day page.
Keith Hawker has been in the computer industry for over 20 years. He has been involved in pioneering the market acceptance of new technologies, having led the early development of both the workflow and then the rules automation markets. His expertise is in turning new enabling technology into high value repeatable business solutions. He has held senior sales roles within the European market and was formally Head of worldwide sales for a major rules automation vendor based in the US, leading the sale into many of the world’s largest banks.
He has been working for Metatomix for 3 years and has pioneered the sale of semantic technology into the financial services and manufacturing markets, working with clients to develop new knowledge-based applications that exploit the semantic capabilities of the Metatomix Platform.
For information on Keith's paper, please visit the Industry Day page.
Dominik Slezak received his PhD in Computer Science in 2002 from Warsaw University, Poland. In an instructional capacity he has supervised more than 15 graduate students in Canada, Poland, and the United Kingdom. He has pursued academic collaborations with Warsaw University, University of Regina, and the Polish-Japanese Institute of Information Technology. Currently, he is working as chief scientist for Infobright Inc.
Dominik Slezak serves as a guest editor and reviewer for a number of international scientific journals, and chair of several international scientific conferences. He has published over 50 pier-reviewed papers for books, journals, and conference proceedings. He has delivered a number of invited talks in Canada, China, Egypt, India, Japan, South Korea, Poland, and the United Kingdom. His research interests are related mainly to rough sets, data mining, KDD, data warehousing, bioinformatics, as well as medical and multimedia data analysis.
About Dominik's ICCS07 tutorial
Theory of rough sets focuses on derivation of knowledge from data. Its advantage lays in simple and powerful knowledge representation, as well as its relationship to the KDD-related task of feature reduction. Rough set principles are supported by efficient algorithms, easily combinable with other methodologies, leading to valuable results in many fields, including multimedia, medicine, bioinformatics, web analytics, etc.
In this tutorial, we focus on both foundations and applications of rough sets. The first part of the tutorial refers to some examples of rough set-based data mining projects, while the second part shows how rough sets enabled BrightHouse - the database engine developed by Infobright Inc. - to efficiently query the terabytes of compressed data with a limited need of their decompression. As a summary, we hope to convince all the attendees about usefulness of rough sets in the areas of data mining and data warehousing, basing on our academic and industry experiences.
|© ICCS 2007 Conference. Site Design by MSS Designs.|