TUESDAY, NOVEMBER 13
WEDNESDAY, NOVEMBER 14
Access Grid Enabled
The Computational Continuum from Human Genomes to Human Health
Moderator: Raquell M. Holmes, Research Assistant Professor, Center
for Computational Science, Boston University Panelists: Thomas Bartol,
Computational Neurobiology Laboratory, The Salk Institute for Biological
Studies; Charles Delisi, Professor Biomedical Engineering, Boston
University; Micah Dembo, Professor Biomedical Engineering, Boston
University; Joel Stiles, Senior Scientific Specialist, Computational
Neuroscience, Pittsburgh Supercomputing Center
The success of the human
genome project has shot the field of bioinformatics and computational
biology into the national limelight. As more and more scientists turn
to the frontier of computational biology, it is important to identify
the current trends and future computational needs of research across
the biological spectrum. This panel brings together computational
scientists who have been and are pushing the envelope in the areas
of genomics, cellular processes, and human health. Panelists will
discuss computational resources and approaches that are transforming
the way life sciences are done and understood.
The panel is directed towards
scientists who are interested in better understanding computational
challenges in the biosciences. Questions to be addressed include the
What are the computational
approaches used in current research of human genomes, cellular processes,
and human health?
What computing resources
are utilized in these areas?
What technological and
computing advances have aided in bringing the research to where
it is today?
What are the computing
advances that are needed to improve research in these areas?
Access Grid Enabled
Do Undergraduate Faculty Know that Computational Science is the Future?
Moderator: Scott Lathrop, Program Manager for Education, Outreach
and Training, National Center for Supercomputing Applications
Panelists: Rubin Landau, Professor Physics, Director of Computational
Physics Program, Oregon State University; E. Bruce Pitman, Professor
Mathematics and Vice-Provost for Educational Technology, State University
of New York, Buffalo; Kris Stewart, Professor Computer Science, San
Diego State University and Director of NPACI/CSU Education Center
on Computational Science and Engineering; Gabriele Wienhausen, Provost
for the Sixth College, University of California, San Diego
Computational Science is
making possible significant innovations and break-throughs within
academia, industry, and government. These improvements have followed
the efforts of schools, colleges, and universities to prepare mathematicians,
scientists, and engineers with the skills and insights needed to effectively
utilize modern computational and informational technologies. Yet many
educators are either unaware of how to integrate computational science
into their courses, or feel that the mere use of computers in their
teaching and research automatically provides them with the necessary
This panel is directed
to faculty, scientists, technologists, deans, and administrators.
We aim to stimulate their interests in the developing resources and
opportunities for expanding computational science on their own campuses.
The proposed panel session
will address Computational Science and its integration into undergraduate
education. Topics to be discussed will include the following:
What is computational
How do Computational
Science programs differ from Informatics and Information Technology
What balance between
mathematics, computer science, and scientific discipline is necessary
to ensure a successful program?
What are the challenges
in keeping this a strong and vibrant triad?
Do we really need to
educate students in computational science?
Will this benefit careers
in business and industry? How will this improve research in industry
Which fields will benefit
The perspective of needed
skills and knowledge within industry.
The impact of programs
as experienced by their alumni.
Review of exemplary
What are the qualities in these programs that
make them successful and attractive to students?
What are the aspects of programs that have
not succeeded and should be avoided?
The computational science
continuum from K-12, to undergraduate to graduate education.
What resources and opportunities
(including funding) are available or needed to aid educators in
integrating computational science into their courses?
How do we ensure that
these educational innovations address all fields including the social
sciences, and are fair to the developers?
Cultural and societal
issues as we move into a global IT society.
privacy, multi-cultural, and multi-lingual issues.
THURSDAY, NOVEMBER 15
10:30am - Noon
Access Grid Enabled
The Access Grid: Where
the Vision Meets Reality
Moderator: Emilee Patrick, Motorola Labs
Panelists: Crysta Metcalf, Anthropologist, Motorola Research Labs;
Don Morton, Associate Professor Computer Science, The University of
Montana, Missoula; Rick L. Stevens, Director Mathematics and Computer
Science Division, Argonne National Laboratory, and Professor of Computer
Science, University of Chicago; Jennifer Teig von Hoffman, Senior
Analyst, Boston University
The Access Grid (AG) is
the future of the Internet: a network of multimedia nodes, connected
by multicast over high-bandwidth networks. Each AG node is an ensemble
of resources enabling human interaction across the Grid, from high-end
audio and visual technology to interfaces to grid middleware and visualization
environments. In addition, large-format displays, multiple cameras
and microphones, and dedicated meeting rooms provide support for real-time
communication among entire groups, not just individuals.
Experiencing the AG can
feel as intimate as a private telephone call between two locations,
or as impersonal as attending a talk in a large auditorium. This is
not your mother's idea of videoconferencing. This panel will take
place on the Access Grid. Not only will the session be made available
to anyone with the requisite equipment and high-speed connection,
but attendees will experience firsthand the nature of interaction
over the AG. Four panelists will each present their own vision for
the future of communication and collaboration via the Internet, and
then engage in a discussion of how this compares with the usage, utility,
and usability of the Access Grid as it exists today.
Specific questions will
include, but will not be limited to:
Should the AG support
large scheduled meetings or smaller, informal gatherings?
Is it possible to do
What is the nature of
the AG community todaywho are the people that use the AG and
how do they use it?
What is the most compelling
application for the AG in the future?
How should it evolve?
What are the privacy
implications of AG technology everywhere?
From the perspective
of a novice user, what is the single most important problem that,
if it were solved, would bring the biggest improvement to the AG?
For more information about the Access Grid, see the AG Home Page at
FRIDAY, NOVEMBER 16
HPC Software: Have
We Succeeded in Spite of It or Because of It?
Moderator: John M. Levesque, Senior Technologist, Cray Inc.
Panelists: Walt Brainerd, Owner, The Fortran Company; Chris Doehlert,
President & CEO, Etnus; Michael Gittings, Guest Scientist, Los Alamos
National Laboratory, and Assistant Vice President and Chief Scientist,
Science Applications International Corporation; Bill Gropp, Senior
Computer Scientist and Associate Division Director, Mathematics and
Computer Science Division, Argonne National Laboratory; David Kuck,
Intel Fellow and Director KAI Software Lab, Intel Corporation; James
R. Taft, Technical Director, Advanced Computing Technologies, NASA
Ames Research Center
The moderator believes that we have succeeded
in spite of HPC software. The state of HPC software is poor at best
and the future of HPC software is dismal. The current thrust towards
Linux and open source software does not bode well for software. My
questions are targeted at a few specific problem areas.
First, I believe standards
have the most impact on the efficiency of the resulting software.
Fortran 90 standards have ruined the efficiency of the Fortran language.
On the other hand, the MPI standards group was able to define a set
of calls that could utilize the efficiency of the system without many
A second area of concern
is the academic HPC software research. Has this research contributed
to principal application developers? There are two different ways
such research can benefit the end-user: (1) be implemented within
the principal software components developed by the hardware/software
vendors or (2) be used directly by the end-user. An important measurement
for the panel is the efficiency of a software component on the state-of-the-art
hardware. HPF formulated a language that was impossible to implement
efficiently and subsequently was never used by the major application
developers; MPI, on the other hand, formulated a language (library)
that could be implemented efficiently and is the principal tool in
the HPC industry today.
Questions addressed to
Bill Gropp and Walt Brainerd on standards:
How are features brought
to a standards group?
When considering a feature,
are the efficiencies of implementation considered? For example,
if a feature is very difficult or impossible to implement efficiently
on a majority of the hardware architectures, is it given serious
do you have to improve HPC software standards?
Questions addressed to David Kuck and Chris Doehlert on software implementation:
What are the priorities for the following
characteristics of software development?
(c) ease of use
Please comment on the
difficulties of targeting Fortran 90 in your software. How much
of your work is based on academic research?
Has government funding
of numerous universities for HPC software research aided your development?
Questions addressed to Jim Taft and Michael Gittings on use of HPC software:
Considering all software
components, prioritize the software that you use to get your job
What, if any, non-application software that
you or your group developed out of necessity, do you feel should
have been supplied by a hardware and/or software vendor?
Going back over the years, please list what,
if any, software you have used which was developed by HPC academic
research groups? If the list is small or empty, please give your
views on the third question under the software implementers' list.
Best and Worst Ideas
Moderator: H. J. Siegel, Professor, Colorado State University
Panelists: James C. Browne, Professor Computer Science, University
of Texas, Austin; Cherri M. Pancake, Professor, Oregon State University;
Guy Robinson, Research Liaison/MPP Specialist, Arctic Region Supercomputing
Center, University of Alaska; Charles Seitz, CEO & CTO, Myricom, Inc.;
Burton Smith, Chief Scientist, Cray Inc.; Marc Snir, Professor, University
of Illinois, Urbana-Champaign
The title of this panel
is, to some extent, self-explanatory-but if that were all we said,
the abstract would be too short. Thus, the questions below expand
on the issues that the panelists may wish to consider. We are pleased
to have a very distinguished group of panelists covering a wide range
of aspects of supercomputing. We invite you to join in the discussion
with your answers to the following questions, and with your own questions
for the panelists and audience.
Some questions to be put
forth at the panel are as follows:
What makes an idea a
"supercomputing idea"? (You cannot use the word "supercomputer"
in your answer)
What are the criteria
for deciding what is best or worst?
What ideas have gone,
over time, from best to worst, worst to best, or both?
What ideas have stayed
around too long?
From ideas that were
bad, what good kernels have been extracted?
From ideas that were
good, what bad impacts have occurred?
Are some ideas reminiscent
of a "Tale of Two Cities"it is the best of ideas and the worst
Is one person's best
idea choice another person's worst idea choice?
What best and worst
ideas will hardware, software, and applications people agree to?
What ideas change from
best to worst (or visa versa) if your perspective changes among
hardware, software, and applications?
What best ideas have
been stifled due to industry (i.e., economic) factors?
How much should wideness
of applicability be used to judge the "bestness" of an idea?
What has been the impact
of marketing (i.e., mass appeal) on the ability to develop an idea
to find out if it is best or worst, or does this not affect supercomputers?
What has been the impact
of government initiatives on what can become a best idea?
When have standards
been enablers/stiflers of best ideas?
When have standards
been enablers/stiflers of worst ideas?
Can we use a knowledge
and understanding of supercomputing's best and worst ideas of the
past to develop new best ideas and avoid new worst ideas in the