2013-2014 Distinguished Lecture Abstracts

Yale N. Patt
Professor of Electrical and Computer Engineering
Ernest Cockrell, Jr. Centennial Chair in Engineering, and
University Distinguished Teaching Professor
The University of Texas at Austin

The correct FIRST course in computing for serious students

11:00 a.m. Friday, May 2, 2014 
Room 2005 Emerging Technologies Building


After 50 years of teaching computer science and computer engineering students, I am convinced that the conventional method of introducing computing to students via a high level language course (and worse yet an object-oriented course in JAVA) is a mistake. Students have no understanding of how a computer works and so they are forever memorizing patterns and hope that they can apply those patters to the application at hand. Unfortunately, memorizing is not learning and the results have been as expected. I believe the correct approach is my "motivated bottom-up" approach which I introduced to the freshman class at Michigan in 1995, and have been continually teaching and refining ever since.

I start with the switch level behavior of a transistor and build from there. From transistors to gates to muxes, decoders, gated latches, finite state machines, memory, the LC-3 computer, machine language programming, assembly language programming, and finally C programming. Students continue to build on what they already know, continually raising the level of abstraction. I have taught the course 12 times over the last 18 years to freshmen at Michigan and Texas, and an accelerated version of the course at USTC in Hefei and Zhejiang University in Hangzhou at the invitation of the Chinese. In this talk, I will describe the course, why I think it makes sense, how it affects the rest of the curriculum, and my experiences with it.


Parallelism: A Serious Goal or a Silly Mantra (... and what else is needed for the Microprocessor of 2024)

4:00 p.m. Friday, May 2, 2014 
Room 2005 Emerging Technologies Building


The microprocessor of 2024 will have two things going for it: more than 50 billion transistors on each chip and an opportunity to properly harness the transformation hierarchy.  We hear a lot about the parallelism that one will get from those 50 billion transistors, but very little about what rethinking the transformation hierarchy will enable.  In fact, almost everyone in the computer industry these days seems to be promoting parallelism, whether or not they have any clue whatsoever as to what they are talking about. That is not to say everyone is clueless about parallelism.  Indeed, the Parasol Lab here at Texas A&M is an excellent example that not everyone is clueless.  However, most of the current preoccupation with parallelism is due in large part to the highly visible and well advertised continuing benefits of Moore's Law, manifest by more and more cores on a chip.  More cores = more opportunity for parallelism. By 2024, we will clearly have more than 1000 cores on a chip -- whether we can effectively utilize them or not does not seem to curb the enthusiasm.  What I would like to do today is examine parallelism, note that it did not start with the multicore chip, observe some of the silliness it has recently generated, identify its fundamental pervasive element, and discuss some of the problems that have surfaced due to its major enabler, Moore's Law.  I would also like to try to show how the transformation hierarchy can turn the bad news of Moore's Law into good news, and play an important role in the microprocessor of 2024.



Yale N. Patt is Professor of ECE and the Ernest Cockrell, Jr. Centennial Chair in Engineering at The University of Texas at Austin. He continues to thrive on teaching both the large (400+ students) freshman introductory course in computing and advanced graduate courses in microarchitecture, directing the research of eight PhD students, and consulting in the microprocessor industry.  Some of his research ideas (e.g., HPS, the two-level branch predictor, ACMP) have ended up in the cutting-edge chips of Intel, AMD, etc. and some of his teaching ideas have resulted in his motivated bottom-up approach for introducing computing to serious students. The textbook for his unconventional approach, "Introduction to Computing Systems: from bits and gates to C and beyond," co-authored with Prof. Sanjay Jeram Patel of Illinois (McGraw-Hill, 2nd ed. 2004), has been adopted by more than 100 universities world-wide.  He has received the highest honors in his field for both his research (the 1996 IEEE/ACM Eckert-Mauchly Award) and teaching (the 2000 ACM Karl V. Karlstrom Outstanding Educator Award). He was the inaugural recipient of the recently established IEEE Computer Society Bob Rau Award in 2011, and was named the 2013 recipient of the IEEE Harry Goode Award.  He is a Fellow of both IEEE and ACM, and a member of the National Academy of Engineering.  More detail can be found on his web page www.ece.utexas.edu/~patt.

Faculty Contacts: Dr. Nancy M. Amato (amato [at] cse.tamu.edu) and Dr. Lawrence Rauchwerger (rwerger [at] cse.tamu.edu)

The Science of Cause and Effect

Judea Pearl
Department of Computer Science and Statistics
University of California, Los Angeles

10:20 a.m., Friday, May 9, 2014
Room 169, Blocker Building

Recent developments in graphical models and the logic of causation have a drastic effect on the way scientists now treat problems involving cause-effect relationships. Paradoxes and controversies have been resolved, slippery concepts have been demystified, and practical problems requiring causal information, which were long regarded as either metaphysical or unmanageable can now be solved using elementary mathematics.

I will review concepts, principles, and mathematical tools that were found useful in this transformation, and will demonstrate their applications in several data-intensive sciences. These include questions of causal effect estimation, confounding control, policy analysis, misspecification tests, heterogeneity, selection bias, missing data and the integration of data from diverse studies.

Reference: J. Pearl, Causality (Cambridge University Press, 2000)
Background material: http://ftp.cs.ucla.edu/pub/stat_ser/r355-reprint.pdf; http://ftp.cs.ucla.edu/pub/stat_ser/r372.pdf; http://ftp.cs.ucla.edu/pub/stat_ser/r417.pdf
Working papers: http://bayes.cs.ucla.edu/csl_papers.html


Judea Pearl is a Professor of Computer Science and Statistics at UCLA. He is a graduate of the Technion, Israel, and joined the faculty of UCLA in 1970. where he currently directs the Cognitive Systems Laboratory and conducts research in artificial intelligence, causal inference, and philosophy of science. Pearl has authored several hundreds research papers and three books: Heuristics (1984), Probabilistic Reasoning (1988), and Causality (2000; 2009). He is a member of the National Academy of Engineering, the American Academy of Arts and Sciences, the National Academy of Sciences, and a Fellow of the IEEE. AAAI, and Cognitive Science Society.

Pearl received the 2008 Benjamin Franklin Medal for Computer and Cognitive Science and the 2011 David Rumelhart Prize from the Cognitive Science Society. In 2012 he received the Technion's Harvey Prize and the ACM A.M. Turing Award.

Home page: http://bayes.cs.ucla.edu/jp_home.html
Extended bio: http://amturing.acm.org/award_winners/pearl_2658896.cfm

Faculty Contact: Dr. Nancy Amato (amato [at] cse.tamu.edu)

Dealing with your Digital Stuff: Managing Digital Collections

Steven M. Drucker
Principal Researcher and manager of the VUE group in VIBE
Microsoft Research

4:10 p.m., Monday, March 31, 2013
Room 124, Bright Building


George Carlin once famously talked about 'stuff'...

That's all I want, that's all you need in life, is a little place for your stuff, ya know? I can see it on your table, everybody's got a little place for their stuff. This is my stuff, that's your stuff, that'll be his stuff over there. That's all your house is: a place to keep your stuff. If you didn't have so much stuff, you wouldn't need a house.

But now our 'stuff' is digital and there's more of it than ever. It comes in the form of photos, music, presentations, tweets, web pages, email, documents and data in an endless variety of other forms. How can we organize our stuff? Find it? Browse it? Make sense of it? Show it to other people? We explore a variety of projects, from photo collection organizers to assisted clustering systems, all with interfaces inspired by principles of information visualization, and find some common techniques for dealing with our digital 'stuff'.



Dr. Steven M. Drucker is a Principal Researcher and manager of the VUE group in VIBE at Microsoft Research (MSR) focusing on human computer interaction for dealing with large amounts of information. He is also an affiliate professor at the University of Washington Computer Science and Engineering Department.

Before coming to Microsoft, he received his Ph.D. from the Computer Graphics and Animation Group at the MIT Media Lab in May 1994. His thesis research was on intelligent camera control interfaces for graphical environments.

He has demonstrated his work on stage with Bill Gates at the Consumer Electronics Show (CES); shipped software on the web for gathering and acting on information collected on the web; was written up in the New York Times; filed over 108 patents; and published papers on technologies as diverse as exploratory search, information visualization, multi-user environments, online social interaction, hypermedia research, human and robot perceptual capabilities, robot learning, parallel computer graphics, spectator oriented gaming, and human interfaces for camera control.

His email address is sdrucker [at] microsoft.com, web site is http://research.microsoft.com/~sdrucker.

Faculty Contact: Dr. Andruid Kerne (andruid [at] cse.tamu.edu)


Parsing with Pictures

Keshav Pingali
W.A."Tex" Moncrief Chair of Grid and Distributed Computing Professor
Department of Computer Science
University of Texas, Austin

4:10 p.m., Monday, October 14, 2013
Room 124, Bright Building


The development of elegant and practical algorithms for parsing context-free languages is one of the major accomplishments of 20th century Computer Science. These algorithms are presented in the literature using string rewriting or abstract machines like pushdown automata, but the resulting descriptions are difficult to understand even for experts in programming languages.

In this talk, we show that these problems are avoided if parsing is reformulated as the problem of finding certain kinds of paths in a graph called the Grammar Flow Graph (GFG) that is easily constructed from a context-free grammar. Intuitively, the GFG permits parsing problems for context-free grammars to be formulated as path problems in graphs in the same way that non-deterministic finite-state automata do for regular grammars. We show that the GFG enables a unified and elementary treatment of parsing algorithms such as Earley's parser for general context-free grammars, recursive-descent parsers for LL(k) and SLL(k) grammars, and shift-reduce parsers for LR(k) and SLR(k) grammars, among others.

These results suggest that the GFG can be a new foundation for the study of context-free languages. No knowledge of parsing is required to understand this lecture - the only prerequisites are elementary graph theory and an open mind.

This is joint work with Gianfranco Bilardi of the University of Padova, Italy.



Keshav Pingali is a Professor in the Department of Computer Science at the University of Texas at Austin, and he holds the W.A."Tex" Moncrief Chair of Computing in the Institute for Computational Engineering and Sciences (ICES) at UT Austin. He was on the faculty of the Department of Computer Science at Cornell University from 1986 to 2006, where he held the India Chair of Computer Science. Pingali is a Fellow of the ACM, IEEE and AAAS. He served as co-Editor-in-chief of the ACM Transactions on Programming Languages and Systems (TOPLAS) in 2007-2010, and on the NSF CISE Advisory Committee in 2009-2012.

Faculty Contact: Dr. Lawrence Rauchwerger (rwerger [at] cse.tamu.edu)