http://cacm.acm.org/blogs/blog-cacm/98517
In the 19th century, writing about his work on mechanical calculating devices, Charles Babbage noted, "The most constant difficulty in contriving the engine has arisen from the desire to reduce the time in which the calculations were executed to the shortest which is possible." Roughly a century later, Daniel Slotnik wrote retrospectively about the ILLIAC IV parallel computing design, "By sacrificing a factor of roughly three in circuit speed, it's possible that we could have built a more reliable multiquadrant system in less time, for no more money, and with a comparable overall performance."
Babbage's design challenged the machining and manufacturing capabilities of his day, though others were recently able to build a functioning system using parts fabricated to tolerances achievable with 19th century processes. Similarly, Slotnick's design challenged electronics and early semiconductor fabrication and assembly. Today, of course, parallel computing designs embodying tens of thousands of processors are now commonplace, leveraging inexpensive, commodity hardware.
There is a lesson here that systems designers repeatedly ignore at their peril. Simple designs usually triumph, and the artful exploitation of mainstream technologies usually bests radical change.
All of which is to say that incrementalism wins repeatedly, right up to the point when a dislocating phase transition occurs. There are, of course, many paths to failure. One can be too early or too late. Or to put it another way, you want to be the first person to design a successful, transistorized computer system, not the last person to design a vacuum tube computer. The same is true of design approaches such as pipelining, out of order issue and completion, superscalar dispatch, cache design, and programming tools.
Any designer's challenge is to pick the right technologies at the right time, recognizing when inflection points, such as maturing, disruptive technologies, are near.
The shift from largely proprietary high-performance computing designs to predominantly commodity clusters a decade ago was but one of the most recent transitions. Arguably, we are near another disruptive technology point. The embedded hardware ecosystem offers one intriguing new performance-power-price point, particularly as we consider trans-petascale and exascale designs that are energy constrained. The experiences of cloud providers in building massive scale infrastructures for data analytics and on-demand computing are another possibility.
As I frequently told my graduate students at the University of Illinois at Urbana-Champaign, the great thing about parallel computing is the question ("How can I increase performance?") never changes, but the answers do. Babbage would have understood.
http://cacm.acm.org/blogs/blog-cacm/98702
I direct an effort, which is funded by the National Science Foundation's Broadening Participation in Computing program, called "Georgia Computes!" in which we are trying to improve and broaden computing education, across the pipeline, statewide. We spend a lot of effort offering professional development to high school teachers and undergraduate faculty. We are increasingly getting signals that the faculty in the University System of Georgia (USG) are turning more toward research, away from teaching.
Over the four years of the project, we have had fewer and fewer USG faculty attend our workshops. We have had many return visitors, and good coverage across the 29 USG institutions with computing departments. But overall nowadays, we are lucky to get a half-dozen attendees. Our external evaluator did interviews with faculty around the state to help us understand the attendance issue. The answer was pretty much the same from everyone, characterized by this interview quote: "In any department, only about 20% of the faculty care about undergraduate teaching. You got them all."
One of my Ph.D. students, Lijun Ni, is studying computing teachers, e.g., their sense of identity and how they improve their practice. She interviewed USG faculty at institutions whose mission is primarily undergraduate teaching. When asked, "What do you do to improve as a computing educator?" one faculty member told her, "I'm not a computing educator. I don't want to improve at it. My tenure case depends on my research work. I'm not going to spend any time working at being a better teacher."
This tension has always existed in the American university system. In his book, How Scholars Trumped Teachers, Larry Cuban points out that the American land grant universities were designed in the late 1800s to merge the British focus on undergraduate education with the German research university. But nobody asked whether that was even possible, and Cuban argues that the structure of the American university pretty much prevents the potential synergy of research and teaching from ever working.
The question I'm raising here is whether we can afford a shift toward research and away from teaching in the United States. There is evidence suggesting that the increasing costs of higher education are not due to growth in instructional costs, but in costs associated with sponsored programs and graduate education. In his "Why Universities Do Research" post, Rich DeMillo points that university research rarely pays for itself. Doing research is more expensive than doing education well.
Of course, we need research in American universities. It's absolutely critical for graduate education. Do we need research in our undergraduate institutions? Rich also cites a new study in his "Damaged Pipelines and the Future of Innovation" post showing that many of our undergraduate institutions that produce the most future Ph.D. students do not have (large, expensive) research programs. Number two on that list is Harvey Mudd College, which prides itself on having a "liberal arts engineering" focus, where undergraduate education is the top priority. If we can have high-quality undergraduate education without research programs, and research programs are so expensive, and higher education costs are growing too quickly for our nation's economy to absorb, does it make sense to steer away from teaching and into research?
http://cacm.acm.org/blogs/blog-cacm/99121
A thought-provoking article, "Learning by Playing: Video Games in the Classroom," appeared in The New York Times recently on games in education. Author of the indigestible yet indispensable academic game design bible Rules of Play, Katie Salen has a project in a New York city school called Quest to Learn in which children's learning is based around game playing and design. It sounds like a big budget, integrated version of what teachers in the U.K. have been doing in their own practice: the teachers playing games on a big screen in front of the class while soliciting advice from a super-engaged, drooling set of kids; and learners using game-authoring software to explore concepts with game design. What's different is that this project has been systematically devised and rolled out across a school, rather than relying on the individual innovative practices of some teachers. Particularly telling is the fact that there are three game designers working with 11 teachers to devise games around interdisciplinary curricular content. That's a fantastic resource.
The Times article acknowledges that sixth-graders who took part in Quest to Learn did no better on standardized tests than learners who had not had the privilege. This makes me uneasy. Sure, you can argue the Quest to Learn values different educational goals that are more relevant to 21st century society than ordinary curricula and that therefore different assessment tools are required. But is this just special pleading? It may be that we need to win the wider battle of making school assessment more appropriate to the skills required in current society before we worry about installing Wiis in the classroom.
One of the most interesting aspects of the article was related to the metaphor used to organize school life. The author writes: "What if we blurred the line between academic subjects and remained the typical American classroom so that, at least in theory, it came to resemble a typical American living room, or child's bedroom or even a child's pocket circa 2010...what if, instead of seeing school the way we have always seen it, we saw school for what our children dreamed it might be: a big, delicious video game?"
Guy Claxton argues that schools are currently organized around a factory metaphor, full of grim tasks that must be completed and overseen by a supervisor figure, and time strictly marshaled by bells. He proposes a metaphor which he considers to be more fruitful: learning as gymnasium, where learners strive to improve their own performance by exercising their brains in an enjoyable fashion. This is consistent with the learning as game metaphor, but perhaps more general purpose. In any case, it is about time we reconsidered how schools work as learning spaces. The industrial revolution has had its day, and we are beginning to realize it is counterproductive to coop children up in learning factories. That's why we need experiments like Quest to Learn.
©2011 ACM 0001-0782/11/0600 $10.00
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee. Request permission to publish from [email protected] or fax (212) 869-0481.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2011 ACM, Inc.
No entries found