When a corporate, government, or university IT department of the 1970s debated an upgrade to its IBM S/370 mainframes, it is doubtful that the IT director (this was before CIO was a common title) was in any way influenced by the computing experiences or opinions of their teenage children. These decisions were made by a select few, driven by institutional budgets and objectives.
Nor were the research debates then raging over the language features in the U.S. Department of Defense (DoD) draft specifications for what would become Ada driven by the need to support consumer devices or rapid software development by the distributed ranks of language aficionados. Many of us remember the vigor of the debates and the successive language specification drafts (e.g., Strawman, Ironman, Steelman), some of which were published in ACM SIGPLAN notices.
To say that the world of computing has changed over the past four decades is to espouse a now tiresome platitude. Among the most profound changes has been the shift in locus of influence from organizations to individuals. Today, no CIO wishing to remain employed would consider a major hardware or software deployment without broad input from his or her users. Nor would they ignore the personal devices and expectations those users bring to their place of work. Smartphones, social networks, consumer email, cloud services, community software, and tools–they are blurring the boundaries between personal and corporate computing.
This is an inevitable consequence of the consumerization of IT and, equally importantly, the crumbling demarcations that historically separated our personal and professional lives. We have all responded to work-related email while nominally on vacation. I daresay most of us have also checked the status of our Facebook friends while at work.
This consumerization brings challenges and opportunities. Organizational leaders rightly fear the security risks of consumer devices on institutional networks, with possible information leakage and concomitant political, legal, and financial exposure. Conversely, employees desire convenience, flexibility, and familiarity, particularly when economic constraints are forcing them to work even harder and do more with less.
It is worth remembering that we in research and education are not immune to these trends. I was once forced to explain to a colleague in the midst of a desperate, final push to finish a grant proposal that we has no choice but to sandbox his machine (i.e., limit its network access) because it had been infected and was the source of a denial of service attack on the network. He protested–rightly–that the proposal was important. I observed–also rightly–that the risks and consequences of the attack were not simply to him. His personal choices had organizational implications.
Beyond the inevitable balancing of personal prerogatives and organizational processes lie the social contexts that shape our research agendas. IT consumerization touches the way we envision the future, the projects we pursue and the methods we embrace. Even more importantly, it defines the framework of discourse. Yes, we can do amazing things with today’s consumer technology–and we should–but other opportunities also await.
One could have conducted social network research using those 1970s mainframes and their timesharing systems. Only a few hardy and intrepid souls did, for which we must be thankful. Computing research is about the judicious balance between the present practical and the future possible. We must leverage the practical while envisioning the possible.
No entries found