In the distant past, there was a proverbial "digital divide" that bifurcated workers into those who knew how to use computers and those who didn't.[1] Young Gen Xers and their later millennial companions grew up with Power Macs and Wintel boxes, and that experience made them native users on how to make these technologies do productive work. Older generations were going to be wiped out by younger workers who were more adaptable to the needs of the modern digital economy, upending our routine notion that professional experience equals value.
Of course, that was just a narrative. Facility with using computers was determined by the ability to turn it on and log in, a bar so low that it can be shocking to the modern reader to think that a "divide" existed at all. Software engineering, computer science and statistics remained quite unpopular compared to other academic programs, even in universities, let alone in primary through secondary schools. Most Gen Xers and millennials never learned to code, or frankly, even to make a pivot table or calculate basic statistical averages.
There's a sociological change underway though, and it's going to make the first divide look quaint in hindsight.
From TechCrunch
View Full Article
No entries found