Sunday, September 20, 2009

Self-checkout lines: the future of education?



As do many millions of people in the U.S., I am now regularly forced to use the self-checkout apparati at such national chain stores as Home Depot (shown above) and Walmart (shown here, in case you live under a rock.) I must admit, I don't like them; but, I also admit they make personnel and commercial sense for the companies that use them. Afterall, a manager doesn't have to hire or fire a machine, and machines don't need insurance, or moral talk, or discipline, or that host of a zillion other interactions required for human resource management. My guess is that they also breakdown at a much more (quantifiably) predictable rate, and probably are easier to price and plan for installation than are employees.

In the 80s, I was a part of a (educationally profitable) band. One day, a few bandmembers and myself were standing around, killing time, and the drummer made a good-natured joke at one of the instrumentalist's expense. The bemused object of the joke retorted, "Hey, buddy, you can be replaced by a machine!" Back then it was quite funny, as the drum machines of the day sounded like a cross between tin foil being crinkled and wine-glasses being dropped. However, with today's technology any such retort would be quite accurate, and maybe also in bad taste. Still, even back then, a short conversation broke out on the matter about machines and drummers, with the eventual conclusion being that "you can 'hang out' with a drummer, but you can't with a machine." That was true at the time and is probably still true for a couple of more decades, though the general drift of how we'll first have A.I. buddies is now clear.

So, as has been well attested since the industrial revolution, people are being replaced by machines. Well, not completely, of course. At checkout kiosks, there is one employee who keeps watch over a half a dozen or so of the devices. The same thing happened in the yarn industry in the late 1700s, where one (lucky) employee could watch over eight or more Spinning Jenny machines. The difference, I suppose, is that now such replacements are occurring in the service industry, not just in the manufacturing industry. It used to be said that this wasn't so bad, since somebody had to be trained to fix the machines. An optimistic view, but not true to the employment threat, since the number of people displaced is far more than the number of technicians required to maintain the displacing technology, as is seen by the automobile industry's usage of robots (video).

The sudden rise of the internet was not at first a problem for the education industry, per se, but the super-addition of broadband tools which make the posting of audio and video almost effortless is. Add to this the ever more-powerful, free and open-source cloud-based software, and one begins to think that education would not require institutions like colleges and universities to deliver an adequate degree. However, at this point, it is way too early to make such a claim, since having access to information and learning a subject are two different things, as anyone who has bought a calculus or foreign language textbook knows.

From my observation, most of the educational success that occurs in formal institutional settings comes from peer motivation and personal coaching. People do best when operating in face-to-face social groups, whether in academics, sports, or even in casual exercise programs. World of Warcraft crack-heads not withstanding, the internet is not yet able to offer a significant substitute for this social activity. Granted, that it might offer this someday cannot be ruled out, but we do not have full immersive, three-dimensional displays (i.e., interfaces) nor even the internet infrastructure to deliver this. Furthermore, not all disciplines are skill-based, template-based, or procedural-based; or, put differently, education and Engineering are not the same kind of endeavor. Still, in the early stages of education, there is information that must be mastered, and often it is somewhat template-based--such as using a technical vocabulary, a fairly-standardized history of the discipline, and other lower-division kinds of overviews and introductions. And it is with these where educational institutions are being forced to change. By analogy, just because the checkout kiosk is tolerable for some retail activities does not mean I would want it for all of them. For instance, the case-by-case, situational complexities of medicine and chemistry means that pharmacists are still required to review and perhaps even discuss what products I get at the back of Walmart, even if the fruit and vegetable guy at the front of Walmart need not review me about my choice of onion.

Ultimately, universities will have to separate (1) content that can be delivered for impersonal learning from (2) discipline-specific knowledge that requires a certain amount of personal interviewing and coaching. Quite a bit of money has traditionally be generated by using instructors to teach the former, but it turns out the latter is the more difficult and important task of education. Acquiring data is one thing, analyzing it is quite another, as Astronomers, for example, well know. The internet will continue to be a fine (if financially disruptive) tool for content delivery, but as a student gets more content, there are subsequently more difficult issues in knowing what to do with such content, and that requires a mind-to-mind engagement.

O.

Labels: , ,

Saturday, September 19, 2009

Is Higher Education Worth it?



Certainly here in the U.S., and across the world's developed nations, the answer is still a clear Yes.

The above image (duly lifted from The Economist) shows that in the U.S. there is an $100,000 dollar advantage to the state's coffers (even after student aid programs are taken into account) and around $165,000 dollar advantage to the individual him- or herself.

Some people have worried that there are too many college graduates, and that high supply will mean a lower demand for them from employers, but this has not been the case, so the time and money investments by a person seeking a college degree still pays off.

Also why people chose college at all is not clearly based on the calculation of these financial advantages:
Alison Wolf of King’s College London, the author of a book provocatively entitled “Does Education Matter?” says a big reason why school-leavers go to university is peer pressure. With many graduates to choose from, employers increasing turn up their noses at anyone who does not sport a degree, no matter what the job’s requirements. The result is more akin to an arms race, with everyone running to stand still, than a recipe for increasing prosperity.[1]
Finally, higher education is always a good way to ride-out times of unemployment and recession, because when the economy returns, graduates are best placed to enter the marketplace with the appropriately acquired technical skills. Of course, as a college professor, it is both prudent and enjoyable for me to purport such analysis.


O.

REFERENCES

[image] The Economist (from article below)

[1] "It still pays to study" The Economist Sept. 12, 2009.

Labels: , ,

Sunday, September 06, 2009

The ethics of getting tested for Alzheimer's



The New York Times is reporting that European researchers have discovered two genetic variants for Alzheimer's which "account for about 20 percent of the genetic risk of the disease." A second research team has also found one of the variants, as well as an additional one of their own.[1]

Not surprisingly, this brings up the old question--would you want to get tested for having a disease for which there is no cure? As in all Ethical issues, I imagine that people's intuitions would differ on this matter. For example, a young, single person might think that the knowledge of an event which is very far off could only lower the quality of life in the here and now. But an older person with lots of family members reliant on his income or status might consider it a prudent way of managing an otherwise uncontrollable end.

A relevant consideration is Alzheimer's cost:
"In a 1994 report from the American Journal of Public Health on the economical and social costs of Alzheimer’s, it was the third most expensive disease in the United States after heart disease and cancer. They reported that the average lifetime cost of care for an Alzheimer’s patient is $174,000 with a two to twenty year life expectancy after diagnosis. This figure does not include the loss of wages both for the Alzheimer’s sufferer or the caregiver."[2]
Given this, some might argue that there is an ethical imperative to get tested early, no matter what your age. Health care costs go up as the symptoms of Alzheimer's worsen. The earlier somebody is diagnosed, the better position they will be in to financially prepare for when it occurs. Therefore, in a medical system which is partially reliant on taxpayers (i.e. in Medicare), one owes it to the financially supporting community to put all information on the table.

Consider a somewhat parallel case where one owes the community an acknowledgment of any blood disease when donating blood, since tainted blood would immediately harm those in the community. Alzheimer's also harms the community, though its peculiar harm is merely financial. Does that make a difference in the imperative to get tested?

O.

REFERENCES

[image] Jim Baen's Universe Blog

[1] Nicholas Wade "Scientists Connect Gene Variations to Alzheimer’s" New York Times Sept. 6, 2009.


[2] Carolyn Dean, MD "The High Cost of Alzheimer’s" everything.com (Accessed Sept. 6, 2009)

Labels: , , , ,