Monday, November 26, 2007

Cyborg army coming soon

I am amazed at how fast infantry-level technology is moving. Here is a video titled, "Exoskeleton Turns Humans Into Terminators." It is of actual (and successful) research into exoskeletal robotic systems which enhance soldier's abilities.

At this point it seems the power supply is the problem, since it takes a lot of juice to move both the suit and its load around. Such a suit would have to be plugged in to a portable generator, and would probably work behind the lines in a supply and warehouse labor pool. Still, the time saved in moving missiles, bombs, and other heavy items around could very well be mission critical. Much of this has been conceived before [ex. 1] [ex. 2], [ex. 3]; but, with the advent of even short-term, reliable high wattage portable power supplies, things would move very quickly on the infantry level toward such techno-infantry conceptions.


Labels: , ,

Friday, November 23, 2007

Stem cell breakthrough announced

The New York Times was reporting yesterday that a new method of creating stem cells has finally been discovered. In some ways this validates the charges of those who said the first way (which used developing embryos) was being rushed into technological implementation without consideration of other possible methods.

Happily, the guy who was the brunt of the controversy for using the old embryonic method is the same guy who came up with the new method. Here is a short review of what stem cells are:
Stem cells, universal cells that can turn into any of the body’s 220 cell types, normally emerge only fleetingly after a few days of embryo development. Scientists want to use them to study complex human diseases like Alzheimer’s or Parkinson’s in a petri dish, finding causes and treatments. And, they say, it may be possible to use the cells to grow replacement tissues for patients.[1]
I am most excited about the option for growing replacement tissue. This might be the magic bullet that fixes the problem of too few organs for too many organ transplant needs. One could grow the specific type of tissue for organ repair, or perhaps even grow a second-organ on site and then do a "parts swap" within a patient.

Also, this might solve many issues in what is called xenotransplantation -- the act of transplanting organs or tissue material between two species. At first it was thought that one would take stem cells from animals, which have not specialized much, and therefore were less likely to provoke an immune response in humans. Those cells could then be harvested at some timely moment in development for use in humans. (Today faulty human heart valves are routinely replaced with ones taken from cows and pigs, so xenotransplantation is already a reality; doctors are now just wanting to extend the process.) However, with this new discovery, the stem cells used can be human cells, so perhaps the animals could be surrogate carriers for the developing human organ, and then the human organ gets removed when it has fully matured. A dozen years ago, congress had to ban research on human embryos, but science professed nonetheless.


[1] Gina Kolata, "Man Who Helped Start Stem Cell War May End It" New York Times November 22, 2007


Labels: , , ,

Thursday, November 22, 2007

Chimpanzees (and humans) trade for biological "services."

{ Podcast this essay } Seeing this sign reminded me of an article I recently read. Chimpanzee society shares many striking resemblances to human society. Like us, Chimps form complex social bonds, and they do so with symbolic gestures, most notably by means of sharing of prized resources. For example, though chimps will rarely share the wild plant foods which they forage, for strategic social situations they will share prized foods, such as meat or pilfered, cultivated plant foods (from human crops) as a way of reinforcing social bonds and constructing new alliances.

Most of the sharing occurs when adult males make offers to females that are in the proper cycle for reproduction -- for reference, call such females who receive these goods "hotties". Exchanging food for increase of sexual access is certainly in the male's advantage, and may showcase his suitability to sire, given that he has the capability to acquire such goods. Hotties apparently find this a fairly good indicator of male reproductive health. This activity can generalize even beyond primates as an analysis of altruism:
Food sharing has important implications for the evolution of cooperation, offering a means to evaluate the ‘paradox’ of altruism, whereby a recipient gains fitness benefits at the expense of a donor. When individuals control a highly valued resource, they may opt to use that resource as a tool for social bargaining. Thus, even acts that appear altruistic may serve to ultimately enhance one’s own fitness.[1]
What is also interesting in this study is that the sharing of prized resources was not an immediate cost-benefit payoff: "Males shared crops with a maximally swollen female in 16% of sharing events, but were never observed mating with that female immediately after sharing." I would speculate that hotties would be far better off (in terms of expected reproductive benefits) waiting to see if the pattern could be (i.e. would be) maintained over time.

Diamonds (and of course other valued stones) have been assessed by women across history as attractive gifts. It is known that primitive humans were the first mineral collectors of such stones as chert, chalcedony, and obsidian. These were shaped into arrowheads and spear points for hunting. Later in pre-history, humans used turquoise, gold, silver, and copper in religious ceremonies, for spiritual enhancement, and protection from evil. I think a reasonable speculation would be that the symbols of such stone technology could stand for the hunting capability of the man who gave such a symbol to a desired hottie. Throwing a deer carcass at a hottie would not be practical, or perhaps desirable, but bequeathing a symbol (or an actual instance) of the technology that shows clear access to a desirable resource would be a very small step from the meat-sharing behaviors of chimpanzees. (Also, it could be that good hunting technology shows intelligence, and intelligence is what is actually strongly favored by women in modern mate selection.[2]) Modern jewelery exchange seemingly begins as an abstraction of just such a process about 100,000 years ago.[3]

I've been thinking about getting my wife a necklace for Christmas, but I don't want any more children. So perhaps I should just cease this whole line of thinking.


[1] Hockings KJ, Humle T, Anderson JR, Biro D, Sousa C, et al. (2007) "Chimpanzees Share Forbidden Fruit." PLoS ONE (Accessed 11/22/2007)

[2] Ray Fisman "An Economist Goes to a Bar and solves the mysteries of dating" Slate Magazine 11/7/2007 (Accessed 11/22/2007)

[3] "Researchers Identify What May Be Oldest Known Jewelry" Voice of America 6/22/2006 (Accessed 11/22/2007)


Labels: , , , ,

Thursday, November 15, 2007

Attitude and Ethics

{ Podcast this essay } Suppose a person far more powerful than myself were to suddenly grab me, overwhelm me, and do some evil deed. Perhaps they would grab my arm and beat someone to death. Or perhaps they would work me like a puppet somehow. Would I be responsible for the consequences? I don't think so, at least at first glance.

Suppose I would try to resist, but to no avail. Then, of course, no one would say I was ethically accountable. One ought to be held accountable for what one can control. And I can't control what this powerful person is doing to me. Therefore, I ought not be held accountable for it. At first glance this seems right.

Recently I was on a trip to Boston, and I rode the subway. The platform of the subway was very crowed during the end of the work day. Suppose someone had pushed me from behind, and the force of their push had toppled me into someone in front of myself. And that poor someone fell onto the tracks and was crushed by the train.

No one would say I was to be held accountable for what happened, but the person behind me would be held accountable for what happened. In this case, it was presumed I was pushed inadvertently into a innocent bystander. Perhaps I would have protested along the way.

But now let us vary the scenario a bit. Suppose I am standing on a subway, and I see someone in front of me, known to be a heinous serious killer who got off on a technicality. Perhaps he has a dastardly look on his face. Perhaps I recently heard him whisper to an accomplice that he is going to kill again as soon as he gets the opportunity.

Again, Dick the Bruiser, sneaks up behind me and pushes me forward. I recognize Mr. Bruiser's grasp; I recall, and note yet again, his power at propelling me forward toward the person in front of me -- albeit, this time a serial killer. Past experience and attestation by expert witnesses agreed the first time Mr. Bruiser pushed me that there was nothing I could have done about it. He was just so huge.

This time, however, I recall all this, but I see where I'm being pushed. Even though I can't stop it, and even though I'm not responsible for starting it, I find I'm now glad Mr. Bruiser has pushed me. Quickly, and along the way as he's hurling my body into this serial killer, before the platform drops onto the tracks, I say, "Yes, Mr. Bruiser. Push harder. Faster!" All the while agreeing with him, all the while knowing (counterfactually) there is nothing I can do to stop him.

Again, the inevitable collision. Again, a body drops onto the tracks. But this time it is the body of a serial killer. The train runs over him. But in this case society has been granted an advantage by such a man's death.

Again, the question is asked: would I be ethical responsible for the serial killer's death? Ought implies can. I ought to be held accountable for something I could have stopped. It has already been stipulated there is no way I could have stopped this.

I enjoyed the inevitable outcome. It seems, therefore, even though I might be in agreement with the outcome; even though it might be held that the outcome was unethical -- killing a person arbitrarily without due processes seems about as unethical as could be in otherwise civil circumstances.

Now people might not approve of my attitude toward this. One should perhaps always have a revulsion toward unethical acts which end in the death, even murder of another person. And it is this very revulsion at the possibility, even likelihood, of further deaths at the hands of this serial killer which brought me to such a celebratory state upon being pushed again by Mr. Bruiser.

Certainly Mr. Bruiser would be rightfully held guilty. He didn't have the set-up that I had. He didn't hear the whispering of the serial killer to his friend. He didn't have the relevant moral knowledge, perhaps, that the serial killer was rightly accounted as guilty. So Mr. Bruiser would be send to prison, and rightly so.

I would, perhaps, be socially condemned, but there are seemingly no grounds to convict me. There was nothing I could have done about it. Furthermore, people are not held legally accountable for their emotional responses to certain circumstances. People are held accountable for their actions. This might be just a claim about legal ethics, and not ethics proper. As a legal matter, it would not be right to hold someone accountable to the state for their emotive responses.

Still, perhaps, as an ethical matter, we should see ourselves as somehow flawed when we enjoy being a part of unethical circumstances, even if we cannot stop or interfere with ongoing circumstances into which we are involuntarily drawn.


[image] "Boston Subway" Flickr by saaam Uploaded on July 27, 2005.


Labels: , ,

Sunday, November 11, 2007

Intel chip manufacturing trends

Once again, Intel Corp. is ready to release a new line of microchips, and everyone is rightly bamboozled by them. This chart was taken from The Wall Street Journal (Nov 7, 2007), and is interesting for the trend it shows. It would appear that in very late 2008 or early 2009, on this curve, we should see a 30 nanometer chip with just over a billion transistors on it. The key to this is switching the underlying materials used in the manufacturing process. Silicon dioxide as been used since the 1960s, but now a material called hafnium will be used, since it is much more stable as regards temperature fluctuations which occur as such small scales.


Labels: , , ,

How to use a cell phone screen to wake up more quickly in the morning

{ Podcast this essay } When one's alarm (cellphone or otherwise) goes off during friendly summer mornings, usually there is sunlight to help the body's natural biological rhythms awaken the mind. However, in the winter this sunlight advantage fades (literally, to black), and depending on where you live on the planet, even disappears completely for a time.

An unhelpful solution might be to reach over and turn on a reading lamp, or even an end-table lamp if one is available. This has some drawbacks. First, and most obvious, there occurs a painful experience for your eyes, and probably will result in you negatively conditioning yourself out of doing it. You will probably self talk along the following lines: "I hate when that d@#^ light flares me blind every morning. Screw it, I'm going to lay here a bit longer." Before you know it, you're fast asleep, and any sleep schedule plans you made in your more rational state, before last night's turn-in, are now psychological history. Second, if you are lucky enough to be sleeping beside someone you care about, then you risk (or even guarantee) waking them, which is hardly polite, if not downright dangerous.

Here is a method I've found very helpful for moving from just barely being awake to being workably, even fully awake. It requires that you have a cellphone that uses the same kind of back-lit LCD screen technology as the typical flat screen computer monitor. (I have a Samsung phone[1], for example, which uses this run-of-the-mill technology. I believe most phones now use this type of LCD screen.)

First, after your alarm goes off, grab your phone, and open it close to your face, but don't stare directly at the screen. (I keep the angle of the light being emitted pointed away from me.) Second, slowly rotate the light source toward you. No need to overwhelm your eyes; so, when it gets too bright, stop the rotation, maybe even rotating it back away from you a bit. Third, repeat step two until your eyes are adjusted and you can look directly at the light without feeling irritated by the glare.

At the completion of this little exercise, you will be far more awake than you otherwise would be by just trying to will your drowsy consciousness away unaided.

Why is this trick effective? There are many surveys which have documented that people who spend more pre-bedtime hours using the Internet or watching television are more likely to report that they don’t get enough sleep. This is the case even when users of such technology get about as much sleep as those who don't use such devices before bed time. A particular study published in Sleep and Biological Rhythms[2] confirms this and argues that the use of such electronic media before sleep triggers a self-perception of having insufficient sleep. This means that Internet and TV use prior to bedtime affects the quality of sleep. It turns out that exposure to certain light rays can interrupt the body's melatonin production, and it is melatonin that promotes sleep:
Circadian rhythms are regular changes in mental and physical characteristics that occur in the course of a day (circadian is Latin for "around a day"). Most circadian rhythms are controlled by the body’s biological "clock." This clock, called the suprachiasmatic nucleus or SCN, is actually a pair of pinhead-sized brain structures that together contain about 20,000 neurons. The SCN rests in a part of the brain called the hypothalamus, just above the point where the optic nerves cross. Light that reaches photoreceptors in the retina (a tissue at the back of the eye) creates signals that travel along the optic nerve to the SCN. Signals from the SCN travel to several brain regions, including the pineal gland, which responds to light-induced signals by switching off production of the hormone melatonin. The body’s level of melatonin normally increases after darkness falls, making people feel drowsy. [3]
Ironically, the advantage of using a cellphone upon waking can also solve the very problem that is created by technology on the other end of the sleeping ritual. How? First, just having a growing and ultimately focused source of light provides one means of waking you up. And second, the frequencies of that light helps reset that part of your brain which uses melatonin to reset circadian rhythms. However, it resets it at the time you don't want to sleep, not at the time you do. Naturally, this means that if one wants to increase the odds of having a good night's sleep, one should stay away from technology which emits such light frequencies.


[image] Untitled by think.bubbly Flickr October 24, 2007 (accessed Nov. 11, 2007)

[1] Samsung phone image.

[2] Nakamori Suganuma (et. at.) "Using electronic media before sleep can curtail sleep time and result in self-perceived insufficient sleep" Sleep and Biological Rhythms Volume 5 Issue 3 Page 204-214, July 2007

[3] "Sleep and Circadian Rhythms" HealthLink (Medical College of Wisconsin) (Accessed Nov 11, 2007)


Labels: , , , , ,

Friday, November 09, 2007

McDonald's, a dipped cone, and my freewill

{Podcast this essay} Occasionally, threats to my good health aside, I eat at McDonald's. Sadly, eating at McDonald's is the closest one can come to eating a direct-from-the-factory product. But occasionally, I hear the faint callings of my starving ancestors echoing up from my genome.

"We want fat. We want salt; in fact, we want 1040mg of salt in one sandwich which constitutes 43% of the daily total."

My hominid ancestors clearly scan the off-site propositions of my long-term memory for such information. That's why they are so convincing. I regularly do what they say. Prima facie, this seems wise, since their callings are the results of about five to seven -million years of successful primate evolution operating right up until even my own lunch time cravings. I'm conservative, you see. Therefore, tradition matters to me. And there's no more informative tradition than the tradition of bio-psychology.

Although I regularly do what the ancestors call me to do, it is anything but clear I choose to do what the ancestors call me to do. The prior is simply behavior, while the latter is forming an intention between options, and then actualizing one of those options uncoerced. In the latter case, I would be formulating whether to listen to the whisperings of my hominid ancestors and select the fatty, salty delight of a supersized #2 meal; or, not to listen to them, perhaps selecting a salad instead.

I certainly was doing what I wanted to do. Thus, I was free. After all, anyone who does what he wants is uncoerced. And if one is uncoereced, then one is free.

I'm still not convinced, however. Yes, I was doing what I wanted. But maybe my wants were determined. To be free, I would have to be able to choose against my wants. Is that even possible?

I ate the Big Mac meal (along with the supersize fries, thank you.). I was surprised that I wasn't full. The meal seemed large enough. And at 540 calories for the sandwich, plus 570 calories for the large fries, not counting the sprite and ketchup, I knew there were plenty of calories in the meal to sustain my system. Reason easily acknowledged this. Yet parts of my brain signalled that I desired more to eat. Now I've always been a fan of the chocolate dipped ice-cream cone, and what better 330 calories could one spend then on that sugary-sweet treat.[2] I wanted to do it, and was hardly coerced by any outside force. So I was free. It was, I assure you, quite tasty.

The problem came when I left McD's, or when I almost left. Somewhere around 1,500 calories in a meal is far more than most people ever dream of eating in a day, much less in a meal:
Take Bangladesh [as compared to] the USA: the average food intake for a Bangladeshi is 1930 calories per day, while for an American it is 3650 calories. It has been estimated that the minimum amount of food needed for good health is 2360 calories per day. So you can see, the average person in Bangladesh has too little food while the average American eats too much[1]
Clearly, what I had just put away was a luxury meal on the basis of calories. But as I drove my car around the backside of McD's in order to exit, passing under the gigantic golden double-arches abstract icon signalling to all the presence of this false temple of good food, suddenly realized I wanted yet another ice cream cone. Obviously, I didn't need it. Unquestionably, I would be further compounding the nutritional errors I had just already made by even showing up to McDonald's in the first place. And yet here it was: this want.

Suddenly, I found I did not want the want that I was having. I wanted a different want, I wanted a want that focused on something besides food; and, certainly on something besides a second chocolate-dipped ice cream cone. And yet here it was: this want.

Initially, I reasoned about its badness, its cumulative effect on me for which I would assuredly pay on the next day. I remembered the last time I ate too much McDonald's ice cream, how the sugar made me shake a bit, and how the the jello-horde of glucose too suddenly retreated from my metabolism and made me drop both cognitively and emotionally. I accurately self assessed that I'd regret it.

Then, I rationalized. "What the heck," I said. "It's good to be irrational once in a while!"

Now, and at some days after the event, I realized that during those moments, and under those peculiar conditions, I was no more free to say no to the whims of my hypothalamus hunger control system than is a smoker free to quit "anytime he wants to."

But he and I both share the same curse, if only in difference of degree: neither of us can simply want what we want anytime we want to.



[1] Juliet Gellatley "Food for a Future" (Accessed Nov. 5, 2007)

[2]"McDonald's USA Nutrition Facts for Popular Menu Items" McDonald' (Accessed Nov. 9, 2007)


Labels: , , , ,

Wednesday, November 07, 2007

When Johnny comes marching home (vets and homelessness)

Associated press is running a story through Yahoo which notes that "Veterans make up one in four homeless people in the United States, though they are only 11 percent of the general adult population"[1] This has been a problem for American society in all of its major military engagements.

Most alarming from this report is the trend. Veterans from Iraq and Afghanistan are trickling into shelters and soup kitchens seeking services, treatment or help with locating employment. Given the long rotation times and short leave times of the Iraq/Afghanistan conflicts, Daniel Tooth, director of veterans affairs for Lancaster County, Pa sums up the coming trend: "We're going to be having a tsunami of them eventually because the mental health toll from this war is enormous."[1]

This is not good news for American society. However, I think there is a bright spot. Traditionally, the Democratic party robustly supports funding for mental health and related social services,[2] and it looks like they are going to take control of the government for awhile, so we might get the opportunity to substantially lessen this problem before it becomes critical.


[1] "Study: 1 out of 4 homeless are veterans" Yahoo News (Accessed November 7, 2007).

[2] "Democrats Working to Expand Veterans Mental Health Care" The Democratic Party Website (Accessed November 7, 2007).