Friday, April 26, 2013

Aging in Place: The ‘In’ Market

 

“Aging in Place” is a phrase that I come across on a regular basis and appears to be the ‘in’ phrase right now for the 55+ segment of the population. What does it mean to you? How would you define it? According to Wikipedia, it is: “The ability to live in one’s own home and community safely, independently, and comfortably, regardless of age, income, or ability level.” If one is a younger senior or a baby boomer (one born between 1946-1964) there is still time to make this decision, to save money and to plan for the ‘golden years’ although many may continue to work beyond the typical retirement age of 65-67.

From everything I have heard and read, most adults would like to stay in their own home until they are no longer able or until they die, but is this realistic? Is your current home modified for future needs, are renovations needed or are you in the market for an already modified home? An existing home needs to have a bedroom, bathroom, kitchen and laundry room on the main floor. The market for purchasing a new home or an already modified one: all is one level as in ground floor only with the main features of the house accommodating the needs of an older senior or one with a physical disability; wider doorways, no entranceway stairs, walk in showers with a built-in bench, pull-down or adjustable shower head, non-slippery floor surfaces throughout the home, lower cabinets, easy and accessible closet/storage space, etc.

Home maintenance and finances will be huge factors in the decision process as well. Are amenities close by, public or alternative transportation, medical care, family support, private support/services, socialization opportunities, and most importantly is it affordable? It could cost quite a sum of money to renovate/modify an existing house. Does it make more sense to buy a house that is already modified whether it is a stand alone house or one that is part of a community? These are all questions to consider before making a decision. What this means is a shift from a youth driven market to the aging market in terms of housing.

Louis Tennenbaum, Carpenter/Former Contractor says it best in his article Why You Should Remodel to ‘Age in Place’ Now on Next Avenue:
If boomers think about aging in place at all, we usually regard it as something we can put off until much later in life. Or to be more accurate, it’s something we hope we can put off until much later. That’s because we associate these kinds of modifications with growing old, which doesn’t sound like much fun.
There’s a better way to think about it, though: Aging in place is about creating a home so beautiful, comfortable and expressive of your personality that you never want to leave. We can’t fight aging, but we can take steps to make our house the place we want it to be.
A reframing of aging itself is needed in order for the phrase ‘aging in place’ to become acceptable. Aging is inevitable, but factors of health, well-being both physical and mental, social opportunities, accessibility to needed and necessary support/services/healthcare and finances all play a part in how one ages.

I also came across the Kendall Northern Ohio Blog which has an article on: Aging in Place: Look for Universal Design Features which lists some very practical advice for those looking to relocate into a home that has not been designed in the ‘aging in place’ style, but instead already exists and has universal features built-in that will accommodate someone with a physical disability or a senior with physical/health limitations.
  • No-Step Entry
  • French Doors
  • Wide Interior Doors
  • Open Space
  • Downstairs Bathroom/Bedroom
  • Low Shelving and Sinks
  • Smooth Shower Entry
  • East Turn Faucets/Fixtures.
  • Easy Twist Doorknobs
  • Grab Bars
Baby-boomers do not want to age in a typical senior residence or nursing home that current ‘older’ seniors have chosen or been placed in due to health conditions, diseases, physical limitations or cognitive disorders. The larger a cohort is, the more influence it can have on the market and this is already being seen. Baby-boomers are well-educated, demanding and research information as needed to expand their knowledge. Demands will be met or businesses lose out financially. It will be interesting to watch this market grow.

Written by Victoria Brewster, MSW
SJS Staff Writer

Tuesday, April 23, 2013

Samsung Demos a Tablet Controlled by Your Brain

Via MIT Tech Review http://www.technologyreview.com/news/513861/samsung-demos-a-tablet-controlled-by-your-brain/

An easy-to-use EEG cap could expand the number of ways to interact with your mobile devices.                   

One day, we may be able to check e-mail or call a friend without ever touching a screen or even speaking to a disembodied helper. Samsung is researching how to bring mind control to its mobile devices with the hope of developing ways for people with mobility impairments to connect to the world. The ultimate goal of the project, say researchers in the company’s Emerging Technology Lab, is to broaden the ways in which all people can interact with devices.

In collaboration with Roozbeh Jafari, an assistant professor of electrical engineering at the University of Texas, Dallas, Samsung researchers are testing how people can use their thoughts to launch an application, select a contact, select a song from a playlist, or power up or down a Samsung Galaxy Note 10.1. While Samsung has no immediate plans to offer a brain-controlled phone, the early-stage research, which involves a cap studded with EEG-monitoring electrodes, shows how a brain-computer interface could help people with mobility issues complete tasks that would otherwise be impossible.

Brain-computer interfaces that monitor brainwaves through EEG have already made their way to the market. NeuroSky’s headset uses EEG readings as well as electromyography to pick up signals about a person’s level of concentration to control toys and games (see “Next-Generation Toys Read Brain Waves, May Help Kids Focus”). Emotiv Systems sells a headset that reads EEG and facial expression to enhance the experience of gaming (see “Mind-Reading Game Controller”).
To use EEG-detected brain signals to control a smartphone, the Samsung and UT Dallas researchers monitored well-known brain activity patterns that occur when people are shown repetitive visual patterns. In their demonstration, the researchers found that people could launch an application and make selections within it by concentrating on an icon that was blinking at a distinctive frequency.
Robert Jacob, a human-computer interaction researcher at Tufts University, says the project fits into a broader effort by researchers to find more ways for communicating with small devices like smartphones. “This is one of the ways to expand the type of input you can have and still stick the phone in the pocket,” he says.

Finding new ways to interact with mobile devices has driven the project, says Insoo Kim, Samsung’s lead researcher. “Several years ago, a small keypad was the only input modality to control the phone, but nowadays the user can use voice, touch, gesture, and eye movement to control and interact with mobile devices,” says Kim. “Adding more input modalities will provide us with more convenient and richer ways of interacting with mobile devices.”

Still, it will take considerable research for a brain-computer interface to become a new way of interacting with smartphones, says Kim. The initial focus for the team was to develop signal processing methods that could extract the right information to control a device from weak and noisy EEG signals, and to get those methods to work on a mobile device.

Jafari’s research is addressing another challenge—developing more convenient EEG sensors. Classic EEG systems have gel or wet contact electrodes, which means a bit of liquid material has to come between a person’s scalp and the sensor. “Depending on how many electrodes you have, this can take up to 45 minutes to set up, and the system is uncomfortable,” says Jafari. His sensors, however, do not require a liquid bridge and take about 10 seconds to set up, he says. But they still require the user to wear a cap covered with wires.

The concept of a dry EEG is not new, and it can carry the drawback of lower signal quality, but Jafari says his group is improving the system’s processing of brain signals. Ultimately, if reliable EEG contacts were convenient to use and slimmed down, a brain-controlled device could look like “a cap that people wear all day long,” says Jafari.

Kim says the speed with which a user of the EEG-control system can control the tablet depends on the user. In the team’s limited experiments, users could, on average, make a selection once every five seconds with an accuracy ranging from 80 to 95 percent.

“It is nearly impossible to accurately predict what the future might bring,” says Kim, “but given the broad support for initiatives such as the U.S. BRAIN initiative, improvements in man-machine interfaces seem inevitable” (see “Interview with BRAIN Project Pioneer: Miyoung Chun”).

TED talks AT: Assistive Technology Brings Beauty, Laughter, Freedom and Light.

Via http://atnetworkblog.blogspot.com/2013/04/ted-talks-at-assistive-technology.html
 
If you have never checked out any TED talks before... now is the time to start. I promise you that you will be hooked!

TED's mission statement begins:
We believe passionately in the power of ideas to change attitudes, lives and ultimately, the world. So we're building here a clearinghouse that offers free knowledge and inspiration from the world's most inspired thinkers, and also a community of curious souls to engage with ideas and each other...

TED stands for Technology, Entertainment and Design. Their talks are dedicated to disseminating "ideas worth spreading". Take a break from all the reality shows, sitcoms and dramas and watch a TED talks instead with your loved ones. You will be moved, enlightened, informed and inspired!

Here are just some of our favorite TED talks dedicated to assistive technology and/or disability-related. Have you seen a talk you would like to share? Put it in our comment box - and enjoy!

1. Sue Austin: Deep sea diving … in a wheelchair

When Sue Austin got a power chair 16 years ago, she felt a tremendous sense of freedom -- yet others looked at her as though she had lost something. In her art, she aims to convey the spirit of wonder she feels wheeling through the world. Includes thrilling footage of an underwater wheelchair that lets her explore ocean beds, drifting through schools of fish, floating free in 360 degrees.



2. Todd Kuiken: A prosthetic arm that "feels"

Physiatrist and engineer Todd Kuiken is building a prosthetic arm that connects with the human nervous system -- improving motion, control and even feeling. Onstage, patient Amanda Kitts helps demonstrate this next-gen robotic arm.



3. Aimee Mullins: The opportunity of adversity
The thesaurus might equate "disabled" with synonyms like "useless" and "mutilated," but ground-breaking runner Aimee Mullins is out to redefine the word. Defying these associations, she shows how adversity -- in her case, being born without shinbones -- actually opens the door for human potential.




4. Joshua Walters: On being just crazy enough

At TED's Full Spectrum Auditions, comedian Joshua Walters, who's bipolar, walks the line between mental illness and mental "skillness." In this funny, thought-provoking talk, he asks: What's the right balance between medicating craziness away and riding the manic edge of creativity and drive?










Now control your tablet with thoughts

Via http://www.deccanchronicle.com/130423/news-businesstech/article/now-control-your-tablet-thoughts

A phone company is exploring ways to bring mind control to its mobile devices in hopes of allowing people with mobility impairments to communicate and function more easily in modern society.
But the ultimate goal of the brain-controlled computer project is to broaden the ways in which all people can interact with devices, researchers in the Samsung’s Emerging Technology Lab told MIT Technology Review.

The Samsung researchers are testing how people can use their thoughts to open an application, communicate a message, select a song from a playlist, or turn on or off a Samsung Galaxy Note 10.1.
The researchers are working on the new brain-controlled technology in ollaboration with Roozbeh Jafari, an assistant professor of electrical engineering at the University of Texas, Dallas.
The early-stage research, which utilizes a plastic cap covered with EEG-monitoring electrodes and a tablet device, shows how a brain-computer interface could help someone with mobility issues complete tasks that otherwise could not be done.

In using EEG-detected brain signals to control the interface, the researchers monitored typical brain activity patterns that occur when people are shown repetitive visual patterns.

The Samsung and UT Dallas researchers found that people could launch an application and make selections within it by concentrating on an icon that was blinking at a distinctive frequency.
Discovering new ways to interact with mobile devices has been a driving force behind the project, Insoo Kim, Samsung’s lead researcher, told Technology Review.

These Brain-Scanning Neuro-Toys Are About To Change Everything

Via http://www.fastcompany.com/3008499/tech-forecast/these-brain-scanning-neuro-toys-are-about-change-everything

 
New technology that lets users control game avatars and music playlists with their brainwaves could give stroke patients and the profoundly disabled new ways to communicate.
A small but growing industry of inventors, neurologists, and investors are betting on consumers controlling smartphones, music players, and even desktop computers with their brains.
Innovations in software development kits (SDKs) alongside cheap, ever more sophisticated brainwave readers mean people with money to spend can play computer games through thought alone. But the products they are using--and the patents behind them--could change the world for the neurologically impaired in a decade or two.

Last month at SXSW, Canadian neuroscientist and artist Ariel Garten showed off her commercial brainchild. The Muse is a $200 sensor-enabled headband which connects with PCs and Macs, and allows user to control games with their thoughts or engage in rudimentary neurofeedback. Garten spoke about the Muse and her company, Interaxon, in late 2012 at a TEDx talk in Toronto which went viral thanks to a discussion of the technology the Muse could lead to. Headbands are expected

to ship to customers in late 2013.


Using the Muse was an interesting experience. I had the opportunity to test a prototype out, and the headband slipped on easily--no sterile environment or special electrode setup was required. The headband was accompanied by a number of games and apps, all of which turn brainwaves into data input through embedded electroencephalograph (EEG) sensors. Although the games were dead simple, they were controlled by my thoughts. I was able to manipulate my avatar's motions on screen by thinking happy, sad, or anxious thoughts. Whenever I tried to throw the interface a curveball, it appeared to decently react to whatever line of thought or emotion I was engaged in.

Garten told Fast Company that she first began experimenting with brain-computer interfaces in 2003. Along with InteraXon co-founder Chris Aimone, she created public art installations where people's brainwaves could change the art. “We started by creating concerts where 48 people at a time could control a musician's output, which would then effect people's brain state when they heard it, in a regenerative cycle. We went on to create more musical performances, where musicians could be jamming along to music directly with their brain, it was tons of fun,” Garten said.
EEG-reading headbands aren't only used for consumer games either. Another product making the rounds at SXSW was the Zen Tunes app from Japanese firm Neurowear. Neurowear, who were featured in Co.Design a few years ago for their cosplay brain-powered cat ears (really), manufactured an integrated prototype headset and iOS app combo which generates playlists tailored to a user's brainwaves. Neurowear customers put on an EEG-enabled headset and load songs from their music library onto a playlist. Once the songs are playing, algorithms within the Zen Tunes app analyze brainwaves for EEG patterns associated with focus and relaxation. These patterns are then used to sort music into playlists that, ideally, will match user's specific moods.

Brain-computer interfaces (BCIs) have been around since the 1970s, when clunky EEG readers were used in laboratory settings for rudimentary neurofeedback and biofeedback programs. Although the readings and data inputs from EEG readers have not changed significantly over the past forty-odd years, the equipment used has changed significantly. Instead of requiring a university laboratory or a quiet room without sounds from the outside causing false positives, and instead of requiring nurses or lab technicians to assist with setup, they have become consumer technology. The Muse headband, Neurowear's floppy animal ears, and competing products from firms like Axio are easy-to-use diversions for anyone with a few hundred dollars to burn. Today's brain-reading headbands require no medical training to use, have a tiny learning curve, and frankly are a ton of fun to use.

As Garten put it, “The main concept in brain-machine interfaces is that changes in your brain are reflected in changes in some signal, in our case EEG, which can then be used as a kind of control action to a machine, without the need of using any physical action or command. Your brainwaves (EEG) are small electrical potentials on your scalp as a result of neurons firing in your brain, and Muse's four electrodes record this fluctuating voltage several hundred times per second. These voltages are converted to digital signals and a stream of numbers is sent to your PC or Mac via Bluetooth.”
The SDK lets users turn these EEGs into data, which can then control program avatars. Alternately, developers could use the SDK to write neurofeedback software which lets users view their brain behavior and trace patterns related to hyperactivity or anxiety. According to Garten, Muse's SDK will also provide some preliminary analysis tools that let you extract more meaningful interpretations of the data, such as the power of "alpha" or "beta" frequencies, and use that as control signals to various devices.

However, the real future potential for brain-computer interfaces is in healthcare. When I spoke to Garten, she was outgoing about everything but potential repurposing of Muse's technology for clinical applications. There's a reason for that. Brain-computer interfaces, and higher-end versions of the sensors used in consumer headbands like Muse have world-changing ramifications for traumatic brain injury patients, stroke victims, and individuals with physical disabilities.

Kamran Fallahpour is a psychologist at New York's Brain Resource Center who has used brain-computer interfaces in the workplace for more than 20 years. The Brain Resource Center takes advantage of brain mapping and mind-computer interfaces for patients with everything from mood disorders to traumatic brain injuries. Other patients are professional musicians or actors seeking brain mapping in the course of peak performance training. When these patients visit their office, they essentially use a more complicated version of Neurowear and Muse's software input kits.
According to Fallahpour, the big innovation in brain-computer interfaces is the ever-increasing capabilities of computers. Even an iPhone has sheer processing power to parse through data points that a 1990s-vintage 486 could not. The human mind is immensely complicated and neurologists understand very little of it. Nonetheless, even the information that brain-computer interfaces transform into bits and bytes overwhelmed past computers. Advances in technology mean it's possible to now have basic home mind-reading headbands for your smartphone or laptop--something that was science fiction until quite recently.

But while it's amazingly fascinating to control avatars in Angry Birds-type games with thoughts, it's still all fun and games. According to Fallahpour, most consumer EEG headbands and brain-computer interfaces are “toys” that lack the capabilities of research and clinical-grade systems. The inexpensive sensors used create a large number of artifacts, are very vague, can be effected by physical movement, and only read basic emotions like stress and relaxation. The more sophisticated versions of these commercial brain-computer interfaces are now being used in hundreds of laboratories nationwide in neurofeedback projects that treat post-traumatic stress disorder, hyperactivity, and a host of other conditions.

Other brain-computer interfaces, however, are far more sophisticated. Back in 2010, Fast Company reported on an early project to type into computers using brain-computer interfaces. Since then, they have gotten even more complicated—and can change the world for disabled patients. Researchers at Drexel University College of Medicine in Pennsylvania are currently studying brain-computer interfaces for ALS patients. Using laboratory-quality EEG headsets, scientists hope to see if individuals with ALS with “extreme loss of neuromuscular control and severe communication impairments” can make selections on a computer screen with a brain. In similar projects, patients were able to type short text messages using only their brain waves.

Drexel is currently recruiting participants with ALS in the Philadelphia metropolitan area for the study.

Beleaguered Caregivers Getting Help from Apps

Via http://www.wirelessdesignmag.com/news/2013/04/beleaguered-caregivers-getting-help-apps


NEW YORK (AP) — As her mother and father edged toward dementia, Nancy D'Auria kept a piece of paper in her wallet listing their medications.

It had the dosages, the time of day each should be taken and a check mark when her folks, who live 10 miles away, assured her the pills had been swallowed.

"I work full time so it was very challenging," said D'Auria, 63, of West Nyack.

Now she has an app for that. With a tap or two on her iPhone, D'Auria can access a "pillbox" program that keeps it all organized for her and other relatives who share in the caregiving and subscribe to the app.

"I love the feature that others can see this," D'Auria said. "I'm usually the one who takes care of this, but if I get stuck, they're all up to date."

From GPS devices and computer programs that help relatives track a wandering Alzheimer's patient to iPad apps that help an autistic child communicate, a growing number of tools for the smartphone, the tablet and the laptop are catering to beleaguered caregivers. With the baby boom generation getting older, the market for such technology is expected to increase.

The pillbox program is just one feature of a $3.99 app called Balance that was launched last month by the National Alzheimer Center, a division of the Hebrew Home at Riverdale in the Bronx.
"We thought there would be an opportunity here to reach caregivers in a different way," said David Pomerantz, executive vice president of the Hebrew Home. "It would be a way to reach people the way people like to be reached now, on their phone."

The app also includes sections for caregiving tips, notes for the doctor and the patient's appointments, plus a "learning section" with articles on aspects of Alzheimer's and an RSS feed for news about the disease.

Trackers are also important tools for Alzheimer's caregivers.

Laura Jones of Lighthouse Point, Fla., says she was able to extend her husband's independence for a year and a half by using a program called Comfort Zone.

"He was just 50 when he was diagnosed," she said.

Jones said she went to work so he would continue to get insurance coverage.

"Day care was not appropriate, home care was not affordable," she said. "Even when he stopped driving, he would ride his bike all over town, to the gym, for coffee, errands. He would take the dog for a walk and be out and about when he was alone and I was working."

Using Comfort Zone, which is offered by the Alzheimer's Association starting at $43 a month, she was able to go online and track exactly where he was and where he had been.

Her husband carried a GPS device, which sent a signal every five minutes. If Jones checked online every hour, she would see 12 points on a map revealing her husband's travels. She would also get an alert if he left a designated area.

Eventually, the tracking revealed that Jones' husband was getting lost.

"He would make a big funny loop off the usual route and we knew it was time to start locking down on him," she said.

Mended Hearts, an organization of heart patients and their caregivers, is about to start a program to reach caregivers by texting tips to their phones.

"We hope this will be the beginning of several patient- and caregiver-based texting programs that reach people where they are," said executive director Karen Caruth.

Lisa Goring, vice president of Autism Speaks, said tablets have been a boon to families with autistic children. The organization has given iPads to 850 low-income families. And the Autism Speaks website lists hundreds of programs — from Angry Birds to Autism Language Learning — that families have found useful.

Samantha Boyd of McConnellstown, Pa., said her 8-year-old autistic son gets very excited when the iPad is brought out.

"There's no way he'd be able to use a keyboard and mouse," she said. "But with the iPad, we use the read-aloud books, the songs, the flash card apps."

She said the repetitiveness and visuals help. "He catches a word and repeats it back. He says the name of a picture, and the iPad says it back."

Boyd said the iPad also works as a reward: "He likes to watch Netflix on it."

One of the most popular online tools for caregivers is one of the oldest: the message board, available all over the Internet and heavily used by caregivers of dementia and autism patients, who perhaps can't find the time for conventional support groups.

"It's a place for families to talk about the strengths and the accomplishments of their child with autism but also talk about some of the challenges and be able to find the support of other families," Goring said.

Some tools are not specific to a particular disease or condition.

CareFamily, which prescreens in-home caregivers and matches them to customers over the Internet, has online tools that let a family remotely monitor a caregiver's attendance, provide reminders about medications and appointments, and exchange care plans and notes via email, texting or phone.
"We're in the infancy of what technology can do for caregiving and it's only going to grow," said Beth Kallmyer, a vice president at the Alzheimer's Association.

But she cautioned that it's too soon to depend entirely on online tools.

"It's not a good fit for everybody," she said. "When you're looking at people impacted by Alzheimer's disease, including some caregivers, you're looking at an older population that might not be comfortable. We always have to remember technology is great — when it works."

Monday, April 22, 2013

Magic Reader--ipad app that allows you to turn pages of books with head movement

A Real Page-Turner
via http://community.advanceweb.com/blogs/sp_6/archive/2013/04/22/a-real-page-turner.aspx


Published April 22, 2013 10:23 AM
 
Spinal cord injury, stroke, and hundreds of congenital and acquired disorders impair the use of hands--an essential body part for using touch-screen technology. A handful of apps are switch-accessible, but these consist mainly of AAC apps and some early childhood books and games (Jane Farrell keeps a list here). For all other apps, these users are out of luck for now. However, there is at least one app that shows potential for readers.

MagicReader is a free, ad-supported app for iPad released by the Japanese developer GimmiQ about a year ago. The app uses the iPad's camera to recognize a face, and then track head movement, allowing users to turn the pages of books. The app currently only supports PDF files and compressed comic book files (there are several comics available free in-app), but the developer promises to support more formats soon. After importing a PDF through iTunes or email, you need to find the right distance and lighting to optimize the facial recognition. Once the app reliably finds your face, it is fairly simple to turn the pages forward and back with a turn of the head, even while wearing clear glasses. Two blue stars at the top light up when the app has found your face, letting you know you can turn your head to turn the page. Looking upwards navigates in and out of the library.



The uses for disability populations is currently limited in that the app requires a 45 degree turn of the head rather than tracking only eye movements. Of course users will still need assistance in opening the app unless they have a more sophisticated set-up. When I first used the app, it took some time to find just the right distance, head turn speed, and lighting conditions for reliable turning, and sometimes the pages flipped when I wasn't ready or just looked up from the tablet. I tend to read a lot of PDF files, but most people read e-books, which are not yet supported.


When the iPad is mounted on a wheelchair or supported on a stand, this app could be of great use to many people. For stroke survivors who can hold the device in one hand, they can now use their heads to turn the page instead of setting it down to touch the screen. The description recommends the app for those reading recipes while cooking, musicians turning sheet music, parents reading while holding babies, and even people reading while eating.
This app may be useful in your practice now, but more than that, I think it shows the potential for alternative means of accessing tablet technology. Given that the app's FAQ states a paid version is coming, it's probably worthwhile to download MagicReader now while it's free.