New penile implant endowed with "authenticity"
Boston Scientific announced the launch this week of its Tactra penile prosthesis, which is easier to implant than existing products and offers the user and partner a “more authentic, natural-feeling” erection, said the company. It is Boston Sci’s first new innovation in penile implants in more than a decade. FDA cleared the device in April, but Boston Scientific kept the device under wraps until this week, reports sister brand MD+DI.
The manually operated prosthesis—the patient lifts it up for intercourse and pushes it back down when not in use—contains a nitinol core between two silicone layers.
More than half of men over the age of 40 are affected by erectile dysfunction, some of whom cannot take Viagra or similar medications because of medical conditions, reports MD+DI. There are approximately 69,000 new penile implant candidates in the United States each year.
Boston Scientific plans to launch the Tactra in Europe, Canada, Latin America, Asia Pacific and Saudi Arabia later this year.
New set of medical tools for Uber Health
Henry Schein Medical, the exclusive seller, marketer and distributor of the Medpod MobileDoc 2 (pictured), has formed a partnership with Uber Health. The agreement would enable doctors to have the portable diagnostic devices transported to people’s homes for remote examinations, reports Medical Design & Outsourcing.
Made by software company Medpod, the carry-on sized MobileDoc 2 includes professional-grade medical devices and instruments including dermatoscopes to examine skin lesions and electrocardiograms to detect heart attacks and heart rhythm problems during remote consultations. There are instruments to capture the patient's temperature, peripheral capillary oxygen saturation, blood pressure, height, weight and body mass index, writes Chris Newmarker in Medical Design & Outsourcing.
An initial pilot program currently is being rolled out.
Translating brain signals into words
Patients with paralysis-related speech loss typically must spell words using eye movements or muscle twitches to control a computer interface to communicate. “In many cases, information needed to produce fluent speech is still there in their brains,” said University of California San Francisco (UCSF) neurosurgery professor Eddie Chang. “We just need the technology to allow them to express it.” That’s what he and his team are working on, with some success.
Research volunteers with recording electrodes on the surface of their brains listened to nine simple questions and responded out loud with one of 24 answer choices. Meanwhile, a set of machine-learning algorithms went to work decoding specific speech sounds from the participants’ brain activity. “After some training, the machine learning algorithms learned to detect when participants were hearing a new question or beginning to respond, and to identify which of the two dozen standard responses the participant was giving with up to 61% accuracy as soon as they had finished speaking,” said the press release on the UCSF website.
This is reportedly the first time that this approach has been used to identify spoken words and phrases. A key finding of the research, according to the study published on July 30 in Nature Communications, is the importance of context in improving the algorithm’s speed and accuracy.
“Most previous approaches have focused on decoding speech alone, but here we show the value of decoding both sides of a conversation—both the questions someone hears and what they say in response,” Chang said in the press release posted on the university website. “This reinforces our intuition that speech is not something that occurs in a vacuum and that any attempt to decode what patients with speech impairments are trying to say will be improved by taking into account the full context in which they are trying to communicate.”
Image: Eddie Chang (right), MD, and David Moses, PhD, in Chang’s laboratory at UCSF. Photo by Noah Berger.