Advertisements
Feeds:
Posts
Comments

Posts Tagged ‘acoustics’


Sound of Music and Fancy Lights with 3D Printing

Reporter: Danut Dragoi, PhD

 

Recent initiatives related to 3D printing run the gamut from practical to whimsical, from the rapid turnaround of optical prototypes based on CAD data to the development by university researchers of a children’s toy called a zoolophone, a xylophone with animal shapes. But even the whimsical project turns out to have practical applications, ranging from noise and vibration control to RF filtering, see link in here

Practicality was at the forefront of a joint initiative announced last November by OPTIS, a CAD software vendor, and Luxexcel, a 3D-printing service for optical products. They teamed up to provide automotive manufacturers a fast and easy way to go from design to 3D-printed prototype, see link in here.

Software from OPTIS-a CAD software vendor specializing in the simulation of light, human vision and physically correct visualization-enables automotive designers and manufacturers to simulate their lighting and optical designs, testing and verifying virtual prototypes within their CAD environment, see link in here.

But the transition to real prototypes traditionally has been time consuming and expensive, involving, for example, diamond milling and tuning or-for elaborate free form shapes-injection molding. See link in here.

Luxexcel’s Printoptical technology offers an alternative. OPTIS has integrated Luxexcel material within the OPTIS library, providing customers fast access to 3D-printed customized and fully optimized prototypes, see Figure below, within a few days. Commented Paul Cornelissen, head of marketing and online business development for Luxexcel, in a press release, “With this digital process, we change a 3,000 years old analog industry and make it future proof.” See link in here.

OPTO SHAPE SOUNDS and LIGHT

Image SOURCE: http://www.evaluationengineering.com/2016/02/19/3d-printing-initiatives-span-whimsical-to-practical/

Whimsical could describe the zoolophone, next Figure below, built by computer scientists at Columbia Engineering, Harvard, and MIT, who used it to demonstrate that sound can be controlled by 3D-printing shapes. They presented their work at SIGGRAPH Asia last November in Kobe, Japan—an event targeting the computer-graphics community, which has long been interested in the simulation of contact sounds as well as computational fabrication techniques such as those the researchers used. See link in here.

ZOOLOPHONE TOYES

Image SOURCE: http://www.evaluationengineering.com/2016/02/19/3d-printing-nitiatives-span-whimsical-to-practical/

Watch video on Computational Design of Metallophone Contact Sounds

https://youtu.be/tVoPAgvLzJc

Watch video on Luxcell lights

https://www.luxexcel.com/

The researchers developed an algorithm that optimized for 3D printing the instrument’s keys in the shape of lions, turtles, elephants, giraffes, and more, modeling the geometry to achieve the desired pitch and amplitude of each part. The zoolophone is an idiophone—an instrument that produces sound through its own vibration rather than employing strings (chordophones), columns of air (aerophones), or membranes (membranophones). Most idiophones employ rectangular bars, for which the relationship of sound and geometry is well understood.See link in here.

In contrast, determining the optimal animal shape that produces the desired amplitude and frequency proved to be computationally challenging. The team spent nearly two years developing computational methods while borrowing concepts from computer graphics, acoustic modeling, and mechanical engineering as well as 3D printing. To increase the chances of finding the optimal shape, the researchers developed a fast stochastic optimization method, which they call Latin Complement Sampling (LCS). LCS requires as input a desired shape as well as frequency and amplitude spectra, and LCS optimizes the shape through deformation and perforation to produce the wanted sounds—including overtones. Previous algorithms had been able to optimize either amplitude or frequency but not both. See link in here.

“Our zoolophone’s keys are automatically tuned to play notes on a scale with overtones and frequency of a professionally produced xylophone,” said Changxi Zheng, assistant professor of computer science at Columbia Engineering, who led the research team.2 “By automatically optimizing the shape of 2D and 3D objects through deformation and perforation, we were able to produce such professional sounds that our technique will enable even novices to design metallophones [metal idiophones] with unique sound and appearance.”See link in here.

More than a toy, the researchers say, the zoolophone represents fundamental research into understanding the complex relationships between an object’s geometry and its material properties. “Our discovery could lead to a wealth of possibilities that go well beyond musical instruments,” Zheng added. For example, the algorithm could help reduce computer-fan noise, control vibration in bridges, and advance the construction of micro-electro-mechanical resonators. Zheng already has been contacted by researchers interested in applying his approach to MEMS RF filters. See link in here.

“Acoustic design of objects today remains slow and expensive,” Zheng said. “We would like to explore computational design algorithms to improve the process for better controlling an object’s acoustic properties, whether to achieve desired sound spectra or to reduce undesired noise. This project underscores our first step toward this exciting direction in helping us design objects in a new way.”See link in here.

SOURCE

http://www.evaluationengineering.com/2016/02/19/3d-printing-initiatives-span-whimsical-to-practical/

REFERENCES

Zheng, C., et al., “Computational Design of Metallophone Contact Sounds,” ACM Transactions on Graphics, November 2015., see link in here http://dl.acm.org/citation.cfm?id=2818108 ,

“Change the Shape, Change the Sound,” Newswise, Oct. 28, 2015., see link in here http://www.newswise.com/articles/view/642217/?sc=dwhr&xy=10008807

https://youtu.be/tVoPAgvLzJc

https://www.luxexcel.com/

Advertisements

Read Full Post »


Advances in acoustics and in learning

Larry H. Brnstein, MD, FCAP, Curator

LPBI

 

Controlling acoustic properties with algorithms and computational methods

http://www.kurzweilai.net/controlling-acoustic-properties-with-algorithms-and-computational-methods

October 28, 2015

Computer scientists at Columbia Engineering, Harvard, and MIT have demonstrated that acoustic properties — both sound and vibration — can be controlled by 3D-printing specific shapes.

They designed an optimization algorithm and used computational methods and digital fabrication to alter the shape of 2D and 3D objects, creating what looks to be a simple children’s musical instrument — a xylophone with keys in the shape of zoo animals.

Practical uses

“Our discovery could lead to a wealth of possibilities that go well beyond musical instruments,” says Changxi Zheng, assistant professor of computer science at Columbia Engineering, who led the research team.

“Our algorithm could lead to ways to build less noisy computer fans and bridges that don’t amplify vibrations under stress, and advance the construction of micro-electro-mechanical resonators whose vibration modes are of great importance.”

Zheng, who works in the area of dynamic, physics-based computational sound for immersive environments, wanted to see if he could use computation and digital fabrication to actively control the acoustical property, or vibration, of an object.

Zheng’s team decided to focus on simplifying the slow, complicated, manual process of designing “idiophones” — musical instruments that produce sounds through vibrations in the instrument itself, not through strings or reeds.

The surface vibration and resulting sounds depend on the idiophone’s shape in a complex way, so designing the shapes to obtain desired sound characteristics is not straightforward, and their forms have so far been limited to well-understood designs such as bars that are tuned by careful drilling of dimples on the underside of the instrument.

Optimizing sound properties

To demonstrate their new technique, the team settled on building a “zoolophone,” a metallophone with playful animal shapes (a metallophone is an idiophone made of tuned metal bars that can be struck to make sound, such as a glockenspiel).

 

What happens in the brain when we learn

http://www.kurzweilai.net/what-happens-in-the-brain-when-we-learn

Findings could enhance teaching methods and lead to treatments for cognitive problems
October 28, 2015

A Johns Hopkins University-led research team has proven a working theory that explains what happens in the brain when we learn, as described in the current issue of the journal Neuron.

More than a century ago, Pavlov figured out that dogs fed after hearing a bell eventually began to salivate when they heard the bell ring. The team looked into the question of how Pavlov’s dogs (in “classical conditioning”) managed to associate an action with a delayed reward to create knowledge. For decades, scientists had a working theory of how it happened, but the team is now the first to prove it.

“If you’re trying to train a dog to sit, the initial neural stimuli, the command, is gone almost instantly — it lasts as long as the word sit,” said neuroscientist Alfredo Kirkwood, a professor with the university’s Zanvyl Krieger Mind/Brain Institute. “Before the reward comes, the dog’s brain has already turned to other things. The mystery was, ‘How does the brain link an action that’s over in a fraction of a second with a reward that doesn’t come until much later?’ ”

Eligibility traces

The working theory — which Kirkwood’s team has now validated experimentally — is that invisible “synaptic eligibility traces” effectively tag the synapses activated by the stimuli so that the learning can be cemented with the arrival of a reward. The reward is a neuromodulator* (neurochemical) that floods the dog’s brain with “good feelings.” Though the brain has long since processed the “sit” command, eligibility traces in the synapse respond to the neuromodulators, prompting a lasting synaptic change, a.k.a. “learning.”

The team was able to prove the eligibility-traces theory by isolating cells in the visual cortex of a mouse. When they stimulated the axon of one cell with an electrical impulse, they sparked a response in another cell. By doing this repeatedly, they mimicked the synaptic response between two cells as they process a stimulus and create an eligibility trace.

When the researchers later flooded the cells with neuromodulators, simulating the arrival of a delayed reward, the response between the cells strengthened (“long-term potentiation”) or weakened (“long-term depression”), showing that the cells had “learned” and were able to do so because of the eligibility trace.

“This is the basis of how we learn things through reward,” Kirkwood said, “a fundamental aspect of learning.”

In addition to a greater understanding of the mechanics of learning, these findings could enhance teaching methods and lead to treatments for cognitive problems, the researchers suggest.

Scientists at the University of Texas at Houston and the University of California, Davis were also involved in the research, which was supported by grants from JHU’s Science of Learning Institute and National Institutes of Health.

* The neuromodulators tested were norepinephrine, serotonin, dopamine, and acetylcholine, all of which have been implicated in cortical plasticity (ability to grow and form new connections to other neurons).


Abstract of Distinct Eligibility Traces for LTP and LTD in Cortical Synapses

In reward-based learning, synaptic modifications depend on a brief stimulus and a temporally delayed reward, which poses the question of how synaptic activity patterns associate with a delayed reward. A theoretical solution to this so-called distal reward problem has been the notion of activity-generated “synaptic eligibility traces,” silent and transient synaptic tags that can be converted into long-term changes in synaptic strength by reward-linked neuromodulators. Here we report the first experimental demonstration of eligibility traces in cortical synapses. We demonstrate the Hebbian induction of distinct traces for LTP and LTD and their subsequent timing-dependent transformation into lasting changes by specific monoaminergic receptors anchored to postsynaptic proteins. Notably, the temporal properties of these transient traces allow stable learning in a recurrent neural network that accurately predicts the timing of the reward, further validating the induction and transformation of eligibility traces for LTP and LTD as a plausible synaptic substrate for reward-based learning.

 

Holographic sonic tractor beam lifts and moves objects using soundwaves

Another science-fiction idea realized
October 27, 2015

British researchers have built a working Star-Trek-style “tractor beam” — a device that can attract or repel one object to another from a distance. It uses high-amplitude soundwaves to generate an acoustic hologram that can grasp and move small objects.

The technique, published in an open-access paper in Nature Communications October 27, has a wide range of potential applications, the researchers say. A sonic production line could transport delicate objects and assemble them, all without physical contact. Or a miniature version could grip and transport drug capsules or microsurgical instruments through living tissue.

The device was developed at the Universities of Sussex and Bristol in collaboration with Ultrahaptics.

https://youtu.be/wDzhlW-rKvM
University of Sussex | Levitation using sound waves

The researchers used an array of 64 miniature loudspeakers. The whole system consumes just 9 Watts of power, used to create high-pitched (40Khz), high-intensity sound waves to levitate a spherical bead 4mm in diameter made of expanded polystyrene.

The tractor beam works by surrounding the object with high-intensity sound to create a force field that keeps the objects in place. By carefully controlling the output of the loudspeakers, the object can be held in place, moved, or rotated.

Three different shapes of acoustic force fields work as tractor beams: an acoustic force field that resembles a pair of fingers or tweezers; an acoustic vortex, the objects becoming trapped at the core; and a high-intensity “cage” that surrounds the objects and holds them in place from all directions.

Previous attempts surrounded the object with loudspeakers, which limits the extent of movement and restricts many applications. Last year, the University of Dundee presented the concept of a tractor beam, but no objects were held in the ray.

The team is now designing different variations of this system. A bigger version aims at levitating a soccer ball from 10 meters away and a smaller version aims at manipulating particles inside the human body.

https://youtu.be/g_EM1y4MKSc
Asier Marzo, Matt Sutton, Bruce Drinkwater and Sriram Subramanian | Acoustic holograms are projected from a flat surface and contrary to traditional holograms, they exert considerable forces on the objects contained within. The acoustic holograms can be updated in real time to translate, rotate and combine levitated particles enabling unprecedented contactless manipulators such as tractor beams.


Abstract of Holographic acoustic elements for manipulation of levitated objects

Sound can levitate objects of different sizes and materials through air, water and tissue. This allows us to manipulate cells, liquids, compounds or living things without touching or contaminating them. However, acoustic levitation has required the targets to be enclosed with acoustic elements or had limited maneuverability. Here we optimize the phases used to drive an ultrasonic phased array and show that acoustic levitation can be employed to translate, rotate and manipulate particles using even a single-sided emitter. Furthermore, we introduce the holographic acoustic elements framework that permits the rapid generation of traps and provides a bridge between optical and acoustical trapping. Acoustic structures shaped as tweezers, twisters or bottles emerge as the optimum mechanisms for tractor beams or containerless transportation. Single-beam levitation could manipulate particles inside our body for applications in targeted drug delivery or acoustically controlled micro-machines that do not interfere with magnetic resonance imaging.

 

A drug-delivery technique to bypass the blood-brain barrier

http://www.kurzweilai.net/a-drug-delivery-technique-to-bypass-the-blood-brain-barrier

Could benefit a large population of patients with neurodegenerative disorders
October 26, 2015

Researchers at Massachusetts Eye and Ear/Harvard Medical School and Boston University have developed a new technique to deliver drugs across the blood-brain barrier and have successfully tested it in a Parkinson’s mouse model (a line of mice that has been genetically modified to express the symptoms and pathological features of Parkinson’s to various extents).

Their findings, published in the journal Neurosurgery, lend hope to patients with neurological conditions that are difficult to treat due to a barrier mechanism that prevents approximately 98 percent of drugs from reaching the brain and central nervous system.

“Although we are currently looking at neurodegenerative disease, there is potential for the technology to be expanded to psychiatric diseases, chronic pain, seizure disorders, and many other conditions affecting the brain and nervous system down the road,” said senior author Benjamin S. Bleier, M.D., of the department of otolaryngology at Mass. Eye and Ear/Harvard Medical School.

The nasal mucosal grafting solution

Researchers delivered glial derived neurotrophic factor (GDNF), a therapeutic protein in testing for treating Parkinson’s disease, to the brains of mice. They showed that their delivery method was equivalent to direct injection of GDNF, which has been shown to delay and even reverse disease progression of Parkinson’s disease in pre-clinical models.

Once they have finished the treatment, they use adjacent nasal lining to rebuild the hole in a permanent and safe way. Nasal mucosal grafting is a technique regularly used in the ENT (ear, nose, and throat) field to reconstruct the barrier around the brain after surgery to the skull base. ENT surgeons commonly use endoscopic approaches to remove brain tumors through the nose by making a window through the blood-brain barrier to access the brain.

The safety and efficacy of these methods have been well established through long-term clinical outcomes studies in the field, with the nasal lining protecting the brain from infection just as the blood brain barrier has done.

By functionally replacing a section of the blood-brain barrier with nasal mucosa, which is more than 1,000 times more permeable than the native barrier, surgeons could create a “screen door” to allow for drug delivery to the brain and central nervous system.

The technique has the potential to benefit a large population of patients with neurodegenerative disorders, where there is still a specific unmet need for blood-brain-penetrating therapeutic delivery strategies.

The study was funded by The Michael J. Fox Foundation for Parkinson’s Research (MJFF).


Abstract of Heterotopic Mucosal Grafting Enables the Delivery of Therapeutic Neuropeptides Across the Blood Brain Barrier

BACKGROUND: The blood-brain barrier represents a fundamental limitation in treating neurological disease because it prevents all neuropeptides from reaching the central nervous system (CNS). Currently, there is no efficient method to permanently bypass the blood-brain barrier.

OBJECTIVE: To test the feasibility of using nasal mucosal graft reconstruction of arachnoid defects to deliver glial-derived neurotrophic factor (GDNF) for the treatment of Parkinson disease in a mouse model.

METHODS: The Institutional Animal Care and Use Committee approved this study in an established murine 6-hydroxydopamine Parkinson disease model. A parietal craniotomy and arachnoid defect was repaired with a heterotopic donor mucosal graft. The therapeutic efficacy of GDNF (2 [mu]g/mL) delivered through the mucosal graft was compared with direct intrastriatal GDNF injection (2 [mu]g/mL) and saline control through the use of 2 behavioral assays (rotarod and apomorphine rotation). An immunohistological analysis was further used to compare the relative preservation of substantia nigra cell bodies between treatment groups.

RESULTS: Transmucosal GDNF was equivalent to direct intrastriatal injection at preserving motor function at week 7 in both the rotarod and apomorphine rotation behavioral assays. Similarly, both transmucosal and intrastriatal GDNF demonstrated an equivalent ratio of preserved substantia nigra cell bodies (0.79 +/- 0.14 and 0.78 +/- 0.09, respectively, P = NS) compared with the contralateral control side, and both were significantly greater than saline control (0.53 +/- 0.21; P = .01 and P = .03, respectively).

CONCLUSION: Transmucosal delivery of GDNF is equivalent to direct intrastriatal injection at ameliorating the behavioral and immunohistological features of Parkinson disease in a murine model. Mucosal grafting of arachnoid defects is a technique commonly used for endoscopic skull base reconstruction and may represent a novel method to permanently bypass the blood-brain barrier.

 

Creating an artificial sense of touch by electrical stimulation of the brain

http://www.kurzweilai.net/creating-an-artificial-sense-of-touch-by-electrical-stimulation-of-the-brain

DARPA-funded study may lead to building prosthetic limbs for humans using a direct brain-electrode interface to recreate the sense of touch
October 26, 2015

Neuroscientists in a project headed by the University of Chicago have determined some of the specific characteristics of electrical stimuli that should be applied to the brain to produce different sensations in an artificial upper limb intended to restore natural motor control and sensation in amputees.

The research is part of Revolutionizing Prosthetics, a multi-year Defense Advanced Research Projects Agency (DARPA).

For this study, the researchers used monkeys, whose sensory systems closely resemble those of humans. They implanted electrodes into the primary somatosensory cortex, the area of the brain that processes touch information from the hand. The animals were trained to perform two perceptual tasks: one in which they detected the presence of an electrical stimulus, and a second task in which they indicated which of two successive stimuli was more intense.

The sense of touch is made up of a complex and nuanced set of sensations, from contact and pressure to texture, vibration and movement. The goal of the research is to document the range, composition and specific increments of signals that create sensations that feel different from each other.

To achieve that, the researchers manipulated various features of the electrical pulse train, such as its amplitude, frequency, and duration, and noted how the interaction of each of these factors affected the animals’ ability to detect the signal.

Of specific interest were the “just-noticeable differences” (JND),” — the incremental changes needed to produce a sensation that felt different. For instance, at a certain frequency, the signal may be detectable first at a strength of 20 microamps of electricity. If the signal has to be increased to 50 microamps to notice a difference, the JND in that case is 30 microamps.*

“When you grasp an object, for example, you can hold it with different grades of pressure. To recreate a realistic sense of touch, you need to know how many grades of pressure you can convey through electrical stimulation,” said Sliman Bensmaia, PhD, Associate Professor in the Department of Organismal Biology and Anatomy at the University of Chicago and senior author of the study, which was published today (Oct. 26) in the Proceedings of the National Academy of Sciences. “Ideally, you can have the same dynamic range for artificial touch as you do for natural touch.”

“This study gets us to the point where we can actually create real algorithms that work. It gives us the parameters as to what we can achieve with artificial touch, and brings us one step closer to having human-ready algorithms.”

Researchers from the University of Pittsburgh and Johns Hopkins University were also involved in the DARPA-supported study.

* The study also has important scientific implications beyond neuroprosthetics. In natural perception, a principle known as Weber’s Law states that the just-noticeable difference between two stimuli is proportional to the size of the stimulus. For example, with a 100-watt light bulb, you might be able to detect a difference in brightness by increasing its power to 110 watts. The JND in that case is 10 watts. According to Weber’s Law, if you double the power of the light bulb to 200 watts, the JND would also be doubled to 20 watts.

However, Bensmaia’s research shows that with electrical stimulation of the brain, Weber’s Law does not apply — the JND remains nearly constant, no matter the size of the stimulus. This means that the brain responds to electrical stimulation in a much more repeatable, consistent way than through natural stimulation.

“It shows that there is something fundamentally different about the way the brain responds to electrical stimulation than it does to natural stimulation,” Bensmaia said.


Abstract of Behavioral assessment of sensitivity to intracortical microstimulation of primate somatosensory cortex

Intracortical microstimulation (ICMS) is a powerful tool to investigate the functional role of neural circuits and may provide a means to restore sensation for patients for whom peripheral stimulation is not an option. In a series of psychophysical experiments with nonhuman primates, we investigate how stimulation parameters affect behavioral sensitivity to ICMS. Specifically, we deliver ICMS to primary somatosensory cortex through chronically implanted electrode arrays across a wide range of stimulation regimes. First, we investigate how the detectability of ICMS depends on stimulation parameters, including pulse width, frequency, amplitude, and pulse train duration. Then, we characterize the degree to which ICMS pulse trains that differ in amplitude lead to discriminable percepts across the range of perceptible and safe amplitudes. We also investigate how discriminability of pulse amplitude is modulated by other stimulation parameters—namely, frequency and duration. Perceptual judgments obtained across these various conditions will inform the design of stimulation regimes for neuroscience and neuroengineering applications.

references:

  • Sungshin Kim, Thierri Callier, Gregg A. Tabot, Robert A. Gaunt, Francesco V. Tenore, and Sliman J. Bensmaia. Behavioral assessment of sensitivity to intracortical microstimulation of primate somatosensory cortex. PNAS 2015; doi:10.1073/pnas.1509265112

Read Full Post »