RIT becomes a check for Yamaha’s immersive audio technology

Researchers at the Rochester Institute of Technology are how other people understand sound. Some of those innovations will add to the high-tech acoustics developed through foreign entertainment company Yamaha Corp.

Sungyoung Kim’s Innovative and Applied Research Laboratory for Immersive Sound (AIRIS) at RIT has a test site to explore the long-term functions of one of Yamaha’s latest technologies, the Active Field Control System (AFC). Kim and his academics will help expand innovations into the next phase of the high-tech audio system.

“Through Yamaha I was approached to talk about the long term of this generation. The first step was to adapt the entire generation of Yamaha in our laboratory. That was the first phase,” said Kim, an associate professor of audio engineering generation in RIT’s College of Engineering. Engineering Technology, which has experience in creating immersive audio systems that sound in trendy spaces.

AFC generation improves an environment by controlling reverberations and positions of sounds/objects, deepening acoustics in passive spaces and creating auditory environmental parameters that build the architecture of the site. Often referred to as three-dimensional audio, virtual and immersive sound is an emerging domain of studio and production where corporations like Yamaha continually seek to deliver rich quality sound on platforms, especially for 3D classical music concerts.

“One of the disruptions in 3D musical performances is how to synchronize computer-generated portions and acoustic atmospheres with human performance. And it’s a hot topic of studies on how computers recognize music and track human interpretation,” Kim said. Composers use generation as a means to achieve their musical creativity. We just replaced the concept.

Part of the concept concerned understanding the integration of music and listening environment. The formula allows you to recreate certain acoustic parameters for live performance. That’s more than the speakers; the formula adjusts the acoustics of a trendy space, for example, to give the audience the impression that they are experiencing sound, from the ancient cathedral to the windswept caves, without having to be in the real environment.

In a first experiment, Kim worked with a composer from the Eastman School of Music, Sihyun Uhm, and asked him to return a new piece of music through the formula. It is possible for computers to have pre-recorded effects, but this formula differs audience in the sound environment. The composer intended to make the audience feel or visualize internally two environments: a mountain and a desert, and the composition went back and forth between each other, Kim explained.

“Many other people are looking for this today. With orchestras, for example, there are adjustments in orchestral movements, comparable from scene to scene in a room,” he said. “This new composition lasted only 10 minutes. We replace the acoustic ensemble, or the musical ensemble, several times in the same movement, which is unique and challenging. It’s some other technique for music. “

Uhm’s composition, String Quartet No. 2, was performed at Yamaha Ginza Studio in Tokyo, Japan, last summer, and Kim was joined through Hideo Miyazaki, an area acoustic design engineer at Yamaha, local musicians, corporate colleagues, and several RIT engineering alumni. generation program living in Japan.

Four RIT fellows participated as assistant engineers/operators at the concert, running on the AFC formula while Kim was ready for the concert.

“While scholars can learn many facets of acoustics from books, the formula has provided a unique learning opportunity on how to virtually manipulate acoustics in real time,” said Kim, whose work is funded through two Yamaha corporate grants. The first, “Toward an Individualized Presentation of the Immersive Experience,” is a three-year grant to compare the auditory selective attention procedure, a cognitive procedure humans adopt to distinguish sounds and environments. Another, “Investigating the Belief Signals Needed to Remotely Adjust the Yamaha Formula AFC,” considers the process of virtual connections and how engineers practically sing formulas without entering real space.

“It’s more technical to set up a formula remotely, and that comes to the concept of execution on the metaverse. With other people doing more virtual paintings remotely, it is mandatory to have audio formulas that are compatible and with the best possible audio quality. “possible,” Kim said.

This audio search for Yamaha will also have an effect on RIT with its multiple convention areas and auditoriums on campus. The ITL is an area of experimentation where physically separated areas can be practically synchronized.

“We can have a musician in one building, another musician in another building, and I need to see if they can play together,” he said. “This is the long term of music in the metaverse. You can have an artist performing in Korea, while a few others play in the United States, even here at the RIT. In the metaverse, there are no walls or barriers between those musicians. I think the acoustic landscape is what makes them feel like they’re in the same space.

Thought leaders and leaders from around the world gathered in Rochester earlier this week for the 5th RIC-RIT Global Remanufacturing Conference, which highlighted how “remanufacturing” contributes to a more sustainable long-term in the United States and beyond.

A room in the Hall of Slaughter can accommodate 150 scholars and is intended to simulate the spaces of the Student Hall for Exploration and Development (SHED) that will have categories next fall. The classroom of the Slaughter, called “betaSHED”, combines 3 rooms to provide knowledge of scholars in the large-scale environment.

James Albano, a researcher/engineer at RIT’s Chester F. Carlson Center for Imaging Science, presented a Department of Energy-funded task to expand remote sensing generation that can have programs ranging from assessing earthquake damage to determining soil moisture to waiting for crop yield.

WROC-TV interviews Assistant Professor Zhi Zheng of the Department of Biomedical Engineering about a humanoid robot formula he is developing for Tai Chi.

Leave a Comment

Your email address will not be published. Required fields are marked *