Native Instruments Leverages Amazon Bedrock for Smarter, More Intuitive Search and Discovery for Music Creators

About Native Instruments

For over 25 years, Native Instruments has been at the forefront of musical innovation with technology that has revolutionized how people make music. They develop, manufacture, and supply music software and hardware for music production, audio engineering, and DJing for amateurs and professionals alike.

Native Instruments’ Challenge

Andy Sarroff, Senior Research Manager at Native Instruments, leads a research team that creates virtual instruments and synthesizers, in addition to software for audio production. Native Instruments sells both the software-based instruments that make sounds, as well as access to their expansive catalog of sounds. The catalog –or rather, digital library of sounds– is called Contrastic Language Audio Pretraining (CLAP). 

Sarroff and his team have worked for some time to improve the discoverability of audio files within the library, knowing that end-user creators searched in a multitude of ways, using varied descriptors. One critical piece that Sarroff and his team hadn’t yet accounted for was how to provide social context insights in the search and discovery process. 

“We knew that social context was going to be a huge part of delivering a smarter, more intuitive search. Of course, GenAI was the likely solution, but that wasn’t a space we specialized in,” said Sarroff.

Because Sarroff and his team primarily worked with Machine Learning (ML) and digital signal processing and already had a backlog of business-critical projects in motion, they knew they’d need a third party expert to develop and fast track a GenAI strategy for addressing this challenge. 

Why Tribe AI?

“Of course, there are plenty of other consultancies specializing in GenAI and LLMs. It was especially important to us, however, that we partner with people who share the empathy we have for users in the music space,” said Sarroff.

Native Instruments discovered Tribe AI through a recommendation from their trusted tech investor Francisco Partners. Francisco Partners knew from experience that Tribe AI team members were not just experts on the technology. The Tribe AI team regularly demonstrated their commitment to understanding the end user’s challenge as a means to developing truly impactful solutions.

Proposed Solution

Native Instruments and Tribe AI began a 4-week proof of concept (POC) engagement centered on building a prototype of a human-like chat interface that would deliver an exceptional user search and discovery experience specifically focused on improved recognition of social context.

The GenAI strategy would rely on Amazon Bedrock + Native Instruments’ indexed sound file data; and would utilize Large Language Models (LLMs) to provide improved search results to end user creators. This technology would allow creators to use more human-like language when searching the library. It also would recognize that a user search for ‘Arcade Fire” should produce audio files from the band named Arcade Fire rather than those that sound like an “arcade” or a “fire”. Sarroff believes the chat interface could really revolutionize the creator experience, as it's a GenAI application never before implemented in this space.

“AI is a very touchy subject for the creative workforce because there are some companies trying to replace the talent. We are invested in serving the creators, not disenfranchising them. Instead, we want to empower them to realize their creative outcomes as quickly as possible,” said Sarroff.

Tech Stack Details

Full-stack cloud-based application working alongside the existing Native Instrument environment: 

  • Cloud: Amazon Web Services (AWS) Bedrock
  • Large language models: Anthropic Claude 2
  • Embedding models: CLAP (Audio<>Text)
  • Programming languages: Python + Typescript

How It Works

The project backend is in Django, and the front-end is in React JS. Specifically, the backend was built via a GitHub action, containerized using Docker and pushed to Amazon Web Services (AWS) Elastic Container Registry (ECR). An EC2 instance was created which will run the docker container. In the future, this will instead be auto-scaled by using AWS Elastic Container Service (ECS).

For DNS routing, the teams used Route53, and for security, they utilized AWS Elastic Load Balancing (ELB) which defines the firewall rules.

The backend interacts with AWS Bedrock, to perform LLM queries using Claude. The LLM transforms queries into suitable text for searching via a CLAP model (Contrastive Language-Audio). The CLAP model currently lives outside of the AWS infrastructure, with plans to move that into SageMaker.

The frontend is deployed via a GitHub action, which packages the NPM build and pushes it to an S3 Bucket. Additionally, the teams are using Route53 for DNS routing.

Developing the Roadmap

The engagement spanned four weeks, with the first two weeks focused primarily on discovery and scope development.

The Tribe AI team learned that the CLAP library was quite extensive and the audio files within it had been tagged with indexable data. These descriptor tags could certainly be leveraged, but on their own wouldn’t deliver the smarter search and discovery interface that Native Instruments desired. The Tribe AI team planned to couple the indexed data tags with GenAI to produce a more human-like interface that would be able to understand more nuanced prompts and pop culture references.

The interface would be powered by Anthropic and would rely on LLMs to deliver the additional social context layer that had been missing from the previous solution. Sarroff and his team expressed relief in the idea of leaning on the GenAI strategy –and appropriate tech partners– rather than having to create the social context connection layer themselves. 

“Working with Tribe AI on this project has brought so much value. This kind of work sits outside of our circle of experience. Without having to completely divert our team’s attention, we were able to develop the prototype and learn a lot via osmosis,” said Sarroff.

The final two weeks of the POC engagement involved rapid prototyping for the Tribe AI team. Sarroff knew he’d have some work to do before the testing phase in order to ready his internal team. Sarroff made plans to present on the capabilities of the prototype along with details on the POC engagement work and his team’s learnings from the process. He hoped the presentations would reconfirm the team’s trust in the GenAI strategy and bolster idea generation for developing monetization models for the new interface.

Tribe Team Members

Arushi - Product Manager

Marius - GenAI Lead

Craig - Engagement Lead

Native Instrument’s Experience Working with Tribe

Sarroff and his team had long before identified the technical gap in their current search interface. They also knew that GenAI and the use of LLMs were likely the source of the solution. Where the Tribe AI team brought tremendous value was in narrowing down the problem space enough to provide expert guidance on the technology that resulted in a working prototype of the human-like chat interface. The prototype will allow Sarroff and his team the ability to perform A/B testing against their existing solution.

“Thanks to the Tribe AI team, we were able to quickly ideate and test new AI concepts that would differentiate us from others in the market. The Tribe AI team demonstrated deep expertise in AI and a commitment to rapid execution.” said Sarroff.

Impact

The POC engagement was successful in developing a functional human-like chat interface that better addresses the social context shortcomings of the existing search and discovery process. Leveraging an Amazon Bedrock-powered GenAI + LLM deployment strategy allows the end users to attain more appropriate responses when using pop culture references. Previously, it wasn’t uncommon for users to have to make multiple attempts at a search in order to gain their desired response.

Based on the outcomes of the POC engagement, Sarroff affirmed that focusing on this deliverable was the right choice for exploring GenAI-centered innovation for Native Instruments. He has shared initial results and prototype details internally at Native Instruments in presentation format, and plans for an upcoming A/B testing phase where leadership can interact with the interface. Sarroff believes his leadership team will have a critical eye and valuable feedback to share as they test the prototype.

The Future

“We are living and breathing AI right now. It exists all around us and it's not going away. It's part of our new life stack. That’s a fact,” said Sarroff.

Sarroff and his team look with optimism at the ways they predict GenAI can further innovate the music industry. Leaning on the same technology that improved the search and discovery interface for creators, Sarroff sees revolutionized workflows for amateur music producers as they more innately discover production techniques and products to further streamline the overall creative process. 

“The technology is versatile. We can see it being used almost anywhere: from product catalogs and the software, to the ecommerce website, or even in the instruments themselves. Every product we are working on would be less powerful without this piece,” said Sarroff.

Related Case Studies

Case Study

Tribe AI & Venture Labs: Accelerating Startups with Tailored AI Expertise

Case Study

How Tribe AI Built a Model on GCP That Increased Security Questionnaire Auditor Efficiency by 55%

Case Study

Building a GenAI Roadmap for Educational Content Creation

Case Study

VitalSource Leans on GenAI to Reimagine Content Discoverability for Higher Ed Faculty

Case Study

How Tribe AI Shaped Truebit’s AI Strategy

Case Study

Insurance Company Uses ML to Optimize Pricing

Case Study

How Wingspan built a machine learning roadmap with Tribe AI

Case Study

How Nota Built a Roadmap for AI-enabled Journalism with Help from Tribe

Case Study

Togal.ai powers the construction industry into the age of machine learning

Case Study

How Tribe Helped Reservoir Bring Finance Infrastructure to NFT Trading

Case Study

Building a Proprietary Investment Engine Using Public Data for a Top PE Firm

Case Study

Boomi Leverages Amazon Bedrock for Faster Help Desk Responses

Case Study

How Fantasmo is using machine learning to make GPS obsolete

Case Study

Kettle uses machine learning to balance risk in a changing climate

Case Study

Taking a Data-Driven Approach to Women's Fertility with Rita

Case Study

Francisco Partners Accelerates Portfolio AI Efforts with Tribe AI

Case Study

GenAI Solutions: How Bright Transformed Workforce Training with Tribe AI

Case Study

How Togal AI Built the World's Fastest Estimation Software on AWS

Tribe helps organizations rapidly deploy AI solutions that have real business impact.

No items found.