Audio Software Engineer

Need help leveling up game audio systems and workflows to push the boundaries of gameplay and interactive soundscapes?

From high level systems to tools to platform technology, Michelle can adapt to support audio and gameplay team needs.

Innovation in art and technology for sound

Michelle loves to collaborate on complex cross-disciplinary projects to solve creative problems, especially in game audio.

After realizing her favorite games have exceptional audio, she wanted to learn more about the underlying technology, and she is still excited to continue growing with new challenges.

She thrives on teams striving to reach new frontiers in gameplay and exploring ways audio can enhance interactive experiences.

With her experience in various areas of the game audio pipeline, Michelle is ready to jump into any gameplay system to empower designers and engineers while quickly learning the tech and tools.

Skills honed over the years at:
  • Obsidian Entertainment
  • 343 Industries (now Halo Studios)
  • Dolby Laboratories

Interests

  • Learning different areas of game development
  • Technical sound design and prototyping
  • Advocating for audio and diversity in games

Where you can hear and learn about Michelle's work

Learn about the impact of Michelle's work in past shipped projects showcasing gameplay audio systems, tools, and integrations.

View details about each project by clicking through the arrows.

Gameplay systems in Avowed

Obsidian Entertainment, 2025
Unreal Engine 5, Wwise, C++, PC, Xbox

  • Dynamic music system that changes with combat intensity and scripted areas in open worlds during exploration.
  • Custom Slate UI for organizing music data assets for each level.
  • Override feature for ambient audio volumes placed in level to change music in specific areas such as boss combat music or sweeping vistas.
  • Debug menu to quickly test combat music with a slider to change threat intensity rather than recreating a full combat encounter with enemy AI.
  • Cutscene implementation to seamlessly transition between ambient music and scripted scenes.
  • Ambient emitter technology for rivers based on player movement and location, implemented with tech art pipeline for automatic generation and placement with worldbuilding tools.
  • Parameters including width and velocity (flow) used to generate sections of emitters at each point on the spline.
  • Sound designer implementation time saved with tech art workflow integration to generate ambient audio tech with one button click.
  • Integration of spatial audio Triton Acoustics plugin within game project that uses custom audio components to stress test latest features in collaboration with the Xbox team.
  • Wrapper to pipe Triton Acoustics processing for custom audio components.
  • Prototype bake process to divide level into a grid, improving acoustics data bake times before UE5 world partition support.
  • Audio interfaces for characters, weapons, and other gameplay objects to trigger audio for various systems like movement and events unique to each object type.
  • Custom behaviors for different interfaces to activate/deactivate audio components to save CPU resources.
  • AI VO utilities to add organic cooldowns for enemy and companion banter and hooks for major combat events.
Audio Tools and Extensions in Halo: Infinite

343 Industries (Halo), 2020
Slipspace Engine, Wwise, C#, C++, PC, Xbox

  • Scripting toolkit that automatically imports audio assets into Wwise project and copies object properties with a template hierarchy, streamlining implementation.
  • Utilization of Wwise authoring API pushed to limits of capabilities to extract as much data possible.
  • Naming conventions for files and folders to assist with hierarchy and tokenization of audio objects.
  • In-game debug view of audio memory usage and soundbank loading status.
  • Filterable and sortable categories to display subsets of audio objects and ordered by size.
  • Visual scripting node for proprietary Slipspace game engine to cache vehicle object velocity for sound designers.
Platform Technology - Dolby Atmos for Xbox and Games

Dolby Laboratories, 2018
C++, Python, PC, Xbox

  • Automated testing for Dolby Atmos for VR application with text headtracking input.
  • Test apps for Dolby Atmos SDKs to help Microsoft with OS integration.
  • Sample code showing how to decode Dolby Atmos content to route bed channels and audio objects.
  • Test automation framework for validating Dolby Atmos processing with various content types on multiple platforms and endpoints (i.e. playing a game on Xbox with speakers or streaming a movie on Windows with headphones).
  • Regression and smoke tests integrated with builds and merges.
  • Audio quality verification through signal analysis and re-encoding content decoded in OS.
  • Custom wrapper to integrate Dolby Atmos with EA's Frostbite engine for Star Wars: Battlefront.
Conferences

Game Developers Conference, 2025

Audio Summit talk
(GDC Vault access required to watch)
Understanding Game Audio Programming and How I Got There
Overview:
Game Audio Programming covers many areas of game development, and this talk explains what they are. It will also offer a unique perspective in starting a career in this field. From platform development to engine, tools, and gameplay programming, audio spans a wide breadth of disciplines and game development can benefit from dedicated audio programming support. Michelle shares how she became an audio programmer herself and her experience as a minority (woman, person of color) in the industry, as well as feeling imposter syndrome and striving to overcome it. There is no one way to be an audio programmer, but that's what makes it so exciting.
Published Works

Game Audio Programming: Volume 4, 2023

Book chapter
Data-Driven Music Systems for Open Worlds
Abstract:
Interactive music in games is a broad topic, and there are several ways to develop an interactive music system. This chapter is about implementing a basic data-driven reactive music system for open world games with gameplay examples. It will give an overview of the several game components that work with the music system and how the data is mapped between the game and audio engines. The different gameplay systems within a game help drive the music data through the music system, creating dynamic and responsive music that seamlessly transitions between combat and ambience. After reading this chapter, the main takeaway is a foundation to create a flexible music system to work with any kind of game data to help automatically fill the musical space in a large open world game.

Toolchains and Tech Stacks

Programming languages, game engines, and software

Coding Environment

  • Languages: C++, C#, C, Python
  • IDEs: Visual Studio, Rider, Pycharm
  • Source Control: Perforce, GitHub

Game Development Stack

  • Game Engines: Unreal Engine 5, Unity
  • Audio Middleware: Wwise, FMOD

Let’s connect!

Email or connect on LinkedIn to inquire about gameplay and audio needs or chat about game audio.

  • CA, USA

    Available for remote or hybrid opportunities in California
  • Flexible working hours in PST