1 minute read

Analyzing Audio

In order to visualize audio, it must first be processed into a format more conducive to visualization. Broadly this is the field of digital signal processing. I started this project by writing a custom DSP library for UE4 which relied on the FMOD - borrowing inspiration from parallelcube.

This worked for a basic frequency analyzer!

However, after some experimentation, FMOD proved to be an “all in” kind of solution to audio analysis, and I was after something a bit more flexible. I later switched from using my custom DSP library, to using an existing audio analysis plugin, mostly for the ability to process external audio streams (ie: PC audio) rather an imported file. This allowed me to focus effort on the visualizations.

Visualizing Frequency Spectrum

I started by creating a ReactiveVisualizer interface and a manager which pipes the audio analysis to any implementations that exist in the scene. The first implementation was a simple exercise in moving from familiar frequency spectrum visualizations to actual 3D geometry. A procedurally generated row of cubes corresponds to various frequency ranges. Each cube grows in size based on the intensity/amplitude of the corresponding frequency. I extended these principles to 2D and spherical coordinates as well.

Frequency Spectrum visualization for 1D, 2D and spherical geometry.

Details include:

  • Procedurally generated grid
  • Instanced static meshes for performance
  • Can be modified in real-time
  • Implemented various interpolation methods to yield smoother, or more pixelated visuals
  • Grid visualization styles (row, col, radial, inverse radial etc.)
  • Spherical visualization styles (fibonacci sphere index, equator & inv, latitude & inv, longitude & inv, lat+long & inv)

Adding Interaction

The goal is interactivity, and up until now all these visuals have been reactive but not interactive. I started with a simple experiment where the intensity/behavior of a reactive visualizer is influenced by the proximity of a target object.

I then extended this experiment by using the proximity of tracked hand controllers to control the behavior/intensity of the reactive visualizer. Unfortunately I don’t have a video as desktop capture in VR is wonky. It was cool though - trust me!

Adding dimensions to medium, Playing Video

I was playing around with video textures for another project (interactive cloth simulation), and thought it would be fun to try adding a dimension to what is traditionally a very 2D medium (video).

Comments