Tim Loderhose

Daemonium Machinae | Robotic Cello

A Finis Musicae project for Mercer Labs in NYC. Two robotic arms are playing a cello - one holding the bow, the other an end effector to finger pitches. A podium containing a hand tracking camera allows visitors to control the robots in real-time.

I wrote the software for this project, including (re-)calibration of robots and cello, motion design and planning, MIDI sync, and real-time control using camera input in TouchDesigner. The cello is able to play pieces fed via MIDI, with smooth handover occuring when moving between direct motion control and other motion. The robots are able to calibrate themselves using a combination of sound feedback, force-torque sensing and a calibration camera mounted on the bow holder. This project uses 3D transformation logic in Scipy, inverse rendering for robot eye-in-hand camera calibration in JAX, and UR-RTDE for robot communication of the UR-5e robots used.

mercer.webp

Real estate property search using LLMs

For a US client, I used LLM structured outputs (using OpenAIs API) to create property search based on natural language. The search handles a huge variety of differently-worded queries for real estate and incorporates location-awareness (using the Google Maps API) as well as isochrone maps for location-time-based queries. FastAPI/uvicorn are used to handle tens of thousands of requests every day.

Faces

An iOS app developed for a client in 2020 which guides the user through taking a number of selfies, which are captured alongside depth and head segmentation information, and used to reconstruct a photo-realistic 3D model of the subjects head. Reconstruction happens in PyTorch on a server, taking around a minute (depending on the resolution, and the amount of pictures used). Head shape is fitted using the FLAME 3D morphable model, and facial reflectance is optimized by backpropagating through a physically-based rendering pipeline. The custom shader is implemented in PyTorch3D, and fits delit textures, including normal maps.

faces.webp

Ghostnote

Project to regress drum-hit location from multi-channel microphone audio. By using two overhead and one close microphone, the location of hits on the drumhead can be trilaterated. By running multi-channel realtime onset detection across microphone channels, we can augment an acoustic drumkit with electronic samples in real-time, mapping parameters of our choosing to X-Y coordinates on the drumhead.

https://github.com/timlod/ghostnote/

https://github.com/timlod/onset-fingerprinting

Loopmate

A prototype loopstation developed in Python based on my own realtime sound engine based on PortAudio (usind python-sounddevice). Includes novel features including near-real-time beat analysis (used for correcting loop boundaries) and back-recording which allows adding loops retro-actively.

https://github.com/timlod/loopmate

Audio Effects/Plugins

DLFX (deep learning audio effects) originated with my Master's thesis, in which I recreated analogue guitar distortion effects pedals (like the PLASMA pedal) using neural networks.

I also rapidly prototype plugins in Python and develop them for production using the JUCE framework.