I've been working on a metronome app for quite some time now. Following Apple'e sample Metronome application, I've been using a timer on a (high priority) background thread to play system sounds. My results are okay but far from perfect. Some issues that I face are:
The OSX audio libraries are referred to as "Core Audio". There are a few different ways of interacting with the audio hardware and the Core Audio frameworks. There's OpenAL (as mentioned in another comment) which is a cross-platform standard approach to audio much like OpenGL is for graphics. There is also Audio Unit which is a callback-based interface directly to the audio hardware which allows you to supply raw PCM audio in real-time. Audio Units are extremely time-constrained, so if you're looking for metronomic rhythm, Audio Units can do that for you. There's also a layer on top of Audio Units known as AudioQueue which allows you to schedule a stream of audio for play in real-time. One of the benefits of AudioQueue is that it can automatically decode compressed formats such as MP3 or AAC.
Search on "Core Audio Overview" in the docs to learn more about the various audio frameworks for iPhone.