Ableton Push - Qt in music making

Berlin-based music company Ableton recently released the new Push, a digital instrument to create and perform music. It is a really impressive tool, take a look:

As the Push UI is built with Qt, we had a chance to ask Stephen Kelly, Benjamin Jefferys and Nikolai Wuttke - developers at Ableton - about how it was built and how it works.

Ableton has been around for a long time, where did it all start?

It all started in an apartment in Berlin back in 1999. The three founders – Gerhard Behles, Bernd Roggendorf and Robert Henke – wanted to create an alternative to the linear workflow of established DAWs (Digital Audio Workstations). In 2001 Ableton released Live 1 and moved into the workspace that we still occupy today – just a few streets from the apartment where the first lines of code were written. We have since released nine major versions of Live, and in 2013 we announced our first hardware instrument: Push, which was designed by Ableton but manufactured by Akai. This was well received by the community, but we wanted to evolve the concept so we completely redesigned Push and took hardware engineering in-house. We released the new Push at the beginning of November, together with Live 9.5 and Link.

You mentioned Live and Push, can you give a short overview of the different products you have, and how Qt fits in there?

We make three products: Live, Push and Link. Live is software for creating musical ideas, turning them into finished songs, and also taking them onto the stage. Push is a hardware instrument that puts everything you need to make music with Live at your fingertips. We've also recently developed Link - a technology that keeps devices in time over a wireless network. It is embedded within Live but it is also inside an increasing number of iOS music-making apps. The new Push is our first released product using Qt technology – for its animated display.

Can you give a general overview of the architecture used in Live/Push?

The updated Push hardware is designed to enable new workflows for music makers. Part of the intention of the hardware product is to allow a music maker to stay in the creative flow, without needing to turn to a different interface on a computer. The hardware needs to be able to show the user everything they need to see, in order to make music, from detailed waveforms to browsing for samples.

The interaction between the Push hardware and Live is a little complicated, so let's start with a diagram:

Push Architecture DiagramPush communicates with the computer via two protocols over a USB interface, one of which is standard MIDI. This is a bidirectional protocol. Live uses it to control the pad colors and button lights on the hardware. When the user presses a button or a pad, a signal is sent over the MIDI channel back to Live. The other protocol communicating over USB allows sending image data to the hardware display. This means Push doesn't need a processor running a more complex operating system and application to show GUI elements on the screen. The processor on the Push hardware is optimized solely for handling user input from the pads and buttons. So although Qt is used as the display technology on Push, it is not actually running on the device. When Live and Push are used together, Live starts another process which is responsible for rendering all of the content shown on the Push screen.

This architecture forces a strong implementation of the Model-View-ViewModel pattern. MVVM emphasises division of the lowest level of the data representation, the Model, from the representation which is intended to appear in a View, that is 'the model of the view', aka the ViewModel. Using QObjects with signalling functionality for the ViewModel, and using the same system for the View in QML, gives a high level of compatibility between the components of the design.

The Data Model - the values of parameters, and the content of a song - of what is shown on Push is defined in the Live 9.5 process and then shared via an IPC channel with the Push display process using a JSON protocol. The Push display process creates a ViewModel - QObjects with properties representing what should be shown on screen - making it available in the context of a QML engine, which
then defines the View for the Push screen. The rendered pixels are transformed to the pixel format expected by the hardware and then transferred over USB to Push.

On Push

What were the biggest benefits of using Qt for Push?

We evaluated several GUI technologies when starting work on Push, such as HTML and JUCE. Qt stands out mainly for the QML language, the QtQuick technology, the Scene Graph API and easy integration with OpenGL.

The obvious advantages of the QML language for rapid prototyping have really stood out for us, as well as the ability to extend it with plugins written in C++. Qt contains a broad range of ready-to-use APIs which we wouldn't have access to if we had written the UI in HTML, for example. We try to make sure we make full use of all Qt/QML functionality that is appropriate for us, and keep an eye on new releases to see what becomes available.

How did the team take the new tools? How was the learning curve?

Initially a small task force investigated the GUI technologies available and when Qt was selected for Push, that task force began sharing their knowledge with the rest of the team. From that point, we have heavily relied on the invaluable Qt documentation.

The Qt blog is also an important resource, because articles there often display a more philosophical approach to how to write QML. While the Qt Forum or Stack Overflow can be a useful resource for QML beginners on how to get specific things done with available APIs, they often don't provide principles and details about the mindset required to write and design declaratively with QML.

Writing QML code as a beginner is relatively easy. Trying things is cheap and easy and it results in very fast feedback because of the interpreted nature of the language. However, figuring things out as we go does not always lead to the best design. So a more interesting topic is 'how difficult is it to do QML right as a beginner?'.

There are certain traps that we found, such as over-use of procedural code like onFooChanged to try to enforce a particular order of operations. It is better to encapsulate tolerance of indeterminate order of operations in a declarative interface, but learning the details of exactly how to do that and how to recognise the traps before we fall into them (or create new ones) is something that comes only with experience in the real world!

Developers new to QML need to learn a different approach to programming which is declarative instead of imperative, and where things can change 'spontaneously' or in an unexpected order. As we grow the team working on Push here at Ableton, new developers might leave QML to others because of caution about 'doing it right', or worrying that 'there must be a better way'.

We are extending our best practices on how to write QML code and how to interface it with non-Qt code as time goes on, so we have accumulated some good guidelines now.

Were there technical challenges in getting the remote screen to work properly?

The technical separation of the user interface from the screen requires that we implement an unusually rigorous MVC pattern. The screen is really just a View and it doesn't have any way to offer user interaction. There is no way to take convenient shortcuts on the hardware directly in response to a user interaction.

That separation means there are no GUI events handled in QML at all such as mouse, keyboard, push of a button or press of a pad. The View for Push is simply a declarative response to ViewModel updates received from Live, so changes are represented as 'this is now the selected track' rather than 'this track was just clicked'.

However, user interaction hints do sometimes need to be part of the communication with the display process. For example, a single long list of items is broken up into a multi-column list. Changes of the 'currently highlighted item' are animated vertically through the adjacent column when the user scrolls.

The user can also use directional buttons on the hardware to change selection horizontally, and that calls for horizontal animation. So, it is not enough to update the ViewModel with the information that the selected item is now 'My Heart Will Go On'. The information about whether that is the result of horizontal or vertical movement must also be conveyed in the protocol.

How about timing? Music making is real time, how does it all fit together?

The parts of Live that concern playback of music run in a separate thread to the UI, so that operations that can stall the Live UI, such as loading a large drum rack, do not stall the sound playback.

The process separation for the Push display process is an extension of this isolation from expensive operations in Live, in a sense. If Live is stalled in an expensive operation, the Push display process remains responsive. It is not possible to have multiple independent GUI threads with Qt, so the multi-process solution is the next best thing. We did, in an early iteration, attempt to combine the event loop used in Live with the Qt event loop. Qt APIs made the integration possible, but it seemed impossible to achieve our 60 fps target for the Push GUI while the Live GUI shared the same 16 ms time slices.

In fact, there is another level of separation in the architecture for real-time GUI updates. The IPC mechanism described previously is used only for updates to the song definition and user interface events. Real-time information such as the playhead progress uses a separate channel between the Live engine thread and the Push display process, so it even bypasses the Live GUI thread entirely. This
channel contains much lower traffic and so the display on Push for the playhead more accurately reflects real-time information.

Did you find any other benefits of QML when creating the Push interface?

One of the impressive things about QML is how animations work and how easy they are to use. There is something slightly magical about them. They're so smooth and easy to add that we are not afraid of adding them - our designers sometimes have to restrain us!

Because our QML doesn't deal with user input events, the View is often heavier than it needs to be. Reusable generic widgets, such as Button, usually implement some form of mouse and keyboard handling which we don't need. Using primitives such as Rectangles may be an option to avoid the weight of Button features that are not used, but there are extra features in Button which we do use, such as text that fades out towards the edge and consistent styling. Additionally, we want to reuse code as far as possible. These concerns make it worthwhile to use the Button instead of a simpler alternative.

Any future plans on Push that you can share?

We are always looking for opportunities to develop our products, but I'm afraid we can't give information about future plans. As any product developer knows: things can change!

Thank you for sharing your experiences with Qt and QML on the Push. It really is an amazing piece of music hardware.

Thank you! We are very proud of Push and what it enables musicians to create. Thank you too, for taking the time to conduct this interview.


Blog Topics:

Comments