Easily play combinations of sound effects and Taptic Engine vibrations on iOS.





GitHub Stars



Last Commit

2yrs ago











GitRoyalty Platform: iOS 10.0+ Language: Swift 4 CocoaPods compatible Carthage compatible License: MIT

InstallationUsageDocumentationWhy I Built PianoLicenseContributeQuestions?Credits

Piano is a convenient and easy-to-use wrapper around the AVFoundation and UIHapticFeedback frameworks, leveraging the full capabilities of the Taptic Engine, while following strict Apple guidelines to preserve battery life. Ultimately, Piano allows you, the composer, to conduct masterful symphonies of sounds and vibrations, and create a more immersive, usable and meaningful user experience in your app or game.


Piano requires iOS 10+ and is compatible with Swift 4.2 projects.


Support Piano's contributors with a monthly subscription on https://gitroyalty.com/saoudrizwan/Piano to install this package.

Subscribe on GitRoyalty
* comes with a 2 week free trial and can be cancelled anytime


Using Piano is simple.

let symphony: [Piano.Note] = [
    .sound(.asset(name: "acapella")),


... or better yet:

    .sound(.asset(name: "acapella"))

Optionally add a completion block to be called when all the notes are finished playing:

    .sound(.asset(name: "acapella"))
]) {
    // ...

Or cancel the currently playing symphony:


In the background, each note has an internal completion block, so you can add a .waitUntilFinished note that tells Piano to not play the next note until the previous note is done playing. This is useful for creating patterns of custom haptic feedback, besides the ones Apple predefined. This is also great for creating complex combinations of sound effects and vibrations.



Plays an audio file.

.asset(name: String)Name of asset in any .xcassets catalogs. It's recommended to add your sound files to Asset Catalogs instead of as standalone files to your main bundle.
.file(name: String, extension: String)Retrieves a file from the main bundle. For example a file named Beep.wav would be accessed with .file(name: "Beep", extension: "wav").
.url(URL)This only works for file URLs, not network URLs.
.system(SystemSound)Predefined system sounds in every iPhone. See all available options here.


Plays standard vibrations available on all models of the iPhone.

.defaultBasic 1-second vibration
.alertTwo short consecutive vibrations


Plays Taptic Engine vibrations available on the iPhone 6S and above.

.peekOne weak boom
.popOne strong boom
.cancelledThree sequential weak booms
.tryAgainOne weak boom then one strong boom
.failedThree sequential strong booms


Plays Taptic Engine Haptic Feedback available on the iPhone 7 and above.

.notification(Notification)NotificationCommunicate that a task or action has succeeded, failed, or produced a warning of some kind.
.successIndicates that a task or action has completed successfully.
.warningIndicates that a task or action has produced a warning.
.failureIndicates that a task or action has failed.
.impact(Impact)ImpactIndicates that an impact has occurred. For example, you might trigger impact feedback when a user interface object collides with something or snaps into place.
.lightProvides a physical metaphor representing a collision between small, light user interface elements.
.mediumProvides a physical metaphor representing a collision between moderately sized user interface elements.
.heavyProvides a physical metaphor representing a collision between large, heavy user interface elements.
.selectionIndicates that the selection is actively changing. For example, the user feels light taps while scrolling a picker wheel.

See: Apple's Guidelines for using Haptic Feedback


Tells Piano to wait until the previous note is done playing before playing the next note.


Tells Piano to wait a given duration before playing the next note.

Device Capabilities

  • The iPhone 6S and 6S Plus carry the first generation of Taptic Engine which has a few "haptic" vibration patterns, which you can play with Piano using the .tapticEngine() notes.

  • The iPhone 7 and above carry the latest version of the Taptic Engine which supports the iOS 10 Haptic Feedback frameworks, allowing you to select from many more vibration types. You can play these vibrations using the .hapticFeedback() notes.

  • All versions of the iPhone can play the .vibration() notes.

Piano also includes a useful extension for UIDevice to check if the user's device has a Taptic Engine and if it supports Haptic Feedback. This extension is especially useful for creating symphonies for all devices:

if UIDevice.current.hasHapticFeedback {
    // use .hapticFeedback(HapticFeedback) notes
} else if UIDevice.current.hasTapticEngine {
    // use .tapticEngine(TapticEngine) notes
} else {
    // use .vibration(Vibration) notes

Note: This extension does not work on simulators, it will always return false.

Taptic Engine Guide

Apple's guide over the Haptic Feedback framework is very clear about using the Taptic Engine appropriately in order to prevent draining the user's device's battery life. Piano was built with this in mind, and handles most cases as efficiently as possible. But you can help preserve battery life and reduce latency further by calling these helper methods based on your specific needs.

1. Wake up the Taptic Engine


This initializes and allocates the Haptic Feedback framework and essentially "wakes up" the Taptic Engine, as it is normally in an idle state. A good place to put this is at the begin state of a gesture or action, in anticipation of playing a .hapticFeedback() note.

2. Prepare the Taptic Engine


This tells the Taptic Engine to prepare itself before creating any feedback to reduce latency when triggering feedback.

From Apple's documentation:

This is particularly important when trying to match feedback to sound or visual cues. To preserve power, the Taptic Engine stays in this prepared state for only a short period of time (on the order of seconds), or until you next trigger feedback. Think about when and where you can best prepare your generators. If you call prepare and then immediately trigger feedback, the system won’t have enough time to get the Taptic Engine into the prepared state, and you may not see a reduction in latency. On the other hand, if you call prepare too early, the Taptic Engine may become idle again before you trigger feedback.

tl;dr A good place to put this is right after calling .wakeTapticEngine(), usually at the beginning of a gesture or action, in anticipation of playing a .hapticFeedback() note.

3. Put the Taptic Engine back to Sleep


Once we know we're done using the Taptic Engine, we can deallocate the Haptic Feedback framework, returning the Taptic Engine to its idle state. A good place to put this is at the end of a finished, cancelled, or failed gesture or action.

But you don't have to.

Piano automatically wakes and prepares the Taptic Engine when you call .play([ ... ]) if it includes a .hapticFeedback() note, and returns the Taptic Engine back to sleep when the notes are done playing.

The Example App

The example app is a great place to get started. It's designed as a playground for you to compose and test out your own symphonies of sounds and vibrations.


You can even drag and drop your own sound files into the project and tweak the code a bit to see how your own sounds can work alongside the Taptic Engine. To add your own sound file, simply drag it into Sounds.xcassets, name it accordingly, then edit the cellData property in ViewController.swift (Scroll down to case 7 in cellData, or look for "Add your own sound assets here..." in the Jump Bar using Ctrl + 6).


Option + click on any of Piano's methods or notes for detailed documentation. documentation

Why I Built Piano

With the new iPhone 8 and iPhone X, we are going to see many new Augmented Reality apps, and one of the keypoints in the Human Interface Guidelines for AR is to not clutter the AR view, allowing as much content from the augmented reality to be displayed as possible. Besides AR, Apple has spent tremendous time and manpower giving the iPhone an interface beyond our vision with the Taptic Engine and Siri. Apple even had a session during WWDC 2017 talking about the importance of sound design and the impact it can have on a user experience. It's obvious that the future of technology is not visual interfaces, but augmenting our connection with the real world. By using our physical, auditory, and most importantly visual senses, we can see the world in a whole new light. That's why I built Piano and ARLogger, frameworks I hope will help developers create immersive and uncluttered interfaces, while keeping the user aware of the technology's state and purpose. If you'd like my help on an AR project, or just want to chat about the future of technology, don't hesitate to reach out to me on Twitter @sdrzn.


Piano uses the MIT license. Please file an issue if you have any questions or if you'd like to share how you're using Piano.


Please feel free to create issues for feature requests or send pull requests of any additions you think would complement Piano and its philosophy.


Contact me by email hello@saoudmr.com, or by twitter @sdrzn. Please create an issue if you come across a bug or would like a feature to be added.


Rate & Review

Great Documentation0
Easy to Use0
Highly Customizable0
Bleeding Edge0
Responsive Maintainers0
Poor Documentation0
Hard to Use0
Unwelcoming Community0