-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Swift code helps get going quickly (AudioKit for iOS does this well) #10
Comments
Hi @loopbum - do you have a specific suggestion of what you would want to see with Swift bindings? There's only a single Objective-C class with a single factory method in LinkKit. The rest is C. I could imagine a Swift playground that would allow you to explore the C API in some way would be cool to have, but I'm afraid experimenting with this would not be terribly high priority for us right now. |
I'm more interested in getting it to work with AudioKit as it's C and Objective-C based. As for C that's not the issue as much as C++ which will be addressed in Swift 3.x So immediate use case for LinkKit; I've not been able to pull together AudioKit with LinkKit, that's, very easy to use Swift 2.x with AudioKit. |
+1 |
I've gotten ABLLink to work with Swift 3.0. Recoded the app classes as Swift 3 and using a Bridging-Header.h which allows the Swift to easily call the C classes. Also needed to add the libABLink.a and libc++.tbd to my Build Phases to work. |
@jasonjsnell is there a repo for this swift class? |
I have it as a private repo because I assume Ableton doesn't want the code public (yet? Correct me if I'm wrong). I'll send you a private invite to it. |
Hi Robert,
I've used Link in swift projects but my underlying audio engine is still
written in objective C and C to work with the audio thread (likely in a
similar way that Jason described it). But I'm assuming you are really
asking whether people have used Link when using AudioKit to build the audio
engine in swift, right? If so, then I think your question would be worth
asking on the AudioKit forum directly, since:
1- It seems to be something that the AudioKit team has on their roadmap (
https://github.com/audiokit/AudioKit/projects/2).
2- from a recent AudioKit forum thread (
https://groups.google.com/forum/#!topic/audiokit/PVl0vKc7EcU) it looks like
they still need to implement an interface for working at the audio buffer
level in C. Quoting from that thread: "getting custom C/C++ code working
in the buffer callback has been a commonly requested feature.
Right now, a formal interface for doing such things doesn't exist, but I
see no reason why it can't. We
just need the users who really need it to build it.
"
Julien
…On Tue, Apr 18, 2017 at 12:33 AM, Jason Snell ***@***.***> wrote:
I have it as a private repo because I assume Ableton doesn't want the code
public (yet? Correct me if I'm wrong). I'll send you a private invite to it.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#10 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/ABB6z-d5gZKygE1XKdRfBie0dMT64_pKks5rw-jXgaJpZM4HeOPD>
.
|
I was able to build an audio engine in Swift. I happen to be using an file player unit to play my sounds, but it is also possible with a render loop in Swift. It takes a few conversions in and out of RawPointers, but it can be done. |
Hello, we have just released two link-enabled apps. I am trying to find where to register theses apps or how to let the Ableton Link community to know of their existence.
Thanks you for the help in advance.
Kind regards
Rikard
|
Could I receive an invite as well? I haven't started integrating Link yet
for my project, which is mostly written in Swift, with some Objective-C.
…On Mon, Apr 17, 2017 at 6:34 PM, Jason Snell ***@***.***> wrote:
I have it as a private repo because I assume Ableton doesn't want the code
public (yet? Correct me if I'm wrong). I'll send you a private invite to it.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#10 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AFunMEIsgHtjv6hPbgOLNulrpE3Bjzj2ks5rw-jYgaJpZM4HeOPD>
.
|
Sure, I'll do that now. I also reached out to Ableton to see if I can make that repo public... |
"Me too please", he says jumping up and down like an excited (if slightly lazy) 5 year old... |
I would be very interested too... |
Me too!
Thank you
…On Fri, May 5, 2017 at 12:01 PM, boblemarin ***@***.***> wrote:
I would be very interested too...
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#10 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/ABB6z_nZ5o9Ndjt6Gu6aUkoLbrQOfeA6ks5r2vNtgaJpZM4HeOPD>
.
|
Talking with Ableton now about making it public. I'll have news soon... |
Got the OK from Ableton, here is the Swift wrapper: |
Perfect timing, tomorrow I start on converting my midi generator app to fully fledged sequencing. Thanks. |
Great - if you have any questions or need better comments in the code, let me know. |
Thanks Jason !
…On Wed, May 10, 2017 at 7:20 PM, Jason Snell ***@***.***> wrote:
Great - if you have any questions or need better comments in the code, let
me know.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#10 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AFunMLaLS1Sy0Sf-0Wov3jq6ALhNj82nks5r4kY6gaJpZM4HeOPD>
.
|
Sorry for being a party pooper but using Swift in your audio thread is not something you want to do. Swift and memory allocation uses locks which can cause audio glitches. |
Hi, as the creator of AudioKit I just thought I'd chime in. So, first of all, people like @bangerang are absolutely correct, you should not do audio processing in Swift directly. AudioKit doesn't do this either, its all at the C/C++ level. AudioKit just makes it easier to chain together these nodes on the Swift to get a lot done quickly. And like @jbloit said, Link is on our road map and thanks to @JoshuaBThompson we're moving it up the priority list! So, sorry to pounce on this issue discussion so late in the game, but happy to be here and working with Link! |
I agree. C and C++ are always going to more efficient than Swift. For my uses, the Swift code hasn't caused any audio glitches for me - I have 32 channels of sound, but they are short, 1-5 second samples (a drum machine, in essence). I haven't done any experimenting with longer form audio samples / streams, etc. |
Chiming in here too.. as the developer of AUM I'm a bit concerned about music apps ability to work good together :) C/C++ vs Swift in the audio thread is not about efficiency. It's about real-time safety. Some calls and code paths do stuff that are not good to do in the audio thread. Swift and Objective-C does this all the time, since their messaging system in itself is the culprit. You must use C or C++ in the audio thread, and even then there's stuff in C/C++ you need to avoid: anything that can block or take locks. So, no memory allocation/freeing, no disk or network I/O, etc. I often hear the argument that "I know, I know... but for me it works, I haven't noticed any problem". This might be true when running your app standalone, but not when combined in a mixing environment where lots of apps need to run during the same audio render slice and share the resources. If any app blocks the audio thread, it ruins the party for all apps in the configuration. For people playing around in the couch, some glitches and audio drop-outs might be OK. For musicians playing live on stage, not so much... |
Ah thank you, that's good to learn. It's unfortunate that Swift can't operate safely in the audio thread :( |
Thanks Jonatan, for clarifying the ongoing hidden issues with the “I know, I know … but for me it works, I haven’t noticed any problem” … problem.
C
____
查理
… On Jun 25, 2017, at 12:17 PM, Jonatan Liljedahl ***@***.***> wrote:
Chiming in here too.. as the developer of AUM I'm a bit concerned about music apps ability to work good together :) C/C++ vs Swift in the audio thread is not about efficiency. It's about real-time safety. Some calls and code paths do stuff that are not good to do in the audio thread. Swift and Objective-C does this all the time, since their messaging system in itself is the culprit. You must use C or C++ in the audio thread, and even then there's stuff in C/C++ you need to avoid: anything that can block or take locks. So, no memory allocation/freeing, no disk or network I/O, etc.
I often hear the argument that "I know, I know... but for me it works, I haven't noticed any problem". This might be true when running your app standalone, but not when combined in a mixing environment where lots of apps need to run during the same audio render slice and share the resources. If any app blocks the audio thread, it ruins the party for all apps in the configuration. For people playing around in the couch, some glitches and audio drop-outs might be OK. For musicians playing live on stage, not so much...
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub <#10 (comment)>, or mute the thread <https://github.com/notifications/unsubscribe-auth/AAdZQVglqSrQ3iDPRM4br3l2UMNZCybuks5sHogPgaJpZM4HeOPD>.
|
They are not a problem because of how it is used: OSSpinLockTry and OSSpinLockUnlock do not block the thread. This is a key to understand what is safe to do on the audio thread: when something blocks, you don't know when it will resume. could be very fast (most of the time), could be very slow (rarely, but still unacceptable for audio). Swift and Objective C actually block in a lot of places. Basically, even a method call cannot be proven to execute under a certain amount of time. In most applications this is OK, but not for real time programming. Note that not using mutexes, method calls, malloc and others is not enough. If you're serious about audio programming, you will adopt a different programming style. Recursion is out of question in many cases. Do you send messages from the main thread? How do you process them on the audio thread? |
…linking Fix ABLLinkForceBeatAtTime liker error
Hello People, Any chance of someone from Ableton getting a Best Practices, working, Swift 2.x example that wraps the Objective-C code?
Github.com/audiokit/AudioKit has created some nice Swift examples and they help get things rolling real quick. Thanks. ~ Robert
The text was updated successfully, but these errors were encountered: