-
Notifications
You must be signed in to change notification settings - Fork 205
NowPlaying needs further work before it's ready #414
Comments
Some questions I have at the moment: Initial story sizing and prioritization: |
I've been working to understand the functional requirements of NowPlaying, but it isn't yet clear to me. This is my current understanding: QML Example:
C++ Example:
NowPlaying Cordova:
|
It may be important to note that this Plugin's primary goal is not to play audio - there is already a standard HTML5 way to play audio. The goal here is to allow audio to be played while being a good citizen on the system, and to interact with the overlay feature. So, we want to provide an API where the user (app developer) can "request" playback of their media, and when the service is acquired, playback starts with the source URL, metadata, and icon, that the user requested. Options to enable or disable Next/Previous buttons should be included. The user should be able to receive callbacks on the various events that can happen: Acquire, Revoke, Play, Pause, Stop, Error, Next, Previous. The first six will be primarily notification in case the application wishes to update some UI to reflect that state. The Next and Previous events would be something the application would respond to by setting new media details. Accordingly, there should be trackChange API, which doesn't do the Acquire again, but sets the new icon, metadata, url, and calls the trackChange slot to notify the Now Playing Connection. It should also include the options for enabling and disabling the Next/Previous buttons, as that would likely happen at this time. Since the app will almost certainly have it's own UI for playback, we'll also need APIs for Play, Pause, Stop, which will pass through to the enclosed MediaPlayer object. Stop should probably release everything as well, so that it would need Acquiring to start playback again. It may be useful for debugging to have methods to return the current media playback state, and if the connections is "acquired' or "preempted". I would propose something like this for the API: cordova.plugins.nowplaying {
requestPlayback: function(
{ url: "/filepath",
icon: "/path",
metadata: { artist: "Jane Doe", ... },
nextEnabled: true,
prevEnabled: true,
callbacks: {
acquired: function() { ... },
...
}
}) { ... },
trackChange: function ({
same as requestPlayback, but no callback piece
}) { ... },
stop: function() { ... },
resume: function() { ... },
pause: function() { ... },
currentState: function() { returns { state: string value?, acquired: boolean, preempted: boolean }
} The callbacks defined in the first method would be used for success/error when calling the other methods. |
The previous developer thought it was useful to define their own signals and slots in the plugin. It worked for them, but it's not the only way to do this. I've tried to define above what the API should provide to a user of the plugin - the implementation details don't need to be apparent at that level. An app developer is unlikely to know or care about Signals and Slots or that the media player and now playing connection are separate objects for example. |
Edited the API to use "resume" instead of "play" as it's really the opposite of "pause". One other thing I was thinking is that we should only allow nextEnabled or prevEnabled to be set true if there is a callback defined for them. |
(It is easier to read this post by copy-pasting the text to a text-editor that line-wraps.) Native functionality (as I imagine it to be):
Desired functionality:
|
Here's some information on the file structure with respect to the architecture of this plugin. The plugin is in "plugin". Below, you'll see the layers wrt the drawing Tim made on the whiteboard during the code sprint. The first file listed in each layer is what you need to change (I like to open the non-.cpp/hpp files in order side by side on my editor, then the .cpp/hpp files in order side-by-side on momentics). If I remember correctly, the second file (second "->") is where the file is copied to when you add the plugin or when it gets copied from the "sample" app to the newly created "debug1" app; the third file is where the file is copied to when you add blackberry10 as a platform or is what the file is compiled to (a library). App "main" html: sample/www/index.html App "main" js: sample/www/js/index.js App: plugin/www/client.js Cordova: debug1/platforms/blackberry10/platform_www/cordova.js Controller: plugin/src/blackberry10/index.js TemplateJS: plugin/src/blackberry10/native/src/NowPlaying_js.cpp TemplateNDK: plugin/src/blackberry10/native/src/NowPlaying_ndk.cpp |
We can divided the user stories into these 6 main items (tried to make them as mutually exclusive as possible):
|
Answers to your questions from earlier:
In the course of operation, an error might be encountered - like media not being found, low level issues from the native API/hardware, attempts to play without setting media, etc. There should be a way to get those messages.
I think that should be a property set on the input when calling trackChange. When you set the track, that would be when you know if you are able to handle a Next or Previous request. It could be that you've reached the end of the list you were playing from, or you are starting a new list, or perhaps you are playing a single track with no other tracks to switch to. It seems less likely that the value would change after starting to play a track.
With JavaScript you typically document the properties you are expecting, and you can check for them and return an error if something is missing or incorrectly formatted. What I'm suggesting is the format of that object and the property names that you would use. The { ... } parts are what a developer would supply. With JavaScript you can specify the name of a function or define it inline.
I think stop, resume, and pause would likely be simple methods that take no parameters and return nothing. The output of anything that happens with them would likely fall into one of the callbacks defined earlier. currentState could return a JSON object directly, or through a callback. |
"I think that should be a property set on the input when calling trackChange. When you set the track, that would be when you know if you are able to handle a Next or Previous request. It could be that you've reached the end of the list you were playing from, or you are starting a new list, or perhaps you are playing a single track with no other tracks to switch to. It seems less likely that the value would change after starting to play a track."
"I think stop, resume, and pause would likely be simple methods that take no parameters and return nothing. The output of anything that happens with them would likely fall into one of the callbacks defined earlier."
|
trackChange is one of the methods on the native NowPlayingConnection API. It's intended to notify the system that the current playing track has changed. From the perspective of a developer using the plugin, I think that changing the track and starting to play music the first time share a large amount of overlap. That's why I defined it as I did above. The first time you use the plugin, you'll need to define some callback methods, in addition to the track details. I think that next and previous would not be APIs on this plugin, but that you can set a callback function for when these events happen. For example, you start playing music with the API and have enabled next and previous in that call, and set a callback function for next and previous events. Then the user taps on the "next" button on the Overlay. Your callback function for the next event is called, and you do whatever application logic you would like, and at some point call the trackChange method, providing the new media you would like to play. So, I don't think there should be next or previous methods on the API and I don't think there should be specific APIs for enabling and disabling the next and previous buttons on the overlay. That ability should be included in the requestPlayback and trackChange methods only. For a stop/resume/pause method call, it will start from the application code. For example, an application is playing music with this plugin, and the user taps on an application button to pause the music. The application then calls the "pause" method on the plugin. Back when they first started playback, they defined callback methods for the Pause event. That callback gets fired when the event is received and then they can update their UI, knowing that the music has actually paused, not just that the user tapped the button. So they are using callbacks, but we are not forcing them to define one every time they call a method, just once at the start of using the plugin, because those events can happen at many different times and through different paths. |
So earlier I was talking about finally understanding signals and slots. This made
So we need to refactor to this API. In the native code, we connect signals to slots in this manner: (This isn't how it looks like in the code atm, but I am showing this here for the purposes of discussion.)
mp->pause() should trigger the signal bb::multimedia::NowPlayingConnection::pause() (according to http://developer.blackberry.com/native/reference/cascades/bb__multimedia__nowplayingconnection.html#function-acquired). The slot pause() is the function we provide to handle what happens when we hit the signal. We connect the two. So the slot pause() should, at some point, be provided with a callback to the javascript native code. Tim's point in our discussions was that this callback should be provided as the very first thing a developer using this plugin would do. This happens when the developer calls requestPlayback in the API. He/she provides the callback functions for all of resume, stop, pause, etc. Then in the native code, requestPlayback will set these callbacks as the callbacks that correspond to each slot. This way, when the pause signal is emitted and the pause() slot is executed, it will use that callback. Essentially, e.g. for pause(), there will be three functions:
So the developer writing in javascript first sets up the various callbacks, which will set up the callback infrastructure, and then he/she can resume, stop, pause by just calling the native NowPlayingNDK::NowPlayingResume/Stop/Pause() to emit their corresponding signals. Now providing these callbacks through requestPlayback can be tricky, because of how Cordova keeps track of them. I was invesigating this. Here's my understanding so far. I tested my understanding by walking through the debugger.
I am looking at the code path for setting the metadata and how it sets up the dummy aSyncCallback:
4'. In the native front-end part of the plugin:
3'. In the javascript server-side part of the plugin:
2'. In the javascript client-side part of the plugin:
1'. In the sample app, we:
Now look at 2.a. above:
And 3.a. above:
And take a look at debug1/platforms/blackberry10/platform_www/cordova.js. Look at the exec function at line 897. exec() indexes the given success and fail callbacks with a new callbackId at line 906. It then calls into 3.a. at line 932, which calls into native code tod do work. It then applies the success or fail callbacks at line 293 and deletes the callbackId index to them, depending on the result of 3.a (result.noResult(true)). This is how synchronous success and fail callbacks are made. PluginResult is defined in debug1/platforms/blackberry10/native/device/chrome/lib/PluginResult.js. To see how async callbacks are made, look at 3.a. again. Note that the true boolean in result.noResult(true) actually keeps the callbacks around, rather than deleting the callbackId index to them. This callbackId index is used as a key for resultObjects in 3.a., which is a list of results kept around for asynchronous calls. When a signal/event is fired in the native layer, the corresponding slot makes async calls through sendEvent() (5.b.) on such a callbackId. This goes up to 3'a., where the corresponding result is obtained from the resultObject, which is used to execute the success/fail functions associated with the callbackId. This is how async success and fail callbacks are made. What's important to see here is that because exec indexes success and fail callbacks with one new callbackId, then in order to set up callbacks using
in plugin/www/client.js, we need this function in this file to distribute each callback into its own function that will call exec into the javascript server side of the plugin in plugin/src/blackberry10/index.js and whose success function is the given callback function to requestPlayback. Again, this is because there is only ONE new callbackId associated per exec(), each of which has only ONE success/fail callback associated with it. Otherwise, the success function for requestPlayback will have to determine which callback to execute, which is messy and overloads what it has to do. In the sample app, we'll probably want a playlist to fiddle with, then:
One thing that made this challenging to understand is insufficient documentation. The best I found is https://cordova.apache.org/docs/en/5.1.1/guide/platforms/blackberry10/plugin.html, but we will probably have to reference the actual code for details on how some of the modules work: |
Some questions about acquire/revoke/play/stop and trackChange (thru email): Q1: Regarding the acquire/revoke/play/stop implementation I was considering: Q2: Regarding trackChange API:
Tim's response to Q1: What you’re describing is a different API – where you have a sort of setup method, followed by standard playback controls. Either approach can work, but it seems to me that you’re forcing multiple method calls where you want one action. Every time the developer wants to play something, the developer will need to call the setup function to provide the metadata and settings for next/previous. Then they call play(). To change tracks, you would call the setup function again? And then call play() again? I wouldn’t be surprised if you find some better ways to do the API as you get more familiar with the low level capabilities, but I don’t see how this makes it better. I think you still need callbacks for acquire and revoke, because those events can be fired as a result of external actions, and the application may need to handle them. For example, if you are using this API to play music, and a phone call comes in, the revoke signal would happen, and the app would lose its control over the audio. Maybe that could be abstracted into the pause event, but I don’t know. If it’s possible to handle all the acquire/revoke situations without needing to inform the application, then that would be a good improvement. Tim's response to Q2: You are right that most of the time it will be in a callback function on a next/previous event which determines what to do. At this point the app will figure out what track it wants to play, gather the metadata, and determine if it should have next or previous turned on, then it will call trackChange with that information. However, some applications may only use this to play a single track, turning off the next/previous buttons. I don’t see how adding helper methods for next previous provide anything? The application developer is going to define methods for doing next and previous. These methods will do whatever logic makes sense to the app, and then call trackChange. Why would we need another method of for them to call, when they’ve already defined it, and can call it directly? |
I think one ambiguity I'm having is how NowPlayingConnection will remain a good citizen on the system and with MP, NPConnection, and NPController.
|
About Q2 In the third paragraph:
I imagine three use cases which will eventually call trackChange in the native C++ layer:
For case 1, the developer must first specify next/previous callbacks through requestPlayback. By providing next/previous API, the developer could just specify an anonymous function for case 1 without having to create seperate and possibly inconsistent next/previous functions. When I suggested the API I thought a reason it would help is that we could output an error in case the next/previous callbacks were not yet defined through requestPlayback, but because the same error could be caught through trackChange (which the developer-defined next/previous functions would call), this is a mooch point/question. However, I do feel like it's natural for a music playing API like NowPlaying to have next/previous options, especially since the developer provides callbacks for them through requestPlayback. |
I'm conjecturing that MediaPlayer and NowPlayingConnection should not "depend" on one another. After some inspection, I found that this assumption led me to believe that MediaPlayer's public slot bb::multimedia::MediaError::Type play () emits NowPlayingConnection's signal void play () because the documentation for the latter says:
However, I noticed that
So, in working with QT connecting the NowPlayingConnection signal to a slot, I realize was failing to fire the NowPlayingConnection signal because I thought it would be fired by MusicPlayer's play() slot, but that actually fires a mediaStateChanged() signal. The question is, then, how do I fire the NowPlayingConnection play() signal?. I suspect these to be fired internally, just like through preemption (see bottom of post). So the question might be rephrased how do I get an internal system to fire a NowPlayingConnection play() signal and can I catch it? The same question applies for NowPlayingConnection's next() signal. I need to know this the API behaves as a good citizen on the system. It is important to note that:
What does this mean? Maybe it will be useful in teaching us how to fire the signals. This post ties back to my question three posts up:
What does the doc mean by "call play()"? Does it mean:
|
I'm curious about this "playlist" - can we use it? How does it work? I assume we won't for now.
|
About Q1
I'm not too sure what you mean by "either approach". I am guessing your approach to be this:
But I think requestPlayback should be decoupled. From my understanding it is only very slightly different than what Tim proposed. /**********************
* APP LOGIC VARIABLES
**********************/
var currentSong = -1;
var myPlaylist = [
{
songURL: "http://www.pch.gc.ca/DAMAssetPub/DAM-hymChs-antSgn/STAGING/audio-audio/o-canada_1359474460106_eng.MP3",
iconURL: "http://flaglane.com/download/canadian-flag/canadian-flag-small.jpg",
metadata: {
Title: "O Canada",
Artist: "Canada",
Album: "Canada's Favorites"
}
},
{
songURL: "sounds/highhat.mp3",
iconURL: "img/Hi-hat.jpg",
metadata: {
Title: "High Hat",
Artist: "Drum Kit",
Album: "Instruments"
}
}
];
/************
* CALLBACKS
************/
function myAcquireCallback() { // No need?
// I can't think of any UI or app logic updates that warrants us needing this callback.
// It seems myPlayCallback() should do all the work, which is always triggered alongside.
}
function myRevokeCallback() { // No need?
// I can't think of any UI or app logic updates that warrants us needing this callback.
// It seems myStopCallback() should do all the work, which is always triggered alongside.
}
function myPlayResumeCallback() { // Essentially the same for play and resume
// 1. Update the app logic.
// 2. Update the UI.
}
function myPauseCallback() {
// 1. Update the app logic.
// 2. Update the UI.
}
function myStopCallback() {
// 1. Update the app logic.
// 2. Update the UI.
}
function myNextCallback() {
// 1. Update the UI
// 2. Update the app logic, including currentSong.
// 3. cordova.plugins.nowplaying.trackChange(
// {
// myPlaylist[currentSong + 1].songURL,
// myPlaylist[currentSong + 1].iconURL,
// myPlaylist[currentSong + 1].metadata,
// currentSong + 1 < myPlaylist.size() - 1 // nextEnabled. Here: false
// currentSong + 1 > 0 // previousEnabled. Here: true
// }).
}
function myPreviousCallback() {
// 1. Update the UI
// 2. Update the app logic, including currentSong.
// 3. cordova.plugins.nowplaying.trackChange(
// {
// myPlaylist[currentSong - 1].songURL,
// myPlaylist[currentSong - 1].iconURL,
// myPlaylist[currentSong - 1].metadata,
// currentSong - 1 < myPlaylist.size() - 1 // nextEnabled. Here: true
// currentSong - 1 > 0 // previousEanbled. Here: false
// }).
}
function myErrorCallback() {
// 1. Update the app logic.
// 2. Update the UI.
}
/******************************************************
* Main execution using cordova.plugins.nowplaying API
******************************************************/
cordova.plugins.nowplaying.requestPlayback(
{
myAcquireCallback, myRevokeCallback, // No need these two?
myPlayResumeCallback, myPauseCallback, myStopCallback,
myNextCallback, myPreviousCallback,
myErrorCallback
});
// 1. Set up the NowPlayingConnection volume overlay.
//
// 2. Set up user-triggerable callbacks:
// - Connect the NowPlayingConnection->acquired() signal, // No need?
// as emitted by NowPlayingConnection->acquire(), to the acquireSlot() that triggers myAcquireCallback.
// - Connect the NowPlayingConnection->revoked() signal, // No need?
// as emitted by NowPlayingConnection->revoke(), to the revokeSlot() that triggers myRevokeCallback.
// - Connect the mediaStateChanged(MediaState::Started) signal,
// as emitted by MediaPlayer->play(), to the playSlot() that triggers myPlayResumeCallback.
// - Connect the mediaStateChanged(MediaState::Paused) signal,
// as emitted by MediaPlayer->pause(), to the pauseSlot() that triggers myPauseCallback.
// - Connect the mediaStateChanged(MediaState::Stopped) signal,
// as emitted by MediaPlayer->stop() to the stopSlot() that triggers myStopCallback.
//
// These two are optional and not specified by Tim's API:
// - Connect a newly defined nextSignal() to the nextSlot() that triggers myNextCallback.
// - Connect a newly defined previousSignal() to the previousSlot() that triggers myPreviousCallback.
//
// 3. Set up system-triggerable callbacks:
// - Connect the NowPlayingConnection->acquired() signal, // No need?
// as emitted internally thru preemption, to the acquireSlot() that triggers myAcquireCallback.
// - Connect the NowPlayingConnection->revoked() signal, // No need?
// as emitted internally thru preemption, to the revokeSlot() that triggers myRevokeCallback.
// - Connect the NowPlayingConnection->play() signal,
// as emitted internally thru preemption or volumeOverlay, to the playSlot() that triggers myPlayResumeCallback.
// - Connect the NowPlayingConnection->pause() signal,
// as emitted internally thru preemption or volumeOverlay, to the pauseSlot() that triggers myPauseCallback.
// - Connect the NowPlayingConnection->stop() signal,
// as emitted internally thru preemption or volumeOverlay, to the stopSlot() that triggers myStopCallback.
// - Connect the NowPlayingConnections->next() signal,
// as emitted internally thru volumeOverlay, to the nextSlot() that triggers myNextCallback.
// - Connect the NowPlayingConnections->previous() signal,
// as emitted internally thru volumeOverlay, to the previousSlot() that triggers myPreviousCallback.
//
// 4. Set up other callbacks:
// - Connect a newly-defined errorSignal() to the errorSlot() that triggers the callback.
cordova.plugins.nowplaying.play( // Possibly overwrite trackChange()
{
myPlaylist[0].songURL,
myPlaylist[0].iconURL,
myPlaylist[0].metadata,
currentSong < myPlaylist.size() - 1, // nextEnabled. Here: true
currentSong > 0 // previousEnabled. Here: false
});
// 1. Setup songURL, iconURL, metadata.
// 2. Enable/disable next/previous buttons on the volume overlay.
// 3. NowPlayingConnection->acquire().
// 4. MediaPlayer->play().
cordova.plugins.nowplaying.pause();
// 1. MediaPlayer->pause().
cordova.plugins.nowplaying.resume();
// 1. MediaPlayer->play().
cordova.plugins.nowplaying.trackChange( // Possibly merge into/overwrite with play()
{
myPlaylist[1].songURL,
myPlaylist[1].iconURL,
myPlaylist[1].metadata,
currentSong < myPlaylist.size() - 1, // nextEnabled. Here: false
currentSong > 0 // previousEnabled. Here: true
});
// 1. MediaPlayer->stop().
// 2. Setup songURL, iconURL, metadata.
// 3. Enable/disable next/previous buttons on the volume overlay.
// 4. MediaPlayer->play().
// This API method is optional and is not specified by Tim's API
cordova.plugins.nowplaying.previous();
// 1. emit the newly defined previousSignal().
// This API method is optional and is not specified by Tim's API
cordova.plugins.nowplaying.next();
// 1. emit the newly defined nextSignal().
cordova.plugins.nowplaying.stop();
// 1. MediaPlayer->stop().
// 2. NowPlayingConnection->revoke(). |
I can't seem to get the volumeOverlay buttons to fire signals, e.g. next() or pause() buttons. I am trying this, for example:
I am also trying to fire a signal when the app gets preempted by a video playing in the video app and by using the mic button, but it isn't firing.
|
I don't think we need to use NowPlayingController at all - I think it might be trimmed version of the NowPlayingConnection without the Overlay interface. I'm confirming that if I can with some colleagues. Let's also leave out audioManagerHandle - it's a way to control how the audio is played, like which speakers [handset, speakerphone, line out, bluetooth] and it's not something we really need to tackle in this version. The MediaPlayer instance that our NowPlayingConnection interacts with, is the one that we create. I thought that was already in the original plugin. |
Ok. Yes, MediaPlayer was already here. |
Firing the Next() and Play() signals on the NowPlayingConnection probably require you to tap on the buttons on the Overlay. |
You can give the mediaplayer a playlist file, like a .m3u file, instead of an audio track and that will give you the ability to do next and previous within the mediaplayer object. I wasn't planning on using that approach though, as I preferred to leave that control up to the developer, in case they weren't able to use a standard playlist file. |
I'm fine with you changing the API to what you've got there - the only thing is that I think you can go with a single play() method, and not have a trackChange at all, right? It seems like the small differences could be handled by some status checks inside the method. |
Right, yeah I think this can be done. |
Yes, I've tried this. The overlay does respond to taps in the sense that the buttons change visuals ("tapped" state vs "untapped" state, "pause" toggles with "play"), but they aren't firing anything. At the same time, the volume overlay isn't showing the icon and metadata. And this is odd because I do setVolumeOverlay to Fancy, the NowPlayingConnection is acquired, setNextEnabled(true), and the connect succeeds. |
In this thread, a Paul Bernhardt suggests using NowPlayingController. I tried this but it doesn't seem to work either. |
Regarding your signals not firing - I think we're missing some critical code here and I don't know where it was lost. These sorts of plugins that use signals and slots leverage the ApplicationThread class to make this all work. The plugin code is run on that ApplicationThread, and that's what makes the signals and slots work - because the ApplicationThread is a QThread with the necessary signal and slot capabilities. Looking at both MediaKeys and SystemDialog as examples, I see calls to the "join" method on initialization, passing in the Window Group so that the Application Thread runs in the same group as the app itself, but I don't see that in this plugin: https://github.com/parker-mar/WebWorks-Community-APIs/blob/master/BB10-Cordova/MediaKeys/plugin/src/blackberry10/index.js#L92-L104 Secondly, the method calls need to go from the Plugin's thread, to the ApplicationThread where the NowPlayingNDK code was moved. MediaKeys and SystemDialog do that like the following, but I see direct method calls in this plugin: I think we need to check out these differences in order to make the communication work properly. There could be other things that I missed too. |
Yes! This worked perfectly for the volume overlay. I am trying to figure out how notifications to the media notification area through signals and slots will work for preemption, as it doesn't yet. I am testing this by using the mic button or playing another video, then returning to the NowPlaying app and expecting play/pause callbacks to be shown (but they are not). I looked at almost all the other plugins, but nothing found yet. Do you have any advice here Tim? |
I don't see you handling the revoke signal at all - that's what you should be getting when another audio signal takes over. When you get that signal, you can pause the mediaplayer. |
Right, okay I modified this and preemption works now. Just some personal notes: Cases:
If we revoked while playing/paused and so returns from preemption paused, and we want to continue from that point, then would need all of play() except for the initial stop(). Such a method is too complicated as it does some of play() and some of resume() (even though mediaplayer is set with the music, icon and metadata aren't - so these aren't synced together). Simplify by just always stopping. |
Notes and corresponding slides Blog post summarizing work as term comes to a close: google drive link Plugin: Sample app using plugin: |
The text was updated successfully, but these errors were encountered: