Skip to content
This repository has been archived by the owner on Sep 4, 2019. It is now read-only.

NowPlaying needs further work before it's ready #414

Open
6 tasks
timwindsor opened this issue Jul 22, 2015 · 33 comments
Open
6 tasks

NowPlaying needs further work before it's ready #414

timwindsor opened this issue Jul 22, 2015 · 33 comments

Comments

@timwindsor
Copy link
Collaborator

  1. Should use the NPconnection->acquire() call and wait for the signal before playing.
  2. Should listen for Revoked signal and stop playback if received.
  3. Do not hardcode any file paths, or default audio streams.
  4. Should accept more metadata properties than the 3 given now.
  5. Should take an icon for the media being played
  6. Should use the Fancy OverlayStyle always (Plain is default and shows nothing valuable).
@parker-mar
Copy link
Collaborator

Some questions I have at the moment:
- Are we supposed to allow the user to specify actions wrt to a signal?
Story implies we hard-code, but https://developer.blackberry.com/native/documentation/graphics_multimedia/audio_video/accessing_media_notification_areas.html implies we don't.
- How do we specify signal handling normally then? Are Cordova slots/signals replacing internal npc handling?
- How about play, stop, pause - because we hard-coded behavior, no need signal/slot?
- How do WE specify what acquire does in C++ (not qml)? Signals and slots? Then user who uses our plugin won't? Shoulnd't user specify some kind of signals slots thing?

Initial story sizing and prioritization:
1. ~7 Should use the NPconnection->acquire() call and wait for the signal before playing.
- New function that just emits NPconnection->acquire() signal
- New slot it to acquire()
- mp->play()
npc->setMediaState(bb::multimedia::MediaState::Started);
npc->setMetaData(mp->metaData());
2. ~5 Should listen for Revoked signal and stop playback if received.
- New function that just emits NPconnection->revoke() signal
- New slot it to revoke()
- mp->stop()
3. ~1 Do not hardcode any file paths, or default audio streams.
- How is the NowPlaying plugin intended to provide access to stored media? Where would the media be? (https://developer.blackberry.com/native/documentation/graphics_multimedia/audio_video/accessing_media_notification_areas.html - assets... right? Or media player songs added by user? If the app accesses files there, how can I specify them anymore thru plugin?)
4. ~1 Should accept more metadata properties than the 3 given now.
- Create a test function to see json values set?
- Are we checking whether individual json values are set?
5. ~1 Should take an icon for the media being played
- Need to know how NPconnection works first? Get sample?
6. ~1 Should use the Fancy OverlayStyle always (Plain is default and shows nothing valuable).
- https://developer.blackberry.com/native/documentation/graphics_multimedia/audio_video/accessing_media_notification_areas.html
*- what is overlay, still? https://developer.blackberry.com/native/documentation/graphics_multimedia/audio_video/accessing_media_notification_areas.html
-> play with controls, music controls. That pops out with metadata.

@parker-mar
Copy link
Collaborator

I've been working to understand the functional requirements of NowPlaying, but it isn't yet clear to me. This is my current understanding:

QML Example:

C++ Example:

  • User can specify when npc.acquire(), npc.revoke(), mp.play(), mp.pause(), mp.stop() called (main function)
  • Don't see how users specify handlers.
    Q: How are users intended to interact with the plugin?

NowPlaying Cordova:

  • User can specify when npc.acquire(), npc.revoke(), mp.play(), mp.pause(), mp.stop() called (called in application's index.js)
  • Don't see how users specify handlers.
    Q: What exactly do we want? How are users intended to interact with the plugin?
    • The stories sound like we hard-code the signal handlers.
    • What is it that we really want? From my understanding, we don't need play, stop, pause, etc. (these belong to music player - do we have another cordova module for this), just native NowPlaying's acquire, revoke.
    • Why are we defining our own signals and slots in cordova NowPlaying? Shouldn't we use native NowPlaying's internal signals and slots? Or is defining our own signals and slots really how C++ is meant to interact with the native NowPlaying (By ignoring the internal signals and slots and instead using our own method of communication)?

@timwindsor
Copy link
Collaborator Author

It may be important to note that this Plugin's primary goal is not to play audio - there is already a standard HTML5 way to play audio. The goal here is to allow audio to be played while being a good citizen on the system, and to interact with the overlay feature.

So, we want to provide an API where the user (app developer) can "request" playback of their media, and when the service is acquired, playback starts with the source URL, metadata, and icon, that the user requested. Options to enable or disable Next/Previous buttons should be included.

The user should be able to receive callbacks on the various events that can happen: Acquire, Revoke, Play, Pause, Stop, Error, Next, Previous. The first six will be primarily notification in case the application wishes to update some UI to reflect that state. The Next and Previous events would be something the application would respond to by setting new media details.

Accordingly, there should be trackChange API, which doesn't do the Acquire again, but sets the new icon, metadata, url, and calls the trackChange slot to notify the Now Playing Connection. It should also include the options for enabling and disabling the Next/Previous buttons, as that would likely happen at this time.

Since the app will almost certainly have it's own UI for playback, we'll also need APIs for Play, Pause, Stop, which will pass through to the enclosed MediaPlayer object. Stop should probably release everything as well, so that it would need Acquiring to start playback again.

It may be useful for debugging to have methods to return the current media playback state, and if the connections is "acquired' or "preempted".

I would propose something like this for the API:

cordova.plugins.nowplaying {
    requestPlayback: function( 
        { url: "/filepath",
          icon: "/path",
          metadata: { artist: "Jane Doe", ... },
          nextEnabled: true,
          prevEnabled: true,
          callbacks: {
               acquired: function() { ... },
               ...
          }
        }) { ... },
     trackChange: function ({
          same as requestPlayback, but no callback piece
     }) { ... },
     stop: function() { ... },
     resume: function() { ... },
     pause: function() { ... },
     currentState: function() { returns { state: string value?, acquired: boolean, preempted: boolean }
}

The callbacks defined in the first method would be used for success/error when calling the other methods.

@timwindsor
Copy link
Collaborator Author

The previous developer thought it was useful to define their own signals and slots in the plugin. It worked for them, but it's not the only way to do this. I've tried to define above what the API should provide to a user of the plugin - the implementation details don't need to be apparent at that level. An app developer is unlikely to know or care about Signals and Slots or that the media player and now playing connection are separate objects for example.

@timwindsor
Copy link
Collaborator Author

Edited the API to use "resume" instead of "play" as it's really the opposite of "pause".

One other thing I was thinking is that we should only allow nextEnabled or prevEnabled to be set true if there is a callback defined for them.

@parker-mar
Copy link
Collaborator

(It is easier to read this post by copy-pasting the text to a text-editor that line-wraps.)

Native functionality (as I imagine it to be):

Underlying logic:
    - Each media has a source and a logical (not physical) media notification area associated with it. Events associated with a media are sent and handled in its media notification area. 
    - There is underlying logic for designating a "main" active media source between several. This is abstracted away from all media notification areas.
    - There is a volume overlay widget that is associated with the media notification area of the "main" active media source. The logic (JS), structure and presentation (HTML, CSS)... everything for the widget wrt the media notification area, the media source, etc. is handled by the OS.
    - There is underlying logic for playing/stopping/pausing media when an app is closed, sent to the background, or sent to the foreground. This is abstracted away from all media notification areas. It is done through connection priorities (https://developer.blackberry.com/native/documentation/graphics_multimedia/audio_video/accessing_media_notification_areas.html see that section.)
    - There is underlying logic for how playing/stopping/pausing media causes the hardware to do so. This is abstracted away from all media notification areas.
    - Has concept of signal and slots: http://developer.blackberry.com/native/documentation/dev/signals_slots/ Signals are handled by slots.

APIs: 
    MediaPlayer: An instance of this class can:
        - control a single media source (up to 8 instances can play simultanesouly.) by sending a play/stop/pause command to the hardware.
    NowPlayingConnection: An instance of this class can:
        - call acquire()/revoke() to gain exclusive system-wide access to the media notification area for the currently "main" active media source.
        - when access is acquired to the media notification area, it can display/hide and modify the data in its volume overlay from specified data.
        - enable/disable the next/previous buttons on the volume overlay of the media notification area.
        - handle native signals for play/pause/stop/acquired/revoked which can be attached to a slot
        - return the current media playback state (stopped, playing, paused), whether the media notification area is acquired, and whether the connection to the media notification area is preempted by a higher priority connection (https://developer.blackberry.com/native/documentation/graphics_multimedia/audio_video/accessing_media_notification_areas.html under connection priority). 
    NowPlayingController: An instance of this class can:
        - control the media source that is acquired by an instance of NowPlayingConnection by sending a play/stop/pause command to the hardware.

Desired functionality:

Plugin providing a API between JS and the native APIs, which will allow a
developer to:
1. Play audio "while being a good citizen on the system"
2. Interact with the overlay feature

User Stories: 
-------------
(Note: play is intitial "play", as opposed to resume)

- "As a developer, I want API to acquire(play)/revoke(stop) system-wide access to the media notification area for the currently "main" active media source."
    - NowPlayingConnection.acquire()/revoke().
    - Update the current media player state with NowPlaying for stop with setMediaState(Stopped)

- "As a developer, I want API to change the track being played."
    - NowPlayingConnection.trackChange().
    - Ensure you don't call NowPlayingConnection.acquire(), in case you intentded to reuse that API. Can create helper function instead.

- "As a developer, I want API to resume/pause the currently "main" active media (that was started with acquire)."
    - MediaPlayer.play()/pause().
    - Update the current media player state with NowPlaying.setMediaState(Stated/Paused)

- "As a developer, I want to be able to specify the music I want to play when I acquire the media notification area or change the track."
    - Basically, call MediaPlayer.play()/pause()/stop().
    - The user specifies music through a URL (MediaPlayer.setSourceUrl()) or through the device filesystem (need to find out how)

- "As a developer, I want to be able to specify metadata for the music I want to play when I acquire the media notification area or change the track."
    - Basically, call NowPlayingConnection.setMetaData().
    - Do more than just the three current ones (title, artist, album) http://developer.blackberry.com/native/reference/cascades/bb__multimedia__metadata.html 

- "As a developer, I want to be able to specify the icon for the music I want to play when I acquire the media notification area or change the track."
    - NowPlayingConnection->setIconUrl()

- "As a developer, when I acquire the media notification area or change the track, I want it to play the music specified."

- "As a developer, when I acquire the media notification area or change the track, I want the volume overlay to be displayed with the metadata and icon specified."
    - NowPlayingConnection->setOverlayStyle(OverlayStyle::Fancy)

- "As a developer, I want API to enable/disable the next/previous buttons of the volume overlay"
    - NowPlayingConnection->setNext/PreviousEnabled(bool)
    - Assert callback is specified when bool is true.

- "As a developer, I want API to receive callbacks from Acquire(play), Revoke(stop), Resume, Pause, Stop, Error, Next, Previous so that I can do actions like updating UI (first six) or set new media details (last two: Next, Previous) to reflect the state of the app after the function is done executing."
    - See https://github.com/blackberry/WebWorks-Community-APIs/tree/master/BB10/Memory for how SendPluginEvent works: "If you want the native code to be able to trigger an event on the JavaScript side then you'll need to call the SendPluginEvent function."
    - Example code path: setMetaButtonClick() 

- "As a developer, I want API to return the current media playback state and if the connection is acquired or preempted"
    - NowPlayingConnection.mediaState(),NowPlayingConnection.isAcquired(), NowPlayingConnection.isPreempted()

- "As a developer, I want API that will allow me to define callback handlers for when my app gets preempted or is no longer preempted."
    - "With no audioManagerHandle specified, the now playing service will automatically call [emit] play() when no longer preempted. If an audioManagerHandle is specified, the now playing service will only call [emit] play() if the audio routing has not changed to a more public device while preempted." (http://developer.blackberry.com/native/reference/cascades/bb__multimedia__nowplayingconnection.html#function-play). I think this play signal is the one for NowPlayingConnection (http://developer.blackberry.com/native/reference/cascades/bb__multimedia__nowplayingconnection.html#function-play), and can be tied to a slot. We just have to provide an API function which requires a callback function (that we should assert must be specified?) for this slot, which will execute when the play signal occurs, i.e. when the media is no longer preempted. 
        - I am not sure this is feasibly implementable though becauase idk if we can catch/handle/slot the emitted play() signal. One way to find out is to create an app and see how preempting behaves, and see if we can catch the signal and slot it.
    - I'm assuming pause() signal is emitted when we are preempted (I didn't find this specified in the native references though) and that we can do the same as for the play() signal.


To ask:
    - What do you mean by "Error" event? ("The user should be able to receive callbacks on the various events that can happen: Acquire, Revoke, Play, Pause, Stop, _Error_, Next, Previous.") 

    - "[The trackChange API] should also include the options for enabling and disabling the Next/Previous buttons, as that would likely happen at this time."? Do you mean because e.g. we might hit the last track or something, we should ensure next/previous button enabling/disabling can be specified as a callback?

    - Clarification on syntax for API for requestPlayback: Are you proposing to use a JSON parameter to specify how the JSON argument must be formatted (is that valid? I am unfamiliar with this syntax)? Are you are proposing to specify acquired as a named function callback paramter that must be specified in the function JSON argument (is that valid syntax, with the {...}?)? Is this intended to be a variable argument list of callbacks?

    - Clarification on syntax for API for stop, resume, pause, currentState: These will take callbacks as parameters too, right?

@parker-mar
Copy link
Collaborator

Storyboard of example app using the NowPlaying plugin.

  1. Play -> Pause -> Resume -> Pause -> Resume -> Stop
  2. Play -> trackChange -> Stop
  3. Preempted -> ?

img_2231

@parker-mar
Copy link
Collaborator

Here's some information on the file structure with respect to the architecture of this plugin.

The plugin is in "plugin".
"sample" is a sample app that uses the plugin.
nowplaying_build.sh automatically creates another app called "debug1" and adds the plugin to it. It then uses the "sample" app as boilerplate code to copy into the newly created "debug1" app. The script then adds blackberry10 as a platform, and does a cordova run. You can use this script to run and test changes, because the first thing it always does is remove the "debug1" app.

Below, you'll see the layers wrt the drawing Tim made on the whiteboard during the code sprint. The first file listed in each layer is what you need to change (I like to open the non-.cpp/hpp files in order side by side on my editor, then the .cpp/hpp files in order side-by-side on momentics). If I remember correctly, the second file (second "->") is where the file is copied to when you add the plugin or when it gets copied from the "sample" app to the newly created "debug1" app; the third file is where the file is copied to when you add blackberry10 as a platform or is what the file is compiled to (a library).

screen shot 2015-10-06 at 4 42 33 pm

App "main" html: sample/www/index.html
-> debug1/www/index.html
-> debug1/platforms/blackberry10/www/index.html

App "main" js: sample/www/js/index.js
-> debug1/www/js/index.js
-> debug1/platforms/blackberry10/www/js/index.js

App: plugin/www/client.js
-> debug1/plugins/com.blackberry.community.nowplaying/www/client.js
-> debug1/platforms/blackberry10/www/plugins/com.blackberry.community.nowplaying/www/client.js

Cordova: debug1/platforms/blackberry10/platform_www/cordova.js

Controller: plugin/src/blackberry10/index.js
-> debug1/plugins/com.blackberry.community.nowplaying/src/blackberry10/index.js
-> debug1/platforms/blackberry10/native/device/chrome/plugin/com.blackberry.community.nowplaying/index.js

TemplateJS: plugin/src/blackberry10/native/src/NowPlaying_js.cpp
-> debug1/plugins/com.blackberry.community.nowplaying/src/blackberry10/native/src/NowPlaying_js.cpp
-> debug1/platforms/blackberry10/native/device/plugins/jnext/libNowPlaying.so

TemplateNDK: plugin/src/blackberry10/native/src/NowPlaying_ndk.cpp
-> debug1/plugins/com.blackberry.community.nowplaying/src/blackberry10/native/src/NowPlaying_ndk.cpp
-> debug1/platforms/blackberry10/native/device/plugins/jnext/libNowPlaying.so

@parker-mar
Copy link
Collaborator

We can divided the user stories into these 6 main items (tried to make them as mutually exclusive as possible):

  1. Implement function to return state: media playback state and connection acquired or preempted.
  2. Implement acquire, revoke, resume, pause, stop, error, next previous using native MediaPlayer API.
  3. Implement callbacks into JS layer for acquire, revoke, resume, pause, stop, error, next previous.
  4. Implement handlers for when the app gets premempted or is no longer preempted (needs further investigation).
  5. Implement functions for specifying metadata, icon, and music to play.
  6. Implement functions to acquire/revoke media notification area. Prepare volume overlay over media notification area.
  • Populate it with metadata, icon, and music.
  • Provide functionality to enable/disable the next/previous buttons of volume overlay.

@timwindsor
Copy link
Collaborator Author

Answers to your questions from earlier:

  • What do you mean by "Error" event? ("The user should be able to receive callbacks on the various events that can happen: Acquire, Revoke, Play, Pause, Stop, Error, Next, Previous.")

In the course of operation, an error might be encountered - like media not being found, low level issues from the native API/hardware, attempts to play without setting media, etc. There should be a way to get those messages.

  • "[The trackChange API] should also include the options for enabling and disabling the Next/Previous buttons, as that would likely happen at this time."? Do you mean because e.g. we might hit the last track or something, we should ensure next/previous button enabling/disabling can be specified as a callback?

I think that should be a property set on the input when calling trackChange. When you set the track, that would be when you know if you are able to handle a Next or Previous request. It could be that you've reached the end of the list you were playing from, or you are starting a new list, or perhaps you are playing a single track with no other tracks to switch to. It seems less likely that the value would change after starting to play a track.

  • Clarification on syntax for API for requestPlayback: Are you proposing to use a JSON parameter to specify how the JSON argument must be formatted (is that valid? I am unfamiliar with this syntax)? Are you are proposing to specify acquired as a named function callback paramter that must be specified in the function JSON argument (is that valid syntax, with the {...}?)? Is this intended to be a variable argument list of callbacks?

With JavaScript you typically document the properties you are expecting, and you can check for them and return an error if something is missing or incorrectly formatted. What I'm suggesting is the format of that object and the property names that you would use. The { ... } parts are what a developer would supply. With JavaScript you can specify the name of a function or define it inline.

  • Clarification on syntax for API for stop, resume, pause, currentState: These will take callbacks as parameters too, right?

I think stop, resume, and pause would likely be simple methods that take no parameters and return nothing. The output of anything that happens with them would likely fall into one of the callbacks defined earlier.

currentState could return a JSON object directly, or through a callback.

@parker-mar
Copy link
Collaborator

"I think that should be a property set on the input when calling trackChange. When you set the track, that would be when you know if you are able to handle a Next or Previous request. It could be that you've reached the end of the list you were playing from, or you are starting a new list, or perhaps you are playing a single track with no other tracks to switch to. It seems less likely that the value would change after starting to play a track."

  • I just want to clarify what you mean by trackChange API. Stuff that have to do with changing tracks (next/previous actions) are just part of the overall NowPlaying API, right? "trackChange" is just a name associated with those set of actions?
  • So the proposal is for next/previous API to have optional parameters to enable/disable next/previous buttons on the overlay? Just to clarify, couldn't the user could just call these right after calling the next/previous API? So this is just for convenience?

"I think stop, resume, and pause would likely be simple methods that take no parameters and return nothing. The output of anything that happens with them would likely fall into one of the callbacks defined earlier."

  • I thought that they needed callbacks to update the UI?

@timwindsor
Copy link
Collaborator Author

trackChange is one of the methods on the native NowPlayingConnection API. It's intended to notify the system that the current playing track has changed.

From the perspective of a developer using the plugin, I think that changing the track and starting to play music the first time share a large amount of overlap. That's why I defined it as I did above. The first time you use the plugin, you'll need to define some callback methods, in addition to the track details.

I think that next and previous would not be APIs on this plugin, but that you can set a callback function for when these events happen. For example, you start playing music with the API and have enabled next and previous in that call, and set a callback function for next and previous events. Then the user taps on the "next" button on the Overlay. Your callback function for the next event is called, and you do whatever application logic you would like, and at some point call the trackChange method, providing the new media you would like to play.

So, I don't think there should be next or previous methods on the API and I don't think there should be specific APIs for enabling and disabling the next and previous buttons on the overlay. That ability should be included in the requestPlayback and trackChange methods only.

For a stop/resume/pause method call, it will start from the application code. For example, an application is playing music with this plugin, and the user taps on an application button to pause the music. The application then calls the "pause" method on the plugin. Back when they first started playback, they defined callback methods for the Pause event. That callback gets fired when the event is received and then they can update their UI, knowing that the music has actually paused, not just that the user tapped the button.

So they are using callbacks, but we are not forcing them to define one every time they call a method, just once at the start of using the plugin, because those events can happen at many different times and through different paths.

@parker-mar
Copy link
Collaborator

So earlier I was talking about finally understanding signals and slots. This made
me realize that how Tim's proposed API is supposed to work, now.
Here is the API. I think it should be in NowPlaying/plugin/www/client.js:

    requestPlayback: function( 
        {
            songURL = "/songURL",
            iconURL = "/iconURL",
            metadata: { Title = "/Title", Artist = "/Artist", Album = "/Album" },
            nextEnabled = true,
            prevEnabled = true,
            callbacks: {
                play = function() {},
                stop = function() {},
                pause = function() {},
                resume = function() {},
                error = function() {},
                next = function() {},
                previous = function() {}
        }
    ) { ... },
     trackChange: function (
         {
            songURL = "/songURL",
            iconURL = "/iconURL",
            metadata: { Title = "/Title", Artist = "/Artist", Album = "/Album" },
            nextEnabled = true,
            prevEnabled = true,
         }
     ) { ... },
     stop: function() { ... },
     resume: function() { ... },
     pause: function() { ... },
     getState: function() { ... }

So we need to refactor to this API.

In the native code, we connect signals to slots in this manner: (This isn't how it looks like in the code atm, but I am showing this here for the purposes of discussion.)

    NowPlayingCordova/src/NowPlaying_ndk.hpp:
    class NowPlayingNDK : public QObject {
            Q_OBJECT
            ...
            public:
                ...
                QObject::connect(
                    this, 
                    SIGNAL(bb::multimedia::NowPlayingConnection::pause()),
                    this, 
                    SLOT(pause())
                    );
                ...
                public slots:
                    void pause();
            ...
    }

mp->pause() should trigger the signal bb::multimedia::NowPlayingConnection::pause() (according to http://developer.blackberry.com/native/reference/cascades/bb__multimedia__nowplayingconnection.html#function-acquired). The slot pause() is the function we provide to handle what happens when we hit the signal. We connect the two. So the slot pause() should, at some point, be provided with a callback to the javascript native code.

Tim's point in our discussions was that this callback should be provided as the very first thing a developer using this plugin would do. This happens when the developer calls requestPlayback in the API. He/she provides the callback functions for all of resume, stop, pause, etc. Then in the native code, requestPlayback will set these callbacks as the callbacks that correspond to each slot. This way, when the pause signal is emitted and the pause() slot is executed, it will use that callback.

Essentially, e.g. for pause(), there will be three functions:

NowPlayingNDK::NowPlayingPause(): which will just emit bb::multimedia::NowPlayingConnection::pause() by doing "mp->pause();"

void pause(): the slot which will do a callback into the javascript layer using NowPlayingNDK::sendEvent on a callbackId.

void setUpPauseCallback(callbackId): 
    1. Connect the pause signal to its slot
        QObject::connect(
            this, 
            SIGNAL(bb::multimedia::NowPlayingConnection::pause()),
            this, 
            SLOT(pause())
        );
    (I think it's better to do it here, rather than in NowPlayingCordova/src/NowPlaying_ndk.hpp)
    2. Sets up the callbackId by which the slot pause() can make a callback into the javascript layer (class NowPlayingNDK should probably keep a dictionary of callbackIDs, one per slot).

So the developer writing in javascript first sets up the various callbacks, which will set up the callback infrastructure, and then he/she can resume, stop, pause by just calling the native NowPlayingNDK::NowPlayingResume/Stop/Pause() to emit their corresponding signals.

Now providing these callbacks through requestPlayback can be tricky, because of how Cordova keeps track of them. I was invesigating this. Here's my understanding so far. I tested my understanding by walking through the debugger.

  1. [sample app]
    https://github.com/parker-mar/WebWorks-Community-APIs/blob/b3fb7f5ae166ece359e04a272a0a6c88886bdfef/BB10-Cordova/NowPlaying/sample/www/js/index.js
  2. [javascipt client-side part of the plugin]
    https://github.com/parker-mar/WebWorks-Community-APIs/blob/b3fb7f5ae166ece359e04a272a0a6c88886bdfef/BB10-Cordova/NowPlaying/plugin/www/client.js
  3. [javascript server-side part of the plugin]
    https://github.com/parker-mar/WebWorks-Community-APIs/blob/b3fb7f5ae166ece359e04a272a0a6c88886bdfef/BB10-Cordova/NowPlaying/plugin/src/blackberry10/index.js
  4. [native front-end part of the plugin]
    https://github.com/parker-mar/WebWorks-Community-APIs/blob/b3fb7f5ae166ece359e04a272a0a6c88886bdfef/BB10-Cordova/NowPlaying/plugin/src/blackberry10/native/src/NowPlaying_js.cpp
  5. [native back-end part of the plugin]
    https://github.com/parker-mar/WebWorks-Community-APIs/blob/b3fb7f5ae166ece359e04a272a0a6c88886bdfef/BB10-Cordova/NowPlaying/plugin/src/blackberry10/native/src/NowPlaying_ndk.cpp

I am looking at the code path for setting the metadata and how it sets up the dummy aSyncCallback:

  1. In the sample app, we:

    1.a. Bind the metadata button:

    setMetaButtonClick: function(){
    
        var jsonData = {"Title":"MyTitle",
                "Artist":"MyArtist",
                "Album":"MyAlbum"};
        com.blackberry.community.nowplaying.NowPlayingSetMetadata(jsonData, app.aSyncCallback);
    
    },
    

    1.b. aSyncCallback is defined:

    aSyncCallback: function(data) {
        if (data) {
            console.log(data);
            app.writeOut(data.result);
        }
    },
    
  2. In the javascript client-side part of the plugin:

    2.a. setMetaButtonClick (in 1.a.) calls this:

    _self.NowPlayingSetMetadata = function (input, callback) {
            var success = function (data, response) {
                var json = JSON.parse(data);
                callback(json);
            },
            fail = function (data, response) {
                console.log("Error: " + data);
            };
        exec(success, fail, _ID, "NowPlayingSetMetadata", { input: input });
    };
    
  3. In the javascript server-side part of the plugin:

    3.a. exec(success, fail, _ID, "NowPlayingSetMetadata", { input: input }) (in 2.a.) calls this:

    NowPlayingSetMetadata:  function (success, fail, args, env) {
        var result = new PluginResult(args, env);
        resultObjs[result.callbackId] = result;
        args = JSON.parse(decodeURIComponent(args["input"]));
        nowPlaying.getInstance().NowPlayingSetMetadata(result.callbackId, args);
        result.noResult(true);
    }
    

    3.b. nowPlaying.getInstance().NowPlayingSetMetadata(result.callbackId, args); (in 3.a.) calls this:

    self.NowPlayingSetMetadata = function (callbackId, input) {
        return JNEXT.invoke(self.m_id, "NowPlayingSetMetadata " + callbackId + " " + JSON.stringify(input));
    };
    
  4. In the native front-end part of the plugin:

    4.a. JNEXT.invoke(self.m_id, "NowPlayingSetMetadata " + callbackId + " " + JSON.stringify(input)); (in 3.b.) calls this:

    string NowPlayingJS::InvokeMethod(const string& command) {
        ...
        }else if(strCommand=="NowPlayingSetMetadata"){
        m_NowPlayingMgr->NowPlayingSetMetadata(callbackId,arg);
        return "";
        ...
    }
    
  5. In the native back-end part of the plugin:

    5.a. m_NowPlayingMgr->NowPlayingSetMetadata(callbackId,arg); (in 4.a.) calls this:

    void NowPlayingNDK::NowPlayingSetMetadata(const std::string& callbackId, const std::string& data){
        ...
        Json::Value root;
        ...
        root["result"] = "SetMetadata Succeed.";
        sendEvent(callbackId + " " + writer.write(root));
        ...
    }
    

    5.b. sendEvent(callbackId + " " + writer.write(root)); (in 5.a.) calls this:

    void NowPlayingNDK::sendEvent(const string& msg){
        m_pParent->NotifyEvent(msg);
    }
    

4'. In the native front-end part of the plugin:

4'.a. m_pParent->NotifyEvent(msg); (in 5.b.) calls this:
```
// Notifies JavaScript of an event
void NowPlayingJS::NotifyEvent(const std::string& event) {
    std::string eventString = m_id + " ";
    eventString.append(event);
    SendPluginEvent(eventString.c_str(), m_pContext);           
}
```

3'. In the javascript server-side part of the plugin:

3'.a. SendPluginEvent(eventString.c_str(), m_pContext); (in 4'.a.) calls this:
```
// Fired by the Event framework (used by asynchronous callbacks)
self.onEvent = function (strData) {
    var arData = strData.split(" "),
        callbackId = arData[0],
        result = resultObjs[callbackId],
        data = arData.slice(1, arData.length).join(" ");

    if (result) {
        if (callbackId != threadCallback) {
            result.callbackOk(data, false);
            delete resultObjs[callbackId];
        } else {
            result.callbackOk(data, true);
        }
    }
};
```

2'. In the javascript client-side part of the plugin:

2'.a. result.callbackOk(data, true); (in 3'.a.) calls the function success in this:
```
_self.NowPlayingSetMetadata = function (input, callback) {
        var success = function (data, response) {
            var json = JSON.parse(data);
            callback(json);
        },
        fail = function (data, response) {
            console.log("Error: " + data);
        };
    exec(success, fail, _ID, "NowPlayingSetMetadata", { input: input });
};
```

1'. In the sample app, we:

1'.a. callback(json); (in 2'.a.) calls this:
```
aSyncCallback: function(data) {
    if (data) {
        console.log(data);
        app.writeOut(data.result);
    }
},
```

Now look at 2.a. above:

    _self.NowPlayingSetMetadata = function (input, callback) {
            var success = function (data, response) {
                var json = JSON.parse(data);
                callback(json);
            },
            fail = function (data, response) {
                console.log("Error: " + data);
            };
        exec(success, fail, _ID, "NowPlayingSetMetadata", { input: input });
    };

And 3.a. above:

    NowPlayingSetMetadata: function (success, fail, args, env) {
        var result = new PluginResult(args, env);
        resultObjs[result.callbackId] = result;
        args = JSON.parse(decodeURIComponent(args["input"]));
        nowPlaying.getInstance().NowPlayingSetMetadata(result.callbackId, args);
        result.noResult(true);
    }

And take a look at debug1/platforms/blackberry10/platform_www/cordova.js.

Look at the exec function at line 897. exec() indexes the given success and fail callbacks with a new callbackId at line 906. It then calls into 3.a. at line 932, which calls into native code tod do work. It then applies the success or fail callbacks at line 293 and deletes the callbackId index to them, depending on the result of 3.a (result.noResult(true)). This is how synchronous success and fail callbacks are made.

PluginResult is defined in debug1/platforms/blackberry10/native/device/chrome/lib/PluginResult.js.

To see how async callbacks are made, look at 3.a. again. Note that the true boolean in result.noResult(true) actually keeps the callbacks around, rather than deleting the callbackId index to them. This callbackId index is used as a key for resultObjects in 3.a., which is a list of results kept around for asynchronous calls. When a signal/event is fired in the native layer, the corresponding slot makes async calls through sendEvent() (5.b.) on such a callbackId. This goes up to 3'a., where the corresponding result is obtained from the resultObject, which is used to execute the success/fail functions associated with the callbackId. This is how async success and fail callbacks are made.

What's important to see here is that because exec indexes success and fail callbacks with one new callbackId, then in order to set up callbacks using

    requestPlayback: function( 
            {
                songURL = "/songURL",
                iconURL = "/iconURL",
                metadata: { Title = "/Title", Artist = "/Artist", Album = "/Album" },
                nextEnabled = true,
                prevEnabled = true,
                callbacks: {
                    play = function() {},
                    stop = function() {},
                    pause = function() {},
                    resume = function() {},
                    error = function() {},
                    next = function() {},
                    previous = function() {}
            }
        ) { ... }

in plugin/www/client.js, we need this function in this file to distribute each callback into its own function that will call exec into the javascript server side of the plugin in plugin/src/blackberry10/index.js and whose success function is the given callback function to requestPlayback. Again, this is because there is only ONE new callbackId associated per exec(), each of which has only ONE success/fail callback associated with it. Otherwise, the success function for requestPlayback will have to determine which callback to execute, which is messy and overloads what it has to do.

In the sample app, we'll probably want a playlist to fiddle with, then:

var playlist = [
    {
        songURL: "http://www.pch.gc.ca/DAMAssetPub/DAM-hymChs-antSgn/STAGING/audio-audio/o-canada_1359474460106_eng.MP3",
        iconURL: "http://flaglane.com/download/canadian-flag/canadian-flag-small.jpg",
        metadata: {
            Title: "O Canada",
            Artist: "Canada",
            Album: "Canada's Favorites"
        }
    },
    {
        songURL: "sounds/highhat.mp3",
        iconURL: "img/Hi-hat.jpg",
        metadata: {
            Title: "High Hat",
            Artist: "Drum Kit",
            Album: "Instruments"
        }
    }
];

One thing that made this challenging to understand is insufficient documentation. The best I found is https://cordova.apache.org/docs/en/5.1.1/guide/platforms/blackberry10/plugin.html, but we will probably have to reference the actual code for details on how some of the modules work:
debug1/platforms/blackberry10/platform_www/cordova.js
debug1/platforms/blackberry10/platform_www/cordova-js-src/exec.js
debug1/platforms/blackberry10/native/device/chrome/lib/PluginResult.js
debug1/platforms/blackberry10/native/device/chrome/lib/jnext.js

@parker-mar
Copy link
Collaborator

Some questions about acquire/revoke/play/stop and trackChange (thru email):

Q1: Regarding the acquire/revoke/play/stop implementation I was considering:
The NowPlaying API's requestPlayback method can acquire to set up stuff like overlay style and music, then it can revoke before the method ends.
The NowPlaying API's play method can implicitly reacquire. The NowPlaying API's stop method can implicitly revoke. (Why would we need callbacks for Acquire, Revoke, then?)

Q2: Regarding trackChange API:
It can be called in two ways:

  1. Developer can directly call trackChange() which in the native layer will do a stop(), set track according to details, then play(), and also do enabling/disabling of next/previous buttons from the volume overlay as specified.
  2. requestPlayback is first given next()/previous() callbacks. These are called when internal logic (e.g. triggered by user tapping next/previous on the volume overlay) emits native next()/previous() signals, which connect to slots that execute the callbacks. The callbacks defined earlier should update the UI and call trackChange() according to "next"/"previous" songs and metadata that the developer keeps track of with a playlist in the JS client layer.
    So my question is why requestPlayback needs parameters for enabling and disabling the next/previous buttons, if it's really the callback that calls trackChange that determines whether to disable/enable the buttons?
    And why not provide next/previous helper NowPlayingConnection methods which emit the native next()/previous() signals (which do the callback and call trackChange eventually)?

Tim's response to Q1:
I don’t understand why you’d revoke in the method that is trying to acquire.
There’s no “play” method defined in the API that I gave – it was changed to “resume” to better reflect its purpose, as the opposite of “pause”.

What you’re describing is a different API – where you have a sort of setup method, followed by standard playback controls. Either approach can work, but it seems to me that you’re forcing multiple method calls where you want one action. Every time the developer wants to play something, the developer will need to call the setup function to provide the metadata and settings for next/previous. Then they call play(). To change tracks, you would call the setup function again? And then call play() again? I wouldn’t be surprised if you find some better ways to do the API as you get more familiar with the low level capabilities, but I don’t see how this makes it better.

I think you still need callbacks for acquire and revoke, because those events can be fired as a result of external actions, and the application may need to handle them. For example, if you are using this API to play music, and a phone call comes in, the revoke signal would happen, and the app would lose its control over the audio. Maybe that could be abstracted into the pause event, but I don’t know. If it’s possible to handle all the acquire/revoke situations without needing to inform the application, then that would be a good improvement.

Tim's response to Q2:
This is related to Q1 – in my design requestPlayback handles setup and begins playing the file. Whenever you start playing a file, you need to turn on or off the next/previous buttons, because they go into the overlay, which will be available as soon as audio starts playing. The requestPlayback method also works as the first time setup function, so it allows the developer to set up all their callback functions at one time, rather than every time they play a track.

You are right that most of the time it will be in a callback function on a next/previous event which determines what to do. At this point the app will figure out what track it wants to play, gather the metadata, and determine if it should have next or previous turned on, then it will call trackChange with that information. However, some applications may only use this to play a single track, turning off the next/previous buttons.

I don’t see how adding helper methods for next previous provide anything? The application developer is going to define methods for doing next and previous. These methods will do whatever logic makes sense to the app, and then call trackChange. Why would we need another method of for them to call, when they’ve already defined it, and can call it directly?

@parker-mar
Copy link
Collaborator

I think one ambiguity I'm having is how NowPlayingConnection will remain a good citizen on the system and with MP, NPConnection, and NPController.

https://developer.blackberry.com/native/documentation/graphics_multimedia/audio_video/accessing_media_notification_areas.html

The notification area emits signals that can be used by your app to keep track of when the Play, Pause, Prev or Next media player control buttons have been pressed.

  • I understand this sends a NPC Play(), Pause(), Prev(), Next() signal. But which music player instance does this interact with? Or is this a plain NPC signal which we must connect to a slot which calls methods of a music player instance we keep track of?

http://developer.blackberry.com/native/reference/cascades/bb__multimedia__nowplayingconnection.html#property-audiomanagerhandle

The controllerActive property provides notification to indicate that one or more controllers are attached to the service that is currently acquired. The controllers require regular status updates to function accurately. This means that even if an application is in standby mode, regular status messages should continue to be sent to it.

With no audioManagerHandle specified, the now playing service will automatically call play() when no longer preempted. If an audioManagerHandle is specified, the now playing service will only call play() if the audio routing has not changed to a more public device while preempted.

  • What this means to me is that the play() signal will be emitted, which means that we must connect it to a slot which calls methods of a music player instance we keep track of.
  • I didn't look at audioManager before. Is it a good idea to use it? Is it necessary?

@parker-mar
Copy link
Collaborator

About Q2

In the third paragraph:

I don’t see how adding helper methods for next previous provide anything? The application developer is going to define methods for doing next and previous. These methods will do whatever logic makes sense to the app, and then call trackChange. Why would we need another method of for them to call, when they’ve already defined it, and can call it directly?

I imagine three use cases which will eventually call trackChange in the native C++ layer:

  1. Volume Overlay next/previous buttons are touched.
  2. The developer creates a next/previous button on their app.
  3. The developer creates a method to jump to a particular song from their app.

For case 1, the developer must first specify next/previous callbacks through requestPlayback.
For case 2, the developer may create a next/previous functions.
For case 3, the developer would use trackChange.

By providing next/previous API, the developer could just specify an anonymous function for case 1 without having to create seperate and possibly inconsistent next/previous functions.

When I suggested the API I thought a reason it would help is that we could output an error in case the next/previous callbacks were not yet defined through requestPlayback, but because the same error could be caught through trackChange (which the developer-defined next/previous functions would call), this is a mooch point/question.

However, I do feel like it's natural for a music playing API like NowPlaying to have next/previous options, especially since the developer provides callbacks for them through requestPlayback.

@parker-mar
Copy link
Collaborator

I'm conjecturing that MediaPlayer and NowPlayingConnection should not "depend" on one another. After some inspection, I found that this assumption led me to believe that MediaPlayer's public slot bb::multimedia::MediaError::Type play () emits NowPlayingConnection's signal void play () because the documentation for the latter says:

This signal is emitted on notification that a play track command has been received.
and I presumed this "play track command" to be a signal as emitted by former.

However, I noticed that

  1. Since these are separate APIs, we should expect them to behave independently.
  2. The documentation for MediaPlayer's play never actually says it emits a "play track command"-like signal:

Successful playback will emit a mediaStateChanged() signal with a MediaState::Started value. If the player is in the MediaState.Unprepared state internally prepare() will be called. In this case a mediaStateChanged() signal is emitted twice, once with a value of MediaState.Prepared and next MediaState.Started.

So, in working with QT connecting the NowPlayingConnection signal to a slot, I realize was failing to fire the NowPlayingConnection signal because I thought it would be fired by MusicPlayer's play() slot, but that actually fires a mediaStateChanged() signal.

The question is, then, how do I fire the NowPlayingConnection play() signal?. I suspect these to be fired internally, just like through preemption (see bottom of post). So the question might be rephrased how do I get an internal system to fire a NowPlayingConnection play() signal and can I catch it? The same question applies for NowPlayingConnection's next() signal. I need to know this the API behaves as a good citizen on the system.

It is important to note that:

  1. These two (NowPlayingConnection play() and next()) signals differ from the NowPlayingConnection acquire() signal, for which can be fired using the slot acquire() and caught using the the signal void acquired().
  2. These two have similar "cousin" signal emitters and catchers in MediaPlayer. Over there, "play()" can be emitted: it is caught with "mediaStateChanged()" and "nextTrack()" can be emitted: this is caught with "trackChanged()".

http://developer.blackberry.com/native/reference/cascades/bb__multimedia__nowplayingconnection.html#function-play

Acquired signals can be used in calls to next(), previous(), play(), pause(), and stop(). Upon receiving a signal, the user must call the function that is most appropriate for their media.

What does this mean? Maybe it will be useful in teaching us how to fire the signals.

This post ties back to my question three posts up:
http://developer.blackberry.com/native/reference/cascades/bb__multimedia__nowplayingconnection.html#function-play

With no audioManagerHandle specified, the now playing service will automatically call play() when no longer preempted. If an audioManagerHandle is specified, the now playing service will only call play() if the audio routing has not changed to a more public device while preempted.

What does the doc mean by "call play()"? Does it mean:

  1. Execution of the MediaPlayer public slot bb::multimedia::MediaError::Type play (), which emits a mediaStateChanged() signal?
  2. Since this was in the NowPlayingConnection docs, then this means that internal code causes a NowPlayingConnection signal void play() to be emitted?
  3. Something else?
    I will try to see if I can trigger this.

@parker-mar
Copy link
Collaborator

I'm curious about this "playlist" - can we use it? How does it work? I assume we won't for now.
http://developer.blackberry.com/native/reference/cascades/bb__multimedia__mediaplayer.html#function-previoustrack

bb::multimedia::MediaError::Type nextTrack ()
Moves playback to the next track in the playlist.
If the media source is a playlist, this function will move to the next track. On success it will result in a trackChanged() signal.
If the media source is not a playlist then calling this function does nothing.
This method call is blocking.

@parker-mar
Copy link
Collaborator

About Q1

What you’re describing is a different API – where you have a sort of setup method, followed by standard playback controls. Either approach can work, but it seems to me that you’re forcing multiple method calls where you want one action. Every time the developer wants to play something, the developer will need to call the setup function to provide the metadata and settings for next/previous. Then they call play(). To change tracks, you would call the setup function again? And then call play() again?

I'm not too sure what you mean by "either approach".

I am guessing your approach to be this:

requestPlayback should:
    1. Setup the callbacks by connecting acquire, revoke, play, stop,.. pre-defined signals to slots that will call the given callbacks.
    2. Acquire the NowPlayingConnection.
    3. Setup music to be played, with metadata and icon.
    4. Play the MusicPlayer instance. This should call the play() slot.
    (Does not revoke the NowPlayingConnection before exiting)

trackChange should:
    1. Stop the MusicPlayer instance. This should call the stop() slot.
    2. Setup the music to be played, with metadata and icon.
    3. Play the MusicPlayer instance. This should call the play() slot.

But I think requestPlayback should be decoupled. From my understanding it is only very slightly different than what Tim proposed.
The developer would use the API like this:

/**********************
 * APP LOGIC VARIABLES
 **********************/
var currentSong = -1;

var myPlaylist = [
    {
        songURL: "http://www.pch.gc.ca/DAMAssetPub/DAM-hymChs-antSgn/STAGING/audio-audio/o-canada_1359474460106_eng.MP3",
        iconURL: "http://flaglane.com/download/canadian-flag/canadian-flag-small.jpg",
        metadata: {
            Title: "O Canada",
            Artist: "Canada",
            Album: "Canada's Favorites"
        }
    },
    {
        songURL: "sounds/highhat.mp3",
        iconURL: "img/Hi-hat.jpg",
        metadata: {
            Title: "High Hat",
            Artist: "Drum Kit",
            Album: "Instruments"
        }
    }
];

/************
 * CALLBACKS
 ************/
function myAcquireCallback() { // No need?
    // I can't think of any UI or app logic updates that warrants us needing this callback.
    // It seems myPlayCallback() should do all the work, which is always triggered alongside.
}

function myRevokeCallback() { // No need?
    // I can't think of any UI or app logic updates that warrants us needing this callback.
    // It seems myStopCallback() should do all the work, which is always triggered alongside.
}

function myPlayResumeCallback() { // Essentially the same for play and resume
    // 1. Update the app logic.
    // 2. Update the UI.
}

function myPauseCallback() {
    // 1. Update the app logic.
    // 2. Update the UI.
}

function myStopCallback() {
    // 1. Update the app logic.
    // 2. Update the UI.
}

function myNextCallback() {
    // 1. Update the UI
    // 2. Update the app logic, including currentSong. 
    // 3. cordova.plugins.nowplaying.trackChange(
    //    {
    //        myPlaylist[currentSong + 1].songURL,
    //        myPlaylist[currentSong + 1].iconURL,
    //        myPlaylist[currentSong + 1].metadata,
    //        currentSong + 1 < myPlaylist.size() - 1 // nextEnabled. Here: false
    //        currentSong + 1 > 0 // previousEnabled. Here: true
    //    }).

}

function myPreviousCallback() {
    // 1. Update the UI
    // 2. Update the app logic, including currentSong.
    // 3. cordova.plugins.nowplaying.trackChange(
    //    {
    //        myPlaylist[currentSong - 1].songURL,
    //        myPlaylist[currentSong - 1].iconURL,
    //        myPlaylist[currentSong - 1].metadata,
    //        currentSong - 1 < myPlaylist.size() - 1 // nextEnabled. Here: true
    //        currentSong - 1 > 0 // previousEanbled. Here: false
    //    }).
}

function myErrorCallback() {
    // 1. Update the app logic.
    // 2. Update the UI.
}


/******************************************************
 * Main execution using cordova.plugins.nowplaying API
 ******************************************************/
cordova.plugins.nowplaying.requestPlayback(
    {
        myAcquireCallback, myRevokeCallback, // No need these two?
        myPlayResumeCallback, myPauseCallback, myStopCallback,
        myNextCallback, myPreviousCallback,
        myErrorCallback
    });
    // 1. Set up the NowPlayingConnection volume overlay.
    //
    // 2. Set up user-triggerable callbacks:
    //   - Connect the NowPlayingConnection->acquired() signal, // No need?
    //   as emitted by NowPlayingConnection->acquire(), to the acquireSlot() that triggers myAcquireCallback.
    //   - Connect the NowPlayingConnection->revoked() signal, // No need?
    //   as emitted by NowPlayingConnection->revoke(), to the revokeSlot() that triggers myRevokeCallback.
    //   - Connect the mediaStateChanged(MediaState::Started) signal, 
    //   as emitted by MediaPlayer->play(), to the playSlot() that triggers myPlayResumeCallback.
    //   - Connect the mediaStateChanged(MediaState::Paused) signal, 
    //   as emitted by MediaPlayer->pause(), to the pauseSlot() that triggers myPauseCallback.
    //   - Connect the mediaStateChanged(MediaState::Stopped) signal, 
    //   as emitted by MediaPlayer->stop() to the stopSlot() that triggers myStopCallback.
    //
    //   These two are optional and not specified by Tim's API:
    //   - Connect a newly defined nextSignal() to the nextSlot() that triggers myNextCallback.
    //   - Connect a newly defined previousSignal() to the previousSlot() that triggers myPreviousCallback.
    //
    // 3. Set up system-triggerable callbacks:
    //   - Connect the NowPlayingConnection->acquired() signal, // No need?
    //   as emitted internally thru preemption, to the acquireSlot() that triggers myAcquireCallback.
    //   - Connect the NowPlayingConnection->revoked() signal, // No need?
    //   as emitted internally thru preemption, to the revokeSlot() that triggers myRevokeCallback.
    //   - Connect the NowPlayingConnection->play() signal, 
    //   as emitted internally thru preemption or volumeOverlay, to the playSlot() that triggers myPlayResumeCallback.
    //   - Connect the NowPlayingConnection->pause() signal, 
    //   as emitted internally thru preemption or volumeOverlay, to the pauseSlot() that triggers myPauseCallback.
    //   - Connect the NowPlayingConnection->stop() signal, 
    //   as emitted internally thru preemption or volumeOverlay, to the stopSlot() that triggers myStopCallback.
    //   - Connect the NowPlayingConnections->next() signal, 
    //   as emitted internally thru volumeOverlay, to the nextSlot() that triggers myNextCallback.
    //   - Connect the NowPlayingConnections->previous() signal, 
    //   as emitted internally thru volumeOverlay, to the previousSlot() that triggers myPreviousCallback.
    //
    // 4. Set up other callbacks:
    //   - Connect a newly-defined errorSignal() to the errorSlot() that triggers the callback.

cordova.plugins.nowplaying.play( // Possibly overwrite trackChange()
    {
        myPlaylist[0].songURL,
        myPlaylist[0].iconURL,
        myPlaylist[0].metadata,
        currentSong < myPlaylist.size() - 1, // nextEnabled. Here: true
        currentSong > 0 // previousEnabled. Here: false
    });
    // 1. Setup songURL, iconURL, metadata.
    // 2. Enable/disable next/previous buttons on the volume overlay.
    // 3. NowPlayingConnection->acquire().
    // 4. MediaPlayer->play().

cordova.plugins.nowplaying.pause();
    // 1. MediaPlayer->pause().

cordova.plugins.nowplaying.resume();
    // 1. MediaPlayer->play().

cordova.plugins.nowplaying.trackChange( // Possibly merge into/overwrite with play()
    {
        myPlaylist[1].songURL,
        myPlaylist[1].iconURL,
        myPlaylist[1].metadata,
        currentSong < myPlaylist.size() - 1, // nextEnabled. Here: false
        currentSong > 0 // previousEnabled. Here: true
    });
    // 1. MediaPlayer->stop().
    // 2. Setup songURL, iconURL, metadata.
    // 3. Enable/disable next/previous buttons on the volume overlay.
    // 4. MediaPlayer->play().

// This API method is optional and is not specified by Tim's API
cordova.plugins.nowplaying.previous();
    // 1. emit the newly defined previousSignal().

// This API method is optional and is not specified by Tim's API
cordova.plugins.nowplaying.next();
    // 1. emit the newly defined nextSignal().

cordova.plugins.nowplaying.stop();
    // 1. MediaPlayer->stop().
    // 2. NowPlayingConnection->revoke().

@parker-mar
Copy link
Collaborator

I can't seem to get the volumeOverlay buttons to fire signals, e.g. next() or pause() buttons.

I am trying this, for example:

newConnectResult = QObject::connect(
                                            npc,
                                            SIGNAL(next()),
                                            this,
                                            SLOT(pauseSlot()) //dummy
                                        );

I am also trying to fire a signal when the app gets preempted by a video playing in the video app and by using the mic button, but it isn't firing.

newConnectResult = QObject::connect(
                                                npc,
                                                SIGNAL(play()),
                                                this,
                                                SLOT(pauseSlot()) //dummy
                                            );

        newConnectResult = QObject::connect(
                                                        npc,
                                                        SIGNAL(revoked()),
                                                        this,
                                                        SLOT(pauseSlot()) //dummy
                                                    );

@timwindsor
Copy link
Collaborator Author

I don't think we need to use NowPlayingController at all - I think it might be trimmed version of the NowPlayingConnection without the Overlay interface. I'm confirming that if I can with some colleagues.

Let's also leave out audioManagerHandle - it's a way to control how the audio is played, like which speakers [handset, speakerphone, line out, bluetooth] and it's not something we really need to tackle in this version.

The MediaPlayer instance that our NowPlayingConnection interacts with, is the one that we create. I thought that was already in the original plugin.

@parker-mar
Copy link
Collaborator

Ok. Yes, MediaPlayer was already here.

@timwindsor
Copy link
Collaborator Author

Firing the Next() and Play() signals on the NowPlayingConnection probably require you to tap on the buttons on the Overlay.

@timwindsor
Copy link
Collaborator Author

You can give the mediaplayer a playlist file, like a .m3u file, instead of an audio track and that will give you the ability to do next and previous within the mediaplayer object. I wasn't planning on using that approach though, as I preferred to leave that control up to the developer, in case they weren't able to use a standard playlist file.

@timwindsor
Copy link
Collaborator Author

I'm fine with you changing the API to what you've got there - the only thing is that I think you can go with a single play() method, and not have a trackChange at all, right? It seems like the small differences could be handled by some status checks inside the method.

@parker-mar
Copy link
Collaborator

I'm fine with you changing the API to what you've got there - the only thing is that I think you can go with a single play() method, and not have a trackChange at all, right?

Right, yeah I think this can be done.

@parker-mar
Copy link
Collaborator

Firing the Next() and Play() signals on the NowPlayingConnection probably require you to tap on the buttons on the Overlay.

Yes, I've tried this. The overlay does respond to taps in the sense that the buttons change visuals ("tapped" state vs "untapped" state, "pause" toggles with "play"), but they aren't firing anything. At the same time, the volume overlay isn't showing the icon and metadata. And this is odd because I do setVolumeOverlay to Fancy, the NowPlayingConnection is acquired, setNextEnabled(true), and the connect succeeds.

@parker-mar
Copy link
Collaborator

I don't think we need to use NowPlayingController at all - I think it might be trimmed version of the NowPlayingConnection without the Overlay interface. I'm confirming that if I can with some colleagues.

In this thread, a Paul Bernhardt suggests using NowPlayingController. I tried this but it doesn't seem to work either.
https://supportforums.blackberry.com/t5/Native-Development/How-do-I-invoke-the-Pause-button-functionality-to-pause-media/td-p/2685185

@timwindsor
Copy link
Collaborator Author

Regarding your signals not firing - I think we're missing some critical code here and I don't know where it was lost. These sorts of plugins that use signals and slots leverage the ApplicationThread class to make this all work. The plugin code is run on that ApplicationThread, and that's what makes the signals and slots work - because the ApplicationThread is a QThread with the necessary signal and slot capabilities.

Looking at both MediaKeys and SystemDialog as examples, I see calls to the "join" method on initialization, passing in the Window Group so that the Application Thread runs in the same group as the app itself, but I don't see that in this plugin: https://github.com/parker-mar/WebWorks-Community-APIs/blob/master/BB10-Cordova/MediaKeys/plugin/src/blackberry10/index.js#L92-L104

Secondly, the method calls need to go from the Plugin's thread, to the ApplicationThread where the NowPlayingNDK code was moved. MediaKeys and SystemDialog do that like the following, but I see direct method calls in this plugin:
https://github.com/parker-mar/WebWorks-Community-APIs/blob/master/BB10-Cordova/MediaKeys/plugin/src/blackberry10/native/src/mediaKeys_js.cpp#L102-L103

I think we need to check out these differences in order to make the communication work properly. There could be other things that I missed too.

@parker-mar
Copy link
Collaborator

Yes! This worked perfectly for the volume overlay.

I am trying to figure out how notifications to the media notification area through signals and slots will work for preemption, as it doesn't yet. I am testing this by using the mic button or playing another video, then returning to the NowPlaying app and expecting play/pause callbacks to be shown (but they are not). I looked at almost all the other plugins, but nothing found yet. Do you have any advice here Tim?

@timwindsor
Copy link
Collaborator Author

I don't see you handling the revoke signal at all - that's what you should be getting when another audio signal takes over. When you get that signal, you can pause the mediaplayer.

@parker-mar
Copy link
Collaborator

Right, okay I modified this and preemption works now.

Just some personal notes:
Preemption protocol described here - https://developer.blackberry.com/native/documentation/graphics_multimedia/audio_video/accessing_media_notification_areas.html

Cases:

  • Just requested
    • preempt by high: wasn't acquired yet, nothing happens.
    • preempt by low: wasn't acquired yet, nothing happens.
  • Stopped:
    • preempt by high: since was stopped, system doesn't play, and remains stopped. Remained acquired, but inactive (volume overlay not shown while in preempting app)
    • preempt by low: revoked should stop it. Overlay gone, so media notification area, preemption, and buttons already prevented from notifications. Developer forced to call play() to reset overlay.
  • Paused
    • preempt by high: since was paused, system doesn't play, and remains paused. Remained acquired, but inactive (volume overlay not shown while in preempting app)
    • preempt by low: revoked should stop it. Overlay gone, so media notification area, preemption, and buttons already prevented from notifications. Developer forced to call play() to reset overlay.
  • Playing
    • preempt by high: system automatically calls pause, then automatically calls play. Remained acquired, but inactive (volume overlay not shown while in preempting app).
    • preempt by low: revoked should stop it. Overlay gone, so media notification area, preemption, and buttons already prevented from notifications. Developer forced to call play() to reset overlay.

If we revoked while playing/paused and so returns from preemption paused, and we want to continue from that point, then would need all of play() except for the initial stop(). Such a method is too complicated as it does some of play() and some of resume() (even though mediaplayer is set with the music, icon and metadata aren't - so these aren't synced together). Simplify by just always stopping.

@parker-mar
Copy link
Collaborator

Notes and corresponding slides
for presentation I made to UCOSP mentors and students:
CSC494 Presentation.docx
pics.pdf

Blog post summarizing work as term comes to a close: google drive link

Plugin:
https://github.com/blackberry/WebWorks-Community-APIs/tree/master/BB10-Cordova/NowPlaying

Sample app using plugin:
https://github.com/blackberry/BB10-WebWorks-Community-Samples/tree/master/NowPlaying

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

2 participants