- Installation
- Examples
- Usage
- iOS App Transport Security
- Audio Mixing
- Android Expansion File Usage
- Updating
- Contributing
Using npm:
npm install --save react-native-video
or using yarn:
yarn add react-native-video
Then follow the instructions for your platform to link react-native-video into your project:
iOS details
React Native 0.60 and above
Run npx pod-install
. Linking is not required in React Native 0.60 and above.
React Native 0.59 and below
Run react-native link react-native-video
to link the react-native-video library.
Add use_frameworks! :linkage => :static
just under platform :ios
in your ios project Podfile.
See the example ios project for reference
Setup your Podfile like it is described in the react-native documentation.
Depending on your requirements you have to choose between the two possible subpodspecs:
Video only:
pod 'Folly', :podspec => '../node_modules/react-native/third-party-podspecs/Folly.podspec'
+ `pod 'react-native-video', :path => '../node_modules/react-native-video/react-native-video.podspec'`
end
Video with caching (more info):
pod 'Folly', :podspec => '../node_modules/react-native/third-party-podspecs/Folly.podspec'
+ `pod 'react-native-video/VideoCaching', :path => '../node_modules/react-native-video/react-native-video.podspec'`
end
tvOS details
react-native link react-native-video
doesn’t work properly with the tvOS target so we need to add the library manually.
First select your project in Xcode.
After that, select the tvOS target of your application and select « General » tab
Scroll to « Linked Frameworks and Libraries » and tap on the + button
Select RCTVideo-tvOS
Android details
Linking is not required in React Native 0.60 and above.
If your project is using React Native < 0.60, run react-native link react-native-video
to link the react-native-video library.
Or if you have trouble, make the following additions to the given files manually:
Add player source in build configuration
include ':react-native-video'
project(':react-native-video').projectDir = new File(rootProject.projectDir, '../node_modules/react-native-video/android')
From version >= 5.0.0, you have to apply these changes:
dependencies {
...
compile project(':react-native-video')
+ implementation "androidx.appcompat:appcompat:1.0.0"
- implementation "com.android.support:appcompat-v7:${rootProject.ext.supportLibVersion}"
}
Migrating to AndroidX (needs version >= 5.0.0):
android.useAndroidX=true
android.enableJetifier=true
If using com.facebook.react.PackageList to auto import native dependencies, there are no updates required here. Please see the android example project for more details. /examples/basic/android/app/src/main/java/com/videoplayer/MainApplication.java
On top, where imports are:
import com.brentvatne.react.ReactVideoPackage;
Add the ReactVideoPackage
class to your list of exported packages.
@Override
protected List<ReactPackage> getPackages() {
return Arrays.asList(
new MainReactPackage(),
new ReactVideoPackage()
);
}
To enable client side ads insertion CSAI with google IMA SDK, you need to enable it in your gradle file.
buildscript {
ext {
...
RNVUseExoplayerIMA = true
...
}
}
Windows RNW C++/WinRT details
React Native Windows 0.63 and above
Autolinking should automatically add react-native-video to your app.
React Native Windows 0.62
Make the following additions to the given files manually:
Add the ReactNativeVideoCPP project to your solution (eg. windows\myapp.sln
):
- Open your solution in Visual Studio 2019
- Right-click Solution icon in Solution Explorer > Add > Existing Project...
- Select
node_modules\react-native-video\windows\ReactNativeVideoCPP\ReactNativeVideoCPP.vcxproj
Add a reference to ReactNativeVideoCPP to your main application project (eg. windows\myapp\myapp.vcxproj
):
- Open your solution in Visual Studio 2019
- Right-click main application project > Add > Reference...
- Check ReactNativeVideoCPP from Solution Projects
Add #include "winrt/ReactNativeVideoCPP.h"
.
Add PackageProviders().Append(winrt::ReactNativeVideoCPP::ReactPackageProvider());
before InitializeComponent();
.
React Native Windows 0.61 and below
Follow the manual linking instuctions for React Native Windows 0.62 above, but substitute ReactNativeVideoCPP61 for ReactNativeVideoCPP.
Run yarn xbasic install
in the root directory before running any of the examples.
yarn xbasic ios
yarn xbasic android
yarn xbasic windows
// Load the module
import Video from 'react-native-video';
// Within your render function, assuming you have a file called
// "background.mp4" in your project. You can include multiple videos
// on a single screen if you like.
<Video source={{uri: "background"}} // Can be a URL or a local file.
ref={(ref) => {
this.player = ref
}} // Store reference
onBuffer={this.onBuffer} // Callback when remote video is buffering
onError={this.videoError} // Callback when video cannot be loaded
style={styles.backgroundVideo} />
// Later on in your styles..
var styles = StyleSheet.create({
backgroundVideo: {
position: 'absolute',
top: 0,
left: 0,
bottom: 0,
right: 0,
},
});
Name | Platforms Support |
---|---|
onAudioBecomingNoisy | Android, iOS |
onAudioTracks | Android |
onBandwidthUpdate | Android |
onBuffer | Android, iOS |
onEnd | All |
onError | Android, iOS |
onExternalPlaybackChange | iOS |
onFullscreenPlayerWillPresent | Android, iOS |
onFullscreenPlayerDidPresent | Android, iOS |
onFullscreenPlayerWillDismiss | Android, iOS |
onFullscreenPlayerDidDismiss | Android, iOS |
onLoad | All |
onLoadStart | All |
onPictureInPictureStatusChanged | iOS |
onPlaybackRateChange | All |
onProgress | All |
onReadyForDisplay | Android, iOS, Web |
onReceiveAdEvent | Android, iOS |
onRestoreUserInterfaceForPictureInPictureStop | iOS |
onSeek | Android, iOS, Windows UWP |
onTimedMetadata | Android, iOS |
onTextTracks | Android |
onVideoTracks | Android |
Name | Platforms Support |
---|---|
dismissFullscreenPlayer | Android, iOS |
presentFullscreenPlayer | Android, iOS |
save | iOS |
restoreUserInterfaceForPictureInPictureStop | iOS |
seek | All |
Name | Platforms Support |
---|---|
getWidevineLevel | Android |
isCodecSupported | Android |
isHEVCSupported | Android |
Sets the VAST uri to play AVOD ads.
Example:
adTagUrl="https://pubads.g.doubleclick.net/gampad/ads?iu=/21775744923/external/vmap_ad_samples&sz=640x480&cust_params=sample_ar%3Dpremidpostoptimizedpodbumper&ciu_szs=300x250&gdfp_req=1&ad_rule=1&output=vmap&unviewed_position_start=1&env=vp&impl=s&cmsid=496&vid=short_onecue&correlator="
Note: On android, you need enable IMA SDK in gradle file, see: enableclient side ads insertion
Platforms: Android, iOS
Indicates whether the player allows switching to external playback mode such as AirPlay or HDMI.
- true (default) - allow switching to external playback mode
- false - Don't allow switching to external playback mode
Platforms: iOS
Indicates whether the player should only play the audio track and instead of displaying the video track, show the poster instead.
- false (default) - Display the video as normal
- true - Show the poster and play the audio
For this to work, the poster prop must be set.
Platforms: all
A Boolean value that indicates whether the player should automatically delay playback in order to minimize stalling. For clients linked against iOS 10.0 and later
- false - Immediately starts playback
- true (default) - Delays playback in order to minimize stalling
Platforms: iOS
The number of milliseconds of buffer to keep before the current position. This allows rewinding without rebuffering within that duration.
Platforms: Android
Adjust the buffer settings. This prop takes an object with one or more of the properties listed below.
Property | Type | Description |
---|---|---|
minBufferMs | number | The default minimum duration of media that the player will attempt to ensure is buffered at all times, in milliseconds. |
maxBufferMs | number | The default maximum duration of media that the player will attempt to buffer, in milliseconds. |
bufferForPlaybackMs | number | The default duration of media that must be buffered for playback to start or resume following a user action such as a seek, in milliseconds. |
bufferForPlaybackAfterRebufferMs | number | The default duration of media that must be buffered for playback to resume after a rebuffer, in milliseconds. A rebuffer is defined to be caused by buffer depletion rather than a user action. |
maxHeapAllocationPercent | number | The percentage of available heap that the video can use to buffer, between 0 and 1 |
minBackBufferMemoryReservePercent | number | The percentage of available app memory at which during startup the back buffer will be disabled, between 0 and 1 |
minBufferMemoryReservePercent | number | The percentage of available app memory to keep in reserve that prevents buffer from using it, between 0 and 1 |
This prop should only be set when you are setting the source, changing it after the media is loaded will cause it to be reloaded.
Example with default values:
bufferConfig={{
minBufferMs: 15000,
maxBufferMs: 50000,
bufferForPlaybackMs: 2500,
bufferForPlaybackAfterRebufferMs: 5000
}}
Platforms: Android
When playing an HLS live stream with a EXT-X-PROGRAM-DATE-TIME
tag configured, then this property will contain the epoch value in msec.
Platforms: Android, iOS
Determines whether to show player controls.
- false (default) - Don't show player controls
- true - Show player controls
Note on iOS, controls are always shown when in fullscreen mode. Note on Android, native controls are available by default. If needed, you can also add your controls or use a package like react-native-video-controls or react-native-media-console, see Usefull Side Project.
The start time in ms for SSAI content. This determines at what time to load the video info like resolutions. Use this only when you have SSAI stream where ads resolution is not the same as content resolution.
Platforms: Android, iOS
Determines whether video audio should override background music/audio in Android devices.
- false (default) - Override background audio/music
- true - Let background audio/music from other apps play
Note: Allows multiple videos to play if set to true
. If false
, when one video is playing and another is started, the first video will be paused.
Platforms: Android
Determines if the player needs to throw an error when connection is lost or not
- false (default) - Player will throw an error when connection is lost
- true - Player will keep trying to buffer when network connect is lost
Platforms: Android
To setup DRM please follow this guide
Platforms: Android, iOS
Add video filter
- FilterType.NONE (default) - No Filter
- FilterType.INVERT - CIColorInvert
- FilterType.MONOCHROME - CIColorMonochrome
- FilterType.POSTERIZE - CIColorPosterize
- FilterType.FALSE - CIFalseColor
- FilterType.MAXIMUMCOMPONENT - CIMaximumComponent
- FilterType.MINIMUMCOMPONENT - CIMinimumComponent
- FilterType.CHROME - CIPhotoEffectChrome
- FilterType.FADE - CIPhotoEffectFade
- FilterType.INSTANT - CIPhotoEffectInstant
- FilterType.MONO - CIPhotoEffectMono
- FilterType.NOIR - CIPhotoEffectNoir
- FilterType.PROCESS - CIPhotoEffectProcess
- FilterType.TONAL - CIPhotoEffectTonal
- FilterType.TRANSFER - CIPhotoEffectTransfer
- FilterType.SEPIA - CISepiaTone
For more details on these filters refer to the iOS docs.
Notes:
- Using a filter can impact CPU usage. A workaround is to save the video with the filter and then load the saved video.
- Video filter is currently not supported on HLS playlists.
filterEnabled
must be set totrue
Platforms: iOS
Enable video filter.
- false (default) - Don't enable filter
- true - Enable filter
Platforms: iOS
Whether this video view should be focusable with a non-touch input device, eg. receive focus with a hardware keyboard.
- false - Makes view unfocusable
- true (default) - Makes view focusable
Platforms: Android
Controls whether the player enters fullscreen on play.
- false (default) - Don't display the video in fullscreen
- true - Display the video in fullscreen
Platforms: iOS
If a preferred fullscreenOrientation is set, causes the video to rotate to that orientation but permits rotation of the screen to orientation held by user. Defaults to TRUE.
Platforms: iOS
- all (default) -
- landscape
- portrait
Platforms: iOS
Pass headers to the HTTP client. Can be used for authorization. Headers must be a part of the source object.
Example:
source={{
uri: "https://www.example.com/video.mp4",
headers: {
Authorization: 'bearer some-token-value',
'X-Custom-Header': 'some value'
}
}}
Platforms: Android
Controls whether the ExoPlayer shutter view (black screen while loading) is enabled.
- false (default) - Show shutter view
- true - Hide shutter view
Platforms: Android
Controls the iOS silent switch behavior
- "inherit" (default) - Use the default AVPlayer behavior
- "ignore" - Play audio even if the silent switch is set
- "obey" - Don't play audio if the silent switch is set
Platforms: iOS
Sets the desired limit, in bits per second, of network bandwidth consumption when multiple video streams are available for a playlist.
Default: 0. Don't limit the maxBitRate.
Example:
maxBitRate={2000000} // 2 megabits
Platforms: Android, iOS
Sets the minimum number of times to retry loading data before failing and reporting an error to the application. Useful to recover from transient internet failures.
Default: 3. Retry 3 times.
Example:
minLoadRetryCount={5} // retry 5 times
Platforms: Android
Controls how Audio mix with other apps.
- "inherit" (default) - Use the default AVPlayer behavior
- "mix" - Audio from this video mixes with audio from other apps.
- "duck" - Reduces the volume of other apps while audio from this video plays.
Platforms: iOS
Controls whether the audio is muted
- false (default) - Don't mute audio
- true - Mute audio
Platforms: all
Controls whether the media is paused
- false (default) - Don't pause the media
- true - Pause the media
Platforms: all
Determine whether the media should played as picture in picture.
- false (default) - Don't not play as picture in picture
- true - Play the media as picture in picture
Platforms: iOS
Determine whether the media should continue playing while the app is in the background. This allows customers to continue listening to the audio.
- false (default) - Don't continue playing the media
- true - Continue playing the media
To use this feature on iOS, you must:
- Enable Background Audio in your Xcode project
- Set the ignoreSilentSwitch prop to "ignore"
Platforms: Android, iOS
Determine whether the media should continue playing when notifications or the Control Center are in front of the video.
- false (default) - Don't continue playing the media
- true - Continue playing the media
Platforms: iOS
An image to display while the video is loading
Value: string with a URL for the poster, e.g. "https://baconmockup.com/300/200/"
Platforms: all
Determines how to resize the poster image when the frame doesn't match the raw video dimensions.
- "contain" (default) - Scale the image uniformly (maintain the image's aspect ratio) so that both dimensions (width and height) of the image will be equal to or less than the corresponding dimension of the view (minus padding).
- "center" - Center the image in the view along both dimensions. If the image is larger than the view, scale it down uniformly so that it is contained in the view.
- "cover" - Scale the image uniformly (maintain the image's aspect ratio) so that both dimensions (width and height) of the image will be equal to or larger than the corresponding dimension of the view (minus padding).
- "none" - Don't apply resize
- "repeat" - Repeat the image to cover the frame of the view. The image will keep its size and aspect ratio. (iOS only)
- "stretch" - Scale width and height independently, This may change the aspect ratio of the src.
Platforms: all
The duration the player should buffer media from the network ahead of the playhead to guard against playback disruption. Sets the preferredForwardBufferDuration instance property on AVPlayerItem.
Default: 0
Platforms: iOS
Controls whether or not the display should be allowed to sleep while playing the video. Default is not to allow display to sleep.
Default: true
Platforms: iOS, Android
Delay in milliseconds between onProgress events in milliseconds.
Default: 250.0
Platforms: all
Speed at which the media should play.
- 0.0 - Pauses the video
- 1.0 - Play at normal speed
- Other values - Slow down or speed up playback
Platforms: all
Determine whether to repeat the video when the end is reached
- false (default) - Don't repeat the video
- true - Repeat the video
Platforms: all
Callback function that is called when audio tracks change
Payload:
Property | Type | Description |
---|---|---|
index | number | Internal track ID |
title | string | Descriptive name for the track |
language | string | 2 letter ISO 639-1 code representing the language |
bitrate | number | bitrate of track |
type | string | Mime type of track |
selected | boolean | true if track is playing |
Example:
{
audioTracks: [
{ language: 'es', title: 'Spanish', type: 'audio/mpeg', index: 0, selected: true },
{ language: 'en', title: 'English', type: 'audio/mpeg', index: 1 }
],
}
Platforms: Android
Determine whether to generate onBandwidthUpdate events. This is needed due to the high frequency of these events on ExoPlayer.
- false (default) - Don't generate onBandwidthUpdate events
- true - Generate onBandwidthUpdate events
Platforms: Android
Determines how to resize the video when the frame doesn't match the raw video dimensions.
- "none" (default) - Don't apply resize
- "contain" - Scale the video uniformly (maintain the video's aspect ratio) so that both dimensions (width and height) of the video will be equal to or less than the corresponding dimension of the view (minus padding).
- "cover" - Scale the video uniformly (maintain the video's aspect ratio) so that both dimensions (width and height) of the image will be equal to or larger than the corresponding dimension of the view (minus padding).
- "stretch" - Scale width and height independently, This may change the aspect ratio of the src.
Platforms: Android, iOS, Windows UWP
Configure which audio track, if any, is played.
selectedAudioTrack={{
type: Type,
value: Value
}}
Example:
selectedAudioTrack={{
type: "title",
value: "Dubbing"
}}
Type | Value | Description |
---|---|---|
"system" (default) | N/A | Play the audio track that matches the system language. If none match, play the first track. |
"disabled" | N/A | Turn off audio |
"title" | string | Play the audio track with the title specified as the Value, e.g. "French" |
"language" | string | Play the audio track with the language specified as the Value, e.g. "fr" |
"index" | number | Play the audio track with the index specified as the value, e.g. 0 |
If a track matching the specified Type (and Value if appropriate) is unavailable, the first audio track will be played. If multiple tracks match the criteria, the first match will be used.
Platforms: Android, iOS
Configure which text track (caption or subtitle), if any, is shown.
selectedTextTrack={{
type: Type,
value: Value
}}
Example:
selectedTextTrack={{
type: "title",
value: "English Subtitles"
}}
Type | Value | Description |
---|---|---|
"system" (default) | N/A | Display captions only if the system preference for captions is enabled |
"disabled" | N/A | Don't display a text track |
"title" | string | Display the text track with the title specified as the Value, e.g. "French 1" |
"language" | string | Display the text track with the language specified as the Value, e.g. "fr" |
"index" | number | Display the text track with the index specified as the value, e.g. 0 |
Both iOS & Android (only 4.4 and higher) offer Settings to enable Captions for hearing impaired people. If "system" is selected and the Captions Setting is enabled, iOS/Android will look for a caption that matches that customer's language and display it.
If a track matching the specified Type (and Value if appropriate) is unavailable, no text track will be displayed. If multiple tracks match the criteria, the first match will be used.
Platforms: Android, iOS
Configure which video track should be played. By default, the player uses Adaptive Bitrate Streaming to automatically select the stream it thinks will perform best based on available bandwidth.
selectedVideoTrack={{
type: Type,
value: Value
}}
Example:
selectedVideoTrack={{
type: "resolution",
value: 480
}}
Type | Value | Description |
---|---|---|
"auto" (default) | N/A | Let the player determine which track to play using ABR |
"disabled" | N/A | Turn off video |
"resolution" | number | Play the video track with the height specified, e.g. 480 for the 480p stream |
"index" | number | Play the video track with the index specified as the value, e.g. 0 |
If a track matching the specified Type (and Value if appropriate) is unavailable, ABR will be used.
Platforms: Android
Sets the media source. You can pass an asset loaded via require or an object with a uri.
Setting the source will trigger the player to attempt to load the provided media with all other given props. Please be sure that all props are provided before/at the same time as setting the source.
Rendering the player component with a null source will init the player, and start playing once a source value is provided.
Providing a null source value after loading a previous source will stop playback, and clear out the previous source content.
The docs for this prop are incomplete and will be updated as each option is investigated and tested.
Example:
const sintel = require('./sintel.mp4');
source={sintel}
A number of URI schemes are supported by passing an object with a uri
attribute.
Example:
source={{uri: 'https://www.sample-videos.com/video/mp4/720/big_buck_bunny_720p_10mb.mp4' }}
Platforms: all
Example:
source={{ uri: 'file:///sdcard/Movies/sintel.mp4' }}
Note: Your app will need to request permission to read external storage if you're accessing a file outside your app.
Platforms: Android, possibly others
Path to a sound file in your iTunes library. Typically shared from iTunes to your app.
Example:
source={{ uri: 'ipod-library:///path/to/music.mp3' }}
Note: Using this feature adding an entry for NSAppleMusicUsageDescription to your Info.plist file as described here
Platforms: iOS
Provide a member type
with value (mpd
/m3u8
/ism
) inside the source object.
Sometimes is needed when URL extension does not match with the mimetype that you are expecting, as seen on the next example. (Extension is .ism -smooth streaming- but file served is on format mpd -mpeg dash-)
Example:
source={{ uri: 'http://host-serving-a-type-different-than-the-extension.ism/manifest(format=mpd-time-csf)',
type: 'mpd' }}
The following other types are supported on some platforms, but aren't fully documented yet:
content://, ms-appx://, ms-appdata://, assets-library://
Property | Description | Platforms |
---|---|---|
fontSizeTrack | Adjust the font size of the subtitles. Default: font size of the device | Android |
paddingTop | Adjust the top padding of the subtitles. Default: 0 | Android |
paddingBottom | Adjust the bottom padding of the subtitles. Default: 0 | Android |
paddingLeft | Adjust the left padding of the subtitles. Default: 0 | Android |
paddingRight | Adjust the right padding of the subtitles. Default: 0 | Android |
Example:
subtitleStyle={{ paddingBottom: 50, fontSize: 20 }}
Load one or more "sidecar" text tracks. This takes an array of objects representing each track. Each object should have the format:
Property | Description |
---|---|
title | Descriptive name for the track |
language | 2 letter ISO 639-1 code representing the language |
type | Mime type of the track * TextTrackType.SRT - SubRip (.srt) * TextTrackType.TTML - TTML (.ttml) * TextTrackType.VTT - WebVTT (.vtt) iOS only supports VTT, Android supports all 3 |
uri | URL for the text track. Currently, only tracks hosted on a webserver are supported |
On iOS, sidecar text tracks are only supported for individual files, not HLS playlists. For HLS, you should include the text tracks as part of the playlist.
Note: Due to iOS limitations, sidecar text tracks are not compatible with Airplay. If textTracks are specified, AirPlay support will be automatically disabled.
Example:
import { TextTrackType }, Video from 'react-native-video';
textTracks={[
{
title: "English CC",
language: "en",
type: TextTrackType.VTT, // "text/vtt"
uri: "https://bitdash-a.akamaihd.net/content/sintel/subtitles/subtitles_en.vtt"
},
{
title: "Spanish Subtitles",
language: "es",
type: TextTrackType.SRT, // "application/x-subrip"
uri: "https://durian.blender.org/wp-content/content/subtitles/sintel_es.srt"
}
]}
Platforms: Android, iOS
Configure an identifier for the video stream to link the playback context to the events emitted.
Platforms: Android
Controls whether to output to a TextureView or SurfaceView.
SurfaceView is more efficient and provides better performance but has two limitations:
- It can't be animated, transformed or scaled
- You can't overlay multiple SurfaceViews
useTextureView can only be set at same time you're setting the source.
- true (default) - Use a TextureView
- false - Use a SurfaceView
Platforms: Android
Force the output to a SurfaceView and enables the secure surface.
This will override useTextureView flag.
SurfaceView is is the only one that can be labeled as secure.
- true - Use security
- false (default) - Do not use security
Platforms: Android
Adjust the volume.
- 1.0 (default) - Play at full volume
- 0.0 - Mute the audio
- Other values - Reduce volume
Platforms: all
Set the url scheme for stream encryption key for local assets
Type: String
Example:
localSourceEncryptionKeyScheme="my-offline-key"
Platforms: iOS
Callback function that is called when the audio is about to become 'noisy' due to a change in audio outputs. Typically this is called when audio output is being switched from an external source like headphones back to the internal speaker. It's a good idea to pause the media when this happens so the speaker doesn't start blasting sound.
Payload: none
Platforms: Android, iOS
Callback function that is called when the available bandwidth changes.
Payload:
Property | Type | Description |
---|---|---|
bitrate | number | The estimated bitrate in bits/sec |
Example:
{
bitrate: 1000000
}
Note: On Android, you must set the reportBandwidth prop to enable this event. This is due to the high volume of events generated.
Platforms: Android
Callback function that is called when the player buffers.
Payload:
Property | Type | Description |
---|---|---|
isBuffering | boolean | Boolean indicating whether buffering is active |
Example:
{
isBuffering: true
}
Platforms: Android, iOS
Callback function that is called when the player reaches the end of the media.
Payload: none
Platforms: all
Callback function that is called when the player experiences a playback error.
Payload:
Property | Type | Description |
---|---|---|
error | object | Object containing properties with information about the error |
Platforms: all
Callback function that is called when external playback mode for current playing video has changed. Mostly useful when connecting/disconnecting to Apple TV – it's called on connection/disconnection.
Payload:
Property | Type | Description |
---|---|---|
isExternalPlaybackActive | boolean | Boolean indicating whether external playback mode is active |
Example:
{
isExternalPlaybackActive: true
}
Platforms: iOS
Callback function that is called when the player is about to enter fullscreen mode.
Payload: none
Platforms: Android, iOS
Callback function that is called when the player has entered fullscreen mode.
Payload: none
Platforms: Android, iOS
Callback function that is called when the player is about to exit fullscreen mode.
Payload: none
Platforms: Android, iOS
Callback function that is called when the player has exited fullscreen mode.
Payload: none
Platforms: Android, iOS
Callback function that is called when the media is loaded and ready to play.
Payload:
Property | Type | Description |
---|---|---|
currentTime | number | Time in seconds where the media will start |
duration | number | Length of the media in seconds |
naturalSize | object | Properties: * width - Width in pixels that the video was encoded at * height - Height in pixels that the video was encoded at * orientation - "portrait" or "landscape" |
audioTracks | array | An array of audio track info objects with the following properties: * index - Index number * title - Description of the track * language - 2 letter ISO 639-1 or 3 letter ISO639-2 language code * type - Mime type of track |
textTracks | array | An array of text track info objects with the following properties: * index - Index number * title - Description of the track * language - 2 letter ISO 639-1 or 3 letter ISO 639-2 language code * type - Mime type of track |
videoTracks | array | An array of video track info objects with the following properties: * trackId - ID for the track * bitrate - Bit rate in bits per second * codecs - Comma separated list of codecs * height - Height of the video * width - Width of the video |
Example:
{
canPlaySlowForward: true,
canPlayReverse: false,
canPlaySlowReverse: false,
canPlayFastForward: false,
canStepForward: false,
canStepBackward: false,
currentTime: 0,
duration: 5910.208984375,
naturalSize: {
height: 1080
orientation: 'landscape'
width: '1920'
},
audioTracks: [
{ language: 'es', title: 'Spanish', type: 'audio/mpeg', index: 0 },
{ language: 'en', title: 'English', type: 'audio/mpeg', index: 1 }
],
textTracks: [
{ title: '#1 French', language: 'fr', index: 0, type: 'text/vtt' },
{ title: '#2 English CC', language: 'en', index: 1, type: 'text/vtt' },
{ title: '#3 English Director Commentary', language: 'en', index: 2, type: 'text/vtt' }
],
videoTracks: [
{ bitrate: 3987904, codecs: "avc1.640028", height: 720, trackId: "f1-v1-x3", width: 1280 },
{ bitrate: 7981888, codecs: "avc1.640028", height: 1080, trackId: "f2-v1-x3", width: 1920 },
{ bitrate: 1994979, codecs: "avc1.4d401f", height: 480, trackId: "f3-v1-x3", width: 848 }
]
}
Platforms: all
Callback function that is called when the media starts loading.
Payload:
Property | Description |
---|---|
isNetwork | boolean |
type | string |
uri | string |
Example:
{
isNetwork: true,
type: '',
uri: 'https://bitdash-a.akamaihd.net/content/sintel/hls/playlist.m3u8'
}
Platforms: all
Callback function that is called when the playback state changes.
Payload:
Property | Description |
---|---|
isPlaying | boolean |
Example:
{
isPlaying: true,
}
Platforms: Android
Callback function that is called when picture in picture becomes active or inactive.
Property | Type | Description |
---|---|---|
isActive | boolean | Boolean indicating whether picture in picture is active |
Example:
{
isActive: true
}
Platforms: iOS
Callback function that is called when the rate of playback changes - either paused or starts/resumes.
Property | Type | Description |
---|---|---|
playbackRate | number | 0 when playback is paused, 1 when playing at normal speed. Other values when playback is slowed down or sped up |
Example:
{
playbackRate: 0, // indicates paused
}
Platforms: all
Callback function that is called every progressUpdateInterval milliseconds with info about which position the media is currently playing.
Property | Type | Description |
---|---|---|
currentTime | number | Current position in seconds |
playableDuration | number | Position to where the media can be played to using just the buffer in seconds |
seekableDuration | number | Position to where the media can be seeked to in seconds. Typically, the total length of the media |
Example:
{
currentTime: 5.2,
playableDuration: 34.6,
seekableDuration: 888
}
Platforms: all
Callback function that is called when the first video frame is ready for display. This is when the poster is removed.
Payload: none
- iOS: readyForDisplay
- Android STATE_READY
Platforms: Android, iOS, Web
Callback function that is called when an AdEvent is received from the IMA's SDK.
Enum AdEvent
possible values for Android and iOS:
Event | Platform | Description |
---|---|---|
AD_BREAK_ENDED |
iOS | Fired the first time each ad break ends. Applications must reenable seeking when this occurs (only used for dynamic ad insertion). |
AD_BREAK_READY |
Android, iOS | Fires when an ad rule or a VMAP ad break would have played if autoPlayAdBreaks is false. |
AD_BREAK_STARTED |
iOS | Fired first time each ad break begins playback. If an ad break is watched subsequent times this will not be fired. Applications must disable seeking when this occurs (only used for dynamic ad insertion). |
AD_BUFFERING |
Android | Fires when the ad has stalled playback to buffer. |
AD_CAN_PLAY |
Android | Fires when the ad is ready to play without buffering, either at the beginning of the ad or after buffering completes. |
AD_METADATA |
Android | Fires when an ads list is loaded. |
AD_PERIOD_ENDED |
iOS | Fired every time the stream switches from advertising or slate to content. This will be fired even when an ad is played a second time or when seeking into an ad (only used for dynamic ad insertion). |
AD_PERIOD_STARTED |
iOS | Fired every time the stream switches from content to advertising or slate. This will be fired even when an ad is played a second time or when seeking into an ad (only used for dynamic ad insertion). |
AD_PROGRESS |
Android | Fires when the ad's current time value changes. Calling getAdData() on this event will return an AdProgressData object. |
ALL_ADS_COMPLETED |
Android, iOS | Fires when the ads manager is done playing all the valid ads in the ads response, or when the response doesn't return any valid ads. |
CLICK |
Android, iOS | Fires when the ad is clicked. |
COMPLETE |
Android, iOS | Fires when the ad completes playing. |
CONTENT_PAUSE_REQUESTED |
Android | Fires when content should be paused. This usually happens right before an ad is about to cover the content. |
CONTENT_RESUME_REQUESTED |
Android | Fires when content should be resumed. This usually happens when an ad finishes or collapses. |
CUEPOINTS_CHANGED |
iOS | Cuepoints changed for VOD stream (only used for dynamic ad insertion). |
DURATION_CHANGE |
Android | Fires when the ad's duration changes. |
FIRST_QUARTILE |
Android, iOS | Fires when the ad playhead crosses first quartile. |
IMPRESSION |
Android | Fires when the impression URL has been pinged. |
INTERACTION |
Android | Fires when an ad triggers the interaction callback. Ad interactions contain an interaction ID string in the ad data. |
LINEAR_CHANGED |
Android | Fires when the displayed ad changes from linear to nonlinear, or the reverse. |
LOADED |
Android, iOS | Fires when ad data is available. |
LOG |
Android, iOS | Fires when a non-fatal error is encountered. The user need not take any action since the SDK will continue with the same or next ad playback depending on the error situation. |
MIDPOINT |
Android, iOS | Fires when the ad playhead crosses midpoint. |
PAUSED |
Android, iOS | Fires when the ad is paused. |
RESUMED |
Android, iOS | Fires when the ad is resumed. |
SKIPPABLE_STATE_CHANGED |
Android | Fires when the displayed ads skippable state is changed. |
SKIPPED |
Android, iOS | Fires when the ad is skipped by the user. |
STARTED |
Android, iOS | Fires when the ad starts playing. |
STREAM_LOADED |
iOS | Stream request has loaded (only used for dynamic ad insertion). |
TAPPED |
iOS | Fires when the ad is tapped. |
THIRD_QUARTILE |
Android, iOS | Fires when the ad playhead crosses third quartile. |
UNKNOWN |
iOS | An unknown event has fired |
USER_CLOSE |
Android | Fires when the ad is closed by the user. |
VIDEO_CLICKED |
Android | Fires when the non-clickthrough portion of a video ad is clicked. |
VIDEO_ICON_CLICKED |
Android | Fires when a user clicks a video icon. |
VOLUME_CHANGED |
Android | Fires when the ad volume has changed. |
VOLUME_MUTED |
Android | Fires when the ad volume has been muted. |
Payload:
Property | Type | Description |
---|---|---|
event | AdEvent | The ad event received |
Example:
{
"event": "LOADED"
}
Platforms: Android, iOS
Callback function that corresponds to Apple's restoreUserInterfaceForPictureInPictureStopWithCompletionHandler
. Call restoreUserInterfaceForPictureInPictureStopCompleted
inside of this function when done restoring the user interface.
Payload: none
Platforms: iOS
Callback function that is called when a seek completes.
Payload:
Property | Type | Description |
---|---|---|
currentTime | number | The current time after the seek |
seekTime | number | The requested time |
Example:
{
currentTime: 100.5
seekTime: 100
}
Both the currentTime & seekTime are reported because the video player may not seek to the exact requested position in order to improve seek performance.
Platforms: Android, iOS, Windows UWP
Callback function that is called when timed metadata becomes available
Payload:
Property | Type | Description |
---|---|---|
metadata | array | Array of metadata objects |
Example:
{
metadata: [
{ value: 'Streaming Encoder', identifier: 'TRSN' },
{ value: 'Internet Stream', identifier: 'TRSO' },
{ value: 'Any Time You Like', identifier: 'TIT2' }
]
}
Platforms: Android, iOS
Callback function that is called when text tracks change
Payload:
Property | Type | Description |
---|---|---|
index | number | Internal track ID |
title | string | Descriptive name for the track |
language | string | 2 letter ISO 639-1 code representing the language |
type | string | Mime type of the track * TextTrackType.SRT - SubRip (.srt) * TextTrackType.TTML - TTML (.ttml) * TextTrackType.VTT - WebVTT (.vtt) iOS only supports VTT, Android supports all 3 |
selected | boolean | true if track is playing |
Example:
{
textTracks: [
{
index: 0,
title: 'Any Time You Like',
type: 'srt',
selected: true
}
]
}
Platforms: Android
Callback function that is called when video tracks change
Payload:
Property | Type | Description |
---|---|---|
trackId | number | Internal track ID |
codecs | string | MimeType of codec used for this track |
width | number | Track width |
height | number | Track height |
bitrate | number | Bitrate in bps |
selected | boolean | true if track is selected for playing |
Example:
{
videoTracks: [
{
trackId: 0,
codecs: 'video/mp4',
width: 1920,
height: 1080,
bitrate: 10000,
selected: true
}
]
}
Platforms: Android
Methods operate on a ref to the Video element. You can create a ref using code like:
return (
<Video source={...}
ref={ref => (this.player = ref)} />
);
dismissFullscreenPlayer()
Take the player out of fullscreen mode.
Example:
this.player.dismissFullscreenPlayer();
Platforms: Android, iOS
presentFullscreenPlayer()
Put the player in fullscreen mode.
On iOS, this displays the video in a fullscreen view controller with controls.
On Android, this puts the navigation controls in fullscreen mode. It is not a complete fullscreen implementation, so you will still need to apply a style that makes the width and height match your screen dimensions to get a fullscreen video.
Example:
this.player.presentFullscreenPlayer();
Platforms: Android, iOS
save(): Promise
Save video to your Photos with current filter prop. Returns promise.
Example:
let response = await this.player.save();
let path = response.uri;
Notes:
- Currently only supports highest quality export
- Currently only supports MP4 export
- Currently only supports exporting to user's cache directory with a generated UUID filename.
- User will need to remove the saved video through their Photos app
- Works with cached videos as well. (Checkout video-caching example)
- If the video is has not began buffering (e.g. there is no internet connection) then the save function will throw an error.
- If the video is buffering then the save function promise will return after the video has finished buffering and processing.
Future:
- Will support multiple qualities through options
- Will support more formats in the future through options
- Will support custom directory and file name through options
Platforms: iOS
restoreUserInterfaceForPictureInPictureStopCompleted(restored)
This function corresponds to the completion handler in Apple's restoreUserInterfaceForPictureInPictureStop. IMPORTANT: This function must be called after onRestoreUserInterfaceForPictureInPictureStop
is called.
Example:
this.player.restoreUserInterfaceForPictureInPictureStopCompleted(true);
Platforms: iOS
seek(seconds)
Seek to the specified position represented by seconds. seconds is a float value.
seek()
can only be called after the onLoad
event has fired. Once completed, the onSeek event will be called.
Example:
this.player.seek(200); // Seek to 3 minutes, 20 seconds
Platforms: all
By default iOS seeks within 100 milliseconds of the target position. If you need more accuracy, you can use the seek with tolerance method:
seek(seconds, tolerance)
tolerance is the max distance in milliseconds from the seconds position that's allowed. Using a more exact tolerance can cause seeks to take longer. If you want to seek exactly, set tolerance to 0.
Example:
this.player.seek(120, 50); // Seek to 2 minutes with +/- 50 milliseconds accuracy
Platforms: iOS
A module embed in ReactNativeVideo allow to query device supported feature. To use it include the module as following:
import { VideoDecoderProperties } from '@ifs/react-native-video-enhanced'
Platforms: Android
Indicates whether the widevine level supported by device.
Possible results:
- 0 - unable to determine widevine support (typically not supported)
- 1, 2, 3 - Widevine level supported
Platforms: Android
Example:
VideoDecoderProperties.getWidevineLevel().then((widevineLevel) => {
...
}
Indicates whether the provided codec is supported level supported by device.
parameters:
- mimetype: mime type of codec to query
- width, height: resolution to query
Possible results:
- true - codec supported
- false - codec is not supported
Example:
VideoDecoderProperties.isCodecSupported('video/avc', 1920, 1080).then(
...
}
Platforms: Android
Helper which Indicates whether the provided HEVC/1920*1080 is supported level supported by device. It uses isCodecSupported internally.
Example:
VideoDecoderProperties.isHEVCSupported().then((hevcSupported) => {
...
}
- By default, iOS will only load encrypted (https) urls. If you want to load content from an unencrypted (http) source, you will need to modify your Info.plist file and add the following entry:
For more detailed info check this article
At some point in the future, react-native-video will include an Audio Manager for configuring how videos mix with other apps playing sounds on the device.
On iOS, if you would like to allow other apps to play music over your video component, make the following change:
AppDelegate.m
#import <AVFoundation/AVFoundation.h> // import
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
...
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryAmbient error:nil]; // allow
...
}
You can also use the ignoreSilentSwitch prop.
Expansions files allow you to ship assets that exceed the 100MB apk size limit and don't need to be updated each time you push an app update.
This only supports mp4 files and they must not be compressed. Example command line for preventing compression:
zip -r -n .mp4 *.mp4 player.video.example.com
// Within your render function, assuming you have a file called
// "background.mp4" in your expansion file. Just add your main and (if applicable) patch version
<Video source={{uri: "background", mainVer: 1, patchVer: 0}} // Looks for .mp4 file (background.mp4) in the given expansion version.
resizeMode="cover" // Fill the whole screen at aspect ratio.
style={styles.backgroundVideo} />
The asset system introduced in RN 0.14
allows loading image resources shared across iOS and Android without touching native code. As of RN 0.31
the same is true of mp4 video assets for Android. As of RN 0.33
iOS is also supported. Requires [email protected]
.
<Video
source={require('../assets/video/turntable.mp4')}
/>
To enable audio to play in background on iOS the audio session needs to be set to AVAudioSessionCategoryPlayback
. See [Apple documentation][3] for additional details. (NOTE: there is now a ticket to expose this as a prop )
-
See an [Example integration][1] in
react-native-login
note that this example uses an older version of this library, before we usedexport default
-- if you userequire
you will need to dorequire('react-native-video').default
as per instructions above. -
Try the included [VideoPlayer example][2] yourself:
git clone [email protected]:react-native-community/react-native-video.git cd react-native-video/example npm install open ios/VideoPlayer.xcodeproj
Then
Cmd+R
to start the React Packager, build and run the project in the simulator. -
Lumpen Radio contains another example integration using local files and full screen background video.
In your project Podfile add support for static dependency linking. This is required to support the new Promises subdependency in the iOS swift conversion.
Add use_frameworks! :linkage => :static
just under platform :ios
in your ios project Podfile.
See the example ios project for reference
Probably you want to update your gradle version:
- distributionUrl=https\://services.gradle.org/distributions/gradle-3.3-all.zip
+ distributionUrl=https\://services.gradle.org/distributions/gradle-5.1.1-all.zip
From version >= 5.0.0, you have to apply this changes:
dependencies {
...
compile project(':react-native-video')
+ implementation "androidx.appcompat:appcompat:1.0.0"
- implementation "com.android.support:appcompat-v7:${rootProject.ext.supportLibVersion}"
}
Migrating to AndroidX (needs version >= 5.0.0):
android.useAndroidX=true
android.enableJetifier=true
In order to support ExoPlayer 2.9.0, you must use version 3 or higher of the Gradle plugin. This is included by default in React Native 0.57.
ExoPlayer 2.9.0 uses some Java 1.8 features, so you may need to enable support for Java 1.8 in your app/build.gradle file. If you get an error, compiling with ExoPlayer like:
Default interface methods are only supported starting with Android N (--min-api 24)
Add the following to your app/build.gradle file:
android {
... // Various other settings go here
compileOptions {
targetCompatibility JavaVersion.VERSION_1_8
}
}
When using a router like the react-navigation TabNavigator, switching between tab routes would previously cause ExoPlayer to detach causing the video player to pause. We now don't detach the view, allowing the video to continue playing in a background tab. This matches the behavior for iOS.
The SurfaceView, which ExoPlayer has been using by default has a number of quirks that people are unaware of and often cause issues. This includes not supporting animations or scaling. It also causes strange behavior if you overlay two videos on top of each other, because the SurfaceView will punch a hole through other views. Since TextureView doesn't have these issues and behaves in the way most developers expect, it makes sense to make it the default.
TextureView is not as fast as SurfaceView, so you may still want to enable SurfaceView support. To do this, you can set useTextureView={false}
.
Previously, on Android ExoPlayer if the paused prop was not set, the media would not automatically start playing. The only way it would work was if you set paused={false}
. This has been changed to automatically play if paused is not set so that the behavior is consistent across platforms.
Previously, on Android MediaPlayer if you setup an AppState event when the app went into the background and set a paused prop so that when you returned to the app the video would be paused it would be ignored.
Note, Windows does not have a concept of an app going into the background, so this doesn't apply there.
Version 3.0 updates the Android build tools and SDK to version 27. React Native is in the process of switchting over to SDK 27 in preparation for Google's requirement that new Android apps use SDK 26 by August 2018.
You will either need to install the version 27 SDK and version 27.0.3 buildtools or modify your build.gradle file to configure react-native-video to use the same build settings as the rest of your app as described below.
You will need to create a project.ext
section in the top-level build.gradle file (not app/build.gradle). Fill in the values from the example below using the values found in your app/build.gradle file.
// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
... // Various other settings go here
}
allprojects {
... // Various other settings go here
project.ext {
compileSdkVersion = 31
buildToolsVersion = "30.0.2"
minSdkVersion = 21
targetSdkVersion = 22
}
}
If you encounter an error Could not find com.android.support:support-annotations:27.0.0.
reinstall your Android Support Repository.