iOS/tvOS Player - Documentation¶
The Red Bee Media Managed OTT iOS/tvOS SDK is our playback client SDK for platforms supporting iOS & tvOS. It acts as a middle layer between your iOS/tvOS Client application, the video player APIs and the Red Bee Platform, so you don't have to focus on complicated and time consuming details like authentication, analytics and more.
Requirements¶
iOS
12.0+tvOS
12.0+Swift
5.0+-
Xcode
13.0+ -
Framework dependencies
iOSClientPlayer
iOSClientExposure
Sample Application¶
You can find a sample implementation on how to use the sdk in the Sample App SDKSampleApp
git hub repository.
Installation¶
RedBee iOS/tvOS Player SDK supports Swift Package Manager, Carthage & CocoaPods.
We recommend installing the player using SPM or CocoaPods.
Swift Package Manager¶
The Swift Package Manager is a tool for automating the distribution of Swift code and is integrated into the swift compiler.
Once you have your Swift package set up, adding iOSClientExposurePlayback
as a dependency is as easy as adding it to the dependencies value of your Package.swift.
dependencies: [
.package(url: "https://github.com/EricssonBroadcastServices/iOSClientExposurePlayback", from: "3.6.0")
]
Carthage¶
Carthage is a decentralised dependency manager that builds your dependency graph without interfering with your Xcode
project setup. CI
integration through fastlane is also available.
Install Carthage through Homebrew by performing the following commands:
$ brew update
$ brew install carthage
Once Carthage has been installed, you need to create a Cartfile
which specifies your dependencies. Please consult the artifacts documentation for in-depth information about Cartfile
s and the other artifacts created by Carthage.
github "EricssonBroadcastServices/iOSClientExposurePlayback"
Running carthage update
will fetch your dependencies and place them in /Carthage/Checkouts
. You either build the .framework
s and drag them in your Xcode
or attach the fetched projects to your Xcode workspace
.
Finally, make sure you add the .framework
s to your targets General -> Embedded Binaries section.
CocoaPods¶
CocoaPods is a dependency manager for Cocoa projects. For usage and installation instructions, visit their website. To integrate iOSClientExposurePlayback
into your Xcode project using CocoaPods, specify it in your Podfile:
pod 'iOSClientExposurePlayback', '~> 3.6.0'
Supported Versions¶
iOS 12.0+
tvOS 12.0+
Swift 5.0+
Xcode 13.0+
Implementation guide¶
Authentication¶
Accessing most functionality on the EMP Platform requires a valid SessionToken
.
Authentication requests return a valid SessionToken
(or an encapsulating Credentials
) if the request is successful. This sessionToken
should be persisted and used in subsequent calls when an authenticated user is required.
Authenticate(environment: exposureEnv)
.login(username: someUser,
password: somePassword)
.request()
.response{
if let credentials = $0.value {
// Store/pass along the returned SessionToken
let sessionToken: SessionToken = credentials.sessionToken
}
}
Creation¶
ExposureContext
based Player
s require an Environment
and a SessionToken
to operate which are provided at initialisation through a convenience initialiser.
public convenience init(environment: Environment, sessionToken: SessionToken)
This will configure the Player
for playback using EMP functionality.
-
If you want to send the analytics events to a custom end point, you can optionally pass your
analyticsBaseUrl
when creating the player. -
If you want to send the the
app name
/app version
to the analytics, you can optionally pass yourappName
,appVersion
when creating the player.
import iOSClientPlayer
import iOSClientExposure
import AVFoundation
class SimplePlayerViewController: UIViewController, AVPictureInPictureControllerDelegate {
var environment: Environment!
var sessionToken: SessionToken!
@IBOutlet weak var playerView: UIView!
fileprivate(set) var player: Player<HLSNative<ExposureContext>>!
// Optional : pictureInPictureController
private var pictureInPictureController: AVPictureInPictureController?
override func viewDidLoad() {
super.viewDidLoad()
/// This will configure the player with the `SessionToken` acquired in the specified `Environment`
player = Player(environment: environment, sessionToken: sessionToken)
/// This will configure the player with the `SessionToken` acquired in the specified `Environment` & sends analytics events to a custom endpoint
player = Player(environment: environment, sessionToken: sessionToken, analyticsBaseUrl: "analyticsBaseUrl")
/// This will configure the player with the `SessionToken` acquired in the specified `Environment` & sends appName to the analytics
player = Player(environment: environment, sessionToken: sessionToken, appName: "App Name", appVersion: "App Version")
player.configure(playerView: playerView)
// Optional :=
// If you want to support `picture in picture` in the player or need to access `AVPlayerLayer` , you can configure the player with `AVPlayerLayer`
/*
player = Player(environment: environment, sessionToken: sessionToken)
let avPlayerLayer = player.configure(playerView: playerView)
// Enable picture in picture mode in player
pictureInPictureController = AVPictureInPictureController(playerLayer: avPlayerLayer)
pictureInPictureController?.delegate = self
*/
//
// Note :
// Client developers can access all the AVPictureInPictureController Lifecycle Events by implementing `AVPictureInPictureControllerDelegate` delgate in your playerViewController. Read more : https://developer.apple.com/documentation/avkit/avpictureinpicturecontrollerdelegate
//
// Check SDK sample app for example implementation : https://github.com/EricssonBroadcastServices/iOSClientSDKSampleApp
}
}
Event Listeners¶
The preparation and loading process can be followed by listening to associated events.
player
.onPlaybackCreated{ player, source in
// Fires once the associated MediaSource has been created.
// Playback is not ready to start at this point.
}
.onPlaybackPrepared{ player, source in
// Published when the associated MediaSource completed asynchronous loading of relevant properties.
// Playback is not ready to start at this point.
}
.onPlaybackReady{ player, source in
// When this event fires starting playback is possible (playback can optionally be set to autoplay instead)
player.play()
}
Once playback is in progress the Player
continuously publishes events related media status and user interaction.
player
.onPlaybackStarted{ player, source in
// Published once the playback starts for the first time.
// This is a one-time event.
}
.onPlaybackPaused{ [weak self] player, source in
// Fires when the playback pauses for some reason
self?.pausePlayButton.toggle(paused: true)
}
.onPlaybackResumed{ [weak self] player, source in
// Fires when the playback resumes from a paused state
self?.pausePlayButton.toggle(paused: false)
}
.onPlaybackAborted{ player, source in
// Published once the player.stop() method is called.
// This is considered a user action
}
.onPlaybackCompleted{ player, source in
// Published when playback reached the end of the current media.
}
Besides playback control events Player
also publishes several status related events.
player
.onProgramChanged { [weak self] player, source, program in
// Update user facing program information
self?.updateProgram(with: program)
}
.onEntitlementResponse { player, source, entitlement in
// Fires when a new entitlement is received, such as after attempting to start playback
}
.onBitrateChanged{ [weak self] player, source, bitrate in
// Published whenever the current bitrate changes
self?.updateQualityIndicator(with: bitrate)
}
.onBufferingStarted{ player, source in
// Fires whenever the buffer is unable to keep up with playback
}
.onBufferingStopped{ player, source in
// Fires when buffering is no longer needed
}
.onDurationChanged{ player, source in
// Published when the active media received an update to its duration property
}
.onDateRangeMetadataChanged { metadataGroups in
// fire once the current playback `DateRangeMetadata` : `#EXT-X-DATERANGE tag` changes.
}
Starting the Playback¶
Client applications start playback by supplying Player
with a Playable
.
With the SDK 2.xxx
up versions we have deprecated the ChannelPlayable
& ProgramPlayable
. Instead now you can simply
create an AssetPlayable
to play a live channel , programme or vod assets.
let assetPlayable = AssetPlayable(assetId: "assetId")
player.startPlayback(playable: assetPlayable)
Optionally, client applications can set specific playback options by specifying them in PlaybackProperties
. These options include maximum bitrate, autoplay mode, custom start time and language preferences.
let properties = PlaybackProperties(autoPlay: true,
playFrom: .bookmark,
language: .custom(text: "fr", audio: "en"),
maxBitrate: 300000)
player.startPlayback(playable: assetPlayable, properties: properties)
If application developers want to disable the analytics for the playback they can pass enableAnalytics
property when starting a playback. By default analytics are enabled for the player.
player.startPlayback(playable: assetPlayable, properties: properties, enableAnalytics: false)
If application developers want to play a specific material variant, they can pass materialProfile
when starting the playback.
(eg. for a specific asset, the "default" material contains a full length movie, and a "TRAILER" material might contain only an extract: a virtual subclip generated using the VOD to VOD flow)
player.startPlayback(playable: assetPlayable, properties: properties, materialProfile: "TRAILER")
If application developers want to filter the output to a particular set of metadata identifiers. For instance, if the stream’s #EXT-X-DATERANGE tags define multiple metadata attributes, but you are only interested in the values for the the X-AD-ID and X-AD-URL attributes, you can pass the array of identifiers when starting the playback.
player.startPlayback(playable: assetPlayable, properties: properties, metadataIdentifiers: ["X-AD-ID ", "X-AD-URL"])
Check Server-Side Ad Insertion (SSAI) for Ad related params
Audio only Playback¶
SDK provides an out of box implementation for the audio only playback. You can create & start the playback using the AssetPlayable
. If the stream is an audio only stream, SDK will return the mediaType
as audio
. This event will be fired when a new new entitlement is received, such as after attempting a new playback request.
player.onMediaType { [weak self] mediaType in
// mediaType : audio / video
}
You will find an example implementation of audio only playback with sticky player, background audio handling, iOS control center integration, audio interuption handling & AirPlay / Chromecast in the SDKSampleApp
.
Epg and Content Presentation¶
The Exposure
module provides metadata integration with EMP Exposure layer for quick and typesafe content access.
Listing all available channels can be done by calling
FetchAsset(environment: environment)
.list()
.includeUserData(for: sessionToken)
.filter(on: "TV_CHANNEL")
.sort(on: ["assetId","originalTitle"])
.request()
.validate()
.response{
if let assetList = $0.value {
// Present a list of channels
}
}
EPG, or the electronic programming guide, details previous, current and upcomming programs on a specific channel. Client applications may request EPG data through the FetchEpg
endpoint.
let current = player.serverTime ?? Date()
FetchEpg(environment: environment)
.channel(id: channelId)
.show(page: 1, spanning: 100)
.filter(starting: current.subtract(days: 1), ending: current.add(days: 1) ?? current)
.request()
.validate()
.response{
if let channelEpg = $0.value {
// Present the EPG
}
}
Client applications relying on ExposureContext
may also fetch the currently playing Program
directly from the player
object.
let nowPlaying = player.currentProgram
Or listen to the onProgramChanged
event.
player.onProgramChanged { player, source, program in
// Update user facing program information
}
Playback Configuration and Stream Navigation¶
Custom Playback Properties¶
Client applications can apply custom options to any playback request when the default options needs to be tweaked.
Scenarios include enabling autoplay
mode, maximum bitrate
restrictions, language preferences
and start time
offset.
With autoplay
enabled, playback will start as soon as the playback is ready.
let properties = PlaybackProperties(autoplay: true)
Client applications may opt in to limiting the bitrate at a preferred maximum during playback. Specifying a non-zero value will indicate the player should attempt to limit playback to that bitrate. If network bandwidth consumption cannot be lowered to meet the requested maximum, it will be reduced as much as possible while continuing to play the item.
let properties = PlaybackProperties(maxBitrate: 300000)
Subtitles and audio preferences can be indicated by specifying LanguagePreferences
which will be applied automatically if the current playback stream supports the requested selection.
Language Preferences¶
.defaultBehavior
defers track selection to whatever is specified as default by the selected stream.
let properties = PlaybackProperties(language: .defaultBehavior)
.userLocale
takes advantage device's Locale
settings when searching for a language to select. For example, if the Locale.current
specifies fr
, this will be the preferred language. In the event the stream does not support the device´s Locale
, stream defaults will be applied.
let properties = PlaybackProperties(language: .userLocale)
When using .userPreference
, the SDK checks the user's language preferences stored locally on the device. These preferences are updated whenever the user changes the audio or subtitle options for any asset during playback.
If the user has specified a preferred language for audio or text content, it will be selected for playback. If the user has not specified a preference or if the specified preference is not available, the SDK will fall back to the device's language and then to the stream's default language.
Additionally, if the preferred audio language does not match the device's language settings and there are no preferred subtitles, the SDK automatically applies subtitles in the device's language (if available).
let properties = PlaybackProperties(language: .userPreference)
Finally, client applications may specify a custom selection for just subtitles, audio or both.
let properties = PlaybackProperties(language: .custom(text: "en", audio: nil))
For more information regarding track selection, please see Player
module
Start Time¶
PlaybackProperties
specifies 5 different modes, PlayFrom
, for defining the playback start position.
.beginning
Playback starts from the beginning of the program.bookmark
Playback starts from the bookmarked position if available and fallbacks to.defaultBehavior
.customPosition(position:)
Playback starts from the specified buffer position (in milliseconds) . Will ignore positions outside theseekableRange
and present the application with aninvalidStartTime(startTime:seekableRanges:)
warning..customTime(time:)
Playback starts from the specified unix timestamp (in milliseconds). Will ignore timestamps not within theseekableTimeRange
and present the application with aninvalidStartTime(startTime:seekableRanges:)
warning.
let properties = PlaybackProperties(playFrom: .bookmark)
The .defaultBehavior
varies according to stream type:
- Live Channel:
.defaultBehavior
Playback starts from the live edge - Live Program:
.defaultBehavior
Playback starts from the live edge - Catchup Program:
.defaultBehavior
Playback starts from the program start
Playback Progress¶
Playback progress is available in two formats. Playhead position reports the position timestamp using the internal buffer time reference in milliseconds. It is also possible to seek to an offset relative to the current position
let position = player.playheadPosition
player.seek(toPosition: position - 30 * 1000)
For date-time related streams, playheadTime
reports the offset mapped to the current wallclock time. This feature is used for live and catchup.
let position = player.playheadTime
Stream Navigation¶
The most intuitive and easy way to navigate the stream is by using the unix timestamp based seek(toTime:)
api which enables client applications seamless transition between programs.
In order to check what unix timestamp (ms) the playhead is currently at, client applications use
let currentTime = player.playheadTime
Two important ranges with influence over the playback experience are seekableTimeRange
and bufferedTimeRange
.
seekableTimeRange
navigating within this range will not require a new play requestbufferedTimeRange
navigating within this range will not fetch new segments from the CDN
Although playback started in a specific program, the player will continue playing until it reaches a program without a valid license for the user. Program boundary crossings can occur meaning that the program being displayed has changed. When this happens onProgramChanged
will fire with updated Program
information.
Going live in a catchup or timeshifted scenario is as easy as calling seekToLive()
.
player.seekToLive()
Restarting the currently playing program can be done by calling
if let programStartTime = player.currentProgram?.startDate?.millisecondsSince1970 {
player.seek(toTime: programStartTime)
}
Seeking 30 seconds back
player.seek(toTime: currentTime - 30 * 1000) }
All information regarding the currently playing program is encapsulated in the Program
struct accessed through player.currentProgram
. This data can be used to populate the user interface.
In the event that the player tries to play a program, either through natural progression over a program boundary or by stream navigation, that the user does not have the right to watch, playback will stop and throw an ExposureError.ExposureResponseMessage
specifying NOT_ENTITLED
.
Continuous validation of the current playback occurs at each program boundary or navigation attempt. If the validation process somehow fails, onWarning
messages will be triggered. These warnings occur on gaps in EPG or failure to validate or fetch the current program. When this occurs, playback will continue without interruption. Client applications may choose to take other actions, such as stopping playback, if needed.
Contract Restrictions¶
Client applications may fetch the playback related contract restrictions for the current playback. This can for example be used to enable or disable specific user controls.
let entitlement = player.context.entitlement
Another option is to register for the onEntitlementResponse
callback which will fire every time a new entitlement is recieved.
Three PlaybackEntitlement
properties are of special interest
ffEnabled
specifies if fast-forwarding is enabledrqwEnabled
specifies if rewinding is enabledtimeshiftEnabled
if timeshift is disabled, playback can not be paused
Adobe primetime authentication¶
if you are planning to use adobe primetime authentication, you can pass the base64EncodedToken
value when starting the playback.
player.startPlayback(playable: assetPlayable, properties: properties, adobePrimetimeToken: "base64EncodedToken")
Server-Side Ad Insertion (SSAI)¶
Preparation
If you are planning to use server side ad insertion with the player you can set AdsOptions
to pass client / device specific information that can be used for ad targeting when starting the playback.
let adsOptions = AdsOptions(latitude: 18.000, longitude: 18.000, mute: true, consent: "consent", deviceMake: "deviceMake", ifa: "ifa", gdprOptin: true)
player.startPlayback(playable: assetPlayable, properties: properties, adsOptions: adsOptions)
application developers can pass custom key with its value as an array related to the server side ads.
let customAdParams: [String: Any] = ["TestKey": 1 , "TestKeyTwo": "test", "TestKeyThree": true]
player.startPlayback(playable: assetPlayable, properties: properties, customAdParams: customAdParams)
Events Related to SSAI
If the stream has server side ads enabled player will publish several events related to the ads.
You can replace your .onPlaybackStarted
event with .onPlaybackStartWithAds
which will return several attributes related to the ads.
.onPlaybackStartWithAds { [weak self] vodDurationInMs, adDurationInMs, totalDurationInMs, adMarkers in
// vodDurationInMs : Actual vod content duration
// adDurationInMs : Total Ads duration
// totalDurationInMs : Total duration ( vod + ads )
// adMarkers : Ad Markers ( MarkerPoints) that you can place in your timeline
}
Player will publish onWillPresentInterstitial
when an Ad starts playing & onDidPresentInterstitial
when an Ad ends playing.
.onWillPresentInterstitial { [weak self] contractRestrictionService, clickThroughUrl, adTrackingUrls, adClipDuration, noOfAds, adIndex in
// contractRestrictionService : contractRestrictionsPolicy.fastForwardEnabled & contractRestrictionsPolicy.rewindEnabled
// clickThroughUrl : External link to navigate to when the user clicked the ad. ( ex : Show / hide link button when ad is playing )
// adTrackingUrls : If user clicked the ad `clickThroughUrl` link / button, send these Urls back to the player to track the ad click.
// adClipDuration : Duration of the currently playing ad clip
// noOfAds : Number of ads in the ad break
// adIndex : Index of the current playing ad
}
.onDidPresentInterstitial { [weak self] contractRestrictionService in
// contractRestrictionService : contractRestrictionsPolicy.fastForwardEnabled & contractRestrictionsPolicy.rewindEnabled
}
Player will publish onServerSideAdShouldSkip
event as it requires the player to seek specific position. App developers must implement this event.
.onServerSideAdShouldSkip { [weak self] skipTime in
self.player.seek(toPosition: Int64(skipTime) )
}
Implementing for tvOS
When implementing the SSAI on tvOS player, you need to implement the following method to
class PlayerViewController: UIViewController, AVPlayerViewControllerDelegate {
func playerViewController(_ playerViewController: AVPlayerViewController,
willResumePlaybackAfterUserNavigatedFrom oldTime: CMTime,
to targetTime: CMTime) {
if let targetTime = targetTime.milliseconds {
self.player.seek(toPosition: targetTime)
}
}
}
Tracking Ad's clickThroughUrl
Optionally app developers can use clickThroughUrl
to navigate the users to the Ad´s external link if the ad contains that url. This can be done adding a button / link in the player skin.
When / if the clickThroughUrl button is clicked , app developers should pass relevant adTrackingUrls
back to the player to track & send analytics back to the ad server.
self.player.trackClickedAd(adTrackingUrls: adTrackingUrls)
Sprites¶
Client applications can use the sprites , preview thumbnails for VOD assets & for Catchups if it is provided by the backend. If there are any sprites streams available for the given asset, client application can find that in the source
object return from the player events.
player
.onEntitlementResponse { player, source, entitlement in
// Fires when a new entitlement is received, such as after attempting to start playback
// source.sprites contains array of Sprites with each sprite has it's own vtt url & width
print(source.sprites)
}
Developers can use their own implementation to handle the vtt streams & get the sprite images. But ExposurePlayback
module provides out of the box solution that developers can use to add sprite to their players.
Activate Sprites¶
First you need to activate
the sprites to use in the player. Client application can easily do this by passing the assetId
, width
& the JPEGQulaity
. width
& the JPEGQulaity
are optional fields. If nothing provided sdk will use the width of the first available sprite stream & cache the sprite images with the highest
JPEGQulaity.
if let playable = playable, let sprites = sprites , let width = sprites.first?.width {
let _ = self.player.activateSprites(assetId: playable.assetId, width: width, quality: .medium) { spritesData, error in
// print(" Sprites have been activated " , spritesData )
}
}
When the developer activate the sprites, sdk will fetch all the sprite images & cache it.
Get Sprites¶
After the activation client developers can get the sprite images by passing the player current time when scrubbing through the timeline.
let sliderPosition = Int64(sender.value * Float(duration))
let currentTime = timeFormat(time: sliderPosition)
if let assetId = self?.playable?.assetId {
let _ = self?.player.getSprite(time: currentTime, assetId: assetId,callback: { image, startTime, endTime in
// assign the image in to the UIImage
// startTime : Sprites offset ( startTime )
// endTime : Sprites end offset (endTime )
})
}
You can find a sample implementation on how to use the sprites in the SDK Sample App SDKSampleApp
git hub repository.
Offline Playback¶
Client applications can use the ExpoureDownload
by confirming EnigmaDownloadManager
to any class.
class MyTestViewController: UIViewController, EnigmaDownloadManager {
// After confirming client applications can use `enigmaDownloadManager` instance to perform any download related tasks.
}
Playback of a downloaded Asset¶
Check - ExposureDownload
for more information regarding downloads.
Client applications can get an offlineMediaAsset
( downloaded asset ) by using the EnigmaDownloadManager
.
let downloadedAsset = enigmaDownloadManager.getDownloadedAsset(assetId: assetId)
Or client applications can get AllDownloadedAssets
by using getDownloadedAssets()
let allDownloadedAssets = enigmaDownloadManager.getDownloadedAssets()
Then developers can create a OfflineMediaPlayable
& pass it to the player to play any downloaded asset.
But there is an exception when playing downloaded mp3. AVPlayer sometimes doesn't play offline mp3 files, so the client application developers are encourage to use AVAudioPlayer
or AVAudioEngine
to play offline mp3 files.
check SDK Sample application for an example implementation. ( https://github.com/EricssonBroadcastServices/iOSClientSDKSampleApp )
OfflineMediaPlayable
has the attribute format
which will pass the format of the downloaded file's format.
let downloadedAsset = enigmaDownloadManager.getDownloadedAsset(assetId: assetId)
if let entitlement = downloadedAsset?.entitlement, let urlAsset = downloadedAsset?.urlAsset, let format = downloadedAsset?.format {
if format == "MP3" || format == "mp3" {
// Create `AVAudioPlayer` or `AVAudioFile` and pass to `AVAudioEngine`
} else {
let offlineMediaPlayable = OfflineMediaPlayable(assetId: assetId, entitlement: entitlement, url: urlAsset.url)
// Play downloaded asset
player.startPlayback(offlineMediaPlayable: offlineMediaPlayable)
}
}
Analytics for Offline Playback¶
The SDK will automatically store analytics for offline playback in local storage and flush them periodically if the device has connected to an internet connection.
Optionally, client developers can implement offline playback analytics to be flushed as a background event. The SDK will check if there are any locally stored events, and if found, will send them to the analytics server.
To configure your app to allow background tasks, enable the background capabilities that you need, and then create a list of unique identifiers for each task.
Configure Background Processing¶
- Add the capability: Background processing
- Add the BGTaskSchedulerPermittedIdentifier key as com.emp.ExposurePlayback.SampleApp.analyticsFlush to the Info.plist
Read more about background processing in Apple's documentation: Using background tasks to update your app
Check SDK Sample application for sample implementation SDK Sample App
class AppDelegate: {
let appRefreshTaskId = "com.emp.ExposurePlayback.SampleApp.analyticsFlush"
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
BGTaskScheduler.shared.register(forTaskWithIdentifier: self.appRefreshTaskId, using: nil) { task in
self.handleFlusingOfflineAnalytics(task: task as! BGProcessingTask)
}
}
func applicationDidEnterBackground(_ application: UIApplication) {
self.scheduleAppRefresh(minutes: 2)
}
func applicationWillEnterForeground(_ application: UIApplication) {
self.cancelAllPendingBGTask()
}
func cancelAllPendingBGTask() {
BGTaskScheduler.shared.cancelAllTaskRequests()
}
func scheduleAppRefresh(minutes: Int) {
let seconds = TimeInterval(minutes * 60)
let request = BGProcessingTaskRequest(identifier: self.appRefreshTaskId )
request.earliestBeginDate = Date(timeIntervalSinceNow: seconds)
request.requiresNetworkConnectivity = true
do {
try BGTaskScheduler.shared.submit(request)
} catch {
print("Could not schedule app refresh task \(error.localizedDescription)")
}
}
func handleFlushingOfflineAnalytics(task: BGProcessingTask) {
// Schedule a new refresh task: Define the minutes
scheduleAppRefresh(minutes: 2)
let manager = iOSClientExposure.BackgroundAnalyticsManager()
manager.flushOfflineAnalytics()
DispatchQueue.main.asyncAfter(deadline: .now() + .seconds(10)) {
task.setTaskCompleted(success: true)
}
task.expirationHandler = {
self.cancelAllPendingBGTask()
}
}
}