Ios AVPlayerItem初始时间未观察到元数据(KVO)

Ios AVPlayerItem初始时间未观察到元数据(KVO),ios,cocoa-touch,avfoundation,avplayer,Ios,Cocoa Touch,Avfoundation,Avplayer,我有一个类正在处理一个AVPlayer(和AVPlayerItem),它向委托报告状态、时间和timedMetadata 除了在70-80%的时间内,初始timedMetadata不是“观察到的关键值”之外,它工作得很好。但是,在缺少第一个timedMetadata实例之后,所有其他timedMetadata似乎都可以正常观察到 作为一个临时修复,我开始在视频的开头嵌入虚拟timedMetadata标记,可以说除了“踢轮胎”什么都不做,之后一切都正常。然而,这似乎相当笨拙。我怀疑我是在以次优的方

我有一个类正在处理一个AVPlayer(和AVPlayerItem),它向委托报告状态、时间和timedMetadata

除了在70-80%的时间内,初始timedMetadata不是“观察到的关键值”之外,它工作得很好。但是,在缺少第一个timedMetadata实例之后,所有其他timedMetadata似乎都可以正常观察到

作为一个临时修复,我开始在视频的开头嵌入虚拟timedMetadata标记,可以说除了“踢轮胎”什么都不做,之后一切都正常。然而,这似乎相当笨拙。我怀疑我是在以次优的方式设置AVPlayerItem和KVO,或者这里只是一个bug

任何关于为什么会发生这种情况的想法都将不胜感激!代码如下

// CL: Define constants for the key-value observation contexts.
static const NSString *ItemStatusContext;
static const NSString *ItemMetadataContext;
static const NSString *ItemPlaybackForcastContext;


- (id)initWithURL:(NSURL *)url
{
    if (self = [super init]) {

        __weak TFPAVController *_self = self;

        AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
        NSString *tracksKey = @"tracks";

        [asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
         ^{
             dispatch_async(dispatch_get_main_queue(),
                            ^{
                                NSError *error = nil;
                                AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];

                                if (status == AVKeyValueStatusLoaded) {
                                    AVPlayerItem *item = [AVPlayerItem playerItemWithAsset:asset];
                                    [item addObserver:_self forKeyPath:@"status" options:0 context:&ItemStatusContext];
                                    [item addObserver:_self forKeyPath:@"timedMetadata" options:0 context:&ItemMetadataContext];
                                    [item addObserver:_self forKeyPath:@"playbackLikelyToKeepUp" options:0 context:&ItemPlaybackForcastContext];

                                    [[NSNotificationCenter defaultCenter] addObserver:_self
                                                                             selector:@selector(playerItemDidReachEnd:)
                                                                                 name:AVPlayerItemDidPlayToEndTimeNotification
                                                                               object:item];

                                    AVPlayer *player = [AVPlayer playerWithPlayerItem:item];
                                    _self.totalRunTime = CMTimeGetSeconds(item.duration);
                                    [_self.delegate avPlayerNeedsView:player];

                                    _self.playerItem = item;
                                    _self.player = player;
                                }
                                else {
                                    NSLog(@"The asset's tracks were not loaded: %@ // [%@ %@]",
                                          error.localizedDescription,
                                          NSStringFromClass([self class]),
                                          NSStringFromSelector(_cmd));
                                }

                                _self.playerObserver = [_self.player addPeriodicTimeObserverForInterval:CMTimeMake(1, _FrameRate_) 
                                                                                                  queue:NULL
                                                                                             usingBlock: ^(CMTime time) {
                                                                                                 _self.currentVideoTime = CMTimeGetSeconds([_self.playerItem currentTime]);
                                                                                             }];
                            });
         }];
    }

    return self;
}
#pragma mark - KVO Response Methods
- (void)observeValueForKeyPath:(NSString *)keyPath 
                      ofObject:(id)object 
                        change:(NSDictionary *)change 
                       context:(void *)context 
{    
        __weak TFPAVController *_self = self;

    if (context == &ItemStatusContext) {
        dispatch_async(dispatch_get_main_queue(),
                       ^{
                           if (((AVPlayerItem *)object).status == AVPlayerItemStatusReadyToPlay) {

                               [_self.delegate videoIsLoadedInPlayer:_self];
                           }
                       });
        return;
    }
    else if (context == &ItemMetadataContext) {
        dispatch_async(dispatch_get_main_queue(),
                       ^{
                           [_self checkMetaDataForPlayerItem: (AVPlayerItem *)object];
                       });
        return;
    }
    else if (context == &ItemPlaybackForcastContext) {
        dispatch_async(dispatch_get_main_queue(),
                       ^{
                           AVPlayerItem *playerItem = object;                           
                           if (CMTimeGetSeconds([playerItem currentTime]) <= 0) return;

                           NSDictionary *notificationDictionary = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:playerItem.playbackLikelyToKeepUp] 
                                                                                              forKey:kAVPlayerStateKey];

                           [[NSNotificationCenter defaultCenter] postNotificationName:kAVPlayerNotification 
                                                                               object:self 
                                                                             userInfo:notificationDictionary];
                        });
        return;
    }

    [super observeValueForKeyPath:keyPath ofObject:object change:change context:context];

}

- (void)checkMetaDataForPlayerItem:(AVPlayerItem *)item
{
    NSMutableDictionary *metaDict = [NSMutableDictionary dictionary];

    // CL: make sure there's stuff there
    if (item.timedMetadata != nil && [item.timedMetadata count] > 0) {
        // CL: if there is, cycle through the items and create a Dictionary
        for (AVMetadataItem *metadata in item.timedMetadata) {
            [metaDict setObject:[metadata valueForKey:@"value"] forKey:[metadata valueForKey:@"key"]];
        }
        // CL: pass it to the delegate
        [self.delegate parseNewMetaData:[NSDictionary dictionaryWithDictionary:metaDict]];
    }
}
//CL:为键值观察上下文定义常量。
静态常量NSString*ItemStatusContext;
静态常量NSString*ItemMetadataContext;
静态常量NSString*ItemPlaybackForcastContext;
-(id)initWithURL:(NSURL*)url
{
if(self=[super init]){
__弱TFP控制器*_self=self;
AVURLAsset*asset=[AVURLAsset UrlAssetTwithUrl:url选项:无];
NSString*tracksKey=@“tracks”;
[asset LoadValuesSynchronousLyforkeys:[NSArray arrayWithObject:tracksKey]完成处理程序:
^{
dispatch\u async(dispatch\u get\u main\u queue(),
^{
n错误*错误=nil;
AVKeyValueStatus status=[ValueForKey的资产状态:tracksKey错误:&错误];
如果(状态==AVKeyValueStatusLoaded){
AVPlayerItem*项=[AVPlayerItem playerItemWithAsset:asset];
[item addObserver:_selfforkeyPath:@“状态”选项:0上下文:&ItemStatusContext];
[item addObserver:_selfforkeyPath:@“timedMetadata”选项:0上下文:&ItemMetadataContext];
[item addObserver:_selfforkeyPath:@“playbackLikelyToKeepUp”选项:0上下文:&ItemPlaybackForcastContext];
[[NSNotificationCenter defaultCenter]添加观察者:\u self
选择器:@selector(playerItemDidReachen:)
名称:AVPlayerItemDidPlayToEndTimeNotification
对象:项目];
AVPlayer*player=[AVPlayer playerWithPlayerItem:item];
_self.totalRunTime=CMTimeGetSeconds(item.duration);
[\u self.delegate avplayernedsview:player];
_self.playerItem=项目;
_self.player=玩家;
}
否则{
NSLog(@“未加载资产的磁道:%@/[%@%@]”,
error.localizedDescription,
NSStringFromClass([自类]),
NSStringFromSelector(_cmd));
}
_self.playerObserver=[\u self.player addPeriodicTimeObserverForInterval:CMTimeMake(1,\u帧率)
队列:空
使用块:^(CMTime){
_self.currentVideoTime=CMTimeGetSeconds([\u self.playerItem currentTime]);
}];
});
}];
}
回归自我;
}
#pragma-mark-KVO响应方法
-(void)observeValueForKeyPath:(NSString*)键路径
ofObject:(id)对象
更改:(NSDictionary*)更改
上下文:(void*)上下文
{    
__弱TFP控制器*_self=self;
if(context==&ItemStatusContext){
dispatch\u async(dispatch\u get\u main\u queue(),
^{
if(((AVPlayerItem*)对象).status==AVPlayerItemStatusReadyToPlay){
[\u self.delegate videoIsLoadedInPlayer:\u self];
}
});
返回;
}
else if(context==&ItemMetadataContext){
dispatch\u async(dispatch\u get\u main\u queue(),
^{
[_self-checkMetaDataForPlayerItem:(AVPlayerItem*)对象];
});
返回;
}
else if(context==&ItemPlaybackForcastContext){
dispatch\u async(dispatch\u get\u main\u queue(),
^{
AVPlayerItem*playerItem=对象;

if(CMTimeGetSeconds([playerItem currentTime])Ahhh,KVO。可能是苹果有史以来最糟糕的设计决策之一

我想这已经不再相关了,但我猜您遇到的问题是,有时您尝试观察的值已经被分配给了关键点,当您开始将自己添加为观察者时,因此您的观察者选择器不会被调用


为了避免这种情况,您可以在调用
addObserver:forKeyPath:options:context:
时,将
NSKeyValueObservingOptionInitial
添加到
选项中,您的observer方法将立即用当前值调用。

所有文件的格式是否相同?简而言之: