Audio 如何在iOS和Xcode中记录完美循环

Audio 如何在iOS和Xcode中记录完美循环,audio,ios7,xcode5,audioqueue,Audio,Ios7,Xcode5,Audioqueue,我已经为此挣扎了大约一年了,现在我试图确定我的问题,并把它呈现给其他人看 我一直在写一个应用程序,它依赖于像“GarageBand”这样的录音。也就是说,我想记录用户精确的8拍,然后我希望他们能够循环这个。同时,我正在为用户播放节拍器(用户将戴着耳机收听节拍器,录制到他们设备上的麦克风中) 我可以设法打开录音约4.8秒(.6*8拍),计时器显示它运行了4.8秒,但是我的音频录音总是比4.8秒短一点。它就像4.78或4.71,它会导致循环发生异常 我用AVAudioRecorder、AudioQu

我已经为此挣扎了大约一年了,现在我试图确定我的问题,并把它呈现给其他人看

我一直在写一个应用程序,它依赖于像“GarageBand”这样的录音。也就是说,我想记录用户精确的8拍,然后我希望他们能够循环这个。同时,我正在为用户播放节拍器(用户将戴着耳机收听节拍器,录制到他们设备上的麦克风中)

我可以设法打开录音约4.8秒(.6*8拍),计时器显示它运行了4.8秒,但是我的音频录音总是比4.8秒短一点。它就像4.78或4.71,它会导致循环发生异常

我用AVAudioRecorder、AudioQueue和AudioUnits进行了实验,认为后一种方法可以解决我的问题

我正在使用NSTimer每0.6秒发射一次,为节拍器播放一小段光点。4拍后,节拍器定时器的功能,打开录音机节拍器,等待4.6秒停止录音

我使用时间间隔来计算地铁运行的时间(看起来很紧,4.800xxx),并将其与音频文件的持续时间进行比较,音频文件的持续时间总是不同的

我希望我可以附加我的项目,但我想我只能满足于附加我的头和实现。要进行测试,您必须制作一个具有以下IB特征的项目:

录制、播放、停止按钮 歌曲/曲目持续时间标签 计时器持续时间标签 调试标签

如果你启动应用程序,然后点击记录,你被“计入”4次,然后录音机启动。用手指轻拍桌子,直到录音机停止。再拍8拍(总共12拍)后,录音机停止

您可以在显示器上看到,录制的曲目略短于4.8秒,在某些情况下短得多,导致音频无法正常循环

有人知道我能做些什么来收紧这个吗?谢谢你的阅读

这是我的密码:

//
//  ViewController.h
//  speakagain
//
//  Created by NOTHING on 2014-03-18.
//

#import <UIKit/UIKit.h>
#import <Foundation/Foundation.h>
#import "CoreAudio/CoreAudioTypes.h"
#import <AudioToolbox/AudioQueue.h>
#import <AudioToolbox/AudioFile.h>
#import <AVFoundation/AVFoundation.h>

@interface ViewController : UIViewController
{
    IBOutlet UIButton *btnRecord;
    IBOutlet UIButton *btnPlay;
    IBOutlet UIButton *btnStop;
    IBOutlet UILabel *debugLabel;
    IBOutlet UILabel *timerDuration;
    IBOutlet UILabel *songDuration;

    //UILabel *labelDebug;

    struct AQRecorderState {
        AudioStreamBasicDescription  mDataFormat;
        AudioQueueRef                mQueue;
        AudioQueueBufferRef          mBuffers[kNumberBuffers];
        AudioFileID                  mAudioFile;
        UInt32                       bufferByteSize;
        SInt64                       mCurrentPacket;
        bool                         mIsRunning;                    // 8

    };
    struct AQRecorderState aqData;
    AVAudioPlayer *audioPlayer;

    NSString *songName;
    NSTimer *recordTimer;
    NSTimer *metroTimer;
    NSTimeInterval startTime, endTime, elapsedTime;

    int inputBuffer;
    int beatNumber;

}
@property (nonatomic, retain)   IBOutlet UIButton *btnRecord;
@property (nonatomic, retain)   IBOutlet UIButton *btnPlay;
@property (nonatomic, retain)   IBOutlet UIButton *btnStop;
@property (nonatomic, retain)   IBOutlet UILabel *debugLabel;
@property (nonatomic, retain) IBOutlet UILabel *timerDuration;
@property (nonatomic, retain) IBOutlet UILabel *songDuration;


- (IBAction) record;
- (IBAction) stop;
- (IBAction) play;

static void HandleInputBuffer (void *aqData,AudioQueueRef inAQ,AudioQueueBufferRef inBuffer,const AudioTimeStamp *inStartTime, UInt32 inNumPackets,const AudioStreamPacketDescription  *inPacketDesc);

@end
//
//ViewController.h
//演讲增益
//
//2014年3月18日由NOTHING创建。
//
#进口
#进口
#导入“CoreAudio/CoreAudioTypes.h”
#进口
#进口
#进口
@界面ViewController:UIViewController
{
IBUI按钮*BTN记录;
IBUIButton*btnPlay;
IBUIButton*btnStop;
IBUILabel*调试标签;
IBUILabel*时间持续时间;
IBUILabel*歌曲持续时间;
//UILabel*Labeldbug;
结构AQRecorderState{
AudioStreamBasicDescription mDataFormat;
AudioQueueRef-mQueue;
AudioQueueBufferRef mbuffer[kNumberBuffers];
音频文件ID;
UInt32缓冲字节大小;
sint64mcurrentpack;
bool错误运行;//8
};
结构AQRecorderState aqData;
AVAudioPlayer*音频播放器;
NSString*歌曲名;
NSTimer*记录计时器;
NSTimer*metroTimer;
NSTimeInterval startTime、endTime、elapsedTime;
int输入缓冲区;
整数;
}
@属性(非原子,保留)IBUIButton*BTN记录;
@属性(非原子,保留)IBUIButton*btnPlay;
@属性(非原子,保留)IBUIButton*btnStop;
@属性(非原子,保留)IBUILabel*debugLabel;
@属性(非原子,保留)IBUILabel*timerDuration;
@属性(非原子,保留)IBUILabel*songDuration;
-(i)记录;
-(i)停止;
-(i)游戏;
静态void HandleInputBuffer(void*aqData、AudioQueueRef inAQ、AudioQueueBufferRef inBuffer、const AudioTimeStamp*inStartTime、UInt32 inNumPackets、const AudioStreamPacketDescription*inPacketDesc);
@结束
实施:

//
    //  ViewController.m
    //  speakagain
    //
    //  Created by NOTHING on 2014-03-18.
    //

    #import "ViewController.h"


    @interface ViewController ()

    @end

    @implementation ViewController
    @synthesize btnPlay, btnRecord,btnStop,songDuration, timerDuration, debugLabel;


    - (void)viewDidLoad
    {
        debugLabel.text = @"";
        songName =[[NSString alloc ]init];
        //NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        //NSString *documentsDirectory = [paths objectAtIndex:0];
        songName = @"TestingQueue.caf";



        [super viewDidLoad];
        // Do any additional setup after loading the view, typically from a nib.
    }
    - (void)prepareAudioQueue
    {
        //struct AQRecorderState *pAqData;
        inputBuffer=0;
        aqData.mDataFormat.mFormatID         = kAudioFormatLinearPCM;
        aqData.mDataFormat.mSampleRate       = 44100.0;
        aqData.mDataFormat.mChannelsPerFrame = 1;
        aqData.mDataFormat.mBitsPerChannel   = 16;
        aqData.mDataFormat.mBytesPerPacket   =
        aqData.mDataFormat.mBytesPerFrame = aqData.mDataFormat.mChannelsPerFrame * sizeof (SInt16);
        aqData.mDataFormat.mFramesPerPacket  = 1;

        //    AudioFileTypeID fileType             = kAudioFileAIFFType;
        AudioFileTypeID fileType             = kAudioFileCAFType;
        aqData.mDataFormat.mFormatFlags = kLinearPCMFormatFlagIsBigEndian| kLinearPCMFormatFlagIsSignedInteger| kLinearPCMFormatFlagIsPacked;

        AudioQueueNewInput (&aqData.mDataFormat,HandleInputBuffer, &aqData,NULL, kCFRunLoopCommonModes, 0,&aqData.mQueue);

        UInt32 dataFormatSize = sizeof (aqData.mDataFormat);

        // in Mac OS X, instead use
        //    kAudioConverterCurrentInputStreamDescription
        AudioQueueGetProperty (aqData.mQueue,kAudioQueueProperty_StreamDescription,&aqData.mDataFormat,&dataFormatSize);

        //Verify
        NSFileManager *fileManager = [NSFileManager defaultManager];
        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        NSString *documentsDirectory = [paths objectAtIndex:0];
        NSString *txtPath = [documentsDirectory stringByAppendingPathComponent:songName];

        NSLog(@"INITIALIZING FILE");
        if ([fileManager fileExistsAtPath:txtPath] == YES) {
            NSLog(@"PREVIOUS FILE REMOVED");
            [fileManager removeItemAtPath:txtPath error:nil];
        }


        const char *filePath = [txtPath UTF8String];
        CFURLRef audioFileURL = CFURLCreateFromFileSystemRepresentation ( NULL,(const UInt8 *) filePath,strlen (filePath),false );
        AudioFileCreateWithURL (audioFileURL,fileType,&aqData.mDataFormat, kAudioFileFlags_EraseFile,&aqData.mAudioFile );

        DeriveBufferSize (aqData.mQueue,aqData.mDataFormat,0.5,&aqData.bufferByteSize);

        for (int i = 0; i < kNumberBuffers; ++i)
        {
            AudioQueueAllocateBuffer (aqData.mQueue,aqData.bufferByteSize,&aqData.mBuffers[i]);
            AudioQueueEnqueueBuffer (aqData.mQueue,aqData.mBuffers[i], 0,NULL );
        }

    }

    - (void) metronomeFire
    {
        if(beatNumber < 5)
        {
            //count in time.
            // just play the metro beep but don't start recording
            debugLabel.text = @"count in (1,2,3,4)";
            [self playSound];
        }
        if(beatNumber == 5)
        {
            //start recording
            aqData.mCurrentPacket = 0;
            aqData.mIsRunning = true;
            startTime = [NSDate timeIntervalSinceReferenceDate];
            recordTimer = [NSTimer scheduledTimerWithTimeInterval:4.8 target:self selector:@selector(killTimer) userInfo:nil repeats:NO];
            AudioQueueStart (aqData.mQueue,NULL);
            debugLabel.text = @"Recording for 8 beats (1,2,3,4 1,2,3,4)";
            [self playSound];
        }
        else if (beatNumber < 12)
        {   //play metronome from beats 6-16
            [self playSound];
        }
        if(beatNumber == 12)
        {
            [metroTimer invalidate]; metroTimer = nil;
            [self playSound];
        }

        beatNumber++;

    }
    - (IBAction) play
    {
        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        NSString *documentsDirectory = [paths objectAtIndex:0];
        NSString *txtPath = [documentsDirectory stringByAppendingPathComponent:songName];
        NSURL *url = [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@",txtPath]];

        if (audioPlayer)
        {
            [audioPlayer stop];
            audioPlayer = nil;
        }
        NSError *error;
        audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];

        if (audioPlayer == nil)
        {
            NSLog(@"%@",[error description]);
        }
        else
        {
            [audioPlayer play];
            [audioPlayer setNumberOfLoops:-1];
        }
    }
    - (void) killTimer
    {
        //this is the timer function.  Runs once after 4.8 seconds.
       [self stop];

    }
    - (IBAction) stop
    {
        if (audioPlayer)
        {
            [audioPlayer stop];
            audioPlayer = nil;



        }
        else
        {

            if(metroTimer)
            {
                [metroTimer invalidate];metroTimer = nil;
            }
            //Stop the audio queue
            AudioQueueStop (aqData.mQueue,true);
            aqData.mIsRunning = false;
            AudioQueueDispose (aqData.mQueue,true);
            AudioFileClose (aqData.mAudioFile);

            //Get elapsed time of timer
            endTime = [NSDate timeIntervalSinceReferenceDate];
            elapsedTime = endTime - startTime;

            //Get elapsed time of audio file
            NSArray *pathComponents = [NSArray arrayWithObjects:
                                       [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject],
                                       songName,
                                       nil];
            NSURL *audioFileURL = [NSURL fileURLWithPathComponents:pathComponents];
            AVURLAsset* audioAsset = [AVURLAsset URLAssetWithURL:audioFileURL options:nil];
            CMTime audioDuration = audioAsset.duration;
            float audioDurationSeconds = CMTimeGetSeconds(audioDuration);

            //Log values
            NSLog(@"Track Duration: %f",audioDurationSeconds);
            NSLog(@"Timer Duration: %.6f", elapsedTime);

            //Show values on GUI too
            songDuration.text = [NSString stringWithFormat: @"Track Duration: %f",audioDurationSeconds];
            timerDuration.text = [NSString stringWithFormat:@"Timer Duration: %@",[NSString stringWithFormat: @"%.6f", elapsedTime]];
            debugLabel.text = @"Why is the duration of the track less than the duration the timer ran?";
        }


    }
    -(void) playSound
    {
        NSString *path = [[NSBundle mainBundle] pathForResource:@"blip2" ofType:@"aif"];
        SystemSoundID soundID;
        AudioServicesCreateSystemSoundID((__bridge CFURLRef)[NSURL fileURLWithPath:path],  &soundID);
        AudioServicesPlaySystemSound (soundID);
    }

    - (IBAction) record
    {
        [self prepareAudioQueue];
        songDuration.text = @"";
        timerDuration.text = @"";
        //debugLabel.text = @"Please wait 12 beats (The first four are count in)";
        //init beat number
        beatNumber = 1;

        //safe guard
        if(aqData.mIsRunning)
        {
            AudioQueueStop (aqData.mQueue,true);

            aqData.mIsRunning = false;

            AudioQueueDispose (aqData.mQueue,true);
            AudioFileClose (aqData.mAudioFile);
        }

        //start count in (metro will start recording)
        //aqData.mCurrentPacket = 0;
        //aqData.mIsRunning = true;
        startTime = [NSDate timeIntervalSinceReferenceDate];
        metroTimer = [NSTimer scheduledTimerWithTimeInterval:.6 target:self selector:@selector(metronomeFire) userInfo:nil repeats:YES];
        //recordTimer = [NSTimer scheduledTimerWithTimeInterval:4.8 target:self selector:@selector(killTimer) userInfo:nil repeats:NO];
        //AudioQueueStart (aqData.mQueue,NULL);

    }
    static void HandleInputBuffer (void *aqData,AudioQueueRef inAQ,AudioQueueBufferRef inBuffer,const AudioTimeStamp *inStartTime,UInt32 inNumPackets,const AudioStreamPacketDescription *inPacketDesc)
    {
        //boiler plate
        NSLog(@"HandleInputBuffer");

        struct AQRecorderState *pAqData = (struct AQRecorderState *) aqData;

        if (inNumPackets == 0 && pAqData->mDataFormat.mBytesPerPacket != 0)
            inNumPackets = inBuffer->mAudioDataByteSize / pAqData->mDataFormat.mBytesPerPacket;

        if (AudioFileWritePackets (pAqData->mAudioFile,false,inBuffer->mAudioDataByteSize,inPacketDesc,pAqData->mCurrentPacket,&inNumPackets,inBuffer->mAudioData) == noErr)
        {
            pAqData->mCurrentPacket += inNumPackets;
        }

        if (pAqData->mIsRunning == 0)
            return;

        AudioQueueEnqueueBuffer (pAqData->mQueue,inBuffer,0,NULL);
    }

    void DeriveBufferSize(AudioQueueRef audioQueue,AudioStreamBasicDescription ASBDescription,Float64 seconds,UInt32 *outBufferSize)
    {
        //boiler plate
        static const int maxBufferSize = 0x50000;
        int maxPacketSize = ASBDescription.mBytesPerPacket;
        if(maxPacketSize == 0)
        {
            UInt32 maxVBRPacketSize = sizeof(maxPacketSize);
            AudioQueueGetProperty(audioQueue, kAudioQueueProperty_MaximumOutputPacketSize, &maxPacketSize, &maxVBRPacketSize);
            NSLog(@"max buffer = %d",maxPacketSize);
        }
        Float64 numBytesForTime = ASBDescription.mSampleRate * maxPacketSize * seconds;
        *outBufferSize = (UInt32)(numBytesForTime < maxBufferSize ? numBytesForTime : maxBufferSize);
    }

    OSStatus SetMagicCookieForFile (AudioQueueRef inQueue, AudioFileID inFile)
    {
        //boiler plate
        OSStatus result = noErr;
        UInt32 cookieSize;
        if (AudioQueueGetPropertySize (inQueue,kAudioQueueProperty_MagicCookie,&cookieSize) == noErr)
        {
            char* magicCookie =(char *) malloc (cookieSize);
            if (AudioQueueGetProperty (inQueue,kAudioQueueProperty_MagicCookie,magicCookie,&cookieSize) == noErr)
            {
                result =    AudioFileSetProperty (inFile,kAudioFilePropertyMagicCookieData,cookieSize,magicCookie);
            }

            free (magicCookie);
        }
        return result;

    }













    - (void)didReceiveMemoryWarning
    {
        [super didReceiveMemoryWarning];
        // Dispose of any resources that can be recreated.
    }
    @end
//
//ViewController.m
//演讲增益
//
//2014年3月18日由NOTHING创建。
//
#导入“ViewController.h”
@界面视图控制器()
@结束
@实现视图控制器
@合成btnPlay、btnRecord、btnStop、songDuration、timerDuration、debugLabel;
-(无效)viewDidLoad
{
debugLabel.text=@;
songName=[[NSString alloc]init];
//NSArray*Path=NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,是);
//NSString*documentsDirectory=[paths objectAtIndex:0];
songName=@“TestingQueue.caf”;
[超级视图下载];
//加载视图后,通常从nib执行任何其他设置。
}
-(void)prepareAudioQueue
{
//结构AQRecorderState*pAqData;
inputBuffer=0;
aqData.mDataFormat.mFormatID=kAudioFormatLinearPCM;
aqData.mDataFormat.mSampleRate=44100.0;
aqData.mDataFormat.mChannelsPerFrame=1;
aqData.mDataFormat.mBitsPerChannel=16;
aqData.mDataFormat.mBytesPerPacket=
aqData.mDataFormat.mBytesPerFrame=aqData.mDataFormat.mChannelsPerFrame*sizeof(SInt16);
aqData.mDataFormat.mFramesPerPacket=1;
//AudioFileTypeID fileType=KaudioFileIfType;
AudioFileTypeID fileType=kAudioFileCAFType;
aqData.mDataFormat.mFormatFlags=kLinearPCMFormatFlagIsBigEndian | kLinearPCMFormatFlagIsSignedInteger | KlinearpCMFormatFlagIsPack;
AudioQueueNewInput(&aqData.mDataFormat,HandleInputBuffer,&aqData,NULL,kCFRunLoopCommonModes,0,&aqData.mQueue);
UInt32 dataFormatSize=sizeof(aqData.mDataFormat);
//在Mac OS X中,使用
//KaudioConverterCirrentInputStreamDescription
AudioQueueGetProperty(aqData.mQueue、kaudioqueeProperty_StreamDescription和aqData.mDataFormat以及dataFormatSize);
//核实
NSFileManager*fileManager=[NSFileManager defaultManager];
NSArray*Path=NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,是);
NSStri