Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống
1
/ 89 trang
THÔNG TIN TÀI LIỆU
Thông tin cơ bản
Định dạng
Số trang
89
Dung lượng
1,28 MB
Nội dung
withFlags:(NSUInteger)flags{ if (flags == AVAudioSessionInterruptionFlags_ShouldResume && player != nil){ [player play]; } } Discussion On an iOS device, such as an iPhone, a phone call could interrupt the execution of the foreground application. The audio session(s) associated with the application will be deactivated in that case and audio files will not be played until after the interruption is ended. In the beginning and the end of an interruption, we receive delegate messages from the AVAudioPlayer informing us of the different states the audio session is passing through. After the end of an interruption, we can simply resume the playback of audio. Incoming phone calls cannot be simulated with iPhone Simulator. You must always test your applications on a real device. When an interruption occurs, the audioPlayerBeginInterruption: delegate method of an AVAudioPlayer instance will be called. Here, your audio session has been deactivated. In case of a phone call, the user can just hear her ring tone. When the interruption ends (the phone call is finished or the user rejects the call), the audioPlayerEndInterrup tion:withFlags: delegate method of your AVAudioPlayer will be invoked. If the with Flags parameter contains the value AVAudioSessionInterruptionFlags_ShouldResume, you can immediately resume the playback of your audio player using the play instance method of AVAudioPlayer. The playback of audio files using AVAudioPlayer might show memory leaks in Instruments when the application is being run on iPhone Sim- ulator. Testing the same application on an iOS device proves that the memory leaks are unique to the simulator, not the device. I strongly suggest that you run, test, debug, and optimize your applications on real devices before releasing them on the App Store. 9.3 Recording Audio Problem You want to be able to record audio files on an iOS device. 518 | Chapter 9: Audio and Video Solution Make sure you have added the CoreAudio.framework framework to your target file, and use the AVAudioRecorder class in the AV Foundation framework: NSError *error = nil; NSString *pathAsString = [self audioRecordingPath]; NSURL *audioRecordingURL = [NSURL fileURLWithPath:pathAsString]; self.audioRecorder = [[AVAudioRecorder alloc] initWithURL:audioRecordingURL settings:[self audioRecordingSettings] error:&error]; For information about the audioRecordingSettings and audioRecordingPath methods used in this example, refer to this recipe’s Discussion. Discussion The AVAudioRecorder class in the AV Foundation framework facilitates audio recording in iOS applications. To start a recording, you need to pass various pieces of information to the initWithURL:settings:error: instance method of AVAudioRecorder: The URL of the file where the recording should be saved This is a local URL. The AV Foundation framework will decide which audio format should be used for the recording based on the file extension provided in this URL, so choose the extension carefully. The settings that must be used before and while recording Examples include the sampling rate, channels, and other information that will help the audio recorder start the recording. This is a dictionary object. The address of an instance of NSError where any initialization errors should be saved to The error information could be valuable later, and you can retrieve it from this instance method in case something goes wrong. The settings parameter of the initWithURL:settings:error: method is particularly interesting. There are many keys that could be saved in the settings dictionary, but we will discuss only some of the most important ones in this recipe: AVFormatIDKey The format of the recorded audio. Some of the values that can be specified for this key are: • kAudioFormatLinearPCM • kAudioFormatAppleLossless AVSampleRateKey The sample rate that needs to be used for the recording. 9.3 Recording Audio | 519 AVNumberOfChannelsKey The number of channels that must be used for the recording. AVEncoderAudioQualityKey The quality with which the recording must be made. Some of the values that can be specified for this key are: • AVAudioQualityMin • AVAudioQualityLow • AVAudioQualityMedium • AVAudioQualityHigh • AVAudioQualityMax With all this information in hand, we can go on and write an application that can record audio input into a file and then play it using AVAudioPlayer. What we want to do, specifically, is: 1. Start recording audio in Apple Lossless format. 2. Save the recording into a file named Recording.m4a in our application’s Docu- ments directory. 3. Five seconds after the recording starts, finish the recording process and immedi- ately start playing the file into which we recorded the audio input. We will start by declaring the required properties in the .h file of a simple view con- troller: #import <UIKit/UIKit.h> #import <CoreAudio/CoreAudioTypes.h> #import <AVFoundation/AVFoundation.h> @interface Recording_AudioViewController : UIViewController <AVAudioPlayerDelegate, AVAudioRecorderDelegate> @property (nonatomic, strong) AVAudioRecorder *audioRecorder; @property (nonatomic, strong) AVAudioPlayer *audioPlayer; - (NSString *) audioRecordingPath; - (NSDictionary *) audioRecordingSettings; @end When the view inside our view controller is loaded for the first time, we will attempt to start the recording process and then stop the process, if successfully started, after five seconds: - (void)viewDidLoad { [super viewDidLoad]; NSError *error = nil; NSString *pathAsString = [self audioRecordingPath]; 520 | Chapter 9: Audio and Video NSURL *audioRecordingURL = [NSURL fileURLWithPath:pathAsString]; self.audioRecorder = [[AVAudioRecorder alloc] initWithURL:audioRecordingURL settings:[self audioRecordingSettings] error:&error]; if (self.audioRecorder != nil){ self.audioRecorder.delegate = self; /* Prepare the recorder and then start the recording */ if ([self.audioRecorder prepareToRecord] && [self.audioRecorder record]){ NSLog(@"Successfully started to record."); /* After 5 seconds, let's stop the recording process */ [self performSelector:@selector(stopRecordingOnAudioRecorder:) withObject:self.audioRecorder afterDelay:5.0f]; } else { NSLog(@"Failed to record."); self.audioRecorder = nil; } } else { NSLog(@"Failed to create an instance of the audio recorder."); } } - (void) viewDidUnload{ [super viewDidUnload]; if ([self.audioRecorder isRecording]){ [self.audioRecorder stop]; } self.audioRecorder = nil; if ([self.audioPlayer isPlaying]){ [self.audioPlayer stop]; } self.audioPlayer = nil; } In the viewDidLoad method of our view controller, we attempt to instantiate an object of type AVAudioRecorder and assign it to the audioRecorder property that we declared in the .h file of the same view controller earlier. 9.3 Recording Audio | 521 We are using an instance method called audioRecordingPath to determine the NSString representation of the local URL where we want to store our recording. This method is implemented like so: - (NSString *) audioRecordingPath{ NSString *result = nil; NSArray *folders = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsFolder = [folders objectAtIndex:0]; result = [documentsFolder stringByAppendingPathComponent:@"Recording.m4a"]; return result; } The return value of this function is the document path of your application with the name of the destination file appended to it. For instance, if the document path of your application is: /var/mobile/Applications/ApplicationID/Documents/ the destination audio recording path will be: /var/mobile/Applications/ApplicationID/Documents/Recording.m4a When instantiating our AVAudioRecorder, we are using a dictionary for the settings pa- rameter of the initialization method of the audio recorder, as explained before. This dictionary is constructed using our audioRecordingSettings instance method, imple- mented in this way: - (NSDictionary *) audioRecordingSettings{ NSDictionary *result = nil; /* Let's prepare the audio recorder options in the dictionary. Later we will use this dictionary to instantiate an audio recorder of type AVAudioRecorder */ NSMutableDictionary *settings = [[NSMutableDictionary alloc] init]; [settings setValue:[NSNumber numberWithInteger:kAudioFormatAppleLossless] forKey:AVFormatIDKey]; [settings setValue:[NSNumber numberWithFloat:44100.0f] forKey:AVSampleRateKey]; 522 | Chapter 9: Audio and Video [settings setValue:[NSNumber numberWithInteger:1] forKey:AVNumberOfChannelsKey]; [settings setValue:[NSNumber numberWithInteger:AVAudioQualityLow] forKey:AVEncoderAudioQualityKey]; result = [NSDictionary dictionaryWithDictionary:settings]; return result; } You can see that five seconds after the recording starts successfully in the viewDid Load method of the view controller, we are calling the stopRecordingOnAudioRecorder method, implemented like so: - (void) stopRecordingOnAudioRecorder :(AVAudioRecorder *)paramRecorder{ /* Just stop the audio recorder here */ [paramRecorder stop]; } Now that we have asked our audio recorder to stop recording, we will wait for its delegate messages to tell us when the recording has actually stopped. It’s good not to assume that the stop instance method of AVAudioRecorder instantly stops the recording. Instead, I recommend that you wait for the audioRecorderDidFinishRecording:success fully: delegate method (declared in the AVAudioRecorderDelegate protocol) before proceeding. When the audio recording has actually stopped, we will attempt to play what was recorded: - (void)audioRecorderDidFinishRecording:(AVAudioRecorder *)recorder successfully:(BOOL)flag{ if (flag){ NSLog(@"Successfully stopped the audio recording process."); /* Let's try to retrieve the data for the recorded file */ NSError *playbackError = nil; NSError *readingError = nil; NSData *fileData = [NSData dataWithContentsOfFile:[self audioRecordingPath] options:NSDataReadingMapped error:&readingError]; /* Form an audio player and make it play the recorded data */ self.audioPlayer = [[AVAudioPlayer alloc] initWithData:fileData error:&playbackError]; 9.3 Recording Audio | 523 /* Could we instantiate the audio player? */ if (self.audioPlayer != nil){ self.audioPlayer.delegate = self; /* Prepare to play and start playing */ if ([self.audioPlayer prepareToPlay] && [self.audioPlayer play]){ NSLog(@"Started playing the recorded audio."); } else { NSLog(@"Could not play the audio."); } } else { NSLog(@"Failed to create an audio player."); } } else { NSLog(@"Stopping the audio recording failed."); } /* Here we don't need the audio recorder anymore */ self.audioRecorder = nil; } After the audio player is finished playing the song (if it does so successfully), the audio PlayerDidFinishPlaying:successfully: delegate method will be called in the delegate object of our audio player. We will implement this method like so (this method is defined in the AVAudioPlayerDelegate protocol): - (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag{ if (flag){ NSLog(@"Audio player stopped correctly."); } else { NSLog(@"Audio player did not stop correctly."); } if ([player isEqual:self.audioPlayer]){ self.audioPlayer = nil; } else { /* This is not our player */ } } As explained in Recipe 9.2, when playing audio files using AVAudioPlayer, we also need to handle interruptions, such as incoming phone calls, when deploying our application on an iOS device and before releasing the application on the App Store: - (void)audioPlayerBeginInterruption:(AVAudioPlayer *)player{ /* The audio session has been deactivated here */ 524 | Chapter 9: Audio and Video } - (void)audioPlayerEndInterruption:(AVAudioPlayer *)player withFlags:(NSUInteger)flags{ if (flags == AVAudioSessionInterruptionFlags_ShouldResume){ [player play]; } } Instances of AVAudioRecorder must also handle interruptions, just like instances of AVAudioPlayer. These interruptions can be handled as explained in Recipe 9.4. See Also Recipe 9.2; Recipe 9.4 9.4 Handling Interruptions While Recording Audio Problem You want your AVAudioRecorder instance to be able to resume recording after an inter- ruption, such as an incoming phone call. Solution Implement the audioRecorderBeginInterruption: and audioRecorderEndInterruption :withFlags: methods of the AVAudioRecorderDelegate protocol in the delegate object of your audio recorder, and resume the recording process by invoking the record instance method of your AVAudioRecorder when the interruption has ended: - (void)audioRecorderBeginInterruption:(AVAudioRecorder *)recorder{ NSLog(@"Recording process is interrupted"); } - (void)audioRecorderEndInterruption:(AVAudioRecorder *)recorder withFlags:(NSUInteger)flags{ if (flags == AVAudioSessionInterruptionFlags_ShouldResume){ NSLog(@"Resuming the recording "); [recorder record]; } } 9.4 Handling Interruptions While Recording Audio | 525 Discussion Just like audio players (instances of AVAudioPlayer), audio recorders of type AVAudio Recorder also receive delegate messages whenever the audio session associated with them is deactivated because of an interruption. The two methods mentioned in this recipe’s Solution are the best places to handle such interruptions. In the case of an interruption to the audio recorder, you can invoke the record instance method of AVAudioRecorder after the interruption to continue the recording process. However, the recording will overwrite the previous recording and all data that was recorded before the interruption will be lost. It is very important to bear in mind that when the delegate of your audio recorder receives the audioRecorderBeginInterruption: method, the audio session has already been deactivated, and invoking the resume instance method will not work on your audio recorder. After the inter- ruption is ended, you must invoke the record instance method of your AVAudioRecorder to resume recording. 9.5 Playing Audio over Other Active Sounds Problem You either want to put other applications in silent mode while you play audio or play audio on top of other applications’ audio playback (if any). Solution Use audio sessions to set the type of audio category your application uses. Discussion The AVAudioSession class was introduced in the AV Foundation framework. Every iOS application has one audio session. This audio session can be accessed using the share dInstance class method of the AVAudioSession class, like so: AVAudioSession *audioSession = [AVAudioSession sharedInstance]; After retrieving an instance of the AVAudioSession class, you can invoke the setCate gory:error: instance method of the audio session object to choose among the different categories available to iOS applications. Different values that can be set as the audio session category of an application are listed here: AVAudioSessionCategorySoloAmbient This category is exactly like the AVAudioSessionCategoryAmbient category, except that this category will stop the audio playback of all other applications, such as the iPod. When the device is put into silent mode, your audio playback will be paused. 526 | Chapter 9: Audio and Video This also happens when the screen is locked. This is the default category that iOS chooses for an application. AVAudioSessionCategoryRecord This stops other applications’ audio (e.g., the iPod) and also will not allow your application to initiate an audio playback (e.g., using AVAudioPlayer). You can only record audio in this mode. Using this category, calling the prepareToPlay instance method of AVAudioPlayer will return YES and the play instance method will return NO. The main UI interface will function as usual. The recording of your application will continue even if the iOS device’s screen is locked by the user. AVAudioSessionCategoryPlayback This category will silence other applications’ audio playback (such as the audio playback of iPod applications). You can then use the prepareToPlay and play in- stance methods of AVAudioPlayer to play a sound in your application. The main UI thread will function as normal. The audio playback will continue even if the screen is locked by the user and even if the device is in silent mode. AVAudioSessionCategoryPlayAndRecord This category allows audio to be played and recorded at the same time in your application. This will stop the audio playback of other applications when your audio recording or playback begins. The main UI thread of your application will function as normal. The playback and the recording will continue even if the screen is locked or the device is in silent mode. AVAudioSessionCategoryAudioProcessing This category can be used for applications that do audio processing, but not audio playback or recording. By setting this category, you cannot play or record any audio in your application. Calling the prepareToPlay and play instance methods of AVAudioPlayer will return NO. Audio playback of other applications, such as the iPod, will also stop if this category is set. AVAudioSessionCategoryAmbient This category will not stop the audio from other applications, but it will allow you to play audio over the audio being played by other applications, such as the iPod. The main UI thread of your application will function normally. The prepareTo Play and play instance methods of AVAudioPlayer will return with the value YES. The audio being played by your application will stop when the user locks the screen. The silent mode silences the audio playback of your application only if your application is the only application playing an audio file. If you start playing audio while the iPod is playing a song, putting the device in silent mode does not stop your audio playback. To give you an example of using AVAudioSession, let’s start an audio player that will play its audio file over other applications’ audio playback. We will begin with the .h file of a view controller: #import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> 9.5 Playing Audio over Other Active Sounds | 527 [...]... playing We will do this in the viewDidLoad method of our view controller: - (void)viewDidLoad { [super viewDidLoad]; NSError *audioSessionError = nil; AVAudioSession *audioSession = [AVAudioSession sharedInstance]; if ([audioSession setCategory:AVAudioSessionCategoryAmbient error:&audioSessionError]){ NSLog(@"Successfully set the audio session."); } else { NSLog(@"Could not set the audio session"); } dispatch_queue_t... of the AVAudioSession class in the viewDidLoad instance method of our view controller to set the audio category of our application to AVAudioSessionCategoryAmbient in order to allow our application to play audio files over other applications’ audio playback 9 .5 Playing Audio over Other Active Sounds | 52 9 9.6 Playing Video Files Problem You would like to be able to play video files in your iOS application... framework on iOS Simulator, but the Contacts database on the simulator is empty by default If you want to run the examples in this chapter on iOS Simulator, first populate your address book (on the simulator) using the Contacts application I have populated my iOS Simulator’s contacts database with three entries, as shown in Figure 10-2 54 8 | Chapter 10: Address Book Figure 10-2 Contacts added to iOS Simulator... arrayWithObject:thirdSecondThumbnail]; /* Ask the movie player to capture this frame for us */ [self.moviePlayer requestThumbnailImagesAtTimes:requestedThumbnails timeOption:MPMovieTimeOptionExact]; 9 .7 Capturing Thumbnails from a Video File | 53 5 } else { NSLog(@"Failed to instantiate the movie player."); } } You can see that we are asking the movie player to capture the frame at the third second into the movie Once this... with iTunes while the picker was being displayed to the user Immediately, your mediaPickerDidCancel: delegate message will be called as well 9.8 Accessing the Music Library | 54 5 CHAPTER 10 Address Book 10.0 Introduction On an iOS device, the Contacts application allows users to add to, remove from, and manipulate their address book The address book can be a collection of people and groups Each person... In the Build Phases tab, find and expand the Link Binary with Libraries box and press the + button, located at the bottom left corner of that box 5 In the list that gets displayed, select the AddressBook.framework and press the Add button (see Figure 10-1) 54 7 Figure 10-1 Adding the AddressBook framework to our app After you’ve added the framework to your application, whenever you want to use address-book-related... Bear in mind that the MPMoviePlayerController class does not work in iPhone Simulator You need to run this code on a real device and check the results for yourself 9.6 Playing Video Files | 53 3 See Also Recipe 9 .7 9 .7 Capturing Thumbnails from a Video File Problem You are playing a video file using an instance of the MPMoviePlayerController class and would like to capture a screenshot from the movie at... *buttonStopPlaying; @end 54 0 | Chapter 9: Audio and Video When our view loads up, we will then instantiate these two buttons and place them on our view: - (void)viewDidLoad { [super viewDidLoad]; self.view.backgroundColor = [UIColor whiteColor]; self.buttonPickAndPlay = [UIButton buttonWithType:UIButtonTypeRoundedRect]; self.buttonPickAndPlay.frame = CGRectMake(0.0f, 0.0f, 200, 37. 0f); self.buttonPickAndPlay.center... addSubview:self.buttonPickAndPlay]; self.buttonStopPlaying = [UIButton buttonWithType:UIButtonTypeRoundedRect]; self.buttonStopPlaying.frame = CGRectMake(0.0f, 0.0f, 200, 37. 0f); self.buttonStopPlaying.center = CGPointMake(self.view.center.x, self.view.center.y + 50 ); [self.buttonStopPlaying setTitle:@"Stop Playing" forState:UIControlStateNormal]; [self.buttonStopPlaying addTarget:self action:@selector(stopPlayingAudio)... does not work with delegates; instead, it relies on notifications This allows for a very flexible decoupling between the system libraries and the applications that iOS programmers write For classes such as MPMoviePlayerController, we start 53 0 | Chapter 9: Audio and Video listening for notifications that get sent by instances of that class We use the default notification center and add ourselves as an . NSError *audioSessionError = nil; AVAudioSession *audioSession = [AVAudioSession sharedInstance]; if ([audioSession setCategory:AVAudioSessionCategoryAmbient error:&audioSessionError]){ . share dInstance class method of the AVAudioSession class, like so: AVAudioSession *audioSession = [AVAudioSession sharedInstance]; After retrieving an instance of the AVAudioSession class, you can invoke. to iOS applications. Different values that can be set as the audio session category of an application are listed here: AVAudioSessionCategorySoloAmbient This category is exactly like the AVAudioSessionCategoryAmbient