我期待在更换ALAssetsLibrary
与Photos
框架在我的应用程序.
我可以很好地检索照片,集合和资源(甚至可以将它们写回来),但是看不到任何地方可以访问照片的元数据({Exif},{TIFF},{GPS}等字典,等等...).
ALAssetsLibrary
有办法.UIImagePickerController
有办法.Photos
也必须有办法.
我看到它PHAsset
有一个location
属性可以用于GPS词典,但我希望访问所有的元数据,包括面部,方向,曝光,ISO和吨更多.
目前苹果处于测试阶段2.也许还有更多的API需要推出?
UPDATE
没有官方的方法只使用Photos API来做到这一点.
但是,您可以在下载图像数据后读取元数据.有几个方法来做到这一点使用两种PHImageManager
或PHContentEditingInput
.
该PHContentEditingInput
方法需要较少的代码,不需要您导入ImageIO
.我把它包装在PHAsset类别中.
如果您请求内容编辑输入,则可以将完整图像作为a CIImage
,并且CIImage
具有标题properties
为包含图像元数据的字典的属性.
示例Swift代码:
let options = PHContentEditingInputRequestOptions() options.networkAccessAllowed = true //download asset metadata from iCloud if needed asset.requestContentEditingInputWithOptions(options) { (contentEditingInput: PHContentEditingInput?, _) -> Void in let fullImage = CIImage(contentsOfURL: contentEditingInput!.fullSizeImageURL) print(fullImage.properties) }
示例Objective-C代码:
PHContentEditingInputRequestOptions *options = [[PHContentEditingInputRequestOptions alloc] init]; options.networkAccessAllowed = YES; //download asset metadata from iCloud if needed [asset requestContentEditingInputWithOptions:options completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) { CIImage *fullImage = [CIImage imageWithContentsOfURL:contentEditingInput.fullSizeImageURL]; NSLog(@"%@", fullImage.properties.description); }];
您将获得所需的{Exif},{TIFF},{GPS}等词典.
我发现并且对我有用的更好的解决方案是:
[[PHImageManager defaultManager] requestImageDataForAsset:photoAsset options:reqOptions resultHandler: ^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) { CIImage* ciImage = [CIImage imageWithData:imageData]; DLog(@"Metadata : %@", ciImage.properties); }];
我以为我会分享一些代码来使用ImageIO框架和Photos框架来阅读元数据.您必须使用PHCachingImageManager请求图像数据.
@property (strong) PHCachingImageManager *imageManager;
请求图像并使用它的数据来创建元数据字典
-(void)metadataReader{ PHFetchResult *result = [PHAsset fetchAssetsInAssetCollection:self.myAssetCollection options:nil]; [result enumerateObjectsAtIndexes:[NSIndexSet indexSetWithIndex:myIndex] options:NSEnumerationConcurrent usingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop) { [self.imageManager requestImageDataForAsset:asset options:nil resultHandler:^(NSData *imageData, NSString *dataUTI, UIImageOrientation orientation, NSDictionary *info) { NSDictionary *metadata = [self metadataFromImageData:imageData]; NSLog(@"Metadata: %@", metadata.description); NSDictionary *gpsDictionary = metadata[(NSString*)kCGImagePropertyGPSDictionary]; if(gpsDictionary){ NSLog(@"GPS: %@", gpsDictionary.description); } NSDictionary *exifDictionary = metadata[(NSString*)kCGImagePropertyExifDictionary]; if(exifDictionary){ NSLog(@"EXIF: %@", exifDictionary.description); } UIImage *image = [UIImage imageWithData:imageData scale:[UIScreen mainScreen].scale]; // assign image where ever you need... }]; }]; }
将NSData转换为元数据
-(NSDictionary*)metadataFromImageData:(NSData*)imageData{ CGImageSourceRef imageSource = CGImageSourceCreateWithData((__bridge CFDataRef)(imageData), NULL); if (imageSource) { NSDictionary *options = @{(NSString *)kCGImageSourceShouldCache : [NSNumber numberWithBool:NO]}; CFDictionaryRef imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, (__bridge CFDictionaryRef)options); if (imageProperties) { NSDictionary *metadata = (__bridge NSDictionary *)imageProperties; CFRelease(imageProperties); CFRelease(imageSource); return metadata; } CFRelease(imageSource); } NSLog(@"Can't read metadata"); return nil; }
这具有抓取图像的开销,因此它不如枚举您的资产或集合那么快,但它至少是这样的.
PhotoKit将对元数据的访问限制为PHAsset的属性(location,creationDate,favorite,hidden,modificatonDate,pixelWidth,pixelHeight ......).原因(我怀疑)是由于引入了iCloud PhotoLibrary,图像可能不在设备上.因此,整个元数据不可用.只有这样,才能获得完整的EXIF/IPTC元数据将首先从iCloud中下载原始图像(如果不可用),然后使用的ImageIO以提取其元数据.