iOS FD SDK v2.
0
Installation
With .xcframework file
Drag and drop the file into the Frameworks, Libraries and Embedded Content section of
the project settings.
Disable Bitcode: Build Settings -> Enable Bitcode to No
Import: import ParavisionFD
With CocoaPods
Coming soon!
With Swift Package Manager
Coming soon!
Usage
Loading image to memory
Before running detection, it's necessary to load the analyzed image to memory. For example, when
receiving image in delegate method of PHPickerViewController
func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
dismiss(animated: true, completion: nil)
guard let firstResult = [Link] else { return }
if [Link](ofClass: [Link]) {
[Link](ofClass: [Link]) { reading, error in
if let pickedImage = reading as? UIImage {
// Perform detection with UIImage
} else {
// Handle error
}
}
} else {
// Handle error
}
}
Another way to perform detection is with CMSampleBuffer returned by AVCaptureSession
delegate method
func captureOutput(_: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connecti
// Perform detection with CMSampleBuffer
}
Bounding boxes
In order to detect bounding boxes on a photo, all you need is an UIImage or CMSampleBuffer
containing an image loaded into the memory. Detector can be initialized ad hoc - like in the example - or
you can cache it and use the same detector for multiple images.
func detectBboxes(in image: UIImage) {
guard let detection = PNBoundingBoxDetector() else {
// Handle error
return
}
do {
let result = try [Link](image: image) // or buffer: for CMSample
for bbox in result { // CGRect
// Handle
}
} catch {
// Handle error
}
}
Quality and acceptability
In order to estimate the quality and acceptability of the faces in a photo you need to provide, aside from the
image also a set of bounding boxes represented by CGRect type.
func estimateQuality(for boundingBoxes: [CGRect], image: UIImage) {
guard let qualityEstimator = PNQualityEstimator() else {
// Handle error
return
}
do {
let qualities = try [Link](image: image, boundingBoxes: bound
let qualitiesAndBboxes = zip(qualities, boundingBoxes)
// Parse result
} catch {
// Handle error
}
}
The result of quality estimator will always be an array of the same length as the input bounding boxes
array, with the corresponding quality/acceptability values at the same indexes as their bounding boxes.
Landmarks
In order to detect the landmarks of the faces in a photo you need to provide, aside from the image also a
set of bounding boxes represented by CGRect type.
func detectLandmarks(for boundingBoxes: [CGRect], image: UIImage) {
guard let landmarkDetection = PNLandmarkDetector() else {
// Handle error
return
}
do {
let landmarks = try [Link](image: image, boundingBoxes: boun
let landmarksAndBboxes = zip(landmarks, boundingBoxes)
// Parse result
} catch {
// Handle error
}
}
The result of landmark detector will always be an array of the same length as the input bounding boxes
array, with the corresponding landmark values at the same indexes as their bounding boxes.
Types
PNBoundingBoxDetector
CLASS
PNBoundingBoxDetector
public class PNBoundingBoxDetector
Bounding box detector. Finds face bounding boxes on a provided image.
Methods
init()
public convenience init?()
Convenience init. Will return nil only in case internal module initialization failed.
getBoundingBoxes()
public func getBoundingBoxes(image: UIImage? = nil, buffer: CMSampleBuffer? = nil, data: Dat
Bounding box detection process.
Parameter image: UIImage initialized with the image that's intended to be analyzed (alternatively
parameter buffer: or data:).
Returns: An array of bounding boxes detected on the image. If no face detected, array will be empty.
The array is sorted by face size (from largest to smallest).
Throws: Can throw in case of any undesired internal behaviour.
Parameters
Name Description
UIImage initialized with the image that’s intende
image
to be analyzed.
buffer CMSampleBuffer frame from camera.
data Image Data loaded from file.
PNLandmarks
STRUCT
PNLandmarks
public struct PNLandmarks: Equatable
Face landmarks. Represents mouth, nose and eye landmarks.
Properties
eyeRight
public let eyeRight: CGPoint
Right eye landmark coordinates in iOS coordinate system.
eyeLeft
public let eyeLeft: CGPoint
Left eye landmark coordinates in iOS coordinate system.
nose
public let nose: CGPoint
Nose llandmark coordinates in iOS coordinate system.
mouthRight
public let mouthRight: CGPoint
Right mouth landmark coordinates in iOS coordinate system.
mouthLeft
public let mouthLeft: CGPoint
Left mouth landmark coordinates in iOS coordinate system.
Methods
init(eyeRight:eyeLeft:nose:mouthRight:mouthLeft:)
public init(eyeRight: CGPoint, eyeLeft: CGPoint, nose: CGPoint, mouthRight: CGPoint, mouthLe
Default initializer.
asCGPointArray()
public func asCGPointArray() -> [CGPoint]
Convenience method for representing landmarks as an array of CGPoints.
PNLandmarkDetector
CLASS
PNLandmarkDetector
public class PNLandmarkDetector
Landmark detector. Finds face landmarks on a provided image, given bounding boxes as input.
Methods
init()
public convenience init?()
Convenience init. Will return nil only in case internal module initialization failed.
getLandmarks(boundingBoxes:)
public func getLandmarks(image: UIImage? = nil, buffer: CMSampleBuffer? = nil, data: Data? =
Landmark detection process.
Parameter image: UIImage initialized with the image that's intended to be analyzed (alternatively
parameter buffer: or data:).
Parameter boundingBoxes: Previously detected bounding boxes. These don't necessarily have to be
a result of PNBoundingBoxDetector , but they should represent an actual face on the image.
Parameter withFrontality: if true then the frontality will be also calculated.
Returns: An array of PNLandmarks objects corresponding to the given bounding boxes.
Throws: Can throw in case of any undesired internal behaviour.
Parameters
Name Description
UIImage initialized with the image that’s intende
image
to be analyzed.
buffer CMSampleBuffer frame from camera.
data Image Data loaded from file.
Previously detected bounding boxes. These don
necessarily have to be a result of
boundingBoxes
PNBoundingBoxDetector , but they should
represent an actual face on the image.
withFrontality If true then the frontality will be also calculate
PNQualityEstimator
CLASS
PNQualityEstimator
public class PNQualityEstimator
Quality estimator. Calculates face image quality and acceptability on a provided image, given bounding
boxes as input.
Methods
init()
public convenience init?()
Convenience init. Will return nil only in case internal module initialization failed.
getQualities (boundingBoxes:)
public func getQualities(image: UIImage? = nil, buffer: CMSampleBuffer? = nil, data: Data? =
Quality estimation process.
Parameter image: UIImage initialized with the image that's intended to be analyzed (alternatively
parameter buffer: or data:).
Parameter boundingBoxes: Previously detected bounding boxes. These don't necessarily have to be
a result of PNBoundingBoxDetector , but they should represent an actual face on the image.
Returns: An array of PNQuality objects corresponding to the given bounding boxes.
Throws: Can throw in case of any undesired internal behaviour.
Parameters
Name Description
UIImage initialized with the image that’s intende
image
to be analyzed.
buffer CMSampleBuffer frame from camera.
data Image Data loaded from file.
Previously detected bounding boxes. These don
necessarily have to be a result of
boundingBoxes
PNBoundingBoxDetector , but they should
represent an actual face on the image.
PNQuality
STRUCT
PNQuality
public struct PNQuality
Face quality. Contains quality and acceptability scores in ranges [0,1].
Properties
quality
public let quality: Float
Quality value in [0,1] range.
acceptability
public let acceptability: Float
Acceptability value in [0,1] range.
Methods
init(quality:acceptability:)
public init(quality: Float, acceptability: Float)
Default initializer.
Release notes
v2.0
Date
01 Mar 2023
Description
Monolithic SDK has been split up into three separate SDKs: Face Detection (iOS FD SDK), Face
Recognition (iOS FR SDK), and Liveness 2D RGB (iOS FL SDK)
API changes have been applied in order to align the SDK with other Paravision SDKs:
[Link] replaced
[Link]
[Link] replaced [Link]
[Link] replaced [Link]