public class CameraDetector extends Detector implements CameraHelper.OnCameraHelperEventListener
Modifier and Type | Class and Description |
---|---|
static interface |
CameraDetector.CameraEventListener
Reports events related to the handling of the Android Camera by
CameraDetector . |
static class |
CameraDetector.CameraType
This enumeration is used to specify which camera to use during recording.
|
Detector.FaceListener, Detector.ImageListener
Constructor and Description |
---|
CameraDetector(android.content.Context context,
CameraDetector.CameraType cameraType,
android.view.SurfaceView cameraPreviewView)
Creates a CameraDetector.
|
Modifier and Type | Method and Description |
---|---|
void |
onFrameAvailable(byte[] frame,
int width,
int height,
Frame.ROTATE rotation) |
void |
onFrameSizeSelected(int width,
int height,
Frame.ROTATE rotation) |
void |
reset()
Resets the baselines used to measure facial expressions and emotions.
|
void |
setCameraType(CameraDetector.CameraType type)
Indicates which device camera to use.
|
void |
setMaxProcessRate(float maxFramesPerSecond)
The maximum processing rate to operate in [FPS].
|
void |
setOnCameraEventListener(CameraDetector.CameraEventListener listener) |
void |
setSendUnprocessedFrames(boolean sendUnprocessedFrameFlag)
When the SDK is in control of the camera, if the SDK frame rate is lower than the camera frame rate, there will
be frames that are not processed for expressions by the SDK.
|
void |
start()
Initiates processing of frames received from the device's camera.
|
void |
stop()
Stops processing frames received from the device's camera, and releases the camera to allow its use by other
apps.
|
getDetectAnger, getDetectAttention, getDetectBrowFurrow, getDetectBrowRaise, getDetectChinRaise, getDetectContempt, getDetectDisgust, getDetectEngagement, getDetectEyeClosure, getDetectFear, getDetectInnerBrowRaise, getDetectJoy, getDetectLipCornerDepressor, getDetectLipPress, getDetectLipPucker, getDetectLipSuck, getDetectMouthOpen, getDetectNoseWrinkle, getDetectSadness, getDetectSmile, getDetectSmirk, getDetectSurprise, getDetectUpperLipRaise, getDetectValence, getPercentFaceDetected, isRunning, setDetectAllEmotions, setDetectAllExpressions, setDetectAnger, setDetectAttention, setDetectBrowFurrow, setDetectBrowRaise, setDetectChinRaise, setDetectContempt, setDetectDisgust, setDetectEngagement, setDetectEyeClosure, setDetectFear, setDetectInnerBrowRaise, setDetectJoy, setDetectLipCornerDepressor, setDetectLipPress, setDetectLipPucker, setDetectLipSuck, setDetectMouthOpen, setDetectNoseWrinkle, setDetectSadness, setDetectSmile, setDetectSmirk, setDetectSurprise, setDetectUpperLipRaise, setDetectValence, setFaceListener, setImageListener, setLicensePath, setLicenseStream
public CameraDetector(android.content.Context context, CameraDetector.CameraType cameraType, android.view.SurfaceView cameraPreviewView)
onCreate()
method of its host Activity
or Fragment
Note: using the specified SurfaceView
is much more efficient than using the returned frames in the
Detector.ImageListener.onImageResults(List, Frame, float)
callback to display camera images.context
- application's context.cameraType
- an enumerated value indicating which camera (front or back) to usecameraPreviewView
- a SurfaceView
to use as a camera preview. Note that the camera will stretch its images to fit the
entire SurfaceView
, so it is the responsibility of the developer to size the SurfaceView
to have the same aspect ratio
as the returned camera images. The OnCameraEventListener
interface reports the selected camera
frame size. See the Affectiva SDK Developer Guide for an example of how to correct the aspect ratio of the SurfaceView
As of SDK 2.0, it is no longer possible to submit a null value for the SurfaceView
. The Android API
requires a Surface for its camera to function. See the Affectiva SDK Developer Guide for an example of how to occlude the
SurfaceView
if you do not want it to be shown on screen.
Please do not register for the SurfaceHolder.Callback
interface belonging to this SurfaceView
, as that interface is managed by
the SDK.
NullPointerException
- if context
or cameraPreviewView
is null
.public void setMaxProcessRate(float maxFramesPerSecond)
Default: 5 frames per second.
maxFramesPerSecond
- the maximum frames per second the SDK can process frames, larger than or equal to zero.IllegalArgumentException
- if maxFramesPerSecond
is negative.public void setSendUnprocessedFrames(boolean sendUnprocessedFrameFlag)
sendUnprocessedFrameFlag
- Set this to true to receive unprocessed video frames.public void start()
stop()
is called.start
in class Detector
LicenseException
- if no or invalid license was provided see: Detector.setLicensePath(String)
.AffdexException
- if the detector did not initialize successfully.IllegalStateException
- if the Camera
can not be opened.public void stop()
stop
in class Detector
IllegalStateException
- if called before Detector.start()
.Detector.getPercentFaceDetected()
public void setCameraType(CameraDetector.CameraType type)
public void reset()
Detector
Detector.start()
.public void setOnCameraEventListener(CameraDetector.CameraEventListener listener)
public void onFrameAvailable(byte[] frame, int width, int height, Frame.ROTATE rotation)
onFrameAvailable
in interface CameraHelper.OnCameraHelperEventListener
public void onFrameSizeSelected(int width, int height, Frame.ROTATE rotation)
onFrameSizeSelected
in interface CameraHelper.OnCameraHelperEventListener
Copyright © 2015. All rights reserved.