SpringBoot+SeetaFace6搭建人臉識別平臺
前言
最近多個項目需要接入人臉識別功能,之前的方案是使用百度云api集成,但是后續(xù)部分項目是內(nèi)網(wǎng)部署及使用,考慮到接入復(fù)雜程度及收費等多種因素,決定參考開源方案自己搭建,保證服務(wù)的穩(wěn)定性與可靠性
項目地址:https://gitee.com/code2roc/fastface
設(shè)計
經(jīng)過檢索對別多個方案后,使用了基于seetaface6+springboot的方式進行搭建,能夠無縫接入應(yīng)用
seetaface6是中科視拓最新開源的商業(yè)正式版本,包含人臉識別的基本能力:人臉檢測、關(guān)鍵點定位、人臉識別,同時增加了活體檢測、質(zhì)量評估、年齡性別估計
官網(wǎng)地址:https://github.com/SeetaFace6Open/index
使用對接的sdk是tracy100大神的封裝,支持 jdk8-jdk14,支持windows和Linux,無需考慮部署問題,直接使用jar包實現(xiàn)業(yè)務(wù)即可,內(nèi)部同時封裝了bean對象spring能夠開箱即用
官網(wǎng)地址:https://github.com/tracy100/seetaface6SDK
系統(tǒng)目標實現(xiàn)人臉注冊,人臉比對,人臉查找基礎(chǔ)功能即可
實現(xiàn)
引用jar包
<dependency>
<groupId>com.seeta.sdk</groupId>
<artifactId>seeta-sdk-platform</artifactId>
<scope>system</scope>
<version>1.2.1</version>
<systemPath>${project.basedir}/lib/seetaface.jar</systemPath>
</dependency>
bean對象注冊
FaceDetectorProxy為人臉檢測bean,能夠檢測圖像中是否有人臉
FaceRecognizerProxy為人臉比對bean,能夠比對兩張人臉的相似度
FaceLandmarkerProxy為人臉關(guān)鍵點bean,能夠檢測人臉的關(guān)鍵點,支持5個點和68個點
@Configuration
public class FaceConfig {
@Value("${face.modelPath}")
private String modelPath;
@Bean
public FaceDetectorProxy faceDetector() throws FileNotFoundException {
SeetaConfSetting detectorPoolSetting = new SeetaConfSetting(
new SeetaModelSetting(0, new String[]{modelPath + File.separator + "face_detector.csta"}, SeetaDevice.SEETA_DEVICE_CPU));
FaceDetectorProxy faceDetectorProxy = new FaceDetectorProxy(detectorPoolSetting);
return faceDetectorProxy;
}
@Bean
public FaceRecognizerProxy faceRecognizer() throws FileNotFoundException {
SeetaConfSetting detectorPoolSetting = new SeetaConfSetting(
new SeetaModelSetting(0, new String[]{modelPath + File.separator + "face_recognizer.csta"}, SeetaDevice.SEETA_DEVICE_CPU));
FaceRecognizerProxy faceRecognizerProxy = new FaceRecognizerProxy(detectorPoolSetting);
return faceRecognizerProxy;
}
@Bean
public FaceLandmarkerProxy faceLandmarker() throws FileNotFoundException {
SeetaConfSetting detectorPoolSetting = new SeetaConfSetting(
new SeetaModelSetting(0, new String[]{modelPath + File.separator + "face_landmarker_pts5.csta"}, SeetaDevice.SEETA_DEVICE_CPU));
FaceLandmarkerProxy faceLandmarkerProxy = new FaceLandmarkerProxy(detectorPoolSetting);
return faceLandmarkerProxy;
}
}
在使用相關(guān)bean對象時,需要進行l(wèi)ibrary的本地注冊,指定cpu還是gpu模式
LoadNativeCore.LOAD_NATIVE(SeetaDevice.SEETA_DEVICE_CPU)
人臉檢測
public FaceEnum.CheckImageFaceStatus getFace(BufferedImage image) throws Exception {
SeetaImageData imageData = SeetafaceUtil.toSeetaImageData(image);
SeetaRect[] detects = faceDetectorProxy.detect(imageData);
if (detects.length == 0) {
return FaceEnum.CheckImageFaceStatus.NoFace;
} else if (detects.length == 1) {
return FaceEnum.CheckImageFaceStatus.OneFace;
} else {
return FaceEnum.CheckImageFaceStatus.MoreFace;
}
}
人臉比對
public FaceEnum.CompareImageFaceStatus compareFace(BufferedImage source, BufferedImage compare) throws Exception {
float[] sourceFeature = extract(source);
float[] compareFeature = extract(compare);
if (sourceFeature != null && compareFeature != null) {
float calculateSimilarity = faceRecognizerProxy.calculateSimilarity(sourceFeature, compareFeature);
System.out.printf("相似度:%f\n", calculateSimilarity);
if (calculateSimilarity >= CHECK_SIM) {
return FaceEnum.CompareImageFaceStatus.Same;
} else {
return FaceEnum.CompareImageFaceStatus.Different;
}
} else {
return FaceEnum.CompareImageFaceStatus.LostFace;
}
}
人臉關(guān)鍵點
private float[] extract(BufferedImage image) throws Exception {
SeetaImageData imageData = SeetafaceUtil.toSeetaImageData(image);
SeetaRect[] detects = faceDetectorProxy.detect(imageData);
if (detects.length > 0) {
SeetaPointF[] pointFS = faceLandmarkerProxy.mark(imageData, detects[0]);
float[] features = faceRecognizerProxy.extract(imageData, pointFS);
return features;
}
return null;
}
人臉數(shù)據(jù)庫
- 注冊
public long registFace(BufferedImage image) throws Exception {
long result = -1;
SeetaImageData imageData = SeetafaceUtil.toSeetaImageData(image);
SeetaRect[] detects = faceDetectorProxy.detect(imageData);
if (detects.length > 0) {
SeetaPointF[] pointFS = faceLandmarkerProxy.mark(imageData, detects[0]);
result = faceDatabase.Register(imageData, pointFS);
faceDatabase.Save(dataBasePath);
}
return result;
}
- 查找
public long queryFace(BufferedImage image) throws Exception {
long result = -1;
SeetaImageData imageData = SeetafaceUtil.toSeetaImageData(image);
SeetaRect[] detects = faceDetectorProxy.detect(imageData);
if (detects.length > 0) {
SeetaPointF[] pointFS = faceLandmarkerProxy.mark(imageData, detects[0]);
long[] index = new long[1];
float[] sim = new float[1];
result = faceDatabase.QueryTop(imageData, pointFS, 1, index, sim);
if (result > 0) {
float similarity = sim[0];
if (similarity >= CHECK_SIM) {
result = index[0];
} else {
result = -1;
}
}
}
return result;
}
- 刪除
public long deleteFace(long index) throws Exception {
long result = faceDatabase.Delete(index);
faceDatabase.Save(dataBasePath);
return result;
}
拓展
集成了face-api.js,實現(xiàn)簡單的張張嘴,搖搖頭活體檢測,精確度不是很高,作為一個參考選項
官網(wǎng)地址:https://github.com/justadudewhohacks/face-api.js
加載模型
Promise.all([
faceapi.loadFaceDetectionModel('models'),
faceapi.loadFaceLandmarkModel('models')
]).then(startAnalysis);
function startAnalysis() {
console.log('模型加載成功!');
var canvas1 = faceapi.createCanvasFromMedia(document.getElementById('showImg'))
faceapi.detectSingleFace(canvas1).then((detection) => {
if (detection) {
faceapi.detectFaceLandmarks(canvas1).then((landmarks) => {
console.log('模型預(yù)熱調(diào)用成功!');
})
}
})
}
打開攝像頭
<video id="video" muted playsinline></video>
function AnalysisFaceOnline() {
var videoElement = document.getElementById('video');
// 檢查瀏覽器是否支持getUserMedia API
if (navigator.mediaDevices.getUserMedia) {
navigator.mediaDevices.getUserMedia({ video: { facingMode: "user" } }) // 請求視頻流
.then(function(stream) {
videoElement.srcObject = stream; // 將視頻流設(shè)置到<video>元素
videoElement.play();
})
.catch(function(err) {
console.error("獲取攝像頭錯誤:", err); // 處理錯誤
});
} else {
console.error("您的瀏覽器不支持getUserMedia API");
}
}
捕捉幀計算關(guān)鍵點
function vedioCatchInit() {
video.addEventListener('play', function() {
function captureFrame() {
if (!video.paused && !video.ended) {
// 設(shè)置canvas的尺寸與視頻幀相同
canvas.width = 200;
canvas.height = 300;
// 繪制當前視頻幀到canvas
context.drawImage(video, 0, 0, canvas.width, canvas.height);
// 將canvas內(nèi)容轉(zhuǎn)換為data URL
//outputImage.src = canvas.toDataURL('image/png');
// 可以在這里添加代碼將data URL發(fā)送到服務(wù)器或進行其他處理
faceapi.detectSingleFace(canvas).then((detection) => {
if (detection) {
faceapi.detectFaceLandmarks(canvas).then((landmarks) => {
})
} else {
console.log("no face")
}
})
// 遞歸調(diào)用以持續(xù)捕獲幀
setTimeout(captureFrame, 100); // 每500毫秒捕獲一次
}
}
captureFrame(); // 開始捕獲幀
});
}

浙公網(wǎng)安備 33010602011771號