We can see all the points, the even points or the odd points.
If we enable block "FaceUpdated", we will only see the points corresponding to the lips, I have taken these points for the lips:
416,311,312,313,14,83,82,81,192,79,97,91,180,87,16,317,404,325,309
This application will draw two blue balls on the eyes of the face (LeftEyeTop and RightEyeTop)
It will also draw a yellow ball on the nose, it is point 2.
In the VideoUpdated event we are going to check if the position of the nose has changed, if it has not changed it is that our nose, point 2, the yellow ball will not be changing, in that case the mobile will vibrate.
Hi @ewpatton,
I think there is a problem here, with the Front Camera it works perfectly, but with the Back Camera when the model is moved to one side, the mesh moves in the opposite direction.
Great, but can we make a picture of the hand and fingers and by moving the fingers move balls or connect with Arduino and move a robotic arm or something like that
I am developing Face Emotion Detector application for my final year project, currently iam using "Personal Image Classifier" but its not effective, please any one guide me to do on this effective way.
Using Face recognition extension can i detect face emotion, or any other idea please suggest me ,