I played a game called watch dogs, and so there was a similar system, which also scanned faces and also helped to stop crimes, but one of the main thoughts of this game is that such a system is not perfect because anyone could use information in evil
That's one thing the team is aware I think, they will not allow such evil acts done by using the advantage of the technology. It's because the main purpose of the project is to give a new face of technology in the market to make users feel more secure and comfortable with it.
developers like to comment on how the data will be protected, or will everyone be able to access information about any person?
Faceter are going to improve privacy, not to broke it.
Today people are recorded on a video hundreds times a day and their faces are saved in different libraries. It is mandatory consequences of digitalisation and no one can change this.
Faceter will come into the market in accordance with local regulations on personal data protection.
This is a vast research area and Faceter are not going to ignore it.
Faceter don't store any information about the person behind the face, such as Facebook.
Faceter create a number hash of your face - a neural map - and nothing else.
Privacy is a key feature of this technology.
You will have two methods to store your data.
First, storage on secure private Faceter cloud, where each customer shall have all his encrypted data stored for his account with private keys for each account. Second, data storage on customer side, for users who want to have full control over their database.
Both solutions make database of each customer a proprietary base, that can not be used by third parties without customer consent.
So your family faces will be securely stored in your database.