Top 2 Apps Similar to Eyeplus-Your home in your eyes

Third Eye - Intruder Detection 1.4
Mirage Stacks
Find who tries to access your mobile. Catch your friends&family red handed.
Eye Type 1.0
IEEEmadC
The Eye Type application allows users toinputtext to a mobile device, using onlythe movements of their eyes. Application was developed fortheIEEEmadC 2015 (MobileApplication was developed for the IEEEmadC 2015(MobileApplication Development Contest) http://ieeemadc.org/ and it won theComputerSociety Special Award!Developer: Evangelos, Greece, IEEE Region 8Link to demonstration video: https://youtu.be/6KOoxkY7KBcAPPLICATION DESCRIPTIONThis mobile application aspires to be a step towards ubiquitousgazetrackingusing handheld devices. The ultimate goal is to assist peoplewithALS, lock -insyndrome, tetraplegia or any other people that can only movetheireyes with usingtheir handheld device to communicate, browse the Internetorfacilitate othereveryday tasks.The eye type application consists of a visual keyboard, whosekeysare “clicked”by estimating the user‟s gaze (line of sight). Instead of usingthetraditionalQWERTY keyboard layout, which would make it very hardtoaccuratelydetermine which key the user is looking at, the application'smainlayoutcomprises of a several large keys. Text is composed usingpredictivetext input,which depends on the combination of the keys clicked(eachcontaining a specificsubset of letters). The visual keyboard layout consists of fourkeysat the cornerof the screen containing the letters of the alphabetequallydistributed (hencemaking them “ambiguous” keys) and three control keys (in ordertoaccept aword, correct mistypings and scroll through thepredictedwords).DESCRIPTION OF USEThe eye type application works optimally on large screendevicessuch as tablets.The user is asked to maintain his head still during the eyetypingsession. In orderto facilitate this, an object such as a large book may be used asachin rest. Theillumination must be adequate so that the eyes of the userarecorrectly detected.The user presses the „calibrate‟ button in order to calibratethesystem and thencan start typing using their gaze. During the calibrationphase,green dots appearsuccessively at certain points on the screen, followed byredshrinking dots. Thegreen dots remain visible for ~1 second and aim to prepare theuserto direct hisgaze towards that point. When the red shrinking dots appear,thepositions of theeyes of the user are captured. Thus, in case the user needstoblink, he should doso when the green dots appear.Once the calibration session is completed, the typingviewcontaining the visualkeyboard automatically appears and a magenta-coloredcircleindicates theposition of the gaze estimations. The user directs his gaze onthekeys containingthe letters he wants to input. A button on the screen isconsideredas clicked uponfixation on it for a specific time interval (~1 second)andmomentarily turns green.A disambiguation/predictive text engine predicts the desiredwordand allows theuser to either accept it, or select an alternative suggestion.Asthe user continues totype, the predictive text engine attempts to determine whichwordthe user meansto input and also offers alternative predictions (listofsuggestions).