The UI/UX of SREngine SDK for iPhone 3GS has been almost fixed.
SREngine recognizes scenes with camera input only.
As you can see it in the above movie, SREngine searches in real time manner: 1. without a server, 2. without marker/barcode, 3. without location/posture sensors.
Here is some additional info about the stuff.
Some UI stuff help users to estimate the object in live view frame whether or not they could be a target.
I haven’t named them yet, but let me call 1. SRIndicator, 2. SRCandidate, 3. SRConsole from left side.
The indicator provides the user with brief info.
The higher level of blue bar indicates the object in the live view frame may be a candidate.
The higher level of red bar indicates the object in the live view frame may not be a candidate or out of condition, such as too monotonous, solid, or complex object. You get high level of red bar in case of camera shaking.
You also get red frame, if SREngine detects invalid object to search (see left Fig.).
The right Fig. shows that SREngine is searching and picking the candidates up on the thumbnail image panel.
The user can click the candidate to complete SREngine search by manually.
I’m gonna improve the UI/UX because the candidate may be switched just before user click it.
The UI like Mac OS X’s Expose may be good.
The console message gives you concrete info about the object and device status.
Further, the reliability of search result is shown as score during searching and after search (see left and above Fig).
The left Fig. is that SREngine finish searching and showing the result because the score is enough high.
Finally, about ‘Annotation Button’.
The annotation button is
This button is for 3rd party vendor who provides AR application with SREngine SDK.
SREngine SDK provides AR phase only.
SREngine SDK contains the project template of Xcode, so that the developer can start very quickly and easily.
Cet article a été publié dans Uncategorized
. Ajoutez ce permalien
à vos favoris.