Here at kooaba we are always keen on testing the boundaries of the current state-of-the art in image recognition. In cooperation with ETH Zurich we thus create prototypes of novel applications of our image recognition services.
One field of such applications is Augmented Reality. In a nutshell, augmented reality overlays digital information on the real world, e.g. as seen “through” the screen of a mobile phone. Recently several applications have appeared on the market, which rely on GPS and compass information to achieve the desired effect. However, these solutions have several shortcomings:
* it is difficult to select relevant data for display. Often elements that are not actually visible are labeled on the screen
* only stationary objects can be shown, i.e. objects which are bound to a fixed location, since the system relies on GPS in the first place
The combination with image recognition allows for overcoming these challenges. Thus, we asked Aleksander Slater, a Master Student at ETH, to implement an iPhone Augmented Reality system. It connects to our kooaba recognition system for media covers, as well as our landmark recognition engine developed recently at ETH. The result is shown in the video below:
Note how both (stationary) landmarks and (non stationary) media covers can be recognized and displayed on the screen. A click on the label leads to a page about related information. The recognized objects are stored in a history. (We have currently over 7 million media cover items in the kooaba database, and hundreds of thousands of landmarks in our landmark recognition prototype at ETH).
The next video shows the same concept (objects are replaced by pictures, since we didn’t have the Golden Gate Bridge in front of our house) and even combines it with OCR and translation capabilities.
This is all running on an iPhone already today, pretty much in real time (the video was recorded on the iPhone Emulator, just to get better quality). Let us know, if you would to have it on your device😉