Search this blog...

Showing posts with label Sensors. Show all posts
Showing posts with label Sensors. Show all posts

Sensors on Google Glass

Recently Google has shared the Linux Kernel source for the firmware running on the Google Glass. Having grown tired of watching online reviews about the monotonous "Glass this, glass that" voice commands, i was curious about the support for other forms of input. (read gestures). A quick peek into the innards of the kernel source revealed quite a lot...
 

FACT1. Google Glass runs on Texas Instruments OMAP4430.

Nothing revolutionary. A major win for TI (FWIW), considering that it has nonchalantly quit the mobile-SoC market citing a low RoI. This was already known as some guy who actually had the Google Glass, ran adb and found out.

FACT2. Google Glass has a built-in Accel, Gyro & Compass. 

Invensense MPU6050 = 3axis gyro + 3 axis accel.
Asahi Kasei AKM8975 = 3axis geomagnetic sensor(compass).
Combining facts 1 and 2 we can see that the device spec for SoC and sensors perfectly matches the popular Samsung Galaxy S2 (variant I9100G).

Rather than having independent ICs for both, the Google Glass uses MPU9150. Invensense MPU9150 is a single SiP which contains MPU6050 and AK8975 ICs within. This is fully hardware-compatible with existing MPU6050 board designs with the additional benefit of... (as Invensense quotes on its website) "...providing a simple upgrade path and making it easy to fit on space constrained boards." Perfect for Google Glass.
Refer: arch/arm/mach-omap2/board-notle.c line:1710

FACT3. Google Glass has a "glasshub"

I stumbled upon this by accident as i was searching for the sensor drivers. The glasshub appears to be a external micro-controller that communicates with OMAP4430 over I2C. This is the hardware that supports the "wink" command. Strangely enough, it supports upto 20winks! Looks like someone didn't learn their lesson with triple and quadruple mouse-clicks designs. On the other hand, this will be most essential when someone attempts to write a Google Glass app to detect seizures. Forward thinking as always, Google.

The glasshub also reports IR data and proximity (not sure about the underlying hardware though).

 

FACT4. Google Glass has a "Proximity" sensor.

Not to be confused with the "glasshub" there is another independent module, the LiteON LTR-506ALS. A very good sign, this IC is extremely customisable when it comes thresholds/IR-pulse-freq/pulse-count/poll-rate. Maybe, just maybe, we could hack the whole setup into a rudimentary IR remote. While being used primarily for ambient light sensing, it also supports proximity sensing. This means that we can have the Google Glass detecting our finger/hand swipes in front of our face. Quite the most exciting tech of the lot as it will provide the illusion of being able to actually handle the projected images.
Refer: arch/arm/mach-omap2/board-notle.c line:1727
          drivers/input/misc/ltr506als.c


Overall quite a good amount of
"sensory" tech inside for me to play with.
Me so excited. ;-) ;-) wink wink
Hey Google, Can i haz a Google Glass?

nGPS : Location fix without GPS

NOTE: Skip this post if you do NOT live on planet earth.

This is one of the ideas that i hit upon when preparing for a talk Sensors on Android @DroidCon2011. It is an unusual application of the on-board sensors present most Android devices. Due to lack of time i was unable to present it in much detail during my talk. So here goes...

nGPS (NO GPS) is a way of obtaining a location fix without using any GPS, AGPS, Wi-Fi Positioning and cell-site triangulation technologies.

Why would anyone want to use nGPS
- Pure GPS based systems take upto 10mins for 1st fix.
- AGPS, Wi-Fi positioning require an active data-connection.
- Cell-site triangulation requires network coverage.

So without any of these technologies at our disposal, how do we obtain a "location-fix" i.e. a latitude-longitude pair representing our current position. The answer lies in the magnetic-field sensor.

The Earth's magnetic field, as measured by a magnetic sensor on the Earth's surface, is combination of of several magnetic fields generated by various sources. These fields interact with each other and the net resultant what the magnetic sensor measures. 
World Magnetic Model (WMM)
Major contributors to a magnetic-field: 
+ Conducting, fluid outer core.
+ Earth's crust and upper mantle.
+ Electrical currents in the atmosphere. 
+ Local magnetic interference.
By filtering the local magnetic interference due to other electronic/electrical devices, we have a unique magnetic-field signature present at each place on earth. The WMM aims to provide an accurate estimate of this field. A device (having a magnetic sensor) can measure the components of this field. Then comparing it with the WMM values of the earth's field, one can identify the latitude/longitude of the present location.
Android contains built-in support for the WMM using the GeomagneticField class. The GeomagneticField class utilises the WMM internally to provide an estimated magnetic field at any given point on Earth at a given time. The important thing to note is that this class accepts the location (alongwith altitude and time) and provides the expected magnetic-field at that position (at that particular altitude and instant of time).

To determine the location using the GeomagneticField class, requires some reverse-lookup trickery on our part. More on it in another post.

UPDATE : A recent talk on Sensors and Location based services on Android at blr-droid meetp#12 featuring nGPS among other things.

Sensors on Android @ DroidCon2011

Here is the talk i presented @ DroidCon2011
 

Download Sensors on Android @ DroidCon2011

Gathered lots of inspiration talking (and listening) to several bright minds @DroidCon2011. Will be posting a "few" of them here. So subscribe to TheCodeArtist or take a quick peek here.

Update:  Here is the complete video of the talk Sensors on Android at DroidCon-2011.


 

Proximity Sensor on Android Gingerbread

The proximity sensor is common on most smart-phones, the ones that have a touchscreen. This is because the primary function of a proximity sensor is to disable accidental touch events. The most common scenario being- The ear coming in contact with the screen and generating touch events, while on a call.

Sensors on Android Gingerbread

Android 2.3 (codename Gingerbread) was officially released amidst huge hype and fanfare last week and BOY O BOY!!  people sure are queuing-up to have a peek. Sensors were the most hyped about sub-system.