dcsimg
 

Interpreting Sensor Data in Android

Wednesday May 9th 2018 by Chunyen Liu

Determine what sensors your mobile device has, and then learn to leverage them.

You have probably been enjoying convenient functionalities provided by the sensors on your mobile devices without even noticing their existence. There are many sensor examples, such as a device screen automatically switching between portrait and landscape modes, screen brightness auto-adjusted according to current lighting condition, showing your position on maps, and so on.

Not all sensors are available on all mobile devices, however. Therefore, we first need to figure out what sensors exist on your device and what services they provide. Then, we can start collecting the raw data values from the supported sensors and interpreting them for our software needs. Fortunately, the Android platform provides the APIs for many commonly used sensors and they are put into three main categories: motion, environmental, and position. According to the official documentation, the motion sensor category includes accelerometers, gravity sensors, gyroscopes, and rotational vector sensors. The environmental sensor category includes barometers, photometers, and thermometers. The position sensor category includes orientation sensors and magnetometers.

Exactly What Sensors Are Available on Your Mobile Device?

There are so many different types of sensors and manufacturers out there. I am sure you would be as curious as I am to find out what your mobile device has. For this purpose, we can use Android API's SensorManager without any specific selection type Sensor.TYPE_ALL, as in Listing 1. You can see from Figure 1 that the result includes sensor vendor, manufacturer, resolution, delay, and so on. My mobile phone is a Nexus 6P and it has 24 hardware and software sensors installed. Yours can show a very different list, depending on what type of mobile device you have.

public class ShowSupportedSensors extends Activity {
   private SensorManager mSensorManager;
   private TextView mTextView;

   @Override
   protected void onCreate(Bundle savedInstanceState) {
      super.onCreate(savedInstanceState);

      setContentView(R.layout.sensors);
      mTextView = (TextView)findViewById(R.id.tvsensors);

      mSensorManager = (SensorManager)getSystemService
         (SENSOR_SERVICE);

      List<Sensor> sensors = mSensorManager.getSensorList
         (Sensor.TYPE_ALL);
      String s = "";
      for (int i = 1; i < sensors.size(); i++) {
         s += ("[" + i + "] " + sensors.get(i).toString() + "\n");
      }

      mTextView.setText(s);
   }
}

Listing 1: Inquiring All Available Sensors

List of Sensors and Properties
Figure 1: List of Sensors and Properties

Sensor Framework and Programming Template

The Android API for sensor support is android.hardware.Sensor, which provides the classes and interfaces to serve two main purposes. First, they can be used to detect what sensors exist and what capabilities they can support. Secondly, for the sensors that exist, you can monitor their events and retrieve data from them. The key programming objects we will pay most attention to regrading sensors are: Sensor, SensorManager, SensorEvent, and SensorEventListener.

As of now, these are the sensor types supported by Android API: TYPE_ACCELEROMETER, TYPE_AMBIENT_TEMPERATURE, TYPE_GRAVITY, TYPE_GYROSCOPE, TYPE_LIGHT, TYPE_LINEAR_ACCELERATION, TYPE_MAGNETIC_FIELD, TYPE_ORIENTATION, TYPE_PRESSURE, TYPE_PROXIMITY, TYPE_RELATIVE_HUMIDITY, TYPE_ROTATION_VECTOR, and TYPE_TEMPERATURE. In Listing 2, we are going to outline the generic sensor programming template by using one of the listed types. Developing software with most other sensor types is extremely similar except for the returned values. We select the sensor type TYPE_PROXIMITY in the template and check its existence at runtime. Then, we implement a SensorEventListener with the activity. We must make sure it is registered and unregistered at the appropriate time, as in onResume() and OnPause(). onSensorChanged() is where the raw sensor data is retrieved. In this specific case, we also print out the range value defined by your proximity sensors. For demonstration purpose, the software responds with a background color change as in Figure 2 and Figure 3.

public class ProximitySensorActivity extends Activity implements
         SensorEventListener {
   private SensorManager mSensorManager;
   private Sensor mProximitySensor;
   private LinearLayout mLinearLayout;
   private TextView mTextView;

   @Override
   public final void onCreate(Bundle savedInstanceState) {
      super.onCreate(savedInstanceState);

      setContentView(R.layout.proximitysensor);
      mLinearLayout = (LinearLayout)findViewById(R.id.layout);
      mTextView = (TextView)findViewById(R.id.tvsensors);

      mSensorManager = (SensorManager) getSystemService
         (Context.SENSOR_SERVICE);

      mProximitySensor = mSensorManager.getDefaultSensor
         (Sensor.TYPE_PROXIMITY);
      if (mProximitySensor == null) {
         Toast.makeText(getApplicationContext(), "Proximity sensor
            is not available.", Toast.LENGTH_LONG).show();
         finish();
      }
   }

   @Override
   public final void onAccuracyChanged(Sensor sensor,
         int accuracy) {
      // Do something here if sensor accuracy changes.
   }

   @Override
   public final void onSensorChanged(SensorEvent event) {
      if (event.values[0] < mProximitySensor.getMaximumRange()) {
         mLinearLayout.setBackgroundColor(Color.YELLOW);
         mTextView.setText("Proximity Sensor:\n\nNear (distance
            <= " + mProximitySensor.getMaximumRange() + " cm)");
      } else {
         mLinearLayout.setBackgroundColor(Color.GREEN);
         mTextView.setText("Proximity Sensor:\n\nFar (distance >
            " + mProximitySensor.getMaximumRange() + " cm)");
      }
   }

   @Override
   protected void onResume() {
      super.onResume();
      mSensorManager.registerListener(this, mProximitySensor,
         SensorManager.SENSOR_DELAY_NORMAL);
   }

   @Override
   protected void onPause() {
      super.onPause();
      mSensorManager.unregisterListener(this);
   }
}

Listing 2: Proximity Sensor Example

Proximity Sensor Out of Range
Figure 2: Proximity Sensor Out of Range

Proximity Sensor Within Range
Figure 3: Proximity Sensor Within Range

Another Example with Orientation Sensors

With the sensor template from the previous section, we can pick another type of sensor to illustrate how similar the implementation can be. As in Listing 3, we are going to use the raw data from Sensor.TYPE_ORIENTATION, which represents the angle around the z-axis in degrees; in other words, the azimuth. We then draw the two lines on the view in OrientationSensorView to indicate the angle, as in Listing 4. The result is in Figure 4. Please note that if you want to use the more accurate value for a compass app, you should derive it from Sensor.TYPE_ACCELEROMETER and Sensor.TYPE_MAGNETIC_FIELD, as recommended by Android's official documentation.

public class OrientationSensorActivity extends Activity implements
      SensorEventListener {
   private SensorManager mSensorManager;
   private Sensor mOrientationSensor;
   private OrientationSensorView mOrientationSensorView;

   @Override
   public final void onCreate(Bundle savedInstanceState) {
      super.onCreate(savedInstanceState);

      setContentView(R.layout.orientationsensor);
      mOrientationSensorView = (OrientationSensorView)findViewById
         (R.id.osv);

      mSensorManager = (SensorManager)getSystemService
         (Context.SENSOR_SERVICE);

      mOrientationSensor = mSensorManager.getDefaultSensor
         (Sensor.TYPE_ORIENTATION);
      if (mOrientationSensor == null) {
         Toast.makeText(getApplicationContext(), "Orientation
            sensor is not available.", Toast.LENGTH_LONG).show();
         finish();
      }
   }

   @Override
   public final void onAccuracyChanged(Sensor sensor,
         int accuracy) {
      // Do something here if sensor accuracy changes.
   }

   @Override
   public final void onSensorChanged(SensorEvent event) {
      // Azimuth (angle around the z-axis) in degrees.
      float azimuth = event.values[0];
      mOrientationSensorView.applyOrientationSensorData(azimuth);
   }

   @Override
   protected void onResume() {
      super.onResume();
      mSensorManager.registerListener(this, mOrientationSensor,
         SensorManager.SENSOR_DELAY_NORMAL);
   }

   @Override
   protected void onPause() {
      super.onPause();
      mSensorManager.unregisterListener(this);
   }
}

Listing 3: Orientation Sensor Example

public class OrientationSensorView extends View {
   private Paint mPaint;
   private int mAngle = 0;

   public OrientationSensorView(Context context) {
      super(context);
      init();
   }

   public OrientationSensorView(Context c, AttributeSet attrs) {
      super(c, attrs);
      init();
   }

   private void init() {
      mPaint= new Paint();
      mPaint.setColor(Color.BLUE);
      mPaint.setAntiAlias(true);
      mPaint.setStrokeWidth(3);
      mPaint.setStyle(Paint.Style.STROKE);
      mPaint.setTextSize(80);
   }

   @Override
   protected void onDraw(Canvas canvas) {
      super.onDraw(canvas);

      int cx = getMeasuredWidth() / 2;
      int cy = getMeasuredHeight() / 2;
      int r = getMeasuredWidth() / 3;
      double ang = - mAngle / 180.0 * Math.PI;

      mPaint.setColor(Color.BLUE);
      canvas.drawCircle(cx, cy, r, mPaint);

      mPaint.setColor(Color.RED);
      canvas.drawLine(cx, cy, (float)(cx + r * Math.sin(0)),
         (float)(cy - r * Math.cos(0)), mPaint);
      canvas.drawLine(cx, cy, (float)(cx + r * Math.sin(ang)),
         (float)(cy - r * Math.cos(ang)), mPaint);
      canvas.drawText("Azimuth: " + mAngle, cx, cy, mPaint);
   }

   public void applyOrientationSensorData(float angle) {
      mAngle = (int)angle;
      invalidate();
   }
}

Listing 4: Orientation Sensor View

Orientation Sensor in Action
Figure 4: Orientation Sensor in Action

Conclusion

Current mobile devices have become a major part of people's daily lives. Most of them have built-in sensors helping users become aware of their surroundings one way or another. We covered in this tutorial three main categories in motion, environmental, and position sensors supported by the Android platform. Because not all devices are equipped with all the sensors, we started the example of how to detect their availability. Then, we outlined the generic framework while developing sensor-specific software. Interpreting the raw data collected through these sensors also played an important part of this tutorial. Finally, a code walk-through of a more practical example illustrated what we learned. Sources codes for the tutorial examples are available for download in the references section.

References

About the Author

Author Chunyen Liu has been a software veteran in Taiwan and the United States. He is a published author of 50+ articles and 100+ tiny apps, a software patentee, technical reviewer, and programming contest winner by ACM/IBM/SUN. He holds advanced degrees in Computer Science with 20+ graduate-level classes. On the non-technical side, he is enthusiastic about the Olympic sport of table tennis, being a USA certified umpire, certified coach, certified referee, and categorized event winner at State Games and the US Open.
Home
Mobile Site | Full Site