“Sensors” is Android’s overall term for ways that Android can detect elements of the physical world around it, from magnetic flux to the movement of the device. Not all devices will have all possible sensors, and other sensors are likely to be added over time. In this chapter, we will explore the general concept of Android sensors and how to receive data from them.
Note, however, that this chapter will not get into details of detecting movement via the accelerometer, etc.
Understanding this chapter requires that you have read the core chapters,
particularly the chapter on threads. Having experience
with other system-service-and-listener patterns, such as
fetching locations with LocationManager
, is helpful
but not strictly required.
When fetching locations from LocationManager
, you do not have dedicated APIs
per location-finding technology (e.g., GPS vs. WiFi hotspot proximity vs.
cell-tower triangulation vs. …). Instead, you work with a LocationManager
system service, asking for locations using a single API, where location
technologies are identified by name (e.g., GPS_PROVIDER
).
Similarly, when working with sensors, you do not have dedicated APIs to get
sensor readings from each sensor. Instead, you work with a SensorManager
system service, asking for sensor events using a single API, where sensors
are identified by name (e.g., TYPE_LINEAR_ACCELERATION
).
Note, though, that there are some dedicated methods on SensorManager
to
help you interpret some of the sensors, particularly the accelerometer.
However, those are merely helper methods; getting at the actual accelerometer
data uses the same APIs that you would use to, say, access the barometer for
atmospheric pressure.
Usually, when working with sensors, you want to find out about changes in the sensor reading over a period of time. For example, in a driving game, where the user holds their device like a steering wheel and uses it to “turn” their virtual car, you need to know information about acceleration and positioning so long as game play is going on.
Hence, when you request a feed of sensor readings from SensorManager
, you
will specify a desired rate at which you should receive those readings.
You do that by specifying an amount of delay in between readings; Android will
drop sensor readings that arrive before the delay period has elapsed.
There are four standard delay periods, defined as constants on the SensorManager
class:
SENSOR_DELAY_NORMAL
, which is what most apps would use for broad changes,
such as detecting a screen rotating from portrait to landscapeSENSOR_DELAY_UI
, for non-game cases where you want to update the UI
continuously based upon sensor readingsSENSOR_DELAY_GAME
, which is faster (less delay) than SENSOR_DELAY_UI
, to
try to drive a higher frame rateSENSOR_DELAY_FASTEST
, which is the “firehose” of sensor readings, without
delayThe more sensor readings you get, the faster your code has to be for using those readings, lest you take too long and starve your thread of time to do anything else. This is particularly important given that you receive these sensor events on the main application thread, and therefore the time you spend processing these events is time unavailable for screen updates. Hence, choose the slowest rate that you can that will give you acceptable granularity of output.
Sensors are event-driven. You cannot ask Android for the value of a sensor at a point in time. Rather, you register a listener for a sensor, then process the sensor events as they come in. You can unregister the listener when you are done, either because you have the reading that you need, or the user has done something (like move to another activity) that indicates that you no longer need the sensor events.
To demonstrate this, we will examine the
Sensor/Monitor
sample application, which will list all of the available sensors, plus show the
incoming readings from a selected sensor.
The gateway to the sensor roster on the device is the SensorManager
system service.
You obtain one of these by calling getSystemService()
on any Context
, asking
for the SENSOR_SERVICE
, and casting the result to be a SensorManager
, as seen
in the onCreate()
method of our MainActivity
:
mgr=(SensorManager)getSystemService(Context.SENSOR_SERVICE);
There are sensor types, and then there are sensors.
You might think that there would be a one-to-one mapping between these. In truth,
there might be more than one sensor for a given type, the way the SensorManager
API is set up. Regardless, somewhere along the line, you will need to identify the
Sensor
that you want to work with.
The most common pattern, if you know the type of sensor that you want, is to call
getDefaultSensor()
on SensorManager
, supplying the type of the sensor (e.g.,
TYPE_ACCELEROMETER
, TYPE_GYROSCOPE
), where the type names are constants defined
on the Sensor
class. If there is more than one possible Sensor
for that type,
Android will give you the “default” one, which is usually a reasonable choice.
Another approach, and the one used by this sample application, is to call
getSensorList()
on SensorManager
, which returns a List
of all Sensor
objects available on this
device. The sample’s MainActivity
has a getSensorList()
that returns this
list, after a bit of manipulation:
@Override
public List<Sensor> getSensorList() {
List<Sensor> unfiltered=
new ArrayList<Sensor>(mgr.getSensorList(Sensor.TYPE_ALL));
List<Sensor> result=new ArrayList<Sensor>();
for (Sensor s : unfiltered) {
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.KITKAT
|| !isTriggerSensor(s)) {
result.add(s);
}
}
Collections.sort(result, new Comparator<Sensor>() {
@Override
public int compare(final Sensor a, final Sensor b) {
return(a.toString().compareTo(b.toString()));
}
});
return(result);
}
Android 4.4 started introducing some “trigger sensors”, ones that are designed
to deliver a single reading, then automatically become unregistered. This sample
app is designed to display results from more traditional sensors that provide
ongoing readings. So, getSensorList()
calls an isTriggerSensor()
method on
API Level 19+ devices, and throws out sensors that are trigger sensors. The
isTriggerSensor()
method simply checks the sensor type against a list of
trigger sensors:
@TargetApi(Build.VERSION_CODES.KITKAT)
private boolean isTriggerSensor(Sensor s) {
int[] triggers=
{ Sensor.TYPE_SIGNIFICANT_MOTION, Sensor.TYPE_STEP_DETECTOR,
Sensor.TYPE_STEP_COUNTER };
return(Arrays.binarySearch(triggers, s.getType()) >= 0);
}
The reason for isolating isTriggerSensor()
into a separate method, and not
having the array of sensor types as a static final
array, is because these
sensor types are not available in all Android versions. Having the array
of sensor types as a static final
data member would require putting
the @TargetApi
annotation on the entire class, which is unwise if the
class will be used on older devices. This way, we can isolate the
new-target code into a dedicated method, with a more locally-scoped
@TargetApi
annotation.
To get sensor events, you need a SensorEventListener
. This is an interface,
calling for two method implementations:
onAccuracyChanged()
, where you are informed about a significant change in
the accuracy of the readings that you are going to get from the sensoronSensorChanged()
, where you are passed a SensorEvent
representing one of
those readingsTo receive events for a given Sensor
, you call registerListener()
on the
SensorManager
, supplying the Sensor
, the SensorEventListener
, and one of the
SENSOR_DELAY_*
values to control the rate of events. Later on, you need to call
unregisterListener()
, supplying the same SensorEventListener
, to break the
connection. Failing to unregister the listener is bad. The sensor subsystem
is oblivious to things like activity lifecycles, and so if you leak a listener, not
only will you perhaps leak the component that registered the listener, but you will
continue to get sensor events until the process is terminated. As active sensors
do consume power, users will not appreciate the battery drain your leaked listener
will incur.
The List
of Sensor
objects from that getSensorList()
method shown previously
will be used to populate a ListView
. When the user taps on a Sensor
in the list,
an onSensorSelected()
method is called on the MainActivity
. Here, we
unregister our listener (a SensorLogFragment
that we will discuss more in a bit),
in case we were registered for
a prior Sensor
choice, before registering for the newly-selected Sensor
:
@Override
public void onSensorSelected(Sensor s) {
mgr.unregisterListener(log);
mgr.registerListener(log, s, SensorManager.SENSOR_DELAY_NORMAL);
log.init(isXYZ(s));
panes.closePane();
}
We will discuss the remainder of the onSensorSelected()
method a bit later in this
chapter.
Since SensorLogFragment
implements SensorEventListener
— so we can use it with
registerListener()
— we need to implement onAccuracyChanged()
and onSensorChanged()
:
@Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
// unused
}
@Override
public void onSensorChanged(SensorEvent e) {
Float[] values=new Float[3];
values[0]=e.values[0];
values[1]=e.values[1];
values[2]=e.values[2];
adapter.add(values);
}
Once again, we will get into the implementation of onSensorChanged()
a bit later
in this chapter.
The big thing to note now about onSensorChanged()
, though, is that
the SensorEvent
object comes from an object pool and gets recycled.
It is not safe for you to hold
onto this SensorEvent
object past the call to onSensorChanged()
. Hence, you need to
do something with the data in the SensorEvent
, then let go of the SensorEvent
itself,
so that instance can be used again later. This is to help prevent excessive garbage
collection, particularly for low-delay requests for sensor readings
(e.g., SENSOR_DELAY_FASTEST
).
The key piece of data in the SensorEvent
object is values
. This is a six-element
float
array containing the actual sensor reading. What those values mean will
vary by sensor. For example:
TYPE_ACCELEROMETER
), the first three elements
of the array represent the reported acceleration, in m/s2, along the X,
Y, and Z axes respectively (X = out the right side of the device, Y = out the top
edge of the device, Z = out the screen towards the user’s eyes)TYPE_PRESSURE
uses the first element of the values
array to report the barometric
pressure in millibarsTYPE_LIGHT
uses the first element of the values
array to report the light level
in luxAnd so on.
The SensorEvent
documentation contains instructions on how to interpret these
events on a per-sensor-type basis.
That being said, sensors can be roughly divided into two groups:
TYPE_ACCELEROMETER
, TYPE_GRAVITY
, TYPE_GYROSCOPE
, TYPE_LINEAR_ACCELERATION
,
and TYPE_MAGNETIC_FIELD
.TYPE_PRESSURE
and
TYPE_LIGHT
The isXYZ()
method on MainActivity
simply returns a boolean
indicating whether
or not this particular Sensor
is one that uses all three axes (true
) or not
(false
). As the roster of sensors has changed over the years, it also does some
checks based on API level:
@TargetApi(Build.VERSION_CODES.KITKAT)
private boolean isXYZ(Sensor s) {
switch (s.getType()) {
case Sensor.TYPE_ACCELEROMETER:
case Sensor.TYPE_GRAVITY:
case Sensor.TYPE_GYROSCOPE:
case Sensor.TYPE_LINEAR_ACCELERATION:
case Sensor.TYPE_MAGNETIC_FIELD:
case Sensor.TYPE_ROTATION_VECTOR:
return(true);
}
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR2) {
if (s.getType() == Sensor.TYPE_GAME_ROTATION_VECTOR
|| s.getType() == Sensor.TYPE_GYROSCOPE_UNCALIBRATED
|| s.getType() == Sensor.TYPE_MAGNETIC_FIELD_UNCALIBRATED) {
return(true);
}
}
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) {
if (s.getType() == Sensor.TYPE_GEOMAGNETIC_ROTATION_VECTOR) {
return(true);
}
}
return(false);
}
Overall, this sample app uses the SlidingPaneLayout
first seen back
in the chapter on large-screen support. We have two
fragments, in a master-detail pattern, where the “master” will be a list of all
available sensors, and the “detail” will be a log of sensor readings from a
selected sensor.
Our layout (res/layout/activity_main.xml
) wires in a SensorsFragment
(master)
and SensorLogFragment
(detail) in a SlidingPaneLayout
:
<android.support.v4.widget.SlidingPaneLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/panes"
android:layout_width="match_parent"
android:layout_height="match_parent">
<fragment
android:id="@+id/sensors"
android:name="com.commonsware.android.sensor.monitor.SensorsFragment"
android:layout_width="300sp"
android:layout_height="match_parent"/>
<fragment
android:id="@+id/log"
android:name="com.commonsware.android.sensor.monitor.SensorLogFragment"
android:layout_width="400dp"
android:layout_height="match_parent"
android:layout_weight="1"/>
</android.support.v4.widget.SlidingPaneLayout>
The SensorsFragment
is reminiscent of CountriesFragment
from the
SlidingPaneLayout
variant of the EU4You
sample. The biggest differences are that
we use a SensorListAdapter
for representing the list of sensors, that we
use getSensorList()
on our SensorsFragment.Contract
class to retrieve the
model data, and that we call onSensorSelected()
on the contract to report of selections:
package com.commonsware.android.sensor.monitor;
import android.hardware.Sensor;
import android.os.Bundle;
import android.view.View;
import android.widget.ListView;
import java.util.List;
public class SensorsFragment extends
ContractListFragment<SensorsFragment.Contract> {
static private final String STATE_CHECKED=
"com.commonsware.android.sensor.monitor.STATE_CHECKED";
private SensorListAdapter adapter=null;
@Override
public void onActivityCreated(Bundle state) {
super.onActivityCreated(state);
adapter=new SensorListAdapter(this);
getListView().setChoiceMode(ListView.CHOICE_MODE_SINGLE);
setListAdapter(adapter);
if (state != null) {
int position=state.getInt(STATE_CHECKED, -1);
if (position > -1) {
getListView().setItemChecked(position, true);
getContract().onSensorSelected(adapter.getItem(position));
}
}
}
@Override
public void onListItemClick(ListView l, View v, int position, long id) {
l.setItemChecked(position, true);
getContract().onSensorSelected(adapter.getItem(position));
}
@Override
public void onSaveInstanceState(Bundle state) {
super.onSaveInstanceState(state);
state.putInt(STATE_CHECKED, getListView().getCheckedItemPosition());
}
interface Contract {
void onSensorSelected(Sensor s);
List<Sensor> getSensorList();
}
}
SensorListAdapter
illustrates another approach for handling the difference in
“activated” row support. The EU4You
samples used an activated
style to apply
the “activated” support on Android 3.0 and higher. Here, our custom ArrayAdapter
subclass dynamically chooses between android.R.layout.simple_list_item_activated_1
(an activated-capable built-in row layout) and the classic android.R.layout.simple_list_item_1
based upon API level:
package com.commonsware.android.sensor.monitor;
import android.hardware.Sensor;
import android.os.Build;
import android.view.View;
import android.view.ViewGroup;
import android.widget.ArrayAdapter;
import android.widget.TextView;
class SensorListAdapter extends ArrayAdapter<Sensor> {
SensorListAdapter(SensorsFragment sensorsFragment) {
super(sensorsFragment.getActivity(), getRowResourceId(),
sensorsFragment.getContract().getSensorList());
}
@Override
public View getView(int position, View convertView, ViewGroup parent) {
View result=super.getView(position, convertView, parent);
((TextView)result).setText(getItem(position).getName());
return(result);
}
private static int getRowResourceId() {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.GINGERBREAD) {
return(android.R.layout.simple_list_item_activated_1);
}
return(android.R.layout.simple_list_item_1);
}
}
We also have to override getView()
, as our model is Sensor
, whose toString()
is
not what we want, so we have to manually populate the list row with getName()
instead.
SensorLogFragment
is another ListFragment
. In particular, though, we set it
up for TRANSCRIPT_MODE_NORMAL
, which means that Android will automatically scroll the
ListView
to the bottom if we add new rows to the list and the user has not scrolled
up in the list to view past data:
@Override
public void onActivityCreated(Bundle state) {
super.onActivityCreated(state);
getListView().setTranscriptMode(ListView.TRANSCRIPT_MODE_NORMAL);
}
However, we do not initialize our ListAdapter
in onActivityCreated()
, as we might
normally do. Instead, we have a dedicated init()
method, to be called by MainActivity
,
where we set up the SensorLogAdapter
and keep track of whether the Sensor
that
we are logging is designed to report three-dimensional values (isXYZ
is true
) or not:
void init(boolean isXYZ) {
this.isXYZ=isXYZ;
adapter=new SensorLogAdapter(this);
setListAdapter(adapter);
}
The init()
method, in turn, was called by onSensorSelected()
of MainActivity
.
Hence, whenever the user taps on a sensor, we set up a fresh log. init()
can do this
because MainActivity
retrieved our SensorLogFragment
up in onCreate()
, stashing
it in a log
data member:
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
mgr=(SensorManager)getSystemService(Context.SENSOR_SERVICE);
log=
(SensorLogFragment)getFragmentManager().findFragmentById(R.id.log);
panes=(SlidingPaneLayout)findViewById(R.id.panes);
panes.openPane();
}
Our onSensorChanged()
method in SensorLogFragment
copies the values
from the SensorEvent
into a separate Float
array that is
our list’s model data:
@Override
public void onSensorChanged(SensorEvent e) {
Float[] values=new Float[3];
values[0]=e.values[0];
values[1]=e.values[1];
values[2]=e.values[2];
adapter.add(values);
}
SensorLogAdapter
uses the isXYZ
value to determine how it should format the rows:
Float
from the array
class SensorLogAdapter extends ArrayAdapter<Float[]> {
public SensorLogAdapter(SensorLogFragment sensorLogFragment) {
super(sensorLogFragment.getActivity(),
android.R.layout.simple_list_item_1,
new ArrayList<Float[]>());
}
@SuppressLint("DefaultLocale")
@Override
public View getView(int position, View convertView, ViewGroup parent) {
TextView row=
(TextView)super.getView(position, convertView, parent);
String content=null;
Float[] values=getItem(position);
if (isXYZ) {
content=
String.format("%7.3f / %7.3f / %7.3f / %7.3f",
values[0],
values[1],
values[2],
Math.sqrt(values[0] * values[0] + values[1]
* values[1] + values[2] * values[2]));
}
else {
content=String.format("%7.3f", values[0]);
}
row.setText(content);
return(row);
}
}
The rest of MainActivity
simply manages the SlidingPaneLayout
, much like the
EU4YouSlidingPane
sample did.
When the user taps on a sensor in the list, we get a log of readings:
Figure 882: SensorMonitor, On a Nexus 10, Showing Gravity Readings While Being Wiggled by the Author
API Level 19 (Android 4.4) added a new feature to the sensor subsystem: batched sensor
events. Now, registerListener()
can take a batch period in microseconds, and Android
may elect to deliver events to you delayed by up to that amount of time. The objective
will be to reduce the power draw of the sensors, for sensor hardware that supports
this sort of batching behavior. Not all hardware will, in which case your requested
batch latency will be ignored.