Google Awareness API使用与介绍
前言
在开发中,有的时候需要我们监听很多系统的状态,比如:是否有耳机插入、当前用户的活动状态(跑步,散步,静止状态等)。通过检测这些状态,我们可以做很多人性化的操作,比如:监听到用户插入了耳机,我们可以打开音乐播放器准备播放音乐。插一句,如果你是安卓的老用户,你可能知道安卓上有一款应用叫Tasker。这款应用是通过用户设定一些条件,当触发了设定好的条件后会执行用户已经指定好的操作。概括起来就是:If this, then that,简称:IFTTT。
如果对Tasker感兴趣的可以看看下面的文章:
今天我们要来看看Google是如何帮助我们实现IFTTT的操作。
介绍
Google Awareness API是用来干嘛的?
官网上是这样概括的:
A unified sensing platform enabling applications to be aware of multiple aspects of a users context, while managing battery and memory health.
翻译过来就是:
它是一个统一的获取手机上各种传感器数据的平台,应用程序使用它能够了解用户当前状态的多个方面的信息,同时能够管理电池和内存运行状况。
不够直白?我的理解是:通过它提供的API能够获取到用户当前使用手机的一些状态信息,像地理位置、天气信息、运动信息等,在调用API的时候,系统已经为我们考虑到了电池以及内存使用的相关情况,无需用户在手动处理了。
有哪些特性?
-
Many signals, one API(一个API提供多种数据)
-
High quality data(高质量数据)
-
Smart battery savings(智能电池管理)
提供哪些信息?
Google Awareness API总共提供七种信息:
-
时间
-
地点(经纬度)
-
位置(位置类别,比如:公园、商店等)
-
活动状态(走路、跑步、骑行等)
-
标志(命名相匹配的附近标志)
-
耳机
-
天气
这七种类别是可以相互组合的,也就是说,在开发中,可以设定某几种状态,只有当这些状态都触发了才可以执行具体的操作。
如何使用?
Google提供了两种方式:
-
Fence API
实在是不好翻译啊????,我的理解是:通过组合(也可以不组合)使用API,当手机触发到设定的条件时,应用程序会收到回调信息,在回调的接口中做逻辑的处理。后面我会提供Demo的。
-
Snapshot API
还是不好翻译????,官网上是这么说的:获取七种里面的某一种瞬时具体的数据。也就是说,通过使用Snapshot API你可以检测到某种状态的瞬息且详细的信息。
介绍完了,下面就开始使用它吧。
使用
下面我会写一个检测耳机是否插入的demo。
-
build.gradle配置
implementation 'com.google.android.gms:play-services-awareness:16.0.0'
-
在Google API种添加或选择项目
网址: Google API
创建好以后,在应用限制里面选择Android应用,填入你项目的包名以及SHA1;在API限制里面选择Awareness API.
设置好以后,把生成好的**值填入Manifest文件中:
<meta-data android:name="com.google.android.awareness.API_KEY" android:value="YOUR API KEY"/>
注意:如果你的项目中需要使用定位以及Nearby服务,还需要添加对应的API,操作的步骤和添加Awareness API相似。
-
设置权限
<uses-permission android:name="android.permission.INTERNET"/> <uses-permission android:name="android.permission.ACCESS_FINE_LOCATION"/> <uses-permission android:name="com.google.android.gms.permission.ACTIVITY_RECOGNITION"/>
根据项目的实际需求添加对应的权限,这里提供一份官网上调用对应接口时需要的权限:
4. 调用接口(使用Fence API)
public class MainActivity extends AppCompatActivity {
private static final String TAG = "MainActivity";
private static final String FENCE_RECEIVER_ACTION = "fence_receiver_action";
// Create a fence.
AwarenessFence headphoneFence = HeadphoneFence.during(HeadphoneState.PLUGGED_IN);
// Declare variables for pending intent and fence receiver.
private PendingIntent myPendingIntent;
private MyFenceReceiver myFenceReceiver;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
Intent intent = new Intent(FENCE_RECEIVER_ACTION);
myPendingIntent = PendingIntent.getBroadcast(this, 0, intent, 0);
myFenceReceiver = new MyFenceReceiver();
registerReceiver(myFenceReceiver, new IntentFilter(FENCE_RECEIVER_ACTION));
}
@Override
protected void onResume() {
super.onResume();
// Register the fence to receive callbacks.
// The fence key uniquely identifies the fence.
Awareness.getFenceClient(this).updateFences(new FenceUpdateRequest.Builder()
.addFence("headphoneFenceKey", headphoneFence, myPendingIntent)
.build()).addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void aVoid) {
Log.i(TAG, "Fence was successfully registered.");
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(@NonNull Exception e) {
Log.e(TAG, "Fence could not be registered: " + e.getLocalizedMessage());
}
});
}
@Override
protected void onStop() {
if (myFenceReceiver != null) {
unregisterReceiver(myFenceReceiver);
myFenceReceiver = null;
}
super.onStop();
}
// Handle the callback on the Intent.
public class MyFenceReceiver extends BroadcastReceiver {
@Override
public void onReceive(Context context, Intent intent) {
FenceState fenceState = FenceState.extract(intent);
if (TextUtils.equals(fenceState.getFenceKey(), "headphoneFenceKey")) {
switch (fenceState.getCurrentState()) {
case FenceState.TRUE:
Log.i(TAG, "Headphones are plugged in.");
Toast.makeText(context, "Headphones are plugged in.", Toast.LENGTH_LONG).show();
break;
case FenceState.FALSE:
Log.i(TAG, "Headphones are NOT plugged in.");
Toast.makeText(context, "Headphones are NOT plugged in.", Toast.LENGTH_LONG).show();
break;
case FenceState.UNKNOWN:
Log.i(TAG, "The headphone fence is in an unknown state.");
Toast.makeText(context, "The headphone fence is in an unknown state.", Toast.LENGTH_LONG).show();
break;
}
}
}
}
}
概括起来总共三步:
-
注册广播
-
设置所要检测的服务
-
广播中根据状态进行逻辑处理
除了上面监听耳机单一的状态,我们还可以自由组合。
Awareness API支持AND, OR, NOT。
事例一:
// Create the primitive fences.
AwarenessFence walkingFence = DetectedActivityFence.during(DetectedActivityFence.WALKING);
AwarenessFence headphoneFence = HeadphoneFence.during(HeadphoneState.PLUGGED_IN);
// Create a combination fence to AND primitive fences.
AwarenessFence walkingWithHeadphones = AwarenessFence.and(
walkingFence, headphoneFence
);
上述事例设置了条件:用户正在步行且耳机已经插入;
事例二:
double currentLocationLat; // current location latitude
double currentLocationLng; // current location longitude
long nowMillis = System.currentTimeMillis();
long oneHourMillis = 1L * 60L * 60L * 1000L;
AwarenessFence orExample = AwarenessFence.or(
AwarenessFence.not(LocationFence.in(
currentLocationLat,
currentLocationLng,
100.0,
100.0,
0L)),
TimeFence.inInterval(nowMillis + oneHourMillis, Long.MAX_VALUE));
上述事例设置了触发条件:用户从当前位置移动超过了100米或从当前时间已经过去一个小时;
以上介绍了Fence API的使用,下面我们再来看看Snapshot API的使用:
事例一:
// Pulling headphone state is similar, but doesn't involve analyzing confidence.
Awareness.getSnapshotClient(this).getHeadphoneState()
.addOnSuccessListener(new OnSuccessListener<HeadphoneStateResponse>() {
@Override
public void onSuccess(HeadphoneStateResponse headphoneStateResponse) {
HeadphoneState headphoneState = headphoneStateResponse.getHeadphoneState();
boolean pluggedIn = headphoneState.getState() == HeadphoneState.PLUGGED_IN;
Log.i(TAG, "Headphones are " + (pluggedIn ? "plugged in" : "unplugged");
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(@NonNull Exception e) {
Log.e(TAG, "Could not get headphone state: " + e);
}
});
上述事例是检测耳机是否插入;
事例二:
// Each type of contextual information in the snapshot API has a corresponding "get" method.
// For instance, this is how to get the user's current Activity.
Awareness.getSnapshotClient(this).getDetectedActivity()
.addOnSuccessListener(new OnSuccessListener<DetectedActivityResponse>() {
@Override
public void onSuccess(DetectedActivityResponse dar) {
ActivityRecognitionResult arr = dar.getActivityRecognitionResult();
// getMostProbableActivity() is good enough for basic Activity detection.
// To work within a threshold of confidence,
// use ActivityRecognitionResult.getProbableActivities() to get a list of
// potential current activities, and check the confidence of each one.
DetectedActivity probableActivity = arr.getMostProbableActivity();
int confidence = probableActivity.getConfidence();
String activityStr = probableActivity.toString();
Log.i(TAG, "Activity: " + activityStr
+ ", Confidence: " + confidence + "/100");
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(@NonNull Exception e) {
Log.e(TAG, "Could not detect activity: " + e);
}
});
上述事例是获取用户当前的活动状态以及对应的强度;
注意事项
-
手机需要有Google play服务;
-
官网上提供的事例已经过时,需要参考github上的example;
-
在我的使用过程中需要全程FQ????;
最后
之所以会写这篇文章,是因为昨天看少数派的一篇文章里面有提及Awareness API,当时抱着好奇心就决定要学习学习。
有没有App已经使用了Awareness API?
当然是有的,目前我所了解到的有一款壁纸应用是使用了该API。
Vortex - Data Driven Live Wallpaper