Friday, October 31, 2014

Tips for Optimizing the battery power on iOS devices - Location Manager

Turn off location services when not in use
This guideline is obvious but worth repeating. Unless the application is a turn by turn navigation app, it doesnt need to get the direction very often. After getting the location once, application can turn off the location monitoring. 

Use Significant location change service instead of standard location service whenever possible.: 
This enable app to leave the location services still running while using only low power. This is recommended for apps that need to track changes in user’s location but don’t need higher precision offered by the standard location services. 

Use lower-resolution values for the desired accuracy: 
Using highest accuracy makes the device to power up additional hardware to determine the location. Unless the app need highest accuracy in few meters, application should not use accuracy values such as kCLLocationAccuracyBest OR kCLLocationAccuracyNearestTenMeters in the desired accuracy property. Also to note that providing value kCLLocationAccuracyThreeKilometers doesnt prevent system from returning a better accurate data. Most of the time core location can return better accurate data within 100 metres or so. 

Turn off location monitoring if the accuracy doest improve
By checking accuracy value for a period of time and if it doesn’t improve over a period of time, then turn off the notification and retry later so that it will save the power. 

Specify the activity type of the app
Letting core location know what type of activity is associated with the app (for e.g. whether it is automobile navigation app or a fitness app) helps the location manager determine the most appropriate time to pause the location updates when app is in the background. This can improve the battery life of the device

Allow location manager to defer the delivery of location updates when your app is in the background. 

When the app can’t do anything useful with the location updates it receives from the location manager - other than log the info and go back to sleep - allow the location manager to defer those updates until they are meaningful to the app.

references: 
https://developer.apple.com/library/ios/documentation/UserExperience/Conceptual/LocationAwarenessPG/CoreLocation/CoreLocation.html

Android Signing apes using Android Studio

Signing in Debug mode 
In debug mode, the apps are signed with a debug certificate generated by the SDK tools. This certificate has private key with a known password, which enables developer to run and debug app without entering the password every time we make change to the application. 

Android studio and Eclipse IDE automatically sign the project when running from these tools. But this cannot be distributed as we expect. 

Signing in release mode. 
In release mode, developer sign using his own certificate. Below are the steps

- Create a Keystore: A keystone is a binary file that contains set of private keys. Developer must keep keystore in a secure place. 
- Create a private key: A private key represents the entity to be identified with the app. such as a person or company. 
- Build the project : Generate the unsigned APK of the app 
- Sign the app: Using the private key, generate the signed version of the app. 

After the above step, the app can be distributed in the store. 

Signing wearable apps
When publishing wearable apps, developer needs to package wearable app inside of a handheld app and both must be signed. 

In the Android studio, there is a utility to create the Keystore and store key and password into it. 
This can be accessed using Build > Generate Signed APK
On the Signed APK wizard, need to click on Create new
store the keystone in a known private path in the system, provide the key and password to store in the store. 

This creates the APK. 

References:

Thursday, October 30, 2014

iOS Location Monitoring Accuracy values

When requesting for significant location changes, application can give the accuracy values using any of the constants below.


Below are the values and we will use kCLLocationAccuracyHundredMeters for WFF, 

extern const CLLocationAccuracy  kCLLocationAccuracyBestForNavigation;
extern const CLLocationAccuracy  kCLLocationAccuracyBest;
extern const CLLocationAccuracy  kCLLocationAccuracyNearestTenMeters;
extern const CLLocationAccuracy  kCLLocationAccuracyHundredMeters;
extern const CLLocationAccuracy  kCLLocationAccuracyKilometer;
extern const CLLocationAccuracy  kCLLocationAccuracyThreeKilometers;

Below given the tips for battery conservation, which is kCLLocationAccuracyThreeKilometers more efficient and kCLLocationAccuracyBestForNavigation worst in terms of battery. 

Also, when we get the location call back, we will use the horizontal accuracy to figure out the accuracy, along with the kCLLocationAccuracyHundredMeters. and we will check if the accuracy is < 400 to process further. 


References: 

Wednesday, October 29, 2014

iOS 8.0 Pushing Local Notification or Remote Notificaiton with Custom Action

To show the notication action an app defined, categorised and registered, the app needs to receive a remote notification or schedule a local notification. In the remote notification case, we need to include the category identifier in the payload. 

  {
    "aps" :  {
        "alert" : "You’re invited!",
        "category" : "INVITE_CATEGORY",
    }
}

For local notifications, the code is something like below 

UILocalNotification *notification = [[UILocalNotification alloc] init]; . . .
notification.category = @"INVITE_CATEGORY";
[[UIApplication sharedApplication] scheduleLocalNotification:notification];

Handling Notification actions 
If the user doesn’t tap on specific action (actions being available in iOS 8.0), then the system calls the didReceiveLocalNotification or didReceiveRemoteNotiicaion as usual. This is the case when the application is in foreground. IF the app was in background and user doesn the above actions, then didFinishLaunchingWithOptions will be called passing in the notificaiotn details in the launch options dictionary. 

To handle the actions from a notification that is available in iOS 8.0, application needs to implement any of the below depending on the type of notification applicaiotn needs. 
application:handleActionWithIdentifier:(NSString*)identifier 

Code is something like below 

- (void)application:(UIApplication *) application
handleActionWithIdentifier: (NSString *) identifier
// either forLocalNotification: (NSDictionary *) notification or
forRemoteNotification: (NSDictionary *) notification
  completionHandler: (void (^)()) completionHandler {
    
    if ([identifier isEqualToString: @"ACCEPT_IDENTIFIER"]) {
        [self handleAcceptActionWithNotification:notification];
    }
    
    // Must be called when finished
    completionHandler();

}

references: 
https://developer.apple.com/library/ios/documentation/NetworkingInternet/Conceptual/RemoteNotificationsPG/Chapters/IPhoneOSClientImp.html

Tuesday, October 28, 2014

iOS 8.0 Registering Notification Actions


To use notification Actions, an app needs to define actions, group them into categories and register them. 
Below is the code to do this 

  UIMutableUserNotificationAction *acceptAction =
            [[UIMutableUserNotificationAction alloc] init];
            
            // Define id string to be passed back to your app when you handle the action
            acceptAction.identifier = @"ACCEPT_IDENTIFIER";
            
            // Localized string displayed to the user
            acceptAction.title = @"Accept";
            
            // If you need to show UI, choose foreground
            acceptAction.activationMode = UIUserNotificationActivationModeBackground;
            
            // Destructive actions display in red
            acceptAction.destructive = NO;
            
            // Does the action require authentication?
            acceptAction.authenticationRequired = NO;


The activationMode property tells whether iOS should launch the app in foreground or background when the user responds to the notification. If the activation mode is set to background UIUserNotificationActivationModeBackground, then the app is given seconds to run. If the destructive property is NO, then the action button appears blue. 
If its YES, it will appear as red. If the authentication required is set to YES, then if the device is locked when user is acting on the notification, 
OS will ask user the passcode. When the activation mode property is activation in foreground  UIUserNotificationActivationModeForeground, the value of the authenticationRequired value will be set to YES. Once the action is defined, they need to be grouped into categories. The entire code for this is like below 

// First create the category
            UIMutableUserNotificationCategory *inviteCategory =
            [[UIMutableUserNotificationCategory alloc] init];
            
            // Identifier to include in your push payload and local notification
            inviteCategory.identifier = @"INVITE_CATEGORY";
            
            // Add the actions to the category and set the action context
            [inviteCategory setActions:@[acceptAction, maybeAction, declineAction]
                            forContext:UIUserNotificationActionContextDefault];
            
            // Set the actions to present in a minimal context
            [inviteCategory setActions:@[acceptAction, declineAction]
                            forContext:UIUserNotificationActionContextMinimal];

  NSSet *categories = [NSSet setWithObjects:inviteCategory, alarmCategory, ...
                                 
                                 UIUserNotificationSettings *settings =
                                 [UIUserNotificationSettings settingsForTypes:types categories:categories];
                                 

                                 [[UIApplication sharedApplication] registerUserNotificationSettings:settings];

References:
https://developer.apple.com/library/ios/documentation/NetworkingInternet/Conceptual/RemoteNotificationsPG/Chapters/IPhoneOSClientImp.html

iOS 8.0 Registering, Scheduling, and handling User notifications


Biggest change in iOS 8.0 is that application needs to register the type of notifications application indent to deliver. The system then gives ability for the user the ability to limit the types of notification the app displays. 
A new method is available registerUserNotificaitonSettings method of UIApplication. If app don’t register any notification types, the system pushes all remote notifications to the app silently, that is without displaying any user interface. Or the app can explicitly declare that it will not display any user interface for notifications. 

Below code displays how to register for Notifications 

UIUserNotificationType types = UIUserNotificationTypeBadge | UIUserNotificaitonTypeAlert | UIUserNotificationTypeSound; 
UIUSerNotificationSEttings *mySettings = [UIUserNotificationSettings settingsForTypes:types categories:nil];
[UIApplication sharedApplication]registerUserNotificationSettings:mySettings]; 

Passing category type is important in case of handling specific actions for notifications, in this case, it is specified as nil and custom actions won’t be available for the notification. 
When the registerUSerNotificationSettings api is called, application presents a dialog to the user for permission to present the type of notifications the app registered. After the user replies iOS call-back to the UIApplicationDelegate object with the didRegisterUserNotificationSettings method passing a UIUserNotificationSettingsType object that specifies the type of notifications the user allows. Application also required to check the current notification settings by calling currentUSerNotificationSettings. From iOS 8.0 and later registerUserNotificationSettings applies to both local and remote Notificaitons and this deprecates the prior version methods registerForRemoteNotificaitonType. 

Registering for Remote notifications 
The remote notification registration is similar to the pre-iOS 8.0 versions 

1. Register remote notification types the app supports using registerUserNotificaitonSettings
2. Register to receive the APNs token using registerForeRemoteNotifications 
3. Store the device token returned upon success registration, handle the errors gracefully
4. Forward the push token to the app’s push provider 

Handling local and remote notifications 
Handling of notification is same as earlier versions of iOS that is, 

- If app was not running in the foreground. In this case, OS will take care of presenting this notification to the user. 
- When user tap on the notification or action buttons on the notification alert, app will be called with didFinishLaunchingWithOptions and the payload is passed in as argument

- When app was running in the foreground while receiving the notification, didReceiveRemote/LocalNotification method will be called. 

references:
https://developer.apple.com/library/ios/documentation/NetworkingInternet/Conceptual/RemoteNotificationsPG/Chapters/IPhoneOSClientImp.html

Monday, October 27, 2014

Android Lollipop - Whats new

Notification Changes 
- responding from lock screen, 
- Ability to filter out the notifications based on priority, and/ a time based filtering of  notifications where only priority notification can get through in this time. 
- Incoming calls worn interrupt when watching or playing. => possibly a note for Softphone app, currently it takes over the screen when there is an incoming SIP call. 
- Ability to prioritise the notification from apps => Possibly we need to look at way to provide the priority for the notifications, ranking of notifications. 

Security: 
- Automatic security enabled on data stored on the device => We need to see if anything 
- SELinux security enforcement -> looks like this is user based access privileges 

Device Sharing:
- Device can be shared without sharing the data. Seems much like Desktop multi user modes or logins => We need to see how the app can be available or restricted based on multiple logins
- Screen pinning : pin the screen (not sure if possible to pin screen of an app) so that another user can access only that content 

Connectivity: 
 - improved network handoffs - continuing the VoIP calls without interruption when switching between cellular or WiFi -> Much interesting feature for P2G app 
- Power efficient scanning of BLE devices -> Something around Beacons, Wearables can be planned based on this. also BLE peripheral mode available. (e.g. An Android L device can be a Health monitor device and can transmit this data to another BLE device.)

  Runtime And Performance
- ART is an entirely new runtime, improving the application performance and responsiveness.
- Upto 4x performance improvements 
- Smoother UI for complex, visually rich applications. 
- Compacting background apps and services so you can do more at once. 

Support for 64 bit devices, like nexus 9, brings desktop class CPU to Android 

- Support 64 bit SoCs using ARM, x86, and MIPS based cores 
- Shipping 64 bit native apps like Chrome, Gmail, Calendar, Google Play music, and more 
- Pure Java language apps run as 64 bit apps automatically. 

Media
 Bolder graphics and improved audio, video and camera capabilities 

- Lower latency audio input ensuring that music and communication applications that have strict delay requirements provide an amazing realtime experience. 
- Multi - Channel audio stream mixing means professional audio applications can now mix up to eight channels including 5.1 and 7.1 channels 
- USB audio support means one can plugin USB microphones, speakers, and myriad of other USB audio devices like amplifiers and mixers into your Android device. 
- Open GL ES 3.1 and Android extensions pack brings Android to the forefront of mobile graphics putting it on par with desktop and console class performance. 

There is a new profession range of photography features for Android Lollipop that lets user 

-Capture full resolution frames around 30fps
- Support raw formats like YUV and Bayer RAW 
- Control capture settings for sensor, lens, and flash per individual frame 
- Capture metadata like noise models and optical information 

State of the art video technology with support for HEVC to allow for UHD 4K video playback, tunnelled video for high quality video playback on Android TV and imposed HLS support for streaming. 

OK Google 
- Even if the screen is off, user can say OK Google on devices with digital signal processing support such as Nexus 6 and 9. 
- Talk to Google on the go to get quick answers, send a text, get directions, and more. 

Android TV
Support for Living room devices 

- User interface adapted for the living room
- Less browsing, more watching with personalised recommendations for content like movies and TV shows 
- Voice search for Google Play, YouTube and supported apps so user can say what he want to see 
- Console-style Android Gaming on the user’s TV with gamepad. 
- Case user’s favourite entertainment app to the big screen with Google Cast support for Android TV devices. 

Accessibility
- Enhanced low vision and colour blind capabilities. 
- Adjust display to improve colour differentiation 

New 68+ Languages
15 new additional languages are included, In this includes Indian Languages such as Malayalam, Kannada, Marathi, Tamil, Telugu 

Device Set up
Tap & Go, set up a new Android device by just tapping on an older one. requires NFC. 
Whenever there is a new Android phone or tablet, user can bring over apps from Google play automatically from any of older Android devices. 

References

Sunday, October 26, 2014

Android Service - A bird's eye view

A service is an application component representing either an application’s desire to perform a longer running operation while not interacting with the user or to supply functionality for other applications to use. Each service class must have a corresponding declaration in package’s manifest XML file AndroidManifest.xml. A service can be started with Context.startService or Context.bindService 

Services like other application objects run in the main thread of their hosting process. This means that if the service is going to do any CPU intensive tasks, or blocking operations such as MP3 playback or networking respectively, then the service should spawn its own thread in which to do that work. The IntentService class is available as a standard implementation of service that has its own thread where it schedule its work to be done. 

Starting a service is as simple as creating an intent and and calling startService method. 

Intent intent = new Intent(this,HelloService.class);
startService(intent);

This results in calling Service’s onStartCommand method. 
IF the service is not already running, it creates the service by calling onCreate method and then call the onStartCommand method. 
Inorder to communicate back to an Activity, the service can make use of Intent and the activity can be broadcast receiver of those intents. 

As an example, If a location checking service needs to run continuously in the background  and report when there is a significant location change, it can create a service and listen for the location changes in background. When the service finds that there is significant location change, it can update that information to the Activity by sending an intent like below 

intent = new Intent(BROADCAST_ACTION);
          intent.setAction("com.test.locationbroadcast");
  intent.putExtra("Latitude",location.getLatitude());
          intent.putExtra("Longitude",location.getLongitude());
          intent.putExtra("Provider",location.getProvider());

          sendBroadcast(intent);

References:
http://developer.android.com/guide/components/services.html

Friday, October 24, 2014

The Navdy experience:

It is pretty cool that a device can project the contents of a moiled device right in front of a driving wheel.  The demo video posted in the site is really cool. 
Below are few notes on this: 

- It works on android and iPhone devices:
It extends the apps that are available on the phone. No service plans required. Wondering how it extends the already existing apps 

- Touchless Gestures 
User can use the gestures to control the behaviour. Such as swipe left to cancel the call, Or dismiss a notification. 

- Voice Recognition 
The device also supports voice recognition as it is described, the commands are similar to the one used in Google now or Siri such as “compose new tweet”, “call “mom” etc. 

- Notifications 
Any notifications on the phone (text, social etc) can be displayed, read aloud or disabled entirely. User can decide what and when to appear. There are even parental controls to keep the teens safe. 

Split Windows 
- Navigation doesn’t disappear when the call comes in. It split the current window and shows navigation and the call together. 

Works with any music app

The technical specs of the device had few interesting items below 

- IR camera for touch less gesture control 
- Accelerometer, e-compass, ambient light sensor
- WiFi (802.11 b/g/n), Bluetooth 4.0/LE

- Dual core processor running on Android 4.4 

References:
https://www.navdy.com/

Sunday, October 19, 2014

iOS 8.0 LocationManager related changes

When thought to write a location manager test app, as usual, wrote the following few lines and tried to test 

 if(m_foregroundLocationManager == nil)
    {
        self.m_foregroundLocationManager = [[CLLocationManager alloc]init];
    }
    self.m_foregroundLocationManager.delegate = self;
    [self.m_foregroundLocationManager startUpdatingLocation];

But this resulted in giving the below errors 
is depending on legacy on-demand authorization, which is not supported 

Googling this around, lead to few interesting facts 

Application need to have either of the below two keys in the plist. IF not, the location manager won’t start. 

NSLocationWhenInUseUsageDescription
NSLocationAlwaysUsageDescription 

This property accepts String value which will be used by the iOS framework when any of the below APIs are called with the intent of 
requesting user to grant permission on behalf of user 

self.locationManager requestWhenInUseUsageAuthorization
self.locationManager requestWhenAlwaysUsageAuthorization 

If the app has the main functionality to do something in the foreground and only some of the secondary functionality in background, then can add both the keys into the plist
Buf if in background location is part of the main functionality, then add request always authorisation. 

Below are the location types that needs always allow authorisation permission 

- Significant location change
- Boundary crossing 
- Background location updates (e.g. Fitness, Navigation apps) 
- iBeacons 
- Visited Locations (iOS 8.0+)
- Deferred location updates 

All these location types have the power to wake up the app from suspended or terminated when a location event occurs. If the application is given only
when in use authorisation, these services will work only when the app is in the background. 

However, in iOS 8.0, application can request system to present a local notification to the user when user crosses a set location boundary (geofence). When the user 
crosses the boundary, system sends a local notification from the app but without waking up the app. If the user chooses to open the notification, the app will wake up
and can receive the boundary information. 

These are the basics of Location changes in the iOS 8.0 version. 

References:

Monday, October 13, 2014

Adding Environment variables in MAC

Open Terminal
Type cd ~ this will take you to your home directory
Type touch .profile this will create a hidden file named profile
Type open -e .profile this will open the file you just created in TextEdit
In the file, type export PATH=${PATH}:/pathToTheAndroidSdkFolder/android-sdk-mac_86/platform-tools
Save file, close TextEdit, Quit Terminal, and Relaunch Terminal

NOTE: By creating an environment variable you won’t need to cd to the Android/tools folder every time you want to run ADB

Android Running the app on Device

Even thought have loaded many apps earlier, i could not refresh the memory and neither could find a blog for this my own. Below are few notes on trying to load a map application (the templates application in Android Studio) on to Spice Kitkit OS phone. 

Since my device was an Android Kitkat phone, did the below 

- Settings -> About Phone , and tapped the build number 7 times. It showed up the Developer Options in the Settings main menu. 

On selecting the developer options and connecting to the device, it asked for the machine finger print and that was it, the Android studio was able to detect the device and able to install the app on to it. 
Below is the screenshot showing this. 




References:

Sunday, October 12, 2014

iOS UIWindow concepts

Based on some readings, there seems to be three levels of UIWindow 

UIWindowLevelNormal 
UIWindowLevelStatusBar
UIWindowLevelAlert 

Just as a cool note, the UIWindow are just another UIViews and their order of display is controlled by the windowLevel property. 


To test out these concepts, created a sample application. which is available at this link. In the application, it does the below 

- A button that shows an alert
- A button that shows up an action sheet
- A text filed that can show up a text keypad and the copy/select all control options 
- A button that can resign the text field as first responder. 

The application prints the current level of windows in a timer and for each window, it prints the below items 

- Window object 
-Window level 
- Window whether it is key window 


Below table gives an idea of the number of windows iOS framework places 



References:

Saturday, October 11, 2014

Mobile Push automation


three possibilities for push are 

Schedule based - when there is a known upcoming event, say black friday etc, push message can be scheduled 
Real-Time - Based on user activity, for e.g. when user is going into a specific section of the application 

One important item to discuss is that it is the first 30 days after installing the app initially

From iOS 7.1 onwards, app doesn’t have to be in opened in background to be able to receive iBeacon messages. It can be in closed state as well. 

One interesting thing is that when the user gets the directed notification or beacon, then application can open directly to that page. and the page data can be html that can be 

opened in web view or uiwebview and show the continent directly there. 

Android SDK - Installing on MAC


Decided to do some Android development on the MAC machine and below are few things did to do that. 

Installing the Eclipse IDE 
The IDE is available at location explicitly for mac. 

IF don’t want to install the Eclipse IDE, then below are the options 

1. Android Studio Install 
2. stand alone SDK tools 

To set up the ADT Eclipse bundle, below are the steps 

- Download the Eclipse ADT bundle from the link https://developer.android.com/sdk/index.html, which will give file with the name adt-bundle-.zip and save to appropriate location such as Development directory. 
- Open the act-bundle-/eclipse/ directory and launch eclipse. 

Interestingly, the Eclipse ADT link did not work and it was keep coming back to the index page. This forced me to install the Android Studio, which is based on Intelli J IDE. Almost all features that Eclipse offers except the NDK support. Since i was not planning to do any NDK development thought there are few project with me that needs it, thought to go ahead with the Studio install, 

This downloaded a .dmg file and then copied that into the application folder. Launching the IDE launcher requested me to install the Java 6 runtime and as the machine was new, this wasn’t present. Went ahead and asked for installing it.  

Captured the first run of the IDE and the screenshot is below. Except Android L, selected all the latest versions of the SDKs such as 4.4.2, 4.3.1, 4.2.x. As has experience from past, this is a humoungs download process and take a lot amount of time. 

Thats it mostly, the IDE is installed, it felt to be little slow in my 8GB RAM Mac Pro, but i was able to use the Simulator after creating a device in AVD. 

References:

Installing Previous versions of Xcode

Having installed Xcode 6, there was no way to develop apps using 7.1 SDK. This was required because the app store apps were not yet ready to move to the Xcode 6.0. 

By following below steps, could install the Xcode 5.1.1 version and achieve this


2. Go to Additional Tool section 

This shows up the Downloads in pages, Around 2nd page from last, got the Xcode 5.1.1. 

Installing this. Just a note, there are a lot of downloads available for developers to install. 

References: 

Sunday, October 5, 2014

Google Voice Engine tests

The V2 API for google voice translation seems to be using the URL https://www.google.com/speech-api/v2/recognize
The output is having following characteristics 

output : json. xml not supported 
lang: any locale 
key: Developer should get the key from Developer console 
app : optional parameter . passing this returns some additional reruns values 
client : is an optional parameter again, normally application seems can use chorome as value. 

An Example can be given like this: 

First of all, we need to have a recording utility, and that can be obtained using the folioing few commands

Install SOX 
On OS X with homebreew installed, 
brew install sox 

Recording Audio

rec —encoding singed-integer —bits 16 —channels 1 —rate 16000 test.wav

Send the request
curl -X post —data-binary @“audio/hello.wav” —header ‘Content-Type:audio/l16;rate=16000;’ https://www.google.com/speech-api/v2/recognize?output=json&lag=en-us&key=

When google is 100% sure about the translation, the resonse is something like below 

{
   "result":[
      {
         "alternative":[
            {
               "transcript":"good morning Google how are you feeling today"
            }
         ],
         "final":true
      }
   ],
   "result_index":0
}

when google is unsure, the response will be something like below 

{
  "result":[
    {
      "alternative":[
        {
          "transcript":"this is a test",
          "confidence":0.97321892
        },
        {
          "transcript":"this is a test for"
        }
      ],
      "final":true
    }
  ],
  "result_index":0
}

References


https://github.com/gillesdemey/google-speech-v2

Wednesday, October 1, 2014

Installing Command line tools in Xcode

To install the Xcode command line tools, each versions of the Xcode are having different mechanisms. the latest Xcode as of this writing which is Xcode 6.0 and even Xcode 5.1.1 not having the command line install facility via the Xcode interface, instead they needs to be installed via the terminal. Below command line scripts are helping in this regard. 

xcode-select —install 

This brings up the command line install prompt like in the below screenshot. 




prior to the Xcode 5.0, it had this option in the Xcode preferences.