Tuesday, September 30, 2014

Sharing iOS device screen on Desktop computer

For projecting the device screen to the Desktop, we can use a software called iTools. 

http://itools-for-windows.en.softonic.com

Can download this and connect the device to the PC using the standard iOS cable (say, lightning one). And can see a Desktop option on the left panel of this software. 

Though it is little bit pixelated, this works good. 

Sunday, September 28, 2014

iOS Window Coordinate System

As part of the project, i had to place a which is added on top of Window to be scrolled up as the keyboard scrolls up. 

Understanding the Window coordinate system is important for this. 

The main concept is, the window orientation defined only in Portrait mode. All other orientations, the coordinates are defined w.r.to portrait orientation. 

When the keyboard appears, in various orientations, the value of the keyboard rectangle is as below 

Note that the Orientation values are like below in iOS. The values are given w.r.to the home key and hence may generate a bit of confusion, but it is all goo once see the picture below.




Portrait : keyboardRect is :(0.000000,760.000000,768.000000,264.000000)
This is easy to understand. Because keyboard left x is 0, and y is 760 from the top left of the window. and the width is device screen, height of the keyboard is 264. 

Landscape left: keyboardRect is :(416.000000,0.000000,352.000000,1024.000000)
This orientation is when device is rotated once to the right, i.e. home key appears on the left. in this case, the 0,0 is on the top right of the device and hence the x is 415 and y is 0. width is 352 and height becomes 1024

Upside Down: keyboardRect is :(0.000000,0.000000,768.000000,264.000000)
When the device is upside down, the 0,0 point is bottom right of the device. and hence the x becomes 0,0 and width becomes 768 and height becomes 264. 

Landscape Right:keyboardRect is :(0.000000,0.000000,352.000000,1024.000000)
When the device is in this orientation, the 0,0 coord is at bottom left of the device. the width becomes 352, and the height becomes 1024

a general note from this is, if the height is greater than width, then the device is in landscape orientation, and if the height is less than width, then it is in portrait orientation. 


the view centre on various orientations were 

Portriat: panel center is :(384.000000,512.000000)
Landscape Left : panel center is :(512.000000,384.000000)
Upside Down: panel center is :(384.000000,512.000000)
Landscape Right:  panel center is :(512.000000,384.000000)

Basically, the panel was put on the centre. 

Now the panel has a text field at the bottom. when the keyboard comes up, need to move the panel up so that the text field is visible. 

It is interesting to note that the easiest way to interact with a view adjusting its y position is to change the center. 


Once compute the amount that overlaps the keyboard with the view, we can just set that height to the amount to be movable.  



Monday, September 22, 2014

Enable Airplay using iOS device

Below simple steps will let the iPad connect and do a mirroring. 

The images mentioned below are copied to google drive at below location 

https://drive.google.com/folderview?id=0B0ZgwdHsnw1bS3BlSlpIeFZmMHc&usp=sharing

1. Turn on Apple TV 
2. unlock iPad screen 
3. swipe the bottom of the screen to bring up the quick control panel  (IMG 0994.png)
4. Tap on Airplay to enable to Apple TV (IMG 0996.PNG) 

6. Enable Mirroring (IMG 0995.PNG) 


Sunday, September 21, 2014

XMPP Signalling with Google Server for Roster, vCard & Presence

Below diagram shows the details of XMPP interaction with Google server

StoryBoard hands on

IF the requirement is to have an app with many screens, then storyboards can help to reduce the amount of glue code between the screens. Instead of using separate xib file for each of the view controllers, the app can use a single storyboard that contains the designs of all these view controllers and the relationships between them. 

Below are few advantages storyboards have over the xib file. 

1. Bettter overview of overall app screens and connections between the screens. 
2. The storyboard describes the transitions between various screens. These transitions are called segues and the segues are created by control dragging from one view controller to the next. this helps to write less code for connecting the screens. 
3. Storyboards make working with table views lot easier with the new prototype cells and static cells features. The table views can be completely designed with the storyboard editor allowing no code to write using the 

Especially as a person loving to write the ui code programmatically, xib was not a way to go for me and neither storyboard. However, for getting an idea of how it works, the below sections dive into creating a sample app. 

In Xcode 5.0, storyboards are enabled by default. Due to this, the storyboard files are enabled by default in the application. This created a new project with ViewController class and two storyboard files, one for iPhone and another for iPad. Adding controls to the Storyboard is similar to how we used to do with the xib files. Just drag and drop. The official terminology for a ViewController in storyboard is a scene. On the iPhone, only one scene is visible at a time. However, on iPad, multiple scenes are visible at a point, for e.g. Master Detail view controllers. 

Now I decided to add a Button to the view controller for a test. This was quite easy as dragging and dropping the component. After doing this, the ui looked something like below. 

In a xib based application, a xib file contained a top level UIWindow object, a reference to the app delegate, and one or more UIViewControllers When the apps ui is put in a story board, the MainWindow.xib is not used. 

When a starboard application is done, it is a requirement that the ApplicationDelegate should inherit from the UIResponder, and that it should have a UIWindow property. In the plist file, it will be declared that the main story board file as the story board file name and the default one it create is Main. The plist property name is UIMainStoryboardFile. When this settings is file is present, then UIApplication will load the named storyboard file dynamically and automatically instantiates the first UIViewController from that story board, and then puts its view into the new UIWindow. 

Now in order to add tab bar controller, just drag and drop the tab bar controller from the object library window to the story board. After doing this, the already existing UIview controller where the button was added still remained the launch point. The tab bar controller comes with two uiviewcontrollers by default. note that the original view controller had the arrow indicating that this is the initial launch view controller. this can be made to point to the tab bar controller by just dragging the arrow to the tab bar view controller. 

To note, even though there were two view controllers created in the storyboard, there was no UIViewController class that was present in the source file list. If needed, a separate UIViewController can be created and highlight the controller to which custom class name to be attached and give this class name. 

references: 
http://www.raywenderlich.com/50308/storyboards-tutorial-in-ios-7-part-1

Including XMPP framework in a sample app

In the iPhone XMPP sample app included in the framework, the XMPP group contains the files and folders that can be easily copied to another project. The folders include Authentication, Categories, Core, Extensions, Utilities, Vendor. Also, this is having XMPPFrameworks.h file which is basically includes other framework files and header files. To integrate the XMPP framework into the sample app, can follow the steps below 

Copy the above folders into XMPP folder in the XMPP Sample app folder. Now drag and drop the XMPP folder to the sample app project to create XMPP folder group. 
Application should include the references libraries such as CFNetwork,  CoreData, SystemConfiguration, CoreLocation, Security.

Rest of the job is to implement the XMPP core and other methods and implement the callbacks as necessary. The sample app project files that demonstrate the integration can be found here. To run the sample app, just follow the below steps 

Sample App project can be downloaded from: 
https://drive.google.com/file/d/0B0ZgwdHsnw1bYVh2c1lvNmxucW8/edit?usp=sharing

1. Extract and launch the XMPPSample.xcodeproj file 
2. Open AppDelegate.h and change the JABBER_USER_NAME & JABBER_PASSWORD to valid jabber service user name and password. For e.g. gtalk id and password. 
3. Run the sample => wait for 2-3 seconds, it brings up the roster with the images and the presence status categorised in UITableView sections 

A screenshot of the running sample with my gtalk id can be seen at: 
https://drive.google.com/file/d/0B0ZgwdHsnw1bZU5ITVdxUjUzSjA/edit?usp=sharing

Note: The sample is very basic and may not show any errors if the jabber user name or password is wrong. The sample is inspired by the iPhoneXMPP sample available in the XMPPFramework project. A new sample is created just to demonstrate how easy it is to integrate into a new project. As it turned out, it is pretty easy. 

References: 

Saturday, September 20, 2014

XMPP learning - Framework Main components



The XMPP framework started out in 2008 with an RFC 3920. Some of the common features that are on top of xmpp are roasters, capabilities and Hundreds of XEPs. But this initial version had only a handful of features such as messaging, presence and iq. Though the code was compiled into objective C, it required more code from many developers to make use of the full functionalities of XMPP. 

The framework has two parts 

1. The xmpp core
2. The extensions (roster, XEP's, optional supporting utilities etc)

The xmpp core is an implementation of RFC 3920. An important concept to be understood is, we should not confuse between XMPP and chat. XMPP is a generic protocol that can be used for many purposes. There are many companies that use this protocol for tasks such as home automation and delivering code blue alarms to nurses in hospital. The details of the XMPP indicates that eXtensible Messaging and Presence Protocol. 

XMPP core 
The main files in the core are 
XMPPStream, XMPPParser, XMPPJID, XMPPElement, XMPPIQ, XMPPMessage, XMPPPresence, XMPPModule, XMPPLogging, XMPPInternal. 

The heart of the framework is XMPPStream class. This is the class primarily the application will interact with and this is class that all extensions and custom code will plug into. 
XMPParser is an internal class used by XMPPStream. 

XMPPJID provides an immutable JID (Jabber Identifier) implementation. It support parsing of JIDs and extracting various parts of the JID in various forms. It conforms to the NSCopying protocol so that the JID may be used as is as Keys in NSDictionary. It even conforms to the NSCoding protocol. 

XMPPelement is the base class for 3 primary XMPP elements. XMPPIQ, XMPPMessage, XMPPPresence, 

XMPPModule provides a foundation class for optional pluggable extensions. If a developer is writing own specific code, then it can be register to this and declared to get events. However, if building self a standard XEP, or if want application specific extensions to be pluggable, then the developer will be building atop XMPPModule. 

XMPPLogging provides a very flexible, fast and powerful logging framework . 

XMPPInternal is just internal stuffs related to the core and various advanced low-level extensions. 

References:

Experiments with XMPP framework

If do a search in Github, we can see a number of XMPP frameworks available. robbiehanson framework is the one thought to analyse first as this was the first to list in the search result. 

Note that we wont get the download link unless logging into git repository. 

Once logged in, the code can be downloaded directly. There are some dependancies for the project, which are explained like below 
Even though the framework has several dependancies, they don't need to be checked out separately. They are downloaded when the XMPPFramework is checked out. And these dependancies 
files are available in the Vendor folder. 

The first dependancy is CocoaLumberJack, which is a logging framework used in the project. Lumberjack doesn't have any sub-dependancies. 

The next dependancy is CocoaAsyncSocket, this is low level networking code used by the framework. The dependancy required for this is the CFNetwork framework. Again, this is included in the Vendor folder, 
This one require also Apple's security framework. 

Next dependancy is KissXML. Since Apple did not include NSXML in the iOS, the replacement is KissXML. This again is available inside the Vendor folder. 
KissXML uses libxmle internally.  

The 4th dependancy is libidn. Libidn's purpose is to encode and decode internationalized domain names. The library contains generic string prep implementation. Profiles for Nameprep, iSCSI, SASL, XMPP and Kerberos V5 are included. Punycode and ASCII compatible encoding ACE are included. The libidn is a static library, compiled as a fat library to include many architectures. When compiling, the compiler will only include the used portion of the code and the size of the library will remain the same. 

Since preparing a new project, the following folders are to be added. 

Authentication
Catergories 
Core 
Utilities 

In addition, we need to add the libresolve.dlib which contains the DNS resolution functions. 

Thats pretty much it is. The project file contained within the folder were pretty much compiling immediately without giving a hassle. If the Jabber ID is given as gmail id and the password as gmail password, it fetches the list of users and lists their presence information. 

References: 



Friday, September 12, 2014

Cross-secting the ImageInverter Extension WWDC sample

In this sample, the main View Controller AAPLImageShareVC.m utilises the storyboard and it has an image in it. There is a share button, and clicking on the share button, below is the code

UIActivityController *activityController = [UIActivityCotnroller alloc]initWithActivityItems:@[[imageView image]] applicationActivities:nil]; 
UIActivityController was discussed in detail with the AirDrop programming, This basically shows up the list of applications that can handle the data. With iOS 8.0, extension also get listed here. 

When an extension is selected, the image is passed as extension item to the invoked one. 

The code in the extension is like below 

the UIViewController is having the extensionContext and we can get the object from the extension item like below 

NSExtensionItem *imageItem = [self.extensionItems.inputItems firstObject];
// now get the item provider from the extension object

NSItemProvider *imageItemProvider = [imageItem attachments firstObject];

Now we can check if the item provider is conforming to the image UTI

imageItemProvider hasItemsConformingToTypeIdentifier: (NSString*)kUTTypeImage

imageItemProvider has a method loadItemForTypeIdentifier which will load the image from the ImageItemProvider in a completion block. 
Once the image is retrieved, this can be set in the view so that the extension displays this information 

Now, when the extension is done, then application can post the data back to caller of the extension in the below way. 

NSExtensionItem *extensionItem = [NSExtensionItem alloc]init];
extensionItem setAttributeTitle:[[NSAttributedString alloc]initWithString:@"Inverted Image"];
extensionItem setAttachments:[NSItemProvider alloc] initWithItem:[imageView image] typeIdentifier:(NSSTring*)kUTTypeImage
];
[self.extensionContext completeRequestReturningItems:@[extensionItem] completionHandler:nil];

If it is to cancel without doing any action, below code can be used 


[self.extensionContext cancelRequestWithError:[NSError errorWithDomain:@"ImageInverterErrorDomain"] code:0 userInfo:nil];

references: 

Monday, September 8, 2014

Using ANDLineChartView

ANDLineChartView is and easy to use view based class for displaying animated line chart. The source code is available in Github and the compilation steps is bit different compared to other chart SDKs and can be cracked open like given below. 
as per the instructions, need to run the pod install install the pod file. But this resulted in the below message 
-bash: pod: command not found 
The pod utility is not installed as it appeared. To resolve this issue, the cocoa pods needs to be installed from the github location,   
Even though there is another previous blog about installing cocoa pods, it is almost forgotten since no real life samples were done. Considering this as good opportunity, looked at the instructions in the cocoa pods site in the reference and the summary is like below. 
In short, the cocoa pods utilise a pod file which is a text file listing all the dependancies. Cocoapods will resolve dependancies between libraries, fetch the resulting source code, then link it together in the Xcode workspace to build the project. 

Installation of the cocoa pods is like below. 

Cocoapods is built with Ruby and it will be installable with the default ruby available on OS X. Below command said that it will install the gem and is given below. 

sudo gem install cocoapods 

This seems to be getting hung for a long time. and finally came back with the below error

ERROR:  Error installing cocoapods:
activesupport requires Ruby version >= 1.9.3.

One of the blog post was mentioning that executing the below statement could resolve the issue:

rvm install 1.9.3

however, doing this resulted in "rvm: command not found"

Did the below two steps which worked for most of the people

rm -rf ~/.rvm 
curl -L https://get.rvm.io | bash -s stable 

But this did not fully resolve the problem, and had to do the below as well. 

source ~/.rvm/scripts/rvm 
type dvm | head -n 1 

the output was 

rvm is a function 
The suggestion s to add the source ~/.rvm/scripts/rvm to the ~/.bash_profile file. for time being did not do this. 

Now trying again with the command 

Now trying again with the command rvm install 1.9.3

This created a system memory not available message as the hard disk was running low on memory.
this resulted in the below issue 

Error running 'requirements_osx_port_libs_install apple-gcc42 libyaml libksba',
showing last 15 lines of /Users/retheesh/.rvm/log/1410182167_ruby-1.9.3-p547/package_install_apple-gcc42_libyaml_libksba.log
To report a bug, follow the instructions in the gu

Sunday, September 7, 2014

Streaming


Streaming is a method of serving or delivering video and audio content over the internet. The need to accommodate various device screen dimensions has complicated the streaming work flow. Adaptive bit rate streaming offers viewers with various quality with different devices. The encoder creates multiple version of video. These streams are then delivered to the server. When the viewer first clicks play, the server delivers a manifest file containing information about the available streams to the player. Based on players response and bandwidth, the server switches stream quality every few seconds to provide steady non-stop playback experience. To deliver the streams to various devices, typically would need separate servers and containers. 

Some of the smart solutions use one streaming server and one transcoder. The transcoder engine creates different quality of streaming video and the streaming engine repackages into different streaming formats. Such as smart streaming engine need only one streaming input which could be a software or hardware HD audio video input. The major encoding solutions include RTMP which is a free software and enterprise grade MPEG-TS hardware. If there is an encoding engine, then it can be used to deliver to variety of end points such as Flash, HLS (HTTP live streaming) or Silverlight. Also some of the products give a DRR add on so that the users can pause, rewind, or resume live streams. which is called catchup-TV services. This basically gives a time shifted viewing experience. 

Another option is to stream from IP cameras which are using H.264 encoding. The server may re-broadcast into viewers on any screen. A transcoder add on can then handle non-H264 cameras to create single or adaptive bit rate H.264 streams consumable on any screen. 

References: 

Thursday, September 4, 2014

Creating first MAC application

The basic concepts in this are below: 

- The starting point is same as iOS application. i.e. File -> New Project , select the Mac OS X and Cocoa Application. 
- Save the project and just run it => It shows up the Empty Mac application. 

- the difference between iOS and Mac application is that the window is fully resizable. 
- Mac application can have more than one window, and can be minimised etc. 

The empty view that shows up when it is run is a Window. 

Now it is time to add views to it. This can be done in the below way. Inorder to do this, File / New File and give name something like FirstAppViewController. 
If the Controller is created with a xib file associated with it, then the components can be just dragged to the xib file

Even though now we created a new Cotnroller, it doesn't get added to the window automatically. Code like below needs to be added to the app delegate

self.firstAppViewController = [[FirstAppViewController alloc]initWithNibName:@"FirstAppViewController" bundle:nil];
    [self.window.contentView addSubview:self.firstAppViewController.view];
    self.firstAppViewController.view.frame = ((NSView*)self.window.contentView).bounds;

After this the application shown the window like below !!



References:

Wednesday, September 3, 2014

CIF (Common Intermediate Format)

This is also known as FCIF (Full CIF). This is a format used to standardise the horizontal and vertical resolutions in pixels of YCbCr sequence in video signals, commonly used in video teleconferencing systems. It was first proposed in the H.261 standard. 

The CIF image sizes were specifically chosen to be multiples of macro blocks (i.e. 16x16 pixels) because the way the desecrate con sine transform based video compression and decompression is handled. So, by example, a CIF image size (352x288) consists of 22x18 macro blocks.

SIF (Source Input Format) is practically identical to CIF, but taken from MPEG-1 rather than ITU Standards. SIF on 525 line(NTSC) is 352x240, and on 625 line ("PAL") is 352x288 which is identical to the CIF. SIF and 4SIF are commonly used in the video conferencing systems. 

DCIF means Double CIF and is introduced as a compromised solution between CIF and 4CIF that is more balanced (in terms of horizontal and vertical resolution) and suited to common CCTV equipment 

references:

What is Facebook App Center


This is a Facebook initiative to let the users find the social apps. The app centre is available on both iOS and Android platforms. All apps those are canvas, 

In order to have app appeared on the Facebook page, the developer need to create an app detail page. Once the app is created, it will be available in the Facebook search. If the app 
got a good amount of ratings, then it is eligible for getting listed in the app centre. 

The main eligibility criteria is, the technique used for building such an app. The app must be built using 

- A Canvas app 
- An iOS or Android app that uses the Facebook Login SDK for iOS or Android
- An app that is a website or mobile website that uses Facebook login

There are a number of guidelines for creating an application and that is listed in the link mentioned in the references section. 

The review process is similar to the iOS app review, with the difference that a Facebook app need to get qualified by enough rating to get picked up for the approval process. The submission
looks can happen only after enough rating has been obtained. 

Even if the app is rejected during the approval process, the app will be still searchable in the Facebook and able to be found. 

There are many benefits for App developers if the app is listed in the app centre. some of those are below

- user feedback on the app

- in app purchases streamlined etc.  

References: