Saturday, February 28, 2015

WebSocket - an Initiatial investigation

Thought to try something on web socket, and landed on a web page like this: https://github.com/elabs/mobile-websocket-example/blob/master/README.md

Downloaded all the source files and it had Android, iOS, Browser, clients and a ruby based server. 

I did not have the ruby installation correctly it seems, so i had to download and install the ruby. Did the below command and installed the ruby 

sudo gem install bundler 

this installed and from the server directory, just ran the command “bundle install” 

So this specifically installs the web socket server 

bundle install
Fetching gem metadata from https://rubygems.org/...........
Fetching version metadata from https://rubygems.org/..
Fetching dependency metadata from https://rubygems.org/..
/Library/Ruby/Gems/2.0.0/gems/bundler-1.8.3/lib/bundler.rb:317: warning: Insecure world writable dir /usr/local in PATH, mode 040777
Installing eventmachine 1.0.3
Installing http_parser.rb 0.5.3
Installing em-websocket 0.4.0


to start the server, just need to call “ruby server.rb”

Now, connect from the client which is given along with the server browser code given. 


administrators-MacBook-Pro-3:server retheesh$ ruby server.rb
[[:initialize]]

[[:receive_data,
  "GET / HTTP/1.1\r\nHost: localhost:8080\r\nConnection: Upgrade\r\nPragma: no-cache\r\nCache-Control: no-cache\r\nUpgrade: websocket\r\nOrigin: null\r\nSec-WebSocket-Version: 13\r\nUser-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.115 Safari/537.36\r\nAccept-Encoding: gzip, deflate, sdch\r\nAccept-Language: en-US,en;q=0.8\r\nSec-WebSocket-Key: 8EWR5n2GnRKyGWuX5RSDuA==\r\nSec-WebSocket-Extensions: permessage-deflate; client_max_window_bits\r\n\r\n"]]

[[:accepting_ws_version, 13]]

[[:upgrade_response,
  "HTTP/1.1 101 Switching Protocols\r\nUpgrade: websocket\r\nConnection: Upgrade\r\nSec-WebSocket-Accept: VsXGAOPgndpkYnXSFZ7pb9CKbZc=\r\n\r\n"]]

[[:sending_frame, :text, "1 connected"]]

[[:receive_data,
  "\x81\xFE\x00\x84\xAD %\x83\xE5EI\xEF\xC2\x00C\xF1\xC2M\x05\xCE\xC2ZL\xEF\xC1A\n\xB6\x83\x10\x05\xAB\xE0AF\xEA\xC3TJ\xF0\xC5\e\x05\xCA\xC3T@\xEF\x8DmD\xE0\x8Dov\xA3\xF5\x00\x14\xB3\xF2\x11\x15\xDC\x9C\t\x05\xC2\xDDPI\xE6\xFAEG\xC8\xC4T\n\xB6\x9E\x17\v\xB0\x9B\x00\r\xC8\xE5th\xCF\x81\x00I\xEA\xC6E\x05\xC4\xC8CN\xEC\x84\x00f\xEB\xDFOH\xE6\x82\x14\x15\xAD\x9D\x0E\x17\xB1\x9C\x14\v\xB2\x9C\x15\x05\xD0\xCCFD\xF1\xC4\x0F\x10\xB0\x9A\x0E\x16\xB5"]]

References: 

Android - Relative Layout

Relative layout is a view group that displays child views in a relative positions. The position of each view can be specified relative to its siblings. 
The reference documentation on this can be found at at link below http://developer.android.com/guide/topics/ui/layout/relative.html 
There are many layout parameters available and they are mentioned in http://developer.android.com/reference/android/widget/RelativeLayout.LayoutParams.html 

Below is a sample which shows this. 

the tv_sample is made to the right of the tv_helloworld 

Note that we cannot specify a circular dependancy. for e.g. we cannot specify android:layout_toLeftOf="@+id/tv_sample" />, as that will become circular dependancy.
With the below XML, the screenshot from device was like this below 

    xmlns:tools="http://schemas.android.com/tools" android:layout_width="match_parent"
    android:layout_height="match_parent" android:paddingLeft="@dimen/activity_horizontal_margin"
    android:paddingRight="@dimen/activity_horizontal_margin"
    android:paddingTop="@dimen/activity_vertical_margin"
    android:paddingBottom="@dimen/activity_vertical_margin" tools:context=".MainActivity">

   
        android:layout_height="wrap_content"
        android:id="@+id/tv_username"
        android:layout_alignParentLeft="true"
        android:layout_alignParentStart="true" />

   
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:id="@+id/et_username"
        android:layout_alignParentRight="true"
        android:layout_alignParentTop="true"
        android:hint="Enter Username"/>

   
        android:layout_height="wrap_content"
        android:id="@+id/tv_password"
        android:layout_below="@+id/tv_username"
        android:paddingTop="25dp"
        android:layout_alignParentLeft="true" />

   
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:id="@+id/et_password"
        android:layout_alignParentRight="true"
        android:hint="Enter Password"
        android:layout_below="@+id/et_username"/>

   
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:text="Login"
        android:id="@+id/button"
        android:layout_below="@+id/et_password"
        android:layout_alignParentRight="true"/>



Screenshots: 

 


It is difficult to make any kind of complicated ui with relative layout as many developers say. 
I was trying to specify the layout widths with this layout, however, was unable to achieve so far. 

References: 

Thursday, February 26, 2015

iOS Auto Layout - Yet another example

Thought to create an application which having browser kind of view. With an address field, a Go button and a webview. Below is how the final finished product looking like.


Learnings from this are:

When tried to give the spacing between text field and the button, it gave warnings. that was because for the button, only the height constraint was given, not the width constraint. This caused the confusion to the layout manager it looks like and was making the items to go awry.



references:
self


iOS 8.0 Auto layout learning

For this demo, the project is named SizeClassDemo and it does not remove the size class property. 
Dragged and dropped a label and two button on the main story board which also uses the size classes. 
This looked fine in the story board, but previewing on the various screen resolutions, the right buttons did not quite look good. 
Also dragged a button and a text view. The view looked like below 

Now selecting the Label, pressed on the I beam 

Few notes from experience while doing this are:
- We can see all the constraints added to a UI component using the second right button on the right side tool area
- We can see that by default, the story board considers the status area and leaves that and in landscape mode, it doesn’t show up the status area 
- there is a 16 px margin left on the edges. ignored to remove that, place -16 as a left value, +16 as a right value, add these as constraints
- Whenever update a constraint, unless we update the frame using the triangular icon option, it will show the warning in orange color. 

Yay! my first app with the constraints is ready. 






References: 

Facebook - Webtool for exploring Graph APIs

One of my friend brought to my notice about this beautiful tool which can be used for experimenting the graph APIs. The link is given in reference. Just like the query is given in the SDK APIs, it can be given in this page also showing the JSON responses.



References:
https://developers.facebook.com/tools/explorer/145634995501895/?method=GET&path=me%3Ffields%3Did%2Cname&version=v2.2&

Saturday, February 21, 2015

Android Creating a CustomView from Layout XML file

This is easy as declaring the ViewGroup in the layout XML file and then creating a class and inflating this layout values.

The inflation code is like below.

ublic class FloatingView extends RelativeLayout {
    private ImageView launcherButtonImgView;
    private ImageView closeButtonImgView;

    public FloatingView(Context context) {
        super(context);
        init();
    }

    public FloatingView(Context context, AttributeSet attrs) {
        super(context, attrs);
        init();
    }

    public FloatingView(Context context, AttributeSet attrs, int defStyle) {
        super(context, attrs, defStyle);
        init();
    }

    private void init() {
        inflate(getContext(), R.layout.floating_view_layout, this);
        this.launcherButtonImgView = (ImageView)findViewById(R.id.thumbnail);
        this.closeButtonImgView = (ImageView)findViewById(R.id.thumbnail2);
    }
}



The entire source code is available at

https://drive.google.com/folderview?id=0B0ZgwdHsnw1bfllsWjZNZWcwTC13Vmk3T09tbHRSd1hUXzFST1Q1cFptRmk1ZjR5eWdvYk0&usp=sharing

References:
http://trickyandroid.com/protip-inflating-layout-for-your-custom-view/
http://stackoverflow.com/questions/16022615/how-to-load-programmatically-a-layout-xml-file-in-android

Friday, February 20, 2015

Android play Short Sound Clips


As part of an app, which had to play a short sound. There is a possibility that we can use SoundPool. 

Below is some code snippet for this: 

SoundPool soundPool;
HashMap soundPoolMap;
int soundID = 1;
Button sound1;

@override onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
soundPool = new SoundPool(4,AudioManager.STREAM_MUSIC,100);
soundPoolMap = new HashMap();
soundPoolMap.put(soundID,soundPool.load(this,R.raw.click,1));
sound1 = (Button) findViewByID(R.id.Beaver);
sound1.setOnClickListener(new View.onClickListener()
{
public void onClick(View v)
{
AudioManager manager = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
float curVolume = manager.getStreamVolume(AudioManager.STREAM_MUSIC);
float maxVolume = manager.getStreamMaxVolume(AudioManager.STREAM_MUSIC);
float leftVolume = curVolume / maxVolume;
float rightVolume = curVoluem /maxVolume
int priority  = 1;
int no_loop = 0;
float normalPlaybackRate = 1f;
soundPool.play (soundID,leftVolume, rightVolume,priority,no_loop,normal_playback_rate);
}
});
}

References:

Thursday, February 19, 2015

Android - Showing Overlay Window on top of all activities

This can be achieved if we are using the SYSTEM_ALERT_WINDOW. As per the API document, this should be used by only very few Apps. 
these windows are intended for system-level interaction

Code looks something like this 

public class OverlayViewService extends Service
{
private WindowManager windowManager; 
private ImageView overlayViewImage;
@override public IBinder onBind(Intent intent)
{
return null;
}
@override public voice onCreate()
{
super.onCreate();
windowManager = (WindowManager) getSystemService(WINDOW_SERVICE);
overlayViewImage = new ImageView(this);
overlayViewImage.setImageResource(R.drawable.overlay_image);
WindowManager.LayoutParams params = new WindowManager.LayoutParams(
WindowManager.LayoutParams.WRAP_CONTENT,
WindowManager.LayoutParams.WRAP_CONTENT,
WindowManager.LayoutParams.TYPE_PHONE,
WindowManager.LayoutParams.FLAG_NOT_FOCUSABLE,
PixelForamt.TRANSLUCENT
);

params.gravity = Gravity.TOP | Gravity.LEFT;
params.x = 0;
params.y = 100;
windowManager.addView(overlayViewImage, params);
}

@override
public void onDestroy()
{
super.onDestroy();
if(overlayViewImage != null) windowManager.removeView(overlayViewImage);
}
}

now start the service like below 

startService(new Intent(context, OverlayViewService.class));

References:

Wednesday, February 18, 2015

iOS - Google Chrome browser launch URLs


Looks like this is possible to display the App name as the navigational back button if we have Chrome browser installed in the device. 

Did a test in My TWC app by passing the url scheme like below 

    [[UIApplication sharedApplication]openURL:url2];


This opens up the google.com url in the chrome browser and shows the app name (passed in the x-source scheme parameter in the above scheme) in the browser. like in screenshot here. 


We can also utilise this feature in the following way

if(chromeInstalled) //using canOpenURL api. 
{
    [[UIApplication sharedApplication]openURL:url2];
}
else 
{
    NSURL *url2 = [NSURL URLWithString:@“http://www.google.com"];
    [[UIApplication sharedApplication]openURL:url2];

}

References
https://developer.chrome.com/multidevice/ios/links

Android Enable Offline mode text to speech


It looks like not all devices are supporting Offline mode speech input. 

On my phone, i could find these settings under 

Settings -> Voice Search -> Offline Speech Recognition 

But even after having the offline speech recognition enabled, it was not working well. 

References:

Tuesday, February 17, 2015

Android Speech to Text

Android comes with the default inbuilt support for speech to text. Below are few simple steps to include this feature are:

Step 1: Create a Recognizer Intent by setting Intent flags such as 

ACTION_RECOGNIZE_SPEECH  - take user’s speech input and return to the same activity
LANGUAGE_MODEL_FREE_FORM - Consider input in free form English 
EXTRA_PROMPT - text prompt to be shown to the user when to speak. 


Step 2: Receiving the speech response
Once the speech input is done, then we have to catch the response in onActivityResult and take the appropriate action needed. 

private void promptSpeechInput
{
Intent intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL, RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);
intnet.putExtra(RecognizerIntent.EXTRA_LANGUAGE,Locale.getDefault());
intent.putExtra(RecognizerIntent.EXTRA_PROMPT,getString(R.string.speech_prompt));
try
{
startActivityForResult(intent, REQ_CODE_SPEECH_INPUT);
}
catch (ActivityNotFoundException ex)
{
ex.printStackTrace();
}
}

protected void onActivityResult(int requestedCode, int resultCode, Intent data)
{
super.onActivityResult(requestedCode, resultCode, data);
switch(requestCode)
{
case REQ_CODE_SPEECH_INPUT:
{
if (resultCode == RESULT_OK && data != null)
{
ArrayLis result = data.getStringArrayListExtra(RecognizerIntent.EXTRA_RESULTS).txtSpeechInput.setText(result.get(0));
break;
}
}
}
}


References:

Monday, February 16, 2015

Software As a Service (SaaS)

SaaS is software distribution model in which applications are hosted by a vendor or a service provider and made available to customers over network, usually internet. 

SaaS is becoming increasingly relevant delivery model as underlying technology that support Web Services and service oriented architecture (SOA) mature and new developmental approaches such as Ajax, become popular. Meanwhile broadband services has become increasingly available to support user access from more areas around the world. 

SaaS is closely related to the ASP (Application Service Provider) and on Demand computing software delivery models. IDC identifies two slightly different delivery models for SaaS

Benefits of SaaS include

- Easier Administration
- Automatic updates and patch management
- Compatibility: All users will have same version of software 
- easier collaboration, for some reason
- global accessibility

The traditional model of software distribution, in which software is purchased and installed on personal computers, is sometimes referred to as software as a product. 


References:

Saturday, February 14, 2015

iOS 8.0 Auto layout applications


in iOS 8.0, introduced a concept called universal story board, which means we can create a same storyboard for iPhone and iPad. 
We can turn this off by unchecking the size classes. 

The one nice thing noticed about this is, there is a feature called Preview in Assistant editor. Which lets developer to see how the ui looks like on different resolutions. 
This can be opted in by Clicking “Show the Assistant Editor”. By default, it would have been set to Automatic, instead, set to Preview. 
We can add more and simulators in it and see how it looks on devices models and screen resolutions

For the tutorial perspective, it is dragging a label at the story board and for aligning to the centre, ctrl drag to the 4 edges of the screen and select Trailing edge to the container margin
Like wise, do the same for all the edges. 

However, this seems to be equivalent to centre vertically and horizontally in the container.  

References:

Friday, February 13, 2015

Android: Running Camera2Video Application


Wanted to further investigate on how to record video via the Phone video recorder. Camera2Video Application is the one provided by the Android Examples, this is already a Android Studio Project. However, trying to run it did not seem to have Android SDK Build tools. It opened up the Android tools and installed the SDK build tools. 

Installing Archives:
  Preparing to install archives
  Installing Android SDK Build-tools, revision 21.1.1
    Installed Android SDK Build-tools, revision 21.1.1
  Done. 1 package installed.

This seems to be using the min SDK as 21 which i have only older KitKat device. It was not possible to run. 

references:

Tuesday, February 10, 2015

Android Video Recording


The intent was to provide a Camcorder view on an activity. For this, created an activity first and then in that a CamcorderView which shows the camera preview which can record on a tap action. 

Camcorder view works with Android SurfaceView. The class declaration is something like below 

public class CamcorderView extends SurfaceView implements SurfaceHolder.Callback 
{
MediaRecorder recorder;
SurfaceHolder holder:

public CamcorderView (Context context, AttributeSet attrs)
{
holder = getHolder();
holder.addCallback(this); 
recorder = new MediaRecorder(); 
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT);
recorder.setOutputFormat(MediaREcorder.OutputFormat.MPEG_4);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recoder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
}

public void surfaceCreated(SurfaceHolder holder)
{
recorder.setOutputFile(outtputFile);
recorder.setPreviewDisplay(holder.getSurface());
if(recorder != null)
{
recorder.prepare();
}
}
public void startRecording()
{
recorder.start();
}
public void stopRecording()
{
recorder.stop();
recorder.release();
}
}


However, though the video and audio was captured without problem, when playing the recorded file, it was coming as green and now video was seen, however, audio was clear. Need to investigate the reason for this.

finally, found the bleow code to be working

recorder = new MediaRecorder();
        recorder.setAudioSource(MediaRecorder.AudioSource.DEFAULT);
        recorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT);
        CamcorderProfile cpHigh = CamcorderProfile
                .get(CamcorderProfile.QUALITY_HIGH);
        recorder.setProfile(cpHigh);
        recorder.setOutputFile(outputFile);
        recorder.setMaxDuration(500000); // 50 seconds
        recorder.setMaxFileSize(10000000); // Approximately 5 megabytes

References
self

Android View vs SurfaceView - a deep dive


View 

If the application doesn’t require significant amount of processing or frame-rate speed (perhaps for a chess game, snake game, or another slowly-animated application), then developer can consider creating a custom view component and drawing with a Canvas in View.onDraw. 

This can be done by extending the View class and define the onDraw() callback method. this method will be called by the Android framework. Android framework will call the onDraw only as necessary. Each time application is prepared to draw something, then we should call the invalidate() method. 

One important thing to note is that, ignored to call invalidate from another thread than application’s UI thread, application should use the postInvalidate() method. 

Surface View
the surface view is a special subclass of View that offers a dedicated drawing surface within the view hierarchy. The aim is to offer this drawing surface to an applications secondary thread, so  that the application isn’t required to wait until the System’s view hierarchy is ready to draw. Instead, a secondary thread that has reference to the surface view can draw it to its own Canvas at its own pace. 

To begin using the SurfaceView, below are the tasks involved

1. Create a class that extends SurfaceView. 
2. Implement SurfaceHolder.Callback also in this class (this class gives the callback about when the surface is created, changed or destroyed)
3. Define thread class in this new Class that can be used for updating the surfaceview. 
4.Instead of handling surface object directly, Application should handle it via SurfaceHolder. application can get hold of this by calling getHolder method. then  add a listener using addCallback method .

Inorder to draw to the Surface Canvas from the secondary thread, applicant must pass the thread to the SurfaceHandler and call lockCanvas() then draw on to the canvas and once done with drawing, call the unlockCanvasAndPost() method passing the canvas object. 

References: 


Monday, February 9, 2015

Android Canvas & Drawables Basics


The Android framework APIs provides a set of 2D-drawing APIs that allows application to render own custom graphics onto a canvas or to modify existing views to customise their look and feel. There are two ways to draw 2D graphics. 

1. Draw graphics or animations into a View object from layout. in this manner, drawing of graphics is handled by the Ssystems’s normal view hierarcy drawing process. - Application just define the graphics to go inside the view. 

2. Draw the application  graphics directly on to a Canvas. This way application personaly call the appropriate classes onDraw method by passing the canvas or one f the Canvas drwa() method In doing like this, application also in control of any animation.

Option 1 is suitable when we want to draw simple graphics that do not change dynamically and are not part of performance intensive game. 

Option 2 is suitable when application needs to re-draw itself regularly. Applications such as video games should draw on to canvas of its own. There are more than one way to do draw onto own canvas of application.

- In the same thread as UI Activity, where in application create a custom View component in the layout , call invalidate() and then handle the onDraw callback. 


- On a separate thread where in manage SurfaveView and perform draws to canvas as fast as the thread is capable (application need not to request invalidate) 

References: 
http://developer.android.com/guide/topics/graphics/2d-graphics.html

Difference Between Android SurfaceView and regular View

The main difference between SurfaceView and regular View is that SurfaceView can be updated by a background thread. 


- A surfaceView has dedicate surface buffer while all the view share one surface buffer that is allocated by ViewRoot. In another word, surfaceView cost more resources. 

- surfaceView can not be hardware accelerated as of (JB4.2) while 95% operations on normal View are HW accelerated using openGL ES. 

- More work should be done to create your customised surfaceView. You need to listen to the surfaceCreated/Destroy event, create a render thread, more importantly, synchronize the render thread and the main thread. However to customise the view, all we need to do is override onDraw method. 

- The timing to update is different. Normal view update mechanism is constraint or controlled by the framework: Application call view.invalidate in the UI thread or view.postInvalid in other thread to indicate to the framework that the view should be updated. However the view won’t be updated immediately but wait until next VSYNC event arrived. The easy approach to understand VSYNC to achieve better smoothness. Now, back to the surfaceView, you can render it anytime as we wish. However, i can hardly tell if it is an advantage, since the display is also synchronised with VSNC, as started previously. 

- SurfaceView may take more resources. 

References:

Saturday, February 7, 2015

My First Applescript

Just tried to execute some Applescript code which was about opening up OpenOffice application and start slide show.

#!/usr/bin/osascript

-- Name of the presentation
set presentationName to "T-Notifications.pptx"

-- Create a full path the presentation
set myPresentation to (("/Users/rr/Downloads/") & presentationName)

-- launch PowerPoint and bring the application to the front
tell application "OpenOffice"
activate
-- open the presentation in PowerPoint
open myPresentation
-- run the presentation, looping until the Mac is shut down
-- set «class lUSp» of «class SSSt» of «class AAPr» to true
-- «event sPPTRSsH» «class SSSt» of «class AAPr»

end tell

Just copy this on to a text pad and save it as scpt. Open this in Applescript Utility and run it. This looks for presentation in the Downloads directory of Mac. 

References:
self.

Thursday, February 5, 2015

is 911 & 1911 different ?

Looks like they both are different and 1911 is something like if we are holding a firearm then call 1911. 1911 is highest priority call.

References:

Wednesday, February 4, 2015

Adapter Design Pattern

Adapter design pattern is one of the structural design pattern and is used so that two unrelated interfaces can work together. The object that joins these unrelated interface is called adapter. 

A Good example is Socket and  Mobile Charger. A regular wall socket produces 120 V but the mobile charger cable 

Normally there will be a default implementation of a functionality and if a change is required an adapter is created. 

There are two implementations for Adapters

1. One that extends the existing implementation and also the adapter interface 
2. One that wraps the existing implementation and also implements the adapter interface. 

both provide the same output 

Another real time implementation may be Logger. There could be a network logger, file logger, a console logger. Which logger to be used we could specify in a property file and using Java Class loader we can load the implementation at run time 

Class aClass = classLoader.loadClass("com.mytest.FileLogger”);
LoggerImpl  limpl = (LoggerImpl) aClass;

References:

Tuesday, February 3, 2015

Audio Recording On MAC

The steps for audio recording on MAC is simple as below

1. Create a capture session
2. Begin configuration of the capture session
3. Get the Audio Device
4. Create a Capture Input with the audio device
5. To the Capture session, add the created CaptureInputDevice
6. Initialise AVCaptureAudioOutput which is a local file path
7. Add the AVCaptureAudioOutput to the capture session
8. set the buffer delegate to the AVCaptureDataOutput
9. Now create an AssetWriter with local file path and the type as audio
10. Now create AVASsetWriterInput with the output audio settings
11. To the asset writer, add the AVAssetWriterInput
12 Now commit the configuration of CApture session
13. Call start Running on capture session
14. call start writing on the asset writer

the flow seqeuence diagram for this is like below



References:
My own test app.
 

Monday, February 2, 2015

Mongo DB - What is it ?



The word mongo was derived from the word humongous. Mongo DB is an open source document database and the leading No SQL database ,written in C++.

Below are few features of Mongo DB:

Document oriented storage:
JSON style documents with dynamic schemes offer simplicity and power. 

Full Index Support:
Index on any attribute just like we usually do. 

Replication & High availability 
Mirror across LAN and WANs for scale and peace of mind

Auto - Shading 
Scale horizontally without compromising functionality 

Querying 
Rich document based queries

Fast - In-Place updates 
Atomic modifiers for contention-free performance 

Map/Reduce
Flexible aggregation and data processing 

GridFS 
Store files of any size without complicating the stack. 

Mongo DB Management service
Manage MongoDB on the cloud infrastructure of developers choice 


References: 

Sunday, February 1, 2015

How Payment Gateways work ?

I was curious to know how to incorporate a payment functionality to the Website or app and if there is any specific Mobile API. Below few notes about the overall process happening with the Payment gateways such as CCAvenue.


CCAvenue is a leading payment Gateway service provider, and it is authorised by Indian financial institutions, to appoint sub merchants. CCAvenue seems to be accepting and validating internet payments via Credit Card, debit card Net banking ATM-cum-debit card Mobile payment and cash card modes from end customers in real time. 


Below is an overview of what CCAvenue do. 


Also, it seems like there are Android and iOS SDKs available which does the same procesing on mobile devices as In App payments.

references:

http://www.ccavenue.com/inapp_payments.jsp

What is Android Lollipop Material Design



Android 5.0 brings material design to Android and gives an expanded UI toolkit for integrating the new design patterns easily in apps. 

New 3D views lets developers to give z-level to raise elements off of the view hierarchy and cast real time shadow, even as they move. 

Built in Activity transitions take the user seamlessly from one state to another with beatiful animated motion. The material theme adds transitions for activities including ability to use shared visual elements across transitions

Ripple animations are available for buttons, checkboxes, and other touch controls in the app. 

Developers can now also define vector drawables in XML and animate them in a variety of ways. Vector drawables scale without losing definition, so they are perfect for single color in app icons. 

A new system managed processing thread called RenderThread keeps animations smooth even when there are delays in main UI thread. 

References: 


Recording in MAC application


Inorder to get the devices those are acting as input devices, below method can be used

[AVCaptureDevice devices]

this returns an array of devices that are currently connected and available for capture. The array contains all devices those were connected at the time of calling the method. Application also supposed to observe for AVCaptureDeviceWasConnectedNotification and AVCaptureDeviceWasDisconnectedNotification to know whether the availability of device was changed. 

The array when ran the code returned two devices 

__NSArrayI 0x600000033b40>(
,

assuming one is for vide and another is for video.

Lets say we want to capture Audio, in that case, the code will be like below 

//create a capture session 
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
//now we need to begin configuring the capture session 
[captureSession beginConfiguration];
//as part of the configuration, add the input device 
AVCaptureDeviceInput *audioInput = [[AVCaptureDeviceInput alloc] initWithDevice:audioDevice error:nil];
[captureSession addInput:audioInput];
//Now add the output
AVCaptureAudioDataOutput  audioDataOutput =  [[AVCaptureAudioDataOutput alloc] init];
[captureSession addOutput: audioDataOutput];
//create an audio queue 
dispatch_queue_t dipAudioQueue = dispatch_queue_create(“AudioQueue”, NULL);
//now ask the AVCaptureAudioDataOutput to callback to the application on the queue. Since the out captured is called back on queue, it doesn’t get blocked. However, if there is any delay in finishing the callback didOutPutSampleBuffer, then the new sample will be dropped.

 [audioDataOutput setSampleBufferDelegate:self dipAudioQueue];