Thursday, August 1, 2019

Combinations of all numbers in List

Recently I was asked this seemingly innocent problem, given an array of numbers, generate all combination of indices of all lengths possible. For example, if there is an array of length 3, the result should be combinations of all indices of length 1, 2 and 3. The answer should be like below.

{{0},{1},{2},{0,1},{0,2},{1,2},{0,1,2}

The problem at first seemed extremely simple, but as I got down to writing code for it, it was clear very quickly that it is not that simple a problem. The final solution is not very complex. Basically, we start with all the sets of length 1 and then for every higher length, we create elements with all the indices prefixed to all the solutions of a lower length. Here is a solution that I finally came up with.


Wednesday, September 19, 2018

Finally clause in Java

Recently I was asked this question; "Is finally guaranteed to be always called in Java programs". "What are the cases when finally will not be called".

My first answer was it will always be called, of course, that was a simplistic answer and once I thought about it for a while some more thoughts came to my mind.

  1. In a healthy VM, finally is guaranteed to be called always. At least this is what I thought.
  2. If somebody abruptly terminates the node the JVM really will have no time to call finally
  3. If somebody kills the JVM with a SIGKILL, the JVM will have no time to call finally.
Now I started thinking about, is there a scenario when finally will not be called even if conditions 2 & 3 did not occur.

One of the calls that comes to mind is System.exit(), its behavior is very similar to abruptly terminate a JVM so I was keen to understand if it will call finally. As I looked at the finally specification, I found the following statement.

The finally block always executes when the try block exits. This ensures that the finally block is executed even if an unexpected exception occurs. 
When one looks at System.exit(), the following information is found.

Terminates the currently running Java Virtual Machine. The argument serves as a status code; by convention, a nonzero status code indicates abnormal termination. This method calls the exit method in class Runtime. This method never returns normally.
The key point here is that System.exit() never returns. If it is never returned then the try block will get no chance of getting exited. If try block is not getting exited, it the JVM can't call finally. Finally, as they say, proof of the pudding is in eating so I decided to try it with the latest VM.
public class FinallyCode {

  public static void main(String[] args) {
    try {
      System.out.println("Try is terminating normally.");
    }
    finally {
      System.out.println("Finally is called.");
    }
  }
}

And the output is as below.
Try is terminating normally.
Finally is called.
As we can see, this is the expected behavior, once the normal flow of try block is complete, the code in finally is called.
The next bit is when there is an exception in the normal flow of try block.

public class FinallyCode {

  public static void main(String[] args) {
    try {
      System.out.println("Try is terminating after throwing NullPointerException.");
      throw new NullPointerException();
    }
    finally {
      System.out.println("Finally is called.");
    }
  }
}

And the output is as below.
Try is terminating after throwing NullPointerException.
Finally is called.
Exception in thread "main" java.lang.NullPointerException
 at FinallyCode.main(FinallyCode.java:6)
We can see that when an exception is thrown in the flow of try block, the finally is called on the try block exit.
public class FinallyCode {

  public static void main(String[] args) {
    try {
      System.out.println("Try is terminating after calling System.exit().");
      System.exit(0);;
    }
    finally {
      System.out.println("Finally is called.");
    }
  }
}

And the output is as below.
Try is terminating after calling System.exit().
Just to be doubly sure, I tried with the error code in System.exit as another number apart from 0.
public class FinallyCode {

  public static void main(String[] args) {
    try {
      System.out.println("Try is terminating after calling System.exit().");
      System.exit(1);;
    }
    finally {
      System.out.println("Finally is called.");
    }
  }
}

And the output is as below.
Try is terminating after calling System.exit().
That leaves us to a small matter of what one needs to do when System.exit is called. Java defines a shutdown hook that can call a user-defined function in case VM is terminating.

A shutdown hook is simply an initialized but unstarted thread. When the virtual machine begins its shutdown sequence it will start all registered shutdown hooks in some unspecified order and let them run concurrently. When all the hooks have finished it will then run all uninvoked finalizers if finalization-on-exit has been enabled. Finally, the virtual machine will halt. Note that daemon threads will continue to run during the shutdown sequence, as will non-daemon threads if shutdown was initiated by invoking the exit method.
Once the shutdown sequence has begun it can be stopped only by invoking the halt method, which forcibly terminates the virtual machine.
Once the shutdown sequence has begun it is impossible to register a new shutdown hook or de-register a previously-registered hook. Attempting either of these operations will cause an IllegalStateException to be thrown.
So, how do we define a shutdown hook? Look at the code below.
public class FinallyCode {

  public static void main(String[] args) {
    try {
      Runtime.getRuntime().addShutdownHook(new Thread() {
        public void run() {
          System.out.println("Shutdown hook is called.");
        }
      });
      System.out.println("Try is terminating after calling System.exit().");
      System.exit(5);;
    }
    finally {
      System.out.println("Finally is called.");
    }
  }
}
The output is as below.
Try is terminating after calling System.exit().
Shutdown hook is called.

So the learnings from this small experiment are as follows.


  1. When the virtual machine is working normally, the finally is guaranteed to be called unless System.exit is called in the normal flow of the execution.
  2. When System.exit is called, since the function never returns, finally is not called.
  3. If the JVM is terminated through SIGKILL or through hardware reset, the finally is not called.

Monday, September 17, 2018

Unique Absolute Numbers

Recently I was asked a question. Given an array of sorted integers, count the number of unique absolute numbers without using any additional space. Here is what I came up with. Comment if you have a better approach. I believe the complexity of this function is O(n).
private static int countAbsoluteNumbers(int[] array) {

    Integer lastNumberFromLeft = null;  
    int count = 0;  
    int innerIndex = array.length - 1;  
  for (int i = 0; i < innerIndex + 1; ++i) {
    if (lastNumberFromLeft == null 
        || Math.abs(array[i]) != Math.abs(lastNumberFromLeft)) {
      if (array[i] < 0) {
        lastNumberFromLeft = array[i];      
      }
      else {
        lastNumberFromLeft = - array[i];      
      }
        ++count;
        Integer lastNumberFromRight = null;      
        for (; innerIndex > i ;--innerIndex) {
        if (Math.abs(array[innerIndex]) > Math.abs(lastNumberFromLeft) ) {
          if ((lastNumberFromRight == null) 
               || (lastNumberFromRight != null 
                   && Math.abs(array[innerIndex]) != lastNumberFromRight)) {

            ++count;          
          }
          lastNumberFromRight = array[innerIndex];        
        }
        else if (Math.abs(array[innerIndex]) != Math.abs(lastNumberFromLeft)) {
          break;
        }
      }
    }
  }
  return count;
}

Friday, July 6, 2018

Cloud Applications: The case for realtime monitoring

Most cloud applications depend on tools link Splunk or its competitors for the monitoring of applications. These tools have become an integral part of any cloud application in production. Here I am going to talk about situations when these applications are not sufficient to provide the type of monitoring that you desire.
There are situations when you want to monitor specific workflows. For example you want to monitor a customer order as it proceeds through different steps or you want to monitor what a specifc, or a set of customers are doing on your applications. You may want to monitor all the requests that are coming in from a particular IP address in realtime for debugging purposes.
Most organizations use things like Trace Id, or Tracker Id, to trace all the logs that belong to a specific request but there are issues with that. These can be effective for requests that originated through an external interaction like a REST endpoint call or a Kafka message but you can not synchronize these ids with background activities that you system may be performing to complete your workflow. 
Since tools like Splunk go through a process of log creation, parsing and it is a batch process. It is really not very effective for real-time monitoring. I believe that we need a tool that can help us with the real-time monitoring in an effective way and not become an overhead for the application itself. I call these services contextual monitoring services.
The idea of this service is pretty simple, as developers are writing applications, they are adding logs. Depending on what is of interest, the developers could push some of these logs to contextual monitoring service. The only difference is that these logs are tied to a context which may be a User Id, Order Id, Partner Id, IP Address or any other identifier that you wish to use. We will not log all the requests through this mechanism but a small subset of request that you might be interested in.
The diagram below describes a mechanism through which we can implement a system for real-time monitoring of the applications.


Basic Architecture of Real-time Monitoring
The real-time monitoring system needs to have the following basic functions.
Real-time monitoring use case
We need a capability to create and delete a context in the form of REST calls. We don't want to log all the messages all the time. The messages will only be logged when a context exists for the particular id against which messages are being logged.
We also need to support the case of a context being related to another context. Let's understand it with an example. Take the case of a retail marketplace. When any supplier updates the status of an order as shipped, this information only has an order id as context. But this needs to be updated to a user as well. So when a context for a user id is created, his order status is also logged.
Following diagram defines how the context is used. You can create a context, create a relationship, destroy a context and log messages.
Using Context

Following diagram describes how the messages are logged.
Logging message
Here we are intending to use Kafka as a backend for logging the messages. This allows us a capability to keep the messages in Kafka queue for an amount of time till it is available for use to us.
This service with help us build functionality that would help us take real-time actions on the behavior of the application.

Saturday, May 26, 2018

The microservices challenge

It is 2018 and everything that is being built on the cloud has to be a Microservices Architecture. It was not so long ago, one could build one monolithic service, deploy it behind an elastic load balancer and he was set. Once a REST call was received by the service, it can either return a success or a failure based on what happened in the processing of the request.
But that architecture had its limitations and issues that led people to look at microservices architecture. As microservices become commonplace, we are faced with unique challenges that need to be resolved for a microservices based cloud deployment.
Let's look at an example. Let's say we are building a School Management System. A typical school management system would have following modules within the system.

A sample of school management system
Now, Let's take a very simple use case on this system. A parent who has got his child admitted into the school wants to pay the fees and confirm the admission. It would probably look something like below.
Confirming admission sample flow

Now, let's assume each of the lines in the above picture are separate microservices. When parent tried to confirm the admission by paying the fees, he interacted with Admissions microservice which provided him details of fees that needs to be paid. In turn, Account microservice calls the payment gateway interface and processes the payment.
It is possible that there is a transient error in the call between Accounts and Payment Gateway microservices. The likelihood of such errors across microservices is higher because every call across microservices is a network call. There are two options to handle such situations.

  1. The error from payment gateway is captured and returned to the user and he is asked to try after some time. This causes a bad user experience because we are returning an error to the end user without knowing the reason for the error. The user may think that there is something wrong with his credit card or bank account while the error may be just because of network failure.
  2. If we are not sure about the reasons for the error, another option is to pass on the request to a dead-letter-queue service. The whole purpose of the dead-letter-queue service is to retry the request based on a configured retry-count and retry-timeout. This makes sure we inform the custom asynchronously only in cases of genuine customer error.
Modules including dead letter queue

We now we have another microservice which takes care of retrying the failed request. Assuming the initial request for fee payment to the payment gateway was unsuccessful, the modified flow of requests will look like below.
Fee payment with dead letter queue
We can clearly see this flow is more resilience and takes care of scenarios when the REST call between microservices may fail. The next big question is what are the conditions when the requests need to be sent to dead letter queue. To understand this, let's look at HTTP error codes and try to figure out what they really mean.

The HTTP error codes depicting failures are defined in series 4XX and 5XX. Let's evaluate them one by one. Normally as per standards, 4XX series are error codes are used to denote the situations where the client has errored while 5XX denotes situations where the server has errored.

4XX

400 Bad Request, 411 Length Required, 412 Precondition Failed, 413 Request Entity Too Large, 414 Request-URI Too Long, 415 Unsupported Media Type, 416 Requested Range Not Satisfiable, 417 Expectation Failed Normally these errors are solely caused by bad request body. But in case of microservices architecture, the request bodies are composed by the microservices themselves. So unless we are talking about a customer facing microservice where the request is coming from external client or user, this is most probably caused because of some interface confusion between microservices. There is nothing that a client or user can do about this request. We send this request to DLQ.

401 Unauthorized, 403 Forbidden We need to understand how we are authenticating request across microservices. If the authentication is being down with client/user credentials, then this is a fatal error and we can return the error to client/user. If it is a service authentication across users, we will have to handle it internally and we should send the request to DLQ.

404 Not Found, 405 Method Not Allowed,406 Not Acceptable, 410 Gone  If the request is coming from user/client, we can return the error back to user/client. If it is a request between two microservices, we need to send the request to DLQ.

407 Proxy Authentication Required Our application will see this error only in case of calls between microservices and we need to send this request to DLQ.

408 Request Timeout This denotes some service being down, needs to be sent to DLQ.

409 Conflict This error is generated because of inconsistent state of the system. This needs to be sent to DLQ.

5XX

500 Internal Server Error This is a transient error and should be sent to DLQ.

501 Not Implemented, 502 Bad Gateway, 503 Service Unavailable, 504 Gateway Timeout, 505 HTTP Version Not Supported If the request was received by an external client/user, this error is returned back to client/user else the request should be sent to DLQ and retried when the error is resolved.

Once we communicate an accurate understanding of each of the error codes with the team members and return proper error codes, a microservice architecture can be made more resilient with an architectural block like a dead letter queue.

Thursday, May 10, 2018

The agile conundrum

The agile manifesto has been around for quite some time and is being hailed as the biggest revolution since stored program concept. My personal experiences around agile never felt like a revolution.  If one looks at the sparsely available data related to project success rates in IT, one finds that agile projects did have higher success rates but not as high that they could be called revolutionary.

So I decided to look at the agile manifesto again.
  1. Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.
  2. Welcome changing requirements, even late in  development. Agile processes harness change for  the customer's competitive advantage.
  3. Deliver working software frequently, from a  couple of weeks to a couple of months, with a  preference to the shorter timescale.
  4. Business people and developers must work  together daily throughout the project.
  5. Build projects around motivated individuals.  Give them the environment and support they need,  and trust them to get the job done.
  6. The most efficient and effective method of  conveying information to and within a development  team is face-to-face conversation.
  7. Working software is the primary measure of progress.
  8. Agile processes promote sustainable development.  The sponsors, developers, and users should be able  to maintain a constant pace indefinitely.
  9. Continuous attention to technical excellence  and good design enhances agility.
  10. Simplicity--the art of maximizing the amount of work not done--is essential.
  11. The best architectures, requirements, and designs  emerge from self-organizing teams.
  12. At regular intervals, the team reflects on how  to become more effective, then tunes and adjusts  its behavior accordingly.
 As we carefully look at the above principles of the agile manifesto, there can't be any quarrels about 1, 3, 6, 7, 8, 9, 10, 12. These are just good principles for any project team. Nothing to do with agile versus traditional. Let's look at other items in the agile principles.

Changing Requirements
I agree that there is a need for the software teams to adapt to changing requirements but I would not go as far as to say that we need to welcome requirement change. Agile or traditional, any change in requirement does cause disturbance to the ongoing problem and agile teams may be better placed to handle that change but it is the adaptability comes from underlying software architecture and design rather than the agile process per say. I have good agile teams having underlying architecture and designs that result in the complete team being thrown off course as soon as a  requirement change is encountered.
Business and Developers must work together daily
In most of the agile teams that I have worked on, business people can't afford to work with developers on a daily basis and hence the voice of business people is carried to developers through the product managers. The original intent of "the individual with the problem" working together with "the person who can code" is not achieved anyway. There is still content lost in translation.
Motivated Individuals
I think this is the most important aspect of any agile team. This also overpowers all the other principles of the agile manifesto. If I can gather a group of motivated individuals, I don't need to really worry about most of the other stuff. If the team is motivated to build the right thing that the customer wants, all our problem would be solved. If I was running a non-agile, traditional project and I had a set of motivated individuals, I would still most probably end up with a successful project.
Best architecture, requirements, designs emerge from self-organizing teams
I think there is very little difference between a team in chaos and a self-organizing team. When a team delivers with best architecture, requirements, and design,  we call them self-organizing team otherwise they are labeled a chaos. I think this principle of the agile manifesto cannot be operationalized because if we give the freedom to self-organize to a team, whether they are effective or they have descended into chaos will only know once they produce their output and by that time it is too late.
In my view, the biggest problem staring the agile manifesto today is that it has become almost akin to a religion. People fight battles whether something should be called an Epic, Story or Task. People fight over what can be discussed in a standup. It is almost like the process has become most important aspect, nobody is worried about the end result.
Most people working in agile teams think that agile is all about coding, the thinking required to build a good product is considered a waste of time. People build bad software proudly claiming, we will refactor it later. In the name of backlog list, the visibility in the project tracking is completely lost. It has almost become a voodoo.
I believe the point of the agile manifesto was to eliminate activities that were not adding anything positive to the product. For example, there was absolutely no point in writing an interface control document because, C headers, Java interfaces could be used to explain interfaces better and they were easily manageable. But unfortunately, it has been taken as execute to eliminate activities that were actually contributing to product quality. In the end, we have not made gains that agile should have given us.

Monday, May 7, 2018

Writing Secure Code

Security is the paramount issue when deploying a product over the internet. Here I am trying to collect a set of principles for writing secure code, that I have collected from different places on the internet.


  1. Any functionality you have built can be used by anybody with an ulterior motive. Design functionality/endpoint/service/microservice with a mindset that guards against the most rogue user of that functionality.
  2. All user input coming your way should be assumed to be coming from a most rogue user of the functionality. Follow the steps defined below. 
    1. Sanitize
    2. Validate
    3. Execute
    4. Display feedback
  3. If you have defined endpoints and provided a web client or an app, Don't assume that is the only method that will be used to access your endpoints.
  4. API Keys are as important, if not more important, than usernames and password. Guard them like that.
  5. Any place you are using API keys or passwords, think, what will you do in case these are exposed. How will you handle key rotation? Particularly if you have a device as your client, it should be capable of handling key rotations, or forced password change.
  6. Passwords and API keys should not be part of the code and committed to configuration management systems.
  7. If you are using encryption with keys (e.g. AES 256), think where will be store the keys themselves. Think of Hardware Security Modules.
  8. Unless your occupation is a security researcher, don't fall to the temptation of designing your own encryption algorithm. Security by obfuscation is just a bad idea.
  9. Before storing any data, think about what you intend to do with it. Any customer data stored in your systems have potential to be leaked. Don't store a piece of data that you have no need for.
  10. Pay attention to filtering rules for your logging and audit your logs for any accidental sensitive data reaching the logs.
  11. Heed warning from compilers, lint, and other analysis tools. A sprint is done only when all the warnings have been removed.
  12. In this time and age, don't use HTTP, Stick to HTTPS with at least TLS/1.2
  13. Have a policy on employee turnover. Many times ex-employees may be the biggest source of data leakage. 
  14. Specifically design and test against injection risks. Minimize native SQL queries as much as possible.
  15. Specifically design and test against broken authentication and authorization. Many systems suffer from issues such as users assuming identities of other users, users able to authorize themselves for higher roles etc. Take the help of a security researcher if you don't have staff on the team.
  16. Each piece of data collected and stored should be specifically annotated with the level of privacy and encryption required. Detailed thought related to the type of data needs to be given right at the design stage.
  17. Auditing sensitive data is mandatory. The system needs to have an audit system built that can help us retrieve sufficient breadcrumbs in case of a customer complaint about data compromise.
These are some of the items that I could think of. Comment with your thoughts.


What are some fundamentals of security every developer should understand?
OWASP Top 10 - 2017

Saturday, January 6, 2018

Why I finally believe Microsoft is a changed company

The media has been talking about how Microsoft is a changed company. I never believed that story. I have seen many large companies trying to change and eventually fail. There are examples of Motorola, Bell Labs, HP, Nokia; all of them trying to change but failing at it.

Yesterday one of my friends introduced me to this new app called "SMS Organizer". The name could have been better but leaving that part aside, it looks like a typical thing that you expect Googles and Apples of the world to do, not expected from Microsoft at all. The fact that they built an app that solves a key problem and released it on android indicates to the fact that it is a changed company.

The app itself is fantastic, it takes a "Inbox by Gmail" approach to the SMS. I believe this is something that Google should have done years ago but they chose to ignore it. Following are the key features of the app.

  • You have Bill reminders right on top
  • Messages are categorized as personal, promotional, transactional. 
  • You can block SMS messages. 
  • You can put rules to delete messages like OTP and promotions to be deleted after a certain period. 
  • You can backup your SMS messages to Google Drive
It is a great app, I am still playing with it to see if it completely replaces the native messaging app but as of now, it looks promising.

Tuesday, August 1, 2017

Why most organizations are wrong about AI

Everybody has joined the AI or ML bandwagon. The corridor talk seems to be about words like Deep Learning, CNN, Tensor Flow and other related jargons. As companies are devoting more and more resources into machine learning, the results don't seem to be matching up the investment.
The biggest reason behind that seems to be the fact that the machine learning is not really about the underlying algorithm, but it is about the data in a form that can be used.
For any business to use the data that they are collecting for their ML solutions, they first need to define what inferences they want to draw. Once they know what inferences they want to draw, they need to look at the data and decide whether those inferences can be drawn from the data they collected. If the answer to that question is yes, an effort needs to be started to accurately tag the data and define the training set for their data.
All this is a lot of work and even with this the end result may not be worth the investment. Organizations need to understand, ML and AI is more about the data that they collect and less about that fancy algorithm that you can cook up.

Sunday, September 27, 2015

Using google blobstore through android client

Recently I have been working on an Android project that required me to store some images into cloud. Upon reading through google documentation, I learnt that the best place to store dynamic images is Google BlobService. Unfortunately for me it required some iterations to figure out how to use this service. So I thought I would document it here for the help of others.

It is important to understand how the BlobService workflow works. Following is the typical workflow of a blobservice.


  • The client first creates an upload URL where the image would be uploaded. Since it is hard (not possible) to do a decent authentication scheme around blobservice, I decided to use a appengine endpoint to create the upload URL for me. The code for appengine endpoint implementation is very simple.

    @ApiMethod(name = "getUploadURL")
    public UploadURL getUploadURL() {

        BlobstoreService blobstoreService = BlobstoreServiceFactory.getBlobstoreService();
        return new UploadURL(blobstoreService.createUploadUrl("/_ah/uploads"));
    }
The arugument to createUploadUrl is the suffix that would be added to your appengine base URL. We need to understand that the actual  image upload is unauthenticated, i.e. anybody who has the URL can upload the image but the endpoint itself is authenticated and hence nobody would know the URL to upload anything except authenticated users. 
  • Now that we have the URL, the first thing that we need to do is to convert the image into a byte array that we can upload. I have created following utility function to do just that. Basically I don't want to store full sized images, I have created this function that takes a Bitmap and scales the image and converts it to a byte array. The images is scaled with actual aspect ratio with the width fixed at maximum PRODUCT_IMAGE_WIDTH.
    public static byte[] getByteArrayFromBitmap(Bitmap bitmap, boolean scale) {

        Bitmap scaledBitmap = bitmap;
        if (scale) {
            scaledBitmap = scale(bitmap);
        }
        ByteArrayOutputStream stream = new ByteArrayOutputStream();
        scaledBitmap.compress(Bitmap.CompressFormat.PNG, 100, stream);
        return stream.toByteArray();
    }
    public static  Bitmap scale (Bitmap source){
        int w = source.getWidth();
        double factor = 1.0;
        if (w > PRODUCT_IMAGE_WIDTH) {
            factor = ((double)w) / ((double)PRODUCT_IMAGE_WIDTH);
        }
        else {

        }

        int dw = new Double((((double)factor) * ((double)(source.getWidth())))).intValue();
        int dh = new Double((((double)factor) * ((double)(source.getHeight())))).intValue();
        return Bitmap.createScaledBitmap(source, dw, dh, false);
    }
  • Now that we have the image as a byte array, we upload the image from the client using this URL. Since google has removed the old HTTPClient, I had to figure out how to use OkHTTP which seems to be the new HTTP client that needs to be used. It was really not very hard to do this, it is just that documentation around OKHttp is very sparse and it takes some trial and error to make this work. We set a response header so that the client can receive the cloud key that is generated by the blob service. We need to know this key to retrieve the image later.
            String uploadURL = api.getUploadURL().execute().getUrl();
            OkHttpClient httpClient = new OkHttpClient();
            String filename = UUID.randomUUID().toString() + ".png";
            RequestBody body = new MultipartBuilder("image-part-name")
                    .type(MultipartBuilder.FORM)
                    .addFormDataPart("file", filename,          

            RequestBody.create(MediaType.parse("image/png"), imageByteArray))
                    .build();
            Request request = new Request.Builder().url(uploadURL).post(body).build();
            Response response = httpClient.newCall(request).execute();
            String cloudKey = response.header("X-MyApplication-Blob-Cloud-key");
            // We can store this cloud key in cloud store entity with the other data related to 
            // image so that we can retrieve it later.
  • Let's look at the implementation of blobservice. We basically need to create a servlet to handle the upload request. The doPost of the servlet should handle the upload. WIth some trial and error, I figured out that the blobservice API looks for a multipart payload with image byte array and header with the name "file". You can also provide the bytearray type, otherwise it decodes it with the filename extension that you provide as part of the value.
    @Override
    public void doPost(HttpServletRequest req, HttpServletResponse res)
            throws ServletException, IOException {

        Map> blobs = blobstoreService.getUploads(req);
        List blobKeys = blobs.get("file");
        if (blobKeys == null || blobKeys.isEmpty()) {
        } else {
            res.setHeader("X-MyApplication-Blob-Cloud-key", blobKeys.get(0).getKeyString());
        }
    }
  • Retrieving the image is also very simple. You need another servlet to do that. In my case I have a servlet with a GET method which returns the image.
    @Override
    public void doGet(HttpServletRequest req, HttpServletResponse res)
            throws IOException {

        BlobKey blobKey = new BlobKey(req.getParameter("blob-key"));
        blobstoreService.serve(blobKey, res);
    }
As we can see, it is quite easy to build a work flow where the image is stored with blobservice and other metadata is stored with google cloud store with cloud key. The image can be retrieved based on the other metadata and lookup from cloud store.

Tuesday, September 9, 2014

Why I think Java is bad for computer industry

I have felt this for quite a long time and after multiple events that have happened in past, I have come to the conclusion that java should not be the first language of anybody who intends to learn programming. Even the regular java programmer should take their time off and code in some other language once in a while.

Why am I saying this. In last many years, I have run a programming competition for the employees of the companies where I have been working in. These are individuals with significant experience in programming. We generally allow people to code in C/C++ and Java.

Most of the problems that we design for these competitions have a standard statement written in them. 
The input is read from standard input till EOF and the output should be written to standard output.

Almost every year I am asked clarifications related to this statement in many different forms.

  1. My program reads from a file standard-input.txt and writes to standard-output.txt
  2. My program reads one line at a time and then you have to run it again 
  3. Standard input doesn't have EOF
Almost always these clarifications come from experienced Java programmers. This leads me to believe that people who start programming in languages that have extremely rich set of libraries forget the basic constructs of language, programming and operating system. These are some basic constructs that I would expect everybody would know Even otherwise, it is just a simple google query away. Since most of these people are experienced programmer, their presumption is that the question must have a typo and then don't bother looking it up.

There, it is off my chest now. I can breathe properly.

Saturday, December 14, 2013

Security concern: NDTV android app


 The latest release of NDTV Android app is asking for full access to your calendar. Given that it is an app from a news site, what's the point of asking for complete access to your calendar on the phone. What do they expect to do with your calendar. I am not going to install this upgrade.

Saturday, December 7, 2013

Accessory Review : Sony DSC-QX100

I finally got a chance to review Sony DSC-QX100, what Sony calls Smartphone attachable lens style camera. Here is my review of the accessory (if you want to call it) after couple of hundred shots. Let me first explain how it works..
What it is
The device is essentially a full blown camera without a viewfinder and most of the control. The only controls available on camera (lens) body are shutter and zoom control.
How does it work
QX100 connects to your smartphone wirelessly. Its can use NFC for authentication, or one can use the password provided on the camera to authenticate and then it uses Wi-Fi to connect, send the control commands and retrieve photos. I used QX100 with Xperia Z and it worked after a failed attempt at connection.
The app that is suggested by Sony to control the camera is Sony PlayMemories for Mobile. I had to download the app from the play store and install on my phone. One would think that sony would make life of its consumers easy by providing some kind of QR code in the package so searching the app becomes easy.
Setting Up
Setting up of camera is easy, I could do it in very little time and effort. But it takes 10 to 30 seconds, depending on the whims and fancy of the camera and app to connect to QX100. This becomes a huge issues because one reason I take a camera phone because I want to take spur of the moment photos. This camera makes it impossible to do that. By the time you take out the camera, mount on phone turn the camera on, start the app and then as it to connect, it is easily half a minute and the spur of the moment has passed away.

Storage
You have option to use only camera storage, phone storage or both. QX100 has a slot for SD card and I put a 32GB SD card and it worked without a problem. If you setup the app to simultaneously copy photo to both the camera and mobile phone, the turn around time for the next photo increases significantly. It takes almost 10 seconds to transfer a full size photo from camera to mobile phone and that becomes a huge issue. I tried with that and eventually disabled the functionality of copying the photos to the camera. Another huge problem for me was the Sony PMM app stores the photos somewhere in the internal memory of the phone. This causes two problems. The first being that you can very quickly run out of space on the phone and the second being I have setup Google+ to upload the photos to Picasa automatically. This behavior of PMM breaks that functionality and makes life very difficult.
Other Annoyances
Given that the lens is specifically designed for mobile phones, it is surprising that it does not support location tagging at all. One of the main reason why I take photos from mobile phone is that I want to location tag them. The reason for that seem to be that Sony Camera Remote API, which probably was designed its other line of wifi-enabled cameras does not support passing of location information to camera while clicking, probably because the camera can't handle that. I think this is a very stupid design. Even if a third party software vendor to right an app that replaces Sony PMM, they can't really add location without going through hoops to do that.
The zoom-in/out button on the lens is so close to click button that when I was using it, I would accidentally end up clicking while trying to zoom in/out the lens.
The grip of the camera when mounted on the phone is also very awkward.  You almost don't trust the clip that holds the lens to camera and since only type of viewfinder is the camera LCD, it makes it impossible to take any kind of picture in sunny areas. You practically can't see anything.
The other very significant problem is that because of the way camera is mounted in Xperia line of phones, it obstructs the flash and the PMM app also does not have any way to fire the flash while you are using the app making it absolutely useless after sunset or in dark rooms.
Picture Quality
I found picture quality of this camera definitely better than the stock camera on Xperia Z but not by the magnitude that it made any difference. Below are two picture from Xperia Z and QX 100 for comparison. Some portions of the picture actually look better in Xperia picture.
Sony Xperia Z Picture
DSC-QX100 Picture
As we can see in the picture above, there DSC-QX100 is slightly better, colors are more accurate. Find below a set of images that were taken using this camera for you to decide for yourself.


Friday, May 17, 2013

Apps that I use


Here is a list of applications that I have downloaded from Android Market and use them.

  • Adobe Reader -- For reading PDF documents.
  • Amazon Kindle -- eBooks from Amazon
  • Audible -- Audio books
  • Google Authenticator -- Two factor authentication for Google services
  • Barcode Scanner -- Really essential app, can use the camera to scan any barcode. This app is also used by many other apps for that functionality.
  • Booking.com -- App for hotel reservation and keeping track of your reservations
  • Bubble -- A surface level testing application
  • CallTrack -- Records your calls to Google Calendar
  • Calorie Counter -- A client for weight losers' social network http://www.fatsecret.com
  • CamScanner -- Very useful app for creating PDFs out of whiteboard discussions and using the camera as an scanner.
  • Cardio Trainer -- Can record your running sessions along with a google map
  • Citibank India -- Citibank online banking app
  • Cleartrip -- Mobile app for cleartrip.com
  • Compass -- A regular direction compass, has analog and digital settings. Pretty useful
  • Concur -- Keeping track of reservations
  • ConvertPad -- Fantastic Unit conversion app
  • Currents -- App for creating magazines (like flipboard). Currently used as a replacement for Reader
  • CWT To Go -- My company's travel agent is CWT, so I use this to keep track for official travels
  • Dictionary -- Mobile app for dictionary.com
  • doubleTwist -- Alternate music player, mostly used for a very good podcast search service
  • Drive -- App for google drive
  • Epocrates -- To make sure that the doctor is not messing up with you
  • Facebook for Android -- What would be life without facebook
  • File Commander -- To access the storage in the phone directly
  • Finance -- Client for Google Finance
  • Flipboard
  • Goggles -- Nifty app that can decipher things based on their pictures, Just point the camera and it would tell you what it is.
  • Google+
  • HDFC Bank -- Very Nice app for online banking
  • IMDb -- All about movies
  • L:IC Mobile -- Keeping track of your policies
  • Linked In
  • Lookout -- Security service
  • Voice -- Client for google Voice. I have it but can't use it in India
  • Layar -- Augmented Reality app, overlays stuff on google map. Pretty cool
  • Listen -- Podcast finder
  • Locale -- Very nifty app to change the settings of your phone based on time or your location.
  • My Tracks -- Records your movements. Very useful to share directions with others.
  • My Backup -- Backups all the user data on SD card. Useful when changing phones.
  • Places Directory -- Another cool app from Google, nice to find places around you.
  • QuickOffice -- Official version integrates very well with Google Drive
  • Seesmic -- A twitter client
  • Shazam -- Can decipher song details by listening to it.
  • Skype
  • SMS Backup+ -- Backups all your SMS messages to your gmail account under a specified Label.
  • Twitter -- Client for twitter
  • Unit Converter -- Converts pretty much any unit to any other unit
  • Ustream Broadcaster -- Broadcast yourself. Pretty good
  • Voice Recorder -- Record voice
  • Yahoo Mail -- That yahoo email account that I never use
Let me know you any of you people out there use any other app that you find useful. I can try that out.

Sony Xperia Z -- A Review

Recently I acquired a Sony Xperia Z and here are my impressions of the phone.
  • Industrial design of the phone is really good. I am a long time Motorola user and compared to the industrial design of Motorola, I could never get the same feel with phones made by LG or Samsung. I was forced to look at a manufacturer other than Motorola because they have pretty much walked out of the India market and there is nothing new from them. Sony's industrial design is really slick. The phone is put together very well. You don't get a plasticky feel from the phone and it feels robust.
    Benchmark Sony Xperia Z
  • Underlying hardware and operating system performs well. Here is the benchmark from Quadrant App. The only phone that I has seen give better number than this one is Galaxy S4.
  • I was earlier using Motorola Razr XT910. With the numbers of apps that I was using, I saw that the phone was running very low on RAM, so for my type of usage, this phone helps with its 2 GB RAM.
  • There is a bunch of crapware installed by Sony on the phone, most of these are meaningless and I would liked not to have them. There are apps like PlayNow, Smart Connect, Socialife, Sony LIV, Sony Music, Sony Select, Xperia Link. Xperia Privilege. Many of these apps may be useful for people who have more Sony gear at home and it is about all of these interworking together. Most of these are completely useless for me.
  • The speaker on the phone is completely useless. It is on the side of the phone and barely audible. It was a complete surprise for me since I expected sony to atleast have a handle on Audio stuff. The sound quality of
    Sony Xperia Z Speeds Airtel Bangalore
    phone call is terrible. The speakerphone can't be used at all. For it to even be audible, you have to keep the phone vertical with it sides facing towards you. If you intend to do many calls with the phone, invest in a bluetooth or wired headset. The interesting part is, the call voice quality is good with any headset. So it is just a problem with the phone.
  • The data connection itself works fine and the speeds delivered are pretty decent. I tested it with Airtel HSPA network in Bangalore and got the 6MBps upload and 1.7 MBps downloads.
In summary, for a phone that costs close to Rs. 39000/- the phone does not meet expectations. The basic features like call quality and speaker quality is well below expectation. Sony also does not have a good car dock which is surprising given the premium nature of the device.

Saturday, March 23, 2013

Apps for Scanning and OCR

CamScanner
Goggle
One of the needs that I have faced while working in office is to capture the content drawn on whiteboard and OCR. The best app that I have found that does that is CamScanner. Please scan the QR code on the left to install this app.
This app is the best app to take pictures of document or whiteboards. It take the picture, cleans it up and then adjusts the perspective of the picture as well.
Translate
One can take multiple pictures and make a PDF document of the many pictures. There is a basic version available for free and one can also buy a paid version with additional features which includes things like cloud sync and some other stuff.
The app also allows one to share the document over many mediums like email, evernote, drive or any other service that you may have installed that can handle pdf documents.
Once we have taken a picture, the next step sometime is to perform an OCR on the picture.  Here one of the old time favorite application from Google stable, Goggles comes in handy. Google Goggle along with Translate offers services for OCR and translation as well. Give these apps a spin.