Jan 30, 2015

Suggesting a new story type: "Irritant"

In the scrum world, we typically divide our stories in to bugs, features and chores. I find this classification limiting and that it does not address all the work that should be implemented to deliver an awesome product.

The problem here is not that any task cannot be classified into one of these categories. The problem is that these categories are typically owned by different people. The product mainly focus on features, QA focus on bugs and chores is a catch all, generally used to plan refactoring like tasks.

There is one group of people who have no place in these categories and they are the product end users. These are the little, little things through out the application which are neither features, nor bugs, nor chores but make a big impact on how the users perceive your product. I am calling these as 'Irritants'.

Irritants are those confusing messages, the non highlighted action button, the convoluted menu structure, the UI redundancy or lack of it and all those little things which make using the product that much more painful.

I find products filled with irritants all along everyday. Some of our most popular products today will disappear in the months and years to come mainly because of their failure to address this key story type. We focus on pixel perfect designs, micro second load times and debate on using Java vs PHP, but what really will delight our users I believe is removing these 'Irritants'.

Jul 26, 2014

Thought exercise: Artificial Intelligence

The only part that I understand about Artificial Intelligence is an attempt to create computers that can think and learn like humans or animals. AI has been in talks for many decades now and sci-fi movies stretch our imagination with the possibilities of AI. 

Being a programmer for over a decade now, I realize computers are pretty dumb and they need to be fed with every bit of work that they do. The permutations and combinations they can thus handle are for all practical purposes almost always very limited.

When I think of how our brain thinks, recollects, remembers and analyzes, I feel our microprocessor is fundamentally different than that of a computer. Things which are really simple for computers, like math, is quite complicated for humans. And things such as image processing which is quite simple for humans, is much difficult for computers.

Which leads me to thinking that may be AI might require a completely different kind of microprocessor. The assembly level instructions for an AI processor would need to be fundamentally different than what it currently is. 

We think everything in terms of images or flashes. An image big or small is the same thing for our mind. We can instantly associate one with the other. An image could be made up of thousand other images and we can instantly separate out and connect those with all the other images we have. We don't learn numbers and alphabets as ascii codes but as images and their variations.

The processor than which would create AI would need to have its basic instructions in the same paradigm. A bit in todays microprocessor would be equivalent to an image in an AI processor. There is no concept of bitmaps in an AI processors. All images then are fundamentally vector images or an image bit.

The second fundamental element we would need in this AI processor is the ability to infinitely link images to other images and parts thereof. Hence when we search for something, we don't do a linear search but a multi dimensional search which goes deep into this network of interconnected images.

Moreover vision is just one of the many senses that we think with. The AI processor in a similar manner would need to process sound, taste, smell and touch, each as fundamentals bits of our AI microprocessor. With these low level AI instruction sets available to us, programming an intelligent being would thus become increasingly fathomable.

Silicon brought a breakthrough in binary computing. Probably we need to look at some other source to bring this breakthough in AI computing. If the computing power keeps increasing at its current rate, we should be able to reach that tipping point pretty soon when we are able to construct the babbage engine of AI. I believe that should open an entirely new horizon in computing era, a step into true artificial intelligence.


Jun 5, 2014

Plan - Do -Track - Deliver

While browsing through my archives, I came across this Project Time Tracker I had developed many years back while working at UCLA. We had zero systems for tracking any kind of work, tasks or people and so I hacked this one up for use with my own team. 



Above is a screen shot of it (its the only thing I have now) which hints at the feature set it provided then.