Facial recognition is now mainstream

Facial recognition is now mainstream and available as a service. We (Strateq) are using Amazon’s engine

These 2 articles really sum it up well;



From a software engineering point of view, facial recognition has 2 values;

1. Auto identification of a user from a biometric property (which is a non transferable token).

2. As part of computer vision which enables software to trigger events and behaviour from observation of a scene, rather than wait for human UID or preprogrammed trigger.

Case study in Machine Learning

Machine learning is very different from big data, this diagram illustrates it well.

What machine learning can do is teach us rules that we can reuse to allow for automation at the speed and scale of computers and software. The story from Ars Technica below illustrate this very well.

Vicarious AI describes an algorithm they created that is able to take minimal training and easily handle CAPTCHAs;

“They modeled the structure of their AI on information we’ve gained from studying how the mammalian visual cortex processes images.In the visual cortex, different groups of neurons recognize features like edges and surfaces (and others identify motions, which aren’t really relevant here).”


Thinking about the infamous google manifesto

I think the author of the now infamous google manifesto failed to get 3 things;

1. Stereotypes exist because of patterns that exist, but the individual could very well be a part of a different minority, and so everyone should be given a chance despite the fact that majority patterns exist

2. Diversity programs and deliberate attempts to give these minorities access to ladders of opportunity may be imperfect, but they serve an important function of reaching out to a highly disadvantaged group to take away that friction of beginning or barrier of entry

3. We need diversity for representation, everytime we do something that shapes the future of humanity, for example the life changing algorithms of google, we need to ensure we have a good representation of stakeholders

Here's a great read

I'm a woman in computer science. Let me ladysplain the Google memo to you. – Vox


The DARPA framework for understanding AI

The DARPA framework for understanding AI;

1. Handcrafted Knowledge – or the Knowledge Management era.

2. Statistical Learning – or the neural net or machine learning era.

3. Contextual Adaptation – the deep learning systems will construct their own explanatory models for classes of real world phenomena.

If you notice the trend, we are moving from human abstraction to machine abstraction powered only by human math and training.


DNA Asia’s Story on the Strateq Transformation

DNA Asia published a great story about the Transformation of the Strateq Group (formerly known as Kompakar) on the 28th of April, 2017, shortly after the AWS Summit in KL;


where I stood on the stage and told the audience about the transformation of this 34 year old IT service company. Strateq is a Malaysian origin multi national, growing now in the US, Malaysia, Hong Kong, China, Singapore & Thailand and with a few more countries in the pipeline. It specialises in the vertical industries of healthcare, downstream oil and gas, disaster recovery and business continuity with its 2 shared service bureaus of cloud computing and big data analytics. I joined them since Nov 2016 as the Group Chief Innovation Officer to help with it’s transformation with and as an AWS partner.

My thoughts on the MIT Solve Talks at Google Hosted by Kara Miller on Healthcare

MIT Solve Talks at Google Hosted by Kara Miller
Video: https://youtu.be/MpMrK2nm3pU

Streamed live on 28 Oct 2015


  • Rushika Ferdandopulle, CEO of Iora Health
  • Denny Ausiello, Chairman Emeritus at Mass General Hospital
  • Heidi Williams, MIT

About Solve: “Learn about MIT’s initiative that asks extraordinary people to work together to find solutions to the extraordinarily hard problems facing our global community.” http://solve.mit.edu

My take home message was that Healthcare needs to consider building these Healthcare Operating Systems Platforms that leverages Big Data and other new technology of this digital era to integrate between sources of data, the practise and the patients. It will allow collaboration between clinicians and patients both in the provision of care and in research.

My Observations about the speakers

  1. Heidi has a very typical data scientist and epidemiologist perspective, cautious and tempered by date
  2. Denny has a very typical experienced practitioner perspective, interested in reform but cautious about hype around revolutions.
  3. Rushika has a next generation medical conceirge or patient advocate kind of perspective.
  4. The host Kara does not have deep enough knowledge of the domain to truly leverage on the panels knowledge, but she does a good job. The panel however don’t always take her queues, as evidenced in the Watson comments.
  5. The overal theme and consensus is the need for leveraging new IT capabilities to provided more wholistic lifetime medical records to clinicians at the point of care so that evidence based medicine can become the norm of practise.

My Observations about the issues

  1. Issue raised in 5:35, US using biggest % of GDP in healthcare but not getting as good quality measures as some other nations who spend less.
  2. In 7:50 its established that the driver of inflated healthcare cost in the US is waste. One driver of waste is fee for service, which skews incentives to provision unecessary procedures and therapies.
  3. The story from 17:54 by Rushika demonstrates the issue of fee for service procedure pushing and upselling that needs a better and stronger primary care doctor to prevent. At the heart of the issue is a lack of a patient advocate and the information assymetry
  4. Comments from Denny in 23:06 onwards show the need for good data science and epidemiology to avoid inaccurate misconceptions and generalisations
  5. Comments from Denny in 26:42 bring up the culture of the US, and how it rejects ‘benign neglect’, where accepting less is acceptable, but it the US, patients are vigilant and expect more. He goes on to say that medicine today is probably only operating on a 50% ratio on evidence based, the other 50% of blindness is caused by lack of information.
  6. Key point from Denny in 28:00 “The quality and quantity we get from our patients at the point of care is quite random and episodic”, its in this absence of information that a physician may resort to basing their decisions on their personal expereince rather then the nature of evidence that is available.
  7. 3:35 Kara: “how do we get from 50% to 80%?” Denny: “The quality of that information has to be guarded under more continous and presymtomatic ways” – He goes on to say that we need to use the digital tools of today to capture all phenotype information to provide clinicians with the information needed to make informed decisions. “We need a complete retake on how we garner these information, how do we partner with our patients not only in clinical care but in discovery, and then how we annotate that information to give a much more evidence based and scientific base to medicine.”
  8. 34:35 Rushika: “The right way to do this obviously is to get a tonne of data in from when people are living their normal life, we have to figure out how we interpret that data, how do we pick signals out of the noise, and turn that into action”
  9. When Kara talks about IBM’s Watson in 36:20, Denny responds to say Watson is good for dealing with structured data but not unstructured data. I know IBM’ers who will jump at this statement, but I think Denny’s point here is that until Watson can be part of ingesting and consuming data from the points of care and make sense of it depsite its lack of structure and ontology, it will be relegated to studdying journals and already structured medical knowlegde and correlating that to post structured content created by clinicians.
  10. Denny paints a picture from 37:20 of a scenario where we are able to process the data glut and turn it into a data resource that includes journaling and participation from patients, then then turned to knowledge and actions. The market now is full of apps that are comodities, that are not prioritized for goals of precision and not intergrated into the overall patient record. “We need a fully integrated and wholistic system”
  11. 39:49 Heidi points out that IT has failed to be the magic bullet to solve issues as promised
  12. 40:52 Rushika explains that the reason for this failure of IT to deliver has been rooted in the fact that much of the systems built were pivoted on billing and with that focus, the ROI and gains were focused on billing optimisation, and therefore they seeked to make Doctors structure on input, turning their documentation from a simple note in plain english to 50 clicks of forms and severely driving down productivity.
  13. 42:40 Denny: “we were all trained to diagnose disease and treat disease and its progression to ultimately death, we are the only profession in the world that doesn’t know its gold standard, we can’t diagnose wellness”. To drive wellness and engage patients, we need to work on defining what wellness looks like.
  14. Comments till 48:00 on the theme of whoslistic planning for policy makers and factoring the social aspect for health, to be able to meet the public policy goals they typically have.
  15. 49:00 Denny: “Partnerships with patients, not just in care, but also in discovery”. Behaviour science is a science, and its something to master to modify behaviour. Read Social Physics http://goo.gl/IGPaJw
  16. 57:29 Rushika describes Iora as building an operating system for healthcare instead of an EHR, a link between technology and people. Not billing but collaborative care. “Technology in the context of realtionship”
  17. 58:56 Denny: “Integrated Healthcare Systems” – “Intergration depends on people not machines, BUT machines, toolkits and skillsets can ehance much of that, and we would be foolish living in such a technologically advanced era in not taking advantage of that”

Coming to grips with the great Digital Revolution: Part 1 – Describing some parts, but never the Elephant

Describing Parts of the Elephant 


There is that familiar parable of the blind men and the elephant, originating from the Indian subcontinent. Let me use the Jain version to make my point;

“A Jain version of the story says that six blind men were asked to determine what an elephant looked like by feeling different parts of the elephant’s body. The blind man who feels a leg says the elephant is like a pillar; the one who feels the tail says the elephant is like a rope; the one who feels the trunk says the elephant is like a tree branch; the one who feels the ear says the elephant is like a hand fan; the one who feels the belly says the elephant is like a wall; and the one who feels the tusk says the elephant is like a solid pipe.

A king explains to them: “All of you are right. The reason every one of you is telling it differently is because each one of you touched the different part of the elephant. So, actually the elephant has all the features you mentioned.[2]””

I use this story to make this point, that everyone who understands the promise and possibilities of the emerging technologies in Artificial Intelligence, Robotics, Internet of Things, etc can only talk about some trends and patterns – no one can credibly predict how all these forces will converge and how the changes and the revolution they bring about would look like. In a sense, we are all describing the various attributes of the Elephant, but none of us have seen the Elephant in it’s glorious entirety. It’s here though, it’s so big and it’s presence cannot be ignored.

The Second Machine Age 

One framework I like to use when considering the problem is the idea of platforms, that there are certain technological breakthroughs, when they occur,  suddenly accelerate progress from the regular slow drudgery of linear growth to an almost instantaneous burst of exponential growth.

Screen Shot 2015-06-05 at 3.03.04 PM

In their influential book “The Second Machine Age: Work, Progress, and Prosperity in a Time go Brilliant Technologies“, Messrs Brynjolfsson and McAfee chart this phenomenon. The Economist sums up the book very well;

“Innovation has always driven advances in mankind’s standard of living, from agriculture to electricity. Information technology, the authors argue, is quantitatively and qualitatively different. It is, thanks to Moore’s law, exponential: its effects, barely perceptible for the first few decades, are turning explosive. It is also digital. Formerly complex tasks can be mastered then reproduced and distributed at almost no cost. Finally, it is recombinant, merging separate, existing innovations and innovators through networks and crowdsourcing.”  

These spikes occur in a phenomenon that some have come to call Platform moments. Deloitte explains this as a series of events;

1. The cost to performance ration of the 3 building blocks, compute, storage and bandwidth has improved exponentially over the last few years
2. Innovation is built on these building blocks and they drive up value and competitiveness, so as more and more attempt to leverage this phenomenon, the effect of no 1 is further amplified
3. Innovations start to coalesce and combine to form platforms and ecosystems that then drive innovation as a whole, bringing exponential change and disruption

Screen Shot 2015-06-05 at 3.58.16 PM

So in a nutshell, the slow upward trajectory of progress of individual components begin to build up into platforms that can suddenly launch us into accelerated progress that seems so sudden, and out of nowhere. It is with this framework that I will attempt to describe some parts of the elephant and then predict some of the possible platforms and spikes that could come, but by no means will I ever even imagine to pretend to be able to describe the elephant. That would be as absurd as being an old Mongolian solider on my horse (before the invention of the automobile) trying to predict the future of electric cars.



Sentience vs Artificial Intelligence 

There is Hollywood induced baggage to the term Artificial Intelligence, which assumes switching on a computer or robot with Sentience. Wikipedia defines Sentience as;

Sentience is the ability to feelperceive, or experience subjectively.[1] Eighteenth-century philosophers used the concept to distinguish the ability to think (reason) from the ability to feel (sentience). In modern Western philosophy, sentience is the ability to experience sensations (known in philosophy of mind as “qualia“). In Eastern philosophy, sentience is a metaphysical quality of all things that requires respect and care. The concept is central to the philosophy of animal rights, because sentience is necessary for the ability to suffer, and thus is held to confer certain rights.”

The conventional definition of AI though when you key in “define Artificial Intelligence” in google is;

Artificial Intelligence is the theory and development of computer systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.”

It is a gigantic leap from creating rational sub routines and emulating of human cognitive functions to having a machine that is sentient. That’s why to avoid doubt on a popular level, I rather use the term Cognitive Computing then AI.

The mystery of consciousness

So the AI we have today, is still a long way of from the sentient beings from our SciFi tradition. Our systems at present produce models and processes that simulate human cognition and function in very specific scenarios. Sometimes we anthropomorphise human interfaces and systems to make the user feel as if they were interacting with a sentient being. As for having a sentient machine, it will only happen after we have solved the great mystery of ‘human consciousness’, though many today question if it really exist or of if it is just an illusion of our internal dialogue machine that is constantly emulating consciousness to enable our self of self and others and then communicate with one another, because we are fundamentally a social species. The solution in the movie transcendence was to image various states of consciousness in an FMRI and reproduce it digitally despite not being able to define what it was, and then have to fused to a state-of-the-art Cognitive Engine.


Debate: On the Extinction of CIOs

Speaking at IDC CIO Magazine Conference 2015. We had an interesting dialogue on the challenges we face and if we are becoming obsolete

Speaking at IDC CIO Magazine Conference 2015. We had an interesting dialogue on the challenges we face and if we are becoming obsolete

I will be on the debate team at the CIO Magazine Conference on the 23/04/2015. 

The question will be; “Soon, CIOs will become only operational and it will be the CXOs that will be the technology decision-makers!” Agree or Disagree?” 

Here is how I plan to answer;

Saturation of IT 

1. IT is a way of life, digital immigrants are still alive but the concept of migration has passed

2. Consumer IT is now more important then Enterprise IT because people bring their talent, capital and assets with them to work and much of these are embedded in some sort of IT

3. The concept of an IT specialist is quickly becoming archaic and somewhat quaint

4. In a sense connectivity, data and processing will become a utility like water and electricity, and no one will care how it reaches them, they will only complain when it doesn’t and try to get away with paying as little as they can

Software Engineering and Computer Science 

1. Behind the utility there is serious science to keep the flow efficient, relevant and secure

2. Some services eventually will be recognised as critical national infrastructure and funded from the public coffers

3. Others will be platforms on which SAAS and BPAAS will keep being rolled out to create new opportunities for revenue

4. No matter how popular the notion of democratisation and decentralisation of IT becomes, there will always be a real science and engineering behind it that differentiates the professionals from the rest and the challenge will be one of funding from these talent and services

Evolution of the CIO 

1. Some CIOs are purveyors of information and counsellors to the King – the one who knows where to find the answers and anticipates the questions. This type will always be needed and will strengthen themselves with data science and big data. Their also the type likely to move on to be CEO or be chosen to be an acting one when CEOs need to be replaced.

2. Some CIOs are operational mavens and lubricants of process – these will make the transition to be COO.

3. Some CIOs are covert business development and strategy generals – these will enjoy the new world of social and marketing possibilities and move on to be the new generation of CMOs. Marketing is a discipline that will soon transition from art to science and when it does, it will have more IT running it. For example, Google is the world’s biggest advertising firm by revenue today.

4. Some CIOs are seen as witch doctors, the ones who understand tech and deal with those impossible and socially inept programmers. These guys are at risk for loosing scope and value. Some in highly technical industries, will always be needed and may change their title to CTO to better reflect their role.

Who will choose Technology 

1. I think its a given that as technology becomes popular and less mysterious, many other stakeholders apart from IT will want to be part of the decision and it will no longer be the sole domain of the CIO

2.The CIO though must continue to play the role of architect, making sure the company chooses technology that is efficient, relevant and secure

Herding Cats: Rethinking our change management strategy

Herding Cats 


It has been repeated ad nauseam that “working in healthcare IT is like herding cats“, a refrence to the challenges to faced in change management of Clinicians and other supporting actors in the provision of care. In their paper “Herding Cats: The Challenges of EMR Vendor Selection [1]”.  Doctors McDowell & Michelson remind us that in the case of migrating to an EMR;

“In some instances, the process may represent only an incremental change in a partially developed computerised EMR. In other cases, it comes closer to a revolution, as it is part of a complete overhaul of a minimally computerised medical record system. In the latter circumstance, the implementation of the EMR involves much more than simply automation of preexisting processes. Strategically it requires analysis of, and change to, the underlying clinical information processes.” 

In other words, it requires a change to the actual practise of care and naturally there will be resistance from your clinicians. Which is why the authors rather cheekily allude to the herding of cats in their title.

The Value of Information 


Fundamental to the practise of medicine is the medical record. The practise has evolved to codify knowledge, track patients as individuals, part of a cohort or on an epidemiological scale and track studies and research before the information revolution – so the question is often set up wrongly,as “do we need to go paperless?”. The real question should be how much faster, more collaborative, more comprehensive, more accessible simultaneously and more persistent and available do we need our medical record to be? Popular culture is saturated enough by IT for all to understand the value of information through software, so while everyone looks looking up information on a computer and the added benefits of analytics it affords, many dislike the disruptive nature of the EMR is making providers change their workflows, be more disciplined with documentation and having to do things in certain methodologies or process steps. Resistance then often comes not from a hatred of screens or keyboards but the intrusivenes of having someone else dictate your methodology and process.

Rethinking our Change Management Strategy 


The conventional wisdom is usually to engage clinicians at the very begining and then buy some monolithic application that does everything from billing claims to medical records and struggle with integration to the myriad of anxilliary life science software that already exist such as medical imaging, pharmacy management & laboratory systems. What ends up happening is either a paralysis of choosing a system or a fallout from Clinicians who lost the vote and then provide resistance to the change that the chosen software will bring to their work.

I have from my experience adopted a different approach. Let me give you a high level overview to get you thinking;

0. Build a decent IT department with real IT experts because no matter what you choose, the fundamentals underlying everything is IT.

1. The Content Management Phase. Let those experts work on a data integration strategy – how to build a complete 360 view of a patient, from operational, financial and medicine that the different stakeholders can use at the point of need to access all the information they need about the patient their attending to. This will involve the digitisation of legacy records, from scanned images, to Optical Character Recognition and patching in existing digital information that already exist. What they will end up building is a digitisation bureau and a data warehouse that will be able to provide a consolidated patient record to any application you choose to use later.

2. The Analytics & Automation Phase. Avoid talking about new workflows and process, rather begin by providing Clinicians and Operational staff more and more access to patient centered information at their convenience on their computers and mobile devices in a secure and reliable fashion. Quickly churn out analytics from this data warehouse such as some basic measures of outcomes, productivity or even commercial insights such as revenue drivers, performing departments and efficiency of different support services.

3. The New Workflows & Standards Phase. Only change the method of input and data capture with new workflows and tools after steps 1 to 3 are established. The added benefit of having completed step 1 is that you now have a bigger selection of applications that can be used since all of them tap the record from the common data warehouse. Many experienced healthcare people reading this now will protest that this can’t be done, but honestly if step 0 is done correctly, we won’t need to have this debate.

Screenshot 2014-06-02 15.16.20


[1] “Herding Cats: The Challenges of EMR Vendor Selection” by Samuel W. McDowell, PhD, Regi Wahl, and James Michelson, MD in the  Journal of Healthcare Information Management — Vol. 17, No. 3