The Product-Market fit of AI technologies
Influencing disruptions in ever-changing retail, warehouse and health industries
A strategy is a roadmap for organisations to hone them to their respective objectives. From marketing strategies to product roadmap strategies, success is cemented on how such plans are executed. Along the way, a lot has to be negotiated and metrics traded off against one another that will either provide a tactical remedy or a strategic panacea. While it is the team that works towards such a role, the vision of the leadership ought to encompass and adapt to the various weathers of corporate ups and downs. This whole process, less so the technical façades of a product, determines whether you are a unicorn or if the company gets dividends out of the strategy. It defines whether you have revenues flowing in or you are a forgotten start-up that did not attract funding, had an early demise due to poor product-market fit or have not transformed quick enough to the needs of the market. Whether you are a start-up or a billion-dollar company, your success hangs on a string with the delicate balance of leadership, vision, operations, technology and ultimately the product.
Artificial Intelligence (AI) Strategies are convenient to write, operationalising them not so. In a series of musings, I intend to touch upon how digital strategies can be aligned with business strategy. Specifically, I will touch upon AI use cases and machine learning theory, organisational culture (talent, structure and ways of working), assets (technology, data, capabilities) and ecosystem (partnerships and M&A). In this article, I will broach three industry segments that have been creating a stir — retail, warehouse and health. We will discuss how machine learning technologies, like computer vision (CV), Natural Language Processing (NLP), reinforcement learning (RL), robotics, etc. can influence the disruptions in these industries.
With COVID-19 disrupting several industries, it is reflection time for sectors like retail which have invariably suffered due to punctuated lockdowns, reducing footfall. Here technological disruption is inevitable for the survival of brick-and-mortar stores. The consumer’s duel of why they should bother going to the retail frontend instead of buying commodities online is going to be even more acute. AI technologies will prove to be a game-changer to revive the retail industry (see Artefact 1).
For a start, consumers would need to have the same personalised experience as they have with an online retailer like Amazon, Walmart, Kroger and others. As soon as I walk into a store, the computer vision system recognises me, a virtual agent on my phone geolocates me to a store and remembers all of my favourite purchases, it starts to mimic an online shopping experience. With computer vision technologies getting sophisticated every month, some use cases can be as simple as determining the perfect size for a dress, stopping shoplifters for their uncanny moves or identifying shelves that need stocking up. NLP, on the other hand, can allow us to build knowledge graphs for our customer base — gone are the days of vanilla customer segmentation, customer spends have become dependent on ensemble behaviour, Instagram and YouTube influencers have popped up in hoards and so has the digital marketing strategies for the retail industry. It requires the full weight of NLP to understand such a retail base. These use cases fall into the perceive and analyse bucket of an AI value-chain, action in the form of recommendation is the third strand. Here, as soon as I buy items in a high-street grocer, should I be recommended to visit the nearby store donning haute couture or the wine merchant that is the part of my weekly ritual? This is the classic scenario of exploration (go to a new store) versus exploitation (stick with one’s habit of being risk-averse) witnessed in reinforcement learning. It is about time that the offline sibling of online recommendations becomes the lifeblood of a physical store.
Just like corporate social responsibilities (CSR), consumers are increasingly becoming responsible — they are health-conscious, calorie watching, buying local and going green. Ask yourself — will the meat in your bourguignon taste differently when you know it is from a local farm (sustainable sourcing) than a supermarket with difficult to trace ingredients? We are humans; perception matters! Can retail use this trend to reinvent itself in the post-COVID world?
As soon as a pallet arrives in a warehouse, computer vision aided cameras can intuit the number of parcels constituting a pallet. In contrast, NLP guided document ingestion system can verify the number of parcels with the purchase order that was raised. As the pallet is relocated, using vision-guided autonomous robots, from the landing bay to a pre-defined rack, a task scheduler using sophisticated machine learning algorithm distributes the ‘right’ amount of work between humans and robots. An NLP based speech system informs the warehouse assistant not only of his next assignment but also reminds him to pick groceries on his way back home. Video surveillance means gone the days where you have to tap your identity card to register presence on the shop floor. George Orwell would have been relieved to know that medical aid reaches a colleague on the warehouse floor where she has fallen off a staircase, using action recognition. Knowing how many workers and robots to schedule for a task and dynamically load balance work due to the logistic failures from a storage warehouse to a dispatch facility becomes the foray of bleeding-edge reinforcement learning. Predict, prevent and adapt is the slogan that will touch upon different facets of a warehouse in future (see Artefact 2).
A lot of companies are in a unique position where they can build AI products for the warehouse industry, and litmus test them in their own warehouses. Collaborative robots (Cobots) are a burgeoning industry in their own right with humans and robots collaborating to put together an order; in future, we will see less of cobots and more of autonomous bots that will work on a task with little or no human intervention. Such autonomous bots can range from multi-agent collaborative squadrons to soft robots that have increased tactile flexibility. The melting pot of robotics and decision making in the form of reinforcement learning is instrumental for companies to be a front-runner in this field. Time is of the essence with lots of big technology organisations eyeing the commercial success of RL, given its recent academic and engineering laurels. Going further, we need to contemplate if supply chain operations can be organically developed as a service, in a similar vein as cloud services? The marriage of capacity planning, task execution, computer vision-based inspection, voice bots, robots that are not only autonomous but also have end effectors, can go a long way to establish such a business model, incrementally.
As a technologist and a recent senior executive in the insurance industry, I often feel health and insurance go hand-in-hand. From NLP aided claims fraud detection products to IoT (Internet of Things) devices that scrutinise your health every minute of the day for dynamically calculating your health insurance premium is not a thing of the future, anymore. Primary healthcare systems and insurers have been deploying virtual assistants. The hope is to reduce involvement from expensive radiologists or general practitioners so that low-risk ailments can be dealt with using non-human bots. In a similar vein as retail and warehouse industries, artificial intelligence has touched the health industry at its core, from fundamental work on predicting protein folding, generative models-based drug discovery, vision aided radiology and microscopy to creating virtual bots that can, with some likelihood, indicate the ailment that we are suffering from or might suffer in future. Predictive analytics is slowly mutated to prescriptive analytics, where automated reasoning technologies, are being fed insights from computer vision and NLP based analysis. Reasoning based on post-diction with generative models is helping us understand the repercussion of medical intervention. Augmenting the intelligence of an already specialised doctor through such reasoning algorithms is the dream of modern medicine.
With increased pressure on the healthcare system, COVID-19 catalyses the development of robotic surgeries. Routine surgeries that have been postponed disturbs the patient-to-doctor ratio. Robotic surgeries, like its counterparts in product replenishments using robot end effectors, require a union of computer vision and robotics. A small mistake made by the robotic surgeon can cost a person their life, enforcing the construction of a precise and fault-tolerant automated system.
As a variety of companies establishes themselves on the healthcare vertical, they ought to build upon one fundamental pillar, i.e., operational expertise. Like a warehouse, nurses and doctors have to orchestrate tasks around a hospital; beds need to be filled up as per demand, the optimal quantity of coronavirus vaccines needs to be ordered so that we can optimise the number of vaccine shots that can be administered daily. Logistics is also a challenge — medical supplies that have to be maintained at a pre-determined temperature and indulged with care requires operational expertise that has already been battle-tested by a variety of organisations in different verticals.
Regardless of industries, our success rests on innovation, surgical execution of the strategy and most importantly, customer adoption.
I intend to write posts that are either (a) of mathematical nature (like the one on differential geometry) or (b) commercial use-cases of machine learning (like the current post). I will try to interleave one of each to keep the blog posts a tad bit balanced for different audience segments.