Big Data: Lessons Learnt

Big Data: Lessons Learnt

How the pandemic exposed both sides of Big Data – hits and misses

The COVID-19 Pandemic proved to be the touchstone for Big Data and Artificial Intelligence (AI). A wide range of analytics technologies were marshalled in the medical response to the virus in hospitals, in lockdown areas, in the home, and in laboratories. AI was extensively used by governments and health organizations for diagnostics and to help contain the spread of COVID-19. However, not everything was a hit, and AI failed in many ways too.

Currently, we are on the threshold of exponential growth in data volume – a large part of which will be generated from global mobile data. It is estimated that by 2026, more than 6 billion people will be consuming over 226EB (exabytes) of mobile data per month via smartphones, laptops and a multitude of new devices. It is Big Data that feeds the analytics engines and therefore the two are inevitably connected. The key requirement is the availability of very large volumes of data to create the forecasting models that help make proactive real-time decisions.

The virus being a novel one, there were no existing data to feed the AI algorithms and this is where it failed. Most AI applications learn from large amounts of data, and with a “novel” virus, this information was in short supply. What was available did not provide an accurate picture and that picture changed over time as more information was collected and correlated. This once-in-a-century event proved to be a reality check for Big Data and the expectations of AI being the holy grail to all our problems – an extremely important realization.

The pandemic was, thus, a litmus test for big data experts all over the world. However, with eight months into the pandemic, not much success stories are blowing in the wind. It is now clear that we can’t accurately track a contagious disease in real-time, nor can we accurately predict where it’s headed.

Why Big Data Analytics failed

  • First, as people were unsure about what to expect from Big Data Analytics (BDA), the target was vaguely defined. Even experts sometimes tend to forget their limitations in handing so much data and overestimate their capacity and try to answer too many questions.
  • Second, data for such a purpose need to be accumulated globally to identify geographical hotspots and make predictions. But that involves an impossible number of variables, and they’re always existed a lack of coordination in collecting and combining data across countries. Privacy and security-related laws in different lands were also of concern.
  • Third, there’s no denying the fact that too many useless data are collected. This is a general problem – may be due to increasing ambition and overestimation of statistical and technological capacity.
  • Fourth, routine software packages are never adequate for analyzing big data, and is often incorrect! The added disadvantage in creating models for such a pandemic is that nobody knows the exact dynamics of the disease. Existing epidemiological models from past epidemics were used for predictions. Consequently, failure was bound to happen.
  • Fifth, current computational equipment is certainly inadequate to handle millions of variables and billions of data points. There is a strong need for a new data engine for this. Quantum computing is expected to be that new engine. According to analyst firm TBR, 2021 would witness significant discoveries that push quantum computing to the forefront – and COVID-19 may just provide the right boost.

Where Big Data scored

  • In response to the pandemic, over 500 clinical trials of potential COVID-19 treatments and interventions began worldwide. Such trials used a living database that compiled and curated data from trial registries and other sources. This helped medical and public health experts predict disease spread, find new treatments and plan for clinical management. Big Data and analytics, combined with AI technologies, were paramount to respond proactively on a global scale.
  • The healthcare and life sciences vertical has seen the application of Big Data in hospital management, care delivery, workforce planning, cost-structure optimization, insurance claims processing, device monitoring, and improving care through better customer service. IBM, SAS, Splunk, Hitachi Vantara (previously Pentaho), and SAP are some of the key technology providers with clients in this space.
  • US pharmaceutical giant Pfizer, whose COVID-19 mRNA vaccine got approved for emergency supply by the UK health authorities, has been extensively using Big Data to drive its research and drug discovery initiatives. Whether through sensors that deliver health data to clinicians or mobile apps that support patients through treatments, Pfizer is leveraging new methods of engagement and data capture. It has invested nearly $1.8 billion towards these efforts and signed strategic partnership with leading AI providers such as XtalPi, CytoReason and IBM.

Governments used Big Data extensively

Governments too relied heavily on BDA during the pandemic to find ways to track, trace and diagnose the infected. In Taiwan, for example, BDA helped integrate the health insurance database with the immigration and customs database, to track infected patients and make early diagnoses to successfully reduce the spread. In China, the pandemic has driven companies such as Huawei, Alibaba, and Baidu to accelerate BDA and AI innovation in the healthcare sector to fight COVID-19. South Korea leveraged Big Data to find the number of test kits that need to be produced to meet demands. Also, contact tracing widely helped curb the spread of COVID-19, particularly in East Asia.

China’s surveillance culture, too, depended on AI in response to COVID-19. Automated thermal scanners in train stations measured body temperatures. Any person with a high temperature was detained by health officials and tested for the virus. If the test came back positive, every other passenger was alerted so they could quarantine themselves.

China also used its millions of security cameras – generally installed for security and tracking citizens – to enforce quarantine orders. Mobile phone data was also used to track movements.

The Pandemic push to Big Data

Data and analytics have always had a critical role to play in business decision-making but the importance of data was highlighted in 2020 as COVID-19 provided senior executives and decisionmakers with a vivid reminder of the importance of agile, data-driven decision-making in enabling enterprises to respond to fast-changing market conditions and competitive threats.

Per analyst firm Frost & Sullivan, the market revenue from BDA reached $14.85 billion in 2019 and was expected to expand at a CAGR of 28.9% to reach $68.09 billion by 2025. The market is driven by organizations realizing the operational advantages of using BDA to make better-informed decisions, improve data preparation, and the ability of DDV to empower organizations to better target consumers and measure the effectiveness of their marketing campaigns.

While some enterprises are retrenching, others have seized on the outbreak as an opportunity to accelerate transformational change. Technology priorities driven by the C-suite amid coronavirus include improving efficiency and cost-cutting, understanding evolving business processes, and accelerating digital transformation. All of these rely on data, along with agility and pragmatism, to make rapid decisions as socioeconomic conditions evolve.

Companies like Amazon Web Services, Google Cloud, and others have offered researchers free access to open datasets and analytics tools to help them develop COVID-19 solutions faster. Verizon launched a big data coronavirus search engine; an open-source program used to create coronavirus academic research searches over 50,000 articles and offered highly relevant information on COVID-19 research.

© 2023 Praxis. All rights reserved. | Privacy Policy
   Contact Us