All posts by admin

5 Tips to Select the Best PG Diploma in Management in 2021

Do you want to become a business consultant or strategist? Do you aspire to climb up the corporate ladder effortlessly to be a renowned manager in your company? Do you wish to start your own business and be a successful entrepreneur? A PG diploma management course can help prepare you with the knowledge and skills to be successful in these roles. A PGDM program is a great way to start and fast-track your career or to manage and grow a business. 

For those looking to explore the wide range of aspects of business and management within the business sphere, taking a PG diploma management program is the gateway to enter the world of opportunities. So, here are a few tips to select the best PG diploma in management in 2021.

How to select the best PG Diploma Management course in 2021

Be clear on your career goals

A diploma in management is one of the most-wanted programs of all time. Selecting the best PG diploma in management in 2021 helps you prepare for your career path in business, which may stretch across different industries or sectors. There are a lot of roles and specializations that you can take and it offers a wide range of career options and employment opportunities across different sectors. So, you need to be clear on your career goals and what role you would want to specialize in. As a top business school in India, Praxis offers specialization opportunities in traditional streams like Finance, Marketing, HR, Systems, and Operations. Thus, being clear on your goals is the most basic step that you need to follow before selecting the best PG diploma in management in 2021.  

Select the course with experienced trainers

The next step after you have defined your career goals is to pick the medium for your learning. Although today, online courses and online mode of learning have become extremely popular and more institutes and companies are offering courses online, traditional classes are more suitable for teenagers who are yet to join the workforce. Classroom-based learning offers more methods of interaction and helps students have a deeper understanding of the curriculum with longer-lasting retention of the material. In short, nothing can beat the effectiveness of traditional learning. Also, you need to make sure that the trainers for the course are well experienced so that you can have a better learning experience. 

Make sure the course content is in response to the changing demand of the industry

This is one of the most important things to keep in mind while selecting the best PG diploma in management in 2021. The best PG diploma in management in 2021 is the one that offers both practical and theoretical learning experience. Praxis offers extensive classroom training that trains the students to take up industry-specific roles in response to the changing demands of the industry. In addition to classroom training, the Praxis philosophy requires that there is intense industry interaction at various levels, in terms of live projects, site-visits, hands-on experience, and visiting faculty lectures and workshops.

Mind the placement statistics

Placement statistics is one of the important things that you need to look at before finalizing the diploma in management  This is a crucial step that decides your future. Only the institutes with good placement statistics will help you secure your future. Selecting a reputed institute like Praxis that has excellent placement statistics will help you get placed in a good company with a good salary package.

Connect with Alumni

Speaking to Alumni about the institute and course is the best way to know in-depth about the PG diploma in management in that institute. Connecting with the Alumni helps you know everything about the institute, course, tutors, learning environment, the practical ratio of the curriculum, fee, placement records, etc. Praxis’s PGPM alumni are now holding top positions in prestigious companies including Vodafone Idea Limited, Standard Chartered Bank, Kellogg’s, HSBC global analytics, and VISA.

As a premier business school in India, Praxis is offering a 2 year fully residential PGDM program. This program is aimed at combining the art and science of theoretical learning with the virtues of practical training. The program is approved by the All India Council for Technical Education (AICTE, Ministry of Human Resource Development, Government of India). The program at Praxis is, on the one hand, rooted in the principles of academic rigor and discipline, and, on the other, designed to offer multiple touch-points with the industry. We also have a well-structured campus placement program that ensures interview opportunities with the most significant companies in the field. 

Image by: https://pixabay.com/users/stocksnap-894430/

Do Data Reveal All?

Competing trial regimens for the COVID vaccine reveal new perspectives on the philosophy of data analysis

The COVID-19 vaccines have finally been launched. After a frenzied race against time and market, a handful of pharma companies came up with three or four effective vaccine candidates on which the civilization is pinning its hope. The process was not without its share of controversies, and a lot has been circulated in the media on how several factors might have been compressed to meet near-impossible deadlines that were probably dictated by reasons guided by policymakers.

To dispel doubts and worries, all manufacturers have been highlighting data derived from trial findings in support of the efficacy rates of their candidate vaccines. There was a slew of such announcements in November 2020, as preparations for marketing the vaccines were going on in full swing.

Data is the only reliable index in any scientific research, and in drug trials their role is crucial.  While any data on the COVID-19 vaccines would be epoch making, the current trial regimens have revealed new perspectives on the philosophy of data analysis. And the way the efficacy numbers were presented in the press releases, can provide valuable lessons for managers for whom data-based decisions are part of the daily grind. Let us consider the top three insights gathered in the process.

  1. The size of the database may not have any bearing onanalytics

Pfizer and BioNTech announced in early November that their vaccine candidate had displayed over 90% effectiveness in the randomized controlled trials. As per the data released by the two firms, more than 43,000 volunteers from various backgrounds had taken part in the trials – which looked like a convincing figure covering a wide sample population. The rate of effectiveness was also remarkable, because WHO guidelines requires just 50% success rate for a vaccine to be termed effective. Thus, more than 90% effective on 43,000 people looks a piece of data that indicates great success. However, data interpretation is really not that straightforward as our discussion would reveal.

Let us consider how the efficacy percentage was calculated. Following were the steps involved:

  1. Countinghow many in the group actually vaccinated contracted the COVID-19 infection.
  2. Counting how many in the placebo group contracted the COVID-19 infection.
  3. Dividing number A by number B.
  4. Subtracting the quotient from 1. The result is the efficacy rate.

In the Pfizer-BioNTech trials, 8 in vaccinated people contracted the infection compared to 86 under placebo. Thus, 8/86 = 0.093 — which, subtracted from 1, comes to 0.907 or 90.7%. That 0.7 stands for the “more” part in “more than 90% effective”.

Let us not forget that in the above calculation the total sample size – however large that might be – had no bearing on the efficacy rate. It was all about the number of persons who actually got infected. In this study, we were dealing with a total of 94 (8 + 86) confirmed cases among 43,000 people. The efficacy of the vaccine would have turned out exactly the same if the number of infected people remained the same – and distributed likewise among the vaccinated and placebo groups – in a total sample size of just 200 people!

Of course, the findings still hold good. A ratio of 8/86 in a randomized control trial is nearly impossible merely due to chance factors – had definitely been achieved through the vaccine. We just intend to highlight the fact that a huge sample size may look and sound reassuring, but in reality, it may not have any mathematical relevance to the outcome.

  1. Exact numbers may not lend extra credence to data

Into the second week of November, it was the turn for the Moscow-based Russian manufacturer of Sputnik V vaccine to release their trial results. The Gamaleya National Research Centre for Epidemiology andMicrobiology announcedan efficacy rate of 92% in a trial involving 40,000volunteers.

Close on its heels, US pharmaceutical major Modernadeclared trial results for their vaccine candidate too – it was 94.5% efficient in a trial involving a sample size of over 30,000 people.

In both these announcements, the point to be noted is the exactitude of expression: 92% and 94.5% – not over 90 or above 94 – unlike “more than 90%” as in the Pfizer-BioNTech announcement. It is a general human tendency to assume that anything that is precise and exact is closer to “the truth” or “the ultimate”. Thus, announcements with precise percentages might lead greater credence and reliability, whether or not that may be the case.

While this in no way undermines the scientific achievements of Gamaleya or Moderna, this goes to confirm that “how” numbers are presented can flavour the message that is being communicated. In business communications, and especially for promotional material, this strategy is routinely employed. However, precision is not always synonymous to perfection – but it can very well serve as a tool for persuasion. So much so that in our present example, a Belgian newspaper De Standaardwent ahead towrite that “the candidate vaccine of the American biotech company Modernaworks even better than that of Pfizer” – although no such claim had ever been made by the company itself.

Too much precision is definitely overwhelming; they impress and dazzle. That can influence the human ability to interpret data in its correct perspective, which is crucial in making informed decisions. A far balanced approach while dealing with estimates, is to make a clear distinction between precisely reported numbers andhigh-quality data – judging both on merit and in context.

  1. Better not to work your way backwards while analysing data

In the last week of November 2020, AstraZeneca came forward with findings from their own vaccine trials. Their sample size involved more than 11,000 people and the efficacy rate of their vaccine was announced to be 70%. This might surprise us because it sounds too low when compared to the previous figures. However, the AstraZeneca study used two different dosing regimens, out of which the half-dose regimen administered to a subset of 2,741 participants, proved to be 90% effective – nearly at par with the other candidates we had mentioned.

But does it? AstraZeneca later confirmed that the 90% estimate was on the basis of 33 reported cases of infection in the half-dose group, in which 3 got the actual vaccine and the other 30 were on placebo. Overall, the AstraZeneca trial reported 131 confirmed infections – leading to the 70% efficacy.

Further revelations emerged, where it turned out that the half-dose regimen was not intentional but rather an inadvertent mistake by a participating partner. Moreover, AstraZeneca admitted that they had combined together results from two differently designed trials in two geographies – the UK and Brazil – which, again is never a standard practice. It appears that the company had tried their best to proceed with the analysis despite the bloopers and salvage the situation.

This situation is a perfect example of a dilemma which data analytics are often faced with. They can either formulate a hypothesis and then gather data and analyse it to test whether the initial hypothesis holds through, or they can first collect whatever data is available and then go on to structure a hypothesis based on the analysis findings. For scientific experiments, the first approach is more-or-less universally followed. The second approach can radically increase chances of false positives, leading to erroneous conclusions. This later approach is more tempting, though, but leaves the door wide open to chance and will not hold ground against the time-tested former approach.

5 Tips to Select the Best Cyber Security Course in 2021

5 Tips to Select the Best Cyber Security Course in 2021

The coronavirus pandemic has not only disrupted the global economy, social system, and political systems but also created an enormous challenge for businesses worldwide. The dependency on digital tools and digital infrastructure has heightened among the people and employees as they try to stay connected to each other. But, with the explosion of digital technologies and tools, the risk of malicious cyber-attacks increases day by day. As companies continue to support remote working conditions, cybersecurity remains one of the enterprise’s highest priorities. 

For those looking to advance their cybersecurity careers or break into the field, taking a cybersecurity course is the gateway to enter the world of opportunities. So, here are a few tips for choosing the best Cyber Security course in 2021.

5 Tips to select the Best Cyber Security Course in 2021:

Define your career goals

Cybersecurity is one of the hottest industries in the world right now. To defend against cyberattacks and security breaches, top organizations are willing to pay a lot for cyber analysts who can protect their data. There are a lot of roles and specializations in the industry. Some of the highest-paid roles in cybersecurity include cryptographer, information security officer, Security Assessor, Security Engineer, Penetration Tester, Forensics Expert, and Security Administrator. So, it is important to be clear on your career goals and what role you would want to specialize in. This is the most basic step you need to follow before choosing the best cybersecurity course in 2021. 

Shortlist the courses with the best trainer

After you’re clear on your long-term goals, you need to pick a medium for learning. Although in today’s digital age, most of the students prefer online courses for learning cybersecurity, nothing can beat the effectiveness of traditional classroom learning. Offline learning has a certain edge over online courses and the best way to learn cybersecurity would be through an offline course. Also, you need to make sure that the trainers and tutors have the right experience and expertise. This way, the cybersecurity course you take would be much more effective and you can learn easily.

Make sure the course content is updated to the latest industry standards

This is one of the most important things to keep in mind while selecting the best cybersecurity course in 2021. You need to find a course that not only covers the theoretical aspects of the concept but also includes practical sessions for the best learning experience. The course should also be regularly updated to the latest industry standards so that you’ll be equipped with the latest technology and industry knowledge. That is what Praxis Business School endeavors to bring to you – the top cybersecurity course that combines the art and science of theoretical learning with the virtues of practical training.

Check the placement statistics

At this point, you would have probably finalized the list of institutes and cybersecurity courses that are the best of the best. Now comes a crucial step- you need to check the placement statistics and reviews of those institutes. Taking a cybersecurity course in a reputed institute like Praxis that has excellent placement statistics, will help you get placed in a good company with a good salary package. So, make sure that you check the placement reviews before applying for a cybersecurity course in any institute.

Speak to Alumni

Speaking to Alumni about the institute and course is the best way to know in-depth about the cybersecurity course in that institute. Connecting and interacting with the institute’s alumni can help you know everything about the course like course details, the practical ratio of the curriculum, mentor experience, learning process, and placement records. This way you can select the best cyber security course in 2021.  

As a premier business school in India, Praxis is offering a 9-months Full-Time Post Graduate Program in Cyber Security (PGP CS) is delivered from the State-of-the-Art dedicated Facility at Salt Lake Sector 5 in Kolkata with integrated high-speed internet-enabled advanced cybersecurity lab and fully equipped digital classrooms. Praxis has forged an extensive industry partnership with CISCO, Fortinet, ISACA (Kolkata Chapter), British Standards Institution, and Infosec Foundation to make the program relevant and effective. We also have a well-structured campus placement program that ensures interview opportunities with the most significant companies in the field.

Tips to select the best data science course in 2021

5 Tips to Select the Best Data Science Course in 2021

Data science has evolved exponentially over the past decade and it has now become the backbone of many organizations across the world. Despite having begun their careers with a basic idea of what the industry is all about, most professionals are still left with an instinct to further hone their skills. As a newbie, you might just bump into too many options making it difficult for you to figure out which data science course to pick. 

So, you want to select the best data science course, but you’re not sure which area of expertise is right for you? Here are a few tips for choosing the best data science course in 2021 for you.

Tips to select the best data science course in 2021:

1. Define your goals

Data science is a growing industry and there are a lot of roles and specializations in the industry. Some of the growing roles include data engineer, data analyst, data visualization expert, machine learning expert, data architect, etc. So, it is important to know what you’re looking for. This is the most basic step you need to follow before choosing the right data science course for you.

Check our article on- Skills Required for a Data Scientist Job in 2021

2. Consider the experience of the institute or the trainer

After you’re clear on the roles, you need to pick a medium for learning. Today, most students prefer online learning. But offline learning has a certain edge over online courses since it provides you that authentic experience of classroom learning. So, after you’ve shortlisted the courses, you need to check the experience of the institute or trainer beforehand as it helps gain the right industry knowledge.

3. Ensure that practical sessions are included

You need to find a course that not only covers the theoretical aspects of the concept but also includes practical sessions for a holistic data science course learning experience. It is crucial that you need to practice concepts that you’re learning throughout the way to pick up on skills faster. That is what Praxis Business School endeavours to bring to you – a data science course that combines the art and science of theoretical learning with the virtues of practical training. This way, you’ll learn things faster and be industry-ready in no time. 

4. Check the Placement Reviews

By now, you would have probably finalized the list of data science courses that pokes your interest. Now comes one of the most important steps, you need to check the placement status and reviews of that institute. Taking a data science course in a reputed institute like Praxis that has excellent placement reviews will help you get placed in a good company. So, make sure that you do check the placement reviews before applying for any course.

5. Speak to Alumni 

This is one of the most effective ways to know in-depth about the data science course and the institute. Interacting with the institute’s alumni can help you gain the best insight into the course. You’ll know everything you need to know about the course details, the practical ratio of the curriculum, mentor experience, placement record, and the learning process. This way you can zero in on the data science course that best suits your goals. 

 

To make students and freshers industry-ready, Praxis is providing a specialized data science course. As a premier business school in India, Praxis offers a 9-month full-time post-graduate program in Data Science. With our vast experience in business education, we offer students both the time to understand the complex theory and practice of data science concepts and the guidance from knowledgeable faculty who are available on campus for mentoring. We also have a well-structured campus placement program that ensures interview opportunities with the most significant companies in the field.

 

DALL-E: From Caption to Image

What connects a twentieth-century surrealist artist, a Pixar-animated film from 2008 and an AI-backed neural engine? The richest man in the world, of course.

Salvador Dali was a surrealist painter born in Spain in 1908. Exactly a hundred years later, in 2008, Pixar Animation Studios, a subsidiary of the Walt Disney Studios, released an animated film about the lost robot left on the earth, called Wall-E. Thirteen years hence, Elon Musk-backed AI laboratory, OpenAI, has brought the two together to piece together ‘DALL-E’ — a portmanteau formed by the juxtaposition of ‘Dali’ and ‘Wall-E’.

The Dali of the AI world

Dall-E is essentially a piece of from the labs of OpenAI that has managed to generate images from a short caption alone. The neural engine working behind the application specifically uses a dataset of 12 billion images along with their captions from the wide abyss of the internet. The images are quirky and innovative AI-backed software, ranging from armchairs in the shape of avocados to baby radishes walking dogs in tutus. In a blog post published very recently, OpenAI writes: “We’ve found that it [Dall-E] has a diverse set of capabilities, including creating anthropomorphized versions of animals and objects, combining unrelated concepts in plausible ways, rendering text, and applying transformations to existing images.”

Source: OpenAIblog

The ingenuity behind DALL-E is the fact that it is the first-and-only neural engine of its kind that can coherently generate images or videos while relying solely on text inputs. Whilst there are several AI or machine learning-based image or video generators in the market, not a single one has the ability to produce images from captions alone. In general, the production of synthetic images and videos have gained much popularity over the recent past – leading to the creation of several ‘deepfakes’, for example. These generally use General Adversarial Networks (GANs) employing two neural networks in order to carry out their processes.

According to a report from CNBC USA, “OpenAI acknowledged that DALL-E has the ‘potential for significant, broad societal impacts,’ adding that it plans to analyze how models like DALL-E ‘relate to societal issues like economic impact on certain work processes and professions, the potential for bias in the model outputs, and the longer-term ethical challenges implied by this technology.’”

OpenAI on a roll

The release of DALL-E from the house of OpenAI comes only a few months after the launch of GPT-3, currently regarded as the world’s most advanced natural language processing (NLP) AI software. The GPT-3 is a language-generation tool capable of producing high-quality human-like text – even impressively writing its own news articles, short stories and poetry.

OpenAI writes: “Like GPT-3, DALL-E is a transformer language model. It receives both the text and the image as a single stream of data containing up to 1280 tokens and is trained using maximum likelihood to generate all of the tokens, one after another. This training procedure allows DALL-E to not only generate an image from scratch, but also to regenerate any rectangular region of an existing image that extends to the bottom-right corner, in a way that is consistent with the text prompt.” As an extension of the GPT-3 engine, DALL-E is an adept Text-to-Image system that has been trained not just on text, but on images as well.

From the standpoint of artificial engines having creativity through the ability to coherently blend concepts together, this is a great step forward in the right direction. According to former director of machine learning at Amazon, Neil Lawrence, DALL-E looks a ‘very impressive’ engine that accurately demonstrates the ability of AI-based models “to store information about our world and generalize in ways that humans find very natural.”

We have been writing captions from images for a long time; it seems the time has now come to create images out of our captions!

To find out more, you can visit: https://openai.com/blog/dall-e/

The Mathematical Road to AI

Experts feel, one major reason why AI talent is in short supply may be the lack of mathematics training in undergraduate IT courses

Artificial Intelligence (AI) is now the big craze as AI-based applications are infiltrating every industry. Their scope is as varied as innovation can think of, and the goal is always to improve business outcomes. No one doubts the phenomenal potential that AI offers. Every day, we wake up to the news of yet another exciting AI-enabled innovation being pressed into service. The time has arrived when these innovations are no longer considered as niche concepts; rather, most of them are reaping rich rewards. However, despite demand, there is a pitiful dearth of trained AI engineers all over the globe. What may be the reason behind it? Industry experts have pointed out to a possibility that sounds pretty straightforward; but can have far-reaching consequences – both to the AI industry and to future curricula in tech institutes.

It is now common knowledge that the requirement for AI engineers is astonishingly huge. Going by ballpark estimates, the required figures are close to several millions the world over, based on the current and projected scope of Artificial Intelligence across industries. And there is need for candidates from every wake of the AI spectrum – from AI theory, to writing AI codes, developing Machine Learning algorithms and implementing them and even building AI-compatible hardware products on which these algorithms would learn. Connect to this the fact that it is not a localised requirement, and every technology aspiring nation need each one of these skills – and the demand is only going to spike in the coming years. However, the actual supply is pitifully low – perhaps a few hundred thousand AI professionals all over the globe!

Machine Learning is at the heart of AI. It involves a lot of parameters, statistics, calculus, linear algebra, and other related domains; in short it involves mathematics of a high order. And experts fear this, precisely, is where the problem lies. But first, let us dive deeper to understand the backdrop.

Usually, software engineers are trained in the basics of general coding in their academic days after which they go on picking up skills in specific programming languages and platforms, as and when required, throughout their professional life. This lifelong learning is based on project requirements, and such upskilling is in sync with their intuitional training as software programmers. This is effective and has been the typical operating procedure in resourcing for IT projects for all these years. IT professionals are always ready for this, because technologies and programming languages keep on evolving – and what is standard today becomes extinct a few years later when an upgrade appears. Any average programmer with formal undergraduate training can develop working knowledge of the required language in a few months.

This is where things turn different for AI skills. It is not something that can be picked up with a crash course in the relevant programming language. It is about mathematical acumen to a great extent, for which there is really no shortcut. Undergraduate computer science curricula focus on system designing, coding and algorithm formulation – but understandably not on mathematics. They don’t need to and students who aspire to become IT professionals are mostly not the ones who would be interested in profound mathematical studies. However, without mathematics there can be no serious Machine Language programming. This gap is gradually dawning upon industry experts, and they mostly agree that the lack of AI talent is directly related to the level of mathematics skills required for it – and there will not be any quick fix for this.

Tech institutions are globally waking up to this lacuna as more and more computer science students want to learn AI and the industry demand for AI professionals skyrocket each passing day. Undergraduates now realize that mathematics is integral to machine learning and they want the requisite training. It is now up to the educators to make the necessary changes in their course curricula. For example, Columbia University has started a data science institute where there is a judicious mix of mathematics, programming and software applications to build AI products. Data science programs with very specific focus on training for Ai and Machine Language are gaining popularity.

Another approach that some experts suggest is splitting programming courses to cater to student interests. As Sameer Maskey – adjunct Assistant Professor at Columbia University and the Founder of FuseMachines, an advanced machine learning company that builds software robots for automated customer servicing – recently explained in a media interview: “we are starting to see… mini-programs where it might not be a two year master’s program but a one year program to sort of do hyper-focused courses on machine learning, deep learning, and computer vision, natural language processing….[T]his kind of a mix of full-on master’s programs and data science… would be good to create more talent, to build more AI applications.”

Let us hope that this trend soon picks up in the Indian academic scenario too – where, till now, we only have a handful of private institutions that offer extensive courses in Data Science.

Back to Office

A smarter office driven by data

As the COVID-19 vaccine begins to be rolled out worldwide, there’s a clear trend of employees eager to return to office. According to a survey by real estate firm JLL, despite all the ways the coronavirus pandemic has normalized working from home, three in four workers hope to return to an office at some point in the future, according to a recent survey of 2,033 office workers worldwide. However, it will be a new kind of office that employees would return to. The pandemic will have a lasting impact on company operations and office spaces. From autonomous cleaning devices to tighter cybersecurity measures technologies are influencing how the office could look during the reopening process and beyond.

Health & safety key concern

Data supports the view that employees want to return to a workplace that values their health and safety. According to a study that Envoy commissioned from Wakefield Research, 94% of people would like to spend at least one day a week in the office. However, 73% of employees currently fear going into the office and 75% would consider quitting if their employers downplayed COVID risks. This is unleashing new opportunities to create smart offices, embedded with sensors, collecting vital data about workplace utilization, employ health and safety and providing the foundation for a data-driven organization.

Most research point out that the future of workspaces will be a hybrid of WFH and working from offices. A mix of in-office and remote work options are likely to maximize employee and organizational performance. Employees want choice and freedom in where they work, but few want to work outside the office exclusively. There are clear downsides to this pandemic induced WFH period. Office workers feel disconnected from corporate culture, personal wellbeing has suffered, and employees feel that they’ve had fewer opportunities to learn, especially through informal mentoring.

The coffee-shop office

According to real-estate consulting firm, Cushman & Wakefield workplace of the future will be an ecosystem of multiple options for workers. The first option may continue to be the core office where most learning, mentoring, team connection and collaboration occurs. For many workers, their home may now be a viable second option for working on a regular basis. And workers may have the flexibility to choose third options like local community hubs (e.g., coffee shops, the local library, etc.), on-demand event spaces, co-working spaces, retail spaces and suburban “spoke” offices. These third places may appeal to employees for a variety of reasons—for example, a spoke office might be more conveniently located than the core office and it might offer a better social outlet than home. Companies may need to help manage these options for their employees, even offer several “office pod” options, and provide the ability to book spaces on any given day.

A robust technology platform a pre-condition

This hybrid workspace mode or the central office and pod work architecture will require a robust technology platform to connect employees, partners, customers and the myriad devices creating the Internet of Things (IoT) ecosystem. From measuring employee health parameters, productivity, efficiency, infrastructure utilization, enabling remote delivery of services, providing a collaboration and communication platform, technology will be the driving force in creating this new workspace.

The entire office infrastructure will be redesigned from the entry lobby to the cafeteria to ensure employee health and safety and in the process capture vital data. As a building’s first point of contact — and first line of defence — entrances and lobbies are poised for a revamp in policies and procedures when it comes to fighting the spread of COVID-19. The criteria for authorized occupants could include pandemic-specific considerations like employee schedules, health indicators, and contact tracing.

Contact-less technologies

A handful of companies are exploring gesture detection technologies to help individuals command elevators without touching buttons. Doors using PIN entries for access can be replaced by low-touch or touchless entry. Low-touch or touchless entry using smartphones or “wave to unlock” solutions require less hassle than certain traditional methods, such as key-cards, which can also be easily duplicated. By using Wi-Fi, LTE, or Bluetooth, the latest mobile solutions can unlock doors while still in a bag or pocket.

Biometrics would be a more permanent fixture in the process of authorizing entry of employees or visitors. Given the richness of biometric data, including key health data, companies may explore anonymizing it for use cases such as planning for an upcoming flu season or triggering alerts when multiple employees come to work with elevated temperatures. However, there will be significant challenges in widely implementing this technology. User acceptance and public opinion are major obstacles, as many people view biometrics as an invasion of privacy and are reluctant to adopt the technology.

Redefining roles in a remote-first world

Overall, a reduced need for occupancy will allow some companies to downsize, decentralize, or redistribute space into smaller offices to create hubs closer to where people live.  This will also reduce commute times for many employees. An entirely remote workforce seems unlikely for most companies, with an office providing intangible benefits such as social connection, collaboration, and innovation. Additionally, there are still a variety of roles that require being physically present in an office. Beyond COVID, it’s likely that employees may have more flexible schedules and work situations that do not require them to be in the office every day.

Tech Trends: 2021 – Part 2

Geopolitics, Big Tech regulations and Blockchain would be burning issues across the technology landscape in 2021

The technology landscape for 2021 will be dominated not only by scientific advancements, but by geopolitical currents as tensions between US, China and Russia increases; stricter global regulatory environments with concerns over data privacy and unprecedented sophistication of cyber-attacks; and governments’ attempt to curb the overwhelming influence of Big-Tech that is threatening to rival the power of the state itself and this is resulting in friction that is bound to grow in the coming years.

Geopolitical tensions

The latest round of massive state sponsored hacking into US government institutions, including the Pentagon, and hundreds of Fortune 500 companies, allegedly coming from Cozy Bear, the hacking arm of Russia’s foreign intelligence service, the SVR, known as Software Supply Chain Attack, have sparked another round of diplomatic row between the two countries. When the US President-elect Joe Biden assumes office in mid-January 2021, he’ll have to respond with some tough measures as otherwise he’ll be viewed as going soft on Russia.

China, however, is a different ball game altogether. The US has accused it of stealing technology for several years to build its own copy-cat business such as Baidu, Alibaba & Tencent by disallowing US social media companies from entering its markets. The outgoing US president Donald Trump has retaliated but preventing any US or global company, using US technology to manufacture semiconductors, from selling to any Chinese company. This has severely impaired China’s technological ambitions to rival the US’.

The move has forced Huawei, China’s biggest telecom company, to curb its ambition to lead the mobile handset market. Far for serious is the impact on its plans to win the global 5G race. US, the UK, several European countries and India are pulling our Huawei equipment from their networks. While Biden has not detailed a specific Huawei strategy, he has said he will put global cooperation at the centre of efforts to counter China’s tech offensive: “To win the competition for the future against China or anyone else, the United States must sharpen its innovative edge and unite the economic might of democracies around the world,” Biden wrote in a piece outlining his foreign policy in Foreign Affairs in March. He said the U.S. needs to “get tough” to counter intellectual property theft and state subsidies that give China an “unfair advantage.”

Seismic changes in Semiconductor industry – China’s big vulnerability

The semiconductor industry is in the throes of a major reset. The next five years will see new chip architectures to handle the spread of deep learning and store and process the explosive growth in sensor-generated real-time data at the edge. There will be new programmable networking chips for 5G data centres and an intensifying drive by China to create a semiconductor supply chain free of US design and production technology by 2030. The Communist country realizes that it’s ambition to lead the world in technology hinges on availability of advanced semiconductors; and this is where it is most vulnerable.

There will be breakthroughs in the use of 3D layering of integrated circuits and chipset packaging to keep Moore’s law going, without suffering from Moore’s second law (which states that the cost of a semiconductor chip fabrication plant doubles every four years). Chips modelled on the human brain’s meshing of processing and memory will address the problem of using up to 80% of a processor’s time and energy moving data to and from storage.

A proliferation of embedded micro data centres will drive increasing numbers and varieties of progressively autonomous connected IoT devices. These range from smart traffic lights and autonomous vehicles to wearable biosensors and augmented reality (AR) headsets.

Realizing that its access to US technologies will be cut off, China is now putting a multi-year counterstrategy in place to upgrade its digital infrastructure and wean itself, as far and as quickly as possible, off reliance on US software and hardware inputs. China and the US will embark on a porous, complex, and protracted decoupling over the coming decade.

A $1.4 trillion state program has been announced to support R&D in key enabling technologies over the next five years. These include semiconductors, artificial intelligence (AI), robotics, 5G and 6G, data centres and cloud, supercomputing, quantum computing technologies, and low-earth orbit satellites. However, without access to the most advanced semiconductors (5-3 nanometres) from TSMC (Taiwan Semiconductor Manufacturing Company) which comes under the US ban as it uses US machinery to make the chips, China will find its plans difficult to achieve. Currently its own semiconductor capabilities are only at 14 nanometres, which is far less powerful that what TSMC, which produces 50% of world supplies, manufactures.

China’s most significant chip companies, led by Huawei’s HiSilicon subsidiary and SMIC, depend on US-developed electronic design automation (EDA) software and production equipment. The latter is fundamental to any chip-making plant, including those operated by the world’s leading chipmaker, TSMC, to which Chinese companies have to turn to make their most advanced chips. The biggest obstacle is that China suffers from a severe deficit of skilled managers and engineers. The semiconductor skills shortfall is estimated at 400,000 suitably qualified professionals.

Blockchain to become mainstream

COVID-19 pandemic has amplified the need for technologies that help improve trust in data and assets, remove operational inefficiencies from business processes and boost the resilience of supply chains, where blockchain/DLT (Distributed Ledger Technology)  plays a key role alongside AI (Artificial Intelligence), IoT (Internet of Things) and data analytics. It should further spur market activity and adoption across industries, including the least digitized ones. For example, agri-tech start-up GrainChain and global payment giant Mastercard have recently partnered to ‘upgrade’ soft commodity markets with blockchain capabilities and boost supply-chain resilience and sustainability, as well as fair trade in these markets, across the US and Latin America, and potentially further geographies.

Blockchain/DLT will be among the topmost priorities for organizations. Although mainstream adoption is still in the early stages, more than two-thirds of companies surveyed by 451 Research indicate that blockchain/DLT will be among their organization’s top five strategic priorities in the next three years.

In 2020 there were a number of announcements about significant investments, as well as prominent partnerships and deployments across a breadth of use cases, addressing critical business and industry-level challenges – all pointing to a maturing market that is gaining strategic importance. The 451survey showed that 68% of respondents are confident about blockchain/DLT being in their organization’s top five strategic priorities in the next three years (see Figure 3). This number is 84% among the most data-driven companies – those organizations that self-report they are making nearly all their strategic decisions based on data.

IBM and Oracle upgraded their blockchain-as-a-service platforms – both underpin some prominent live business networks – to accommodate evolving customer demand, and new platform offerings were launched, including IT service provider First Genesis’ cloud-native Xenese platform and Intellect EU’s Catalyst Blockchain Platform.

Notable partnership announcements include VMware becoming an investor in smart-contract firm Digital Asset, as well as the Eclipse Foundation teaming up with the IOTA (IOTA is the first distributed ledger built for the “Internet of Everything” – a network for exchanging value and data between humans and machines)  team to drive commercial adoption of the permission-less IOTA distributed ledger protocol, in areas such as digital identity and decentralized marketplaces. Blockchain software house ConsenSys got hold of JPMorgan’s Quorum protocol and launched the ConsenSys Quorum brand, which consolidates the company’s enterprise Ethereum protocol technology into a single offering.

Tighter Regulations for Big Tech

Big Tech has grown too big and has been on the cross hairs of governments across the world for a while now. The complaints against these companies range from influencing citizens on political issues, to being anti-competitive and having an adverse impact on innovation as they gobble up any company coming up with an innovation that threatens their dominance. The fact that Big Tech has grown even more financially strong during Covid19, when most other businesses were suffering has drawn some unwelcome attention from the regulators.

On 6 October 2020, the US House Judiciary Committee’s Antitrust Subcommittee released its investigation into the digital economy, reporting that the main four Tech companies – Amazon, Apple, Facebook and Google – all were monopolies across specific markets, an accusation the four companies rejected. There is now a growing consensus worldwide regarding the need for better regulation of the digital economy, and the next debate will be on which tools and remedies must be used to improve competition, especially as global companies continue to diversify while leveraging their core market strengths.

Even the Chinese authorities have become wary of Alibaba’s growing influence in every aspect of life in the Mainland and Hong Kong. They have cracked down on the company, stopped its affiliate company Ant from issuing the world’s largest IPO in the Shanghai Stock Exchange. The IPO would have valued it at more than $310 billion, making it worth more than major US investment banks such as Goldman Sachs (GS) and Morgan Stanley (MS). The company sells an array of financial products in China and its payments arm, Alipay, is the country’s biggest payments platform.

One of the trends noticed is that tech companies are acting more and more like nation-states, sometimes being stronger than many countries, and at other times believing they should be left alone because they ultimately benefit the common good. For instance, in the same week as the US House report, Facebook released its own report saying that breaking it up would be impossible and would cost billions of dollars, and while the company is allowed to defend its interests, it should not be the one deciding how it should operate. Companies have to be more open and realise that regulation will necessarily be a part of their day-to-day operations, otherwise it will leave regulators with only one option to ensure competition: structural separation or breaking them up.

The Year of the Robot

Of all the things that the year 2020 may or may not be reminisced for, it will assuredly forever be remembered as nothing short of seminal for the world of robotics. From treating COVID-19 patients to dancing, herding sheep and even making perfect omelettes – robots did it all this past year, with recent developments promising a rather fruitful future in the offing as well.

Dancing Doctor

United States-based engineering and robotics design company Boston Dynamics decided to end 2020 with a bang – with all its robots participating in a dance ensemble set to the 1962 classic ‘Do You Love Me?’. The video envisages a mind-boggling depiction of Boston Dynamics’ state-of-the-art Atlas robots working together to stitch together and coordinate synchronised dance moves.

Image 1: Dancing Robots; Courtesy: Boston Dynamics

It demonstrates stunningly the eventual transition of the Atlas humanoid robots through the paste decade – from barely walking in 2013 to covering full-blown acrobatic moves in 2020. Boston Dynamics currently has only one commercially available robot – Spot – the robot dog that has been seen ‘herding sheep in New Zealand’, and ‘working on a Norwegian oil rig’. This however, is in no way the extent of Spot’s power.

‘Doctor’ Spot has also been deployed to remotely measure patients’ vital signs whilst maintaining a distance of over six feet. It iust currently is in use to ‘safely triage contagious COVID-19 patients’ (NewAtlas)

Image 2: Dr Spot; Courtesy: Boston Dynamics

According to NewAtlas, “Some patient vital signs are easier than others to measure remotely. Body temperature, for example, can be gathered relatively simply using an infrared camera. In this instance, the research team developed algorithms to more accurately use infrared camera data to measure body and skin temperature by incorporating factors such as ambient temperature and distance from the patient.”

Further analyses, such as tracking a patient’s breath rate by calculating temperature changes in a face mask and using specifically tuned monochrome cameras of specific wavelengths to detect patients’ pulse and blood oxygen levels have hypothetically made human monitoring a completely contactless affair.

Home Improvement

Dubbed the world’s first “mobile robotic blocklaying machine and system”, Australian robotics company Fastbrick Robotics (FBR) Hadrian X has been flexing its giant telescopic arm since 2015 – and today is being deployed in Western Australia to complete the walls of its first display home as part of residential development.

Image 3: FBR Hadrian-X in full flow; Courtesy: Fastbrick Robotics

It works by way of laying bricks through a large telescopic boom which mounts on top of a truck or an excavator. “By feeding the system a 3D CAD model of a house, the robot can then go to work placing bricks, along with the mortar and adhesive needed to hold it all together.”

According to NewAtlas, “not too long ago, the Hadrian X was capable of laying around 85 blocks an hour, but the team has made significant improvements to its control software that first saw that rate jump to 150 blocks an hour, and then last month to more than 200. This was considered a demonstration of the skills needed to compete with traditional bricklaying services, and now the team is seeing how its machine fares as part of a real-world construction team.”

It is not just building houses; robots are now tending to them too. Enter: the Yardroid. Dubbed  an ‘intelligent landscaping robot’, it uses artificial intelligence to autonomously handle certain tedious gardening tasks: rolling “along on tracks, with water, herbicide and pesticide chambers in the back, lawn-mowing blades on the underside, and a gimbal-stabilized pivoting turret in the front.”

 Yardroid uses computer vision and AI systems to autonomously mow lawns (even without perimeter wires, as is the case with most contemporaries), spotting plants for watering as per a schedule, identify weeds and pests on sight, responding with its inbuilt herbicide and pesticide facilities and ‘discourage’ certain unwanted pests such as raccoons, with squirts of water.

 

Image 4: The Yardroid demonstrating its pest-killing action; Courtesy: Yardroid and Whirly Max Inc.

 

 

AI Wave Goes Quantum

AI makes iconic Quantum Chemistry breakthrough

It was back in 1925 that an Austrian-Irish physicist named Erwin Schrödinger first postulated a linear partial differential function that not only described the wave function of a quantum mechanical system but also managed to completely transform the way that we knew and understand quantum chemistry at the time. About a century since, we have now managed to create an Artificial Intelligence system that can successfully solve the iconic equation of its own accord.

Waves of a Quantum Nature

“Central to both quantum chemistry and the Schrödinger equation is the wave function – a mathematical object that completely specifies the behaviour of the electrons in a molecule. The wave function is a high-dimensional entity, and it is therefore extremely difficult to capture all the nuances that encode how the individual electrons affect each other. Many methods of quantum chemistry in fact give up on expressing the wave function altogether, instead attempting only to determine the energy of a given molecule. This, however, requires approximations to be made, limiting the prediction quality of such methods.” (SciTechDaily)

The reason why this is news of significantly important measure, of course, is because this is a rather thankless task. In essence, the goal of quantum chemistry is to predict two uncertain states simultaneously: the “chemical and physical properties of molecules based solely on the arrangement of their atoms in space.” While this is something solving the Schrödinger’s equation gives us, this is a rather resource-intensive and time-consuming affair, requiring adept usage of laboratory experiments and technical knowhow.

This essentially reasserts the importance of the work being carried out by the scientists at Freie Universität at Berlin, making it all the more commendable. “Escaping the usual trade-off between accuracy and computational cost is the highest achievement in quantum chemistry,” according to Dr Jan Hermann of Freie Universität Berlin, a man deeply involved in the design of several key features of the method of study. “As yet, the most popular such outlier is the extremely cost-effective density functional theory. We believe that deep ‘Quantum Monte Carlo’ the approach we are proposing, could be equally, if not more successful. It offers unprecedented accuracy at a still acceptable computational cost.”

Enter: PauliNet

The use of a deep neural network in representing wave functions is a rather novel affair. It is essentially an artificial intelligence-based system that can compose a wave function based off relatively simple mathematical concepts to appropriately predict the complex placement of electrons around the nucleus of an atom. Called ‘PauliNet’ in commemoration of Austrian physicist Wolfgang Pauli, the AI system utilises the idea behind his famous ‘Exclusion Principle’: where the neural network architecture depicting the wave function must change signs when two electrons are exchanged – akin to the asymmetry noted in real-life electronic wave functions.

SciTechDaily writes: “Up to now, it has been impossible to find an exact solution for arbitrary molecules that can be efficiently computed. But the team at Freie Universität has developed a deep learning method that can achieve an unprecedented combination of accuracy and computational efficiency. AI has transformed many technological and scientific areas, from computer vision to materials science. “We believe that our approach may significantly impact the future of quantum chemistry,” says Professor Frank Noé, who led the team effort. The results were published in the reputed journal Nature Chemistry.”

There is, however, still a while to go until this system reaches industrial application. Yet, the authors agree that this is ‘fundamental research’ in exacting “a fresh approach to an age-old problem in the molecular and material sciences; and (are) excited about the possibilities it opens up.”

Acknowledgement: “Deep-neural-network solution of the electronic Schrödinger equation” by Jan Hermann, Zeno Schätzle and Frank Noé, 23 September 2020, Nature Chemistry.