Celebrate Your Worth

The Buzz Behind Business AI

As relentless technological progress in data science transforms modern businesses, let’s look at the focus areas in business AI

That data-driven technologies are now the cornerstone to modern businesses is mystery to none. What is, however, ensured by the relentless march of technological progress in data science is that “the boundaries of what is possible are constantly being redrawn, spawning new behaviours, trends and buzzwords.”

This is thus a cornerstone which isn’t, perhaps even ironically so, set in stone one bit.

Buzzword: as-a-service

Data science today, to an extent, can even be considered a parent for business-case AI as we know it. The ability of machines to ‘learn’ based on in-built cognitive decision-making capabilities is one of the most exciting developments in global technology today. But, not much business use can come of it unless marketed as such. This is perhaps exactly why AI-as-a-service platforms are all set to boom this coming decade.

Traditionally, the adoption of AI has, more often than not, been hindered by the exorbitantly high upfront investment costs in setting up adequate infrastructure, skills, and managing regulations. Subsequently, major technologies such as artificial cognitive networks and machine learning have often only been restricted to the realms of academic institutions or big businesses.

A major turning point, in this regard, emerged hand-in-hand with the growth of cloud-based service platforms instead. What these platforms (such as Oracle or the Microsoft-based business-cloud Azure) offer is ready-made infrastructure in data centres that can be used in the form of plug-and-play applications for individual users. This, therefore, allows for the remote deployment of data-driven solutions such as recommendation engines, automated marketing services, or predictive maintenance, thereby making AI accessible to even small organisations with limited budgets.

Currently, although usability (for most smaller businesses) is often restricted to automating repetitive processes such as language translation or data entry tasks only, its future potential in allowing for data-driven decision-making, setting smarter strategic targets and developing more innovative products is unquestionable.

Buzzword: Content

Customarily, one wouldn’t expect much debate if one were to hear that creativity was a centrally human endeavour over anything else. But then again, these are the roaring 20’s and the prospect of sitting down to find a robot-authored novel competing for a Literature Nobel hardly seems an impracticability.

Esteemed technologist Bernard Marr, in this regard, opines: “We’ve had AI-created art, music, and even computer programs, and until recently, it’s generally been seen as a curiosity. However, the ability of AI systems to continuously improve – as well as the development of more sophisticated machine learning algorithms such as generative adversarial networks (GANs) – mean that machines are increasingly giving us a run for our money when it comes to creativity.”

Whilst competing for Nobel Prizes may indeed be a while off, lesser-ambitious endeavours such as writing product descriptions, creating highlight-videos for sports events or video-transcriptions is becoming a rather common occurrence for our AI counterparts. “The one huge advantage that AI has here over human creatives”, Marr writes, “is that the speed it can work at means it can far more efficiently produce targeted, personalized content. Product descriptions on websites can be tailored for the person that the AI predicts will be reading them, and adverts (or even movies) could have a personalized soundtrack, algorithmically created to appeal to a specific individual.”

Buzzword: Edge

There’s living on the cloud and then there’s decentralised living on the edge of it through off-site data centres calling data to be consumed through local terminal-based APIs and dashboards. Edge computing takes the burden of computational heavy-lifting to the closest possible point where the data is needed.

Marr writes: “Applications for edge computing exist in high-concept technology use-cases such as self-driving cars – where the cars themselves need to be able to make a decision on whether they are in a dangerous situation and should take evasive action, without having to send everything they know off to a data center and wait for the result to come back.” Decisions are thus taken quicker than ever before – and bandwidth preserved.

The hills are well and truly abuzz with the sound of intelligence. Artificial, and otherwise.

Leave a Reply

Your email address will not be published. Required fields are marked *

*