The GenAI paradox: Why Indian IT firms
ban it while preparing to sell it
While most of the industry is treading cautiously as the
ChatGPT era enters its second year, one sector—human capital
management—is eagerly embracing it
Abhirami G, 31 Oct 2023
While many companies find great business value in generative AI’s proposition, most of them are feeling their way
in the dark
For IT services firms, the priority is building use cases that convince their clients to adopt the technology
Hallucination—GenAI model output that’s nonsensical or false—and lack of data privacy are among key concerns
that need to be addressed
Recruiters have seen big gains since adoption. At the end of the day, though, humans continue to be the ultimate
authority
Read a 200 word free summary.
SHOW SUMMARY
November marks a year since ChatGPT’s public debut. And it’s been a whirlwind
of progress for the viral artificial intelligence (AI) sensation, including the
recent update that gives users access to all of its tools—essentially rendering many
third-party products obsolete. During this time, tech enthusiasts and businesses
have transitioned from being awestruck to timid adopters to bold experimenters.
Indian IT companies, which integrate new technologies for businesses and build
applications on them, are talking about generative AI (GenAI) but seem to be
walking gingerly towards it. During the just-concluded September quarter results,
GenAI figured in their communication, but no one spoke specifically.
“In the last six months, every client has inquired about GenAI. But in the last six
months, we also haven’t found a single client who is ready to use GenAI,” said a
senior executive at IT giant Tata Consultancy Services (TCS). The likes of TCS,
along with its rivals HCL Technologies and Infosys, know that a freely available
technology such as GenAI won’t do much for their relative competitiveness or
margins unless they build and sell “transformative solutions, which use GenAI”.
To that effect, “clueless” is how some describe the general business response to
GenAI in India.
It’s not that clients don’t understand how GenAI could be transformative for them,
said Sandeep Chaudhary; it’s their internal lack of readiness or preparedness. As
chief executive officer (CEO) of the human-capital management company
Peoplestrong, Chaudhary has a panoramic view of how IT and global companies
with captive centres in India are hiring for GenAI.
“Since GenAI is still in a nascent stage, we can’t vouch for the accuracy of
products yet. However, we can make an intelligent case to companies regarding
their utility and value,” said Sunny Singh, an AI specialist at HCL. After the
second quarter results, Managing Director C Vijayakumar had said HCL was
working on around 100 sub-$1 million GenAI projects.
AI has existed for a few decades. But the reason why GenAI is considered to be
such a major leap forward is that it has greatly democratised AI for the general
public and businesses. Applications that were limited to large companies with
years’ worth of stored data can now be accessed by even smaller and newer
companies.
As an example, Vini Khabya, co-founder of the business-to-business (B2B) talent
intelligence company Unberry, highlights how it is now possible for any company
to use GenAI to analyse and shortlist resumes based on a given job description.
Earlier, this was possible with traditional AI only for larger companies since they
had a large repository of previously received resumes to train their AI models on.
Now, though, large language models, or LLMs, are already trained on a very large
dataset. This makes the functionality accessible to businesses irrespective of their
size and age.
Beerud Sheth, the CEO of Gupshup—the US-based messaging platform founded in
India—attributes the expected AI-driven $194-billion bonanza in global e-
commerce sales this holiday season (November and December) to GenAI and
LLMs, which empower marketers and retailers to “gain a better understanding of
their customers”.
That said, it costs a pretty penny to run these models, and, in some cases, they also
pose security risks. In fact, the latter has led many companies, including IT service
providers like Cognizant, Infosys, and HCL, as well as consumer-electronics giants
such as Samsung and Apple, to ban the general use of ChatGPT within their
organisations.
Paul Daugherty, the chief technology officer (CTO) of IT consulting major
Accenture, believes by this time next year, companies will shift from
“experimenting to scaling”, and GenAI will reshape enterprises’ core processes
and workflows.
Indian service providers are under peer pressure.
Firms, functions, finances
A logistics major, for instance, is creating a GenAI-powered executive assistant for
its CEO. One that can answer specific queries and requires less “button pressing”
than earlier systems built using “business intelligence based on the available data”.
It’s an assistant that can answer how the company’s products are performing vis-à-
vis its competitors.
Another example is an electricity service provider that wants a tool to automate
customer inquiry responses, particularly during peak load periods.
So, the bandwagon effect aside, companies in India also want to integrate GenAI
due to the early efficiency gains they see when compared to pre-LLM processes,
said Saibal Samaddar, head of AI transformation at Infosys Consulting.
Earlier in September, industry giants like Reliance Industries, TCS, and Infosys
joined forces with the US chip powerhouse Nvidia to create a massive AI
infrastructure and upskill their entire workforce.
Infosys has also partnered with Google to train 20,000 of its employees on Google
Cloud’s GenAI solutions. Others like Microsoft and Accenture, too, have launched
their GenAI-focussed certification courses in India this year.
As integrators, IT companies are aware that different clients are at different stages
of tech readiness. “It’s important that companies are willing to adopt the
technology right from the executive level down to the ground level. It is not
enough for the technology to be just a CIO [chief information officer] or CTO’s
mandate,” said Samaddar.
No wonder IT companies are only taking baby steps.
For instance, the primary areas in which HCL is building its projects include
chatbots, summarisation, email generation, and translation. It’s here that they have
seen the most success. “The customers don’t come up with use cases for GenAI.
They don’t come to us and say, ‘We want to integrate LLMs into these aspects of
our processes’. Rather, we have to come up with these solutions for them,” said
Singh.
As service providers, Samaddar stressed, companies must understand GenAI as a
business tool rather than just a technological one. “It’s like the difference between
cloud and analytics. Cloud is purely a technical solution that allows companies to
store large amounts of data. While cloud transformation can be done by
technology, analytics requires an understanding of business and the specific
domain.”
Kickoff quirks: Privacy and hallucinations
Every other day, a new way to use GenAI pops up somewhere around the world.
Earlier in October, one of the world’s finest robotics companies, Boston Dynamics,
integrated GenAI into its robot and turned it into a city tour guide. Fancy.
But, Indian companies, not without reason, harbour reservations. Data privacy,
after all, is a key concern when considering the use of GenAI-based solutions.
When a user feeds data into an LLM-based chatbot like ChatGPT, it can save and
use that data to get smarter. This can be a problem. (Interestingly, although
OpenAI makes this clear in its FAQs page and Privacy Policy when asked about its
data use policy, ChatGPT denies that it is trained on conversation data.)
Plus, sometimes, these AI models make up stuff—termed “hallucinations”—when
they don’t know the answer.
The big fear is that if a user puts their employer’s confidential data into the
chatbot, it might not stay private. For instance, if a programmer copy-pastes their
company’s proprietary code into ChatGPT to ask for help fixing an error, that code
could end up being seen by others, said Jainendra Tiwari, a data scientist at
Cognizant. The data entered by the programmer can potentially be recovered from
the LLM.
At Cognizant, only select employees are allowed to use ChatGPT. To get access,
employees have to ask for permission officially, mainly those working on GenAI
projects for clients or doing internal research and development.
Yet, there are always ways to slip past the velvet rope, using ChatGPT on personal
devices or through the Bing Chat feature in the Microsoft Edge browser.
Companies are tackling these issues differently. The likes of Infosys, for instance,
are exploring a method called retrieval augmented generation (RAG) to address
data privacy and hallucination concerns. RAG enhances the accuracy of an LLM’s
responses by adding information from external sources.
Most companies opt for RAG when using GenAI because it’s the most effective
way to blend a company’s confidential data with a language model.
For example, if an Infosys employee wants to know their entitlement to leave days,
they can’t rely on a generic chatbot like ChatGPT since the answer depends on
internal company rules. Also, ChatGPT operates with data that hasn’t been updated
since 2021, making RAG essential for obtaining current and accurate responses.
(September onward, premium ChatGPT users have been able to access the Browse
option, allowing them to get up-to-date information.)
Samaddar sketched a useful diagram on the back of a tray liner from Burger King
to explain how this works:
In Samaddar’s scenario, sandwiching Infosys’s proprietary data and the client’s
specific data between the LLM and the user shields the information from going
straight into the LLM and provides the user with a more accurate answer.
Another way companies handle this is by turning to GenAI tools integrated into
cloud computing platforms such as Microsoft Azure and Amazon Web Services
(AWS).
Platform wars
This September, AWS launched Amazon Bedrock for custom AI applications,
while Microsoft’s partnership with OpenAI grants access to GPT-4 and more
through Azure
The idea here is that, in these closed-off environments, sensitive data stays
securely within the cloud ecosystem. These cloud providers offer users a sandbox-
like space where they can pick a foundation model and fine-tune it with their own
data. It’s even possible to incorporate RAG into these setups, allowing the
inclusion of company-specific information.
Azure and AWS also offer developer tools driven by GenAI, such as Github
Copilot and CodeWhisperer. These are dubbed as “AI pair programmers”. They
can do things like automatically fill in lines of code as developers write, or if a
developer explains what they want in comments, the tool can actually generate the
code for them.
Earlier this year, The Ken wrote about how tech giant Microsoft is taking on the
AWS and Google Cloud Platform (GCP) in India’s cloud war thanks to GPT-4—
ChatGPT’s latest iteration.
That said, neither AWS nor Microsoft have rolled out its GenAI services in India.
As a result, their clients operate their services on servers in the US or Europe,
racking up region-specific costs. Data shuffling from Indian servers to the relevant
overseas ones is also par for the course.
Also, since GenAI services are an add-on to their cloud subscriptions, the real
bottom line for companies is the fee for the foundational model, which can range
from $0.0004 to $0.1 USD per 1,000 tokens depending on the model used.
The GenAI in human capital’s bottle
In an atmosphere of scepticism and tepid optimism, with most companies taking
baby steps, there exists one sector in India that is almost wholeheartedly excited
about GenAI.
It goes by the name of human capital management (HCM), responsible for talent
orchestration for and within organisations.
At a time when Indian employees fret more about losing their jobs to AI than their
peers in the US, the UK, and Germany, what sets HCM apart is its ability to flaunt
GenAI’s perks right from the get-go.
For one, recruiters—both as individuals and within HCM firms—have started
using GenAI to automate a range of tasks. For instance, by feeding parameters
such as role, skillset, location, and experience level, they are able to generate
precise job descriptions.
The process then extends to sourcing resumes from platforms such as Linkedin and
job boards such as Monster and Naukri. An LLM can sift through resumes to
match the job description. This is where GenAI truly shines, said Khabya.
Besides, as pointed out by Chaudhary, GenAI assists in candidate screening
through tailored assessment tests and evaluations based on their work experience.
It also streamlines the onboarding of candidates.
AI contains multitudes
So far, large corporations have relied on applicant tracking systems or pre-LLM
tools. But now, even smaller companies can tap into GenAI without being limited
by a shortage of past resume data to train their models
Most people from hiring companies who spoke with The Ken agreed that all of this
is turning GenAI into a sweet deal for them in terms of saving time and money.
Beyond efficiency, tapping into GenAI for the hiring process also tackles the
thorny issue of personal biases. “A human interview is actually the most
unscientific way of hiring anyone. Our experience does not necessarily translate
into knowledge or wisdom,” said Chaudhary. “Unfortunately, it translates into
biases.”
But what about the IT firms: are they also jumping on the bandwagon for their
hiring needs?
It’s a mixed bag. Take HCL, which is kind of dipping its toes in the pool. It’s not a
full-blown enterprise-level frenzy just yet, but Singh has a hunch that the company
might just crank it up a notch in the coming year.
That said, GenAI also has its share of hiccups, even within the HCM sector.
One major concern is GenAI’s potential to produce results that lack relevance to a
user’s specific industry. To tackle this issue, Chaudhary and Khabya underscored
the need to insert tailored data layers between the LLM and the user.
To that end, Khabya also highlighted the need to define industry-specific skills, as
the LLM’s understanding of them might differ from actual expectations. In
essence, phrases like “problem-solving skills” may mean different things in IT and
media roles, and “creativity” varies widely across fields.
Therefore, the final decision, in most cases, will continue to rest with the people.
The human-inputted context will remain the key ingredient of GenAI’s adoption.
Because certain roles—such as leadership positions—are still too distinctive for
GenAI to manage.
And it is these leaders, mostly hired by humans, who stand at the forefront of an
evolving ChatGPT era. They hold the key to turning the current hunger among
companies into an insatiable appetite for GenAI: shifting from experimentation to
scalable solutions and taking the tech from the boardroom to the operational core.
(The story has been updated to accurately reflect Saibal Samaddar’s designation
as head of AI transformation at Infosys Consulting)
Edited by Sumit Chakraborty