AI has the potential to raise productivity over the next decade, but we’re still some way from it transforming the enterprise in the short term. Credit: Gorodenkoff / Shutterstock Although bullish on the prospects for AI to automate many work activities, McKinsey acknowledges it’ll take several decades for this to happen at any scale. CIOs and other executive leaders should keep this in mind amid the hype and wild claims made by many vendors and consultants. There are a number of reasons why meaningful AI deployments within the enterprise will take longer than many imagine. Complexity of human work It’s been estimated that the average person makes 2,000 decisions every hour. While many of these decisions are routine and require little thought, others are far more complex and nuanced. At work, we’re efficient at processing multiple inputs in rapid time to take into account issues of safety, social norms, the needs of our colleagues and employer, as well as accuracy and strategic goals. At the same time, we can communicate these decisions orally, in writing, and through gestures using multiple systems and workflows. While computing technologies and enhanced access to data may have helped businesses make better routine, low-value decisions, anything more complex still requires human input and oversight. An organization’s reputation lives or dies by the decisions made within it and once lost, is difficult, and often impossible, to regain. While chatbots will take over many functions currently performed by human-powered call centers, these will operate within tightly defined parameters including their data inputs and the answers they can give. AI hallucinations The problem of AI hallucinations, where a large language model (LLM) presents authentic-looking but made-up results, shouldn’t be underestimated for enterprise AI deployments. It’s been estimated the hallucination rate for ChatGPT is between 15% and 20%, an unacceptable figure for business-critical decision making. Hallucinations can be reduced within enterprise deployments by fine-tuning LLMs through training them on private data that’s been verified. Further improvements can be made by restricting queries to proven prompts as well as incorporating open source tools such as Langkit and Guardrails, or proprietary products like Galileo. These tools and frameworks are still in the early stages of development, and users will need to experiment with multiple approaches and solutions. It’ll be several years at least before established and trusted methods for reducing hallucinations to acceptable levels are widely available. Changing habits and workflows While the consumer adoption of new technologies such as smartphones and social media occurs rapidly, it’s usually much slower within the enterprise. Workflows, user-training and technological path-dependency act as brakes on the deployment of new hardware and software solutions. Cloud computing, common data formats and APIs have lowered these barriers to an extent but they are still significant. A Gartner survey recently revealed 45% of customer service reps (CSRs) have avoided adopting new technologies and chosen to rely on legacy systems and tools. Cloud computing pioneer and CEO of Box Aaron Levie recently expressed his doubts that AI was going to have a significant impact on digital transformation initiatives in the short term, “I think we’re so early on for any kind of operational task stream with any level of efficacy to be able to replace even 10 minutes of what a real person does,” he says. What next for enterprise AI The initial hype and excitement over generative AI is starting to wane and more realistic expectations are emerging. Traffic to the ChatGPT website fell by almost 10% in June from May with users spending 9% less time on the site. It’s becoming apparent that making practical use of these new tools within the enterprise will require considerable customization and investment. The complexity and subtlety of much human work and the need for organizations to maintain consumer trust are behind this realization. However, it’d be a foolish business that didn’t start on this journey because the potential rewards are significant. Related content opinion Counting database costs: inflexible databases are unaffordable databases Elastic database systems address multiple needs with cloud-native and container-native designs that allow them to scale-out and scale-in dynamically. By Barry Morris 15 Nov 2017 9 mins CIO Databases Budgeting opinion Counting database costs: databases and elasticity How database systems are finally lining up with the rest of the application stack to support modern elastic data centers, and deliver the associated cost benefits. By Barry Morris 24 Oct 2017 8 mins Databases Data Center IT Leadership opinion Back to the (SQL) Future, Part 3 In the first post in this series, I discussed the emergence of NoSQL to address the need to make databases compatible with cloud needs. And in my last blog post, I talked about the dichotomy between smart databases and smart applications. I asked whe By Barry Morris 22 Aug 2017 4 mins SQL Cloud Computing Data Management opinion Simpler applications and smarter databases, Part 2 Here's a more detailed look into the trade-offs represented by NoSQL. By Barry Morris 10 Aug 2017 5 mins SQL Cloud Computing Data Management PODCASTS VIDEOS RESOURCES EVENTS SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe