Smaller, open-source AI large language models offer enterprises the chance to leverage their own data assets for innovation and competitive advantage. Credit: Gorodenkoff / Shutterstock For the last 30 years, the dream of being able to collect, manage and make use of the collected knowledge assets of an organization has never been truly realized. Systems for sharing information assets across the enterprise have evolved in their sophistication but haven’t been able to take it to the next level by effectively turning the information that resides in digital files into usable knowledge. Data exists in ever larger silos, but real knowledge still resides in employees. But the rise of large language models (LLMs) is starting to make true knowledge management (KM) a reality. These models can extract meaning from digital data at scale and speed beyond the capabilities of human analysts. The 2023 State of the CIO survey reveals that 71% of CIO respondents anticipate greater involvement in business strategy over the next three years with 85% stating they were becoming more digital- and innovation-focused. The application of LLMs to an organization’s knowledge assets has the potential to accelerate these trends. Less is More OpenAI’s ChatGPT and Dall-E 2 generative AI (GenAI) models have revolutionized how we think about AI and what it can do. From writing poems to creating images, it’s staggering how a computer can create new content from a few simple prompts. However, the scale of the LLMs used to perform these tasks is vast and expensive for OpenAI to offer. GPT-4 was trained on over 45 terabytes of text data via more than a thousand GPUs over 34 days and cost almost $5 million in compute power. In 2022, OpenAI lost $540 million despite having raised $11.3 billion in funding rounds. Clearly these costs and its scale of operations are beyond the capabilities of most organizations wanting to develop their own LLMs. However, the AI future for many enterprises lies in building and adapting much smaller models based on their own internal data assets. Rather than relying on APIs provided by firms such as OpenAI and the risks of uploading potentially sensitive data to third-party servers, new approaches are allowing firms to bring smaller LLMs inhouse. Adjusting the parameters of LLM models and new languages such as Mojo and AI-programming frameworks like PyTorch significantly reduce the compute resources and time needed to run AI programs. Open is Better Just as the web is built on open-source software and protocols, it’s likely that many enterprise AI initiatives will be built on open-source models such as LLaMA and freely available techniques such as LoRa. According to a recently leaked Google memo, “The barrier to entry for training and experimentation has dropped from the total output of a major research organization to one person, an evening, and a beefy laptop.” These barriers to entry will only become lower and the results better, allowing startups and enterprises to build niche models focused on the specific needs of businesses and workflows. From GenAI to SynthAI Central to these developments is the move from AI systems that create new content based on simple prompts to models that are trained on an enterprise’s internal data and programmed to generate usable insights and recommendations. LLMs such as ChatGPT produce often believable results, but it’s not clear how the data fed into the model was used and whether the answers it gives are true or hallucinations. The recent case of a New York lawyer using ChatGPT to generate court filings and listing presumably historical cases to support his client’s claim showed the dangers of relying on GenAI outputs: despite looking like genuine evidence, six of the cases listed never took place. Silicon Valley venture capital firm A16Z recently outlined its belief that the future for AI in the workplace was not necessarily LLMs like ChatGPT, but more focused models designed to address specific business needs. They refer to this as SynthAI where the models are trained on proprietary data sets and optimized for discrete purposes, such as resolving customer support issues, summarizing market research results and creating personalized marketing emails. Applying the SynthAI approach to better managing a firm’s data assets is a natural evolution of this next stage in the AI revolution. Consulting firm BCG has adopted this approach to 50 years worth of their archives, largely reports, presentations and data collected from surveys and client engagements. Previously, employees could only search these files via a keyword search and then read through each document to check its relevance. Now the system provides usable answers to questions. The knowledge management dream is becoming a reality. Related content feature The startup CIO’s guide to formalizing IT for liquidity events CIO turned VC Brian Hoyt draws on his experience prepping companies for IPO and other liquidity events, including his own, to outline a playbook for crossing the start-up to scale-up chasm. By Michael Bertha and Duke Dyksterhouse 01 Mar 2024 9 mins CIO Startups IT Strategy feature 15 worthwhile conferences for women in tech For women seeking to connect and advance their IT careers, or those who support diversity and inclusion in technology fields, here are 15 conferences you won’t want to miss. By Sarah K. White 01 Mar 2024 11 mins Women in IT Diversity and Inclusion IT Skills brandpost Sponsored by Avanade By enabling “ask and expert” capabilities, generative AI like Microsoft Copilot will transform manufacturing By CIO Contributor 29 Feb 2024 4 mins Generative AI Innovation feature Captive centers are back. Is DIY offshoring right for you? Fully-owned global IT service centers picked up steam in 2023, but going the captive route requires clear-eyed consideration of benefits and risks, as well as desired business outcomes. By Stephanie Overby 29 Feb 2024 10 mins Offshoring IT Strategy Outsourcing PODCASTS VIDEOS RESOURCES EVENTS SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe