June 4th, 2025 4:00 PM CET
Webinar "Modernize or Rebuild from Scratch: What Your Legacy System Really Needs"
Contact us
c AI Transformation with Corporate LLM

AI Transformation with a Corporate LLM

Jul 03, 2024

10 mins read

In the first article I have described the process of developing the simplest PoC for an AI Sales Email Auto Response tool. I used the basic foundation LLM to emulate a corporate LLM. However, with a corporate LLM built as a pre-trained model, you can achieve much more, as it can be integrated into a number of corporate processes in the manual mode (as a corporate ChatGPT prompt) or in a fully automated mode that can transform your operations.

AI transformation

Use Cases for Corporate LLM

Although conversational AI is the most common use case, corporate LLMs have much more to offer.

  • Conversational AI:
    • Customer support and service (AI chatbots)
    • Sales and marketing (sales request AI auto-response and lead scoring)
    • Internal communication (AI Slack bots)
    • Translation and localization
    • Recruiting (AI recruiting).
  • Workflow automation:
    • Content creation, review and management (For texts, images, and video)
    • Human resources and recruiting (AI CV processing)
    • Knowledge management.
  • Discriminative AI (as a part of workflow automation):
    • Data analysis and business intelligence
    • Compliance and risk management.
  • Predictive analytics:
    • Trend forecasting
    • Business outcome prediction.
  • Information retrieval:
    • Document retrieval (AI case studies search)
    • Legal documentation review (based on your standard sets of NDA, MSA, and SOW).

Our AI Employees

Once we have finished and deployed the Leobit Corporate LLM it allowed rolling out of the new solutions on top of it quite easily. Now we have several AI employees built on top of Leobit corporate LLM:

large language models

Future Architectural Changes

I developed a basic proof of concept (PoC) using Google Sheets, Google App Scripts, and the OpenAI API in just one week. Despite its simplicity, it was a fully functional product that we began using in production. However, it was later completely rewritten and re-architected by the Leobit R&D team to achieve a more robust architecture.

  • ChatGPT API replacement: ChatGPT API was first replaced by Google Gemini and then moved to Azure OpenAI as a custom pre-trained Leobit Corporate LLM
  • Extended knowledge base: The LLM was expanded with additional Leobit knowledge base content, like project case studies, presentations, and much more
  • Migration to C#: The Spreadsheet logic was migrated to C# and deployed as Azure Functions
  • Leo’s new identity: Leo got a new picture from our Leobit Design Studio and an official position in the company as a VP of Innovation 🙂

Some Insights Regarding OpenAI API, Google Gemini & Azure OpenAI Service

Now, let’s dive into the main features of the most popular LLMs in the market. 

OpenAI API

Here are several insights from our experience with OpenAI API:

  • It’s the fastest way to start with GPT/AI integration — you can set it up in just a few hours
  • The PoC used a simple, standard, non-customized model, but fine-tuning options are available for the OpenAI API
  • OpenAI supports multiple languages, including English, Polish, and Ukrainian
  • OpenAI GPT-4 API isn’t the same as ChatGPT-4. For instance, the OpenAI API can’t search the internet. It’s somewhat similar to GPT-3.5 in functionality, but still is a GPT-4
  • You’re limited to just 3 requests per minute
  • The OpenAI API is relatively cheap, but remember, it’s not free. For hundreds of calls per day, it’s affordable, but for tens of thousands, you may want to consider alternatives
  • Hope that it is now cheaper with the release of GPT4o.

Google Gemini

Google also has its powerful AI tool, and these are the essential facts to know about it:

  • Gemini Nano (previously Bard) is free. Initially, it lagged behind OpenAI, but the version released in March 2024 was comparable to OpenAI
  • It was much faster than OpenAI, providing comparable inference. However, this was likely because we used Gemini Nano, which only has 6 billion parameters compared to 175 billion in GPT-4
  • Gemini was a great, fast, and free alternative to OpenAI API, but customization seemed limited, so we migrated to Azure
  • Gemini has good integration with Google Workspace products.

Azure OpenAI Service

Azure also has its AI-powered tool comparable to OpenAI and Gemini. These are the most important things to know about it:

  • This service isn’t available from the Admin UI (like Azure Virtual Machines). You’ll have to request it through the Azure support
  • Azure OpenAI is an excellent choice for fine-tuned models. You only pay for the operating time of the instance, not for individual calls
  • Azure OpenAI provides GPT-4o, GPT-4, GPT-3.5, Codex, and DALL-E models, meaning you can also generate images with it
  • It integrates perfectly with other Azure services like Power Platform (BI) and Cognitive Services (AI API)
  • The Azure infrastructure offers a vast array of tools. For instance, you can provide a corporate UI similar to ChatGPT for your fine-tuned model or expose your model through a virtual avatar (like Ukrainian AI-generated foreign ministry spokesperson Victoria Shi, though note that Victoria isn’t an LLM model; she can just read text that she is given). You can read more about Azure AI services we cover here.

Evolution of a Corporate LLM and a Corporate Knowledgebase

Our strategy was clear — a corporate LLM that could answer any question as effectively as a Leobit human representative.

This model is built now, but the fine-tuning process is continuous. The model itself changed how we viewed internal digital artifacts and why we generate them. The observation is simple: now we create new company artifacts that were missing as training data for our model. For example, if some important fact is unknown to the AI, we update internal documents so that the corporate model can answer that question upon retraining. This represents a totally different outlook on information — documents that once seemed enormous, boring, and somewhat useless are now crucially important sources for AI training. In many cases, humans don’t need to read huge documents; they can just ask pre-trained AI questions as they arise.

Our Leobit Corporate Rules, for instance, consist of several related documents with hundreds of pages. Now, you can simply ask LeoSlack about vacation policy, reporting rules, or promotion and salary review criteria, and it will provide the answers you need.

Improving the system will be an ongoing process, but it won’t require much development time. Primarily, it relies on the expertise of an LLM Business Analyst to refine the knowledge base and prompt instructions.

AI Tools on top of Leobit’s Corporate LLM

Later we realized that Leobit’s AI must consist of several different instances to be better instructed for resolving particular tasks: 

  • Internal Leobit AI Model
  • External Leobit AI Model
  • Image Generation Model

These three models have been trained on different datasets and given different instructions. For image generation, LLMs aren’t used, so to generate corporate images, we worked with specialized models like diffusion models, GANs, and other text-to-image models. Azure OpenAI Service supports DALL-E 2 from OpenAI, which delivers impressive results.

Once we built three custom AI models we are going to roll out about 20 different solutions using it. Five are out already, and we’re tweaking the rest. Training a model is an ongoing process, so even the ones we’ve released are still getting fine-tuned.

The possibilities for implementing Large Language Models (LLMs) are endless. These solutions can be adapted across various industries. The top sectors that stand to gain significantly from incorporating custom pre-trained corporate LLMs include legal services, education, finance and banking, healthcare, retail, and technology.

AI transformative

LLM Landscape

GPT and Gemini aren’t the only LLMs available, nor are they the largest. Many other LLMs come from different vendors, some with a much larger number of parameters, like Megatron-Turing from NVIDIA/Microsoft (530 billion parameters) or PaLM from Google (540 billion parameters). Others are tailored for specific tasks, like Codex from OpenAI and GitHub Copilot from GitHub/Microsoft, which specialize in writing code.

For the Chinese language, there are several models: PanGu-α from Huawei (200 billion parameters), Z-Code from Alibaba (70 billion parameters), and Wu Dao 2.0 from Beijing Academy (1,750 billion parameters). However, only PanGu-α is available through an API.

Returning to Gemini, it has three versions: Gemini Nano with 6 billion parameters, Gemini Pro with 60 billion, and Gemini Ultra with 540 billion, but the free version is only available for Gemini Nano.

GPT-5 is projected to be released in July 2024 and will have almost three times the number of parameters as GPT-3 (500 billion compared to 175 billion). Note that the number of parameters for GPT-4 hasn’t been publicly disclosed (it’s likely larger than 175 billion, but possibly the same). You can now speak with GPT-4o (released in May 2024) almost as fluently as with a human, request it to read you a poem, or ask it to create and sing a song. While it is not yet an artificial general intelligence (AGI), its current capabilities are impressive.

As you can see, the landscape of LLMs is quite complex and diverse, so choosing the right model can be a challenging task. At Leobit, we build R&D projects for each of the major vendors: OpenAI, Microsoft, NVIDIA, AWS, and Meta in order to better consult our customers regarding the preferred choices for their specific needs.

transformative AI

Corporate LLM Implementation & Privacy/Security Concerns

Initial implementation of a corporate LLM should not be complex or difficult. While building an ideal LLM can be challenging, even a simple implementation can be more than sufficient for many conversational AI and discriminative AI applications within corporate workflow automation. In many cases, we might not even need a fine-tuned model; using a basic foundation model or a RAG (Retrieval-Augmented Generation) approach to query our data is quite straightforward. Even smaller companies can implement it on their own if they have an IT department. However, it is always best to consult with the companies specializing in AI/LLM, like Leobit.

Some organizations are resistant to the intensive usage of Gen AI due to privacy and security concerns or algorithmic biases. Although these concerns are quite serious for some industries, many of them can be mitigated. For example, we can start by applying AI in areas where a certain level of biases and hallucinations can be tolerated, as attempts to fully eliminate them may be inefficient. Conversational AI in sales and HR are good examples where this technology can be applied, and it’s worth noting that these systems will improve over time.

For better data privacy, one can use AI Models as a Service, such as Azure OpenAI Service. If that’s not sufficient, you can use open-source LLMs to run them as Privately Hosted AI Models.

AI implementation

Summary

The fact that AI Tools are fully covering some parts of corporate routine is fabulous.

The main takeaway is that whether you believe it or not, AI is here and ready to take human jobs. AI is already doing impressive things, and we may be amazed by its capabilities in a year — it’s better to be prepared. 

Many companies haven’t implemented a corporate LLM yet, but such a system could significantly enhance their operations or elevate their product making step forward to AI transformation. It’s inevitable: either you build LLMs alongside with Corporate AI Tools now to stay ahead of competitors, or you’ll miss the boat when it’s already too late.

If you have any questions, I’d be happy to answer them. Please connect with me on LinkedIn

If you’re looking to implement AI/LLM for your company, submit the form below or read more about our AI/ML & Generative AI Services. We’ll be happy to assist you with consulting and AI development.

Want a
quick tech consultation?

Contact us
Artem Matsa | Business Development Director