Table of content

On September 16th, 2025, our teams hosted an internal conference titled “A Journey Through Generative AI”. The goal was to demystify generative artificial intelligence, explain its core concepts, and share real-world applications.

Led by Roxane Jouseau and Hichame Haichour AI experts (from Novencia), the session brought together teams and enthusiasts to explore topics such as Large Language Models (LLMs), their architecture, how they are trained, and how they can be applied to real business problems.

Understanding the Basics of Generative AI

The three levels of Artificial Intelligence

- ANI (Artificial Narrow Intelligence): AI focused on a single, narrow task.

- AGI (Artificial General Intelligence): AI with human-like reasoning ability.

- ASI (Artificial Super Intelligence): AI that surpasses human capabilities (still theoretical).

How do LLMs work?

Large Language Models (LLMs) rely on:

- Machine learning (supervised and self-supervised),

- Embeddings to represent language in vectors,

- The Transformer architecture, based on the attention mechanism to capture context and meaning.

Training an LLM: A Colossal Challenge

Building a model like GPT-4 or Gemini requires:

- Thousands of high-performance GPUs,

- Massive compute time (weeks or even months),

- Huge financial investments, often in the tens of millions.

Example: GPT-5 and Grok 3 highlight the challenges of scalability, cost, and energy consumption.

Evaluating an LLM: Towards More Reliability

Evaluation methods

- Benchmarking with reference datasets,

- LLM-as-a-judge, where one model evaluates another.

Why honesty matters

New research shows the importance of models being able to admit uncertainty when they don’t know an answer.

Some LLM evaluators now outperform humans in consistency and coherence, making them powerful tools for quality assurance.

Real-World Use Case: Automating Configuration File Generation

During the session, we explored a concrete example: automating JSON file generation for a data quality tool.

Two approaches were tested:

  1. Prompt Augmentation (RAG): enriching prompts with contextual documents,

  2. Supervised fine-tuning: adapting an open-source model for a specific business use case.

Outcome: faster configuration setup and improved data quality.

What’s Next for Generative AI?

Upcoming trends to watch include:

- Multimodal LLMs (text, image, audio, video),

- Autonomous AI agents able to reason and execute tasks,

- Optimization and quantization to reduce inference costs,

New approaches to AI safety and reliability.

Conclusion

This conference helped our teams build a stronger understanding of Generative AI fundamentals and how they translate into real-world use cases. By combining theory with practice, we aim to spread AI culture across teams and spark new collaborations.

Share :
Share

What is Generative AI?

Generative AI is a branch of artificial intelligence that can create new content such as text, images, audio, or code by learning patterns from large datasets.

What are LLMs (Large Language Models)?

LLMs are advanced machine learning models trained on massive text datasets. They use transformer architecture to understand context and generate human-like language.

How are LLMs evaluated?

LLMs are tested through benchmarking (fixed datasets) and LLM-as-a-judge, where one AI model evaluates another’s output for quality and accuracy.

What’s the future of Generative AI?

The future lies in multimodal models, autonomous AI agents, and energy-efficient optimization techniques that make AI more scalable and reliable.

Our experts are only a phone call away!

Let us know your circumstances, and together we can find the best solution for your product development.
Contact us

Read more news

18/2/26

Becoming a manager for the first time: insights from junior business managers at T&S

What does it mean to become a manager for the first time? Two junior managers at T&S share their experience and key takeaways.

READ MORE
12/2/26

Devops culture and devops audit: building the foundations for sustainable transformation

DevOps culture and DevOps audit: assess your DevOps maturity, identify risks and build strong foundations for sustainable transformation.

READ MORE
25/1/26

Sharing know-how from the field to engineering schools - ENSISA & INSA

Discover how skills-based volunteering allows industry experts to share real-world knowledge with engineering students through structured teaching modules at ENSISA and INSA Strasbourg.

READ MORE