Scaling Intelligence: How FSN Capital Uses GenAI to Transform the PE Value Chain


How FSN Capital Uses GenAI for Private Equity Controlling

Every company has processes that have long relied on manual research and fragmented data. But Benjamin Grether, leading Digital Value Creation at FSN Capital, is proving that generative AI can turn this cumbersome process into a scalable engine for smarter decisions.

His talk at the AI & Data Convention offered rare insights into how AI is becoming a game changer across the private equity value chain – and how that is transferable to the general finance function.

Active ownership and digital value creation in PE through AI

Private equity firms take differing approaches in how they manage and create value in their portfolio companies, but FSN Capital’s speaker Benjamin Grether made a particularly interesting comparison to explain their focus on digital capabilities in value creation: “PE is like house flipping, but with companies. In this analogy, a Digital Value Creation Manager is like the electrician who is converting a house into a smart home, which then can be sold for a higher price”. In his role, he’s responsible for driving data-driven decision making and helping companies in building the required infrastructure and tools – not just in single companies, but across an entire investment portfolio.

In his presentation at the AI & Data Convention, organized by Horváth, he showed how artificial intelligence – specifically, large language models (LLMs) – is becoming a key tool in making strategic M&A scalable.

The pain point: too much data, too little structure

Private equity has never had a bigger role in shaping the business landscape. Today, 88 percent of companies with more than $100 million in annual recurring revenue are private, and the private market is growing twice as fast as its public counterpart. Yet sourcing these hidden champions – especially in the small and mid-cap segments – remains a manual and resource-intensive effort.

For larger companies, reliable public data is usually available. But for SMBs, it’s a different story: company databases are inconsistent, reporting is limited, and the information that does exist for example on company websites is largely unstructured. Traditional deal sourcing approaches in PE companies – such as sending teams of analysts to screen hundreds of websites – can’t keep pace with the scale and speed required. Something had to change.

The breakthrough: from URLs to structured intelligence

That change came with advances in large language models. The key realization was: With AI, one can now extract relevant information from company websites, classify companies based on strategic fit, and even estimate missing financials. This process happens in multiple iterations until quality aspirations are fulfilled; yet generally unfolds in the following three-step-schema.

Step 1: Input and extraction

The starting point is a long list of company URLs (10,000’s-100,000’s of URLs). An LLM is prompted to extract information relating to a specific investment thesis for each of these companies. For example, the system might look for:

  • Number of employees
  • Focus areas (e.g. smart building, energy efficiency)
  • Customer types (e.g. B2B)
  • Ownership indicators (e.g. family-owned, succession mentions)

Thanks to recent model developments like the introduction of Gemini 1.5, which support much larger input sizes than previous versions, entire websites can now be processed in one pass. This unlocks a level of nuance and contextual understanding that simple keyword scraping could never achieve.

Step 2: Classification as a follow-up pass

After extracting relevant information from company websites, the next phase involves a second iteration: classification. This step organizes companies based on the extracted features and a list of pre-defined categories. It thus enables a more structured analysis and comparison across potential investment opportunities.​

Step 3: Estimating the unknown

For key figures like revenue or EBITDA, where hard data is missing, the system applies proxy-based estimation. For instance, the number of employees, size of customer projects, or the presence of pricing information can all serve as indirect indicators. While not as accurate as audited figures, these estimates are often directionally reliable within domain-specific bounds – enabling better prioritization of targets.

What makes this approach powerful is its scalability: the same system can be applied across hundreds or thousands of potential targets with minimal marginal effort, and without diluting insight quality.

Embedding AI across the PE value chain

Beyond deal sourcing, PE companies can use AI across every stage of the investment lifecycle:

  1. In fundraising, AI tools assist in screening potential investors and in answering detailed investor questionnaires on fund setup, investment strategy, ESG impact, etc.
  2. In deal sourcing, the discussed example applies.
  3. In due diligence, generative models summarize legal and financial documents, highlighting inconsistencies or missing elements.
  4. In value creation, identifying, mapping and implementing high impact AI use cases across the portfolio leads to a boost in value creation and creates opportunities to substantially reposition companies in terms of their business model, their value proposition or their efficiency.
  5. In exit processes, AI can also enhance exit readiness by automating compliance processes, such as SOX controls and prepare documents for the data room.

A centralized digital team can then support the rollout and adoption of these tools – ensuring that AI doesn’t remain siloed in innovation labs, but becomes a standard part of how the firm operates.

From PE to finance: what controllers can take away

Although the case presented was specific to private equity, the underlying approach has clear implications for controllers and finance professionals. Many finance teams are sitting on troves of unstructured or semi-structured data: supplier websites, invoice attachments, PDF contracts, customer feedback, even ERP comment fields. These data sources are rich in insight – but practically unusable with traditional tooling.

The presentation offered a blueprint for turning that data into actionable intelligence:

  • Use LLMs for automating data (extraction) processes and go beyond chatbots. Pull key metrics or descriptors from the web or document sources into structured forms.
  • Apply classification logic to segment data based on relevant criteria.
  • Estimate missing data with proxy rules, especially when trying to model or forecast outcomes based on incomplete records.

For instance, a controller might employ this approach to analyze supplier websites for ESG indicators or to assess compliance risks when direct data is unavailable.​

As the speaker put it: “Revisit use cases you ruled out last year. The models have improved significantly in terms of scalability and quality. You’ll be surprised what’s possible now.”

An important lesson is this: LLMs are not just better search engines – they are engines for transforming unstructured data into structured data. And once structure is in place, the rest of the finance toolbox – BI dashboards, planning models, scenario tools – can do what they do best.

Conclusion: 3x, 10x, Infinity

At the end of the session, the speaker returned to his opening slide – a simple one showing three symbols: 3x, 10x, ∞.

  • 3x PE target return through portfolio value creation
  • 10x productivity in sourcing the right deals through AI
  • ∞ - the open horizon of what becomes possible when AI is embedded into strategy and operations

For finance leaders and controllers, the message is clear: AI isn’t just about replacing manual work. It’s about reframing what’s possible – by combining structured thinking with unstructured data and turning insight into action at scale.