Skip to main content

Machine Learning Patents from Germany

A comprehensive analysis of German innovation in neural networks, deep learning, knowledge-based AI, quantum computing, and pattern recognition — from early foundations to industrial scale.

Data: EPO PATSTAT Global, Autumn 2025 Period: 2014–2024 Created: February 2026 Author: mtc.berlin
7,732
Patent Families
10,932 applications total
16x
Growth 2014-2021
98 to 1,595 families
Neural Nets
#1 Sub-Area
4,427 families (57%)
Bosch
#1 Applicant
1,835 families
USA
#1 Filing Route
3,611 applications (33%)

Key Takeaways

Eight insights for decision-makers

Automotive dominates. 8 of the top 15 patent filers are automotive companies or suppliers — led by Bosch (1,835 families), followed by Siemens, SAP, and VW.

Paradigm shift to domestic-first. Until 2021, the USPTO was the #1 filing destination. From 2022, German applicants file more at DPMA and via PCT than directly in the US.

Quantum Computing surging. From 2 families in 2014 to 176 in 2023 (88x), making it the fastest-growing ML sub-area by far — still small but rapidly emerging.

SAP breaks the mold. The only top filer where general ML and knowledge-based systems outweigh Neural Networks — reflecting its enterprise AI and business-process focus.

Healthcare is #2 application domain. G16H (Healthcare IT, 445 families) + A61B (Diagnosis, 417) = 862 families, driven by Siemens Healthcare and Siemens Healthineers.

Strong China connection. Huawei (128 joint families) and Tsinghua University (45) are among the top international R&D partners for German ML innovators.

Siemens Healthcare leads citation impact. 25.1 citations per family — the highest density of any large filer. 6 of the 10 most-cited German ML families are from Siemens Healthcare/Healthineers.

1,015 "hidden" ML patents. A multi-layer search found 13% more families — patents using ML that are classified under their application domain (image processing, control systems, automotive) rather than under G06N.

Filing Activity

Annual patent families filed by German applicants in ML/AI classifications, 2014–2024. A family may appear in multiple years if it has filings in different years.

Inflection point 2017: Filing activity more than doubled from 155 families (2016) to 355 (2017), coinciding with the global deep learning wave triggered by breakthroughs in image recognition and natural language processing. Growth continued to a peak of 1,595 families in 2021. The 2022–2023 plateau at ~1,400–1,600 families likely reflects maturation rather than decline.

* 2024 data is incomplete due to the 18-month publication lag.

Data table: Filing Activity 2014–2024
YearFamiliesGrantedCo-apps
2014987250
20151138723
201615510813
201735525055
201865443594
20191,165707152
20201,347637239
20211,595518321
20221,388293261
20231,603166256
2024*80043103

Technology Landscape

Distribution across ML/AI sub-areas (IPC G06N main groups + G06F18)

Sub-AreaIPCFamilies
Neural Networks / Deep LearningG06N34,427
Machine Learning (general)G06N202,931
Knowledge-based / Expert SystemsG06N51,049
Pattern RecognitionG06F18672
Mathematical / ProbabilisticG06N7482
Quantum ComputingG06N10461
Other AIG06N99148

Note: Families may appear in multiple sub-areas if classified under more than one IPC main group. Sub-area totals therefore exceed the 7,732 unique families.

Neural Networks dominate with over half of all families (4,427). The Quantum Computing segment (461 families) is still small but grew from 2 families in 2014 to 176 in 2023 — an 88x increase, making it the fastest-growing sub-area by far.

Data table: ML Sub-Areas
Sub-AreaIPCFamiliesShare
Neural Networks / Deep LearningG06N34,42757.3%
Machine Learning (general)G06N202,93137.9%
Knowledge-based / Expert SystemsG06N51,04913.6%
Pattern RecognitionG06F186728.7%
Mathematical / ProbabilisticG06N74826.2%
Quantum ComputingG06N104616.0%
Other AIG06N991481.9%

Technology Evolution

How the ML technology mix has shifted over time

Paradigm shift: In 2014, Knowledge-based systems (42 families) led ahead of Neural Networks (24). By 2017, Neural Networks had surged past all other areas, and by 2023 they accounted for 842 families — 35x growth in under a decade. Pattern Recognition (G06F18) emerged as a distinct category only from 2019, coinciding with the revised IPC 2019.01 edition. Quantum Computing became a measurable force from 2020 onward.

Data table: Technology Evolution 2014–2024
YearNeural NetsMLKnowledgePatternQuantumMath
20142428420224
20152843414023
20164860377132
20171831376020144
201838923510826665
20197604201571012583
2020870494180915181
20219335751931339574
2022812501149769145
202384255714716617633
2024*42726380859514

Neural Network Architectures & Learning Methods

Breakdown of the 4,427 neural network families by IPC subgroup (G06N3)

Architecture Types

Learning Paradigms

CNNs (138) and Auto-encoders (141) lead the architecture types, while Generative Networks / GANs (94) and Recurrent Networks / RNN (73) follow. Among learning methods, Supervised learning (149) dominates, but Reinforcement Learning (89), Transfer Learning (66), and Federated Learning (57) show strong emerging interest. The Federated Learning segment is particularly noteworthy for privacy-preserving AI applications in automotive and healthcare.

Data table: Neural Network Architectures
ArchitectureFamilies
Architecture / topology1,392
Combinations of nets391
Hardware implementation176
CNN (Convolutional)138
Auto-encoder / Enc-Dec141
GAN (Generative)94
RNN (Recurrent)73
Probabilistic / stochastic69
Knowledge-based NN59
Federated / distributed57
Data table: Learning Paradigms
MethodFamilies
Learning methods (gen.)2,377
Backpropagation178
Supervised learning149
Unsupervised learning97
Reinforcement learning89
Transfer learning66
Adversarial learning61
Architecture mod. (NAS)59
Semi-/self-supervised40
Meta-learning40

Top Patent Leaders

The 15 most active German ML patent filers by family count

Data table: Top 15 Patent Leaders
#ApplicantFamiliesPrimary FocusSector
1Robert Bosch GmbH1,835Neural Networks (72%)Automotive / IoT
2Siemens AG791Neural Networks + Knowledge-basedIndustrial / Digital
3SAP SE688ML general + Knowledge-basedEnterprise Software
4Siemens Healthcare GmbH363Neural Networks (68%)Medical Imaging
5Volkswagen AG246Neural Networks (66%)Automotive
6BMW AG245Neural Networks (59%)Automotive
7IBM Deutschland GmbH217Other / cross-cuttingTechnology / R&D
8NEC Europe Ltd169Neural Networks (58%)IT / Telecom
9ZF Friedrichshafen AG163Neural Networks (83%)Automotive Supplier
10Siemens Healthineers AG139Neural Networks (71%)Medical Devices
11Fraunhofer-Gesellschaft128Neural Networks (67%)Research
12Porsche AG106Neural Networks (62%)Automotive
13Mercedes-Benz Group AG99ML general (60%)Automotive
14Continental Automotive90Neural Networks (75%)Automotive Supplier
15Audi AG81Neural Networks (66%)Automotive

Robert Bosch dominates with 1,835 families — more than the next two combined (Siemens + SAP = 1,479). The automotive sector accounts for 8 of the top 15 filers. SAP is the outlier: it is the only top filer where general ML and Knowledge-based systems outweigh Neural Networks, reflecting its enterprise and business-process focus. ZF Friedrichshafen has the highest Neural Network concentration (83%) of any top filer.

Application Domains

Where is German ML being applied? Top co-occurring IPC subclasses on ML families

Computer Vision (G06V, 1,365 families) and Image Processing (G06T, 1,103) are the #1 application area — combined, they appear on more ML families than any single sub-area. Vehicle Control (B60W, 746) and Traffic Systems (G08G, 355) confirm the automotive dominance. Healthcare (G16H + A61B = 862) is the second-largest application domain, driven by Siemens Healthcare and Siemens Healthineers. Robotics (B25J, 133) and Speech Recognition (G10L, 164) round out the industrial landscape.

Data table: Application Domains
IPCDomainFamilies
G06FDigital Data Processing2,420
G06VComputer Vision1,365
G06TImage Processing1,103
G06KData Reading1,051
G05BControl Systems839
B60WVehicle Control746
G06QBusiness / Admin634
G16HHealthcare IT445
A61BDiagnosis / Surgery417
H04LDigital Transmission362
G08GTraffic Control355
B60RVehicle Fittings296
G01SNavigation / Radar285
G01NMaterials Analysis243
H04WWireless Comms217
G10LSpeech Recognition164
B25JRobotics133

Geographic Filing Strategy

Where German ML innovators seek patent protection (application-level counts by filing office)

Total Filings by Office

Jurisdiction Trends Over Time

Note: WO/PCT is not a target market but an international filing route that typically leads to national/regional phase entries (EP, US, CN, etc.). A family may therefore be counted at multiple offices.

Dramatic geographic shift: In 2014, the US Patent Office was the #1 filing destination (67 apps vs. 17 DE). By 2022, the German Patent Office (DPMA) overtook the USPTO (456 vs. 404), and by 2023, domestic filings led all routes (600 DE vs. 404 WO vs. 313 US). This reflects both a strategic shift toward PCT filings for broader international coverage and a growing domestic-first approach. South Korea (KR) is a consistent secondary market, particularly for automotive-related ML.

Data table: Geographic Filing Trends
YearUSDEEPWOKR
2014671720195
2015652720136
2016943132118
201720385624311
20183801591136524
201957732726112535
202062738833025144
202171438935734572
202240445635330969
202331360037240460
2024*16718410334235

International R&D Network

Non-German co-applicants on co-filed ML families (nb_applicants > 1). Includes both external partners and international subsidiaries of German companies.

Top International Co-Applicants

External Partners vs. Subsidiaries

PartnerCountryFamiliesType
IBM CorpUS220External
Huawei TechnologiesCN128External
Siemens CorpUS59Subsidiary
NEC CorpJP46External
Siemens Ltd ChinaCN45Subsidiary
Tsinghua UniversityCN45Academic
Bosch India (R&D)IN26Subsidiary
IBM ChinaCN25External
Carnegie Mellon UnivUS24Academic
Nanyang Tech UnivSG21Academic
F. Hoffmann-La RocheCH14External
Intel CorpUS11External

IBM is the #1 external R&D partner for German ML applicants with 220 joint families. Huawei (128 families) is the second-largest — a strong China connection reflecting joint R&D operations. International subsidiaries (Siemens US/China, Bosch India) represent global R&D coordination within German multinationals. Academic collaborations are notable: Tsinghua University (45), Carnegie Mellon (24), and Nanyang Tech (21) span three continents of research partnerships.

Who Works With Whom

Co-applicant pairs on German ML patent families — grouped by corporate parent to consolidate name variants

University – Industry Pairs

Industry PartnerUniversity / ResearchFamilies
Robert BoschTsinghua University (CN)45
Robert BoschUni Freiburg (DE)37
Robert BoschCarnegie Mellon (US)24
ContinentalNanyang Tech (SG)19
EleqtronUni Siegen (DE)11
FZ JuelichRWTH Aachen (DE)10
Robert BoschLeibniz Uni Hannover (DE)6
Carl Zeiss MeditecTU Muenchen (DE)5
ToyotaMax Planck Informatik (DE)5
FraunhoferUni des Saarlandes (DE)5

Industry – Industry Pairs

Company ACompany BFamilies
AudiVolkswagen15
PorscheVolkswagen15
AudiPorsche14
CARIADRobert Bosch8
CARIADVolkswagen7
SiemensSiemens Healthcare12
BASFRobert Bosch5

Bosch is the collaboration king — partnering with Tsinghua (45 families), Uni Freiburg (37), and Carnegie Mellon (24), spanning three continents. The VW Group shows intense internal knowledge-sharing between Audi, Porsche, VW, and their software unit CARIAD. A notable emerging pair: Eleqtron x Uni Siegen (11 families) represents the quantum computing startup ecosystem. The Bosch-Tsinghua axis alone accounts for more co-filings than the entire VW Group's internal collaboration.

Grant Rates by Filing Office

Percentage of applications that have been granted, by patent office. Note: Grant rates for recent filings (2022+) are artificially low due to examination pendency.

OfficeApplicationsGrantedGrant Rate
US (USPTO)3,6112,02756.1%
DE (DPMA)2,66358622.0%
EP (EPO)2,02343721.6%
WO (PCT)1,927613.2%*
KR (KIPO)3699224.9%
AU (IP Australia)521732.7%
FR (INPI)493061.2%

* PCT/WO is a filing route, not a granting office. The 3.2% reflects national phase entries that were granted while still tracked under the WO application.

The USPTO leads with a 56.1% grant rate — more than double the European (21.6%) and German (22.0%) rates. This reflects both the higher volume of US filings by German applicants and the USPTO's historically faster examination process for software-related patents. The DPMA and EPO rates are comparable (22% vs. 21.6%), suggesting consistent European examination standards. Many recent filings (2022-2024) are still under examination, so these rates will increase over time.

Citation Impact

Which German ML patent families and applicants generate the most downstream citations?

Most Cited Applicants

Citation Impact vs. Portfolio Size

ApplicantCited FamiliesTotal CitationsAvg/Family
Siemens Healthcare3328,33825.1
Robert Bosch9216,9707.6
SAP SE5536,69212.1
Siemens AG5145,63411.0
Siemens Healthineers941,95520.8
NEC Europe1211,69114.0
Fraunhofer661,01915.4
Volkswagen1471,1798.0
IBM Deutschland1591,0446.6
BASF3851113.4

Most Cited Individual Families

#DOCDB FamilyApplicantYearCitations
155969747Siemens AG / Siemens Healthcare2015484
260888377Autonomos GmbH2017399
359581846Siemens Healthcare / Healthineers2017339
460327326Robert Bosch GmbH2016311
567616469Rudolf Schreiner2019269
655910782Siemens Healthcare GmbH2015245
758558973Siemens Healthcare GmbH2017236
848626420European Molecular Biology Lab2014232
952727129Fraunhofer-Gesellschaft2015216
1060677707Siemens Healthcare GmbH2017200

Siemens Healthcare dominates citation impact with 25.1 citations per family — the highest citation density of any large German ML filer. 6 of the top 10 most-cited families belong to Siemens Healthcare/Healthineers, mostly in medical imaging AI. Autonomos GmbH (an autonomous driving startup) holds the #2 most-cited family (399 citations). BASF (13.4 avg) and Fraunhofer (15.4 avg) punch above their portfolio size, suggesting high-quality, foundational patents. In contrast, Bosch's large portfolio (921 cited families) shows a more applied, incremental pattern at 7.6 citations per family.

Beyond Core Classification: Hidden ML Patents

1,015 additional patent families use ML/neural networks but are classified only under their application domain — invisible to a standard G06N search

A standard patent search for machine learning uses IPC/CPC codes like G06N (AI computing models) and G06F18 (pattern recognition). But many patents that use ML as a method are classified purely under their application domain, with a CPC tag indicating "using neural networks" or "using machine learning." These patents never appear in a core G06N search.

By searching 53 application-domain CPC codes (e.g., G06T2207/20084 "Artificial neural networks for image analysis," G05B13/027 "Adaptive control using neural networks," F02D41/1405 "Engine control using neural networks"), we found 1,015 additional German patent families that our core search missed — a 13% increase in total coverage.

Hidden Families by Year

Hidden Families by Industry

Who Files "Hidden" ML Patents?

#ApplicantHidden FamiliesLikely Domain
1Siemens Healthcare GmbH234Medical imaging (G06T/G06V with NN tags)
2Siemens Healthineers AG121Medical imaging
3Robert Bosch GmbH116Image processing, automotive
4Siemens AG39Industrial image processing
5Carl Zeiss Microscopy GmbH30Microscopy image analysis
6BMW AG22Image processing, driving
7Carl Zeiss Meditec AG21Ophthalmic imaging
8Continental Autonomous Mobility20Autonomous driving vision
9Volkswagen AG19Image processing
10Audi AG16Image processing

92% of hidden ML families are in image processing (933 of 1,015) — patents that use neural networks for image analysis but are classified under G06T or G06V rather than G06N. Siemens Healthcare alone accounts for 355 hidden families (234 + 121 Healthineers), reflecting their deep medical imaging AI work. The hidden patent count is growing faster than the core count: from 4 families in 2014 to 303 in 2023, roughly tripling the proportion from 4% to 19% of the combined total. This means the "true" German ML patent landscape is larger than what classification-only searches reveal.

Methodology

Data source, classification scope, and analytical approach

Multi-Layer Search Strategy

This report uses a three-layer search strategy to maximize coverage:

  • Layer 1: Core IPC — G06N (all AI/ML subgroups) + G06F18 (pattern recognition) = 7,732 families
  • Layer 2: CPC App-Domain — 53 CPC codes tagging ML/NN usage in other fields = 1,015 additional families
  • Layer 3: Title Keywords — Title search for "machine learning," "neural network," "deep learning" etc. = 1,683 (high overlap with L1)

Sections 1-8 use Layer 1 for consistency. Section "Beyond Core Classification" presents Layer 2 findings.

Data Source

EPO PATSTAT Global, Autumn 2025 edition, accessed via Google Cloud BigQuery (project: patstat-mtc, dataset: patstat). Analysis date: February 2026.

Applicant Filter

German applicants identified via person_ctry_code = 'DE' in tls206_person, joined via tls207_pers_appln with applt_seq_nr > 0. This captures German entities filing worldwide, not just applications filed at DPMA.

Family Counting

The headline KPI (7,732) counts each DOCDB family exactly once. In annual trend charts, a family is assigned to each year in which it has a filing, so year totals may sum to more than the unique count (1,495 families have filings in 2+ years).

Citation Counting

Citations are counted at the publication level via tls212_citation, then aggregated to the family level. Citation counts favor older families (more time to accumulate citations).

Stack

PATSTAT BigQuery + patstat-mcp (custom MCP server) + Claude AI for analysis and visualization. All SQL queries are included and reproducible.

Scope Limitations

  • Some German subsidiaries of foreign companies (e.g., IBM Deutschland, NEC Europe) are included because person_ctry_code = 'DE'. These represent genuine R&D conducted in Germany.
  • A single family may be counted in multiple ML sub-areas if it carries IPC codes from more than one main group.
  • WO/PCT is an international filing route, not a target market — families filed via PCT will also appear in national/regional phase entries.
  • 2023–2024 data is incomplete due to the ~18-month publication delay in PATSTAT Autumn 2025.
  • Patent data measures invention disclosure, not market impact or technological quality.
  • The "hidden" Layer 2 families are likely an undercount — only 53 of the most clearly labeled application-domain CPC codes were searched.
Glossary — Patent Terms Explained
DOCDB Family
A grouping of patent applications covering the same invention across different countries. One DOCDB family = one invention, regardless of how many national applications exist.
IPC (International Patent Classification)
A hierarchical classification system maintained by WIPO. The main ML-related codes are G06N (computing models based on specific computational models) and G06F18 (pattern recognition).
CPC (Cooperative Patent Classification)
A more granular classification jointly maintained by the EPO and USPTO. Used here for the "hidden ML patents" analysis via application-domain-specific neural network tags.
PATSTAT
The European Patent Office's worldwide patent statistical database, containing bibliographic and legal data on over 100 million patent documents.
PCT (Patent Cooperation Treaty)
An international filing route (WO applications) that allows applicants to seek protection in multiple countries through a single initial filing. Not a granting office.
Grant Rate
The percentage of applications that have been granted by a patent office. Rates for recent years are artificially low because many applications are still under examination.
Co-application
A patent application filed by multiple applicants (nb_applicants > 1), indicating collaborative R&D or joint ownership of the invention.
Citation
A reference from one patent publication to another. Forward citations (being cited by later patents) are used as a proxy for technological impact and influence.
Neural Network (NN)
A computing model inspired by biological neural networks, classified under IPC G06N3. Includes sub-types like CNNs, RNNs, GANs, and auto-encoders.
Hidden ML Patent
A patent that uses machine learning or neural networks as a method but is classified only under its application domain (e.g., image processing, engine control) rather than under the core ML classification G06N.

All SQL queries and the complete data basis are available for download.

Like what you see?

This report was built with a fully reproducible pipeline: EPO PATSTAT Global on BigQuery, a custom MCP server, and Claude AI for analysis and visualization. Everything is open and auditable — the SQL queries are included.

Custom Patent ReportNeed a similar analysis for your technology area, company, or research question? We build tailored patent intelligence reports.
The ToolchainWant to run your own AI-powered patent analyses? We help you set up PATSTAT on BigQuery with MCP server integration.