Machine Learning Patents from Germany
A comprehensive analysis of German innovation in neural networks, deep learning, knowledge-based AI, quantum computing, and pattern recognition — from early foundations to industrial scale.
Key Takeaways
Eight insights for decision-makers
Automotive dominates. 8 of the top 15 patent filers are automotive companies or suppliers — led by Bosch (1,835 families), followed by Siemens, SAP, and VW.
Paradigm shift to domestic-first. Until 2021, the USPTO was the #1 filing destination. From 2022, German applicants file more at DPMA and via PCT than directly in the US.
Quantum Computing surging. From 2 families in 2014 to 176 in 2023 (88x), making it the fastest-growing ML sub-area by far — still small but rapidly emerging.
SAP breaks the mold. The only top filer where general ML and knowledge-based systems outweigh Neural Networks — reflecting its enterprise AI and business-process focus.
Healthcare is #2 application domain. G16H (Healthcare IT, 445 families) + A61B (Diagnosis, 417) = 862 families, driven by Siemens Healthcare and Siemens Healthineers.
Strong China connection. Huawei (128 joint families) and Tsinghua University (45) are among the top international R&D partners for German ML innovators.
Siemens Healthcare leads citation impact. 25.1 citations per family — the highest density of any large filer. 6 of the 10 most-cited German ML families are from Siemens Healthcare/Healthineers.
1,015 "hidden" ML patents. A multi-layer search found 13% more families — patents using ML that are classified under their application domain (image processing, control systems, automotive) rather than under G06N.
Filing Activity
Annual patent families filed by German applicants in ML/AI classifications, 2014–2024. A family may appear in multiple years if it has filings in different years.
Inflection point 2017: Filing activity more than doubled from 155 families (2016) to 355 (2017), coinciding with the global deep learning wave triggered by breakthroughs in image recognition and natural language processing. Growth continued to a peak of 1,595 families in 2021. The 2022–2023 plateau at ~1,400–1,600 families likely reflects maturation rather than decline.
* 2024 data is incomplete due to the 18-month publication lag.
Data table: Filing Activity 2014–2024
| Year | Families | Granted | Co-apps |
|---|---|---|---|
| 2014 | 98 | 72 | 50 |
| 2015 | 113 | 87 | 23 |
| 2016 | 155 | 108 | 13 |
| 2017 | 355 | 250 | 55 |
| 2018 | 654 | 435 | 94 |
| 2019 | 1,165 | 707 | 152 |
| 2020 | 1,347 | 637 | 239 |
| 2021 | 1,595 | 518 | 321 |
| 2022 | 1,388 | 293 | 261 |
| 2023 | 1,603 | 166 | 256 |
| 2024* | 800 | 43 | 103 |
Technology Landscape
Distribution across ML/AI sub-areas (IPC G06N main groups + G06F18)
| Sub-Area | IPC | Families |
|---|---|---|
| Neural Networks / Deep Learning | G06N3 | 4,427 |
| Machine Learning (general) | G06N20 | 2,931 |
| Knowledge-based / Expert Systems | G06N5 | 1,049 |
| Pattern Recognition | G06F18 | 672 |
| Mathematical / Probabilistic | G06N7 | 482 |
| Quantum Computing | G06N10 | 461 |
| Other AI | G06N99 | 148 |
Note: Families may appear in multiple sub-areas if classified under more than one IPC main group. Sub-area totals therefore exceed the 7,732 unique families.
Neural Networks dominate with over half of all families (4,427). The Quantum Computing segment (461 families) is still small but grew from 2 families in 2014 to 176 in 2023 — an 88x increase, making it the fastest-growing sub-area by far.
Data table: ML Sub-Areas
| Sub-Area | IPC | Families | Share |
|---|---|---|---|
| Neural Networks / Deep Learning | G06N3 | 4,427 | 57.3% |
| Machine Learning (general) | G06N20 | 2,931 | 37.9% |
| Knowledge-based / Expert Systems | G06N5 | 1,049 | 13.6% |
| Pattern Recognition | G06F18 | 672 | 8.7% |
| Mathematical / Probabilistic | G06N7 | 482 | 6.2% |
| Quantum Computing | G06N10 | 461 | 6.0% |
| Other AI | G06N99 | 148 | 1.9% |
Technology Evolution
How the ML technology mix has shifted over time
Paradigm shift: In 2014, Knowledge-based systems (42 families) led ahead of Neural Networks (24). By 2017, Neural Networks had surged past all other areas, and by 2023 they accounted for 842 families — 35x growth in under a decade. Pattern Recognition (G06F18) emerged as a distinct category only from 2019, coinciding with the revised IPC 2019.01 edition. Quantum Computing became a measurable force from 2020 onward.
Data table: Technology Evolution 2014–2024
| Year | Neural Nets | ML | Knowledge | Pattern | Quantum | Math |
|---|---|---|---|---|---|---|
| 2014 | 24 | 28 | 42 | 0 | 2 | 24 |
| 2015 | 28 | 43 | 41 | 4 | 0 | 23 |
| 2016 | 48 | 60 | 37 | 7 | 1 | 32 |
| 2017 | 183 | 137 | 60 | 20 | 1 | 44 |
| 2018 | 389 | 235 | 108 | 26 | 6 | 65 |
| 2019 | 760 | 420 | 157 | 101 | 25 | 83 |
| 2020 | 870 | 494 | 180 | 91 | 51 | 81 |
| 2021 | 933 | 575 | 193 | 133 | 95 | 74 |
| 2022 | 812 | 501 | 149 | 76 | 91 | 45 |
| 2023 | 842 | 557 | 147 | 166 | 176 | 33 |
| 2024* | 427 | 263 | 80 | 85 | 95 | 14 |
Neural Network Architectures & Learning Methods
Breakdown of the 4,427 neural network families by IPC subgroup (G06N3)
Architecture Types
Learning Paradigms
CNNs (138) and Auto-encoders (141) lead the architecture types, while Generative Networks / GANs (94) and Recurrent Networks / RNN (73) follow. Among learning methods, Supervised learning (149) dominates, but Reinforcement Learning (89), Transfer Learning (66), and Federated Learning (57) show strong emerging interest. The Federated Learning segment is particularly noteworthy for privacy-preserving AI applications in automotive and healthcare.
Data table: Neural Network Architectures
| Architecture | Families |
|---|---|
| Architecture / topology | 1,392 |
| Combinations of nets | 391 |
| Hardware implementation | 176 |
| CNN (Convolutional) | 138 |
| Auto-encoder / Enc-Dec | 141 |
| GAN (Generative) | 94 |
| RNN (Recurrent) | 73 |
| Probabilistic / stochastic | 69 |
| Knowledge-based NN | 59 |
| Federated / distributed | 57 |
Data table: Learning Paradigms
| Method | Families |
|---|---|
| Learning methods (gen.) | 2,377 |
| Backpropagation | 178 |
| Supervised learning | 149 |
| Unsupervised learning | 97 |
| Reinforcement learning | 89 |
| Transfer learning | 66 |
| Adversarial learning | 61 |
| Architecture mod. (NAS) | 59 |
| Semi-/self-supervised | 40 |
| Meta-learning | 40 |
Top Patent Leaders
The 15 most active German ML patent filers by family count
Data table: Top 15 Patent Leaders
| # | Applicant | Families | Primary Focus | Sector |
|---|---|---|---|---|
| 1 | Robert Bosch GmbH | 1,835 | Neural Networks (72%) | Automotive / IoT |
| 2 | Siemens AG | 791 | Neural Networks + Knowledge-based | Industrial / Digital |
| 3 | SAP SE | 688 | ML general + Knowledge-based | Enterprise Software |
| 4 | Siemens Healthcare GmbH | 363 | Neural Networks (68%) | Medical Imaging |
| 5 | Volkswagen AG | 246 | Neural Networks (66%) | Automotive |
| 6 | BMW AG | 245 | Neural Networks (59%) | Automotive |
| 7 | IBM Deutschland GmbH | 217 | Other / cross-cutting | Technology / R&D |
| 8 | NEC Europe Ltd | 169 | Neural Networks (58%) | IT / Telecom |
| 9 | ZF Friedrichshafen AG | 163 | Neural Networks (83%) | Automotive Supplier |
| 10 | Siemens Healthineers AG | 139 | Neural Networks (71%) | Medical Devices |
| 11 | Fraunhofer-Gesellschaft | 128 | Neural Networks (67%) | Research |
| 12 | Porsche AG | 106 | Neural Networks (62%) | Automotive |
| 13 | Mercedes-Benz Group AG | 99 | ML general (60%) | Automotive |
| 14 | Continental Automotive | 90 | Neural Networks (75%) | Automotive Supplier |
| 15 | Audi AG | 81 | Neural Networks (66%) | Automotive |
Robert Bosch dominates with 1,835 families — more than the next two combined (Siemens + SAP = 1,479). The automotive sector accounts for 8 of the top 15 filers. SAP is the outlier: it is the only top filer where general ML and Knowledge-based systems outweigh Neural Networks, reflecting its enterprise and business-process focus. ZF Friedrichshafen has the highest Neural Network concentration (83%) of any top filer.
Application Domains
Where is German ML being applied? Top co-occurring IPC subclasses on ML families
Computer Vision (G06V, 1,365 families) and Image Processing (G06T, 1,103) are the #1 application area — combined, they appear on more ML families than any single sub-area. Vehicle Control (B60W, 746) and Traffic Systems (G08G, 355) confirm the automotive dominance. Healthcare (G16H + A61B = 862) is the second-largest application domain, driven by Siemens Healthcare and Siemens Healthineers. Robotics (B25J, 133) and Speech Recognition (G10L, 164) round out the industrial landscape.
Data table: Application Domains
| IPC | Domain | Families |
|---|---|---|
| G06F | Digital Data Processing | 2,420 |
| G06V | Computer Vision | 1,365 |
| G06T | Image Processing | 1,103 |
| G06K | Data Reading | 1,051 |
| G05B | Control Systems | 839 |
| B60W | Vehicle Control | 746 |
| G06Q | Business / Admin | 634 |
| G16H | Healthcare IT | 445 |
| A61B | Diagnosis / Surgery | 417 |
| H04L | Digital Transmission | 362 |
| G08G | Traffic Control | 355 |
| B60R | Vehicle Fittings | 296 |
| G01S | Navigation / Radar | 285 |
| G01N | Materials Analysis | 243 |
| H04W | Wireless Comms | 217 |
| G10L | Speech Recognition | 164 |
| B25J | Robotics | 133 |
Geographic Filing Strategy
Where German ML innovators seek patent protection (application-level counts by filing office)
Total Filings by Office
Jurisdiction Trends Over Time
Note: WO/PCT is not a target market but an international filing route that typically leads to national/regional phase entries (EP, US, CN, etc.). A family may therefore be counted at multiple offices.
Dramatic geographic shift: In 2014, the US Patent Office was the #1 filing destination (67 apps vs. 17 DE). By 2022, the German Patent Office (DPMA) overtook the USPTO (456 vs. 404), and by 2023, domestic filings led all routes (600 DE vs. 404 WO vs. 313 US). This reflects both a strategic shift toward PCT filings for broader international coverage and a growing domestic-first approach. South Korea (KR) is a consistent secondary market, particularly for automotive-related ML.
Data table: Geographic Filing Trends
| Year | US | DE | EP | WO | KR |
|---|---|---|---|---|---|
| 2014 | 67 | 17 | 20 | 19 | 5 |
| 2015 | 65 | 27 | 20 | 13 | 6 |
| 2016 | 94 | 31 | 32 | 11 | 8 |
| 2017 | 203 | 85 | 62 | 43 | 11 |
| 2018 | 380 | 159 | 113 | 65 | 24 |
| 2019 | 577 | 327 | 261 | 125 | 35 |
| 2020 | 627 | 388 | 330 | 251 | 44 |
| 2021 | 714 | 389 | 357 | 345 | 72 |
| 2022 | 404 | 456 | 353 | 309 | 69 |
| 2023 | 313 | 600 | 372 | 404 | 60 |
| 2024* | 167 | 184 | 103 | 342 | 35 |
International R&D Network
Non-German co-applicants on co-filed ML families (nb_applicants > 1). Includes both external partners and international subsidiaries of German companies.
Top International Co-Applicants
External Partners vs. Subsidiaries
| Partner | Country | Families | Type |
|---|---|---|---|
| IBM Corp | US | 220 | External |
| Huawei Technologies | CN | 128 | External |
| Siemens Corp | US | 59 | Subsidiary |
| NEC Corp | JP | 46 | External |
| Siemens Ltd China | CN | 45 | Subsidiary |
| Tsinghua University | CN | 45 | Academic |
| Bosch India (R&D) | IN | 26 | Subsidiary |
| IBM China | CN | 25 | External |
| Carnegie Mellon Univ | US | 24 | Academic |
| Nanyang Tech Univ | SG | 21 | Academic |
| F. Hoffmann-La Roche | CH | 14 | External |
| Intel Corp | US | 11 | External |
IBM is the #1 external R&D partner for German ML applicants with 220 joint families. Huawei (128 families) is the second-largest — a strong China connection reflecting joint R&D operations. International subsidiaries (Siemens US/China, Bosch India) represent global R&D coordination within German multinationals. Academic collaborations are notable: Tsinghua University (45), Carnegie Mellon (24), and Nanyang Tech (21) span three continents of research partnerships.
Who Works With Whom
Co-applicant pairs on German ML patent families — grouped by corporate parent to consolidate name variants
University – Industry Pairs
| Industry Partner | University / Research | Families |
|---|---|---|
| Robert Bosch | Tsinghua University (CN) | 45 |
| Robert Bosch | Uni Freiburg (DE) | 37 |
| Robert Bosch | Carnegie Mellon (US) | 24 |
| Continental | Nanyang Tech (SG) | 19 |
| Eleqtron | Uni Siegen (DE) | 11 |
| FZ Juelich | RWTH Aachen (DE) | 10 |
| Robert Bosch | Leibniz Uni Hannover (DE) | 6 |
| Carl Zeiss Meditec | TU Muenchen (DE) | 5 |
| Toyota | Max Planck Informatik (DE) | 5 |
| Fraunhofer | Uni des Saarlandes (DE) | 5 |
Industry – Industry Pairs
| Company A | Company B | Families |
|---|---|---|
| Audi | Volkswagen | 15 |
| Porsche | Volkswagen | 15 |
| Audi | Porsche | 14 |
| CARIAD | Robert Bosch | 8 |
| CARIAD | Volkswagen | 7 |
| Siemens | Siemens Healthcare | 12 |
| BASF | Robert Bosch | 5 |
Bosch is the collaboration king — partnering with Tsinghua (45 families), Uni Freiburg (37), and Carnegie Mellon (24), spanning three continents. The VW Group shows intense internal knowledge-sharing between Audi, Porsche, VW, and their software unit CARIAD. A notable emerging pair: Eleqtron x Uni Siegen (11 families) represents the quantum computing startup ecosystem. The Bosch-Tsinghua axis alone accounts for more co-filings than the entire VW Group's internal collaboration.
Grant Rates by Filing Office
Percentage of applications that have been granted, by patent office. Note: Grant rates for recent filings (2022+) are artificially low due to examination pendency.
| Office | Applications | Granted | Grant Rate |
|---|---|---|---|
| US (USPTO) | 3,611 | 2,027 | 56.1% |
| DE (DPMA) | 2,663 | 586 | 22.0% |
| EP (EPO) | 2,023 | 437 | 21.6% |
| WO (PCT) | 1,927 | 61 | 3.2%* |
| KR (KIPO) | 369 | 92 | 24.9% |
| AU (IP Australia) | 52 | 17 | 32.7% |
| FR (INPI) | 49 | 30 | 61.2% |
* PCT/WO is a filing route, not a granting office. The 3.2% reflects national phase entries that were granted while still tracked under the WO application.
The USPTO leads with a 56.1% grant rate — more than double the European (21.6%) and German (22.0%) rates. This reflects both the higher volume of US filings by German applicants and the USPTO's historically faster examination process for software-related patents. The DPMA and EPO rates are comparable (22% vs. 21.6%), suggesting consistent European examination standards. Many recent filings (2022-2024) are still under examination, so these rates will increase over time.
Citation Impact
Which German ML patent families and applicants generate the most downstream citations?
Most Cited Applicants
Citation Impact vs. Portfolio Size
| Applicant | Cited Families | Total Citations | Avg/Family |
|---|---|---|---|
| Siemens Healthcare | 332 | 8,338 | 25.1 |
| Robert Bosch | 921 | 6,970 | 7.6 |
| SAP SE | 553 | 6,692 | 12.1 |
| Siemens AG | 514 | 5,634 | 11.0 |
| Siemens Healthineers | 94 | 1,955 | 20.8 |
| NEC Europe | 121 | 1,691 | 14.0 |
| Fraunhofer | 66 | 1,019 | 15.4 |
| Volkswagen | 147 | 1,179 | 8.0 |
| IBM Deutschland | 159 | 1,044 | 6.6 |
| BASF | 38 | 511 | 13.4 |
Most Cited Individual Families
| # | DOCDB Family | Applicant | Year | Citations |
|---|---|---|---|---|
| 1 | 55969747 | Siemens AG / Siemens Healthcare | 2015 | 484 |
| 2 | 60888377 | Autonomos GmbH | 2017 | 399 |
| 3 | 59581846 | Siemens Healthcare / Healthineers | 2017 | 339 |
| 4 | 60327326 | Robert Bosch GmbH | 2016 | 311 |
| 5 | 67616469 | Rudolf Schreiner | 2019 | 269 |
| 6 | 55910782 | Siemens Healthcare GmbH | 2015 | 245 |
| 7 | 58558973 | Siemens Healthcare GmbH | 2017 | 236 |
| 8 | 48626420 | European Molecular Biology Lab | 2014 | 232 |
| 9 | 52727129 | Fraunhofer-Gesellschaft | 2015 | 216 |
| 10 | 60677707 | Siemens Healthcare GmbH | 2017 | 200 |
Siemens Healthcare dominates citation impact with 25.1 citations per family — the highest citation density of any large German ML filer. 6 of the top 10 most-cited families belong to Siemens Healthcare/Healthineers, mostly in medical imaging AI. Autonomos GmbH (an autonomous driving startup) holds the #2 most-cited family (399 citations). BASF (13.4 avg) and Fraunhofer (15.4 avg) punch above their portfolio size, suggesting high-quality, foundational patents. In contrast, Bosch's large portfolio (921 cited families) shows a more applied, incremental pattern at 7.6 citations per family.
Beyond Core Classification: Hidden ML Patents
1,015 additional patent families use ML/neural networks but are classified only under their application domain — invisible to a standard G06N search
A standard patent search for machine learning uses IPC/CPC codes like G06N (AI computing models) and G06F18 (pattern recognition). But many patents that use ML as a method are classified purely under their application domain, with a CPC tag indicating "using neural networks" or "using machine learning." These patents never appear in a core G06N search.
By searching 53 application-domain CPC codes (e.g., G06T2207/20084 "Artificial neural networks for image analysis," G05B13/027 "Adaptive control using neural networks," F02D41/1405 "Engine control using neural networks"), we found 1,015 additional German patent families that our core search missed — a 13% increase in total coverage.
Hidden Families by Year
Hidden Families by Industry
Who Files "Hidden" ML Patents?
| # | Applicant | Hidden Families | Likely Domain |
|---|---|---|---|
| 1 | Siemens Healthcare GmbH | 234 | Medical imaging (G06T/G06V with NN tags) |
| 2 | Siemens Healthineers AG | 121 | Medical imaging |
| 3 | Robert Bosch GmbH | 116 | Image processing, automotive |
| 4 | Siemens AG | 39 | Industrial image processing |
| 5 | Carl Zeiss Microscopy GmbH | 30 | Microscopy image analysis |
| 6 | BMW AG | 22 | Image processing, driving |
| 7 | Carl Zeiss Meditec AG | 21 | Ophthalmic imaging |
| 8 | Continental Autonomous Mobility | 20 | Autonomous driving vision |
| 9 | Volkswagen AG | 19 | Image processing |
| 10 | Audi AG | 16 | Image processing |
92% of hidden ML families are in image processing (933 of 1,015) — patents that use neural networks for image analysis but are classified under G06T or G06V rather than G06N. Siemens Healthcare alone accounts for 355 hidden families (234 + 121 Healthineers), reflecting their deep medical imaging AI work. The hidden patent count is growing faster than the core count: from 4 families in 2014 to 303 in 2023, roughly tripling the proportion from 4% to 19% of the combined total. This means the "true" German ML patent landscape is larger than what classification-only searches reveal.
Methodology
Data source, classification scope, and analytical approach
Multi-Layer Search Strategy
This report uses a three-layer search strategy to maximize coverage:
- Layer 1: Core IPC — G06N (all AI/ML subgroups) + G06F18 (pattern recognition) = 7,732 families
- Layer 2: CPC App-Domain — 53 CPC codes tagging ML/NN usage in other fields = 1,015 additional families
- Layer 3: Title Keywords — Title search for "machine learning," "neural network," "deep learning" etc. = 1,683 (high overlap with L1)
Sections 1-8 use Layer 1 for consistency. Section "Beyond Core Classification" presents Layer 2 findings.
Data Source
EPO PATSTAT Global, Autumn 2025 edition, accessed via Google Cloud BigQuery (project: patstat-mtc, dataset: patstat). Analysis date: February 2026.
Applicant Filter
German applicants identified via person_ctry_code = 'DE' in tls206_person, joined via tls207_pers_appln with applt_seq_nr > 0. This captures German entities filing worldwide, not just applications filed at DPMA.
Family Counting
The headline KPI (7,732) counts each DOCDB family exactly once. In annual trend charts, a family is assigned to each year in which it has a filing, so year totals may sum to more than the unique count (1,495 families have filings in 2+ years).
Citation Counting
Citations are counted at the publication level via tls212_citation, then aggregated to the family level. Citation counts favor older families (more time to accumulate citations).
Stack
PATSTAT BigQuery + patstat-mcp (custom MCP server) + Claude AI for analysis and visualization. All SQL queries are included and reproducible.
Scope Limitations
- Some German subsidiaries of foreign companies (e.g., IBM Deutschland, NEC Europe) are included because person_ctry_code = 'DE'. These represent genuine R&D conducted in Germany.
- A single family may be counted in multiple ML sub-areas if it carries IPC codes from more than one main group.
- WO/PCT is an international filing route, not a target market — families filed via PCT will also appear in national/regional phase entries.
- 2023–2024 data is incomplete due to the ~18-month publication delay in PATSTAT Autumn 2025.
- Patent data measures invention disclosure, not market impact or technological quality.
- The "hidden" Layer 2 families are likely an undercount — only 53 of the most clearly labeled application-domain CPC codes were searched.
Glossary — Patent Terms Explained
- DOCDB Family
- A grouping of patent applications covering the same invention across different countries. One DOCDB family = one invention, regardless of how many national applications exist.
- IPC (International Patent Classification)
- A hierarchical classification system maintained by WIPO. The main ML-related codes are G06N (computing models based on specific computational models) and G06F18 (pattern recognition).
- CPC (Cooperative Patent Classification)
- A more granular classification jointly maintained by the EPO and USPTO. Used here for the "hidden ML patents" analysis via application-domain-specific neural network tags.
- PATSTAT
- The European Patent Office's worldwide patent statistical database, containing bibliographic and legal data on over 100 million patent documents.
- PCT (Patent Cooperation Treaty)
- An international filing route (WO applications) that allows applicants to seek protection in multiple countries through a single initial filing. Not a granting office.
- Grant Rate
- The percentage of applications that have been granted by a patent office. Rates for recent years are artificially low because many applications are still under examination.
- Co-application
- A patent application filed by multiple applicants (nb_applicants > 1), indicating collaborative R&D or joint ownership of the invention.
- Citation
- A reference from one patent publication to another. Forward citations (being cited by later patents) are used as a proxy for technological impact and influence.
- Neural Network (NN)
- A computing model inspired by biological neural networks, classified under IPC G06N3. Includes sub-types like CNNs, RNNs, GANs, and auto-encoders.
- Hidden ML Patent
- A patent that uses machine learning or neural networks as a method but is classified only under its application domain (e.g., image processing, engine control) rather than under the core ML classification G06N.
All SQL queries and the complete data basis are available for download.
Like what you see?
This report was built with a fully reproducible pipeline: EPO PATSTAT Global on BigQuery, a custom MCP server, and Claude AI for analysis and visualization. Everything is open and auditable — the SQL queries are included.