Hungary Implements New Regulations for Artificial Intelligence, Focusing on High-Risk Systems
Hungary has recently enacted legislation to regulate Artificial Intelligence (AI) systems, aligning with the forthcoming EU AI Act. The new law establishes a framework for the progress,implementation,and oversight of AI,with a particular emphasis on high-risk applications.
The legislation empowers the government and the relevant minister to develop detailed procedural rules and establish a test surroundings for AI systems. It also outlines potential fines for non-compliance,reaching up to EUR 35 million or 7% of global annual sales,with a maximum fine of HUF 13.3 billion reserved for the most severe violations – specifically, prohibited AI practices. However, the financial sector’s supervision will continue to fall under the purview of the Hungarian National Bank (MNB) in accordance with EU standards.
A key component of the new regulations focuses on ”high-risk” AI systems, which will require pre-compliance assessment before being placed on the market or used within the EU.This assessment includes mandatory documentation covering aspects like the system’s purpose, functions, risk and data management, human supervision, and cybersecurity, as well as possibly requiring a quality management system review.
Examples of AI systems categorized as high-risk include:
Biometric identification and categorization (e.g., facial recognition, fingerprinting)
Systems used in critical infrastructure (energy, transport)
AI applications in education and vocational training
Tools for employment, HR, and workflow management (e.g.,employee selection,performance assessment)
Systems impacting access to essential public services (e.g.,social subsidies,lending)
AI used in law enforcement and forensic applications
Applications related to migration,border management,and asylum
Systems supporting the judiciary and democratic processes (e.g., court decision support)
The Ministry of National Economy has also issued a draft decree concerning the designation and operation of organizations responsible for conformity assessment of these high-risk AI systems. The EU AI Act defines different levels of “modules” determining the type of conformity procedure, ranging from documentation review to on-site audits and full quality management system investigations.The draft decree mandates that designated conformity assessment organizations maintain adequate liability insurance, with a minimum coverage level confirmed during accreditation. This insurance will cover potential liabilities arising from incorrect certification or failures by the assessment institution.
Administrative service fees for accreditation are set at HUF 250,000 per accredited area, with an additional HUF 20,000 per product group and module. Changes to registered data will incur a fee of HUF 20,000. These fees will be allocated solely to cover the personnel and material costs associated with the designation procedures.
The implementation of these regulations comes amidst growing debate surrounding the use of AI, notably in areas like law enforcement. Recent discussions, as highlighted by HVG.hu, focus on the potential contradictions and limitations of AI applications, even as loopholes exist for national security purposes.
The appointment of Biró Marcell as national security advisor is also noted in relation to these developments. The legislation also references the Hungarian Academy of Sciences, Hungarian Academy of Arts, HUN-REN Hungarian research network, and Hungarian Rector Conference.