Hungaryโฃ Implements New โฃRegulations for Artificial Intelligence, Focusing on High-Risk Systems
Hungary has recently enacted legislation to regulate Artificial Intelligence (AI)โฃ systems, aligning with the forthcoming EU AI Act. The new lawโ establishes a framework for the progress,implementation,and oversight of AI,with a particular emphasis โฃon high-risk applications.
The legislation empowers the governmentโค andโ the relevant minister to develop detailed procedural rules โฃand establish a test surroundings for AI systems. It also outlines potential โคfines for non-compliance,reaching up to EUR 35 million or 7% of global annual sales,with a maximumโ fine of HUF 13.3 billion โขreserved for the most severe โคviolations – specifically, prohibited AI practices. However, the financial sector’s supervision will continue to โfall under theโ purview of the Hungarian National Bank (MNB) in accordance with โขEU โstandards.
A key component of the new regulations focuses on โ”high-risk” AI systems, which will require pre-compliance assessment before beingโค placed on the market or used within the EU.This assessmentโ includes mandatory documentation covering aspects like the system’s purpose, functions, risk and data management, human supervision, and cybersecurity, as well as possibly requiring a quality management โsystem review.
Examples of AI systems categorized as high-riskโ include:
โ Biometric identification andโค categorizationโ (e.g., facial recognition, fingerprinting)
Systems used inโ criticalโข infrastructure (energy, transport)
AI applications in education and vocational training
โฃ Tools for employment, HR, and workflow management (e.g.,employeeโ selection,performance assessment)
Systems impacting access to essential public services (e.g.,social subsidies,lending)
โฃ AI used in law enforcement and forensic applications
Applications related to migration,border management,and asylum
Systems supporting the judiciary and democratic processes (e.g., court decision support)
The Ministry of National Economy has also issued a draft decree concerning the designation and operation of organizations responsible for conformity assessment of these high-risk AIโฃ systems. The EU AI Act defines different levels of “modules” โdetermining the type of conformity procedure, ranging from documentation review to on-site audits and full qualityโ management system investigations.The draft decree mandates that designated โconformity assessment organizations maintain adequate liability insurance, with a minimum coverage levelโฃ confirmed during accreditation. This insurance will โcover potential liabilities arising from incorrect certification or failures by the assessment institution.
Administrativeโค service fees for accreditation are set at HUF 250,000 per accredited area, with an additional HUF 20,000 per product group and module. Changes to registered data will incur a fee ofโ HUF 20,000. These feesโ will beโค allocated solely to cover the personnel and material costs associated with the designation procedures.
The implementation of theseโข regulations comes amidst growing debate surrounding the use โof AI, notably inโ areas likeโ law enforcement. Recent discussions, asโ highlighted by HVG.hu, focus on the potential contradictions โand limitations of AI applications, even as loopholes exist for national securityโ purposes. โ
The appointment of Birรณ Marcell as national security advisor is also noted in relation toโฃ these developments. Theโข legislation also references the Hungarian Academy ofโ Sciences, โคHungarian Academy of Arts, HUN-RENโข Hungarian research network, and Hungarian Rector Conference.