Neurotechnologies in the AI Act: Moving away from the Neurorights Debate
DOI:
https://doi.org/10.17561/tahrj.v26.9853Keywords:
Neurotechnology, Artificial Intelligence, AI systems, Prohibited AI Systems, High-risk AI systems, NeurorightsAbstract
This paper examines the potential application of the European Union’s Artificial Intelligence Act (Regulation 2024/1689) to neurotechnologies (NTs), which increasingly rely on artificial intelligence (AI) for functions such as neuroimaging, brain-computer interfaces, and neurostimulation. In the light of growing ethical and human rights concerns, including the potential risks of inferring private thoughts, manipulating behavior, and undermining individual autonomy, the study evaluates how the AI Act’s provisions on prohibited and high-risk AI systems apply to AI-assisted NTs. It analyzes four categories of prohibited practices assessing their potential application to NTs: subliminal, manipulative or deceptive techniques; criminal offence risk assessment and prediction; emotion recognition in workplaces or education; and biocategorisation based on sensitive characteristics. It also considers high-risk classifications under Annexes I and III of the AI Act, with particular attention to medical device regulations and profiling activities. The findings suggest that while the AI Act does not directly apply to NTs, its regulation of AI systems that enable or enhance NT functions exerts a substantial regulatory impact. This framework may mitigate many NT-related risks, indicating that calls for creating new “neurorights” may be unnecessary.
Downloads
References
ABI-RACHED, J. M. (2008). ‘The implications of the new brain sciences’. European Molecular Biology Organization Reports, 9(12), 1158-1162.
AFP (2023). AI-supercharged neurotech threatens mental privacy: UNESCO, France 24, 13 July 2023. (Accessed: 14 August 2025). Available at: https://www.france24.com/en/live-news/20230713-ai-supercharged-neurotech-threatens-mental-privacy-unesco
AHARONI, E., et al. (2013). ‘Neuroprediction of future rearrest’. Proceedings of the National Academy of Sciences, 110, 6223–6228. https://doi.org/10.1073/pnas.1219302110
AHMED, N., et al. (2023). ‘A systematic survey on multimodal emotion recognition using learning algorithms’. Inteligent Systems with Applications. 17, 200171, 1-19. https://doi.org/10.1016/j.iswa.2022.200171
ALFIHED, S., et al. (2024). “Non-Invasive Brain Sensing Technologies for Modulation of Neurological Disorders”, Biosensors, 14(7), 1-23. https://doi.org/10.3390/bios14070335
ALIMARDANI, M. and HIRAKI, K. (2020). ‘Passive Brain-Computer Interfaces for Enhanced HumanRobot Interaction’. Frontiers in Robotics and AI, 7, Article 125, 1-12. https://doi.org/10.3389/frobt.2020.00125
ALMADA, M. and PETIT, N. (2025). ‘The EU AI Act: Between the Rock of Product Safety and the Hard Place of Fundamental Rights’. Common Market Law Review, 62, 85-120.
ALOUI, K., et al. (2018). “Using brain prints as new biometric feature for human recognition”, Pattern Recognition Letters, 113, 38-45. https://doi.org/10.1016/j.patrec.2017.10.001
AMERICAN PSYCHOLOGICAL ASSOCIATION (2018). APA Dictionary of Psychology. (Accessed: 2 November 2025). Available at: https://dictionary.apa.org/
ANDORNO, R. (2023). Neurotecnologías y derechos humanos en América Latina y el Caribe: Desafíos y propuestas de política pública. UNESCO Office for Latin America and the Caribbean. Available at: https://unesdoc.unesco.org/ark:/48223/pf0000387079 (Accessed: 7 August 2025).
ANSEDE, M. (2025). ‘Rafael Yuste, neuroscientist: ‘We have to avoid a fracture in humanity between people who have cognitive augmentation and those who do not’. EL PAÍS. (Accessed 14 May 2025). Available at: https://english.elpais.com/science-tech/2025-01-18/rafael-yuste-neuroscientist-we-have-to-avoid-a-fracture-in-humanity-between-people-who-have-cognitive-augmentation-and-those-who-do-not.html
BARDHAN, A. (2023). ‘Mind-Control Gaming Isn’t Sci-Fi, It’s Just Science’. Kotaku. (Accessed: 22 May 2025). Available at: https://kotaku.com/virtual-reality-oculus-headset-meta-vr-video-game-ui-1850938750
BELKACEM, A. N., et al. (2023). ‘On Closed-Loop Brain Stimulation Systems for Improving the Quality of Life of Patients with Neurological Disorders’. Frontiers in Human Neuroscience, 17, 1085173. https://doi.org/10.3389/fnhum.2023.1085173
BELTRAN DE HEREDIA RUIZ, I. (2023). Inteligencia artificial y neuroderechos: la protección del yo inconsciente de la persona, Aranzadi: Navarra.
BERMÚDEZ, J. P., et al. (2023). ‘What Is a Subliminal Technique? An Ethical Perspective on AI-Driven Influence’. Conference: 2023 IEEE International Symposium on Ethics in Engineering, Science, and Technology (ETHICS), 1-10. https://doi.org/10.1109/ETHICS57328.2023.10155039
BHIDAYASIRI, R. (2024). ‘The Grand Challenge at the Frontiers of Neurotechnology and its Emerging Clinical Applications’. Frontiers in Neurology, 15:1314477, 17 january. https://doi.org/10.3389/fneur.2024.1314477
BIOMETRICS INSTITUTE (n.d.). Physiological and Behavioural Biometrics. (Accessed: 11 July 2025). Available at: https://www.biometricsinstitute.org/physiological-and-behavioural-biometrics/
BIRD & BIRD (2025). European Union Artificial Intelligence Act: a guide. 7 April.
BLITZ, M. J. (2017). Searching Minds by Scanning Brains. Palgrave Studies in Law, Neuroscience, and Human Behavior. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-319-50004-1_3
BOQUIEN, M. (2024). ‘Difference Between an AI System and an AI Model’. Dastra Blog, 17 July. (Accessed: 1 August 2025). Available at: https://www.dastra.eu/en/article/difference-between-an-ai-system-and-an-ai-model/57721
BORDA, L., et al. (2023). ‘Automated calibration of somatosensory stimulation using reinforcement learning’. Journal of NeuroEngineering Rehabilitation, 20, 131. https://doi.org/10.1186/s12984-023-01246-0
BOSTROM, N. and SANDBERG, A. (2009). ‘Cognitive Enhancement: Methods, Ethics, Regulatory Challenges’. Sci Eng Ethics, 15, 311–341. https://doi.org/10.1007/s11948-009-9142-5
BUBLITZ, J. CH. (2024a). ‘Neurotechnologies and Human Rights: Restating and Reaffirming the Multi-Layered Protection of The Person’. The International Journal of Human Rights, 28(5), 782-807. https://doi.org/10.1080/13642987.2024.2310830
BUBLITZ, J. CH. (2024b). ‘Banning Biometric Mind Reading: The Case for Criminalising Mind Probing’. Law, Innovation and Technology, 16(2), 432-462. https://doi.org/10.1080/17579961.2024.2392934
BUBLITZ, J. CH., et al. (2024). “Implications of the novel EU AI Act for Neurotechnologies”, Neuron, 112, 3013-3016; https://doi.org/10.1016/j.neuron.2024.08.011
BUBLITZ, J. CH. and LIGTHART, S. (2024). ‘The new regulation of non-medical neurotechnologies in the European Union: overview and reflection’. Journal of Law and the Biosciences, 11(2), July-December, lsae021, 1-15. https://doi.org/10.1093/jlb/lsae021
BUBLITZ, J. CH., et al. (2025). ‘Brain Stimulation May Be a Subliminal Technique Under the European Union's Artificial Intelligence Act’, European Journal of Neuroscience, 61(8), e70115, 1-4. https://doi.org/10.1111/ejn.70115
BUCKHOLTZ, J.W. and FAIGMAN, D.L. (2014). ‘Promises, promises for neuroscience and law’. Current Biology, 24(18), R861–R867
BUITEN, M. C. (2019). ‘Towards Intelligent Regulation of Artificial Intelligence’. European Journal of Risk Regulation, 10(1), 41-59. https://doi.org/10.1017/err.2019.8
CALIFORNIA LEARNING RESOURCE NETWORK (2025) Is AI a Software? CLRN, 21 April. (Accessed: 29 July 2025). Available at: https://www.clrn.org/is-ai-a-software/
CAMBRIDGE DICTIONARY (2025). Intention. (Accessed: 9 July 2025). Available at: https://dictionary.cambridge.org/dictionary/english/intention
CANNARD, C., et al. (2020). ‘Chapter 16 - Self-health monitoring and wearable neurotechnologies’, in RAMSEY, N. F.; MILLÁN, J. DEL R. (eds.). Handbook of Clinical Neurology: Brain-Computer Interfaces, 168. Amsterdam: Elsevier, 207–232. https://doi.org/10.1016/b978-0-444-63934-9.00016-0
CARON, J. F. (2018). A Theory of the Super Soldier: The Morality of Capacity-Increasing Technologies in the Military. Manchester University Press.
CATLEY, P. and CLAYDON, L. (2015). ‘The use of neuroscientific evidence in the courtroom by those accused of criminal offenses in England and Wales’. Journal of Law and the Biosciences, 2, 510–549. 10.1093/jlb/lsv025.
CHAMBERLAIN III, V. D. (2023). ‘A Neurotechnology Framework to Analyze Soldier Enhancement Using Invasive Neurotechnology’. U.S. Naval War College, Newreport, RI. (Accessed: 9 July 2025). Available at: https://apps.dtic.mil/sti/trecms/pdf/AD1209159.pdf
CHANDRABHATLA, A. S., et al. (2023). ‘Landscape and Future Directions of Machine Learning Applications in Closed-Loop Brain Stimulation’. NPJ Digital Medicine, 6(79), 1-13. https://doi.org/10.1038/s41746-023-00779-x
CHURCHLAND, P. S. and CHURCHLAND P. M. (2012). ‘What are beliefs?’ in KRUEGER, F., & GRAFMAN, J. The Neural Basis of Human Belief Systems (1st ed.). Taylor and Francis. Retrieved from https://www.perlego.com/book/1685250/the-neural-basis-of-human-belief-systems-pdf
COCKRELL SCHOOL OF ENGINEERING (2024). Universal Brain-Computer Interface Lets People Play Games with Just Their Thoughts. (Accessed: 22 May 2025). Available at: https://cockrell.utexas.edu/news/archive/9841-universal-brain-computer-interface-lets-people-play-games-with-just-their-thoughts
COUNCIL OF EUROPE [CoE] & OECD (2021). Neurotechnologies and Human Rights Framework: Do We Need New Rights? – Rapporteurs’ Report of the Round Table. Strasbourg: CoE & OECD. (Accessed: 9 August 2025). Available at: https://rm.coe.int/round-table-report-en/1680a969ed
CRISTOFORI, I. and GRAFMAN, J. (2017). ‘Neural Underpinnings of the Human Belief System’ in ANGEL, H-F., et al. (eds.). Processes of Believing: The Acquisition, Maintenance,and Change in Creditions, New Approaches to the Scientific Study of Religion 1, 111-123. https://doi.org/10.1007/978-3-319-50924-2_8
DE KOGEL, C.H. and WESTGEEST, E. J. M. C. (2015). ‘Neuroscientific and behavioral genetic information in criminal cases in the Netherlands’. Journal of Law and the Biosciences, 2(3), 580–605. https://doi.org/10.1093/jlb/lsv024
DELFIN, C., et al. (2019). ‘Prediction of recidivism in a long-term follow-up of forensic psychiatric patients: incremental effects of neuroimaging data’. PLoS One, 14(5), 1-21. https://doi.org/10.1371/journal.pone.0217127
DOUGLAS, T. (2014). ‘Criminal Rehabilitation Through Medical Intervention: Moral Liability and the Right to Bodily Integrity’. Journal of Ethics, 18, 101–122. https://doi.org/10.1007/s10892-014-9161-6
DUFFY, C. (2024). ‘First Neuralink human trial subject can control a computer mouse with brain implant, Elon Musk says’. CNN. Available at: https://edition.cnn.com/2024/02/20/tech/first-neuralink-human-subject-computer-mouse-elon-musk/index.html (Accessed: 15 May 2025).
EU NETWORK OF INDEPENDENT EXPERTS ON FUNDAMENTAL RIGHTS (EU NIEFR) (2006). Commentary of the Charter of Fundamental Rights of The European Union, June 2006. (Accessed: 20 July 2025). Available at: https://sites.uclouvain.be/cridho/documents/Download.Rep/NetworkCommentaryFinal.pdf
EU OMBUDSMAN (2024). Case 157/2023/VB, opened on 15 March 2023; Decision on 25 April. (Accessed: 15 August 2025). Available at: https://www.ombudsman.europa.eu/en/case/en/63216
EUROPEAN DATA PROTECTION BOARD & EUROPEAN DATA PROTECTION SUPERVISOR [AEPD & EDPS] (2024). TechDispatch on Neurodata. Luxembourg: Publications Office of the European Union. (Accessed: 01 November 2025). Available at: https://www.aepd.es/guides/neurodata-aepd-edps.pdf
EUROPEAN COMMISSION (2020). White Paper: On Artificial Intelligence – A European Approach to excellence and trust. Brussels. EU Doc. COM(2020) 65 final. (Accessed: 20 July 2025). Available at: https://commission.europa.eu/documents_en?prefLang=es&f%5B0%5D=document_title%3Awhite%20paper%20artificial%20intelligence
EUROPEAN COMMISSION (2021). Questions and Answers: Application of Regulation on Medical Devices – EU rules to ensure safety of medical devices, Press Corner Q&A 21/2619, Brussels, 26 May. Available at: https://ec.europa.eu/commission/presscorner/api/files/document/print/en/qanda_21_2619/QANDA_21_2619_EN.pdf (Accessed: 29 July 2025)
EUROPEAN COMMISSION (2022). Implementing Regulation (EU) 2022/2347 of 1 December 2022 laying down rules for the application of Regulation (EU) 2017/745 of the European Parliament and of the Council as regards reclassification of groups of certain active products without an intended medical purpose. Official Journal of the European Union L 311/94, 2.12.2022.
EUROPEAN COMMISSION (2025). ANNEX to the Communication to the Commission Approval of the content of the draft Communication from the Commission - Commission Guidelines on prohibited artificial intelligence practices established by Regulation (EU) 2024/1689 (AI Act), 4.2.2025, C (2025) 884 final.
EUROPEAN LAW INSTITUTE (2024). The concept of ‘AI system’ under the new AI Act: Arguing for a Three-Factor Approach. Viena.
EUROPEAN PARLIAMENT AND COUNCIL (2016). Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). Official Journal of the European Union, L 119, 1–88.
EUROPEAN PARLIAMENT AND COUNCIL (2024). Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonized rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act) (Text with EEA relevance).
EUROPEAN PARLIAMENTARY RESEARCH SERVICE [EPRS] (2024). The protection of mental privacy in the area of neuroscience. Brussels: European Parliament. (Accessed: 14 August 2025). Available at: https://www.europarl.europa.eu/RegData/etudes/STUD/2024/757807/EPRS_STU(2024)757807_EN.pdf
EUROPEAN UNION (2007). Explanations relating to the Charter of Fundamental Rights. Official Journal of the European Union, C 303, 17–35. (Accessed: 14 August 2025). Available at: https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:C:2007:303:0017:0035:EN:PDF
EUROPEAN UNION AGENCY FOR FUNDAMENTAL RIGHTS (2018). FRA 2018 focus – Big data and fundamental rights. (Accessed: 19 July 2025). Available at: https://fra.europa.eu/sites/default/files/fra_uploads/fra-2018-focus-big-data_en.pdf
EU-STARTUPS (n.d.) ‘NextMind’, EU-Startups. (Accessed: 15 May 2025). Available at: https://www.eu-startups.com/directory/nextmind/
EVANS, N. (2012). ‘Emerging Military Technologies: A Case Study in Neurowarfare’ in TRIPODI, P.; WOLFENDALE, J (Eds.). New Wars and New Soldiers: Military Ethics in the Contemporary World. England: Ashgate Publishing Company Roudledge. 105-116.
FAIGMAN, D.L., et al. (2014). Group to Individual (G2i) Inference in Scientific Expert Testimony. University Chicago Law Rev, 81(2), 417-480.
FARAH, M. J., et al. (2014). ‘Functional MRI-based Lie Detection: Scientific and Societal Challenges’. Nature Reviews Neuroscience, 15, 123-131. https://doi.org/10.1038/nrn3665
FARAHANY, N. A. (2023). “Neurotech at Work”, Harvad Business Review, March- April. (Accessed: 16 May 2025). Available at: https://hbr.org/2023/03/neurotech-at-work
FARINA, M. and LAVAZZA, A. (2002). “Memory Modulation Via Non-invasive Brain Stimulation: Status, Perspectives, and Ethical Issues”, Frontiers in Human Neuroscience, 16, 1-6. https://doi.org/10.3389/fnhum.2022.826862
FARISCO, M. and PETRINI, C. (2014). ‘On the stand. Another episode of neuroscience and law discussion from Italy’. Neuroethics. 7, 243–245. https://doi.org/10.1007/s12152-013-9187-7
FRANK, M., et al. (2017). ‘Using EEG-Based BCI Devices to Subliminally Probe for Private Information’. WPES '17: Proceedings of the 2017 on Workshop on Privacy in the Electronic Society, 133-136. https://doi.org/10.1145/3139550.3139559
GEETHA, A.V., et al. (2024). ‘Multimodal Emotion Recognition with Deep Learning: Advancements Challenges, and Future Directions’. Information Fusion, 105, 102218, 1-38.https://doi.org/10.1016/j.inffus.2023.102218
GLENN, A. L. and RAINE, A. (2014). ‘Neurocriminology: Implications for the punishment, prediction and prevention of criminal behaviour’, Nature Reviews Neuroscience, 15, 54–63. https://doi.org/10.1038/nrn3640
GOU, N., et al. (2021). ‘Identification of violent patients with schizophrenia using a hybrid machine learning approach at the individual level’, Psychiatry Research, 306, 114294, 1-9. https://doi.org/10.1016/j.psychres.2021.114294
GRILLNER, S., et al. (2016). ‘Worldwide Initiatives to Advance Brain Research’. Nature neuroscience, 19(9), 1118–1122. https://doi.org/10.1038/nn.4371
HAIN, D., et al. (2023). Unveiling the Neurotechnology Landscape: Scientific Advancements, Innovations and Major Trends. Paris: UNESCO. https://doi.org/10.54678/OCBM4164
HAFNER, M. (2019). ‘Judging homicide defendants by their brains: An empirical study on the use of neuroscience in homicide trials in Slovenia’. Journal of Law and the Biosciences, 6(1), 226–254. https://doi.org/10.1093/jlb/lsz006
HALKIOPOULOS, C., et al. (2025). ‘Advances in Neuroimaging and Deep Learning for Emotion Detection: A Systematic Review of Cognitive Neuroscience and Algorithmic Innovations’. Diagnostics, 15(456), 1-85. https://doi.org/10.3390/diagnostics15040456
HASLACHER, D., et al. (2024). ‘AI for brain-computer interfaces’ in IENCA, M.; STARKE, G. Developments in Neuroethics and Bioethics. Academic Press, 3-28. https://doi.org/10.1016/bs.dnb.2024.02.003
HASSABIS, D., et al. (2017). ‘Neuroscience-Inspired Artificial Intelligence’. Neuron, 95(2), 245-258. (Accessed: 01 November 2025). Available at: https://www.cell.com/neuron/fulltext/S0896-6273(17)30509-3
HOLBROOK, C., et al. (2016). ‘Neuromodulation of Group Prejudice and Religious Beliefs’. Social Cognitive and Affective Neuroscience, 11(3), 387-394. https://doi.org/10.1093/scan/nsv107
IACOBONI, M., et al. (2007). ‘This is your brain in politics’. New York Times, November 11.
IEEE Brain (n.d.). Neurotechnologies: The Next Technology Frontier, IEEE Brain. (Accessed: 14 August 2025). Available at: https://brain.ieee.org/topics/neurotechnologies-the-next-technology-frontier/
IENCA, M. and ANDORNO, R. (2017). “Towards new human rights in the age of neuroscience and neurotechnology”, Life Sciences, Society and Policy, Vol. 13, No. 5, pp. 1-27. https://doi.org/10.1186/s40504-017-0050-1
IENCA, M. and MALGIERI, G. (2022). “Mental data protection and the GDPR”, Journal of Law and Biosciences, 9(1), lsac006, 1-19. https://doi.org/10.1093/jlb/lsac006
INDEPENDENT HIGH-LEVEL EXPERT GROUP ON AI SET UP BY THE EUROPEAN COMMISSION (AI HLEG) (2019). Ethics Guidelines for Trustworthy AI. European Commission: Brussels.
INTER-AMERICAN JURIDICAL COMMITTEE, ORGANIZATION OF AMERICAN STATES (OAS) (2023). Inter-American Declaration of Principles on Neuroscience, Neurotechnologies, and Human Rights, CJI-RES. 281 (CII-O/23) corr.1.(Accessed: 14 August 2025). Available at: https://www.oas.org/en/sla/iajc/docs/CJI-RES_281_CII-O-23_corr1_ENG.pdf
INTERNATIONAL NEUROMODULATION SOCIETY (2023). About Neuromodulation. Available at: https://www.neuromodulation.com/about-neuromodulation (Accessed: 7 August 2025).
INTERNATIONAL NEUROMODULATION SOCIETY (2021). Conditions That May Be Treated with Neuromodulation. Available at: https://www.neuromodulation.com/conditions (Accessed: 7 August 2025).
ISO/IEC (2022). Information technology — Artificial intelligence — Artificial intelligence concepts and terminology (ISO/IEC 22989:2022). First edition, July 2022. Geneva: International Organization for Standardization.
JOHNSON, S. (2017). ‘This Company Wants to Gather Student Brainwave Data to Measure Engagement’, EdSurge, 26, October. (Accessed: 16 May 2025). Available at: https://www.edsurge.com/news/2017-10-26-this-company-wants-to-gather-student-brainwave-data-to-measure-engagement
JULIÀ-PIJOAN, M. (2020). Proceso penal y (neuro) ciencia: una interacción desorientada. Una reflexión acerca de la neuropredicción. Madrid: Marcial Pons.
JULIÀ-PIJOAN, M. (2024). La computarización del derecho, a partir del proceso y de los procedimientos judiciales. Madrid: Dykinson, S. L.
JWA, A. S. and POLDRACK, R. A. (2022): ‘Addressing privacy risk in neuroscience data: from data protection to harm prevention’, Journal of Law and the Biosciences, 9(2). https://doi.org/10.1093/jlb/lsac025
KAMITANI, Y. and TONG, F. (2005). ‘Decoding the visual and subjective contents of the human brain’. Nature Neuroscience, 8, 679–685. https://doi.org/10.1038/nn1444
KAPLAN, J. T., et al. (2007). ‘Us versus Them: Political Attitudes and Party Affiliation Influence Neural Response to Faces of Presidential Candidates’. Neuropsychologia, 45(1), 55-64. https://doi.org/10.1016/j.neuropsychologia.2006.04.024
KAPLAN, J. T., et al. (2016). ‘Neural Correlates of Maintaining One’s Political Beliefs in the Face of Counterevidence’. Sci Rep, 6, 39589, 1-11. https://doi.org/10.1038/srep39589
KHAILI, M. A., et al. (2023). ‘Deep Learning Applications in Brain Computer Interface Based Lie Detection’. IEEE 13th Annual Computing and Communication Workshop and Conference (CCWC). https://doi.org/10.1109/ccwc57344.2023.10099109
KIEHL, K.A., et al. (2018). ‘Age of Gray Matters: Neuroprediction of Recidivism’, NeuroImage: Clinical, 19, 813–823. https://doi.org/10.1016/j.nicl.2018.05.036
KLONOVS, J., et al. (2013). ‘ID Proof on the Go: Development of a Mobile EEG-Based Biometric Authentication System’. IEEE Vehicular Technology Magazine, 8(1), 81–89. https://doi.org/10.1109/mvt.2012.2234056
KO, L. W., et al. (2017). ‘Sustained attention in real classroom settings: An EEG study’. Frontiers in Human Neurosciences, 11(388), 1-10. https://doi.org/10.3389/fnhum.2017.00388
KOSAL, M. and PUTNEY, J. (2023). ‘Neurotechnology and International Security: Predicting commercial and military adoption of brain-computer interfaces (BCIs) in the United States and China’. Politics and the Life Sciences, 41(I), 81-103. https://doi.org/10.1017/pls.2022.2
KROL, L. R. and ZANDER, T. O. (2017). ‘Passive BCI-based neuroadaptive systems,’ in Proceedings of the 7th Graz Brain-Computer Interface Conference 2017 (Graz: GBCIC). https://doi.org/10.3217/978-3-85125-533-1-46
KUBANEK, J., et al. (2020). ‘Remote, Brain Region–Specific Control of Choice Behavior with Ultrasonic Waves’. Science Advances, 16(21), 1-9.
KUNZ, E. M., et al. (2025). ‘Inner Speech in Motor Cortex and Implications for Speech Neuroprostheses’. Cell, 188, 1-16. https://doi.org/10.1016/j.cell.2025.06.015
LIGTHART, S.; et. al. (2021). ‘Forensic Brain-Reading and Metal Privacy in European Human Rights Law: Foundations and Challenges’, Neuroethics, 14, 191-203. https://doi.org/10.1007/s12152-020-09438-4
LIU, N. H., et al. (2013). ‘Improving Driver Alertness Through Music Selection Using a Mobile EEG to Detect Brainwaves’. Sensors, 13, 8199–8221. https://doi.org/10.3390/s130708199
LUCCHIARI, C., et al. (2019). ‘Editorial: Brain Stimulation and Behavioral Change’. Neuroscience, 13(20), 1-3. https://doi.org/10.3389/fnbeh.2019.00020
MERIKLE, P. M., et al. (2001). ‘Perception without awareness: Perspectives from Cognitive Psychology’. Cognition, 79(1-2): 115-134. https://doi.org/10.1016/s0010-0277(00)00126-8
MIYAWAKI, Y, et al. (2008). ‘Visual image reconstruction from human brain activity using a combination of multiscale local image decoders’. Neuron, 60(5), 915-929. PMID: 19081384. https://doi.org/10.1016/j.neuron.2008.11.004
MOORE, T. E. (1982). ‘Subliminal Advertising: What You See Is What You Get’. Journal of Marketing, Vol. 46, No. 2, pp. 38-47. https://doi.org/10.2307/3203339
MORE, V., et al. (2023). “Using Motor Imagery and Deeping Learning for Brain-Computer Interface in Video Games,” 2023 IEEE World AI IoT Congress (AIIoT), Seattle, WA, USA, 0711-0716. https://doi.org/10.1109/AIIoT58121.2023.10174453.
MORENO, J., et al. (2022). ‘The Ethics of AI-Assisted Warfighter Enhancement Research and Experimentation: Historical Perspectives and Ethical Challenges’. Frontiers in Big Data, 5(978734), 1-13. https://doi.org/10.3389/fdata.2022.978734
MUNYON, CH. (2018). ‘Neuroethics of Non-primary Brain Computer Interface: Focus of Potential Military Applications’. Frontiers in Neuroscience, 12(696), 1-4. https://doi.org/10.3389/fnins.2018.00696
MUSE (2025). Muse™ EEG-Powered Meditation & Sleep Headband. (Accessed: 15 May 2025). Available at: https://choosemuse.com
NEURALINK (n.d). Neuralink. (Accessed: 15 May 2025). Available at: https://neuralink.com/
NUFFIELD COUNCIL ON BIOETHICS (2013). Novel Neurotechnologies: Intervening in the Brain. London: Nuffield Council on Bioethics. https://cdn.nuffieldbioethics.org/wp-content/uploads/Novel-neurotechnologies-report.pdf
OECD (2024). Explanatory Memorandum on the Updated OECD Definition of an AI System. OECD Artificial Intelligence Papers. March, No. 8.
OECD (2025). Recommendation of the Council on Responsible Innovation in Neurotechnology. OECD/LEGAL/0457; (Accessed: 30 July 2025). Available at: https://legalinstruments.oecd.org/en/instruments/oecd-legal-0457
ONCIUL, R., et al. (2025). ‘Artificial Intelligence and Neuroscience: Transformative Synergies in Brain Research and Clinical Applications’, Journal of Clinical Medicine, 14, Article 550. (Accessed: 01 november 2025). Available at: https://doi.org/10.3390/jcm14020550
PATEL, S. H. and AZZAM, P. N. (2005). ‘Characterization of N200 and P300: Selected Studies of the Event-Related Potential’. International Journal of Medical Sciences, Vol. 2, No. 4, pp. 147-154. https://doi.org/10.7150/ijms.2.147
PAUL-ZIEGER, R. (2024). EU MDR Conformity Assessment Options for Medical Devices: Determining the proper path to CE marking for your products. EmergobyUL.com White Paper, May. (Accessed: 30 July 2025). Available at: https://www.emergobyul.com/sites/default/files/2024-12/EU-MDR-Conformity-Assessment-Whitepaper.pdf
PEARSON, H. (2006). ‘Lure of Lie Detectors Spooks Ethicists. Nature, 44(22), 918-919.
PELLEY, R. (2024). ‘‘Use the Force, Rich!’ Can You Really Play Video Games with Your Mind?’. The Guardian. (Accessed: 22 May 2025). Available at: https://www.theguardian.com/games/article/2024/aug/09/can-you-really-play-video-games-with-your-mind
POLDRACK, R. A.; et al (2018). ‘Predicting Violent Behavior: What Can Neuroscience Add?’. Trends Cognitive Science, 22, 111–123. https://doi.org/10.1016/j.tics.2017.11.003
PRESS, G. (2017). ‘Artificial Intelligence (AI) Defined’. Forbes, August 27. (Accessed: 12 July 2025). Available at: https://www.forbes.com/sites/gilpress/2017/08/27/artificial-intelligence-ai-defined/
QVARTRUP, M. (2024). The Political Brain: The Emergence of Neuropolitics. CEU Press Perspectives.
RAINEY, S., et al. (2020). ‘Brain Recording, Mind‑Reading, and Neurotechnology: Ethical Issues from Consumer Devices to Brain‑Based Speech Decoding’. Science and Engineering Ethics, 26, 2295–2311. https://doi.org/10.1007/s11948-020-00218-0
RAINEY, S., et al. (2020a). “Is the European Data Protection Regulation sufficient to deal with emerging data concerns relating to neurotechnology?”, Journal of Law and the Biosciences, 7(1), January-June, 1-19. https://doi.org/10.1093/jlb/lsaa051
RAZQUIN, M. M. (2024). ‘Sistemas de IA prohibidos, de alto riesgo, de limitado riesgo, o de bajo o nulo riesgo’. Revista de Privacidad y Derecho Digital, 34, 172-235.
ROELFSEMA, P. R., et al. (2018). ‘Mind Reading and Writing: The Future of Neurotechnology’. Trends in Cognitive Sciences, 22(7), 598-610. https://doi.org/10.1016/j.tics.2018.04.001
SCHALK, G., et al. (2024). ‘Translation of neurotechnologies’. Nature Reviews Bioengineering, Vol. 2, pp. 637-652. https://doi.org/10.1038/s44222-024-00185-2
SEITZ, R. J. (2017). ‘Beliefs and Believing as Possible Targets for Neuroscientific Research’ in ANGEL, H-F., et al. (eds.). Processes of Believing: The Acquisition, Maintenance,and Change in Creditions, 1. Springer: New Approaches to the Scientific Study of Religion, 69-81. https://doi.org/10.1007/978-3-319-50924-2_8
SHIH, J. J., et al. (2012). ‘Brain-Computer Interfaces in Medicine’. Mayo Clinic Proceedings, Vol. 87, Issue 3, pp. 268-279. https://doi.org/10.1016/j.mayocp.2011.12.008
SKEEM, J. L. and MONAHAN, J. (2011). ‘Current Directions in Violence Risk Assessment’. Current Directions in Psychological Science, 20(1), 38-42. https://doi.org/10.1177/0963721410397271
STEVENSON, A. (ed.) (2015). Oxford Dictionary of English. 3rd edn. Oxford: Oxford University Press. (Accessed: 5 June 2025). Available at: https://www.oxfordreference.com
SURIANARAYANAN, CH., et al. (2023). ‘Convergence of Artificial Intelligence and Neuroscience towards the Diagnosis of Neurological Disorders – A Scoping Review’. Sensors, 23(6), 3062, 1-29. https://doi.org/10.3390/s23063062
TANG, J., et al. (2023). ‘Semantic Reconstruction of Continuos Language from Non-Invasive Brain Recordings’, Nature Neuroscience, Vol. 26, pp. 858-866. https://doi.org/10.1038/s41593-023-01304-9
TORTORA, L., et al. (2020). ‘Neuroprediction and A.I. in Forensic Psychiatry and Criminal Justice: A Neurolaw Perspective’. Front Psychol, 11:220. https://doi.org/10.3389/fpsyg.2020.00220
TÜV AI-LAB (2024). Technical Assessment of High-Risk AI Systems: State of Play and Challenges. TÜV AI-Lab. (Accessed: 1 Aug. 2025). Available at: https://www.tuev-lab.ai/fileadmin/user_upload/AI_Lab/TUEV_AI_Lab_Whitepaper_Technical_Assessment_of_AI_Systems.pdf
UC DAVIS HEALTH (2024), New Brain-Computer Interface Allows a Man with ALS to ‘Speak’ Again. (Accessed: 1 Aug. 2025). Available at: https://health.ucdavis.edu/news/headlines/new-brain-computer-interface-allows-man-with-als-to-speak-again/2024/08
UNESCO (n.d.). Artificial intelligence. UNESCO. (Accessed 7 May 2025). Available at: https://www.unesco.org/en/artificial-intelligence
UNESCO (2025). Draft Recommendation on the Ethics of Neurotechnology. Paris: United Nations Educational, Scientific and Cultural Organization. (Accessed: 9 August 2025). Available at: https://unesdoc.unesco.org/ark:/48223/pf0000394866
UNITED NATIONS HUMAN RIGHTS COUNCIL [UNHRC] (2024). Impact, opportunities and challenges of neurotechnology with regard to the promotion and protection of all human rights. United Nations. Report A/HRC/57/61. (Accessed 9 August 2025). Available at: https://docs.un.org/en/A/HRC/57/61
VAN DONGEN, J. D. M., et al. (2025). ‘Neuroprediction of Violence and Criminal Behaviour Using Neuro-Imaging Data: From Innovation to Considerations for Future Directions’. Aggression and Violent Behaviour, 80(102008), 1-14. https://doi.org/10.1016/j.avb.2024.102008
VIBRE (n.d.). Analyzing Brain Data to Reduce Accidents In High-Risk Industries. (Accessed: 16 May 2025). Available at: https://vibre.io/en/
VOLL, C. (2025). ‘The Science of EEG + fNIRS: Why Combining These Technologies Enhances Mental Fitness’. Muse Blog, 18 March. (Accessed: 15 May 2025). Available at: https://choosemuse.com/blogs/news/the-science-of-eeg-fnirs-why-combining-these-technologies-enhances-mental-fitness
YU, T., et al. (2022). ‘Prediction of Violence in Male Schizophrenia Using sMRI, Based on Machine Learning Algorithms’, BMC Psychiatry, 22(676), 1-7. https://doi.org/10.1186/s12888-022-04331-1
YUSTE, R. (2022). ‘Rafael Yuste: Let’s Act Before It’s Too Late’. The UNESCO Courier. [Accessed 14 May 2025]. Available at: https://courier.unesco.org/en/articles/rafael-yuste-lets-act-its-too-late
YUSTE, R., et al. (2017). ‘It’s Time for Neuro-Rights: New Human Rights for the Age of Neurotechnology’. Horizons, 18, 154-164.
ZHOU, M. H., et al. (2025). ‘Bird’s Eye View of Artificial Intelligence in Neuroscience’. AI in Neuroscience, 1(1), 16-41. https://doi.org/10.1089/ains.2024.0001
ZOHNY, H., et al. (2023). ‘The Mystery of Mental Integrity: Clarifying Its Relevance to Neurotechnologies’. Neuroethics, 16(20), 1-20. https://doi.org/10.1007/s12152-023-09525-2
Published
Issue
Section
License
Copyright (c) 2025 Miguel Angel Elizalde Carranza

This work is licensed under a Creative Commons Attribution 4.0 International License.






















