Monthly Archives: October 2024

Preprint Alert: Empirical Legal Studies and the Reception of EU Law by Domestic Courts

I am pleased to share a forthcoming book chapter titled Empirical Legal Studies and the Reception of EU Law by Domestic Courts: A Critical Examination. The chapter, which will appear in a forthcoming volume on EU Empirical Legal Studies edited by Daniel Naurin, Urska Sadl, and Jan Zglinski (Cambridge University Press) critically assesses empirical legal scholarship on how domestic courts engage with EU law, notably in the context of the preliminary ruling mechanism under Article 267 TFEU. While canvassing the literature, I discuss some of the theoretical, methodological and data limitations that have shaped the field to date and offer pathways for future research.

The preprint can be found on SSRN: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4991569and ResearchGate: https://www.researchgate.net/publication/385023040_Empirical_Legal_Studies_and_the_Reception_of_EU_Law_by_Domestic_Courts_A_Critical_Examination.

Stay tuned for more details and a publication date!

Comments Off on Preprint Alert: Empirical Legal Studies and the Reception of EU Law by Domestic Courts

Filed under Uncategorized

Paper Alert: Examining the Effects of Mood and Emotional Valence on the Creation of False Autobiographical Memories


In a new study with Ahmad Shahvaroughi and Henry Otgaar we investigate the relationship between mood and the formation of false autobiographical memories. Using an innovative blind implantation method, the research reveals that while mood doesn’t significantly affect false memory creation, negative events are much more likely to result in false memories than positive ones. False beliefs and memories were implanted in 6% to 35% of participants. Negative events led to significantly higher false memory formation than positive ones. This research underscores the importance of careful handling of memory in therapeutic settings to avoid implanting false traumatic memories.
Read the full paper on SSRN: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4974308 and ResearchGate:

Comments Off on Paper Alert: Examining the Effects of Mood and Emotional Valence on the Creation of False Autobiographical Memories

Filed under Uncategorized

AI and Privacy: Should Europe Be Wary of Regulatory Taliban?

I rarely write about regulatory questions and I am putting the question in deliberatively provocative terms. So let me be clear: I firmly believe that privacy and AI regulations are necessary. The principles underpinning EU regulations – from the GDPR to the EU AI Act – are fundamentally sound. But the problem I want to point out here has less to do with the legislation itself and more with what has been developing around it and the perverse asymmetries it has been generating for European research and, if we are to believe the Draghi Report, European small and medium businesses.

Most of these regulations were designed to keep Big Tech — predominantly US companies — in check. These firms have been voraciously collecting data from European citizens while deploying generative AI applications and popular chatbots. Yet the same stringent rules apply with equal force to scientific research, SMEs and even charities. Complying with these regulations is demanding and often costly and time-consuming. And this is where we run into the first asymmetry. Big Tech can rely on armies of top-notch lawyers who don’t only understand the law, but also the technology and who excel at risk evaluation – although they only represent a small overhead spread over a giant business operation. Academic researchers and small business don’t enjoy such legal resources. Often they struggle to comply or to get GDPR-approval. Jurists on university bodies in charge of approving ethical and GDPR-compliance can be surprisingly unfamiliar with protocols, data-collection platforms and technologies now used in advanced research.

But this would not be so bad if these regulations had not spurred the emergence of a regulatory culture promoted by a large EU commentariat excessively focused on the dangers and threats associated with new technologies. Privacy is invoked at every turn, often legitimately, but sometimes in more questionable ways. With many adverse consequences for European scientists. Conducting legal AI research in Europe is very hard, when not outright impossible, because many judiciaries hide behind the GDPR to block access to the large corpora of decisions researchers need. The fear of facing complaints have made universities risk averse. And then you have the young, inexperienced jurist on the board of ethics or working for the local regulator who has persuaded himself that the modest research proposal he is asked to review must be the next Cambridge Analytica.

It is not rare for even a very-low risk scientific proposal to take weeks of wrangling to get approved. I have seen instances where researchers eventually had to ask anonymous participants to consent three times to data collection. Once to join the online platform that recruited them in the UK. Then a second time to get their GDPR consent. And, finally, a third time, because they are also supposed to give their ethical consent to data collection (a nuance surely lost on 99.9% of participants).

The problem is not just that researchers end up devoting inordinate amounts of time to convince university reviewers that their work is compliant (which comes on top of all the academic bureaucracy – reporting, time-sheeting, submitting data management plans – that keeps inflating and keeps distracting scientists from the actual research). To avert problems with finicky reviewers, researchers often choose to do less. Don’t collect demographics if they are not strictly necessary for the study or if it is likely to delay approval! Master students interested in conducting experiments involving human participants face the real risk that GDPR and ethical review will prevent them from graduating in timely fashion.

There are no empirical studies of this phenomenon. But I do see scientific areas – such as psychological and behavioural studies – where this is clearly hurting European research. Because studies come with fewer covariates, there is less room for exploratory analysis and the generation of new hypotheses. The same may be occurring in medical research. As studies collect less information about patient characteristics, the resulting data will inevitably offer fewer possibilities to explore and detect interaction effects or understand rare complications.

Meanwhile, Big Tech and multinationals are moving and processing tons of data, often in ways that raise red flags, but as part of complex and opaque business operations that tend to elude the attention of regulators. Discussing with business people, I am often astonished at what Big Tech and multinationals dare to do with data. Which points to another asymmetry: the regulations we are talking about here are much easier to enforce on academics, small charities and companies running far less complex operations. The popularity of their apps and the attending network effects afford Big Tech giants a huge bargaining chip to force you and me to hand them over our data. Academics, obviously, don’t have this leverage, but can still feel at the mercy of frivolous accusations of breaching privacy rights.

The fixation on dangers and threats, some phantasmagorical (e.g. people are going to commit suicide if we let them freely talk to LLM-based chatboxes as a petition circulating in Belgium suggested last year), in the European regulatory bubble has overshadowed the need for more discussion on plain enforcement and the compliance and enforcements asymmetries which are hurting Europe’s research and long term economic prosperity.

What is the solution? What we should try to do first is to change the culture and mindset that have developed on and around the legislation, rather than tinkering with the legislation itself. Let prioritize enforcement and the real risks. And make the lives of European scientists and researchers easier and more productive!

Comments Off on AI and Privacy: Should Europe Be Wary of Regulatory Taliban?

Filed under Uncategorized