Make it (net)work! - MLA2S Networking Seminar #06
Tuesday, 23 September 2025 -
13:00
Monday, 22 September 2025
Tuesday, 23 September 2025
13:00
News from MLA2S and Make it (net)work!
-
Claudius Krause
(
HEPHY Vienna (ÖAW)
)
Nicki Holighaus
(
Acoustics Research Institute, Austrian Academy of Sciences
)
Claus Trost
(
Erich Schmid Institute of Materials Science of theAustrian Academy of Sciences
)
Kati Heinrich
(
IGF | ÖAW
)
Jan Odstrčilík
(
IMAFO
)
News from MLA2S and Make it (net)work!
Claudius Krause
(
HEPHY Vienna (ÖAW)
)
Nicki Holighaus
(
Acoustics Research Institute, Austrian Academy of Sciences
)
Claus Trost
(
Erich Schmid Institute of Materials Science of theAustrian Academy of Sciences
)
Kati Heinrich
(
IGF | ÖAW
)
Jan Odstrčilík
(
IMAFO
)
13:00 - 13:30
Including a short round table news update.
13:30
Centering social responsibility in the development, application and governance of ML/AI
-
Stefan Strauß
(
ITA
)
Astrid Mager
(
ITA
)
Doris Allhutter
(
ITA
)
Fabian Fischer
(
ITA
)
Michael Nentwich
(
ITA
)
Centering social responsibility in the development, application and governance of ML/AI
Stefan Strauß
(
ITA
)
Astrid Mager
(
ITA
)
Doris Allhutter
(
ITA
)
Fabian Fischer
(
ITA
)
Michael Nentwich
(
ITA
)
13:30 - 14:10
Technology Assessment (TA) studies the potentials and challenges of machine learning and AI technologies with regard to ethical questions and social values of equality, privacy, democracy, sustainability, human autonomy, and security. To demonstrate ITA’s portfolio of ML/AI related research, this presentation introduces four projects that demonstrate the institute’s impact in inter- and transdisciplinary research, technology policy and society. The study, ‘Generative AI and Democracy’, examines the opportunities of AI for democracy and political debate, providing the Austrian parliament with policy advice on how to prevent misinformation and fake news. The international research project ‘Auto-Welf’ analyses automated welfare provision across European welfare systems and examines the implications of algorithms and AI for the future of European citizens and societies. ‘A.I.D – Artificial Intelligence and the Shaping of Democracy’ focuses on the social consequences of AI in education and develops teaching materials for various educational institutions. And, eventually, our cooperation with interdisciplinary computer scientists explores the normativity of ML practices with regard to fairness, accountability and transparency.
14:10
Discussion
Discussion
14:10 - 14:20
14:20
Break to walk a few steps and get some fresh air in.
Break to walk a few steps and get some fresh air in.
14:20 - 14:30
14:30
Governing Machines: From Moderation to Alignment
-
Charlotte Spencer-Smith
(
CMC
)
Governing Machines: From Moderation to Alignment
Charlotte Spencer-Smith
(
CMC
)
14:30 - 15:10
This talk presents early work from the CMC research track "Platform Governance in the AI Era". Generative AI tools like ChatGPT are quickly changing the information ecosystem, echoing transformations brought about by social media. This talk compares two ways of managing risks: content moderation, which sets rules for what users can post online, and AI alignment, which sets rules for how AI systems themselves behave. Both are systems for deciding what is acceptable or harmful in digital spaces, but alignment goes further by governing the outputs of non-human actors. The talk discusses how challenges familiar from social media, like harmful content, cultural bias and reliance on hidden human labour, are now reappearing in AI. As such, it is not surprising that early initiatives to regulate AI are informed by paradigms from social media regulation. However, AI alignment poses novel challenges to these paradigms, specifically deep opacity and uncertain controllability. Ultimately, the question is not only how to prevent harm, but how to hold to account the normative assumptions embedded in systems.
15:10
Discussion
Discussion
15:10 - 15:20
15:20
Networking with refreshments / further discussions
Networking with refreshments / further discussions
15:20 - 16:45