Zur Seitennavigation oder mit Tastenkombination für den accesskey-Taste und Taste 1 
Zum Seiteninhalt oder mit Tastenkombination für den accesskey und Taste 2 
Switch to english language
Startseite    Anmelden     
Logout in [min] [minutetext]
SoSe 2025

Embodied Conversational Agents in Social Virtual Reality - Einzelansicht

  • Funktionen:
Grunddaten
Veranstaltungsart Projekt SWS 10
Veranstaltungsnummer 425110009 Max. Teilnehmer/-innen 6
Semester SoSe 2025 Zugeordnetes Modul
Erwartete Teilnehmer/-innen 2
Rhythmus einmalig
Hyperlink  
Sprache englisch


Zugeordnete Personen
Zugeordnete Personen Zuständigkeit
Fröhlich, Bernd, Prof., Dr.rer.nat. verantwortlich
Zöppig, Tony Jan
Schott, Ephraim , Master of Science
Studiengänge
Abschluss Studiengang Semester Leistungspunkte
M. Sc. Computer Science and Media (M.Sc.), PV 11 - 15
B. Sc. Medieninformatik (B.Sc.), PV 29 - 15
B. Sc. Medieninformatik (B.Sc.), PV 11 - 15
B. Sc. Medieninformatik (B.Sc.), PV 16 - 15
B. Sc. Medieninformatik (B.Sc.), PV 17 - 15
M. Sc. Computer Science for Digital Media (M.Sc.), PV 18 - 15
M. Sc. Human-Computer Interaction (M.Sc.), PV19 - 12/18
B. Sc. Informatik (B.Sc.), PV 2020 - 12
M. Sc. Computer Science for Digital Media (M.Sc.), PV 2020 - 12
Zuordnung zu Einrichtungen
Virtuelle Realität und Visualisierung
Fakultät Medien
Inhalt
Beschreibung

The growing capabilities of large language models and text-to-speech technology are expanding the use of conversational agents. In virtual environments, these agents can be represented as animated avatars, leading to more lively and engaging VR experiences, e.g. when using them as guides in virtual museums. However, communication with these agents in VR relies entirely on speech, as traditional text input is no longer an option. In this project, we aim to extend previous work of our research group to multi-user settings (i.e. social virtual reality), making interactions more natural and intuitive while integrating multiple agents into virtual worlds. The key challenge is to ensure smooth communication between multiple VR users and multiple AI agents without overwhelming or confusing conversations. We will investigate how users can initiate interactions with agents without explicit selection of a specific agent and how they can receive additional information in an intuitive way. To explore these aspects, you will build on our existing integrations of OpenAI’s language models and animated avatars in Unity3D. You will investigate different modalities such as pointing, looking, and speaking to select agents and experiment with ways to playfully engage VR users in immersive experiences. During this project, you will learn to design multi-user virtual reality applications and deepen your understanding of virtual reality development. Furthermore, you will explore advanced multi-modal interaction concepts, gain practical experience using OpenAI’s API and enhance your proficiency with Unity3D and C#. Prior experience with C#, Unity3D, and multi-user VR is strongly advised. Additionally, familiarity with rigging and character animations will be beneficial. Equipment for VR development will be provided.

Bemerkung

Time and place will be announced at the project fair.

Voraussetzungen

Solid software programming / scripting experience (e.g. C#, C++, Python). Experience in Unity is very helpful. Experience with rigging and character animations could be beneficial. Successful completion of the Virtual Reality course is recommended.

Zielgruppe

B.Sc. Medieninformatik / Informatik

M.Sc. Computer Science and Media / Computer Science for Digital Media

M.Sc. Human-Computer Interaction

 


Strukturbaum
Die Veranstaltung wurde 10 mal im Vorlesungsverzeichnis SoSe 2025 gefunden:
Projects  - - - 3
Master  - - - 4
Bachelor  - - - 5
Projects  - - - 7
Research Project 1  - - - 8
Research Project 2  - - - 9
Project  - - - 10

BISON-Portal Startseite   Zurück Kontakt/Impressum Datenschutz