Leveraging Biometric Feedback in Game Development for Adaptive and Personalized Horror Experiences
Keywords
-
Personalized Fear Responses
-
Neural Feedback
-
Visual Noise
-
Model Skewing
-
Intonalist Orchestration
-
Real-Time Sensor Data
-
Dynamic Shader Code
-
Horror Game Design
-
Biometric Data
-
Psychophysiological Reactions
-
Immersive Media
-
Neuroaesthetics
-
Adaptive Horror Experiences
Research Question
How can real-time neural and physiological feedback be used to dynamically adjust horror stimuli to personalize and amplify individual fear responses in an immersive video and gaming environment?
Hypothesis
Specific audiovisual stimuli, such as visual noise, model skewing, and intonalist orchestration, can be dynamically adjusted in real-time to elicit personalized fear responses, as measured by neural and physiological sensors. This hypothesis can be tested by analyzing how well different forms of horror stimuli provoke fear for an individual. Once these fear triggers are identified by using their biometric data, this information can be applied in game development, allowing the game's environment and stimuli to adapt dynamically, creating a more personalized and intense experience tailored to the player's unique fear responses.
Description
​Modern horror gaming experiences are constantly pushing the boundaries of conventional gaming through advancements like virtual reality. However, these horror VR experiences can be inconsistent, as certain visual or audio styles may resonate better with some players than others. This project proposes a solution to personalize VR horror experiences by leveraging biometric feedback.
The core idea is to develop a pre-game test that collects individual fear response data. This data can then be used to automatically tailor the visual and audio elements of a horror game. The pre-game test will isolate different stimuli, such as visual noise, model distortion, and audio effects, to elicit personalized fear responses. These responses will be measured using physiological sensors like heart rate and sweat detection. Once the individual’s fear triggers are identified, this data will be used in-game to dynamically adjust shader code, sound design, and even game design to create a tailored, adaptive horror experience.
The theoretical framework draws from heart rate response theory, dynamic content adaptation, and scalable biometric frameworks. The methodological approach involves three phases: 1) developing a calibration system with distinct test scenes, 2) collecting and analyzing data using the SENS-GSR Grove-GSR Skin Current Sensor and the PIM438 MAX30105 Breakout - Heart Rate Sensor, and 3) implementing real-time adaptation systems to adjust stimulus intensity based on the player’s biometric responses.
This research contributes to the existing knowledge base in interactive media design, horror psychology, and adaptive gaming systems. The innovative aspects include integrating sweat detection alongside heart rate data, developing a multi-dimensional stimulus control framework, and implementing a calibration-based personalization approach. The expected outcomes include demonstrating the effectiveness of personalized horror experiences and identifying the most impactful biological metrics for fear response.
Methodological Framework & Technical Implementation
Our methodology is predicated on the following user pipeline:
-
User takes pre-game test with specific fear stimuli
-
User’s fears are isolated and interpreted
-
Data is used to alter a given game in various ways
​
The system continuously monitors biometric data, such as heart rate and sweat levels, while users navigate through various test scenarios. Each scenario focuses on a specific category of fear, including sound, colour, visual distortions, or model distortions. The system begins by establishing a baseline from the user’s resting biometric rates. As users move through designated zones, it compares their real-time biometric data against this baseline. When readings exceed predetermined thresholds during specific fear tests, the system identifies that particular element as a legitimate fear trigger for the user. This process generates a personalized fear profile based on which elements most effectively induced fear responses compared to their baseline readings.
​
Process
This system is designed for testing and calibration purposes, with the goal of creating a dynamic fear-driven gameplay experience in the final game. Biometric data collected from various test scenes is analyzed to fine-tune how the game responds to the player’s emotional state in real time.
​
Phase 1: Test Scene Data Collection​​
​
Step 1: Data Collection
The system continuously monitors the player’s physiological responses using two key sensors:
-
Galvanic Skin Response (GSR) Sensor
Measures skin conductivity, which increases when the player sweats more. This provides a direct indicator of stress or fear by outputting a voltage that reflects sweat levels. -
Heart Rate Sensor (MAX30105)
Detects heartbeats by tracking voltage spikes. By calculating the time intervals between these spikes, it determines beats per minute (BPM). A significant increase in BPM signals heightened anxiety.​
​
Step 2: Baseline Establishment
Baselines for heart rate and GSR are established during the initial calibration scene and periodically recalibrated in neutral calibration scenes placed between fear-inducing scenes.
-
Heart Rate Baseline: Average BPM recorded during calm conditions.
-
GSR Baseline: Average skin conductivity when the player is relaxed.
These baselines serve as reference points to detect elevated stress responses. Thresholds for fear detection are dynamically adjusted based on the baselines.
​
Step 3: Real-Time Monitoring and Thresholding
-
Heart Rate Monitoring: Current BPM is compared to the baseline. A significant increase signals a fear response.
-
GSR Monitoring: A spike in GSR voltage beyond the baseline threshold indicates heightened stress.
-
Dynamic Threshold Application: The system applies a dynamic threshold to reduce false positives caused by brief fluctuations.
​
Step 4: Calibration Scenes for Reset
Neutral calibration scenes placed between fear-specific scenes serve to:
-
Allow the player’s physiological state to return to baseline.
-
Collect updated baselines for heart rate and GSR.
-
Recalibrate thresholds for accurate fear detection in subsequent scenes.
This reset process ensures that residual stress from a previous fear-inducing scene does not affect subsequent scenes, ensuring clean, precise data for fine-tuning the system.
​
Phase 2: Unity Fear Detection and Scene-Specific Triggers
The Unity Fear Manager Script receives real-time biometric data from the Python script and compares it to the threshold. When the data exceeds the threshold, the system triggers fear events based on the type of stimulus in the current scene.
Fear Categories Tested in Scenes
1. Sound Fear
Sound manipulation techniques are applied to create an unsettling auditory environment:
-
Slowed Sound: Reduces playback speed, making ambient noises ominous.
-
Reversed Sound: Plays familiar sounds backward to induce disorientation.
-
Reverb: Adds unnatural echo, making sounds feel distant or hollow.
When the player’s biometric data surpasses the threshold, a sound fear event is logged, and the intensity of sound manipulation increases.
​
2. Visual Fear
Four types of glitches are tested to simulate digital or analog errors:
-
Camera Glitch: Distorts the camera with abrupt perspective shifts.
-
Pixel Glitch: Pixelates parts of the screen into blocky artifacts.
-
Analog Glitch: Mimics old TV static and horizontal line distortions.
-
Digital Glitch: Introduces tearing and color shifts to simulate corrupted video.
If two or more glitches occur simultaneously, a combination glitch event is logged, escalating visual disturbance. For example:
-
Camera + Pixel Glitch causes simultaneous shaking and pixelation, intensifying disorientation.
​
3. Color Fear
Post-processing effects dynamically alter the game’s color palette:
-
Chromatic Aberration: Creates color fringes around objects.
-
Bloom: Intensifies bright areas, creating a glowing effect.
-
Color Grading: Shifts the overall tone to unsettling hues.
-
Lens Distortion: Warps the screen, bending it at the edges.
-
Auto Exposure: Causes rapid brightness fluctuations.
When multiple effects are applied together (e.g., Chromatic Aberration + Bloom), a combination color fear event is triggered, heightening visual tension.
​
​
​
​
​
​
​
​
​
​
​
​
​
4. Model Skew Fear
This scene focuses on real-time deformation of 3D objects based on the player’s stress levels:
-
Skewing: Stretches objects along one axis.
-
Twisting: Rotates objects around a central axis.
As biometric data surpasses the threshold, the system intensifies skewing and twisting, distorting familiar objects and creating a disorienting environment.
​
Step 5: Feedback Loop and Recovery
After a fear event is triggered, the system continuously monitors the player’s biometric data to detect when they return to baseline. Once the data normalizes:
-
The game initiates a recovery phase, gradually reducing the intensity of fear stimuli.
-
This ensures a balanced experience by alternating between periods of intense fear and calm, preventing the player from becoming overwhelmed.
​
Phase 3: Integration into Actual Gameplay
​
Step 6: Dynamic Gameplay Adaptation
-
Fear Profiling
The system analyzes biometric data collected from the test scenes to build a fear profile for each player. This profile identifies which stimuli (sound, visual, color, or model skew) are most effective at inducing fear.
During gameplay, this profile guides the dynamic adaptation of the environment. -
Dynamic Stimulus Triggering
In actual gameplay, the system continuously monitors biometric data. When the player’s stress level exceeds the threshold:-
If sound fear was dominant in testing, slowed, reversed, or echoed sounds are introduced.
-
If visual glitches were most effective, they are triggered in combination to heighten disorientation.
-
Color changes and model skewing are dynamically applied to increase tension in moments of heightened anxiety.
-
Conclusion
The test scenes and calibration processes are essential for refining the system before its implementation in the actual game. By combining real-time biometric data processing with dynamically triggered fear stimuli, the system creates a highly immersive and personalized horror experience. The inclusion of calibration scenes ensures precision in fear detection throughout gameplay, while the subdivision of fears—sound, visual, color, and model skew—offers varied and unpredictable stimuli. This biofeedback-driven approach enhances tension, ensuring that the final game delivers a heightened, responsive, and engaging horror experience, tailored to each player’s unique emotional state.