Labs & Technologies

Explore our different labs and technological platforms


ASIL: ….

Maker Space: …. 

Multisensory displays – Spatial audio

Speakers: ASIL is equipped a room-encompassing audio system consisting out of 80 Martin Audio speakers. Under the hood, this system is powered by 12 OTTOCANALI 4K4 DSP+D amplifiers, a 64 channel Barco IOSONO core and synched with a 48kHz Nanosync. The Dante protocol allows for a very flexible routing of all sound sources and destinations within the audio network.

Spatial Audio playback is supported: up to 7th order ambisonics (IEM plugin Suite, SPAT, …), Wavefield synthesis (Barco IOSONO), Binaural audio (tracked open-back headphones) and room reverb simulation (evertims).

Virtual Reality: two wireless HTC Vive Pro’s allow untethered free roaming on an 8m by 8m surface, enabling compelling and interactive virtual reality experiences. Next to the two wireless systems, the ASIL also has 2 spare wired HTC Vive Pro kits and 2 HTC Vive Pro Eye kits at its disposal.

Augmented Reality: two Microsoft Hololens 2 headsets allow for an untethered mixed reality experience.

Projection: Acoustic transparant projection screen of 7m wide by 4m height, combined with a Barco 7k4 4K projector, allow for encomassing visual experience in the ASIL.

Motion Traking

Full body Motion Capture: Both ASIL & Maker Space labs are equipped with a Qualisys motion capture system.  The system consists out of 14 fixed infrared and 4 fixed RGB camera’s, allowing a precision of approximately 1mm2 in a 8m by 8m by 3m volume (ASIL) and 9 Micqus camera’s for a 5x4m space (Maker Space). Furthermore, both system are extendable with 4 extra mobile infrared and 1 extra mobile RGB cameram allowing even more precise tracking in certain regions.

Eye gaze tracking in VR: Vive Pro Precision eye tracking powered by Tobii. Track and interpret eye movements to enable lifelike interactions, better manage GPU workload, and simplify input and navigation.

Eye gaze tracking in XR: HoloLens 2 allows for a new level of context and human understanding within the holographic experience by providing developers with the ability to use information about what the user is looking at.


Electroencephalography (EEG): Hyperscanning (dual-subject) setup, 2 x 64 channels EEG by Ant-Neuro and Emotiv Epoc X

fNIRS: The fNIRS technology and setup allows for larger body movement compared to traditional techniques, making it a particularly effective neuroimaging tool in pediatric research.

EKG: monitor heart rate using custom sensors (realtime) or using biometric belt (…)

Breathing: chest compression & expansion sensor to determine breathing frequency

GSR:  monitors sweating activity that are reflective of the intensity of our emotional state, otherwise known as emotional arousal.

Sensors and scanners

Matterport 3D environment scanning:  3D space capture, transforms real-life spaces into immersive digital twin models.

3D object scanning: create 3D scans of static objects, to use in VR & XR

Azure Kinect Volumetric Scanning: create 3D scans of moving objects as a timeline of point clouds

Force plates: blah


Audio Recording:

Musical instruments

EMS Synthi 100


Oldschool stuff






Technical backend

Acoustically controlled labs: Both labs (ASIL & Maker Space) are acoustically treated to reach a theoretical reverb time of 0.5 seconds, which is the equivalent of a professional recording studio.  Acoustica treatment focussed on reducing reflections so spatial audio sounds at its best.

Flexible trussing system: ASIL has a flexible trussing system, making custom setups possible

Synchronisation: ….

High-speed network: all computers & devices are linked with a 10 gbit copper connection, allowing fastnessnes

Interconncectivity: connected to Urgent, zwijnaarde, minard, vooruit, wintercircus

Internal data storage: GDPR-complient high performance, offsite backupped, data storage.

Calculation Cluster: high-performance server for statistical analysis & machine learning in R, Matlab & Python


Hardware: The Art and Science Interaction Lab has been engineered as a modular research facility for interaction research. Lots of effort has been put in the creation of a stable and integrated technological ecosystem enabling the integration and interaction of a wide variety of technological systems. Despite the fact that we don’t want to limit ourselves to certain technologies, the matrix on the right depicts the current technological ecosystem that is in place in the Art Science and Interaction Lab.

Partners: ….

Blackbox, focusrite rednet, qualisys, IOSONO, barco, iosono, martin audio, Ant-Neuro, Emotiv, RME, AKG, HTC, Hololens/microsoft, Kinect azure


Software: Ableton, Max Msp, Matlab, R, Steam, Unity, Unreal, Houdini

Protocols: OSC, SMTPE, DANTE, LSL…