ETH Zürich
tendersglobal.net
2 Mar 2024
Job Information
- Organisation/Company
- ETH Zürich
- Research Field
- Computer science » Other
Engineering » Electrical engineering
Engineering » Other - Researcher Profile
- First Stage Researcher (R1)
- Country
- Switzerland
- Application Deadline
- 29 May 2024 – 21:59 (UTC)
- Type of Contract
- Temporary
- Job Status
- Full-time
- Hours Per Week
- 41
- Is the job funded through the EU Research Framework Programme?
- Not funded by an EU programme
- Is the Job related to staff position within a Research Infrastructure?
- No
Offer Description
PhD student in Multi-modal Input Decoding from Wearable Sensors (Human-Computer Interaction, Augmented Reality)
The Sensing, Interaction & Perception Lab at ETH Zurich is looking for another PhD student. Our research will be focused on multi-modal signal processing for learning-based input decoding. Signals will originate from embedded sensors on wearable devices, including IMUs (motion sensors), eye gaze, microphones, EMG (myography), and PPG (optical heart rate sensors). We will build novel learning-based recognizers and interaction techniques based thereon, and we will study users while interacting using these novel interfaces.
Suitable backgrounds
- electrical engineering/signal processing
- computational interaction/human-computer interaction
- machine learning on time series
Key requirements for your application
Following the high number of generic applications, please note that:
- We will only respond to applications for this role submitted through this platform. Applications through email will be ignored.
- Your motivation letter must relate to the specifics of this position, especially how your experience relates to multi-modal signal processing and/or ML-based time series processing for input decoding.
- We won’t be able to advance candidates to the interview if no experience with signals or sensors is evident.
Project background
Multi-modal signal processing is gaining traction for interactive purposes thanks to a multitude of sensor modalities in today’s mobile and wearable devices. As wearable technology becomes more prevalent, there’s a clear need to improve interaction mechanisms. Wearable devices, with their integrated sensors, offer a platform to detect and interpret human behavior and actions.
The research conducted as part of the PhD will focus on the analysis of multi-modal signals from the sensors in mobile and wearable devices. Candidates will work on novel processing methods and interaction techniques, contributing to the evolution of user-device interactions.
Applications of the developed interactions and detection techniques will be in Augmented Reality, Virtual Reality, and mobile computing in general.
Job description
- Literature review of machine learning techniques on signal and time series analysis (specifically IMU, EMG, PPG, microphone, gaze), including Multivariate Time Series Analysis, Sequence Modeling, Dimensionality Reduction, Anomaly Detection, Temporal Pattern Recognition, Feature Extraction and Engineering, and Prediction Models.
- Method development for machine learning-based signal processing, including predictive coding, contrastive learning, augmentation approaches, and multimodal learning from auxiliary inputs.
- Technique development in computational interaction (input decoding, Bayesian decoding, hidden Markov models, combinatorial optimization)
- Review and further development of state-of-the-art efforts in active learning, transfer learning with user-specific finetuning, and online learning.
- Experiment design for validation methods, offline based on datasets and online for empirical validation.
- Present research findings at academic conferences and seminars, engaging with the wider scientific community for feedback and knowledge exchange.
- Collaborations with others in the research team to integrate developed methods into broader project objectives, contributing to shared goals and interdisciplinary learning and applications that involve end-users and patients.
- Later on: Explore practical applications of the novel methods in Augmented Reality and mobile scenarios (e.g., in combination with Microsoft Hololens or Apple Vision Pro)
Your profile
ETH requirements:
- written and spoken fluency in English
- an excellent master’s degree (MSc., M.Eng. or equivalent) in Computer Science or Electrical Engineering
requirements for the position:
- strong interpersonal and communication skills
- experience in signal processing or machine learning for time series
- understanding of multimodal data processing
- optional: sensing/electrical engineering background for any combination of {IMU, EMG, gaze, PPG, microphone} signal acquisition
- optional: Computational Interaction/Bayesian Input Decoding
- optional but useful: experience in 3D programming (Unity/C#, Unreal Engine, …)
Prior experience in conducting user evaluations is useful but not a must.
We offer
We offer an exciting environment and team to study in and work with. Beyond the lab, ETH Zürich has several internationally recognized research groups dedicated to interactive systems, Human-Computer Interaction, AR/VR, and machine learning. In our research, we often collaborate with other groups and departments as well as with several other institutions and companies in Switzerland and abroad.
Working, teaching and research at ETH Zurich
We value diversity
In line with our values , ETH Zurich encourages an inclusive culture. We promote equality of opportunity, value diversity and nurture a working and learning environment in which the rights and dignity of all our staff and students are respected. Visit our Equal Opportunities and Diversity website to find out how we ensure a fair and open environment that allows everyone to grow and flourish.
Curious? So are we.
Please submit your complete application through the online application portal:
- a short motivation letter (≤ 1 page)
- curriculum vitae (PDF)
- university transcript of records (bachelor’s and master’s)
- short overview of your experiences with multi-modal signals
- short overview of your experiences with machine learning
- if available: short overview of your experiences in computational interaction
- contact details of 1–2 academic referees
- a link to your GitHub profile and/or your portfolio/website
Please note that we exclusively accept applications submitted through our online application portal. Applications via email or postal services will not be considered.
Applications will be evaluated on a rolling basis. The position is available for as long as this job ad is up and the job ad will be delisted when the position has been filled. We are looking to fill the position as soon as possible with a start date in Spring 2024 (and well before Fall 2024).
If your questions are not answered in this post, please direct them to [email protected] .
About ETH Zürich
ETH Zurich is one of the world’s leading universities specialising in
science and technology. We are renowned for our excellent education,
cutting-edge fundamental research and direct transfer of new knowledge
into society. Over 30,000 people from more than 120 countries find our
university to be a place that promotes independent thinking and an
environment that inspires excellence. Located in the heart of Europe,
yet forging connections all over the world, we work together to
develop solutions for the global challenges of today and tomorrow.
Requirements
- Research Field
- Computer science
- Years of Research Experience
- 1 – 4
- Research Field
- Engineering
- Years of Research Experience
- 1 – 4
- Research Field
- Engineering
- Years of Research Experience
- 1 – 4
Additional Information
- Website for additional job details
- https://academicpositions.com
Work Location(s)
- Number of offers available
- 1
- Company/Institute
- ETH Zürich
- Country
- Switzerland
- City
- Zurich
- Postal Code
- 8006
- Street
- Rämistrasse 101
- Geofield
Where to apply
- Website
- https://academicpositions.com/ad/eth-zurich/2024/phd-student-in-multi-modal-inp…
Contact
- City
- Zurich
- Website
- https://ethz.ch/en.html
- Postal Code
- 8006
STATUS: EXPIRED
View or Apply
To help us track our recruitment effort, please indicate in your cover/motivation letter where (tendersglobal.net) you saw this job posting.