Yasemin Bekiroglu
Yasemin Bekiroglu is a Researcher in the Automatic Control research group. She completed her Ph.D. at the Royal Institute of Technology (KTH) in 2012. Her research is focused on data-efficient learning from multisensory data for robotics applications. She received the Best Paper Award at IEEE International Conference on Robotics and Automation for Humanitarian Applications (RAHA) in 2016 and the Best Manipulation Paper Award at IEEE International Conference on Robotics and Automation (ICRA) in 2013, and was IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) CoTeSys Cognitive Robotics Best Paper Award Finalist in 2013. She serves as a reviewer for robotics conferences and journals.

Showing 35 publications
Simultaneous Tactile Exploration and Grasp Refinement for Unknown Objects
Visual and Tactile 3D Point Cloud Data from Real Robots for Shape Modeling and Completion
Benchmarking Protocol for Grasp Planning Algorithms
Dynamic grasp and trajectory planning for moving objects
Shape Modeling based on Sparse Gaussian Process Implicit Surfaces
Evaluating the Quality of Non-Prehensile Balancing Grasps
A Database for Reproducible Manipulation Research: CapriDB - Capture, Print, Innovate
Towards advanced robotic manipulation for nuclear decommissioning
Hierarchical Fingertip Space: A Unified Framework for Grasp Planning and In-Hand Grasp Adaptation
Probabilistic Consolidation of Grasp Experience
Analytic Grasp Success Prediction with Tactile Feedback
Active Exploration Using Gaussian Random Fields and Gaussian Process Implicit Surfaces
Learning Predictive State Representation for In-Hand Manipulation
Learning to Disambiguate Object Hypotheses through Self-Exploration
Hierarchical Fingertip Space for Synthesizing Adaptable Fingertip Grasps
Grasp Moduli Spaces, Gaussian Processes and Multimodal Sensor Data
Learning of Grasp Adaptation through Experience and Tactile Sensing
Grasp Moduli Spaces and Spherical Harmonics
What's in the Container? Classifying Object Contents from Vision and Touch
Enhancing Visual Perception of Shape through Tactile Glances
Predicting Slippage and Learning Manipulation Affordances through Gaussian Process Regression
A Probabilistic Framework for Task-Oriented Grasp Stability Assessment
Grasp Stability from Vision and Touch
Learning Task- and Touch-based Grasping
Integrating Grasp Planning with Online Stability Assessment using Tactile Sensing
Learning Tactile Characterizations Of Object- And Pose-specific Grasps
Assessing grasp stability based on learning and haptic data
Joint Observation of Object Pose and Tactile Imprints for Online Grasp Stability Assessment
Learning grasp stability with tactile data and HMMs
Learning grasp stability based on haptic data
Download publication list
You can download this list to your computer.
Filter and download publication list
As logged in user (Chalmers employee) you find more export functions in MyResearch.
You may also import these directly to Zotero or Mendeley by using a browser plugin. These are found herer:
Zotero Connector
Mendeley Web Importer
The service SwePub offers export of contents from Research in other formats, such as Harvard and Oxford in .RIS, BibTex and RefWorks format.
Showing 1 research projects
Dexterous robot assistant for everyday physical object manipulation