Research

My research interests are primarily in the fields of Robotics Grasping and Manipulation. I am interested in designing and integrating robots and tactile sensors to leverage the sense of touch and perform tasks more effectively. I believe robots will be useful in our homes to assist us in our every day tasks.

See a complete list of my publications

A GIF image showing a simulated robot on the left side inserting a peg and on the right side a real robot inserting the same peg in real life.

IndustReal: Transferring Contact-Rich Assembly Tasks from Simulation to Reality

Bingjie Tang*, Michael A. Lin*, Iretiayo Akinola, Ankur Handa, Gaurav S. Sukhatme, Fabio Ramos, Dieter Fox, Yashraj Narang

Robotic assembly is a longstanding challenge, requiring contact-rich interaction and high precision and accuracy. Many applications also require adaptivity to diverse parts, poses, and environments, as well as low cycle times. In other areas of robotics, simulation is a powerful tool to develop algorithms, generate datasets, and train agents. However, simulation has had a more limited impact on assembly. We present IndustReal, a set of algorithms, systems, and tools that solve assembly tasks in simulation with reinforcement learning (RL) and successfully achieve policy transfer to the real world. Specifically, we propose 1) simulation-aware policy updates, 2) signed-distance-field rewards, and 3) sampling-based curricula for robotic RL agents. We use these algorithms to enable robots to solve contact-rich pick, place, and insertion tasks in simulation. We then propose 4) a policy-level action integrator to minimize error at policy deployment time. We build and demonstrate a real-world robotic assembly system that uses the trained policies and action integrator to achieve repeatable performance in the real world. Finally, we present hardware and software tools that allow other researchers to fully reproduce our system and results.

A GIF image showing a robot gripper moving inside a cabinet and making contact with four spice jars quickly, and then grasping one of the spice jars.

Whisker-Inspired Tactile Sensing for Contact Localization on Robot Manipulators

Michael A. Lin, Emilio Reyes, Jeannette Bohg, Mark Cutkosky.

This work presents the design and modelling of whisker-inspired sensors that attach to the surface of a robot manipulator to sense its surrounding through light contacts. We obtain a sensor model using a calibration process that applies to straight and curved whiskers. We then propose a sensing algorithm using Bayesian filtering to localize contact points. The algorithm combines the accurate proprioceptive sensing of the robot and sensor readings from the deflections of the whiskers. Our results show that our algorithm is able to track contact points with sub-millimeter accuracy, outperforming a baseline method. Finally, we demonstrate our sensor and perception method in a real-world system where a robot moves in between free-standing objects and uses the whisker sensors to track contacts tracing object contours.

A GIF image showing a robot gripper moving inside a cabinet and making contact with four spice jars quickly, and then grasping one of the spice jars.

Exploratory Hand: Leveraging Safe Contact to Facilitate Manipulation in Cluttered Spaces

Michael A. Lin, Rachel Thomasson, Gabriela Uribe, Hojung Choi and Mark R. Cutkosky

We present a new gripper and exploration approach that uses an exploratory finger with very low reflected inertia for probing and grasping objects quickly and safely in unstructured environments. Equipped with sensing and force control, the gripper allows a robot to leverage contact information to accurately estimate object location through a particle filtering algorithm and also grasp objects with location uncertainty based on a contact-first approach. This publication is still under review so it is not yet available.

A robot arm with a custom designed 2-DOF wrist that wears a white sleeve with soft pneumatic sensors. The robot is reaching into a fridge drawer to retrieve a pear.

A Stretchable Tactile Sleeve for Reaching into Cluttered Spaces

Alexander M. Gruebele, Michael A. Lin, Dane Brouwer, Shenli Yuan, Andrew Zerbe and Mark R. Cutkosky

A highly conformable stretchable sensory skin made entirely of soft components. The skin uses pneumatic taxels and stretchable channels to conduct pressure signals to off-board MEMs pressure sensors. The skin is able to resolve forces down to 0.01N and responds to vibrations up to 200 Hz. We apply the skin to a 2 degree-of-freedom robotic wrist with intersecting axes for manipulation in constrained spaces, and show that it has sufficient sensitivity and bandwidth to detect the onset of sliding as the robot contacts objects. We demonstrate the skin in object acquisition tasks in a tightly constrained environment for which extraneous contacts are unavoidable.

Previous Work

A user wearing a HoloLens inserting a needle into phantom tissue.

HoloNeedle: Augmented Reality Guidance System for Needle Placement Investigating the Advantages of Three-dimensional Needle Shape Reconstruction

Michael A. Lin., Alexa F. Siu, Jung Hwa Bae, Mark R. Cutkosky and Daniel L. Bruce (2018)

An augmented reality guidance system for needle placement in tissue using needle shape reconstruction and sensing.

A user wearing a HoloLens looking at holograms of 3D medical imaging of a patient's breast overlayed on the patient

A Mixed-Reality System for Breast Surgical Planning

Stephanie L. Perkins, Michael A. Lin, Subashini Srinivasan, Amanda J. Wheeler, Brian A. Hargreaves and Bruce L. Daniel (2017)

We have developed a mixed-reality system that projects a 3D “hologram” of images from a breast MRI onto a patient using the Microsoft HoloLens. The goal of this system is to reduce the number of repeated surgeries by improving surgeons’ ability to determine tumor extent. We are conducting a pilot study in patients with palpable tumors that tests a surgeon’s ability to accurately identify the tumor location via mixed-reality visualization during surgical planning

Four figures: bottom left showing a medical needle with strain sensors embedded, top left shows the experiment target, top right a haptic device that displays needle forces to the user and middle shows the entire experiment setup.

Display of Needle Tip Contact Forces for Steering Guidance

Jung Hwa Bae, Christopher J. Ploch, Michael A. Lin, Bruce L. Daniel and Mark R. Cutkosky (2016)

A MR-compatible biopsy needle stylet is instrumented with optical fibers that provide information about contact conditions between the needle tip and organs or hard tissues such as bone or tumors. This information is rendered via a haptic display that uses ultrasonic motors to convey directional cues to users. Lateral haptic cues at the fingertips improve the targeting accuracy and success rate in penetrating a prostate phantom.

Photo of a haptic device attached to a Phantom Omni haptic manipulator and a user holding the device.

The Effect of Manipulator Gripper Stiffness on Teleoperated Task Performance

Michael A. Lin, Samuel B. Schorr, Iris Yan and Allison M. Okamura (2015)

The absence of environment force sensing in robot-assisted minimally invasive surgery, makes it challenging for surgeons to perform tasks while applying a controlled force to not damage patient tissue. One way to help modulate grip force is to use a passive spring to resist the closing of the master-side gripper of the teleoperated system. To investigate the effect of this spring stiffness we developed a haptic device that can render a programmed gripper stiffnesses. We conducted a study in which subjects used our dvice to teleoperate a Raven II surgical robotic system in a pick-and-place task. We found that increasing the gripper stiffness resulted in reduced forces applied at the slave-side.