Loading Events

« All Events

  • This event has passed.

GRASP REU Site Oral Presentations – Summer 2012

August 9, 2012 @ 1:00 pm - 3:00 pm

GRASP REU Site Oral Presentations
Thursday, August 9, 2012
Wu and Chen Auditorium
1pm – 3pm

Welcome by Katherine J. Kuchenbecker and Max Mintz, GRASP REU Site Co-Directors

1:00 p.m.     Stephen Dodds

Rising Senior in Electronics Engineering at the University of Nebraska at Omaha

Advised by Dr. Vijay Kumar; mentored by Dr. Ed Steager and Denise Wong

Developing Tracking Algorithms For Multiple Microrobots

Abstract: Microrobotics is an exciting area for research because this new field became possible after recent technological advances in microstructure manufacturing and observation. Tracking the microrobots is an essential element in quantitative observation and control. Most of the existing tracking systems for microstrustures are limitted to tracking only one object, or robot, at a time. In this paper I introduce two methods for tracking multiple microrobots. Contour tracking was implemented first and the limitations were explored. Feature tracking was implemented second to make up for some of the limitations of the contour tracker. I show that feature tracking is accurate even when one robot overlaps another, when a robot drifts partially off the screen, or when clusters of bacteria collide with the robot. We show that contour tracking becomes less accurate when the previously stated occur.

1:15 p.m.     Lowell Fluke

Rising Junior in Applied Mathematics at Harvard University

Advised by Dr. Kostas Daniilidis and Dr. Vijay Kumar; mentored by Nicu Stiurca and Dr. Koushil Sreenath

Autonomous Quadrotor Navigation With Visual Servoing

Abstract: Among miniature aerial vehicles (MAV), quadrotor helicopters are becoming the most popular platform for research due to their versatility, and their ease of maintenance and design.  I consider the problem of autonomously flying a quadrotor using one on-board camera, with the goal of performing tasks such as flying through an opening and picking up objects.  Autonomous quadrotor flight using a monocular on-board camera is useful for exploration, rescue, and surveillance in unknown environments, and this paper explores the image based visual servoing (IBVS) controller scheme.  While previous approaches have attempted to estimate the position of the MAV before control is implemented or build a 3D model of the environment for planning, IBVS attempts to execute real-time control within the image space of the on-board camera.  While IBVS is faster and less computationally expensive, it has disadvantages such as the necessity of maintaining a view of the desired object.  I perform Matlab simulations to explore IBVS on a quadrotor and propose methods for testing this system.

1:30 p.m.     Rocky Foster

Rising Senior in Mathematics at the University of Maryland, Baltimore County

Advised by Dr. Alejandro Ribeiro; mentored by Jim Stephan

Robust Fluctuating Computer Networks with Application to Robotic Systems

Abstract: In this paper, maintaining link-to-link and end-to-end communications over a stochastic model of robots is discussed.  Using predetermined probabilities gathered by previous trials, simulations are developed to test methods and models of end-to-end communication systems.  This research is centered around a variation of TCP/UDP_IP communication systems.  The TCP/UDP_IP communication comes from the wireless communication of the robots and how to deal with packet loss and transfer.  The variation comes because probabilities must be re-calculated on a frequent basis as opposed to having a static system.

1:45 p.m.     Tre Glover

Rising Sophomore in Computer Engineering at the University of Maryland, Baltimore County

Advised by Dr. Camillo J. Taylor; mentored by David Isele

Optimization Methods with Landmark Based Robot Navigation

Abstract: This paper looks into the ability to navigate from one position to another based on common landmarks visible from both positions.  A scheme was used in order to generate the direction in which to travel between the two positions using the landmarks.  Also, a mapping method enables the navigation of longer distances by linking multiple local navigation processes.  In order to reduce the distance that is traveled and make the process more resilient to common errors, two other methods were implemented.  These methods all come together to create a single algorithm for landmark based navigation.

2:00 p.m.     Raven Hooper

Rising Sophomore in Electrical and Computer Engineering at Temple University

Advised by Dr. Katherine J. Kuchenbecker; mentored by Heather Culbertson

Real-Time User Interface for the Haptic Camera

Abstract: When you drag a tool across an object, you can feel many of the properties of its texture including roughness, hardness, bumpiness, and stickiness. The vibrations of the tool are responsible for a large portion of the feel of an object. Capturing the feel of real objects can be done with the use of the Haptic Camera, which records normal force, scanning speed, and tool vibrations as the user drags the tool across the textured surface. A magnetic tracking sensor, a force-torque sensor, and two 2-axis accelerometers held together with a pen-like covering is the Haptic Camera device. The tracker has 6DOF and uses x, y, z coordinates to find the position of the tool. The ATI Nano 17 force sensor measures the normal force of the Haptic Camera when it is pressed downward on an object. In order to get accurate data recordings, users need to record vibrations from the textures at several scanning speeds of up to 400 mm/s per second and several normal forces up to 4 Newtons. However, in the previous system, the user was not provided with any real-time indication of the speeds and forces they used during the recording. It was impossible to tell if the user had sufficiently varied scanning speed and normal force until after the recording sessions, which increased the amount of time and data required. I designed a feedback screen that measures scanning speed verse normal force. To pr
vide a second visual indicator of force, an LED was attached to the ATI Nano 17 force sensor so as you press downward with the Haptic Camera the LED’s light up accordingly. 

2:15 p.m.     Liz Miller

Rising Sophomore in Robotics Engineering at Worcester Polytechnic University (WPI)

Advised by Dr. Vijay Kumar; mentored by Yash Mulgaonkar

Optimizing Quadrotor Performance through Modeling and Scaling

Abstract: ​With recent work in unmanned autonomous vehicles focusing mostly on computer vision and environment mapping, this project focuses specifically on improving quadrotor frames through material and scale modeling. Making quadrotors structurally optimal is important in order to increase performance and agility. Research primarily involved modeling the Hummingbird quadrotor frame and assigning variables to represent scale factors. Designers have a better idea on how to construct the best frame for responsive quadrotors using the relationships between moment of inertia, scale, and material selection.

2:30 p.m.     Chase Pair

Rising Senior in Mechanical Engineering at the University of Texas, Arlington

Advised by Dr. Vijay Kumar; mentored by Justin Thomas and Dr. Koushil Sreenath

Obtaining Tension Measurements for Dynamic Quadrotor Load Sharing and Control

Abstract: To enable dynamic control of the position and attitude of a shared load carried by two quadrotors, we focus on obtaining measurements of tension in the cables connecting each quadrotor to the load. Towards this, we show that the combined system of two quadrotors and cables is differentially flat and use this for designing dynamic trajectories. This will enable performing aggressive maneuvers while carrying the load which can be used to swing loads and quadrotors through obstacles.

2:45 p.m.     Julie Walker (in absentia)

Rising Junior in Mechanical Engineering at Rice University

Advised by Dr. Katherine J. Kuchenbecker; mentored by Will McMahan and Jennifer Hui

Haptic Feedback for Remote Palpation

Abstract: During open procedures, surgeons often palpate tissue with their fingers to determine its stiffness and explore for lumps of harder material, such as tumors. However, surgeons performing minimally invasive laparoscopic or robotic surgery use long thin tools, so they cannot directly touch the patient with their fingers. This project presents a fingertip haptic feedback device that seeks to replicate some of the touch sensations felt during palpation. The design focuses on replicating the feel of tissue stiffness (normal force), texture (vibrations), and the sensation of making and breaking contact with surfaces. These signals are measured by a sensorized palpation device, such as the SynTouch Biomimetic Tactile (BioTac) sensor. While holding the user’s finger in place, the feedback device uses a motor and a linkage to lift a surface up to the fingertip and apply the measured forces and vibrations.  Simultaneously, the force distribution detected by the sensor is displayed visually on a computer screen. In this way, the device allows one to see and feel the difference between hard and soft materials remotely. Attaching the touch sensor to a robotic surgery instrument and the feedback device to the control console would give surgeons the ability to palpate without directly touching the patient.


Many thanks to the advisors, mentors, colleagues, staff, GRASP Lab, and larger Penn community for helping make the first year of the GRASP REU Site such a success.  We are especially indebted to Charity Payne and Robert Parajon for their excellent work in running the program this summer.  Congratulations to all eight 2012 Participants!

Details

Date:
August 9, 2012
Time:
1:00 pm - 3:00 pm
Event Category: