| 
View
 

Team Formation and Project Ideas

This version was saved 11 years, 10 months ago View current version     Page history
Saved by Amrutha Krishnan
on January 25, 2013 at 8:47:48 pm
 

 

Project 1: Project Ideas and Team Formation

 

Please use this page to form your teams for project 1. Each team should have four people with no more than one Industrial Design student.

  • If you want to join a project already listed, place your name and email under the project description.
  • If you have your own project idea, or have already formed a team around your own idea, please follow the template below and add your project and team members to this wiki page.
  • If you are an industrial design student, please include (ID) after your name.
  • If a project already has four people, but you are still interested, please add your name under the Alternates section - in case someone else switches to another project.
  • If more than four people are interested in a topic, it is possible to form two groups who will each pursue different aspects of that project idea.  

 

NOTE: When you edit this page, you will lock out all other members of the class until you save your edits. Please edit quickly and save, and don't bogart the editing lock! Do not hit edit and then start composing your project abstract. Please write your content offline, and then hit edit and paste it into this wiki.

 

  • Do not "steal the lock" from another editor. Please wait your turn.
  • Never delete another student's name.
  • If the project is full, add your name to the end of the list.
  • ID students: Please put (ID) next to your name, regardless of your sign-up position.
  • If you remove your name from a project, please move everyone below you up one spot.

 


Title: ubiDroid

 

Abstract: Android has emerged as the most popular software platform for mobile phones. Its success lies in its innovative design which decouples application development from the system level software and hardware platform. Several of the technical features including register based Java virtual machine, kernel level changes in linux to customize it for mobile’s inherent power and resource constraints. This has spurred mobile application explosion in android market. BUT There is one aspect which is missing from android i.e. android is designed from the perspective of supporting applications running on a single device. Although, it is being used in several other product segments based on android which include Google TV, Google Glass and Google driver less cars. This project addresses the challenges to allow android applications to utilize nearby devices in its applications and facilitate applications to cross device boundaries for enhanced and immersive user experiences.  

 

We believe that the project can be extended to develop next generation multi-device applications as project 2 for the course.

 

Team Members:

1. Ketan Bhardwaj : ketanbj AT gatech.edu

2. Dipanjan Sengupta : dsengupta6 AT gatech.edu

3. Tuan Anh Nguyen : tuananh AT cc.gatech.edu 

4. Susmita Gorai : sgorai3 AT gatech.edu

 

Alternate Team Members:

5. 

6. 

 

 

Title: Dog to Human Interactions FIDO : Facilitating Interactions for Dogs with Occupations.

 

Sponsor: Melody Jackson melody.moore@cc.gatech.edu , Thad Starner thad@gatech.edu , Clint Zeagler clintzeagler@gatech.edu

 

 

Abstract: This project will focus on using textile interfaces and off the shelf interfaces and explore uses for on-dog body input.  This should include a small dog user interface test for each of an array of input types.  For more information please contact any of the sponsors.

 

Team Members:

1. Sarah Eiring (ID): sjeiring@gatech.edu

2. Alex Martin: amartin37@gatech.edu

3. Yash Kshirsagar: yash.ksagar@gatech.edu

4. Adil Delawalla: adelawalla@gatech.edu

 


Title: Dog to Human Interactions 2 : FIDO Facilitating Interactions for Dogs with Occupations.

 

Sponsor: Melody Jackson melody.moore@cc.gatech.edu , Thad Starner thad@gatech.edu , Clint Zeagler clintzeagler@gatech.edu

 

 

Abstract: This project will focus on using textile interfaces and off the shelf interfaces and explore uses for on-dog body input.  This should include a small dog user interface test for each of an array of input types.  For more information please contact any of the sponsors.

 

Team Members:

1. Ryan Hollis: rhollis7@gatech.edu

2. Vincent Martin: vincent.martin@gatech.edu

3. Wendy Blount: wendyblount@yahoo.com

4. Giancarlo Valentin: giancarlo@gatech.edu

 


Title: Dog Activity Recognition : FIDO Facilitating Interactions for Dogs with Occupations.

 

Sponsor: Melody Jackson melody.moore@cc.gatech.edu , Thad Starner thad@gatech.edu , Clint Zeagler clintzeagler@gatech.edu

 

 

Abstract: This project will focus on using a wearable device to capture dog motions and activities and recognizing certain activities as an input.  We expect the group will use accelerometers to collect such data, but the group might also use electronic textile techniques.

 

Team Members:

1. Joseph Halton: jhalton3@gatech.edu

2. "Brian" Tan Sun: briansun@gatech.edu 

3. Saagar Patel: sp00252@gatech.edu

4. Minsik Bang: mbang3@gatech.edu 

 

 


Title: Wearable Sensors to Detect Suit Clearance and Pressure

Sponsor: NASA

Mentor Names: Mai Lee Chang, Rajulu Sudhakar, Christopher Reid  

Mentor Orgs: SF3 – Habitability and Human Factors 

Mentor Emails: mai.l.chang@nasa.gov; sudhakar.rajulu-1@nasa.gov; christopher.r.reid@nasa.gov    

Mentor Phones: Chang (281-483-068); Sudhakar (281-483-3725); Reid (281-483-7811

 

Abstract: 

Background

Currently, crewmembers suffer from spacesuit injuries to the shoulders, upper back, chest, and other parts of the body. Most of the injuries occur during training such as extravehicular activity training (EVA) in the Neutral Buoyancy Lab (NBL). The role of suit size and body size needs to be better understood in order to appropriately design future spacesuits. One approach is to know the space clearance and pressure between the body and suit.

 

Problem Statement

The primary goal is to design wearable sensors to detect space clearance in a spacesuit for the following body parts listed in order of preference (most to least):  shoulder region, back, arms, chest, legs, and neck. The secondary goal is to design wearable sensors to detect the pressure between the suit and the body parts previously listed. The sensors must be able to detect the clearance (and pressure) during both static and dynamic positions. In addition, the design must be non-invasive (does not interfere with the crewmember’s movements/activities) and safe to use. Data can be post-processed.

 

Preliminary Requirements

-        Be able to detect space clearance during both static and dynamic positions in the shoulder region, back, arms, chest, legs, and neck.

-        Be non-invasive and safe.

 

Deliverables

The team should deliver and demonstrate a functioning wearable design that detects space clearance in a suit that addresses the problem and requirements stated above. Documentation (including diagrams, pictures, and videos) of the design process should also be provided that allows us to replicate the design.

 

How it relates to NASA’s work

The results of this project will serve as a starting point as our organizations and others design future spacesuits for human spaceflight. The benefits and drawbacks of the wearable prototype will be considered against existing methods. 

 

Team Members:

1. 

2. 

3. 

4. 

 

Alternate Team Members:

5.

6. 

 


Title: Wearable Controls for a Rover

Sponsor: NASA

Mentor Names: Mai Lee Chang, Jennifer Rochlis  

Mentor Orgs: SF2 – Systems Management and Integration; SF3 – Habitability and Human Factors 

Mentor Emails: mai.l.chang@nasa.gov, jennifer.l.rochlis@nasa.gov  

Mentor Phones: Chang (281-483-068); Rochlis (281-483-1718)

 

Abstract: 

Background

Currently, teleoperating a robot involves a stationary work station with multiple computer screens and input devices. New methods for gesture control are under investigation but rely on external viewing systems.  Both paradigms require the operator to remain in a fixed position; however future surface exploration missions may require teleoperating a robot during an extravehicular activity (EVA) using lightweight and mobile equipment.  As astronauts and robots will be working more closely with each other, novel and reliable methods to control robots that allow for greater mobility and less equipment are needed.

 

Problem Statement

Design wearable controls for any part of the body and using any type of sensors to control a rover-class robot with four wheels.  The rover must be controlled in real time (no post processing) and the operator must be able to demonstrate navigating the rover through an obstacle course, controlling both the speed and direction of the robot’s motion.

 

Preliminary Requirements

-        All controls must be worn on the body (no external cameras, no Kinect).

-        Be able to control two different types of steering:  Ackerman and crab.

-        Navigate the rover through an obstacle course (real or virtual).

 

Deliverables

The team should deliver and demonstrate a functioning wearable controller to navigate a rover through an obstacle course that addresses the problem and requirements stated above. Documentation (including diagrams, pictures, and videos) of the design process should also be provided that allows us to replicate the design.

 

How it relates to NASA’s work

The results of this project will serve as a starting point as our organizations and others design future robotic control systems for human spaceflight. The benefits and drawbacks of the wearable prototype will be considered against existing methods. 

 

Team Members:

1. John Bieniek jbieniek3@gatech.edu

2. Aditya Desai: adesai33@gatech.edu

3. (ID): Simon Turgeon sturgeon3@gatech.edu

4. Ryan O'Shaughnessy ryan.oshag@gatech.edu

 

Alternate Team Members:

5.  

6. 

 


Title: Wearable Controls for Intra-vehicular Activities (IVA) Operations

Sponsor: NASA

Mentor Name: Shelby Thompson 

Mentor Org: SF3 – Habitability and Human Factors Branch 

Mentor Email: Shelby.g.thompson@nasa.gov 

Mentor Phone: 281.244.8701 (office)

 

Abstract: 

Background

In microgravity, it is inconvenient, if not impossible in some cases, to operate traditional input devices (e.g., a mouse). Currently, crewmembers on the International Space Station (ISS) use the touchpad on laptop to view timelines or procedures for IVA (intra-vehicular activities) operations such as hardware maintenance. This can be very cumbersome and increase the chance for error during a task in which the crewmember would need to stop what they are doing and go over to the computer to scroll to the next step of a procedure. In the past, our lab has examined several wearable technologies for EVA (extra-vehicular activities) operations. Most notably were cuff-type interfaces. However, little work has been done with wearable technologies and IVA operations.

 

Problem Statement

Design a stand-alone garment that integrates controls that could be used to operate an interface remotely. The garment could be only controls or could integrate an interface for full autonomous interaction. Such an autonomous garment would be very beneficial to medical operations or maintenance activities.

 

Preliminary Requirements

                    o·      A workable set of controls must be integrated into a wearable garment for human-computer interaction (HCI)

    • Must demonstrate that controls do not interfere with crewmember’s movements by selection of  the most invasive locations (e.g., sleeve, body)
    • ·      A desirable feature would be to integrate a wrist display that is light-weight and small so as not to interfere in arm movements during a task

 

Deliverables

The team should deliver and demonstrate a functioning garment that addresses the problem and requirements stated above. Documentation (including diagrams, pictures, or video) of the component integration process should also be provided. Owner of the prototype will be negotiated.

 

How it relates to NASA’s work

This project will serve as a proof of concept for wearable technologies as an appropriate alternative to HCI. This work will benefit the suite of alternative technologies for HCI under investigations such as gesture and voice commanding.

 

Team Members:

1. Abhishek Nandakumar abhishekn@gatech.edu

2. "Chloe" Hongyu Xie hxie34@gatech.edu

3. (ID): Yoni Kaplan (yoni.kaplan@gatech.edu)

4. Mason Foster mason.foster@gatech.edu

 

Alternate Team Members:

5. Eric K Chiu (ID) echiu3@gatech.edu 

6. Hunter Clarke: hclarke3@gatech.edu 

7. Xiao "Nikita" Xiong: xxiong6@gatech.edu

8. Balasubramanyam "Bala" Ganapathi (balasubramanyam@gatech.edu)

 


 

Title: Tactile Display Garments

Sonsor: NASA

Mentor Names: Oron Schmidt, Cory Simon 

Mentor Org: EV3 – Human Interface Branch

 Mentor Emails: oron.l.schmidt@nasa.gov, cory.l.simon@nasa.gov  

Mentor Phones: Schmidt: 281-244-8471, Simon: 281-483-1722

 

Abstract:  

Background

The Human Interface Branch is exploring novel approaches to presenting information to astronauts in space suits and in shirt-sleeve environments such as the International Space Station. Tactile displays have several attractive benefits (non-visual channel, rapid cognition, spatially distributed), but more research is needed into the methods of presenting information, and the mechanics of placing tactors on the body.

 

Problem Statement

Design and manufacture a shirt, pants, and socks that contain tactors that can be used for tactile display research at the Johnson Space Center. Wires to control tactors should be integrated into the garment, but other electrical components such as the processor, wireless communication module, and battery do not need to be integrated (their placement should be considered).

 

Preliminary Requirements

-       Garments must be designed to ensure that all tactors have the best chance of maintaining contact with the skin and therefore being detected by the wearer.

-       While the final goal is to use this technology in microgravity, the garments must be designed for use in one-G test environments in which the wearer is sitting, laying on his/her back, standing, or walking.

-       The team will need to work with the project mentor to decide on tactor placement, number of wires needed, and target wearer size.

 

Deliverables

The team should deliver four separate components – a shirt, pants, and two socks – that address the problem statement and requirements above. The project mentor will keep the delivered products for use in tactile displays research and development.

 

How it relates to NASA’s work

Once the garments are received, engineers in the Human Interface Branch will integrate wireless communication and control hardware to create fully functional tactile display garments. The garments will then be used to carry out research and development in tactile displays, with the goal of applying knowledge gained to future human spaceflight missions.

 

Team Members:

1.

2. 

3.  

4.

 

Alternate Team Members:

5.

6.


 

Title: In-suit Moisture Management

Sponsor: NASA

Mentor Name: Lindsay Aitchison                                 

Mentor Org: EC5 – Space Suit Systems 

Mentor Email: lindsay.t.aitchison@nasa.gov 

Mentor Phone: 281-483-8657

 

Abstract: 

Background

The primary job of a space suit is to protect astronauts from the harsh environment of space – vacuum, extreme temperatures, radiation, and micro-debris- and it does an excellent job of isolating crew from these external hazards.  However, once you isolate the crew from external factors, you also need to address thermal balance inside the suit.  When working hard, the astronauts can generate a lot of heat which can lead to sweaty skin and undergarments, which in turn can cause bacterial growth and minor skin irritation including chafing and fingernail loss in the gloves.  The suit currently combats the moisture issue by cooling the crew through conductive heat transfer and circulating air through the suit via the air inlet at the helmet and air return ducts at the upper arm and ankles.

 

Problem Statement

Design an elbow length glove/sleeve that can be worn inside a space suit to transport moisture from the hands to the upper arm vent ducts without releasing free water drops into the suit. 

 

Preliminary Requirements

  • ·      The garment must be generic in size – not a custom item for each crewmember.
  • ·      The garment shall not require additional power to move the moisture.
  • ·      The moisture movement system must function in microgravity.
  • ·      The garment must be comfortable to wear continuously for up to 8hrs with intermittent bursts of high intensity activity.

 

Deliverables

The team should deliver and demonstrate a functioning garment that addresses the problem and requirements stated above. Documentation (including diagrams, pictures, or video) of the component integration process should also be provided. I would like to keep the prototype for user evaluations and future reference.

 

How it relates to NASA’s work

The results of this project will be included in the technical library associated with the requirements for next generation gloves being developed for exploration missions under the High Performance EVA Glove Project.  The results will also be shared with the EMU Program office as an option to address the fingernail delamination issue with the current space suit design.

.

 

Team Members:

1.

2.

3.

4.

Alternate Team Members:

5.

6.


Title: Garments for Body Position Monitoring and Gesture Recognition

Sponsor: NASA

Mentor Name: Cory Simon 

Mentor Org: EV3 – Human Interface Branch 

Mentor Email: cory.l.simon@nasa.gov 

Mentor Phone: 281-483-1722

 

Abstract: 

Background

Research is currently underway into flexible stretch/bend sensors that can be integrated into textiles. The Human Interface Branch is interested in applying this basic functionality to create garments that detect the wearer’s body position and gestures. Monitoring body position is important for ground-based analysis of spacecraft and habitat design, and gesture recognition may be a useful control mechanism for future Martian or Lunar surface operations.

 

Problem Statement

Integrate multiple stretch/bend sensors and accompanying wiring into a full body garment to provide low profile, comfortable detection of wearer body position. The student team should verify the stretch sensors provide varying electrical output as the wearer moves and, if technically capable, implement wireless communication to a base station. If the team is not able to implement these functions, the mentor will advise the team on wiring needed so the functions can be added after the garment is delivered.

 

Preliminary Requirements

-       Sensors must be integrated in a manner that allows the wearer to lay down, sit, stand, walk, and move comfortably between positions

-       Stretch sensors should be placed at locations which enable detection of the widest variety of body positions

 

Deliverables

The team should deliver and demonstrate a single garment that addresses the problem statement and requirements above. Documentation (including diagrams, pictures, or video) of the component integration process and an explanation of how sensor values correlate to various body positions should also be provided. I would like to keep the prototype for user evaluations and future reference.

 

How it relates to NASA’s work

If necessary, engineers in the Human Interface Branch will integrate wireless communication hardware with the garment to create a functional system. The results of this project will be used for workspace and task evaluations and as a proof of concept for low-profile wearable gesture recognition. 

 

Team Members:

1. Cameron Hord cameron.hord@gatech.edu

2. Norma Easter neaster3@gatech.edu

3. (ID): Mauricio Uruena, juruena6@gatech.edu

4. Sahithya Baskaran, sahithya@gatech.edu

 

Alternate Team Members:

5. 

6. 

 


 

Title: Real Time Fabric Stretch/Shape Visualization for Inflatable Structures

Sponsor: NASA

Mentor Name: Doug Litteken      

Mentor Org: ES2 

Mentor Email: douglas.litteken@nasa.gov 

Mentor Phone: 281-483-0574

 

Abstract: 

Background

Inflatable structures are being developed by NASA and commercial partners to create a unique low-mass, high-volume vehicle for in space use. These structures use metallic internal core and fabric layered exterior, which inflates when in orbit. The exterior layers include a polymer bladder, a Kevlar weaved structural layer, nomex micro-meteoroid protection layers, and beta cloth atomic oxygen protection layers. The fabric exterior layers of the structure are folded and packed around the internal cylindrical core before launch. Once in space, the layers are unfolded and the structure is inflated.

 

Problem Statement

The fabric sections of the vehicle are designed and built for nominal inflation. On the ground, with test articles, we can visually see and fix any issues that arise from inflation. In space, however, we cannot walk around the vehicle to fix any snags or problems. Therefore, it is very important that we understand how the vehicle inflates and whether or not it maintains its proper shape, as expected. The crew should be able to monitor the exterior fabric layers of the vehicles with a series of networked sensors to give a real-time visual 3D representation of the vehicle’s shape.

 

Preliminary Requirements

  • ·      Sensors/hardware need to be installed prior to packing and be able to operate in both the packed and un-packed (inflated) configurations.
  • ·      Similarly, sensors/visualization should be able to demonstrate both an “unpacking/opening” situation and a fully inflated, damage monitoring situation.
  • ·      Stretch of at least 10% should be able to be picked up by the sensors.

 

Deliverables

  • ·      Detailed concepts and designs in a written/oral report.
  • ·      Small scale mockup and demonstration is ideal, but not required.
  • ·      Hardware can be kept by University, but there may be an opportunity to use it on NASA hardware during an actual test.

 

How it relates to NASA’s work

This sensor visualization system will help the NASA JSC Inflatable Structures project through numerous testing and evaluation for flight readiness. It will be implemented immediately on ground-based testing and for flight certification tests. It may also be directly applicable to upcoming commercial inflatable structure missions. It has benefits from a structures and crew safety standpoint that will greatly increase the feasibility and effectiveness of inflatable structure in space.

 

Team Members:

1.

2.

3.

4.

Alternate Team Members:

5.

6.


 

Title: Can You Hear Me Now?

 

Abstract:

 Create a system to detect and recognize when a pre-recorded word of phrase is spoken, and give non-auditory queues when the phrase is recognized, as well as helping to identify where the speaker might be located.  Follow on work includes miniaturization of the signal processing algorithms, as well as an analysis of the energy vs. accuracy trade off, and optimizing for both.  Use of conductive textiles, distributed microprocessors, and radio technology may also be incorporated during the design and prototyping of the system, to potentially build the system into an on-body network.

 

Team Name:

MUChachos

 

Team Members:

1. Leah Denham ldenham3@gatech.edu

2. Taylor Wrobel taylor.wrobel@gatech.edu

3. Jesse Rosalia jesse.rosalia@gatech.edu

4. Atom Raiff araiff3@gatech.edu

 


Title: The exploration of using non-toxic conductive ink as a temporary “tattoo” within ubiquitous computing

Sponsor: Ceara Byrne

Abstract: The intent of the project is to explore the possibilities of using temporary conductive ink "tattoos" within ubiquitous computing.  This implies the reusability of the same “stamp” across multiple people.  Existing research is typically healthcare-related and consists of using temporary tattoos to place sensors, such as RFID, potentiometers and dielectric elastomeric actuators, on the body.  Alternatively, non-toxic conductive ink has been used in an educational manner for basic electronics, such as lighting LEDs and basic buttons.  An opportunity exists to combine the two areas and explore, and are not limited to, the possibilities within the realms of: 

  • Biometrics / biomechanical detection / health monitoring systems
  • Simple and literal on-body “wearable” circuitry
  • Gesture recognition
  • Virtual interfaces
  • Sign language

 

[Project note:  This project does not have a specific direction or application just yet and is not limited to the ideas provided above.  These ideas were produced in a really short brainstorming session and intend to be enhanced.  It could be whether the same tattoo can provide unique readings from person to person (based on conductivity), therefore providing "unique ids", or whether it could be used for getting into concerts / football games, while providing you with other personalized functionality.  Another possibility is to combine some of the research done by Mika Satomi and Hannah Perner-Wilson on the deconstruction of various simple circuits into on-body wearable circuits.  Additionally, there are research possibilities for using conductive ink as a means for aiding gesture recognition, virtual interfaces and sign language.]

 

Team Members:

1. (ID) Ceara Byrne: ceara.byrne@gatech.edu

2. (HCI) William Stuart Michelson: stuart.michelson@gtri.gatech.edu

3. (HCI) Chia-Chi Lee: clee45@gatech.edu

4. (CS) Nisha Lad: nisha.lad@gatech.edu

 

Alternate Team Members:

5. 

6. 


Title: Cycle Atlanta

Sponsor: Dr. Christopher Le Dantec and Dr. Kari Watkins

Abstract: Cycle Atlanta is a smartphone app for recording bicycle trips in Atlanta. When people use the app, they are giving transportation planners with the City of Atlanta the data they need to make Atlanta a better place to ride. Currently, the app provides basic route information to the City, we would like to expand those capabilities to enable different forms of sensing and recording the quality of the ride: using phone sensors, can we determine road quality and the location of potholes? are the simple and robust interactions that we can use for cyclists to flag locations while riding? We are looking for a group to prototype these interactions for possible inclusion into a future version of the app.

 

Team Members:

1. Parisa Khanipour Roshan (khanipour@gatech.edu) 

2. Andrew Darrohn (a_darrohn@gatech.edu)

3. Nate Osborne (nosborne@gatech.edu)

4. Sruthi Padala (spadala3@gatech.edu)

 

Alternate Team Members:

5. Samrat Ambadekar (samrat.ambadekar@gatech.edu)

6. 

 


Title: PauseBuds

Sponsor: Scott Gilliland

Abstract:  Build smart earbuds with the ability to recognize when they're in an ear. Integrate with a smartphone to use the in-ear sensor as an input. For example: both ear buds in = full volume music; one ear bud in = half volume; ear buds out = music paused.

Team Members:

1. Vikram Somu (vs19@gatech.edu)

2. Nathan Bayudan (nbayudan3@gatech.edu)

3. Rae Luetschwager (ID) (rluetschwager3@gatech.edu)

4. Thurston Sandberg (tsandberg3@gatech.edu)

 

Alternate Team Members:

5. 

 

 


 

Title: Buzz-In, Buzz-Out

Sponsor: Scott Gilliland

Abstract:  Build a system to enable the checkout of devices using an RFID tag or the temporary activation of equipment using an RFID tag. For example, using a buzzcard to allow a researcher to check out an Arduino board, or using a buzzcard to turn on a table saw if the person has had training on the table saw.

Team Members:

1. Se Hoon Shon (sehoon@gatech.edu)

2. Jinhyun Kim (jkim608@gatech.edu)

3. David Leber (dleber3@gatech.edu)

4. Yohanes Suliantoro (ysuliantoro3@gatech.edu

 

Alternate Team Members:

5. 

6.  

 


 

Title: Microcontroller WiFi

Sponsor: Scott Gilliland

Abstract:  Embedded WiFi modules have gotten much cheaper (in the $15-20 range now), but they are not well supported by open-source libraries. Most modules either have the entire TCP/IP stack internally (which limits what it can do), or need proprietary support (drivers or documentation) which are not available to the average hardware hacker. Build an Arduino-compatible (or generic SPI-compatible) WiFi shield and write a driver to use that module with some existing microcontroller TCP/IP stack (such as uIP). Scott can provide several hours of guidance on this one, as he wants to see it done right.

Team Members:

1. Abhinav Narain (nabhinav3@gatech.edu)

2.

3.

4.

 

Alternate Team Members:

5.

6.  

 


 

Title: LED Dive Mask interface

Sponsor: Scott Gilliland

Abstract:  Build a waterproof electronic interface to tell dolphin researchers what direction a dolphin sound is coming from. The current idea is to use a strip of individually-controlled LEDs around the diver's mask to indicate left/right. This will need to work with some existing hardware, so we can give the group some guidance on integration (such at which watertight connectors to use and how the mask should communicate).

Team Members:

1. Celeste Mason  celeste.m@gatech.edu

2. Miles Gantt       mgantt7@gatech.edu

3. Ali Halim           ahalim7@gatech.edu

4. Scott Franks      sfranks3@gatech.edu

 

 

 


 

Title: Wild Dolphin video camera

Sponsor: Scott Gilliland

Abstract:  Build a waterproof, battery-powered device to float on the surface of the ocean, above some dolphins and dolphin researchers, to capture high-resolution wide-angle video and high-sample-rate audio. This would allow the dolphin researchers to have an 'overhead' view of the interactions with the dolphins for later review. We can help the group by giving them details of the use cases of the device, as well as tips about waterproofing and high-frequency audio capture.

Team Members:

1.

2.

3.

4. 

 

Alternate Team Members:

5. 

6.  

 


Title: Programming ubicomp environments through the lens of augmented reality

 

Abstract:

The user experience and the development environment for applications of the personal computer exist within the 2-dimensional graphical user interface. In contrast, the user experiences ubiquitous computing applications in the 3-dimensional physical world. However, the dominant development environment for these ubicomp experiences remains the 2-dimensional graphical user interface. We propose the use of a tablet-based mixed GUI/augmented reality interface to enable end users to visualize, appropriate, and program input/output devices and services in the environment. The goal of our project is to enable end-users to build ubiquitous computing applications without writing code. 

 

Imagine a user walks into a room that has been set up with various devices (microphone, Kinect sensor, temperature sensor, motion sensor, LCD screen, etc.) and wishes to appropriate these available devices and configure logic and interactions within the space. Using augmented reality as a lens to visualize device state and capabilities, the user would be able to on-the-fly program a number of events (input) and reactions (output) to occur within the environment. 

 

  • We have two other PhD students (not in the class) interested in this problem and they will be working with us on this project.
  • We are interested in publishing this work. We are targeting the UbiComp 2013 - Pervasive and Ubiquitous Computing Conference (http://www.ubicomp.org/ubicomp2013/) deadline at the end of this semester (March 22nd).
  • We are also considering submitting this project to the Convergence Innovation Competition (March 27th).

 

Team Members:

 

1. Aman Parnami (aparnami3@gatech.edu)

2. Gabriel Reyes (greyes@gatech.edu)

3. Haozhe Li (Stan) (haozhe@gatech.edu)

 


Title: Wearable Camera Device for Activity Recognition Applications

Sponsor: Edison Thomaz (ethomaz AT gatech.edu)

Abstract: The goal of this project is to develop a small camera device that could be easily attached to a pair of glasses. The camera should take first-person point-of-view photos at regular intervals and send those photos to a smartphone app using Bluetooth LE. A device like this would support an emerging angle on activity recognition research where human activities are inferred from first person point-of-view images. One of the primary objectives of this project would be to develop an open platform, which contrasts to existing systems that are proprietary (e.g. Looxcie) or not yet available (e.g. Google Glass).

 

Team Members:

1. 

2.

3.

4.

 

Alternate Team Members:

5.

6. 

 


 

Title: Painting by Robots

Sponsor: Prof. Frank Dellaert (frank.dellaert AT cc.gatech.edu)

Abstract: An end-effector for a robot that will allow it to paint. The idea would be to use a color printhead from a commercially available inkjet printer, and reverse engineer the control signals that need to be sent in order to produce patterns of colored ink. I have an Arduino-driven robotic arm that can move the print head around, but it would be a separate project to actually put to the two together.

 

Team Members:

1. 

2.

3.

4.

 

Alternate Team Members:

5.

6. 

 


 

Title: Scalable Inexpensive Robot Controller PCB

Sponsor: Prof. Frank Dellaert (frank.dellaert AT cc.gatech.edu)

Abstract: I would be very interested in a robots controller + Arduino + Bluetooth combination, in order to create small Arduino-based robots for 3630 in Spring 2013. The ideal PCB that I want does not exist. The goal would be to produce 100 of them by January.

 

Team Members:

1.

2. 

3. 

4. 

 

Alternate Team Members:

5.

6. 

 


 

Title: iPhone Case Robot – Version 1

Sponsor: Prof. Frank Dellaert (frank.dellaert AT cc.gatech.edu)

Abstract: An iPhone case that would double as a small mobile robot. It would have an Arduino-compatible processor, motors and motor controller, and an interface to the iPhone. To keep things cheap, this could be done through the audio connection. We would 3-D print the case the prototype case. Could be a kick starter after the course.

 

Team Members:

1. 

2. 

3. 

4. 

 

Alternate Team Members:

5.

6. 

 


 

Title: iPhone Robot – Version 2

Sponsor: Prof. Frank Dellaert (frank.dellaert AT cc.gatech.edu)

Abstract: A similar project, but now the phone would be laying down, and communicate with a small robot base via its screen, with the arduino having several photo-transistors as the receiving end. iPhone app would be programmed via Processingjs.

 

Team Members:

1. 

2.

3.

4.

 

Alternate Team Members:

5. 

6. 

 


Title: Environment sensors notify an ambient display at home.

Sponsor: Aware Home Research Initiative

 

Background: Between a myriad of electronic devices that exist today our homes have plenty of brains but very little smarts. Sure we have thermostats that maintain temperatures, lights that can come on and shut off automatically but we still don’t have a device that can tie such technologies together with appropriate context and present them in a meaningful way to the average consumer. Even cars can do this type of thing for a long time now, start your engine and your car will gently remind you to put on your seatbelt, engine maintenance due on comes that notification light, car doors unlocked? It will nag you until you remedy that situation. These notifications don’t get in the way of you using your vehicle yet provide important and useful information without the user having to actively seek it out. We aim to bring similar non-obtrusive yet useful notification and ambient awareness technology to the home.

 

Abstract: Leverage inexpensive sensor technologies in a home environment to collect ambient data, collate this data and provide relevant information via simple, unobtrusive notifications on a dedicated device that is designed to blend seamlessly into a home environment. 

 

Team Members:

1. Mudit Gupta (muditg@gatech.edu)

2. Nitya Noronha (nitya.noronha@gatech.edu)

3. James Hallam jhallam@gatech.edu

4. Philip Smith (psmith44@gatech.edu)

 

Alternate Team Members:

5. 

6. 

 


 

Title: Digikits

Sponsor: Dr. James Clawson (jamer AT cc.gatech.edu)

Abstract:  Design and build an on-body ensemble of wearable devices that enables the subtle sending and receiving of one bit of information. The ensemble should be designed to support private communication between a pair of individuals in public settings. I want a team of people to build a suite of devices (bracelet, watchband, ear rings, necklace, ear bud, garment, etc.) all capable of sending and receiving a single signal (vibration, tone, temperature change, etc). The design challenge is that the sending and receiving of the signal has to be done subtlety without alerting other members of the group. The building challenge is getting the electronics to work in various form factors without making the devices look like "technology", the devices should look like something that a person could comfortably wear. A workshop paper is attached with more details.

 

Team Members:

1.  Perron Jones (pjones35@gatech.edu)

2. JP Hamilton (ID): jphamilton@gatech.edu

3. Lamine Sissoko (lsissoko3@gatech.edu)

4. Sanat Rath, sanat@gatech.edu

 

Alternate Team Members:

5.

6. 

 


 

Title: Errors in touchscreen typing

Sponsor: Dr. James Clawson (jamer AT cc.gatech.edu)

Abstract: Typing on mobile phone with physical buttons is well studied. For this project you will build a key logging application to run on a touchscreen phone that is capable of outputting log files. You will conduct a user study recruiting 12 participants to type for 15-20 twenty-minute typing session. You will analyze the logs and look for common patterns in the errors. If we find common patterns, we can compare those error to the errors made when typing on with two thumbs on a physical keyboard.

 

Team Members:

1. Sundararajan Sarangan (sundar@gatech.edu)

2. Amritha Arakali (amritha.arakali@gatech.edu)

3. Sneha Bharath (sbharath3@gatech.edu) 

4. Abhinav Narain (nabhinav3@gatech.edu)

 

Alternate Team Members:

5. 

6. 

 


 

 

Title: Dyslexia detection from users' typing

Sponsor: Dr. James Clawson (jamer AT cc.gatech.edu)

Abstract: For this project you will build a key logging application (or use my existing Blackberry key logger) to run on a mobile phone that is capable of outputting log files. Run a text entry study recruiting dyslexic participants. Analyze the types of mistakes they make. Determine if there are common mistakes. Design an algorithm to automatically detect dyslexic errors.

 

Team Members:

1.Shashank Raghu (sraghu@gatech.edu)

2.

3.

4.

 

Alternate Team Members:

5.

6. 

 


 

 

Title: Neighborhood level air quality sensing

Sponsor: Dr. James Clawson (jamer AT cc.gatech.edu)

Abstract: Investigate the potential of building/purchasing air quality sensors and installing them throughout a small neighborhood in south east atlanta (3 miles from Tech). Once the sensing infrastructure is in place, the team will aggregate data from the sensors to produce a real-time air quality map for the neighborhood. The neighborhood is interested in determining if a large intermodal transit facility nearby is adversely effecting the air quality. This project has financial support from the neighborhood and has political support from both the neighborhood and some state officials. Collecting air quality data and presenting those data in a format that can be easily understood by a non-technical audience is the goal of this project.

 

Team Members:

1. Amrutha Krishnan (akrishnan40@gatech.edu) 

2. Subrai Pai - salil.pai@gatech.edu

3. Sagar Savla

4. Vipul Thakur

 

Alternate Team Members:

 

 


 

Title: Community garden sensing

Sponsor: Dr. James Clawson (jamer AT cc.gatech.edu)

Abstract: For this project, the project team will build or purchase moisture sensors to be deployed outdoors for several weeks. The team will aggregate data from the sensors and display status updates to garden members when they enter the garden. The team will build a location based messaging app that lets garden members communicate with each other asynchronously in the garden. The app should be able to relay information to the end user such as "plot 17 needs to be watered" as well as convey messages from gardeners such as "Sally: I'm out of town until tuesday, please water my plot if it needs to be taken care of. thanks!" This project could also focus on the design and deployment of a decay sensor which would be used to monitor the status of the community compost pile and which would need to alert the gardeners when the compost pile becomes too hot, or is producing too much methane or other gasses. Details of the what exact this sensor needs to do can be determined in the early stage of the project. The compost sensor can or can not be included/act as the focus for this project.

 

Team Members:

1.

2.

3.

4.

 

Alternate Team Members:

5.

6. 

 


 

Title: Order Picking Team 1

Sponsor: Prof. Thad Starner

Abstract: About 750,000 warehouses worldwide distribute approximately $1 trillion in goods per year.  About 60% of the total operational costs of these warehouses is order picking.  Most are still picked by hand, often using paper pick lists.  In his December 2012 dissertation, Dr. Hannes Baumann showed that using a wearable with HUD could improve the speed of order picking 40% and virtually eliminate errors.   Using the experimental design from our CHI2010 paper, directly compare the picking guidance systems:  pick-by-HUD, pick-by-light, pick-by-tablet, and pick by paper pick list.  The first part of the project will be making the picking systems themselves, based on the previous work.  The second part will be running a 16 participant within-subject user test.  The goal is a new CHI paper.

 

NOTE THAT THIS PROJECT CAN BE DIVIDED INTO MANY PROJECTS

POSSIBLE SUBPROJECTS:  

1) implement pick-by-light using microcontrollers  

2) implement Kinect sensing of picks

3) implement pick-by-HUD  

4) re-create user study

 

Team Members:

1. Anhong Guo (guoanhong@gatech.edu)

2. Xuwen Xie (xuwen.xie@gatech.edu)

3. Xiaohui Luo (luoxiaohui@gatech.edu)

4. Shashank Raghu (sraghu@gatech.edu)

 

 

Alternate Team Members:

5. 

6. 

 


Title: Order Picking Team 2

Sponsor: Prof. Thad Starner

Abstract: About 750,000 warehouses worldwide distribute approximately $1 trillion in goods per year.  About 60% of the total operational costs of these warehouses is order picking.  Most are still picked by hand, often using paper pick lists.  In his December 2012 dissertation, Dr. Hannes Baumann showed that using a wearable with HUD could improve the speed of order picking 40% and virtually eliminate errors.   Using the experimental design from our CHI2010 paper, directly compare the picking guidance systems:  pick-by-HUD, pick-by-light, pick-by-tablet, and pick by paper pick list.  The first part of the project will be making the picking systems themselves, based on the previous work.  The second part will be running a 16 participant within-subject user test.  The goal is a new CHI paper.

 

NOTE THAT THIS PROJECT CAN BE DIVIDED INTO MANY PROJECTS

POSSIBLE SUBPROJECTS:  

1) implement pick-by-light using microcontrollers  

2) implement Kinect sensing of picks

3) implement pick-by-HUD  

4) re-create user study

 

 

Team Members:

1. Saad Ismail - sismail3@gatech.edu

2. Joseph Simoneau - simoneau@gatech.edu

3. 

4. 

 

Alternate Team Members:

5.

6. 


Title: Fixing the Seams: Urban Infrastructure 

Sponsor: Caleb Southern (caleb.southern AT gatech.edu)

Abstract: Have you ever paid attention to the civic infrastructure that makes our cities run? Most people have not, at least until there is a problem which makes the invisible infrastructure visible and apparent. Develop a smartphone app and back end infrastructure that allows citizens to report infrastructure problems quickly and easily as they observe them while on the go. Infrastructure issues include, but are not limited to, broken streetlights, potholes, clogged storm drains, broken traffic/crosswalk signals and controls, etc. Think about Prof. Starner’s “two-second rule” for quick and easy mobile interactions. Think about the context (storm drains in the rain, streetlights at night). Project 2 could expand this project to work with the City of Atlanta to make this a real and functional mobile service. Check out seeclickfix.com and other previous work, and make sure you add an original contribution that moves the research in this design space forward.

 

Team Members:

1. 

2.

3.

4.

 

 


 

Title: Sonification of Human Movement

Sponsor: Dr. Shinohara - shinohara@gatech.edu

Possible other sponsor: Bruce Walker

Abstract

Come up with some way to create sound through human movement. Create a wearable device with sensors (I'm thinking accelerometers, but someone else might have a better idea) that determine which notes are played, or if the movement creates a higher pitch or louder volume. It would be cool if someone could wear this thing then dance around to create some sort of music. Music is subjective, here. 

I, Erin Hennessy, started this last semester but haven't gotten very far. I need someone with programming and/or electronics expertise. My strengths are: ideas, sewing, and enthusiasm/dancing. I have done some things with Arduino and took Zeagler's Wearable Products class. Dr. Shinohara is in the Applied Physiology department and wants this project to come to fruition. Bruce Walker, director of the Sonification Lab, also wants something for athletes to wear that would provide auditory feedback on their performance. This would be especially helpful for non-sighted athletes.

 

Team Members 

1. Erin Hennessy - echennessy@gatech.edu; echennessy@gmail.com. 

2. Brandon Flood - b.flood@gatech.edu

3. Gregory Koutrelakos (CS gkoutrelakos3@gatech.edu)

4. Robert Thackston - rthackston3@gatech.edu

 

 

 

Title: RFID Controlled Outlet(Seeking an ID Major):

Sponsor: Ryan Fahsel (ryan.fahsel@gatech.edu and Colin Gray colin.gray@gatech.edu)

Abstract: We would like to control access to outlets via RFID. This could be used to help limit tool use to only people that are trained for that specific tool. I would be happy to talk more about the project if you have any questions.

 

Deliverables:

-authenticated power outlets with a locking mechanism (to prevent switching plugs)

-RFID authentication system

 

Team Members 

1. Ryan Fahsel - ryan.fahsel@gatech.edu

2. Colin Gray - colin.gray@gatech.edu

3. Ramya Ramakrishnan - rramakrishnan3@gatech.edu

4.

 

 

Title:Tracking Everyday Family Arousal Levels An Exploration:

Sponsor: Siemens Foundation

Mentor Orgs: NSF Expeditions

 

Abstract

Family members face everyday challenges and stresses that can easily go unnoticed and have a different impact on each person’s happiness and physical and mental health. 

 

Here we propose building a wearable sensor and tablet based video recording system for helping family’s better monitor, manage and co-regulate their emotional states. If family members can better anticipate one another’s emotional levels then they may be able to better empathize and support one another for leading happier healthier lives. 

 

Background

Electro dermal Activity (EDA) bracelet worn sensors have long been used for measuring changes in people’s non-verbal emotional states such as laughing; however the signal is less useful for measuring the type of behavior -- happiness looks like embarrassment etc. Here we seek to supplement this feedback with synchronized video recording on a tablet to help differentiate these emotional states.

 

If successful we will be able to provide families with comfortable, wearable sensors to help family members better understand and adapt to each other’s emotional states for improving the quality of interactions during everyday activities --- reading a book to a child before bed, helping children with homework and watching TV on the couch. 

 

Deliverables

- wearable sensor for measuring emotional arousal levels ie: eletro dermal activity (EDA)

- tablet based video recording & playback application for reviewing EDA feedback in the context of everyday activities 

 

Team Members:

1. Jon Bidwell: bidwej@gatech.edu

2. Ivan Riobo: ivan.riobo@gatech.edu

3. Cheng Zhang: chengzhang@gatech.edu

4. Emily Keen (ID) : ekeen3@mail.gatech.edu

 

(seeking industrial designer or electrical engineer)

 


Title: Mobile Music Touch: Stenography Learning

Sponsor: David Quigley (CCG)

Abstract: (NOTE: The Project Group may change this abstract) This project group is leveraging the Mobile Music Touch infrastructure to help people learn abstract manual dexterity skills. In particular, this group will develop a system that will allow a user to watch a video and feel the vibrations corresponding to the words spoken in the video. This will include translating from a transcript of a video to the corresponding fingers to type the words with as well as an augmentation of the hardware design of the glove to allow for more input to the user.

 

Team Members:

1. Samrat Ambadekar (samrat.ambadekar@gatech.edu)

2.      

3.      

4. 

 

Alternate Team Members:

5. 

6.

 


 Title: Mobile Music Touch: Active Piano Practice

Sponsor: David Quigley (CCG)

** Looking for at least one more members **

Abstract: (NOTE: The Project Group may change this abstract) This project group is leveraging the Mobile Music Touch infrastructure to help people learn abstract manual dexterity skills. In particular, this group will augment the wearable glove with the ability to sense when a finger presses a key on a piano, and be able to distinguish which finger made the strike. The group will test several possible avenues of detection and determine which one(s) is(/are) the most feasible and reliable. The primary concerns will be the accuracy of detection and the interference for the wearer.

 

Team Members:

1. Xinyan Yan(voidpointer@gatech.edu)

2.                

3. 

4. 

 

Alternate Team Members:

5.

6.


Title: MAGIC Summon

Sponsor: Daniel Kohlsdorf  (dkohlsdorf AT googlemail.com) 

Abstract: The group will develop and evaluate a system that can synthesize motion gestures that will not false trigger in everyday life and present these gestures to the user. Therefore, they will collect a database of everyday life movements using the Microsoft Kinect and / or accelerometers. Based on that data they will develop a method based on Symbolic Aggregate Approximation (SAX)[1] that can be used to check for false positives and is able to generate new gestures. The synthesized gestures will then be presented as a "video" to the user. One representation could be a virtual avatar that performs the synthesized gesture.

 

Hints:

[1] Eamonn Keogh's SAX page: http://cs.gmu.edu/~jessica/sax.htm/.

[2] Ask me for the unpublished paper where we (me and Thad) did the same thing for touch pad gestures.

[3] False Positive Testing:  Daniel Kohlsdorf, Thad Starner, Daniel Ashbrook: MAGIC 2.0: A Web Tool for False Positive Prediction and Prevention for Gesture Recognition Systems , FG' 11, 2011 .

[4] In the case you need to control the avatar    and don't know how there is a introduction to inverse kinematics at: http://freespace.virgin.net/hugo.elias/models/m_ik2.htm.

 

Team Members:

1. 

2. 

3. 

4. 

 

Alternate Team Members:

5. 

6.


Title: Sign language recognition with the Kinect-American Sign Language (ASL) test

Sponsor: Dr. Harley Hamilton  (hjh AT cc.gatech.edu); Prof. Thad Starner (thad AT gatech.edu) 

Abstract: This project will develop a Kinect program to recognize signing to be used in an ASL assessment for children. All video data has already been collected. 

 

Team Members:

1. 

2. 

3. 

4. 

 

Alternate Team Members:

5. 

6.

 


Title: Sign language recognition with the Kinect- Game

Sponsor: Dr. Harley Hamilton  (hjh AT cc.gatech.edu); Prof. Thad Starner (thad AT gatech.edu) 

Abstract: This project will develop a simple prototype of a Kinect game in which the user will interact with the game using sign language. It can be very simple such as a platform game that the user can control with the signs RUN, JUMP, FLY, CLIMB. We can supply the video data. Other game formats are possible if the team has a good idea.

 

Team Members:

1. Samrat Ambadekar (samrat.ambadekar@gatech.edu)

2. 

3. Norma Easter neaster3@gatech.edu

4. 

 

Alternate Team Members:

5.  

 


Title: HyTech Racing - Formula Race car sensor collation and analysis

Sponsor: N/A. Student idea.

Abstract:  There is a GT student club called HyTech Racing which builds a formula hybrid race car and participates in various competitions around the country. Currently, they are ramping up work on v2 of their car. This time, they have various sensors all over the car that collect data (suspension, tyre pressure, fuel status, battery indicators, etc.). Currently, these sensors don't collate the data and neither is there a proper mechanism for analysis. By using Arduinos these sensors can be connected and their data beamed over wifi, live, while the car is on the lap. This data can then be analysed for the team's knowledge in improving the car further. The data collation would be the 1st project while the analysis being the 2nd one; for this course.

 

We are looking for a student with Industrial Design background.

 

Team Members:

1. 

2. 

3. 

4.

 

Alternate Team Members:

5.

6.

 

 


 

Title: Interactive eTextile Knee Brace For Physical Therapy 

 

Abstract: This project will work with physical therapists to design and prototype a low cost interactive knee brace for use in home rehabilitation exercises after surgery or injury.

 

Team Members:

1. Joe Gonzales jgonzales8@gatech.edu

2. David Muñoz davidmunoz@gatech.edu

3. Andy Pruett apruett3@gatech.edu

4. Graceline "Racel" Williams racel.williams@gatech.edu

 


Title: Undergrad Research Assistance (NOT a class project)

Sponsor: Dr. Harley Hamilton  (hjh AT cc.gatech.edu) 

WANTED- Undergraduate Research Assistant to collect images and video clips for the SMARTSign dictioanry, a mobile English to American Sign Language dictionary. The person will also do some basic video editing to join the sign language video with the images/video clips. No experience necessary. Go to www.cats.gatech.edu/SmartSign/production/index.htmlto see a prototype. Search for the word "cat" or "run" to see examples. On a PC use Chrome browser. Should work on any iOS or Android device.

 


 

Shashank Raghu (sraghu@gatech.edu)

Comments (0)

You don't have permission to comment on this page.