Archive for the 'Final Report' Category
Three Musketeers – Prototype II Final Report: Laser Tag


The idea is to create a system that allows the residents to play a movement based game similar to laser tag. Using GPS and a digital compass we track their localized movements and position vector. That information is used when the player wants to ‘fire’ upon another player. All the information communicated via WiFi to central server, which processes the data and outputs game play statistics.


What did you build? How did you build it? Be as specific as possible. Include pictures and or video that directly illustrate description. Give enough explanation that somebody of your capabilities could duplicate your results. A step-by-step set outline of what you did is appropriate.

 The device consists of an Arduino Uno that collects data from a GPS and Magnetometer and sends the data wirelessly to a router using a WiFly Shield. Most of this is safely stored inside a box custom designed for the Arduino.

The Arduino code is a simple program that collects Data from the GPS and Magnetometer and sends the data as an HTML page to anyone who requests one.

The server program is coded in Python 2.7 using the following libraries: Pygame, Twisted, Zope.Interface, PGU-0.1.8 (pygame gui). The program uses Twisted’s reactor as the base, which periodically calls PGU’s event handler. The reactor controls networking and realtime scheduling and the event handler controls the GUI interactions and rendering. A flow chart of this activity is shown below.

This program is designed using the core structure of Andrew’s multiplayer game from another class. This core structure provided an asynchronous reactor with networking and GUI support. A simple functional GUI was added and then data was retrieved from the Arduino at 2 second intervals.

Here is the code:


Cost Report:

Component Qty Unit Cost Total Cost Comments

 WiFly Shield





 arduino uno





 Compass Module – HMC6352





 Resistor assortment









TOTAL     $142.11  


In the first prototype we created a device that tracks position and heading. We had a problem with the accuracy of the measurements from that device so we replaced it with new parts. In the second prototype we broadcasted the data wirelessly to a router which then can send the data to a server. In addition to the expected prototype II, we have created a program that provides limited GUI and receives the data from the router. We are currently having problems establishing reliable communication with the device, but it appears to be a coding issue. From here we just need to setup reliable two-way communication between the server and the device, create more client devices, and implement game play functionality.

Out of Time – Prototype II Final Report: Cool TEC

The Cool TEC Pad



For our design project we talked with residents at the C.P.R.F. facility and found that nearly everyone we talked to felt they would find several uses for a pad that had the capability to be hot or cold. They also stated that different style of pads would be a great addition to the design. Using all the comments that we gathered we decided on a  unit that would use a water circulation system to circulate water through a pad that could be used by the clients. After several design options we decided on making the system based on two parts. The first would have the water circulation parts as well as the temperature control devices. The second would be a pad that the user could place wherever they needed.


The hot/cold pad we have designed is a temperature controlled water circulation pad. The temperature adjustment device we used is a thermoelectric cooler ( also called a Peltier Coooler). When electricity is applied to it, one side gets cold while the other side gets hot.  A Peltier was chosen because of the simplicity of its design and the lack of any moving parts. Through research of the units we found that to work efficiently there would need to be something on the hot side of the unit to pull the heat away from the unit. Without something to reduce the heat of the unit, the effectiveness would be lost to the heat. To solve this we purchased a Liquid CPU cooling unit on the hot side to reduce the heat exposure. The CPU unit was rated to handle the heat we would be dealing with, and would save us the time it would take to develop a hot side cooling system. To be able to circulate the temperature we wanted from the cold side of the Peltier Cooler we fabricated a water block from a heat sink unit of an old computer. We decided to fabricate a water block due to the cost of ready made water blocks, which would put us over budget.  The top was cut off of the block, then we epoxied acrylic plates to the top and sides to contain the water. to circulate the water through the block we drilled 3/8″ holes on both sides of the block and welded copper tubing into the each whole for connecting the water tubing. After connecting the tubing together we attached a submersible water pump to the cycle that would circulate the water through the system. Once this was finished we focused on a power supply for the system. A computer power supply was what we decided to use, and found one that would supply up to 18 amps at a regulated 12.0 volts. This would be sufficient power to supply all the components of the project. A custom or more specific supply will be required in the future, but this one was readily available and free. We disassembled the power supply to remove all unwanted wires, and soldered on wires that we would use to power the unit. After completing all these pieces we attached a pad we acquired from online research that would efficiently supply the wanted temperature to the user.  This pad was designed for use in veterinary hospitals, helping control animal body temperature after surgery.


 Thermoelectric Cooler (Datasheet)                   CPU Cooler


         Water Circulation Pads                               Water Pump



Peltier Cross-Section

 Project Design Overview


Cost Report:

Component       Qty Unit Cost Total Cost Comments
Thermoelectric Cooler                    1 $10.95


 Several different options available 
Water Pump $11.50 $11.50  
CPU cooler 1 $54.99 $55.99  Bought used
Water Circulation Pad #121218 $32.00  $32.00   Used for veterinary medical services  
TOTAL     $110.44   


As much as it helped us out in the final stretch, we are going to try avoid using epoxy in our final project (at least for least prevention).  We believe that most pieces of the final device will have to be fabricated from scratch.  These pieces do exist in the real world already, but budget constraints will prevent us from buying everything.  Due to the fact that we are focusing on water circulation, we are going to focus on making the final device for in-home use, and not be portable.  This will allow us to purchase a larger Peltier cooler, and provide better heating/cooling capabilities.

We believe next semester will show much more progress.  Now that we know what we want to do and how we want to do it, we can hit the ground running.



Team 9 – Prototype II Final Report: Remote Piano Pedal Controller


Remote controlled piano pedal

Project Overview


The project is designed in order to return dynamic control of a piano pedal to someone who has lost the use of their legs. The technology available at this time utilizes a solenoid operated system. Solenoids operate in an on/off manner and don’t allow for the subtle movements that a professional piano player executes.

    Using an infrared sensor to realize the player’s analog input, a cam is operated that delivers an analog response to the piano pedal.


Positioning feedback mechanism

Figure 1 shows the mechanism that was designed to produce positioning feedback. The red gears connect the cam to a potentiometer at a 1:1 ratio. A potentiometer is a device like a volume control knob. It takes a 5v signal and returns a value 0-1024 based on whether the ‘volume’ is turned up or down.  As the motor turns the cam, the potentiometer also turns. This information is delivered to the arduino at input a0. The red gears were printed on a 3 dimensional printer from a pattern that was created on google sketchup. The white hub was turned on the shop lathe. A felt clutch is incorporated to allow the potentiometer to slip when it reaches its minimum and maximum limits. This allows the unit to be easily calibrated to a large variety of pedal heights.


Rotary actuation mechanism (Cam)


Figure 2 shows the cam that pushes the pedal down. The vertical linear range realized by the cam is 0 to 2 inches. This allows the system to be used in a variety of settings with very little adjustment required. Assuming normal piano design and setup, pedal depression ranges from 0.75 inches to 2.0 inches. Inside this range, there is approximately a 30 – 40% area of free travel at the top of the stroke. This is an area where the felt pad (damper) is coming closer to the string but is not close enough to have an effect of the string. Similarly, there is approximately a 25% area of travel at the bottom of the stroke where the damper is in complete contact with the string and further depression does not alter the deadening effect. This gives us an area of 35 – 45% in the middle of the stroke where the player will be using the feathering effect that the cam delivers.


Infrared user interface


Figure 3 shows the infrared interface that the player uses to manipulate the system.  The sensor sends out an infrared pulse and uses the echo response to determine distance. This information is sent as a voltage output returning an integer in the range of approximately 50 – 600mV to the arduino at input a1. This information is used in conjunction with information received from the positioning mechanism to move the piano pedal to the desired position. The sensor is shown here mounted to the system frame. In practice, the sensor can be placed virtually anywhere based on the user’s needs. For example, the sensor could be placed on the side of a user’s chair to read the distance to the user’s thigh.


Arduino micro-controller


Figure 4. Information received at input a0 from the infrared sensor represents the players desired pedal position. This value (50 – 600) is compared with the value (0-1024) received at input at a1 which represents the pedals current position. As the two numbers are compared a0 – a1 = command value. A command value > 25 sends a current through +M2 which causes the motor to move forward. As the motor moves forward, the cam moves down depressing the pedal and the potentiometer value is altered. When the two values agree a0 = a1 the current ceases and the motor stops. Similarly if the command value is < 25, a current is sent through –M2 which causes the motor to move backwards until the two values match.


Arduino code



Wiring Diagram


Demo Video


Cost Report:

  Qty Unit Cost Cost per 100 units Min Qty Comments
Arduino Microprocessor 1 $45.70 $4,570.00 100  Item may be purchased online
12 DC Motor 1 $44.75 $4,475.00 100  Item may be purchased online
H bridge motor shield 1 $20.00 $2,000.00 100   Item may be purchased online
Teflon Cam 1 $40.00 $4,000.00 100  Custom order machined item
Infrared Proximity Sensor 1 $14.95 $1,495.00 100  Item may be purchased online
Gears 2 $2.00 $200.00 100 Similar gears available online
Potentiometer 1 $1.55 $155.00 100  Item may be purchased online
TOTAL   $170.95  $17,095.00     


     The project seems to solve the problem as anticipated. A clutch system was added to the initial design in order to protect the gears. This addition allows the end user to callibrate the system very easily. In addition, the pedal controller needs no adaptation to make it  a viable solution for people who are unable to use a sewing machine pedal or any other type of linear accelerator.

The next phase of developement will include:

  • tightening up some of the parameters to fine tune the sensitivity and speed control.
  • designing a quick installation mechanism.
  • aesthetics of the finished device.
  • developing circuitry to replace the functionallity of the arduino interface and bring the unit price down.
[Team Name] – Prototype II Final Report: [Title of Project]


CPRF staff contacted us and requested a hazardous weather warning light system to keep their community informed and safe during bad weather. Our solution is implementing RGB LEDs controlled by Arduinos connected to a master server.  The lighting system is designed to be modular and will eventually be mounted on three towers to make it easily seen by the CPRF community. By varying the voltage sent to each color of the Light-Emitting Diode (LED), we can control the LED color. The three different colors are intended to be used with different level of weather severity. If there is a severe weather condition, the red LED will be turn on to urge the residents to go to shelter. The blue LED is to make aware the community that there will be a weather watch. And lastly, the green LED turns on when the weather warning and/or watch is cleared. These color combinations may be modified at a later date.

Our hazardous weather warning light system is a helpful addition to existing weather alert systems. Members of CPRF community who would be otherwise unaware will be able to stay informed about dangerous weather.

Parts used for this project are as follow:

RGB LED array: a single LED is small in size and by itself would not be bright enough to see it from a distance. The solution is to combine multiple LEDs in an array which can be more easily be viewed.

Arduino microprocessor: to recognize input signals and turn on the various colors of the array.

Load Resistors: to limit the amount of current flowing to the LED , this prevents the LED from burning out.

Computer: to provide remote input to the arduino.

Box: to hold the Arduino and power supply.



The Hazardous weather warning system is designed to control Red, Green, and blue LEDs. The RGB LED will be mounted in a tower and controlled from a server connected to the Arduino.  To achieve this we implemented a driver circuit that controls the amount of current going to the LED. A 32 Volt transformer provides DC power to each color of the LED, each color has a MOSFET controlling the flow, the Arduino inputs a signal to this switch turning on the array. To prevent burn out, the there is a load resistor in conjunction with the MOSFET. 



Disassembled LED array                                                                       Driver Circuit











Final housing and controller


Array in green mode


Array in blue mode

Cost Report:




Unit   Cost

Total   Cost

Min Qty








Mounting Box



















We are able to operate the RGB LED from our computer, controlling each individual color. The project can be more expanded and improved by modifying the arduino input methods. Because it is a modular system these inputs could be expanded to a variety of forms, including email/sms control or other forms of remote input.

Group 3 – Prototype II Final Report: Battery Range Meter


Wheel chair users come across many issues in their day to day lives. One of the issues they have to deal with is the limited capacity of the battery on the wheel chair they use. Battery capacities are different due to many reasons including size, age or how well it is maintained. Due to these reasons, users have a hard time knowing how long he/she can use it before they have to recharge.  All wheel chairs in the market show some kind of a battery level but how does that translate into real life?  They do not display how far the wheel chair can travel before it has to recharge.

For our project, we are trying to address this issue and build a device that can display the traversable distance remaining on a charge. With this intention, we did some research and started finding a solution.  We are relying on the battery voltage to measure the charge. We have collected data about the distance our test chair can travel and how the voltage drops depending on the charge.  We are reading the battery voltage at a given time and calculating the distance it has remaining on the current charge.

The major components we are using:

  1. Arduino Uno – R3
  2. Basic 16×2 Character LCD – Black on Green 5V
  3. Parallax 28500-RT PMB-648 GPS SiRF Internal Antenna


Batteries and electric wheelchairs are not made equal.  In order to get accurate distance measurements for the wheelchair we used, data had to be collected.  We spend all day in a park running down the battery of our chair and taking voltage measurements along the way.  We started with the voltage of a full charge and collected measurements around every two miles we traveled to see the change. 

In order to translate this into a nice equation we can use to come up with a good judge of distance remaining, we had to change our limits to not include the full charge and dead voltage measurements.  In order to accommodate for this, we don’t allow the meter to display more than 18 miles or less than 2.  Once those we removed, we used excel to find a best fit curve. 

LCD Diagram:

The LCD on this version of the project is a basic 16 by 2 LCD.  This means it can display 16 characters by 2 rows.  The connections for it are shown in the diagram below.


LCD Tutorial and Source Code


  1. Connect Gnd and RW to potential meter and ground
  2. Connect Vcc to the 5V supply
  3. Connect Vs to the potential meter
  4. Connect E to pin 11
  5. Connect D4-D7 to D5-D2 respectively
  6. Connect V+ to 5V supply
  7. Connect V- to ground

GPS Diagram:

To add more functionality to the device a GPS is also included.  The GPS adds a number of features and possibilities.  The LCD displays the speed in MPH and time.  Screen real estate on the 16 by 2 LCD is very limited so this is all we could display.  The next version will contain a bigger LCD and include the date, direction, and even coordinates.  The GPS is very easy to connect.  The circuit diagram is below.


GPS Tutorial and Source Code


  1. Connect the yellow wire to port 6 on the Arduino.
  2. Connect the red wire to the 5V supply.
  3. Connect the black wire to ground.

Final Diagram:



  1. Put a 1K and a 10k resistor in series.
  2. Connect the positive side of the 10k to the 24V battery on the chair.
  3. Connect the negative terminal of the 1k resistor to the ground.
  4. Connect the negative terminal of the battery to ground.
  5. Connect a wire between the 1k and 10k resistor to A0.



Source Code:

 There are comments to give you an idea of what each section of the code is responsible for.


Cost Report:



Unit Cost

Total Cost

Min Qty

PMB-648 GPS SiRF Internal Antenna





1K ohm, 1/4 watt resistor




10K ohm,1/4 watt resistor





Parallax 2×16 Serial LCD (Backlit)




 Arduino Compatible UNO Rev3 Development Board





Dual Stereo Potentiometer











Measuring the voltage output of a battery is a great way to get an idea of its capacity, but it is far from precise.  This project will give an estimate of the mileage but does not take in account for the battery condition and the chair’s output.  It makes an assumption the battery is in good health and you are moving at the maximum speed on a flat surface.  In the future, we hope to refine our circuit by adding other factors, for example, the current to give a more accurate reading.  Currently, our equation only works on the wheelchair we tested.  Coming up with ways to generalize these equations to work with all models of wheelchairs and different types of batteries will be the end goal.

The LCD is another limitation.  Using a 16 by 2 LCD really limits what you can display to the customer at one time.  The plan is to add a larger touch screen to the project.  It will show a lot more information including options for different units of measurement and even allow for input from the user.  We are considering adding an SD card with map data so the GPS will actually place you on a map.

The battery meter began using an Attopilot current and voltage breakout to record our voltage.  The chip was fried for the second time, we switched to a simple voltage divider to use for the demonstration.  Since there have been multiple issues with the breakout, it will no longer be used in the future circuit.

This project was first developed on an Arduino Leonardo.  When connecting the GPS, nothing was ever sent to the Arduino.  After switching to the Arduino Uno, everything was working perfectly, so the Arduino Leonardo may not by fully compatible.

On The Fly – Prototype II Final Report: Home Automation Framework


The Home Automation Framework project is a web application that can remotely control the electronic devices in one’s home using a web interface on a smartphone, tablet or desktop computer. The project facilitates ease of access and remote control of electronic devices for those who are busy, have a disability, or are not physically in the location of the device. Examples of such electronic devices may include lights, security cameras, electric door latches, TVs and computers.
Our initial prototype utilizes the small and inexpensive Raspberry Pi as a base station that wirelessly communicates with modules located around the home. The RPi is connected to the internet and is running an Apache web server. The web server provides access to an HTML/CSS interface with buttons for the user to press. Pressing those buttons sends AJAX commands to a Python script on the RPi, which then communicates wirelessly to either turn an outlet on/off or open and close window blinds.
Currently the RPi communicates with the outlet modules using an existing commercial product that we modified for our prototype, but this component will later be replaced with our own custom hardware. The blinds are controlled using an early version of our new wireless hardware, that utilizes an nRF24L01+ wireless module and an ATtiny85. One of the next hurdles for our project is building a circuit to deliver 5V and 3.3V DC power from a 120V AC wall socket. This needs to be very small and will be used in our outlet modules.

Description of Major Components

Server: A Raspberry Pi is connected to the internet and running an Apache web server, which serves HTML and CSS files to build the interface in the user’s browser. The Apache WSGI plugin is installed, allowing the server to run Python code in response to AJAX requests that are created when the user presses buttons in the interface. The python code handles the GPIO port switching to control the Stanley RF remote control and the nRF24L01+ module. The web interface and python code is hosted in a GitHub repository.
Stanley Remote Controlled Outlets: For our initial prototype, we utilized an existing commercial product as a proof of concept. The Stanley product consists of a remote control with 3 on and off buttons, and three outlet switches. Each outlet switch is plugged into the wall, with a lamp or other AC device plugged into it. The buttons on the remote turn the outlets on or off. For our project, we took apart the remote, soldered wires onto the buttons, and are simulating button presses using the GPIO pins on the Raspberry Pi. This component will later be replaced with our own custom hardware, an early version of which is powering our blinds module.
ATtiny Blinds Opener: An ATtiny85-powered board that sits on top of the blinds, controlling a servo to actuate the tilt rod, which causes an open and close movement of the blinds. This is controlled by the base station using nRF24L01+ modules. The code for this module is posted in the GitHub repository.
WebApp: The WebApp consists of icons that the user would click or tap, which will result in the devices being turned on or off. See image on the right.
ATtiny: On our custom hardware adapter, we are using a the high-performance, low-power Atmel 8-bit AVR RISC-based microcontroller, running the Arduino bootloader. The 8-pin package is small enough to be easily enclosed in our modules, while still providing enough pins to communicate wirelessly through the nRF24L01+ modules. These will be used with transistors/relays to power outlets on and off, and can power any other custom modules (such as our blinds) using PWM.

Wireless Connector: The nRF24L01+ is a small, cheap (less than $5), 2.4 GHz wireless communication module. It’s capable of up to 2 Mbps transfer rates, and has a range suitable for indoor operation. These are used in our project to facilitate communication between the base station and other modules.

Description of Implementation

  • Raspberry Pi Server:

  1. Install Raspbian on the SD card.
  2. Boot up the Pi and log in with SSH.
  3. Install apache2
  4. Install mod_wsgi
  5. Install gpio-admin
  6. Download jQuery to /var/www
  7. Write index.html and main.css to create a basic web interface (in /var/www) which will send AJAX requests to be processed by the python script.
  8. Write python script (with extension .wsgi, and matching the WSGI spec) to process the query sent from the web interface and toggle the appropriate GPIO pins.
  9. Make wire harness to connect the GPIO pins to a breadboard.
  10. Connect GPIO pins to the leads from the Stanley remote control.

  • Arduino Blinds Opener: 

    The Arduino blind opener’s components were described above. The steps to assemble it were as follows:

  1. Assemble the hardware as described in the below circuit diagram
  2. Load the Raw-IR-Decoder-for-Arduino code onto the Arduino. The code is posted in our Github repository linked above.
  3. While the program is running, open a serial view of the output of the Arduino.
  4. Using any IR remote, push a button to be assigned an Arduino function. In this case, we used an HP Media Remote from a HP dv2000 laptop, but any remote works.
  5. We used two buttons, up and down, to assign to the open and close functions in the Arduino.
  6. The serial view will output a properly formatted C array of integers. This array is the code the remote sends, as is captured by the Arduino.
  7. Edit the IR-commander code. There is a library file called ircodes.h. In that file, paste the code that was outputted by Raw-IR-commander, just like the examples already in there.
  8. In the ircommander.ino file, there is a main loop that listens for an IR pulse. By calling the function: IRcompare(numberpulses, [Name of code],sizeof([Name of Code)/4)), we check whether the code matches a saved code.
  9. Every time we match a saved code, we execute a function. In this case, we output 0 or 180 to the servo, which controls the blinds.

  • Alternate Blinds Opener: 

  • An alternate method we used to connect and control the Blinds Opener was a direct connection a GPIO pin the Raspberry Pi. The Web Interface implements a different button type that controls the blinds, closing and opening them through the Servo. It just uses a different function than the switch lights buttons.


Raspberry Pi-Controlled Remote Lamp Actuator:


Raspberry Pi Circuit Diagram

ATTiny-Controlled Servo & Blinds Implementation:




Arduino Blinds Actuator Circuit Board:




Cost Report:

Item Cost
RF Receiver $5
ATTIny $3
Actuator Case $3
Relay $2
Power Supply $2
Raspberry Pi $35
Raspberry Pi Case $5
High Power RF Transmitter $25
Power Supply $2
Ethernet Cable $2



Lessons Learned

We successfully implemented wireless communication between an Arduino, ATtiny, and the Raspberry Pi server, controlled via our web interface. We also were able to control a set of blinds with a Servo, and to turn devices on and off remotely using remote devices we’d obtained. However, the overall project did not reach the state of desired integration as of yet, due to difficulties in figuring out how to transform 120 V AC power to 5 or 3.3 V DC power. The devices are able to communicate wirelessly, but we haven’t been able to create our own power adapters.
In retrospect, while it was great to get the initial boost of energy from figuring out how to take apart the Stanley remote devices and control them in new and different ways, it might be considered something of a detour from our main goals of doing similar work ourselves from scratch.
Another effect of using the Stanley remote devices was that it made us overconfident in our ability to build our own version of it. We gained an important boost in enthusiasm at the cost of reduced man hours in developing our final product.
We also discovered our own difficulties dividing the workload among ourselves. We had multiple meetings where we would take turns doing something, tackling tasks in series, instead of a more practical parallel approach.


We had a fantastic Prototype I, after which we had ambitious goals for Prototype II. We managed to build working proofs of concepts for finishing the prototype, but a few key components proved to be more of a challenge than anticipated. We predict that Prototype III will be much more stable and functional.

#7 Team Bravo Squad – Prototype II Final Report: EmergiText


The project goal is to develop a device to monitor the state of a powered wheelchairs battery and automatically send a text message with location data to a preconfigured emergency contact when the battery has run out of power. Additionally, a button placed within reach can be used to manually send out a message with location data whenever the user needs to. This is to help those in powered wheelchairs to maintain their freedom to travel while having a safety system in place in the event issues arise. This device is directly connected to the wheelchair, so it can’t be forgotten (like a phone can) or run out of power at the wrong time (since it is powered by the main battery with an independent backup for when the battery dies). The second prototype was intended to support sending a text message with GPS coordinates to a contact on command.


We started off extending the first prototype just by adding a GSM shield (with appropriate antenna). Hookup-wise it’s pretty easy, just plug the shield into the Arduino (after soldering on headers) then hook the rest up as before. We did start playing with the button as a interrupt (instead of direct power control), but the shield utilizes pins 2 and 3 to communicate by default which happen to be the only interrupt pins on the Arduino Uno. So in order to use the button as an interrupt we planned on cutting the headers from the shields pins 2 and 3 and wiring them to the Arduino’s pins 7 and 8, freeing the interrupt pins for the button. We ran into several issues with the GSM shield though and never managed to actually send a message through the it, so we never went to the effort to reroute pins.

Because of the issues we were having, we  decided to pivot somewhat and switch to the Android platform. Within two days we had an app that roughly fulfilled our objectives for the second prototype. The application is has very hastily put together by people who have never done Android programming before, so the code is a bit ugly and will need an overhaul to be used further, but it works enough to test some things like how the messages are going to look and so on. With the prototype app you can select a message + contact and a text will be sent to the selected contact with GPS coordinates appended to the selected message.

Although the goal of the project is not to develop a traditional Android app, but rather use the Android platform to build a standalone device, we do get a robust UI framework for free so we plan on having a decent interface for those customers for which an app would satisfy their needs and for initially setting up the standalone device.

App mockups: 



The current prototype interface isn’t much to look at or very intuitive, but it works:



The Arduino based code was set up to be modular, with each component being encapsulated in its own class with easy-to-use methods. Had we not run into so many hardware problems we would have had a rather flexible and easy-to-read framework setup for our project code. Alas, it never got to be utilized to its full extent or receive much testing. Add all the contents of the archive (the two folders) to the root sketchbook folder the Arduino IDE is set up to use. If that is done, it should be able to find the libraries and compile.

Arduino Sketchbook


The Android code is a mostly pieces of different tutorials and sample code glued together into a functional state. Our goal was not a finely crafted app since we were doing it quickly, we just wanted something working so we could test various aspects of the project. For those will little to no Android or Java experience, it will probably be rather confusing to even move around the file structure, but if you drop the folder into your projects folder in Eclipse (or similar) you should be able to find stuff. 

Android Code





Cost Report:

Not exactly production-line environment costs, with custom PCBs, but if someone were to make our Arduino prototype in large numbers (and buy everything from sparkfun), the prices would be something like the following:

Component Qty Unit Cost Total Cost Min Qty Comments
Arduino Uno 1 23.96 23.96 100  
Venus GPS 1  39.96 39.96 100  
GPS Antenna 1 9.56 9.56 100  

Cellular Shield with SM5100B

79.96  79.96  100   
GSM Antenna 1 6.36  6.36  100   
Small Breadboard 1 4.76  4.76   100  
Misc Switch 1 1.56  1.56  100   
TOTAL     $166.12     


For the Android prototype, any Android phone with a screen, GPS, and cell network (texting) would work. The phone used for the main prototype testing and demonstration, a HTC Droid Incredible, can be had for ~$150 on eBay. One could work with a manufacturer to get an existing phone board cut down (no camera, no attached screen, no case, no speakers, etc.), likely saving a lot compared to a custom board, but since we pivoted late in the semester, we haven’t worked out those kind of costs.



The initial prototype systems built on the Arduino were somewhat painful experiences. We ran into numerous hardware problems that, while likely trivial, we just didn’t have the time to properly handle. Issues were bound to come up, anything wireless can be rather difficult to work with, and since what we were after could be considered a stripped down smartphone, we should have recognized earlier that a different system, one designed with this kind of work in mind, would be a better choice than doing everything ourselves. With Android we get a robust telephony stack and easy access to just about every tool we need (GPS, contact management, data storage, etc.).  It should ease development considerably and allow us to move quicker.

In the future, we need to build an accessory to read the state of the main battery and receive a button press signal from the user. For monitoring the main battery, we can either build a device that utilizes the audio plug as an analog-in port, or build a device utilizing the Android Open Accessory Protocol that would communicate over USB.

Team Alpha – Prototype II Final Report: TrueView Camera System


Team Alpha: True View Camera System

Prototype II Final Report


                The Cerebral Palsy Research Foundation specializes in support for patients with unique needs. When we visited the foundation with the senior design class we spoke with the residents to find out ways that technology could help with their everyday lives. The concept that we eventually decided to pursue upon discussion with the residents was for a device that would allow them to see in areas that were most difficult when they were operating their power chairs. We noticed that a lot of the residents had issues seeing to sides and to the rear of their chairs when operating them. The new lithium powered chairs are actually quite powerful and we were surprised to hear that residents have actually put holes in their walls and have damaged their personal property with these types of accidents. This is when we decided to work on the True View Camera System.

                The True View Camera System is powerful rear camera designed specifically for power chairs that allows for full viewing capability of items behind the chair while being operated. The camera has the unique ability to quickly and safely detect object on screen and then immediately lock on to the object and give the user a warning when coming with a dangerous distance. The camera uses infrared pulse technology to give accurate distance detection to with centimeters accuracy all while providing a clean high quality image. The camera even has the ability to detect and render objects in pure darkness due to its infrared detect and render software. The True View Camera is the only system on the market designed specifically for power chair use and will set the standard for viewing safety.


                The prototype that we built expands upon the concepts that we started for prototype I. In the prototype I report, we described in detail how to setup and install the Freenect libraries for use with a Kinect. Since the last prototype, we have done more work with the OpenNi framework. OpenNi is another open sources community with support for the Xbox Kinect within Linux systems. Our goal with this prototype was to establish a standardized integrated development suite and we found that OpenNi had a great deal of support when used in conjunction with Processing. Processing is a development suite that is cross platform with many useful built in functions for visual display activities. OpenNi supports devices that utilize natural interaction between humans and computers and given the scope of our project and our desire to use Processing as a one stop source for our coding needs, it was a natural choice.

                The prototype uses a standard off the shelf Xbox Kinect. As we mentioned in the last report, do not use the Microsoft Kinect made for developers. There is no support for Linux at the moment for the developer Kinect and therefore it is much less versatile than what our project demands. The next item that you will need is an Arduino Duo board in which we used to control our servos. It is also recommended that you use a Linux based system for the setup since there is currently more support for the Kinect. Processing and OpenNi is supported on Windows and MAC as well and we have had success on all three platforms but for the sake of this prototype description we will focus on Linux. The tutorial that we describe here assumes a comfortable working knowledge of Linux.

                The first step to project setup is to install the Processing development suite. We used the 64 bit version and have had the most success with it to date so that is our current recommendation. For information to download and install the Processing suite:

Processing Development Suite

                As we mentioned earlier, you may download and install the Linux or Windows version of Processing for this project but we recommend Linux. The MAC version should work as well but we have not concluded as to whether there is sufficient support for OpenNi yet. There will be further updates on MAC as we continue our project.

                Once Processing is installed you will need to make sure that the OpenNi Libraries are available for the code to work. There is an excellent tutorial on installation of the OpenNi package and driver located on the OpenNi wiki page.

OpenNi Driver and Library Installation

                The help page goes into depth on how to install the 64 bit package that we need and how to move the libraries into the proper directory needed for either Windows or Linux. We are currently using the Processing 2.0 suite so you may use the directions corresponding to that version. We have also gotten our code to work on the earlier versions as well without noticeable difference. One thing that we should note for Windows users is to be sure to install the OpenNi driver for your camera as the default driver installed from Microsoft will not work. There is another tutorial for the install of the Windows driver located here:

OpenNi Driver Install for Windows

                This will guide you to be sure your system is not using the default Microsoft driver. If your system is not specifically using the OpenNi driver you will not be able to run this project. We were able to successfully run our program in Windows 7 and Windows 8. On Windows 8, there is another issue with the operating system not wanting to use an unsigned driver as is the case with the OpenNi Kinect driver. Here is yet another resource on how to get around this issue in case you run into this:

Windows 8 – Disable Driver Signature Enforcement 

                The Windows 8 system ran perfectly for the project once the driver signing enforcement was turned off but it does require a little extra effort.

                Now that Processing is installed you will be ready to run the code. Essentially what our program does is it starts a loop in which a full color RGB camera feed is displayed on screen and each pixel is scanned to find the closest object. The closest object is then converted in feet so that the user can see on screen how far away the object is. The code will also draw a green dot on the screen corresponding to the object that is nearest. The current setting on the code will then flash a warning banner when an object is closer than 3 feet away to alert the user that they are within a dangerous distance to the object. While these actions are happening, the Processing code is writing the coordinates of the closest object to the Arduino board which is then sending those values to the servos. The servos will use these coordinates to lock on and track the nearest object so that the item in question is always in focus and in the center of the display. If the user gets away from the dangerously close object, the camera resumes normal viewing operation at the default coordinates in the middle of the x and y axis. If Processing is installed correctly and the libraries are in place you may try opening the project via Processing here:

Dropbox Code Repository – Kinect Object Detection

Object Detection

Nearest Object Detected


                The link contains our pde project file for 64 bit Processing. You may also open a new sketch in Processing and copy and paste the code to run the project as well. One thing that we noticed, if you receive an error about a null array pointer upon running the project the you probably don’t have the OpenNi driver installed for the Kinect. We plan to make error trapping methods to alert the user if the Kinect is not detected in the future. If ran properly you will see the servos move in response to the movement of the nearest object to the Kinect. If you have any issues getting access to the link for the code or have general questions about its use please contact Eric Robinson:

               The only item that we do not currently have fully documented is the mounting of the Kinect. Currently our prototype is mounted with two standard servos connected, one for the x axis and one for the y axis both being written to via Processing. We have a wooden mounting platform for the servos to support the weight of the camera. We will be formalizing a design for the mounting system soon once we fully debug the movement and decide upon the proper position for chair mounting. For more information about mounting and servo assembly please contact Seth Schoming : Our goal is to have a mount that will be universal to power chair systems that will sit at about a mid-back level and will be as sturdy and inconspicuous as possible. The range of the camera movement via servos will then be decided upon as there may be limitations on the x and y movement due to mounting position.


Power Chair Camera Mount

Proposed Camera Mounting Position




Cost Report:

Component Qty Unit Cost Total Cost Comments
Microsoft XBox Kinect 1 150.00 150.00  Xbox version only
Microsoft Kinect Power Adaptor 8.00 8.00  Not included with standard Xbox Kinect
Arduino Uno Micro Controller 1 25.00 25.00  
Bread Board: Circuits Project Board 10.00 10.00 Standard project board will suffice
Servo Motor 5.00  10.00   
TOTAL     203.00   




                The project concepts are working very much as expected as far as object detection is concerned. Our research concludes that the system is quite accurate at distance detection and quite useful for alerting when an item is getting too close to the camera. We were not only impressed with the accuracy of the camera but also the speed at which object can be detected. We noticed that the camera would track and give a warning even when items like a ball are thrown towards it. The only issue that we are contending with as far as the distance detection is the use of trigonometry to determine the distance at a more accurate level. We can get an accurate distance to an item from the camera but the actual distance from the chair to the item may be skewed due to the angle that the camera is looking at the object.

                I feel that overall our decision to finally go with Processing, Linux, and OpenNi is also paying off quite well. There is a great deal of research already done in the open source community for projects involving distance detection and we seem to be moving at leaps and bound as compared to what we were achieving before. The only thing I would have changed was to come to this conclusion sooner. I feel that we didn’t research enough before getting hands on with this project which led to some minor setbacks. We initially purchased the developer Kinect which was the first mistake and we assumed that the Microsoft SDK was the best to use. We eventually found that there just isn’t the support for either this device or SDK that we wanted.

                The project overall has been on track though even with the setbacks. The future prototype will include a rectification for trigonometry use in distance calculation. We would also like to add touchscreen buttons to our interface in order to activate a “night vision” mode. The night vision will essentially be the infrared rendering views that will allow object detection and tracking at night along with a monochrome view of the objects on screen. The final item after these upgrades have been made will be to then establish a secure mount for the camera system. We feel that these upgrades are well within reach and will be fully established within our deadlines.

Shade Technologies – Prototype II Final Report: Motorized Wheelchair Canopy


Our Motorized Wheelchair Canopy intends to help those people living with disabilities that confine them to their motorized wheelchairs and need shelter at times of inclement weather, such as rain or extreme heat. Although manual wheelchair canopies do exist, no type of motorized wheelchair canopy currently exists in today’s market for purchase. People living with disabilities that confine them to their motorized wheelchairs often have difficulties deploying a manual wheelchair canopy when needed, and express interest in purchasing motorized wheelchair canopies to improve their quality of living. Our Motorized Wheelchair Canopy intends to address these difficulties.

Our Motorized Wheelchair Canopy will consist of a motorized wheelchair canopy connected to a corresponding control mounted onto a motorized wheelchair. Our particular device will use a momentary toggle switch control to obtain input from user to control direction and magnitude of extension/retraction of the motorized canopy and will use a battery to power the motorized canopy.


Prototype diagram for Motorized Wheelchair Canopy


Motorized Wheelchair Canopy closed


Motorized Wheelchair Canopy open

Our Motorized Wheelchair Canopy Prototype I shows basic functionality of a simulated wheelchair canopy (a canopy with metal arcs as structure connected to a prototype base simulates a canopy attached to a motorized wheelchair). Our user controls (a momentary toggle switch) connected to our Adafruit Motor Shield on our Arduino Uno programmed with appropriate code allows the user to control direction and magnitude of extension/retraction of the simulated wheelchair canopy. Using user controls, user will be able to rotate one metal arc, while other metal arc stays stationary. Detailed steps below show the progression of this Motorized Wheelchair Canopy Prototype II.

  1. Programmed Arduino Uno with following Arduino Uno code.
  2. Placed Adafruit Motor Shield on top of Arduino Uno, ensuring that Power Jumper in place.
  3. Connected momentary toggle switch and potentiometer to Adafruit Motorshield on top of Arduino using various connector electrical wires, as show under “Diagrams”.
  4. Attached 24V DC gear motor to position M1 of Adafruit Motor Shield and 12V DC power supply to Arduino Uno using various connector electrical wires to verify that motor works as expected.
  5. Constructed 20″ prototype base using various 1/2″ pieces of wood, wood screws, wood nails, wood glue, and black paint and mounted 24V DC gear motor using 5/16″ screw eyes.
  6. Attached 1 4.9″ turntable wheel onto 24V DC gear motor and attached 1 4.9″ turntable wheel (level to previously attached) onto opposite side of prototype base using 1 10-24 2″ combination-head bolt, 2 10-24 nuts, and 2 #10 flat washers.
  7. Formed 3 1/8″ x 4′ weldable steel rounds into 3 metal arcs (10″ short radius) manually.
  8. Mounted 1 metal arc to 2 4.9″ turntable wheels attached to prototype base using 2 1 1/4″ plate staples, 4 10-24 combination-head bolts, 4 10-24 nuts and 4 10-24 washers.
  9. Drilled holes into opposite sides of 1 side of prototype base to insert and secure second metal arc using wood glue, and let set overnight to ensure stability.
  10. Created Motorized Wheelchair Canopy pattern using parchment paper, cut out patterns on yellow waterproof utility fabric, assembled pattern using yellow thread and 5/8″ Velcro.
  11. Threaded remaining metal arc into Motorized Wheel Canopy to create stability.
  12. Verified that Motorized Wheelchair Canopy Prototype I works as expected using functional diagram show in “Diagrams”.


Circuit diagram for Motorized Wheelchair Canopy (please refer to link for more detailed circuit diagram of Adafruit Motor Shield used)


Functional diagram for Motorized Wheelchair Canopy


Cost Report

Component Quantity Unit Cost Total Cost Comments
Arduino Uno 1 $25.88 $25.88  
Adafruit Motor Shield $19.50 $19.50  
Momentary toggle switch 1 $1.95 $1.95  
Potentiometer $0.75  $0.75  Any resistance
24V DC gear motor $9.75  $9.75  Includes 4.9″ turntable wheels 
1/8″ x 4′ weldable steel rounds 3 $1.97 $5.91   
1 1/4″ plate staples 2 $1.59  $3.18   
10-24 3/4″ combination-head bolts and nuts 1 $0.82  $0.82   
10-24 2″ combination-head bolt and nuts 1 $0.82  $0.82   
#10 Flat washers 1 $0.89  $0.89   
5/16″ screw eyes 1 $0.99  $0.99   
1″ wood furring strip 1 $1.49  $1.49   
1/2″ wood screws 1 $0.82  $0.82   
12V DC power supply 1 $18.75  $18.75   
Electrical wire $11.62  $11.62   
Heat shrink tubing $2.16  $2.16   
Electrical connectors $1.75  $1.75   
Yellow utility fabric 3 $6.99 $20.97  
Yellow 1/2″ bias tape 2 $1.99  $3.98   
Yellow 7/8″ bias tape 2 $1.99  $3.98   
5/8″ x 30″ Velcro 4 $3.99 $15.96   
TOTAL     $151.92    


Prototype II of our Motorized Wheelchair Canopy meets our expectations for what we envisioned. Our Motorized Wheelchair Canopy Prototype I allows a user to control both direction and magnitude of extension/retraction of a simulated wheelchair canopy. We were able to create a motorized wheelchair canopy with more functionality than our previous Prototype I, with better user controls and sensors to prevent overextension/overretraction. We also implemented our actual canopy and gave it a look similar to a market product.

Prototype II of our Motorized Wheelchair Canopy did offer us some difficulties. We ran into difficulties with finding exact parts that we needed. In future implementations, we would like to be able to use smaller turntable wheels or custom-made wheels, a more powerful gear motor that we would able to handle the weight of a wheelchair canopy, and a larger toggle switch for easy use by our clients at CPRF. We ran into a lot of difficulty constructing our canopy from scratch because of inexperience with sewing, we may consider purchasing a canopy instead.

Marz – Prototype II Final Report: Campus App


We wanted to design a tool that helps campus-going individuals out with finding places within their university and organizing their college tasks. The Campus App is a web service we are implementing to help out university-goers with many different aspects of campus happenings. It has a map of the campus, which will show the user’s location as well as other locations of interest. It will integrate with school schedules and events, providing nice social organization for any user. Integration with the schedule will allow the user to see where and when their classes are happening. The app will also show various school posted and community posted classifieds – housing, tutoring, sales, and others.

At first we were thinking about coding pretty heavily with Android (directly coding the app to handle transactions), but we realized that platform portability would be so much better by implementing a web server. The app can then serve nearly as a glorified bookmark to a webpage that offers the services of the Campus App. In the future this will make it much easier to make the move over to supporting iPhone and other platforms.

We have no use for hardware being designed in this stage of prototyping. Our main goal here was to get a working server that provides working interaction with your phone/computer. The system was implemented by means of a free web host service, and tested entirely on our phones and computers. The software aspect is comprised of javascript, php, and a small amount of SQL and CSS. See the description section below for proper details of the back-end.


The source code can be downloaded at

We started out by creating a website using a free web host, located at Which allows the user to interface with the scripts that run the look/feel of the application, as well as sending any queries to the back-end database which houses all of the important info like GPS markers and event listings.


Selecting the map option brings up a list of possible points of interest to display on the map. After choosing one of these categories, the relevant markers will be displayed, along with the users location on the map.  This is done by querying our database for the lists of stored latitudes and longitudes for, ie, the parking lots.



Selecting the classifieds module from the main menu will expand another menu for the different types of ads you can find posted. By selecting one of these, say housing, a concise list view appears with dates and details of posted ads. Each list entry is a link to the description view. To post into the classifieds section, the user will require logging into their account in our database. If the user does not have an account, there is a prompt to set one up including an e-mail verification process. Once the user logs in to credentials matching ones in our database, they can post ads and events.

Here is a screencap of a sample post description. Users can reply to existing ads and get in contact with the original poster.



The events module is very similar functionally to the classifieds module. It features the same type of list view and description view sections as well as a section for student posted events. The entry section has more fields corresponding to database entries for date, time, location, etc.


Attached is a flowchart of interactions with our online campus application. 

Cost Report:

Our project has only involved software research and development, all of which we were able to complete without any investments.


After getting a rough foundation complete with prototype 1, we were able to get the application more modularized. This allowed us to focus on the implementation classifieds and events modules and planning of others. We found another application that did a sports news feed for WSU, and several others that were rough implementations for other schools, but none of these were all-encompassing like we’re hoping ours can become by slowly adding more modules. As we continue on in the project, it’s likely that we’ll have the foundation setup so that each of us can focus on the implementation of a module. This will allow us to complete the rest of the loose ends much more quickly. Upcoming modules will include sports feeds, course listings, schedules, book listings from the school bookstore, and others. We will try to implement these in a consistent way that uses the same back-end for administrators to upload content into the website. In this fashion, it makes our product much easier to adapt for use of different universities. Any customizations of skins, modules, and other look/feel will be readily accessible for us to tailor for what any given university might be looking for.