Archive for the 'Final Report' Category
Prototype I Final Report Assignment

Put your complete, standalone documentation for your first prototype here. Documentation should include complete functional and technical description, including but not limited to pictures, flow chart, circuit diagrams, and code.

due midnight before October 23

Group 9 – Prototype I Final Report: Remote Piano Pedal Controller

Introduction:

     As a group, we feel that everyone should be able to enjoy the things they love to do despite their limitations. Our project is designed to enable people who are unable to use their legs to operate the pedals of a piano and to be able to do so by simply attaching a motion sensor to any moving part of their body. Depending on how they move, the motion sensor will send a signal to the Arduino /microprocessor which will in turn signal the motor to rotate. The motor will be connected to a Cam that will be attached to the pedals of the piano causing it to move either up or down as desired. The parts used in this project include: An Arduino/microprocessor, 12VDC motor, potentiometer, motor shield, cam, and an accelerometer, infrared detector, or other input device.

 

Description:

     The system has been tested by the use of two potentiometers. One potentiometer represents the user interface. The second potentiometer is attached to the shaft of the motor. When adjustments are made to the user interface, the potentiometer sends a signal to the arduino. This signal is interpreted as a value between 0 and 1123. This value is compared with the value of the potentiometer that is connected to the motor shaft. If the interface value is lower that the motor value, the motor is prompted to rotate backward. The movement of the motor adjusts the second potentiometer downward until the two numbers match. At this point the motor stops and waits for another signal. Likewise, when the interface value is higher than the motor value, the motor is driven forward changing the value of the potentiometer until the values agree.  In this way, the position of the motor is tracked and the tendency of the postition of the motor to drift is avoided.

     Currently, an infrared distance measuring sensor unit is being used to provide an input signal. These values range from 0 to 590 and are scaled to match the motor’s potentiometer values. The input values are buffered by a capacitor installed at the output.

     The interface between the user input and the motor is the Arduino microprocessor which is loaded with a computer program that executes the logic. A motor shield is used as the Arduino only supplies  5 volts and 12 volts are required to run the motor, additionally, the polarity of the motor inputs must be reversed in order to drive the motor in both directions. This was accomplished  by the use of the H-bridge motor shield.  A wiring diagram is included below.

YouTube Preview Image

Diagrams:

Reflection:

     In the design of this project, a servo system for a DC motor has been built. Code has been developed that allows the motor to interface with an external input. Several input devices have been explored and the infrared sensor is supplying useable, if not ideal values.  A variety of input sources are being explore. Initially, the use of an accelerometer was investigated. The accelerometer did not give us useable feedback. An inclinometer was tried and again the output was not adequate. The infrared sensor values are tending to jump around causing the motor to ‘chatter’. By sending the output through a capacitor this effect is being buffered somewhat.

     Overheating of the H-bridge chips is still a problem that we are dealing with. A heat sink will be installed in the next phase of design to overcome this challenge.  Additionally, an increased current to the motor will cause the motor to move faster and produce greater torque. The final design of the cam is still underway.

 


Group 3 – Prototype I Final Report: Battery Range Meter

 

Introduction:

Give a brief description of what you are trying to solve. Include a high-level overview of what you made, why you made it, what parts you used, and what it does.

Wheel chair users come across many issues in their day to day lives. One of the issues they have to deal with is the limited capacity of the battery on the wheel chair they use. Battery capacities are different due to many reasons including size, age or how well it is maintained. Due to these reasons, users have a hard time knowing how long he/she can use it before they have to recharge.  All wheel chairs in the market show some kind of a battery level but how does that translate into real life?  They do not display how far the wheel chair can travel before it has to recharge.

For our project, we are trying to address this issue and build a device that can display many the miles remaining on a charge. With this intention, we did some research and started finding a solution. Currently, we are relying on the battery voltage to measure the battery charge. We have collected data about the distance our chair can travel and how the voltage drops depending on the charge.  We are reading the battery voltage at a given time and calculating the distance it has remaining on the current charge.

The major components we are using:

  1. Arduino Leonardo
  2. AttoPilot Voltage and Current Sense Breakout – 90A
  3. Basic 16×2 Character LCD – Black on Green 5V
  4. Car Adapter USB Power Supply – 5VDC 650mA

Description:

Batteries and electric wheelchairs are not made equal.  In order to get accurate distance measurements for the wheelchair we used, data had to be collected.  We spend all day in a park running down the battery of our chair and taking voltage measurements along the way.  We started with the voltage of a full charge and collected measurements around every two miles we traveled to see the change.  Here is our data:

Miles

Voltage

0

25.4

1.9

25

3.9

24.8

5.9

24.6

7.9

24.4

10

24.2

12

23.9

14

23.6

16

23.3

18

22.8

20

20

In order to translate this into a nice equation we can use to come up with a good judge of distance remaining, we had to change our limits to not include the full charge and dead voltage measurements.  In order to accommodate for this, we don’t allow the meter to display more than 18 miles or less than 2.  Once those we removed, we used excel to find a best fit curve.  The equation uses x as the voltage and y as the miles remaining.

 

The Circuit:

To connect the LCD to the Arduino, you can follow this diagram below.  The circuit and a full tutorial on how to use the LCD can be found here: http://www.arduino.cc/en/Tutorial/LiquidCrystal

 

 

To connect the AttoPilot to the Arduino:

AttoPilot

Peripheral

V

Arduino “A0”

GND

Arduino “GND”

Vin+

Battery “+”

GND

Battery “-“

Code sample and a demo on how to use it: http://dlnmh9ip6v2uc.cloudfront.net/datasheets/Sensors/Current/CurrentSenseDemo.pde

When you are done, your entire circuit should look something like the following.

 

In order to power the Arduino, attach the 12V-24V DC to USB converter to the battery and plug it in using your standard USB to MicroUSB cable.

The Code:

Attached is our source code with comments to tell you what the code does.

 BatteryMeter

Reflection:

 Measuring the voltage output of a battery is a great way to get an idea of its capacity, but it is far from precise.  This project will give an estimate of the mileage but does not take in account for the battery condition and the chair’s output.  It makes an assumption the battery is in good health and you are moving at the maximum speed on a flat surface.

In the future, there are many improvements we can make.  We hope to refine our circuit by adding other factors, for example, the current to give a more accurate reading.  Currently, our equation only works on the wheelchair we tested.  Coming up with ways to generalize these equations to work with all models of wheelchairs and different types of batteries will be the end goal.

At the last minute, our AttoPilot circuit was fried.  In order to have a demo to show, we created a simple circuit that measures the voltage across different sets of resistors.  We use this voltage to display a mileage.  We attempted to use a potential meter in order to get a full range of values for the simulated charge of the battery but the voltage was too unstable to get a reading.  We also attempted to make a voltage divider that we could read the charge of an actual battery.  When we created the circuit, again, the voltage varied too much for the Arduino to get a stable reading.


JEL Group Final Report

Introduction:

Our problem is difficult to express briefly. First we must explain our client, Josh’s situation. Josh is wheelchair bound and unable to communicate verbally or physically. To communicate he uses a computer system (link) and navigates using a motorized wheelchair. Unfortunately due to his inability to reliably control his motor functions he is only able to consistently manipulate one input device at a time via his knee, so he is forced to chose  between communication or propulsion. To utilize his computer system he presses a Bluetooth switch that is mounted on his chair near his knee which acts as in I/O device for his computer system.    Unfortunately this device can only communicate with his computer system and another switch is needed for him to control the motion of his wheelchair thus the switches must be exchanged, this exchange of switches is something that is beyond our clients capability thus he is reliant upon the awareness and willingness of others to perform this task.

What we are trying to do is extend the freedom of our client by creating a Bluetooth smart switch that can intercept the information sent by our clients’ Bluetooth switch and delegate how the information is to be used. Via our switch we decide how to handle this information through predefined ‘modes’ which we call chair mode and board mode. When the switch is in chair mode our switch will merely act as a relay and send the information it receives from our clients’ Bluetooth switch directly to our clients computer system. However when our switch is in board mode it will still intercept the data sent from our clients Bluetooth switch but will ignore the computer system. Our switch will then send data to our clients wheelchair which  will allow our client to utilize his computer system as well as the mobility of the chair with a single well made switch.

Description:

Currently we are pursuing two avenues that we hope will lead to a solution. Our first is a purely wireless avenue that utilizes a credit card sized Linux based computer system called a raspberry pi Raspberry Pi Official Site which has 2 usb inputs that we will use for Bluetooth capabilities. We will write preliminary Bluetooth programs in python that will interface with our clients switch and computer system. We will also make the Pi intelligent enough to switch between board and chair mode. We are utilizing the Pi because it meets our size requirements of 4x4x4 inches as well as the fact that it consumes far less power than we are limited by (5v and 1A), not to mention it is only $35 for a full powered computer. The second avenue is a semi-wireless solution that uses a circuit designed by our team member Long that incorporates an Arduino Uno.

In our Raspberry Pi solution we, presently, have only been required to build software that utilizes Bluetooth to interact with Joshs’ Bluetooth switch. Currently the only successful software we have built performs the task of ‘discovering’ nearby Bluetooth devices.

    Before we were able to build anything we needed to find out what kind of services our clients Bluetooth switch provided, we utilized several Linux tools to perform this task. (link). We also were able to obtain the Bluetooth MAC address, clock offset and class from our clients device. This information will make building the rest of the Bluetooth software much simpler.

The second avenue is a semi-wireless solution that uses a circuit designed by our team member Long that incorporates an arduino Uno. In our second solution we have built a circuit that incorporates an Arduino Uno. A diagram for the circuit can be found in the following link.  Uno Diagram

    This circuit leverages the fact that the our clients Bluetooth switch has a wired input. The three LED’s are used to represent different modes which will keep our client informed of what our device is doing. The switch, S1 in the diagram, will be the clients new switch, the input of which will be monitored by the Arduino Uno. The two modules in the diagram represent our clients Bluetooth Switch as well as a direct connection to the input of the motor of the our clients’ wheelchair. The Uno will select which module it will communicate with according to what mode the Uno is in.  When in chair mode the Arduino will be hard wired, in a sense, to the wheelchair motor. On the other hand when the Uno is in board mode the Arduino will be connected to the wired input of the bluetooth switch which will trigger the bluetooth switch to communicate with our client’s computer system via a wireless  bluetooth serial port.  

Reflection:

Our project has not taken the usual path. Our preliminary difficulties were largely conceptual. Just comprehending the magnitude of the impact our project would have on our client’s life took several weeks to convey to each team member as well as our instructors. After that we were burdened with several unknowns pertaining to the feasibility of intercepting information from proprietary devices while concurrently having to study the protocols utilized by bluetooth implementations. On top of all this we were only able to meet our client recently and experiment with his bluetooth switch and computer system. This meeting was fruitful and we were able to obtain the bluetooth address, class, clock offset as well as all the services provided by our clients bluetooth switch.

Despite the challenges encountered thus far and those that remain we feel that we have made consistent, although slow progress. We have had many difficulties but never pursued any options that were dead ends. Given a little more time we are extremely confident that our device will not only be a success but we will also change not only the life of our client but also positively impact his caregivers.

NOTE:
I have attempted to upload bluetooth python code and Uno code up to the Cratel site so that I may link to them here but have been unsuccessful, security reasons according to the post submission page. I have been attempting this for over four hours and have decided to give up. If we can get that problem fixed I will link in our code.

ALPHA TEAM – Prototype I Final Report: THE TRUEVIEW POWER SYSTEM

 

Introduction:

            One of the issues that we had discovered at the Cerebral Palsy Research Foundation was that those with limited mobility that relied on a power chair had issues seeing obstacles behind and to sides of the chair itself. The residents expressed concerns about the safety of those around them and actually described damage to their homes as a result of this issue. Some may not realize that the new lithium powered chairs are actually quite powerful and can easily put holes in sheet rock in a home. Even when we were at the CPRF introduction meeting for a short time we could see this frustration first hand as someone was almost ran into by one of the residents that was backing up. Even at a slow distance the possibility of hitting objects behind the power chair is always present. This one single issued was voiced with the most concern amongst all the ideas that we presented to the CPRF residents and we found that this would be the most pertinent issue to investigate.

            Our solution to this problem was a versatile camera system that would be mounted onto all popular models of power chair that would have the capability to see objects behind, to the sides, and the front of the chair. Our plan incorporated a front mount camera that had the capability to rotate at all angles to see obstacles to each side as well as above and below as the user is moving forward. This front camera also has the capability to see objects that may be on ground which was also of great concern to users of the power chair. In addition to the increased visibility that this camera system would provide, we also wanted to be able to give a distance warning to the user in case they are backing up and cannot determine how close and object is. This would be similar to a system used in vehicles but much more finely tuned and discrete for those using a power chair inside of close quarters such as those found in a home. This system would also incorporate a user friendly graphical user interface that would allow the user to switch from front and rear views and would clearly display distances and warnings of objects or people in their path. This is how we came up with the concept of the TrueView camera system for power chairs.

            The materials that we used for this prototype:

  • Microsoft Kinect
  • Arduino
  • Servos
  • High Definition Web Camera
  • Ubuntu Linux OS
  • Laptop

            The initial prototype currently has a working distance detection graphical user interface that utilizes the Microsoft Kinect’s power infrared emitter. The Kinect provides a live feed of whatever object is behind the user and will also display the distance to that object by moving the mouse to the corresponding pixel on the screen. The prototype currently displays the distance of any object in meters up to 6 meters away and beyond. The camera will also detect objects as close as half a meter away with the potential to work even closer with further code evaluation. We also have a fully interactive front camera servo mount that has the ability to move a camera up, down, left, and right all by a swiping gesture on a standard laptop touchpad. This will allow easy interactive movement for the front camera which will also be eventually implemented on the rear camera when we introduce more robust distance detection upgrades.

Description:

            The first step that we took on our concept design was to select an appropriate camera and a software development kit to run the functions that we need. We decided upon the Microsoft Kinect for its powerful set of features. The Microsoft Kinect is an off the shelf item that may be purchased from many different retailers. See Amazon’s description here:

XBox Kinect

            You also need to make sure that you have the power accessory that enables you to plug into a laptop for development:

XBox Kinect Power Accessory

            This is important as the normal Xbox Kinect usually does not come with this.

            We then needed to select the best development kit that we could to get distance detection and an accurate live feed. We eventually decided upon the Freenect library which is an open source library dedicated to Kinect functions found on Linux. A further description of Freenect may be found here:

http://www.freenect.com/

            We found the ease of use and the widely available functions library to be superior to the Microsoft Development Kit that we originally planned to use for the Kinect. It is highly recommend to install the Freenect libraries from the Ubuntu Software Market. We simply searched Freenect in the market and installed all related items that came as a result. We have also provided links to documentation that describe how to install everything you need for this project below.

            Once we decided to use Freenect and commit to Ubuntu Linux as a development platform, we wanted to start with an open source interface that would allow a live camera feed and give a place on the screen for distance based on what the camera was looking at. We finally decided upon RGBDemo. RGBDemo is a collection of Kinect demonstration software programs for Linux that uses the Depth functions built into the Freenect library for evaluation. Please see the main page:

http://labs.manctl.com/rgbdemo/

Distance Detection

   

            RGBDemo has a quite a large assortment of demo programs and hence has quite a bit of code associated with it. For best results we found that if you are using Ubuntu Linux, you should use the follow command in the Linux terminal to download the code:

            git clone –recursive git://github.com/nburrus/rgbdemo.git

            You may be prompted to install “git” with “apt-get install git” command in order to download this code. This command will download everything you need to start viewing the RGBDemo software. The code provided is mostly C++. It is recommended that you have working knowledge of C++ programming to navigate through source files that will be downloaded. The next challenge will be compiling the code. Instructions for compilation of this software may be found here:

http://labs.manctl.com/rgbdemo/index.php/Documentation/Compiling

            One note that we must make is that you will have to make a minor change to the install libraries to get everything installed that you need. The command used in the compiling instructions above says to install “libglut3-dev” as part of the install but that package is outdated and we had to use “freeglut3-dev” instead. This hasn’t been updated on the compiling instructions yet but is crucial. Also, the instructions say that the install of “PCL” is optional but it is highly recommended to go ahead and install PCL from the following web site:

http://pointclouds.org/downloads/linux.html

            Be sure to use the Ubuntu Linux install directions to install PCL once you navigate to this page. This library seemed to be necessary in order to compile correctly without errors. The directions provided for compilation were otherwise used successfully as described above.

Kinect Project Setup

            Be sure to have the Kinect plugged into the wall with the power adaptor and plugged in via USB to the computer you are using before starting RGBDemo. Once you follow the directions for downloading, installing, and compiling above. The RGBDemo executable files will be in the “build/bin” folder of the RGBDemo directory that you installed the files to. The command to run the distance detection user interface will be “./RGBD-Viewer” once you are in the “build/bin” directory. For any further questions or comments about installing and running this demo please contact Eric Robinson at robinsonen@gmail.com. Please remember that the tutorials and instructions contained in here assume at least an intermediate level of knowledge of Linux operating systems. For further questions about the commands used to install and update Linux packages please see the Ubuntu forums here:

http://www.ubuntu.com/support

            The servo assembly is controlled by user input from the mouse. We followed a tutorial found on youtube to get the servos to communicate with an interface developed in Processing. The two servos are connected to an arduino that is programmed to mimic the x-y coordinates of the mouse. One servo receives the x-component and the other receives the y-component. When connected properly, the servos will move to match the movement of the mouse. The tutorial with links to the code for the arduino as well as Processing can be found here:

http://www.youtube.com/watch?v=jkDEXX06A-g

 

           Here is a video of our servo assembly in action:

http://www.youtube.com/watch?v=_IOP4DcRJSI

 YouTube Preview Image

Reflection:

            Our first prototype was a step towards making a fully functioning detector for objects behind people in wheelchairs that have a hard time seeing anything behind them. We really haven’t ran into many issues except for the Kinect. The main issue is we ordered a kinect online that is for developers but we didnt realize that it wouldn’t work on Linux so we had to return it. This may hurt our final project because the developer Kinect had a near mode detection that helped it see closer objects. The Kinect we are using can only detect objects as close as half a meter. Also the Kinect has a motor on it to move it but it’s not very powerful and can easily break so we decided to not use it. Instead we are using the servo’s to move the Kinect. This has added more work for us but in the end will give the consumer a better product. Our goal for the next prototype is going to be to get the Kinect and the servos working simultaneously. This will allow us to detect objects 180 degrees behind a wheelchair (or really anything that needs to detect the nearest object). Our biggest challenge right now will be to allow the arduino and the kinect software to interact successfully. Once we get them working together its just a matter of adding code to make our final project as user friendly as possible.

 


 

Team 12 – Prototype I Final Report: Home Automation Project

Introduction

The Home Automation Framework project is a web application that can remotely control the electronic devices in one’s home using a web interface on a smartphone, tablet or desktop computer. The project facilitates ease of access and remote control of electronic devices for those who are busy, have a disability, or are not physically in the location of the device. Examples of such electronic devices may include lights, security cameras, electric door latches, TVs and computers.

Our initial prototype utilizes the small and inexpensive Raspberry Pi as the web server and an Arduino microprocessor. A web application was built using Python, in which the user is able to tap or click an icon in the web application to both turn a lamp off and on as well as open and close window blinds. Developing the prototype involved dismantling an existing remote control technology and soldering wires from it to GPIO pins on the Raspberry Pi in order to be able to send voltage to the pins via our program, thus turning them off and on (or active), and remotely controlling any devices plugged into the outlet adapters from the existing technology. In addition, we used a servo in conjunction with an Arduino to control the blinds.

Materials

Materials used in this project include the following:

  1. Raspberry Pi ($35)
  2. Stanley Remote Control Device (with RF-controlled power sockets)
  3. Arduino Uno
  4. Servo Motor
  5. Infrared Remote
  6. LEDs
  7. Breadboard
  8. Connecting wires
  9. Horizontal Venetian Blinds
  10. Resistors
  11. Header pin
  12. Lamp

Description of Major Components

Server: A Raspberry Pi is connected to the internet and running an Apache web server, which serves HTML and CSS files to build the interface in the user’s browser. The Apache WSGI plugin is installed, allowing the server to run python code when the user presses buttons in the interface. The python code handles the GPIO port switching to control the Stanley RF remote control. The web interface and python code is hosted in a GitHub repository.

Stanley Remote Controlled Outlets: For our initial prototype, we utilized an existing commercial product as a proof of concept. The Stanley product consists of a remote control with 3 on and off buttons, and three outlet switches. Each outlet switch is plugged into the wall, with a lamp or other AC device plugged into it. The buttons on the remote turn the outlets on or off. For our project, we took apart the remote, soldered wires onto the buttons, and are simulating button presses using the GPIO pins on the Raspberry Pi. This component will later be replaced with our own custom hardware.

Arduino Blinds Opener: An Arduino Uno device that sits on top of the blinds has a servo actuating the tilt rod, causing an open and close movement of the blinds, as shown in the video below. The Arduino & Raspberry Pi code is posted in a GitHub repository.

WebApp: The WebApp consists of icons that the user would click or tap, which will result in the devices being turned on or off. See image on the right.

Description of Implementation

  • Raspberry Pi Server:
    1. Install Raspbian on the SD card.
    2. Boot up the Pi and log in with SSH.
    3. Install apache2
    4. Install mod_wsgi 
    5. Install gpio-admin
    6. Download jQuery to /var/www
    7. Write index.html and main.css to create a basic web interface (in /var/www) which will send AJAX requests to be processed by the python script.
    8. Write python script (with extension .wsgi, and matching the WSGI spec) to process the query sent from the web interface and toggle the appropriate GPIO pins.
    9. Make wire harness to connect the GPIO pins to a breadboard.
    10. Connect GPIO pins to the leads from the Stanley remote control.
  • Arduino Blinds Opener: The Arduino blind opener’s components were described above. The steps to assemble it were as follows:
    1. Assemble the hardware as described in the below circuit diagram
    2. Load the Raw-IR-Decoder-for-Arduino code onto the Arduino. The code is posted in our Github repository linked above.
    3. While the program is running, open a serial view of the output of the Arduino.
    4. Using any IR remote, push a button to be assigned an Arduino function. In this case, we used an HP Media Remote from a HP dv2000 laptop, but any remote works.
    5. We used two buttons, up and down, to assign to the open and close functions in the Arduino.
    6. The serial view will output a properly formatted C array of integers. This array is the code the remote sends, as is captured by the Arduino.
    7. Edit the IR-commander code. There is a library file called ircodes.h. In that file, paste the code that was outputted by Raw-IR-commander, just like the examples already in there.
    8. In the ircommander.ino file, there is a main loop that listens for an IR pulse. By calling the function: IRcompare(numberpulses, [Name of code],sizeof([Name of Code)/4)), we check wether the code matches a saved code.
    9. Every time we match a saved code, we execute a function. In this case, we output 0 or 180 to the servo, which controls the blinds.


Diagrams

Raspberry Pi-Controlled Remote Lamp Actuator:

Raspberry Pi Circuit Diagram

Arduino-Controlled Servo & Blinds Implementation:

Arduino Blinds Actuator Circuit Board:

Venetian Blinds Controller (utilizing Arduino) - Demo

YouTube Preview Image

Light Controlled from iPhone (via Raspberry Pi) – Demo

YouTube Preview Image

Challenges

We encountered a few challenges in creating this initial prototype for the Home Automation Framework.  One aspect was the need for meticulous care in soldering wires onto the header pins which were to be connected to the breadboard to establish a functional circuit to the Raspberry Pi. This activity took a bit of time as some team members had never soldered before and wanted to participate in the hands-on learning opportunity. Some elements thus were not soldered properly, which led to loose connections for the circuit, resulting in failure of signal from the Raspberry Pi. Soldering of the wires was redone and connections were made again on to the breadboard until it was eventually successful.

In addition, during testing (in which we utilized an LED) to ensure the Python code was written properly and connections were made appropriately, the light failed to glow due to it being connected backwards. This was a perfect opportunity to remind ourselves (or learn!) that LEDs had a certain polarity that needed to be considered in connecting them to the circuit. We also weren’t originally sure if the output from the Raspberry Pi was active high or active low, but this too was something we were able to figure out by trial and error.

Plans for Expansion

Future work plans include expanding on controlling the blinds from commands sent to the  Raspberry Pi via the user’s phone and controlling other home appliances and devices such as an electric latch door and television. The team has put in a lot of effort towards this project. Meetings were scheduled and attended by each member when possible. In general, the first prototype was successful and each team member made this possible and teamwork was exhibited greatly in this initial prototype for our Home Automation Framework.

Reflection

We were pleased with our success on this initial prototype; we accomplished our goal of being able to control a light remotely via commands sent to the Raspberry Pi. In addition, we were able to control a set of blinds with an Arduino and a servo motor, a process we plan to extend in the future to be accomplished by use of commands sent to the Raspberry Pi as well.

Our group members had varying levels of experience with coding in high-level languages such as Python and technical experience in areas like soldering. The more experienced members of our group did a great job of helping everyone learn and contribute. Even so, we all faced the normal challenges associated with a project of this type, from discovering why logic for code wasn’t working, to switched wires, to being mistaken about the polarity of LEDs. Overall, it was a great experience that we look forward to building on in the coming weeks with our next prototype.

Infinite Loop – Prototype I Final Report: Hazardous Weather Warning System
 

Introduction:

The LED warning system notifies local residents of emergency withing the area.  The difference in the color of the LED will allow residents to differentiate between different warnings.  Each shelter building has a person who is in charge of unlocking the door for residents to enter.  Once the person is in the room they will flip a switch for a specific warning that is defined by the facility.  For example, red could mean there is a severe thunderstorm warning, and it would be illuminated on the top of each building via LED.  

Our goal with the prototype is simply that, to create a single LED light that when a switch is flipped it will illuminate a certain color.  Each individual switch will illuminate the LED a different color.  This is starting with the simplest way to solve the problem that was presented to us.  

Parts Used:

     1 Arduino Uno

     1 RGB LED

     3 Toggle Switches

     3 1K Ohm Resistors

     1 Bread Board

     1 Dell XPS

     Misc Wire

Being powered by the laptop, throws of the toggle switches will turn the LED to different colors.  Leaving multiple switches turned into the on position will produce even more colors for the LED.  The LED is set up with a common anode and three (3) cathodes.

(The long prong in the above image is the common anode.)

Description:

We originally started by simply hard wiring the LED and completing the circuits for each different cathode to produce different colors.  As pictured above the common anode receives +5v and the cathodes are grounded.  The +5v and ground come from the arduino which is powered via USB from the computer.  Each wire to a cathode also contains one (1) 1k Ohm resistor.  To switch between colors we complete the circuit on the ground side.  The results are in the picture below.

     Blue                                                             Clear                                                 Cyan

Once we achieved this we moved on to adding switches to the different cathodes in order to control them in a more manageable manner.  Inserting the switches between the ground and cathodes is the only thing that changed from above.

 

             Blue                                                           Green                                            Red

 

Wiring Diagram

Reflection:

 The prototype for our project has been simple enough as of now that problems have yet to arise.  Once we move into the program controlled LED lighting via email and/or text message we believe it will become more complicated.  The majority of what we have learned has been involving how RGB LEDs work and switch between different colors.


Although this is an online submission, please use language and formatting appropriate for a final report (capitalization, well-constructed sentence structure, paragraphs, etc.). Also please delete text such as this which is clearly to help you and not intended as part of the report. You may change the form of this template as you wish, but please include the main elements outlined.

Marz – Prototype I Final Report: Campus App

 

Introduction:

We wanted to design a tool that helps campus-going individuals out with finding places within their university and organizing their college tasks. The Campus App is a smartphone application that uses fine GPS data to provide a map of campus for the user. It shows the user where they currently are located, as well as buildings and interesting locations on campus that they might be looking for. It will integrate with school schedules and events, providing nice social organization for any user.

At first we were thinking about coding pretty heavily with Android (directly coding the app to handle transactions), but we realized that platform portability would be so much better by implementing a web server. The app can then serve nearly as a glorified bookmark to a webpage that offers the services of the Campus App. In the future this will make it much easier to make the move over to supporting iPhone and other platforms.

We have no use for hardware being designed in this stage of prototyping. Our main goal here was to get a working server that provides GPS and other interaction with your phone/computer. The system was implemented by means of a free web host service, and tested entirely on our phones and computers. The software aspect is comprised of javascript, php, and a small amount of SQL and CSS. See the description section below for proper details of each software file.

Description:

We created an account on a free web host that provided MySQL and PHP. A table named school was created in the database to store basic information about the school.

id abbrev name latitude longitude zoom
1 wsu Wichita State University 37.7192 -97.2931 16
2 KSU Kansas State University 39.1998 -96.581 15
3 HU Harvard University 42.37242 -71.114 16

We then created 5 pages, index.php, menu.php, map.php, javascript.js and style.css. A meta tag for “viewport” is included in the header of each page to allow each page to scale properly in mobile device’s browser. The ID of the school is passed around in the URL which allows each page to load school specific data and features specific to that school.

index.php

Queries the datadase table school to gather a list of all the schools, and then outputs a link to the menu.php for that school

menu.php

Displays a set of buttons such as map, sports, events, courses, book store, etc… Only one of them works right now, and that is the map button. Each button will point to a module, which is implemented as a separate page. For example, map.php is the maps module. In each module, the menu.php is loaded in the background, and hidden. This allows the menu to be quickly toggled open without any load time. This is accomplished with javascript and CSS.

Menu with the phone tilted horizonally, the buttons tile up to fit the width.

Menu with the phone tilted vertically, the buttons tile up to fit the width

map.php

Uses google maps api to display a map that is automatically centered on the school using the latitude and longitude stored in the school table in the database. The zoom level from the school table is also used to set how closely zoomed in the map will start at. After all, some schools are larger than others and requires different zoom levels to display the entire school at load time. A callback function that sets a marker at the GPS coordinates provided by HTML5 Geolocation is called every 5 seconds to update the users position on the map.

javascript.js

Stores all the javascript for the web site, which includes a function that allows the menu to be pulled up on any module instantly without loading a new page.

style.css

Stores all the CSS for the website which is used to format the page into sections, and control the look and feel of the text and other elements.

The source code can be downloaded at http://marz.yzi.me/Marz_website.rar

Diagrams:

Flowchart of user entering the campus map section of our website. map.php evokes the Google Maps scripts to get a map of the latitude, longitude, and zoom for the school that was selected. A high precision marker is dropped onto the user’s location.

Reflection:

The first prototype for our project, being purely programming, was not tough to complete. Zane was able to get the server up and running in a timely manner. We did not have to make our own map of campus since we just incorporated an already-made map from Google. The only problem that we have had so far is that sometimes the GPS location is not as precise as we would like it to be. The interface was simple to design for the first prototype and serves as a foundation for other additions. In future prototypes, we will be adding extra functionality to the app besides just GPS location on campus. There will be buttons for sporting events on campus, a course catalog, access to the book store, and many other features. These features, along with the actual hardware part of the project (unique informational kiosks around campus) will be further developed in the coming prototype(s).

Shade Technologies – Prototype I Final Report: Motorized Wheelchair Canopy

Introduction

Our Motorized Wheelchair Canopy intends to help those people with disabilities that confine them to their motorized wheelchairs and need shelter at times of inclement weather, such as rain or extreme heat. Although manual wheelchair canopies do exist, no type of motorized wheelchair canopy currently exists in today’s market for purchase. People with disabilities that confine them to their motorized wheelchairs often have difficulties deploying a manual wheelchair canopy when needed, and express interest in purchasing motorized wheelchair canopies to improve their quality of living. Our Motorized Wheelchair Canopy intends to address these difficulties.

Our Motorized Wheelchair Canopy will consist of a motorized wheelchair canopy connected to a corresponding control mounted onto a motorized wheelchair. Our particular device will use a rocker switch control to obtain input from user to control direction and magnitude of extension/retraction of the motorized canopy and will use a battery to power the motorized canopy.

Materials Used for Motorized Wheelchair Canopy Prototype I

  • 1 Arduino Uno
  • 1 Adafruit Motor Shield
  • 1 solderless breadboard
  • 1 4-pin push button
  • 1 6-pin 2-position switch
  • 2 1kΩ resistor
  • 1 24V DC gear motor
  • Various connector electrical wires
  • 2 4.9″ turntable wheels
  • 2 1 1/4″ plate staples 
  • 4 10-24 3/4″ combination-head bolts
  • 1 10-24 2″ combination-head bolt
  • 6 10-24 nuts
  • 6 10-24 washers
  • 2 5/16″ screw eyes
  • Various 1/2″ wood pieces
  • Various wood screws and nails
  • Wood glue
  • 1 12V DC power supply

Description

Our Motorized Wheelchair Canopy Prototype I shows basic functionality of a simulated wheelchair canopy (attaching metal arcs to a prototype base simulates a canopy attached to a motorized wheelchair). Our user controls (4-pin push button and 6-pin 2-position switch) connected to our Arduino Uno programmed with appropriate code allows the user to control direction and magnitude of extension/retraction of the simulated wheelchair canopy. Using user controls, user will be able to rotate one metal arc, while other metal arc stays stationary. Detailed steps below show the progression of this Motorized Wheelchair Canopy Prototype I.

  1. Programmed Arduino Uno with following Arduino Uno code.
  2. Placed Adafruit Motor Shield on top of Arduino, ensuring that Power Jumper in place.
  3. Built 4-pin push button and 6-pin 2-position switch circuit using solderless breadboard, 1kΩ resistors, and various connector electrical wires as shown in following breadboard diagram under “Diagrams”.
  4. Attached 24V DC gear motor to position M1 of Adafruit Motor Shield and 12V DC power supply to Arduino Uno using various connector electrical wires to verify that motor works as expected.
  5. Constructed 20″ prototype base using various 1/2″ pieces of wood, wood screws, wood nails, and glue and mounted 24V DC gear motor using 5/16″ screw eyes.
  6. Attached 1 4.9″ turntable wheel onto 24V DC gear motor and attached 1 4.9″ turntable wheel (level to previously attached) onto opposite side of prototype base using 1 10-24 2″ combination-head bolt, 2 10-24 nuts, and 2 10-24 washers.
  7. Formed 2 1/8″ aluminum rods into 2 metal arcs (10″ short radius) manually.
  8. Mounted 1 metal arc to 2 4.9″ turntable wheels attached to prototype base using 2 1 1/4″ plate staples, 4 10-24 combination-head bolts, 4 10-24 nuts and 4 10-24 washers.
  9. Drilled holes into opposite sides of 1 side of prototype base to insert and secure other metal arc using wood glue.
  10. Set overnight to ensure stability.
  11. Verified that Motorized Wheelchair Canopy Prototype I works as expected.

Side view of completed Motorized Wheelchair Canopy Prototype I

 

Top view of completed Motorized Wheelchair Canopy Prototype I

Diagrams

Prototype diagram for Motorized Wheelchair Canopy

 

Breadboard diagram connected to Adafruit Motor Shield

 

Circuit diagram of breadboard connected to Adafruit Motor Shield

Reflection

Prototype I of our Motorized Wheelchair Canopy meets our expectations for what we envisioned. Our Motorized Wheelchair Canopy Prototype I allows a user to control both direction and magnitude of extension/retraction of a simulated wheelchair canopy. For Prototype II for our Motorized Wheelchair Canopy, we look forward to be able to create a motorized wheelchair canopy with even more functionality by adding better user controls and sensors to prevent overextension/overretraction. Later on, we will look forward to scaling up and implementing this Motorized Wheelchair Canopy on an actual motorized wheelchair.

Prototype I of our Motorized Wheelchair Canopy did offer us some difficulties. We ran into difficulties with finding exact parts that we needed. In future implementations, we would like to be able to use smaller turntable wheels or custom-made wheels, a more powerful gear motor that we would able to handle the weight of a wheelchair canopy, and a large rocker switch for easy use by our clients at CPRF. We also need to plan for a battery-powered configuration as opposed to our 12V DC power supply that we currently use.

Out of Time – Prototype I Final Report: Hot/Cold Peltier Pad

 

Introduction:

For our senior design project we have decided to work with the C.P.R.F. to design a simple Heating and cooling pad that will make the clients at the C.P.R.F. more comfortable in everyday life. We have debated on several different designs and ways to accomplish our goal. After much research we have decided to use a Peltier Thermo-Electric Cooling Module that will be controlled by an Arduino Uno to control the temperature. This Unit will be attached to a water pump that will pump the temperature controlled water through a pad that the user can place anywhere. We have approached the residents at C.P.R.F. with our design idea and have received some wonderful feedback from them. Many of the residents we talked to were thoroughly excited to see a working product, and also gave us some specifications that we are trying to incorporate into our design.

Description:

We have used a 40x40mm Peltier cooler attached to power supply to run the cooler. There is a Computer CPU water cooler attached to the hot side of the Peltier to keep the unit at best functioning temperature. Attached to the cold side we have attached a CPU block that will pull the cold temperature from the unit to a computer fan that blows the cool air out to provide evidence that our unit will be capable of functioning the way we plan for it to.

Parts Used:

  • 40mmx40mm Peltier Thermo-Electric Cooling Module
  • Corsair High-Performance Liquid CPU Cooler
  • 12.0v DC Power Supply
  • Intel CPU Heat sink
 

Performance:

We collected performance data over a 25 minute period.  Keep in mind this was using a power supply that could not provide the peak power that the Peltier unit .

Time(Minutes)/Hot Side(°F)/Cold Side(°F)

  • 0/75/75
  • 1/72/79
  • 2/80/72
  • 3/78/74
  • 6/82/71
  • 8/80/68
  • 10/82/66
  • 12/84/66
  • 16/86/62
  • 18/84/63
  • 21/85/66
  • 23/85/64
     

Diagrams:

 

 Prototype Mark I

 

Reflection:

When we first started the project we had a basic idea in mind and through several tests and extensive research we have come up with a working unit. The unit so far does not incorporate the use of the water pump or the water circulation pad.  We hope that our first prototype will perform well enough using only air circulation that it will obvious to the residents of CPRF what the true potential of this device is.  We still have a ton of work to do getting the Arduino code fully functional and implemented into our design.  We also need to find a more powerful power supply, as our existing DC supplies only provides ~42 Watts of power (6A @ 7V).  The current Peltier unit performs extremely well, and only at 30% of it’s rated power, but an upgrade to a more powerful unit might prove necessary

 

Resources:

http://garagelab.com/profiles/blogs/how-to-use-a-peltier-with-arduino

http://www.benchtest.com/wcooler1.html

https://sites.google.com/site/arduinotempcontrol/

http://www.dansdata.com/pelt.htm

http://www.amazon.com/Brushless-Submersible-Water-Ideal-cooling/dp/B0051HI3OM/ref=pd_sbs_indust_1

http://www.amazon.com/Corsair-Cooling-Hydro–High-Performance-CWCH60/dp/B004MYFOE2/ref=sr_1_1?s=electronics&ie=UTF8&qid=1348797987&sr=1-1&keywords=Corsair+Cooling+Hydro-Series+All-in-One+High-Performance+Liquid+CPU+Cooler+CWCH60

http://www.paragonmed.com/warmsys.shtml

http://bildr.org/2012/03/rfp30n06le-arduino

http://en.wikipedia.org/wiki/PID_controller

https://github.com/br3ttb/Arduino-PID-Library

https://github.com/osPID/osPID-Hardware

http://www.brewpi.com

http://www.homebrewtalk.com/f51/temperature-monitoring-control-arduino-155408