Cin-D LOU: Difference between revisions

From Hive13 Wiki
Jump to navigation Jump to search
No edit summary
No edit summary
 
(21 intermediate revisions by 3 users not shown)
Line 1: Line 1:
[[Category:Cin-D LOU]]
'''Related Pages:''' {{#ask: [[Category: Cin-D LOU]]}}
{{Infobox Project 2
{{Infobox Project 2
|ProjectTitle=Cin-D LOU House Bot
|ProjectTitle=Cin-D LOU House Bot
|ProjectImage=Image:Cin-D.jpg|150px
|ProjectImage=Image:Cin-D.jpg|150px
|ProjectStatus=Active
|ProjectStatus=Inactive
|StartDate=1/1/2012
|StartDate=1/1/2012
}}
}}
December 2012 Update: Members Jon and Craig stand-in and waive their arms to show how Cin-D LOU's Kinect 3D vision is able to perform real-time recognition and construction of their skeleton joint positions.


[[Image:CinD.kinect2s.png|frameless|500px]]
There has been a lot of behind-the-scenes work happening on the Cin-D LOU project these past months.  It has mostly been focused on person recognition and eyeball/head tracking.  Visible results will start showing up in the coming weeks.  This has all been enabled by the Kinect.
If you don't know the Kinect story, it is a Microsoft product; a peripheral originally developed exclusively for the Xbox 360 video game system.  This amazing USB-connected infrared camera provides a distance/depth image for objects placed in its field-of-view.  It is a result of many years of academic research in the field of computer vision, with hardware development by the Israeli company, PrimeSense, working in close cooperation with MicroSoft software and algorithms.  The "Microsoft Kinect" product was launched on November 4, 2010, and was a major commercial success.  It sold upward of 10 million units in the first month after its release, making it the fastest selling computer peripheral in history.
On the same day as the Kinect's release, the NYC open-source hardware company Adafruit announced a $2,000 bounty to the first person to produce open source drivers that would let anyone access the Kinect's data.  When MicroSoft initially reacted negatively to this bounty, Adafruit increased the amount to $3,000.  Six days later, on November 10, [http://www.adafruit.com/blog/2010/11/10/we-have-a-winner-open-kinect-drivers-released-winner-will-use-3k-for-more-hacking-plus-an-additional-2k-goes-to-the-eff/ Hector Martin claimed the Adafruit bounty] with the first public version of a working driver.  He then joined Josh Blake's OpenKinect project which continues to improve and maintain the drivers to this day. 
Member Craig got a Kinect on November 27, 2010 and started [http://wiki.hive13.org/Kinect the HIVE Kinect project].
More recently, Jim got a copy of the MAKE book [http://www.amazon.com/Making-Things-See-Processing-MakerBot/dp/1449307078 "Making Things See"] by Greg Borenstein.  The hack and the story have made it into the mainstream media.  Craig dusted off the Kinect and made it work as described in the book.  Jim is still making head neck and eyeball parts on the laser.  Stay tuned...
----
See the [http://wiki.hive13.org/index.php/Cin-D_LOU_EyeLids page] about making the eyelid frame parts .
----
Cin-D LOU is the HIVE's house bot project; an autonomous robot (at 5'-2" tall) soon(?) to be able to self-navigate around the HIVE space with some initial (and ever growing?) ability for interaction.  This project is our version of a home-built C3PO-like protocol droid.   
Cin-D LOU is the HIVE's house bot project; an autonomous robot (at 5'-2" tall) soon(?) to be able to self-navigate around the HIVE space with some initial (and ever growing?) ability for interaction.  This project is our version of a home-built C3PO-like protocol droid.   



Latest revision as of 13:15, 23 March 2017

Related Pages: Cin-D LOU, Cin-D LOU EyeLids, Cin-D LOU EyeLids2, Cin-D LOU EyeLids3, Cin-D LOU head, Cin-D LOU/laptop, Detroit Droids



Hive13 Project
Cin-D LOU House Bot
Cin-D.jpg
Status: Inactive
Start Date: 1/1/2012

December 2012 Update: Members Jon and Craig stand-in and waive their arms to show how Cin-D LOU's Kinect 3D vision is able to perform real-time recognition and construction of their skeleton joint positions.

CinD.kinect2s.png

There has been a lot of behind-the-scenes work happening on the Cin-D LOU project these past months. It has mostly been focused on person recognition and eyeball/head tracking. Visible results will start showing up in the coming weeks. This has all been enabled by the Kinect.

If you don't know the Kinect story, it is a Microsoft product; a peripheral originally developed exclusively for the Xbox 360 video game system. This amazing USB-connected infrared camera provides a distance/depth image for objects placed in its field-of-view. It is a result of many years of academic research in the field of computer vision, with hardware development by the Israeli company, PrimeSense, working in close cooperation with MicroSoft software and algorithms. The "Microsoft Kinect" product was launched on November 4, 2010, and was a major commercial success. It sold upward of 10 million units in the first month after its release, making it the fastest selling computer peripheral in history.

On the same day as the Kinect's release, the NYC open-source hardware company Adafruit announced a $2,000 bounty to the first person to produce open source drivers that would let anyone access the Kinect's data. When MicroSoft initially reacted negatively to this bounty, Adafruit increased the amount to $3,000. Six days later, on November 10, Hector Martin claimed the Adafruit bounty with the first public version of a working driver. He then joined Josh Blake's OpenKinect project which continues to improve and maintain the drivers to this day.

Member Craig got a Kinect on November 27, 2010 and started the HIVE Kinect project.

More recently, Jim got a copy of the MAKE book "Making Things See" by Greg Borenstein. The hack and the story have made it into the mainstream media. Craig dusted off the Kinect and made it work as described in the book. Jim is still making head neck and eyeball parts on the laser. Stay tuned...


See the page about making the eyelid frame parts .


Cin-D LOU is the HIVE's house bot project; an autonomous robot (at 5'-2" tall) soon(?) to be able to self-navigate around the HIVE space with some initial (and ever growing?) ability for interaction. This project is our version of a home-built C3PO-like protocol droid.

We are using a motorized wheelchair as the modular base unit. The inital "skin" has a nominal female form, but can be changed out to have a male or androgynous form.

Our project team is an informal group of like-minded folks. We hold our weekly team project meeting on Tuesdays, before/after the regular business meeting and then make progress each week. Individuals and small groups can take on specific tasks and interested bystanders can follow the progress and contribute as things evolve. The purpose is to learn and have fun!

The project has four initial areas of effort.

(1) Locomotion - Thanks to Jon, we now have a slightly used, model 1122 Jazzy motorized wheelchair from Pride Mobility Products. http://www.pridemobility.com/resourcecenter/Downloads/Product_Owners_Manuals/product_owners_manuals.asp The base came in with years of use and is fairly heavy with batteries and power circuits. Different control schemes to recognize position, do path planning and execution are being determined. The HIVE's MS Xbox Kinect might be one input, plus others that are TBD.

Cin-D.loco.01.jpg Cin-D.loco.02.jpg Cin-D.loco.03.jpg Cin-D.loco.04.jpg

Above are some photos taken while stripping down and cleaning up the motorized wheelchair base. Formula 409, WD-40 and elbow grease do wonders on grime, rust, and nasty hair balls wrapped around axles. It has been re-assembled in stripped-down form and is ready for further hacking the safety bumpers, line-tracking navigation parts, and everything else.

The project team is currently looking into different schemes to tap into the joystick and/or powerboard to get computer control for the drive motions. As always, there are others out on the web that are already doing similar things:

http://www.youtube.com/watch?v=XyPG9KN0ryI

http://www.youtube.com/watch?v=RpuQj7ocyyM

http://www.youtube.com/watch?v=y1PPM_9QF2k

http://www.dimensionengineering.com/products/sabertooth2x25

https://www.noisebridge.net/wiki/Noise-Bot

http://cwhatidid.com/view/?t=Build%20a%20Robot%20From%20A%20Power%20Wheelchair&itemParent=294

http://myrobotnstuff.blogspot.com.au/2012/07/circuit-for-wheelchair-robot.html

Once we get computer control of the motion, our first thought is to start with a simple line-following strategy like one of the following links:

https://www.google.com/search?q=line+following+robot&hl=en&prmd=imvnsb&tbm=isch&tbo=u&source=univ&sa=X&ei=1a0yT7i7MaHq2QXivI2PDA&sqi=2&ved=0CDgQsAQ&biw=1280&bih=623

http://www.nxtprograms.com/line_follower/steps.html

http://online.physics.uiuc.edu/courses/phys405/P405_Projects/Fall2005/Robot_project_jaseung_.pdf

http://en.wikipedia.org/wiki/Mobile_robot#Line-following_robot

http://ikalogic.com/tut_line_sens_algo.php

http://www.tombot.net/beam/linefollowingrobot.html

http://www.inpharmix.com/jps/PID_Controller_For_Lego_Mindstorms_Robots.html

http://webdelcire.com/wordpress/archives/619

http://www.pololu.com/docs/0J21/7.c

We want to quickly graduate to autonomous navigation in an open space and are currently use of the open source Willow Garage Robot Operating System (ROS) capabilities.

http://www.willowgarage.com/pages/software/overview

(2) Arm and Hand - Here we're thinking to use laser cut acrylic for the fingers, hand, wrist, forearm, elbow, upper arm and shoulder joints. Planned actuations come from forearm mounted small DC motors with leadscrews driving tendons. Stepping motor drives with gear reduction are anticipated at the elbow and shoulder. The ability to execute programmed motion sequences like pointing, waving, presenting a drink, making a fist bump or shaking hands without (or with) feedback are TBD using networked Arduinos or such for control.

(3) Head - Here we've got a styrofoam head with a vacu-formed replaceable female face as a possible starting point. There are some initial mechanical designs for eye balls (with blue LED pupils) that track with up/down left/right motion and working eyelids and eyebrows. http://www.youtube.com/watch?v=uYYqycOWH5g Mouth and speech capabilities to be determined. Neck functions to turn and nod the head would be included. Again, we anticipate the ability to execute programmed motion sequences (nodding up/down for yes, shaking head left/right for no, and eye and head movements during motion tracking) without (or with) feedback. Here is the start http://wiki.hive13.org/Cin-D_LOU_head

(4) Body - Here we're looking for the HIVE artist types to make a simple light and quick body space frame from exacto-knife cut or laser cut cardboard or other layers using the AutoCAD 123D Make capabilities at http://www.123dapp.com/make. This would get us an egg crate or open grid structure that would have the outer body profile with cavities inside for mounting necessary controls and conduit paths.

Finally, if you have read this far, here is one of many inspirational robot sites out on the web:

http://www.takanishi.mech.waseda.ac.jp/top/research/wabian/index.htm