Wednesday, November 30, 2011

PID (proportional integral derivative)

So, first, I will have to say I am as much a student of how to write a PID as everyone else on the team.  I self taught myself the P last year and attempted the I after States.  It ran but I did not get much of an improvement.  After reading the "PID How To" again, I am not sure I was doing it correctly.

Here is a nice link telling us how to do it.  I think the MATH junkies on the team should push me to make some time to try to get this done on some cold winter weekend.  We could use it next year! 

"PID How To"

You can Google PID and NXT and FLL and find tons of  info.  Too much really.  Here is a great job of writing a PID using NXT-G.  Past years I have looked at it and it was beyond me.  But, I suspect we have learned a lot more this year and might be able to use it to help us write the I part of code.

PID using NXT-G 2.0

A great video showing what a PID can do for a NXT if you know how to write the code.  Maybe some day I'll figure it out.  Hopefully one of you will beat me to it.  Maybe we should have a line following contest this winter.



More PID reference material.
One - You may not need a PhD but you need a Masters degree.  Not helpful for me.
Two - Interesting Stop Motion Video in Spanish with English subtitles.
Three

Table Lights and Kelvin as a new Measure.

It is no secret that the Yappin' Yodas have struggled with light sensors.  It has been a two year struggle and we are only just beginning to get some traction.  The list of variables and built in limitation with the NXT kit really makes it a challenge.

One area that this coach investigated is the lighting.  Each year we go to an event and the lighting on the table looks almost blue vs. the warm yellow at our house.  Last year at States, light sensor driven programs that worked flawlessly at home failed.  Folks are still sore over that.  But, to reinforce persistence, we still fight with them.

Further, for rookie teams they don't even suggest a light over the table for the home practice mat.  To me, if you are using light sensors, that is just one more variable when you get to the competition.  Supposedly, a well shielded light sensor will do well regardless of ambient light.  I am still from Missouri on that one.

I looked at the lamps used at Bishop Brady and I was told by a judge they were spec'ed by USFirst.  They were GE T8 32w Daylight 6500k.  Vs. our old lamps which were GE T12 Ecolux 40w Chroma which have a color value of about 5000k.  For $8 I was able to find a 6500k tube in T12 so that is what we use now.  If nothing else, we are trying to isolate the variables.

To learn more about Kelvin and temperature color go HERE. And if you really want to see the marketing science behind lighting, read this GE brochure.  If nothing else, it makes you realize there is real science behind EVERYTHING around you.

Tuesday, November 29, 2011

NHPR podcast from Bishop Brady re FLL.

I am sure this link at some time will expire but for now here it is if you want to share it with anyone.

NHPR "Word of Mouth" segment

Congrats Yappin' Yodas!

Coach

Score sheet for reference

I figure by putting here we can find it in the future....FAST!



Brain vs. Brawn in 13 seconds

After dinner I decided to dig up last year's MyBlock with our proportional line follower routine.  I opened Justin's "Baby Fish" and copied it to a blank or 'new' file.  I copied the loops from last years proportional line follower (PLF) and also pasted it to this new file.  Merged it together, re defined the variables as they don't copy, and POOF!  less than 5 runs to get things right and it worked three times in a row.  See video.

Notes:  
Original file lacked BURM.  Fixed.  Ahem!
Original file had very high power levels (90) causing wheel slip.  All reduced to 60. Ahem!
Changed the really long run from 1900 to about 1130 degrees...using the old NXT RB to test for the sweet spot.
I played with the gain value which was on .3.  Gain or "correction factor" is a difficult concept.  Recall last year we wanted to "cool" or "modulate" the correction factor?.   .4 was too sloppy and could lose the line.  .2 was kinda too wiggly as it fought to stay on the EXACT edge of the line.  I settled on .25 gain but .3 might have been a better number.  Upped the power from 20 to 30 on the PLF motor block so it would go faster along the line.  We might even go higher if you want to test it.  I was doing a quick and dirty proof of concept.  You guys can polish it up.
Changed the loop which contains the PLF logic to a 15 second loop which will hold it on the wall.
That's about it.

"Dry run" or "pre pen to paper" (or fingers to keyboard) considerations:  Aka, never "ready fire aim!"
I guess the pre engineering was deciding which sensor to use.  Since we calibrate P1, our preference should be to use P1.  There were two options for P1, either North or South edge of the line.  Based on the objective of not spilling the purple bacteria, the decision was easy.  Read, we got lucky.  But something smart folks look at first.  In theory, there were four options.  P1, North or South edge or P2, North or South edge.  Quick eyeball with the bot on the mat and viola.


Coach's Coaching Comments:
13 seconds of Brains vs. 7 seconds of Brawn.  I guess I don't understand the lack of interest in re-using earlier learnings.  You want to build upon what you learn every day whether robotics or life.  Even if you are not sure you learned it correctly, you MUST have the courage test your learnings.  Sometimes failing is the best way to learn.  To avoid testing what you have learned will cause you a great deal of heartache down the road.

Enjoy the video and be the best you can be! Maybe after States we can look at a more robust line follower and add I and D of PID.


Travails of the dreaded light sensors Part II

After you left last night I tried to get a good video to add to Part I.  But, after all of Max's work on the new ATB5 program, it would not work when put into Yoda11.  It would not work when put into PinkBaceria either.  But, by itself it would work.  I reviewed the code and could find nothing that would cause this problem.  I even posted on the FLL forum asking for help.  I will spare you the digital ink.  You can look it up if you want.

For some reason this morning I ran PinkBaceria without the rake attachment.  AND IT WORKED!  Huge HMMMMMMM?  I ran Yoda11 without the rake and it AGIAN WORKED!  Another big HMMMMMMM??  I again looked at the code and ArmDown has nothing to do with anything related to motor B or C or any sensor port.  It made no sense!

Have you figured it out yet???  What took me 3 hours to sort out is as clear as the nose on your and my face!

Think MECHANICAL!  Think GRAVITY.  Think about the LEVER or about the MECHANICAL ADVANTAGE the rake has on the bot if not 100% supported by the mat.  Stop blaming the program!!

Getting closer?


Well, it would seem that not 100% of the rake's weight was off the bot.  This pulled LS1 closer to the mat so it was not getting a good read.  MB_MAX is currently looking for something > 99 intensity.  A tad extreme but it does seem to be working off of the brown mat.  You can change it if you want.  We learned last year when the LS touches the mat, the reading is ZERO because the light generator is effectively turned off.  White mat touching the light sensor will still read ZERO.

So, see the video below.  Time to see if the Table attachment causes us similar pain.


Monday, November 28, 2011

Travails of the dreaded light sensors Part I

Let us review a bit.

We have a few things which are causing us headaches.  Headaches only because we have to REALLY use our grey matter.  AKA, use our brains.

Last year we followed a black and white edge.  Recall the proportional line follower and our desire to be at 50% intensity.  And, the lines we used to align to were surrounded by WHITE!  Well, this year things are different.

The two major killers for us this year are:
  • We are using the crude NXT-G calibration block which is limited to calibrating just one port.  All the other ports have to use the "master" port calibration.  We have seen this causes inaccuracies.  As info, we calibrate P1, P2 has to deal with whatever P1 read.
  • The mat has COLOR and the white and black areas are small targets relative to last year's lines and it therefore impacts the viability of our ATL (advance to line) strategy.
So, we really have to think outside the box.  I believe we assumed we could simply re-use last year's code and did not give much more thought to it.  Myself included.  But, when it failed to work, we need to LOOK and not just react in frustration.

What we have done:

Perhaps I helped a little bit too much, but I wrote a program to read BOTH intensity and raw light sensor readings from each light sensor (or port) to the display.  This was HIGHLY educational for the team.  The team could now SEE how some areas of the brown would give us 0 intensity.  Madness, but if you are willing to put up with the crude NXT-G calibration block, these are the cards you are dealt and we must learn how to play to win.

On Monday we found we could get 100 intensity light reading on brown for PinkBacteria.  So, we went for a "sniff for black first" strategy.  Today, we seem to find 15 intensity in brown.  HUH??!?!?!  This is still a bit concerning but tonight I feel like we were really watching and thinking, more than Monday.

Very odd stuff but I am not discouraged.  At least now we UNDERSTAND what is eating our lunch.  I can live with that much easier than resigning ourselves to being unable to understand or even correctly identify the variable causing us pain.  You need to understand this too! If you don't, ASK QUESTIONS!

As of tonight we have at least three options.

Option A (MB_ATB4):

This program sniffs for black first on brown.  We had black set at 20 intensity or lower.  It would find values less than 20 and turn one motor...Lost Advance To Black (LATB).  But....after tweaking the threshold down to 3, it works great for the brown, light blue and dark blue.

Option B (MB_ATB5) Now MB_Max

Because of he insanity we were having with 20 intensity failing, Max made a ATB5 which sniffs for white, like 90 intensity!  And then advance to black, etc etc.  It too worked well.

Option C (Use both depending on which works best on the various 'non white' backgrounds)

We may find that one or the other works better on the various backgrounds and use different MBs for each program.  Testing is required.

So, at the risk of making this way too confusing, we now UNDERSTAND what is going on. We may not like it but we understand.  If we were really good, we'd log the two sensors readings during the run and then analyze the data and use this to help us decide what threshold to use for match2.  My point, we need to watch what happens and know what to look for.

See Part II for more drama.

Tuesday, November 22, 2011

Memorable design concept

Before I deleted this video from GoogleDocs I thought I would post it on YouTube for the blog.  Simplicity in 9 seconds.  Great work guys, keep it up.  We called this the "Lunar Module" concept.  Who says indices are limited to being used only in base?


Brian Davis' excelent post

On the FLL forum I found a great post with video of programming tips from Brian Davis.  They are longish and likely just taken one at a time when you don't have distractions.  All sorts of nuggets.  Even some cool math!

Forum Link  (The appetizer)

Direct YouTube link (The main course)
 
The reward of good programming  (Dessert)



More than coach has time to digest at this red hot moment.  But I would like to know how he addresses the two light sensor calibration.  I have not taken the time to dig out that nugget.  Could be very useful to us if it is not overly complex.

Sunday, November 20, 2011

Blaue Ratte

This is how it's done by 6:30pm :P

Can we do this first instead of the fish?

-Justin

Take two....with a Move block replacing the motor block.  Using GOM Player you can see it is smoother.  And now we hit the ball on the way back just in case it is still there.

Breakthrough. MOVE vs MOTOR Block learning!



As I was helping a rookie team program, I went back to our favorite mentor, the Cougar Robotic's website, to ensure I was correct in one point.  But I found I was incorrect.  It was the old Move vs. Motor block discussion.  It has been the Yappin' Yoda's rule to always use a motor block when a single motor is being used to turn.  Yet I reviewed the Cougar Bot's presentation and they use a single motor Move block to affect turns.  Hummm #1.


We we believed the primary difference was....


Move blocks have an "internal" PID (Proportional Integral Derivative) correction formula and by use of rotation sensors on each motor, the Move block ensures both motors complete the specified degrees.

The Motor block does not have this correction and to avoid confusion, we always use a motor block when turning one motor and then switch back to a move block when we go straight driving both motors.



As I did more research I read what LEGO Engineering had to say and it seems they suggest use of the Move block when precision is your objective.  And if that is not enough of a "where have you been?", check out this beautifully presented piece on the difference between the two.  Hummm #2.



This just goes to show how we can all learn from each other, or in this case, "unlearn" what we have believed to be true and "relearn" a better approach.

So, which would be better in our Advance To Line program?  A Motor or a Move?  Hummm #3.


Mechanical advantage....and the Fridge Truck

Recall we had explored an option to LIFT the fridge truck with our motor.  In short, the cool magnet holder had promise but the motor did not have the torque needed to lift this heavy table element.  We opted to not explore gears given time constraint.

Well, Brain Davis had some simple gears that were of interest and might be usable without too much modification of our existing attachments.  We gave one a try and with a battery at 6.7 we accomplished the following lift.  It still needs some exploration.

Two useful resources for gears.
Worm gear casings
Tutorial on gears.....the above video uses a 12 driving a 20...what is the mechanical advantage???  (more than 1, less than 2)  How can we get more?

ATL and light sensor variability discussion

Advance to Line...or Advance to White (ATW) as we call it now.  The program relies on two light sensors.  It is important these two light sensors measure reflective values identically.  But, with any sensor, LEGO or otherwise, there will be variability.  Put two thermometers next to each other and tell me if they give exactly the same reading!

We have another obstacle which is the fact the NXT-G 2.0 software's built in calibration block only enables us to calibrate one port, one sensor.  So, if we calibrate P3, these values will be used for our second light sensor that may be plugged into P2.  That's not optimal.  So we have mfg variability and software with a limitation.

What other variables can you think of?  Ambient light, brown vs blue, non solid dark color, flash bulbs, battery power, solar flares????  Our challenge as engineers is to reduce the impact of variables on our robot.

Raw values vs calibration values in NXT-G.  I read that the light sensors have a raw value range of 0 to 1024.  Dark is 0 and bright is 1024.  Personally I have taken a brick and put in this little program with the brick connected to the computer and watched the reflection value.  I still get 0 to 100.  I put it up to a light bulb and I get the max value of 100.  If you can figure out RAW please let me know.  I am assuming the below test program ensures I get a "non calibrated" reflectance value.  Educate me if I am wrong.

(update:  OK, we figured out how to get RAW values.  You need to expand the data ports and you will find a 'intensity' which is based on a scale 0 to 100 and you will see RAW.  Use this raw data plug and output it to your screen and poof!  We now know where to get raw LS readings!)

Experiment on XMAS1.

Justin and I tested the calibration program we used at the last regional.  I have not examined it closely to ensure it is correct, that is your job.  But here are some results after testing on the brown to white in front of pink bacteria.

Uncalibrated (UNC)
             dark        light         Threshold
Left         52           63                57.5
Right       53           66                59.5
Post calibration block (CAL)
Left          0            78
Right       24           98
Variance  24           20

Add some new light sensor shades Justin and I made.....

Uncalibrated (UNC)
               dark     light          Threshold
Left           50           63                56.5
Right         57           66                61.5
Post calibration block (CAL)
Left            0           82
Right        14         100
Variance   14           18

I'm just reporting the data and I'm not going to draw any conclusions.  That is your job.  Due to some of the "oddness" it would be good to repeat the testing a few times and see if we get the same results.  I would caution that the current light sensor calibration program might say left sensor when it is actually right.  Thus you don't want to jump to the conclusion of why the bot tended to line up a little to the left before it went forward for pink bacteria.  You would want to dig into the calibration program and follow the wires on the bot before you start down that road.

Why the exploration or quest to understand?

Well, encourage all of you to practice thinking about what happened vs. just getting mad at what happened.  That is a VERY HARD skill to learn, just ask an adult!  Those who think about why things happen vs. just reacting to things that happen are going to do well in life.  Focus the anger and frustration at figuring out why something failed.  And, with Google, forums, books, built in help on sofware, mentors, I can assure you most problems can be solved if you put in the effort!

But do we have time?  I'd like to explore using the File Access block to store readings and calculate a threshold value for each light sensor port and eliminate one of our variables.  It is unclear to me that shielding the light sensors buys us that much on our home mat.  It might buy us more at the competition mat.  Hummmm.

Reference material for this post.
Brian Davis' excellent tips document on how to use File Access blocks.
Some forum discussion on independently calibrating light sensor ports.
(12/11)  Another reference item to explore using File Access block.

Here are the light sensor shields in place.  You can't even see when the orange light turns on.  The team can decide if the design could benefit from shading in the back.


Click on either picture for full size image.

A picture from below.  It would seem to be in the clear and relatively strong.  I am sure it can be made stronger.  It will change the weight distribution on the robot a little.  We'll have to see if it causes the robot to behave differently due to weight.