01100100 01100101 01110110 00100000 01110101 01110000 01100100 01100001 01110100 01100101 00001010
(that's binary for dev update)
It moves! a poorly-recorded demonstration of our manual controller option
Today is the day of our project alpha presentations. We have been pretty hard at work all weekend to get our robot is a pretty good presentable position. We're fairly excited to demonstrate what we've been working on for the past few weeks. A lot has been done since our last update.
Getting Tangled in Threads
One of our goals was to make movement controls somewhat real-time for our robot. But to do so is a huge issue for us because we would need to find a way to override an infinite loop for "move forward" commands with a "turn left command". As a button is pressed, we would like to send constant requests for the robot to move but at the same time listen for a new action to halt briefly halt the robot and do a completely new command. This is all done with the use of the Thread class.
Well, we didn't totally remember how to use them at first, so we pulled some old labs with interacting threads from CS332 (Operating Systems). It was more so of a syntax check, but something meaningful from the labs is to store a count of threads created into a a thread-safe object, such as a Vector. With some tinkering, for every new button press, a new thread will be created and an old thread will terminate. It is important to keep only one thread running at a time, otherwise we'll get some nasty race conditions between two active threads (I got yelled at by Abby for trying to get the robot to move forward and backwards at the same time).
It Understands Us!
Our current robot setup demo with voice recognition and refined app UI
Voice recognition is functional on our robot now! This is done by using a SpeechRecognizer object (with a given API thanks to the Android Studios Gods) and a handful of intents. It's wasn't too difficult to implement. All Android devices have a speech to text converter given to Google that functions with an online dictionary Google keeps up with. Well since to our misfortune of having to deal with Augustana's strict network, we were lucky to find out that most Android devices have an offline English voice recognition capability. This completely avoids the need to pair up another device to Abby's wifi hotspot.
To implement with our robot controls, we create a new layout activity in Android Studios that has a microphone option and a stop button. Upon tapping on the microphone, a command is said to the phone. Only five words are currently accepted in our dictionary such as "move", "back", "left", "right" and "stop". These worlds directly correlate with the words used to command our robot. After a command has been sent, the app checks if the word is an accepted term. If it's a correct command, the command is reprinted on the screen and the robot receives the message to move via socket object.
Our only caveat is that sending voice commands such as "move" will not also call the robot to end the "move" thread if a new command is called. To combat that, a stop must be called in between actions to prevent thread races. To save our precious vocal chords from saying stop so often (and with the lag to send a command), we implemented a stop button, so the button can be pressed instead of voiced-in in between every command.
Our Alpha App
Just a quick show of our current app design. It's pretty rough at the moment so bare with us. Hopefully we'll have something more elegant to look at, but what we currently have is just enough to get the job done.
And Lastly, Looking into the Future
Some things we want to look at for the future of our robot (and for the final product) will potentially include mounting a phone to the robot to use its camera properties and stream it to another Android device. After doing so, maybe include some image processing i.e. face recognition.
Possibly another thing to do is make our robot artificially intelligent. With the the robot dev team taking the A.I. course in the past, we could possibly find a way to implement it into our robot.